
Ariadne Solutions provides AI-enabled data review processes through their leading product, Red Thread.
By Stephanie Pasas-Farmer, Ph.D.
Drug development is at a tipping point. The emergence of novel therapies, personalized, patient-focused medicine, and high-velocity technologies offer transforming effects in the way clinical development is conducted. Bioanalysis, the foundation of drug development, however, remains labor-intensive, technically challenging, and is often the rate-limiting step to efficient drug development. In spite of recent advances involving the combination of traditional and new bioanalytical technologies, there is an urgent need for new approaches in bioanalysis to increase throughput and improve quality as critical steps to modernizing drug development. The future of bioanalysis wholly depends on innovative technology, which must evolve its capabilities to make these advances possible. The dearth of cutting-edge bioanalytical auditing tools and the stark reality that bioanalytical scientists are, in general, a risk-averse group performing high risk tasks conflate these challenges into a daunting barrier to innovation. This article discusses current bioanalytical trends and explores how artificial intelligence (AI)-enabled applications can solve these challenges when paired
with bioanalytical expertise.
Decades of Decentralization
Over the past 15 to 20 years, the role of bioanalysis has gradually been decoupled and commoditized. As the focus on small molecules evolved into a diversification of specialization as new technologies and a variety of molecular types emerged, so did the pattern of research and development (R&D) outsourcing. The increased pace of bioanalytical outsourcing coincided with the rise of biologic drugs and combination techniques. The pharmaceutical industry began divesting in functional departments and relying on strategic, full-service outsourcing models rationalized by a desire to efficiently manage rising R&D costs.1 Within bioanalytical departments, outsourced work shifted from predominantly sample analysis of clinical studies that involved transfer of a method developed and validated by the pharmaceutical company to a contract research organization (CRO), to method development, validation, and sample analysis all being conducted by the CRO.2 Today, bioanalysis is primarily outsourced and has expanded in scope, due to more early stage work and the growth in large molecule drugs. While decades of outsourcing have resulted in a thriving contract bioanalytical lab industry with centers of excellence, it has also completely dispersed bioanalytical domain expertise into the contract lab and research organization space, with major impact.3 One consequence of this change is the continued commoditization of bioanalysis. This trend fuels speculation that, at best, assay robustness is taken for granted, and at worst, the false perception that bioanalysis is simple, and off the critical path of drug development.
Large molecule bioanalysis is complex, and bioanalytical expertise is critical to navigate scientific and regulatory challenges as they arise. This level of expertise is often hard to find. To continue progressing in the next generation of life-changing and life-saving therapies, it is important that the industry understands the need for specialized bioanalysis in biomarkers, immunogenicity, and cell-based assays. Today, large molecules represent more than 50%, of the global biopharma pipeline, with growth projections of more than 70% in the next five years.4
Today: More Complex Drug Development
The surge of technically challenging delivery technologies has significantly increased the complexity of bioanalysis. Reliance on a handful of techniques has multiplied, with a diversity of technology required to support challenging biomarker, immunogenicity, pharmacokinetic (PK), and cell-based assay requirements. As the complexity of drug types from small molecule to biologics and gene therapy has evolved, so has the difficulty in monitoring the drug levels in the body, which remains the primary objective of the bioanalytical scientist. In turn, the bioanalytical community has employed new types of technologies and tools traditionally used in a clinical laboratory setting.5
A notable example is the use of MesoScale Devices, which over a decade ago employed electrochemiluminescence (ECL) in a new platform for use in quantitative bioanalysis. Since then, ECL has evolved to widespread recognition as the gold standard for immunogenicity/anti-drug antibody (ADA) testing.6,7
Flow cytometry is another example of technology shift in regulated bioanalysis. As a well-established tool for whole cell and biomarker analysis, flow cytometry has also shifted from its origins in the diagnostic clinical laboratories to the regulated bioanalytical laboratory when traditional PK bioassays might not be available or applicable. Beyond whole cell analysis for drugs such as CAR-T cells medicines, flow cytometry has also assisted biomarker and ADA testing. However, the current lack of formal guidance on validation from the regulatory authorities has been a barrier of use to some within the regulated bioanalytical field.8,9
In parallel with drug development complexity is the rise of more complex unmet medical needs. In previous decades, bioanalysis was often focused on a single disease state marker and a single target, such as lowering cholesterol. Now, the oncology treatments represent a larger proportion of the global pipeline and typically involve multiple markers and disease targets.10
The prevalence of more complex disease states with multiple targets have required bioanalytical groups who support these projects to develop and validate multiple assays. The single assay requirement has expanded to range between 5 and 10 assays, all of which need to be validated or qualified in a regulated bioanalysis group and laboratory.
Regulatory compliance requirements in biomarker analysis only continue to rise in importance against the proliferation of potential new oncology therapies. The most recent regulatory guidance issued in May 2018 now requires biomarker assays to be validated in a regulated bioanalysis approach in support of such complex drug development programs.9 Previously, these assays were not required, or were completed in a discovery mode, or they would be performed in a clinical testing lab along with the other clinical panels. The absence of explicit guidance until 2018 also gave rise to other issues impacting quality and consistency:
- Criteria for assay validations and use in clinical or preclinical sample analysis is still highly variable across functional groups, organizations, laboratories, and even from person to person.
- Best practices have slowly emerged from industry leaders collaborating, but these conversations can be held up by legal concerns, scientific concerns, and a fear of sharing what has worked and what has failed.
- Better tools are needed but new technology is subject to mistrust, long implementation times, and/or “pilotitis.”
- Larger amounts of data can overwhelm traditional practices of data review and quality monitoring, with potential negative consequences for data integrity.
Equally important to technology innovations are subject matter experts who can effectively use the tools to capture and deliver robust, accurate scientific data in an efficient manner.
A Closer Look at Bioanalytical Quality
Bioanalytical quality and consistency continue to impact current treatments. Consider the example of checkpoint inhibitors. In the last decade, six immune checkpoint inhibitors have been approved to treat patients with many advanced solid tumor and hematological malignancies to improve their prognosis. However, these therapies have well-documented assay deficiencies, outlined below:
- As of January 2019, three of the six approved monoclonal antibodies (mAbs) had post-marketing commitments related to ADA assays as a condition of regulatory approval.
- Approved products with post-marketing commitments that involved ADA assays include Bavencio, Tecentriq, and Imfinzi. This group of mAbs are all directed against PD-L1. Speculation centers on the higher dosing requirements associated with the use of PD-L1 antibodies. This is a factor in the challenges of developing an ADA assay that meets current regulatory guidelines.
- The post-approval commitment for Bavencio includes these requirements: “Conduct an assessment of treatment-emergent binding and neutralizing anti-drug antibody (ADA) responses with validated assays (including an updated cutpoint for the screening and confirmatory ADA assays and for the neutralizing assay as requested in 3185-5) capable of sensitively detecting ADA responses in the presence of avelumab levels that are expected to be present in the serum at the time of patient sampling. The incidence of treatment-emergent ADA responses will be evaluated in at least 300 avelumab-treated patients.”12
- The post-approval commitment for Tecentriq includes these requirements: “develop and validate an assay for the improved sensitivity for the detection of neutralizing antibodies against atezolizumab in the presence of atezolizumab levels that are expected to be present in samples at the time of patient sampling. Patient samples should be banked for storage until the improved method is available.”13
- The post-approval commitment requirements for Imfinzi include: “Conduct drug tolerance studies for the screening, confirmatory, titering, and triple mutation assays that are in the range of trough concentration of 182 µg/ml to better demonstrate that the assay can detect anti-drug antibodies in the presence of drug.”14
What do these post-marketing commitments tell us? There are major deficiencies in the immunogenicity program for three drugs. Approval was granted due to high unmet medical need, but all three indicate the assays need to be redeveloped and validated. Importantly, the Bavenci example goes beyond revalidation and reanalysis, and includes the requirement to analyze 300 subjects receiving drug. This is essentially requiring an additional clinical trial, if the program either hasn’t saved samples from an older trial or if the previous trials do not meet the sampling timepoints as laid out by the agency. The examples above highlight the grave challenges we face with data quality and integrity as an industry. To address such challenges, we are in dire need of sophisticated solutions to meet regulatory rigor, ensure efficacy, and protect patient safety.
Bioanalytical Challenges and AI
The pressure to accelerate critical tasks, such as data transformation, analysis, and follow-on decision-making continually expands in all phases of drug development Within early clinical stage bioanalysis, data review for large reports ideally require 14 days to account for the multi-layered input involving stakeholders spanning bioanalytical scientists, management, quality control, and quality assurance. However, current cycle times are typically three days, leading to condensed review timelines and overburdened data reviewers.
Further, in spite of all the advances generated from instrumentation, electronic laboratory notebooks, and patient-facing tools, there is no automated end-to-end solution for bioanalysis, and the bioanalytical workflow still ends with manual data review. Even the gold standard of laboratory information management is still semi-automated, requiring manual re-entry of summary data from raw data. Bioanalysis is typically the rate-limiting step for preparation of the clinical study report.
Many current AI applications in drug development focus on time and cost efficiencies involving large data set and automation. A common mantra is to “focus on automating the mundane, to focus on what matters.” AI is well positioned to handle not only the high volume bioanalysis data but also the complexity of the rules that must be applied to link and contextualize disparate data in key functions of assay development and sample analysis.
Bioanalytical Challenges and Ariadne’s Red Thread
Ariadne Solutions is focused on solving these and similar challenges through AI-enabled applications. Red Thread, the company’s lead product, automates data review processes during method validation, sample analysis, report writing, and audit preparation. It accelerates a time-intensive review process and applies sensitive techniques to detect trends of importance or potential compliance issues within bioanalytical data sets.
For validation and sample analysis, Red Thread combines expert systems, computer vision, and natural language processing techniques to streamline data analysis across data sets to rapidly detect patterns of significance. Automation in the drug development space can reduce manual efforts from days to hours. In bioanalysis, data review quickly adds up when method validation reports typically involve up to 30 data tables and assay validation summaries range between 30 and 60 data tables. Interim data reviews powered by Red Thread provide the ability to find problems early and avoid costly re-work.
For report writing, natural language processing and computer vision techniques are paired to extract all reviewable data from virtually any format. Reports are auto-generated, eliminating manual re-entry of summary table data.
Red Thread also provides an early warning system by monitoring bioanalytical data sets across studies or programs. Three levels categorize risk by potential severity of impact to the drug development program. Quality issues are flagged through green, red, and yellow indicators, which includes passing data, failed data, and borderline failures.
One of the major goals of Red Thread and other AI-enabled applications is to build a robust bioanalytical profile of promising drug candidates, while rapidly identifying assay-related problems at the earliest stages of development, before potential risks and development costs escalate.
The Future
AI applications are poised to solve the real problem of escalating inefficiency, rising cost, and lack of available resources. While the future looks bright, we are in the early stages of transformation.
Bioanalysis is inherently complex, and any new technology cannot operate in a bubble. Red Thread and other emerging technologies need to be operated by experienced scientists who have the right expertise to use them. All new technology will need to rapidly evolve with industry trends, regulatory guidance, and the advanced computing power that make these applications possible. For now, as AI begins to live up to its potential, many organizations are working to integrate AI into their current business infrastructure in incremental steps.
LEARN MORE
References
Datin, J. (2018, August 22). The Evolution of Bioanalytical Outsourcing Partnerships.
Spooner, N. (2017, August 22). Outsourcing strategies in bioanalysis.
Hayes, R. (2017, August 1). Bioanalytical outsourcing: transitioning from Pharma to CRO.
Datin, J. (2019, August 12). Podcast Feature: Enabling the Breakthroughs of Tomorrow, Today.
Degg, B. The Development of Bioanalytical Methods for Pharmaceutical and Clinical Research. Chromatogrpahy Online. Volume 10, Issue 21, Nov 20, 2014. Accessed on Oct 29, 2019.
D, J. (2015, September 7). Recommendations for the development and validation of confirmatory anti-drug antibody assays.
Gunn, G. R., Sealey, D. C. F., Jamali, F., Meibohm, B., Ghosh, S., & Shankar, G. (2016, January 19). From the bench to clinical practice: understanding the challenges and uncertainties in immunogenicity testing for biopharmaceuticals.
der Strate, B. V., Longdin, R., Geerlings, M., Bachmayer, N., Cavallin, M., Litwin, V., … Fjording, M. S. (2017, August 2). Bioanalysis.
Bioanalysis. (2018, November 29).
IQVIA Institute Report. Global Oncology Trends 2018. Accessed on Oct 27, 2019.
Center for Drug Evaluation and Research. (2018, December). Biomarker Qualification: Evidentiary Framework. Retrieved October 29, 2018.
DEPARTMENT OF HEALTH AND HUMAN ... - accessdata.fda.gov. (n.d.).
DEPARTMENT OF HEALTH AND HUMAN SERVICES - accessdata.fda.gov. (n.d.).
BLA 761069/S-013 SUPPLEMENT APPROVAL - accessdata.fda.gov. (n.d.).