July 21, 2023 PAO-07-23-RT-01
The potential use cases for AI applications in the pharma space are seemingly only limited by the creativity of the industry. If news stories are correct, billions of dollars have already been invested within the industry to bring AI to reality. We envision three different impacts.
The traditional industrial uses of AI — for supply chain visibility, supplier risk assessments, equipment preventative maintenance, and even detailed production scheduling and so on — are the most likely to mature first in the pharma space. As the industry levels up its digitalization, these application sets are the most likely first entrants, with several key producers already building use cases in these areas. The pandemic provided the impetus for this transition as all producers faced numerous supply chain issues and shortages of excipients, APIs, and starter materials. It stands to reason that moving to AI-centered platforms will bring the industry to new levels of productive efficiency, agility, and flexibility.
It is also easy to envision vast improvements in the business side of pharma organizations. For example, it is very likely that predictive analytics will be part of the standard toolset used for market sizing. The vast stores of demographic and population data, attitude studies, patient experiences, doctor conversion rates, and so on can be leveraged to better estimate forecasts for a new drug. The irony of using AI to improve the transactional intimacy with a potential end consumer should not be overlooked.
On the product development side, the rationale for AI is simple: it will dramatically advance drug discovery and development. AI is exceptional at sorting and applying algorithms and rules to large repositories of data (studies, chemical compounds and molecular structures, patient information, etc.); this will create exceptional efficiencies in how the industry designs molecules. With several AI-developed drugs already in the pipeline, it is hard not to see AI as a dramatic step change for the industry.
A wide-reaching and surprisingly subtle technology, such as AI, which is already affecting our lives to a much larger degree than most of us are aware of, is hard to predict — especially in the areas of pharma and biopharma. With the ever-increasing arsenal of documented chemical reactions, I believe a primary function of AI will certainly be to sift through the literature to find ways of making as-of-now inaccessible pharmaceuticals or to find new routes to make existing pharmaceuticals in more cost-effective ways. Current AI-based synthesis planning tools leave a lot to be desired, and I am excited to see how future algorithms can make the overwhelming task of planning a multi-step synthesis more streamlined. While of course much more limited in scope, in our own work at Alconox Inc., we are also looking to apply AI to optimize detergent selection in order to improve service to our varied industry customers. Although AI is quite powerful, it still requires human oversight. Thus, while we are excited for the possibilities AI offers, we pride ourselves in our customer service and will always put you directly in touch with myself or one of our other Alconox technical support experts.
The applications of AI in general are exciting and a bit terrifying. While the applications for AI in the biopharma field are seemingly endless, I am most excited about the utilization of AI in the real-time monitoring and analysis of clinical trial data in combination with advanced omics technologies, enabling newer “dynamic” clinical trial designs; and connecting this analysis to everything from improving patient recruitment, avoiding obstacles in the regulatory approval process, and even feeding back into guiding and streamlining discovery efforts. This entire section of the field generates incomprehensible amounts of data within relatively defined constructs; and this is exactly where AI and machine learning really excel — processing, recording, and identifying trends in enormous data sets within closed systems.
It’s great to see increasing adoption of AI and machine learning (ML) tools throughout the life sciences. One area where we believe there is a lot of promise is in the bioprocessing space. These tools are enabling improvements in strain selection, process optimization, scale-up, and control strategies for commercial-scale production. These can help drive improvements at many different stages and hopefully drive down costs of manufacturing.
Additionally, AI and ML tools are well suited to make sense of increasingly large and complex data sets with many applications in earlier stages of R&D. Today, target identification is rate-limited by what we can measure and disease models. As improvements in measurement technologies and automation allow for higher throughput and multiplexed approaches, AI and ML can help us gain insight from information that is simply too complex for humans to process alone. Taken together, these will continue to accelerate the design–test–learn cycle and drive faster time to market for bioproducts.
From early drug discovery to human trials, companies use AI/ML to deliver intelligent solutions to decrease timelines and lower failure rates — improving patient outcomes more quickly.
AI/ML in target discovery is critical because it is the genesis of drug development. In this early 21st century, there has been a significant increase in new drug modalities, such as immuno-oncology (e.g., CART-T, bispecifics) and RNA therapeutics (e.g., ASO, RNAi). We often see the same targets used for multiple drugs and across many indications. Recycling old drug targets is problematic as they were not developed in the context of new modalities. Therefore, we must find new optimized drug targets.
At Envisagenics, we use AI/ML to discover and develop immunotherapies and RNA therapeutics using transcriptomics data. We analyze thousands of RNA-seq samples to identify druggable and tumor-specific neoantigens for oncology and RNA splicing modulators for neurodegenerative diseases. We qualify target specificity and safety during the discovery process. In conventional drug discovery, these attributes are evaluated at a later stage — wasting time, resources, and money. Through AI/ML, we deliver novel candidates for modern modalities in a fraction of the time, while de-risking drug assets earlier to increase the probability of clinical success.
AI and machine learning play a critical role in the biopharma industry. The best use cases for these technologies are drug discovery, drug manufacturing, diagnostic assistance, and optimizing medical treatment processes. Over the years, biologics drug discovery in oncology, chronic, rare diseases, and personalized medicine (such as theranostics) has become increasingly competitive and expensive, which has driven pharmaceutical companies to look to AI as methods to reduce research and development costs while avoiding costly errors.
AI has a great potential to transform drug discovery by accelerating the R&D timeline to make drugs more affordable and improve the probability of U.S. FDA approval. AI and machine learning algorithms can identify appropriate biomarkers and innovative molecules that may have failed in mid- to-late clinical trials and predict how the same compound could be applied to target the same disease in smaller patient populations or how the drug could be repurposed to enhance patient health in other diseases, particularly in cancer and chronic diseases, which are leading causes of death in the United States.
In the process development and manufacturing of biologics sector, AI can provide various opportunities to improve processes. This includes but is not limited to performing quality control testing of the drugs during in-line manufacturing processes, as well as maintenance of batch-to-batch consistency, reducing material waste, reducing production failure rate, and enabling faster production and more consistently meeting the product critical quality attributes (CQAs). The development of predictive biomarkers for better screening and diagnosing patients is continuing to be a key emerging area of AI application. Furthermore, machine learning can help forecast and prevent over-demand and under-demand, as well as fix supply chain challenges and failures during product manufacturing.
With increasing complexities of biologics drugs and manufacturing processes along with increasing demand for efficiency and better product quality, modern manufacturing systems are endeavoring to confer human knowledge to AI and machine learning. AI and machine learning tools becoming more accessible in the coming years will facilitate faster and cheaper drug discoveries, manufacturing, and commercialization, thereby becoming part of the natural process in the biopharma industry — not only for the large multi-national companies but also mid-sized to small pharmaceutical companies and CDMOs.
The increased adoption of AI and machine learning holds significant promise in several areas of the pharmaceutical and biopharmaceutical industry. An exciting contribution that AI can make is to offer proven solutions to technical challenges and quality investigations. In drug discovery and development, AI and machine learning can assist in identifying potential drug candidates, predicting their efficacy, optimizing molecular structures, and reducing the time and cost of the drug discovery process. Today, we all start from scratch when it comes to problem-solving. We rely on a small group of individuals that a company employs to formulate solutions to given problems, whether these problems are in R&D, manufacturing, or compliance. As a result, the approach to resolving an issue is predetermined due to the experiences of the small group. Some in the group may have worked in the industry for many years, while others may be early in their careers and have less experience to offer. The random competence of a group means that some problems will take longer to resolve and, worse, still need to be solved. If unresolved, the situation can repeat itself and become costly or impact patient safety. With AI, tapping into a vast pool of industry-wide knowledge becomes possible. There would be no need to reinvent the wheel for a particular OOS investigation or to determine if a process deviation is harmless or will result in an unacceptable stability profile. We all become more competent faster, and product quality and patient safety will significantly improve over the status quo — whether the industry will willingly share such experiences for global use or whether companies will see such knowledge as proprietary to maintain a competitive edge. Such ability improves the performance and image of our industry, which is committed to manufacturing and administering safe drugs to a needy patient population. Proprietary knowledge is best kept at the drug discovery and rational design level, not for compliance or process deviations.
Using AI as a predictive method to produce critical raw materials would be very exciting. To be able to take historical data from raw material production and link it to final drug output would be incredible. Understanding the variation in production throughout an entire process should enable more efficient production and ultimately cost savings. It should also enable us to identify the impact of a raw material like ours on other processes further down the line. The AI could be able to use all the data to create a model without bias or conflicting theories. The model could then help dictate operating parameters starting from critical raw materials to finished drug production.
Ultimately, AI is about increasing the cycles of innovation that improve the discovery, design, and delivery of the therapies to patients.
The obvious use for AI within the pharma industry is upstream in the early-phase drug discovery and screening process, where many drug candidates/molecules are evaluated to determine which candidates would be selected for clinical testing. Currently, it is common to screen thousands of potential candidates to get to a targeted product for a particular indication. Being able to compile all the data and feed them into an AI engine should significantly reduce the number of candidates by utilizing the learning nature of AI and determining the best chemistries to use. AI within these data-rich environments would prove tremendously valuable in developing and executing predictive models. This type of data analysis would go well beyond the tried-and-true design of experiments (DOE) approach and has the potential to significantly decrease development timelines and costs.
However, we are most excited to explore the potential for AI in the downstream processing that a CDMO like PCI Pharma Services engages. We see potential in feeding all the experimental data from formulation and lyophilization cycle development into AI engines to again reduce the number of experiments and process parameters by learning and projecting likely successful parameters to establish stable manufacturing conditions. These tools could also be used in the more efficient running of our equipment, from fill machines to lyophilizers. Once again, if we have data from each and every run compiling the products information, viscosity, surface tension, density, charge, etc., along with the types of materials used in their processing (e.g., tubing bags, fill needles) the hope is that we will materially decrease the development lead time and save our clients and the healthcare system significant cost.
Another very practical application of AI is its ability to increase quality and mitigate risk. Inspection systems, as an example, are an excellent use of AI principles, and we already see significant breakthroughs in the advancement of inspection technologies. With the deployment and progression of AI solutions, technologies can learn and become far more robust in their ability to view and interpret potential defects and their relative severity. Furthermore, PCI has implemented cobot technologies for certain manufacturing and packaging applications. Application of cobots certainly brings ergonomic and safety benefits, but these learning robots can add significantly more value as smart solutions, as both inspection and control mechanisms, and as affording real-time data capture for process analytical technologies (PAT). The coupling of cobots and machine learning inspection systems opens tremendous value in optimizing manufacturing and packaging operations.
AI and machine learning enable personalized treatment approaches by analyzing individual patient data, including genetic information, medical records, and lifestyle factors. These technologies can identify biomarkers, predict disease progression, and recommend tailored treatment plans, improving patient outcomes and reducing trial and error in treatment selection. These technologies can analyze large data sets, predict the efficacy and safety of drug candidates, and optimize molecular structures. They help researchers narrow down the search space and identify potential therapeutic candidates more efficiently.It's important to note that while AI and machine learning hold great potential in these areas, their adoption is still ongoing, and there are challenges to address, including data privacy, regulatory considerations, and ethical implications.
When coupled with machine learning and predictive analytics, the AI transformation has the potential to address some of the biggest challenges in the biopharma supply chain. Within the process innovation group at Samsung Biologics, our responsibility is to continuously look for opportunities where we can optimize the overall supply chain management process.
There is currently an overall lack of agility in pharmaceutical supply chains; AI-based technologies could provide complete visibility with predictive data and enable us to conduct much thorough but faster data analyses, which could forecast potential issues at the early drug production stage and prevent production delays or errors. This means we can predict hurdles and therefore properly allocate resources, which would significantly reduce logistics costs and gaps in the pipeline.
One of the most challenging of the remaining questions in gene expression research is how gene sequence information translates into protein function. The advent of structure prediction software, like AlphaFold, has to a large extent solved the problem of predicting protein structure from gene sequence data, but we are still far away from being able to predict protein function from gene sequence data or even from protein structure. If computational methods like AI can improve the predictability of structure/function models, it would not only reduce the need for experimental testing of the function of new proteins, but it would also greatly facilitate rational design of proteins. In turn, this would vastly simplify drug discovery, allowing biotech companies to perform in silico design of novel proteins for specific purposes to a much greater extent than today rather than screening for natural proteins with the desired function.
I’m most excited about the improvements to the manufacturing processes. Biologics, and in particular personalized medicines, are a challenge due to their high degree of variability. But given enough data, advanced analytics will be able to provide incredible insights. Not just on the process control side, but really in how the whole facility operates. By analyzing business data, lab data, process data, and people data, the combination of predictive and generative AI tools will give our industry new ideas on how to organize and operate our facilities.
Of course, drug discovery has been and will continue to be greatly impacted, and this is where the bulk of capital is being allocated right now. This is exciting too, but I think one of today’s biggest challenges facing biologics manufacturers is scale and cost of goods. AI and machine learning (ML) are much more powerful than people in generating insights from large amounts of unstructured data. With manufacturers continuing to build robust data platforms and building the infrastructure to begin making these insights, I think the progress we will make in the near term will be more than impressive.
I am also very excited for the unknown. AI was always a pillar of Industry 4.0, but even five years ago I’m not sure we could imagine the impact it is having today. If you ask me this question in another few years, my answer may be completely different.
There are many opportunities for AI/ML across the entire clinical trials value chain, but in terms of adoption what’s most exciting right now is seeing AI/ML applications that can help us scale and manage some of the most challenging and time-consuming facets of managing the clinical data in trials. Everyone across the industry is looking for cycle time improvements and efficiency gains. Clinical data is the currency of life sciences, but, historically, manual methods have persisted within the data processes of clinical development. As the volumes of data being collected from patients grew, it became unscalable to manage those data without applying advanced approaches in data science like ML models or AI. Now we are in a place where interest and intent in AI is being matched with heightened adoption of approaches that are available today. Using AI models, we can automate data review to identify data quality issues and augment the work of data reviewers so that they can identify some of the issues that would have taken significant time otherwise. AI can detect outliers and data issues and quickly surface these as outputs that a data reviewer — the “human-in-the-loop,” — can address and act on, removing the need to manually address these incredible numbers of individual data points. Applying AI to these clinical data use cases will create efficiencies and decrease cycle times across clinical trials.
Most companies are striving to adopt AI and machine learning capabilities to accelerate their biopharma discovery engines. Early adopters are identifying new potential targets for drug development and designing new compounds using AI to augment research expertise and intuition in ways never seen before.
We’re already seeing with our customers how this technology is advancing the design of small molecule compounds and therapeutics for neurological disorders. It will be very exciting to see even better predictions of efficacy and potential side effects using real-world patient data. Neurological disorders are extremely complex, and beginning to understand the root cause of these disorders by combining genomic and proteomic data with our increasing understanding of brain physiology and microanatomy can allow researchers to design more targeted therapeutics with increased efficacy and fewer side effects.
Amyotrophic lateral sclerosis, Huntington's disease, Alzheimer's disease, and Parkinson's disease are all devastating disorders with no known cures. Current treatments can slow the progression and manage symptoms of these diseases, but there is so much left to learn about the underlying causes and mechanisms. AI and machine learning can integrate and analyze the vast amounts of data being generated about these diseases giving us new insights to develop more advanced treatments and eventually cures.
The potential applications of machine learning in pharma/biopharma are vast, and the technology is likely to transform the industry in the years to come. Like many industries, the life sciences industry overall needs to continue to evolve and improve efficiencies. Machine learning can help optimize clinical trials by identifying patient populations that are most likely to respond to a particular therapy. AI algorithms can help identify and screen patients who are suitable for clinical trials by analyzing large data sets and electronic health records. The algorithms can match patients with specific eligibility criteria and engage them through online platforms, leading to faster enrollment and recruitment. These applications could potentially be useful in streamlining the variability and complexity of clinical trial insurance. AI can monitor patients remotely, eliminating time-consuming and costly regular hospital or site visits. AI has the potential to improve the efficiency, accuracy, and safety of clinical trials by optimizing various processes from recruitment to patient monitoring and data analysis.
I’m most excited to see AI and machine learning moving from buzzwords in healthcare to real-world applications powering personalized medicine at scale.
Traditionally, even though we know patients’ medical records provide tremendous insight into how their conditions evolve over time, much of that richness has been inaccessible. AI and machine learning have allowed us to identify patterns within these real-world data at scale, associated with outcomes we care about — presence of a rare disease, for example, or risk of rapid progression. With this information, we can build a “digital fingerprint” for patients we’re focused on, and then use that fingerprint to compare patient records in new data sources. This approach lets us bring another buzzword — personalized medicine — to the forefront of clinical care. By applying AI calibrated using larger data sets to a single patient’s data, clinicians can run digital assessment panels for multiple outcomes at once and help with point-of-care decision-making around diagnoses, treatments, and even clinical trial participation, based on analytically rigorous risk-benefit predictions. For life science companies, this personalization helps highlight patients who could benefit most from available treatments and also spotlights areas of greatest unmet need for future development.
In recent years, AI and machine learning tools have made this kind of personalization possible. We’re now getting past the hype and seeing broader real-world adoption of these tools. As their use grows, I’m excited to see stronger cases built for AI-powered clinical decision support to get the right treatment to the right patient at the right time.
I think AI and machine learning will greatly accelerate biomarker and drug target discovery and lead to more personalized medicines. This has long been a goal of AI in the biotech industry, which has, until now, fallen short — but as AI innovations continue at this breakneck pace, boosting its predictive capabilities with each new cycle, AI-driven biomarker discovery and drug development will reach new heights never before achieved.
At Faeth, we use machine learning to match cancer patients with the best precision nutrition treatment. Our MetabOS™ platform combines machine learning and functional genomics to uncover the precise nutrient vulnerabilities for a tumor based on genotype and organ of origin. We sequence a patient’s tumor biopsy and run the genetic information through the MetabOS™ platform, which identifies the specific nutrients the tumor relies on to survive and resist treatments. We then enroll patients in the trial best fit for that metabolic trait. In the future, we expect to tailor these interventions based on individual RNA sequencing data, advancing towards truly personalized medicine. We’ve generated thousands of potential strategies for nutrient restriction in cancer treatment, illustrating the power of machine learning in uncovering new therapeutic avenues. We envision AI and machine learning not only illuminating current scientific understanding but also sparking new insights in cancer diagnostics and therapeutics. We can go beyond broad treatment groups based on one or two genetic signals to nutrient control tailored to individual genetic signatures.
Artificial Intelligence allows drug development companies to screen a high volume of compounds to assess structure–activity relationships (SAR) for disease indications. It can help shorten and decrease the cost of drug discovery through several pathways. With vast amounts of data, models can predict what might be the most clinically successful candidate.
Once the AI hypothesis is identified, it must be tested. For most drug development companies, that means animal models. MedPharm offers one additional step before the cost, time, and lack of translation of animal testing as we continue to optimize human ex vivo models that assess the hypotheses from AI models, which can improve the compounds’ development risk profile before animal or human testing.
In one business case, MedPharm partnered with a company that specializes in AI modeling. The company identified a drug candidate using AI that could target atopic dermatitis. Once the mechanism of action is predicted, MedPharm employs its ex vivo models to test specific inflammatory pathways. In this case, MedPharm was tasked with assessing the drug candidate’s pharmacological effects using an ex vivo human skin model for atopic dermatitis by evaluating multiple inflammatory pathways, which will need to be confirmed in animal and human clinical studies.
More drugs that are identified by AI or machine learning and further verified by ex vivo models should lead to better drug discovery at a faster rate. For diseases with unmet needs or therapeutic treatments, this could significantly accelerate drug development to increase the chances of success.
When it comes to cell and gene therapy manufacturing, we need to inject AI into areas that can streamline data input, output, and interpretation. Human error is inevitable given the mass quantities and complexities of data generated as part of every manufacturing process. Inserting algorithms that can simplify the process would be extremely beneficial in shortening timelines and increasing therapy accessibility for patients.
However, with only six T cell drugs approved for treating diseases, I believe our industry is still some time away from utilizing the power of AI to meaningfully reduce manufacturing complexity. Machine learning is most effective when working with well-structured data, but this space remains highly siloed and far from standardized. We are at a crossroads as we begin to integrate automation, but early efforts have increased complexity rather than reducing it. That not only bakes in complexity that perpetuates longer timelines, cash burn, and lack of scalability, but it limits applications for AI tools.
The marriage of cutting-edge AI with cutting-edge cell and gene therapies seems like a natural fit, and the potential benefits of integration are thrilling. But we also know the downside to doing too much too fast; after all, this is a field that created cures for deadly diseases but is still struggling to make enough of them. In order to reach a place where AI can truly change the paradigm, we must continue to focus on simplification.
Technologies such as AI and ML are increasingly being used in biopharmaceutical process development, where multiple parameters can be analyzed automatically during manufacturing stages to ensure product quality remains within specification. Process efficiency can be increased, reducing cost and accelerating timelines.
The advantages of using AI and ML are in the detection and extraction of information from patterns in complex data sets, and they have opened the door to innovative startups with capabilities to identify new drug targets.
There is the potential for AI and ML to assist in identifying which drug candidates will work safely in humans, not only to guide candidate selection, but with the advent of the FDA Modernization Act, which authorizes the use of computer models as an alternative to animal testing in some circumstances. This can again save time and money in the clinical development of new drugs.
Outside of direct impact on drug development, there are other advantages that AI and ML potentially possess. With the ability to analyze real-time data, AI can monitor global health to improve disease surveillance, to alert authorities to potential new outbreaks and facilitate an earlier response. By analyzing data from global news, online search activity, discussion sites, and public health databases, AI can detect patterns that highlight the emergence of a new disease or the wider spread of an existing one, giving public health officials a new tool to fight future pandemics.
Whether we are empowering researchers in the lab to find the next scientific breakthrough or supporting our customers in the manufacturing and testing of novel therapies and diagnostics, AI is at the forefront of nearly every digital initiative now underway in our Life Science business.
As millions of people are waiting on the promise of new drugs and therapies to come to market, one of the most exciting fields where AI and machine learning systems are utilized is to accelerate and reduce cost of drug discovery.
According to Bekryl, a market research firm, AI has the potential to offer more than U.S. $70 billion in savings for the drug discovery process by 2028.
To identify potential target molecules to treat diseases, researchers traditionally carry out large screens of molecular libraries with numerous rounds of tests to a promising compound.
The sheer size of the libraries used to screen for new drug candidates means it’s now practically impossible for individual researchers to review everything themselves — and that’s where AI and machine learning can help.
The Life Science business of Merck KGaA, Darmstadt, Germany, which operates as MilliporeSigma in the United States and Canada, with its deep scientific expertise, helps researchers with efficient technology platforms that drive faster development of better therapies to patients. Our teams have developed software that combines AI, machine learning, and computer-aided drug-design methods, which is a valuable toolkit for our customers in discovery stages.
Good examples are AIDDISON™ drug discovery software and Synthia™ Retrosynthesis Software — which combined can offer scientists intelligent, improved, faster, and novel approaches to drug discovery while considering sustainability in the design. This increases the success rate of lead candidates in drug discovery and development.
While AIDDISON™ drug discovery software can generate new molecules tailored to needs, explore the vast chemical space, and refine the search with machine learning pharmacokinetic models and 3D docking tools; Synthia™ Retrosynthesis Software allows for the identification of safer, most cost-effective, higher-yield routes to chemical targets out of tens of thousands of potential approaches.
By marrying generative AI to chemical synthesis, we can let the computer focus on the more prosaic, time-consuming tasks, freeing up the scientist to conduct the experiments that require a human touch.
In the not-so-distant future, AI and machine learning will be used across biology and medicine to make powerful predictions: who will get sick, what the trajectory of their disease will look like, and what medicine they are most likely to benefit from. This leap in our understanding of disease onset, progression, and treatment will happen thanks to the integration of AI and machine learning using multi-omic data — using the holistic context of genomics, transcriptomics, metabolomics, and proteomics to understand biology and human health like never before. In the past few months, generative AI tools have improved at a breakneck pace to make highly accurate predictions using contextual clues. This same power can be applied in biology, using different data points across nucleic acids and proteins to fill in gaps in our understanding of the connection between genotype and phenotype, between what’s happening inside our bodies at the molecular level and how that manifests as health … or disease. Thanks to an influx of biological data, particularly from genetic sequencing, it’s now feasible to apply the predictive power of AI and machine learning to biology. We’ll need data across populations, however, to realize the true strength of these tools: from thousands of patients, at different time points in progression of a disease, such as cancer, to how those patients respond when receiving different drugs. AI can help to identify specific biological signatures that correlate with disease risk, and match patients with the drug that has the greatest chance of helping them.
AI and machine learning is already beginning to greatly enhance the capabilities of the entire industry. The first area is drug discovery, where AI is being deployed to predict protein folding and interaction modeling. As first-generation molecules discovered with the help of various machine learning techniques make their way through the development process, it will be amazing to see their impact both on patients and the industry, and I expect the role of these advanced technologies in the drug development process will explode.
Another space ripe for AI adoption is cell therapy manufacturing, where a prediction algorithm to improve cell culture would be immensely useful. Cell culture is a tedious and complex step in cell therapy production, and each subtype within the diverse set of cell therapy applications utilizes cell culture under a unique set of conditions. Available data sets can be used to train machine learning models, and as more and more culture experiments feed the models, the better predictions would become. The ability to predict cell culture parameters based on such analytical capabilities would be an exciting advance.
LabVantage Solutions is excited to see increased adoption of AI and machine learning in the pharma/biopharma industry. One promising area is drug discovery and development, where AI can analyze vast amounts of data to identify potential drug targets and predict the efficacy and safety of new compounds. This accelerates the process and increases success rates.
Another area is clinical trials and patient recruitment. AI and semantic search technologies can efficiently sift through data to identify suitable participants, leading to faster recruitment and improved trial outcomes.
AI also enhances pharmacovigilance and drug safety monitoring by analyzing real-world data to detect patterns indicating safety concerns or adverse reactions. This proactive approach improves patient safety and regulatory compliance.
Manufacturing and quality control processes can benefit enormously from AI-powered analytics. Real-time monitoring and analysis of production data can detect anomalies, predict failures, optimize processes, and ensure adherence to quality standards, leading to increased efficiency and improved product quality.
AI and machine learning are already making an impact in the pharma/biopharma industry and have huge potential for greater contributions in the future. These tools can revolutionize drug discovery, enhance clinical trials, improve drug safety monitoring, and optimize manufacturing processes. LabVantage’s mission is to leverage synergies from our AI and semantic search capabilities to enable pharma and biopharma companies to harness these technologies and drive innovation in their respective fields.
We’ve already seen the impact of these technologies in respiratory trials, where AI-enabled, imaging-based digital biomarkers are delivering highly precise measures of lung physiology to better understand the impact of a therapy, its mechanism of action, and the characteristics of responders. With these insights, sponsors can make faster go/no-go decisions, increase the power of a study, and more tightly define inclusion criteria — all accelerate the timeline of a therapy. We are at a point where we can produce up to 15,000 individual metrics per imaging study across over 50 validated imaging biomarkers.
AI can be applied beyond the analysis of an image. We are increasingly using AI/ML in new ways to optimize data quality and optimize clinical trial imaging. I’ll share three examples.
We will continue to see increased adoption of AI to streamline the clinical trial process — from recruitment to analysis — so promising therapies can reach patients sooner. Imaging data are highly structured and a great source of data for AI inclusion.
Recent advancements in artificial intelligence are fundamentally reshaping the landscape of opportunities across all major industry sectors and aspects of life. This surge of significant breakthroughs, which further accelerated in recent months, continues to provide notable benefits in areas like healthcare, biotechnology, education, and agriculture, to mention a few. The advent of the fine-tuned skill-as-a-service framework provides innovative tools capable of harnessing significant bodies of knowledge while delivering on-demand expertise. The cascade of knowledge synthesis and problem solving now requires fewer search (and research) cycles, as we shift our focus from serially retrieving answers to requesting machines to deliver an outcome.
Advancements in these technologies are consistently providing superior outcomes across a broad array of tasks. We now have more adaptable automation capabilities at our fingertips, which can seamlessly adjust to new assignments, as we are further removing barriers around querying and programming. The tectonic shift from formal language imperative programming toward natural language declarative programming means that we can describe a desired outcome through natural language, rather than instruct the computer how to go about a task in a step-by-step fashion. We're embracing innovative team dynamics through substantial human–machine collaborations, designed to engage in intricate and lengthy processes. These cooperative efforts span a spectrum of activities, from the less uncertain realm of software development to the highly uncertain pursuit of scientific discovery.
As a result, we should expect more out of our investments in drug discovery, the delivery of care, and the overall prevention of disease.
At heart and by training, I’m a computer scientist, so I’m a strong proponent of the progress we’re seeing in applying AI and machine learning to this field. One of the most interesting areas where AI could play a part in the future is for clinical decision making. We have lots of great tools available, but as heuristics get more complicated, it can be difficult to decide what the ideal clinical protocol to follow is. AI will be able to tease these decisions and pathways out.
The biopharma space is not like other areas, where you have a plethora of aggregate material to train these models. In this case, we are limited by what we know and what we've already done preclinically and clinically. One thing that would be really interesting as an application of AI is for drug–drug combinations. Combinatorics is something that humans struggle with, since it’s most often not a straightforward cause-and-effect when it comes to combining complex molecules. There are a number of companies applying AI to pure drug discovery, but I think that AI’s ability to rapidly evaluate combinations of molecules could lead to situations where 1+1 = 3, from an efficacy perspective.
I’m excited to see AI and machine learning used more in predictive analytics. Especially in our work with Parkinson’s patients, combining high-quality symptom data and AI can enable accurate predictions across a slew of problems currently plaguing the field. For example, AI models could help predict when a patient might need a specific therapy, based on their detailed symptom data. Plus, predictive models can help match people with Parkinson’s to a particular clinical trial that is most likely to address the underlying cause of their symptoms. Especially as the neurodegenerative disease progresses and a patient begins to see less of a benefit from their medication, matching people with Parkinson’s to clinical trials for disease-modifying therapeutics can help give renewed hope. Another way AI can help is distinguishing between the two major types of motor symptoms a person with Parkinson’s experiences: tremors (involuntary muscle movements that are rhythmic) and dyskinesia (involuntary muscle movements that are irregular). A lot of patients themselves are unable to distinguish between tremors and dyskinesia, but treatments for the two distinct symptoms are on either end of the spectrum. It’s key that we improve patient and clinician ability to detect and predict dyskinetic episodes, and AI can help bridge this gap.
AI in biopharma is most often associated with therapeutics development, where it is routinely used for everything from discovery to clinical trials. But these algorithms are breaking through in diagnostics as well, with a growing list of potential applications.
One is accelerating molecular diagnostics development — a historically slow, laborious process that requires researchers to manually test hundreds of different combinations of sequences and primers to find the right pairing to most accurately and sensitively identify the disease. In particular, finding the right disease-specific target sequence is a challenge because it has to be both unique enough to avoid false positives and unlikely to mutate so there are few false negatives. In the new world of CRISPR-based diagnostics, we also need to add in guide RNAs, adding complexity to a process that was already akin to shooting in the dark and hoping for a hit.
Today, we’re using machine learning to predict and recommend combinations that are most likely to be effective, depending on the specific disease. Wet lab testing confirms the prediction, and data is fed back into the platform to improve the algorithms’ predictive capabilities.
Perhaps the most exciting opportunities for AI enhancements are the ones we don’t yet know about. It’s one of the main reasons why we integrate AI teams into every aspect of the company at Sherlock.
I’m most excited to see increased adoption of AI, and specifically large language models (LLMs), to make automation more accessible to life science labs around the world. Right now, there are robotic technologies available to automate a variety of lab protocols, but these systems are either very limited “one-trick robots” or powerful platforms that are inaccessible to anyone who isn’t a programming expert. With LLMs, however, we can begin to bring down that barrier so automation is accessible for everyone. We’ve demonstrated how this works with our own Opentrons robots: ChatGPT is capable of creating working Python code for an automated experiment, such as performing a serial dilution or setting up a PCR reaction, and this protocol will run successfully on our robots. For legacy robotics, creating a workable protocol like this often takes a great deal of time and extensive automation expertise. LLMs make it possible to design experiments in natural language, without needing to know how to code. But we’ve only just scratched the surface of what LLMs can achieve in the world of lab automation. My dream is a robot in every lab, easily programmable by any scientist with AI, that can execute reliable and reproducible experiments to push the pace of life science discovery even faster.
There are a couple of interesting ways I’m excited to see AI increasingly applied in our industry. One promising route on the R&D side is utilizing AI and machine learning (ML) to make the drug development process more efficient. Leveraging AI/ML will significantly reduce development costs, make medication more affordable, and allow for quicker delivery of therapeutics to the market, providing faster relief to those in need.
From a marketing standpoint, AI/ML provides a massive opportunity to fine-tune pharma messaging to better align to the physician and patient journeys. Rather than parsing through all available data manually, AI can more quickly review and analyze that data, whether it’s data from a physician-used tool or data from a patient-worn device, so that marketers can adjust their messaging strategies appropriately for the specific audience they’re trying to reach. Ultimately, optimizing messaging through AI ensures that pharmaceutical companies are using their marketing dollars more effectively and that the right information is reaching the right physicians and patients at the right time.
The range of analytical technologies deployed in biopharma is constantly growing. I see tremendous opportunities to apply AI algorithms to raw data or metadata from these different platforms, especially where they are evaluating the same molecule, process, or system. Since many of the deployed technologies have some level of information that is confounded with other technologies, in addition to the unique attributes targeted by those methods, this will allow us to leverage a variety of data sources together that are today siloed.
I would also like to see AI applied to biological systems modeling, especially animal modeling. Preclinical testing today is both expensive and time-consuming, adding years onto the drug development process, and is often coupled with complex ethical considerations.
As an industry that is rapidly evolving and increasingly moving toward animal-free alternatives, we see opportunities across the drug discovery and development process to utilize AI and machine learning (ML) to ensure a sustainable future where responsible animal use and patient safety remain top priority. For Charles River, our Global Technology team is integrated throughout our businesses, exploring alternative digital solutions that will allow for the responsible use of animals, including the opportunity to reduce animal usage. This integration requires close partnership with the industry, working together to ensure programs are designed with a 3Rs (reduction, replacement, and refinement) mindset. One example is already in use in early discovery — Logica, an integrated, iterative AI-driven process that delivers advanceable small molecule leads and candidates. By integrating leading AI and data generation, platforms like Logica can help advance and de-risk candidates, which should lead to reduced attrition in clinical studies.
We’re experiencing a revolution across multiple fronts in the industry. The rise of mRNA therapeutics promises a new class of programmable medicines that code for proteins capable of fundamentally changing cellular physiology to treat or cure diseases. But being able to design and engineer proteins and get them to perform specific functions is incredibly challenging. In the past, engineering a single protein was a monumental feat requiring an arduous trial-and-error process, but now with the computational power of AI and machine learning (ML) and creating bespoke algorithms, we can accomplish in weeks to months what previously took years. Additionally, each iteration and new data generated accelerates the process and improves the likelihood of success. The compression of development timelines coupled with improved chances of success will ultimately translate to safer, more effective therapeutics.
While expediting and improving our ability to engineer therapeutics, AI and ML are revolutionizing our ability to mine and interpret vast quantities of genetic code and information. Although the human genome was mapped more than 20 years ago, our understanding of the nature and organization of DNA is still growing. AI and ML have enabled predictive frameworks to program medicines with prospectively determined outcomes in terms of levels of control of gene expression and cellular programming to treat or even cure diseases. Through computationally guided programmable mRNA medicines designed to behave exactly as we intend, we could potentially address almost any disease.
We are extremely enthusiastic about the increasing implementation of AI in the field of drug discovery, particularly in the advancement of mRNA vaccines and therapeutics. While mRNA technologies have emerged as an innovative and effective approach for treating challenging conditions, certain challenges still exist, such as stability, tissue-targeted delivery, and manufacturing. We believe that there are four significant areas where AI can enhance mRNA technology and assist in designing the next generation of mRNA therapeutics, thus improving efficacy and reducing immunogenicity.
First, AI can assist in optimizing the untranslated regions (UTRs), which can significantly enhance mRNA performance. Secondly, AI can aid in optimizing codons by analyzing transcriptome and proteome data, enabling the identification of codon combinations that yield high protein expression while requiring minimal mRNA levels, thereby increasing potency. Third, AI can contribute to the discovery and optimization of lipid nanoparticles (LNPs) to facilitate efficient and targeted delivery of therapeutic payloads while minimizing systemic spill-over and potential side effects. Lastly, AI can play a role in smart antigen design, predicting future virus mutations and enabling proactive measures against frequent adaptations; this renders continuous vaccine development for new variants, such as COVID, unnecessary.
We have initiated two collaborations that leverage AI to develop next-generation mRNA candidates and are actively exploring additional opportunities. One collaboration, funded by CEPI, is our partnership with DioSynvax, with the goal of developing mutation-agnostic, broadly active beta-coronavirus vaccines to prevent future pandemics. Additionally, we have teamed up with the prestigious Helmholtz Institute in Munich to optimize UTRs, aiming to significantly improve translation efficiency.
One of the biggest topics that comes up in this space is how we can use AI to better understand data. There are a lot of data generated across the life sciences industry every day. But one of the fundamental issues with these large data sets is trying to analyze them in such a way that we can parse out what is valuable and what is noise. AI and machine learning are important tools that can help us clean up data and reanalyze it for use across different applications.
So, what does that mean on a day-to-day basis? AI has the potential to help us in everything from drug discovery and clinical trial development to marketing and manufacturing. Because of the wealth of knowledge that AI can pull from, it can help us make more strategic decisions as well as automate tasks. For example, AI has the potential to help us discover new molecules, design better trials, and identify quality control issues in manufacturing. It can even help us understand how to best “talk” with HCPs and patients when it comes to commercialization.
Lastly, as we navigate through these exciting times, it’s important to remember two things. (1) Asking the AI software the right questions is an essential part of getting the results that you’re looking for. (2) AI is a powerful tool that we can use to do our work better, but we need to use it responsibly, always keeping the patient at the center of what we do.
AI / machine learning is a technology with the potential to transform drug discovery and development. The first AI-discovered drugs are already in the clinic, with small biotechs and big pharma working on hundreds of projects in this area. When applied to drug discovery, AI may be used to solve several key problems. The first is the identification of novel targets, which may be achieved through the use of language models, such as ChatGPT, which can analyze and understand vast amounts of biological information related to proteins and pathways. Another significant application is designing drugs against known hard targets or designing novel antibodies. Many companies in the AI drug discovery space are working in this direction. A completely novel application of AI is elucidating the mechanism of action of drugs. For example, drugs that are discovered using phenotypic screening methods have a significant advantage, as they work "out of the box" in the live biology of a disease model; however, understanding their mechanism of action has always been very challenging.
In the past decade, revolutionary breakthroughs in therapeutic areas, including oncology and gene therapy, have been enabled by precision medicine. Although tailoring treatments to a person’s unique biology has proved beneficial and life-saving, mental health has failed to innovate in this way. Psychiatric medications are prescribed on a trial-and-error basis. Finally, the field of psychiatry is harnessing big data and machine learning approaches to develop digital tools, diagnostics, and therapeutics that show potential to reveal and treat subpopulations of patients across CNS disease states. The identification of AI-derived brain biomarkers and development of corresponding therapeutics is enabling a precision psychiatry approach.
Until now, psychiatric diagnoses have been based on clinical symptoms, without biological biomarker tests that guide diagnosis, prognosis, or treatment selection. By combining artificial intelligence with data from behavioral tests, brain function measures including electroencephalography activity, and activity trackers, we look to find ways to optimize treatments for patients that benefit them because of their targeted nature. Consequently, by moving away from the scatter shot approach that pervades clinical practice today, the lives of those experiencing mental health conditions will dramatically improve.
AI and machine learning should be valuable throughout the drug development life cycle and then beyond into clinical applications. The process of treating diseases is fundamentally a process of utilizing existing knowledge and then incrementally building on it. You see it in the literature, an ongoing building of the knowledge base. Machine learning and AI are ways of building on the knowledge base but at speeds previously unthinkable. It will transform the way we characterize diseases, develop drugs, and implement them as cures.
I’m excited to see the impact that AI/ML could have in unlocking the full potential of cell and gene therapies. Cell and gene therapies are complex living medicines that have demonstrated considerable efficacy across many areas of unmet need — from serious genetic disorders to oncology and autoimmunity, these innovative new treatments are showing considerable benefit. However, their widespread usage is already being limited by the difficulty and cost of making them. One major challenge associated with the delivery of cell and gene therapies is the significant variability in the biological starting materials used in each batch’s manufacture, something that is further compounded in autologous approaches with the use of patient-derived materials that may have disease and prior treatment effects. Today, manufacturing approaches are standardized, and this variability is not effectively controlled for.
As the next generation of manufacturing technologies are developed, with greater digital capabilities and the ability to monitor manufacturing in real time becomes a reality, I can envisage AI/ML approaches that could effectively predict the most optimal growth conditions in response to the biological variability of the starting material and then adapt during manufacturing to the way that cells are growing, providing more robust, effective, and optimal manufacturing solutions.
The pharmaceutical industry is rapidly adopting AI and machine learning (ML), and examples of applications include drug discovery, route optimization, predictive maintenance, supply chain management, and demand forecasting.
One area of particular interest to Sterling is chemical process modelling. This can involve running thousands of virtual experiments to select the most promising ones to run physically in the lab. This saves time and money and allows for a more thorough search of the manufacturing recipe space.
Once at full-scale manufacture, process models can be used to inform further optimization. These models are particularly useful when trained on real data observed throughout scale-up. As process models improve, we will be able to control the scale-up and manufacturing process much more effectively. This will lead to more repeatable batches with higher yields and lower impurities.
The AI theme for 2023 is almost certainly going to be the advances made in large language models (LLMs). While the future impact of these models is much debated, and nothing of great significance has been applied to chemistry yet, it is only a matter of time. As LLM models become more accessible to smaller niche players in the software industry, I expect there to be a revolution in search interfaces for private domain specific company data, such as the artifacts created by experimental scale-up chemists. The building blocks for such applications are there, they are just waiting for someone to put them together.
Nice Insight, established in 2010, is the research division of That’s Nice, A Science Agency, providing data and analysis from proprietary annual surveys, custom primary qualitative and quantitative research as well as extensive secondary research. Current annual surveys include The Nice Insight Contract Development & Manufacturing (CDMO/CMO), Survey The Nice Insight Contract Research - Preclinical and Clinical (CRO) Survey, The Nice Insight Pharmaceutical Equipment Survey, and The Nice Insight Pharmaceutical Excipients Survey.