What Piece of Conventional Industry Wisdom Do You Think is Misguided, Outdated, or Just Plain Wrong?

What Piece of Conventional Industry Wisdom Do You Think is Misguided, Outdated, or Just Plain Wrong?

Sep 15, 2023PAO-09-23-RT-02

Trond Erik Aune, Ph.D., Chief Executive Officer and Co-Founder, Vectron Biosolutions

I think CDMOs have been good at conveying a message that spending time on strain development is not financially meaningful to their clients. It certainly isn't necessarily meaningful to the CDMOs, who benefit financially from weaker strains that require more batches run at larger scale and who typically don't have updated technologies for next-generation strain development. The fact is that it is often possible to improve strains to produce much more of a given protein, especially by taking a tailored approach instead of using the same strains and expression vectors again and again, which will result in lowered COGs that will greatly benefit drug development companies but also eventually patients.

Michael Moussourakis, Vice President of Strategy, Alconox Inc.

The customer is always right.  (Which they are, of course.)  But is it the most efficient or current solution?  Have regulatory, quality, cost, time, environmental, worker safety, and other considerations been taken into account?  Alconox Inc. is a provider of Class I medical device detergents to regulated industries around the world that require critical cleaning, including medical device, cosmetic, biotech, and pharma.  We work closely with our laboratory and manufacturing customers to put them in a position to make the most informed decisions by providing the most current technical data and processes. In addition to providing advice and products to clean surfaces in all facets of the laboratory and manufacturing process, it helps ensure regulatory compliance.  The notion that the customer is always right can therefore be outdated if the data and process the customer is relying on are not the most current. With an ever-changing landscape of regulations, guidance, and environmental considerations, it can be difficult for a drug or medical device manufacturer to remain up to date.  Empowering select qualified suppliers to be part of the design process can be a great path forward. 

Mariah Baltezegar, MBA, Vice President and Head, Peri- and Post-Approval Specialized Solutions, PPD, part of Thermo Fisher Scientific

One piece of conventional pharmaceutical research industry wisdom that is being challenged is the belief that all valuable insights must come from randomized controlled trials (RCTs). While RCTs have been the gold standard for evaluating the safety and efficacy of medicinal products, there is growing recognition that real-world data (RWD) and real-world evidence (RWE) can provide complementary insights that RCTs might not capture.

RWD are data collected outside of controlled clinical trials and are contained in places such as electronic health records, claims data, wearables, and patient registries. RWE involves analyzing these data to understand how treatments work in real-world settings. This approach challenges the notion that RCTs are the only valid source of evidence.

RWE can provide insights into diverse patient populations, long-term outcomes, and real-world treatment patterns that might not be fully represented in RCTs. It can also offer information about patient experiences, adherence, and the broader impact of treatments on healthcare systems. As the pharmaceutical industry embraces the value of RWE, there will be a shift toward incorporating both RCTs and RWE to make more informed decisions throughout the drug development life cycle.

Daniella Kranjac, MBA, Founding Partner, Dynamk Capital

Relying on the adage, “ask the customer what they want and deliver that to them.”  Imagine asking someone in process development or manufacturing in the late ‘90s, “what type of bioreactor do you want?”  They might talk about cleanability, process control, or better or additional sensors, but certainly not a 25-L rocking or 2000-L single-use, stirred-tank bioreactor.  Yet today these formats are now industry standard.  The point is: this misguided approach did not consider then, and would not consider now, that the convergence of technologies can create opportunities and entirely new markets if you ask the right questions.  I challenge myself and our portfolio companies to always ask what is needed not what do people want.    

Michael Kallelis, Chief Executive Officer, Mikart 

Perhaps the most misguided conventional industry wisdom is that U.S. based CDMOs cannot develop or manufacture economically.  This is often stated as a truth, but in fact it is a misnomer. There will always be a low-cost country available to manufacture any given product, but the true cost is not just about labor. Often, materials come from the same sources, and while prices vary from region to region, there is not always a significant advantage in the cost of materials. Cost considerations should also consider transportation costs, tariffs, insurances, and lead times.  Does it make sense to manufacture a liquid, halfway around the world, for the U.S. market?  Manufacturing “in-country” and “for country” often wins over distant, low-labor options.  And then there are the quality and compliance issues related to some offshore, low-cost options.  Labor practices at some of these “low cost” options are also not aligned with good corporate ES&G principles.  

John Mosack, Chief Operating Officer, Tanvex CDMO

One prevalent convention in our industry is the excessive reliance on 'Read and Understand' training for our cGMP operators to learn new SOPs and batch records. This approach is used far too extensively and needs a reevaluation. We should allocate more time to coaching and guiding our team members, whether it involves introducing entry-level concepts or enhancing the skills of experienced team members. This coaching and guidance can come from supervisors, peers, or quality personnel on the floor in real-time or in a reasonably timely manner. Our industry heavily depends on the expertise and know-how of our frontline manufacturing personnel to consistently deliver high-quality biologic products. Investing time and resources into more engaging training and knowledge transfer methods can reduce errors, eliminate time wastage, and enhance the overall quality of the manufacturing process and products.

Andrew Handorf, Ph.D., Director, Proposals, Scorpius BioManufacturing

The required timing for analytical method (AMV) validation is a common misconception. I’ve heard regulatory experts say there is no guidance saying analytical methods need to be fully validated to release phase III material. This is misguided. As a CDMO, we’d recommend AMV to begin around the phase III timeframe, but a more realistic timing to validate methods (and qualify IPC methods) would be prior to or in parallel with process characterization (PC). You collect massive amounts of data during PC, and it’d be a shame to find out down the road that your methods weren’t suitable for their intended purpose. This is still only a risk-based recommendation. The actual drop-dead timing for AMV is prior to BLA filing (so you might as well validate them prior to process validation (PV) to be safe).

Martin Akerman, Ph.D., Chief Technology Officer and Co-Founder, Envisagenics

While a lot has evolved in the pharmaceutical industry, including the advent of AI, the fundamental challenges of drug development remain. In addition, the recent proliferation of new modalities (immuno- and RNA therapeutics), while bringing innovation and hope to both patients and developers, has also added new complexities to the process of drug making. As such, drug development continues to be an expensive and risky proposition.

Despite the exciting transformation of the biopharmaceutical field, efficacy and safety remain the two most critical variables for successful drug discovery. Valuable AI predictive models must address these two critical challenges. 

At Envisagenics, we use AI to mine millions of RNA-splicing events that can elicit novel therapies. Our AI platform SpliceCore® allows us to prioritize novel therapeutic targets that are both effective and safe. To be effective, targets need to be stably expressed and accessible to drugs. To be safe, they need to be specifically expressed in diseased tissues and fundamentally absent in normal tissues.

SpliceCore identifies membrane-bound neoepitopes, encoded by mRNAs, readily translatable into proteins accessible to antibodies on the cell membrane. By screening thousands of tumor samples and vital organs using RNA-seq technologies, SpliceCore eliminates drug candidates with potential toxicity at the onset of the development process.

Richard Schwartz, Head of Life Sciences, Qualtrics

Everything works... until it doesn’t.

Considering the industry commercial model and perspectives on the patient and provider funnel from awareness to advocacy/adherence, the biggest issue we have right now is that it works. It may not always be effective and certainly not efficient, but it is hard to argue that it doesn’t work.

Let’s look at three converging lines. One: industry spent just under $8.1 billion in 2022 on ad campaigns alone, according to Vivvix. Two: the field models have tremendous opportunities for a reset given the number of years since the Department of Health and Human Services Office of Inspector General updated its compliance program guidance. Finally, three: go to any meeting, industry event, or publication and the word omnichannel is used like a conjunction to string thoughts together on what often amounts to conjecture on the customer journey.

The vanishing point, just over the horizon where these lines hopefully meet, requires something beyond the operational metrics like clicks, calls, visits, streams, and downloads. Industry partners report on these and get busy mapping them to financial impact. We give operational data credit for financial impact because it fits the age-old belief in progression down the marketing funnel.

I submit that the often missing measure is the connective tissue found in experiential metrics. Companies who excel at measures like sentiment, trust, CES, and NPS outperform the S&P 500. As neuroscience and immunology researcher, Leonard Kish said in 2021, “If patient engagement were a drug, it would be the blockbuster of the century and malpractice not to use it.”

Gráinne Dunlevy, Ph.D., R&D Director Delivery, Astrea Bioseparations

Fiber chromatography, such as AstreAdept®, has the potential to challenge conventional wisdom in cell and gene therapy bioprocessing in several ways:

Speed and efficiency: Fiber chromatography can significantly reduce processing times compared with traditional chromatography columns. This speed can be particularly valuable in cell and gene therapy manufacturing, where rapid production is often crucial, especially for autologous therapies tailored to individual patients. Faster processing can reduce the overall turnaround time, making therapies available to patients more quickly.

Scale flexibility: Fiber chromatography offers greater flexibility in scale-up and scale-down operations. This adaptability can challenge the conventional idea that cell and gene therapy manufacturing must follow rigid, large-scale processes. With fiber chromatography, it becomes more feasible to match the production scale to the specific needs of individual therapies or patient populations, avoiding excessive overproduction.

Reduced footprint and cost: Fiber chromatography systems are typically more compact than traditional column chromatography setups. They require less facility space and can potentially lower manufacturing costs. This challenges the conventional belief that bioprocessing facilities must be large and expensive to accommodate traditional equipment.

Improved yield and purity: Fiber chromatography has the potential to provide higher product yields and purity levels. One reason for this is that fiber has a more open flow path than traditional chromatography products and so reduces sheer stress. This can challenge the notion that achieving high yields and purity in cell and gene therapy manufacturing is inherently difficult. Improved yields can lead to reduced production costs and increased availability of therapies.

Dan Rudmann, Ph.D., Senior Director of Digital Toxicologic Pathology, Charles River Laboratories (CRL)

“The investment needed for implementation of a non-clinical Good Laboratory Practices (GLP) validated digital pathology workflow is not value added for the industry.” Hearing this statement by individuals in our sector is understandable based on the limitations that previously faced digital pathology, including poorly designed laboratory information management systems (LIMS), slow and volume-limited whole-slide scanners, insufficient internet bandwidth, low refresh rate and/or resolution monitors, and expensive hardware (whole slide scanners, computers, servers) and viewing software. Early attempts ended in expensive systems with substandard performance that frustrated the laboratory and the pathologist. There was also hesitancy concerning regulatory acceptance of a GLP-validated digital workflow. But today, we can build a digital, bar-code–driven LIMS for the laboratory and outfit a pathologist with a superior digital microscope at a fraction of the cost observed only 5 years ago. Internet bandwidth is moving to Gbps+, and cloud services offer tiered storage for the laboratory. The workflow can be automated, metadata rich, and artificial Intelligence (AI) augmented for quality control. Regulators are on board for digital, and recent guidance has been published. When calculating the value proposition, a laboratory needs to dig into all their costs, both people- and process-related, for their non-digital system that starts at the histology laboratory and ends at the microscope of the pathologist. There are also opportunity costs for not going digital that need consideration, like increased efficiency of pathologist’s diagnostics, talent recruitment and retention advantages, training of new staff using a digital archive, and the ability to build AI-based computer assisted diagnostics.

Karen Philip, Executive Director, epocrates Marketing

As the media landscape grows more complex and fragmented, some pharmaceutical clients are adopting hyper-targeted media buying for precision in reaching specific audiences. However, this approach may not fully serve the diverse needs of healthcare providers and patients. 

Healthcare’s intricate complexity and diversity extend beyond demographics, potentially leaving out a wider patient spectrum that could benefit from broader distribution of expertise and insights. Clinicians beyond the targeted audience might miss the opportunity to provide tailored care to their patients. Moreover, hyper-targeting could worsen health disparities by neglecting underserved communities in need of medical attention, contradicting the ethical duty to ensure equitable access to care. 

Hyper-targeted media buying, though precise, often results in missed opportunities for many healthcare providers. The inclusive nature of healthcare, the imperative to address disparities, and the emotional aspect of patient care all highlight the limitations of a solely data-driven approach. A more balanced strategy that embraces wider outreach, inclusivity, and emotional understanding aligns better with the multifaceted healthcare field.

Raj Indupuri, MBA, Chief Executive Officer, eClinical Solutions

Computer software validation (CSV) is an area where industry thinking is more outdated, and it results in a substantial impact on software implementation and adoption cycle times, costs, and overall ROI. To comply with the regulatory validation requirements of the FDA and other regulators, companies regularly and unnecessarily redo the work of software-as-a-service (SaaS) providers. SaaS providers today spend enormous resources and effort conducting testing and validation to ensure cloud-based applications are delivered and implemented with high quality — including documented evidence to comply with regulatory guidance. Because of this, sponsors should put the onus on software companies to ensure regulatory compliance and depend on software companies to provide more “out of the box” capabilities that leverage industry best practices. 

 

Modern regulatory guidance also strongly emphasizes risk-based approaches. As such, a risk-based approach to validation efforts should focus on areas with the highest potential impact on product quality, patient safety, and data integrity. Sponsors should conduct regular audits of the SaaS software providers and their validation process to ensure that guidance is followed effectively and partner as needed to minimize the work redundancies across organizations. A modern approach to validation removes considerable strain on resources and time within SaaS application and platform implementation, streamlining cycle times in the critical phase of adoption that leads to long-term success and ROI of life sciences technologies.

 

George Magrath, M.D., Chief Executive Officer, Lexitas Pharma Services

The conventional wisdom that FDA approval is the only necessary requirement for a successful new drug is changing rapidly. The FDA has a mission to ensure the safety and efficacy of new medicines and devices, and the agency does a great job at achieving this mission. However, medicines are increasingly required to show clearly and definitively where they fit in the treatment paradigm to gain acceptance by patients, physicians, and payors. If any of those stakeholders do not understand the true benefit in the clinic for a new medicine, it is unlikely the medicine will be commercially successful.  It is a great shift in our field to have trials centered not just on regulatory endpoints but on clinically meaningful trials for patients, physicians, and payors.

 

Christian Schetter, Ph.D., Chief Scientific Officer, Rentschler Biopharma SE

From the perspective of a CDMO with almost 50 years of experience in biotechnology, we know that volume is not the only factor that determines our success. We also need to consider the diversity and complexity of the products we manufacture and the necessity to also establish robust processes for difficult-to-make products, as well as the needs of our clients and their projects. For example, while some products, like blockbuster drugs, may require large-scale production of 10,000 L or more, many others may only need smaller volumes, such as advanced therapy medicinal products (ATMPs) and protein-based therapeutics for rare diseases.

Bioprocess development and optimization are crucial to project success. We recognize that biopharmaceuticals are not trivial commodities that can be easily manufactured. Each therapeutic has its own unique characteristics and challenges, depending on the type of molecule, expression system, purification method, formulation, and delivery. Therefore, we focus on developing and implementing exceptional processes that can handle the complexity and variability of biopharmaceuticals and that can meet the highest standards of quality and regulatory compliance.

Moreover, we believe that competition is not a zero-sum game in our industry. We are all in the same boat, and we all want to see the industry grow and thrive. That is why we at Rentschler Biopharma adopt a coopetition mindset, where we support and cooperate with other players in the industry on issues such as supply chain management and technology innovation. We also form strategic alliances with companies that share the same values and client centricity to leverage our complementary strengths and capabilities and to ultimately create value sustainably for our clients and their patients.

Martin Maiers, Vice President, Research, Be The Match BioTherapies

Allogeneic, off-the-shelf therapies are a promising and growing segment of the global immuno-oncology cell therapy market. For a variety of reasons, this field either ignores histocompatibility outright or has overestimated the potential of engineering to safely eliminate it as a barrier. Histocompatibility is a thriving field in its own right, and I believe that the challenges it presents to allo cell therapies can be addressed by thoughtful design and basic science. We plan to focus on data and diagnostics to assess patient allo antibodies and natural killer (NK) function.

This starts with allo cell bank design informed by population genetics of the immune genes and allo antibodies. Clinical trials should include a histocompatibility component with characterization of the patient and cell products. The larger the pool of potential starting material, the better the odds of getting the right cells to the right patient.

The HLA system, which is the most medically relevant and polymorphic system in the human genome, is often misunderstood.  The path to successful allo cell therapies goes through it, not around it.

Jayne Hornung, Chief Clinical Officer, MMIT (Managed Markets Insight & Technology), a Norstella company

Market access is all about the payers at launch. That sentiment is completely misguided. Market access starts from the point the product is in R&D at the manufacturer, through FDA approval and launch, through post-market while the product is generally accessible for patient care, and finally through loss of exclusivity. Every point in the journey is a point to inform manufacturers and access more patients using real-world data. The integration of multiple data sets used at every stage of the product life cycle leads to greater market penetration and allows more patients to access lifesaving therapies. A broader market may now be accessed by drawing from electronic health records, health care and provider organizations, sites of care, associating pharmacy claims and medical data, genomics lab testing data, and patient demographic data to find appropriate patients.

Insurers don’t need RWD. That’s another oft-heard sentiment that’s misguided. Payers need to understand what benefits a drug offers to patients, and RWE can help. If manufacturers can bolster the confidence in their products using RWE to demonstrate improved patient outcomes and reduce hospitalization rates and length of stays, insurers will want to cover those therapies.

Cornell Stamoran, Ph.D., Vice President, Strategy & Government Affairs, Catalent

There is an underappreciation of the critical role that providers of outsourced drug substance and product development and manufacturing have come to play in the global innovator drug development ecosystem. These CDMO companies used to focus primarily on product supply. However, substantial capital inflows over the last decade have enabled emerging pharma innovators to make up the majority of the active pipeline. These companies have strategically turned to CDMOs for help.

CDMOs have in turn invested substantially in adding capabilities, technologies, and integrated development/supply solutions and have helped lead innovation in manufacturing for both conventional and advanced modalities. Today, CDMOs produce an estimated 40% of all drugs globally and have actively supported the development and/or supply of more than 90% of new molecular and biologic entities approved by the FDA over the last five years. CDMOs also played a significant role in supplying vaccines during the COVID-19 pandemic and remain a key enabler of the emerging pharma sector.

Looking ahead, we expect CDMOs to become an even more critical part of the global pharmaceutical innovation and product supply ecosystem.

 

Jeff Policastro, Senior Account Manager, LabVantage Solutions

In the rapidly evolving landscape of laboratory informatics, a conventional industry wisdom that warrants reevaluation is the notion that the mere adoption of cutting-edge technology will inherently usher in heightened efficiency and productivity. While it's true that innovative tools hold promise, blind faith in their transformative power can be misguided.

One fundamental consideration is the alignment of technology with existing laboratory workflows. The belief that the latest gadget will seamlessly integrate into established processes neglects the reality that hasty implementation can disrupt operations and yield inefficiencies. Moreover, the assumption that technology operates in isolation from human factors ignores the importance of proper training and familiarity for optimal utilization.

The fallacy of "one-size-fits-all" solutions also emerges. Laboratories are diverse in their needs, and inflexible technologies may hinder rather than aid. The costs, both financial and temporal, must also be scrutinized. High upfront expenses, coupled with the complexities of implementation and maintenance, warrant a comprehensive assessment of potential returns.

Data deluge is another critical concern. The blind faith in technology's ability to generate insights without robust data-management strategies can lead to chaos rather than clarity. Moreover, overlooking the significance of change management and staff acceptance can undermine even the most sophisticated systems.

Ultimately, the misconception that technology alone can rectify underlying process inefficiencies is problematic. A holistic perspective, encompassing technology, processes, people, and strategy, is indispensable. Laboratories should tread cautiously, acknowledging that while technology can be an invaluable tool, it thrives when harmonized with thoughtful analysis of context and necessity.

 

Richard Chen, M.D., Executive Vice President, R&D, and Chief Medical Officer, Personalis, Inc.

In cancer, imaging is the current standard of care to detect residual disease and recurrence. However, imaging alone may not be the most efficient and reliable tool to determine if cancer therapy was completely successful. Today, there are new circulating tumor DNA (ctDNA)-based technologies, which leverage tumor-informed sequencing, that can detect very small, infinitesimal traces of molecular residual disease (MRD) in the patient’s blood. By taking this non-invasive approach, physicians can potentially detect MRD earlier than conventional methods, closely monitor cancer progression, and better optimize treatment regimens.

While the first-generation MRD tests launched about six years ago, there are newer-generation MRD tests like NeXT Personal(R) that are up to 100 more sensitive than other leading MRD tests and can detect cancer as low as ~1 tumor molecule per million. By "seeing the unseen" and detecting cancer invisible to other available technologies and standards of care, NeXT Personal offers more clarity  in cancer detection and prognosis. 

With advanced MRD testing that leverages the broad genomic profile of a patient’s tumor comes the promise of earlier detection of cancer recurrence and earlier intervention. As a result, ctDNA-based MRD technologies like NeXT Personal may ultimately help physicians make impactful and personalized therapy decisions, ensuring a more effective, long-term treatment strategy to potentially improve patient outcomes. 

Swati Punatar Reichmuth, MBA, Chief Operating Officer, Rune Labs

Many stakeholders in the neurodegenerative disease space believe that since Parkinson’s disease (PD) is the second most common neurodegenerative disease in the United States, access to care is easily attainable. However, recent studies show that PD is mismanaged, misdiagnosed, or not diagnosed at all, causing many patients to get lost in the healthcare system, unable to receive the best care possible. 

We now know that PD is a multifaceted disorder, so it is difficult for patients to get the treatments they need for multiple reasons, such as the lack of movement disorder specialists in a patient’s local region to help with an accurate diagnosis. This can lead to incorrect treatment strategies by physicians not specifically trained to identify PD-related symptoms. The healthcare community needs to realize that the risk of PD is high in the elderly population. If these patients present with a combination of symptoms, whether motor or non-motor, that is associated with PD, it is crucial that physicians keep the PD diagnosis at the forefront of possibilities.

Furthermore, since the disease is heterogeneous in nature, a patient that is diagnosed with PD needs to receive a tailored approach to treatment. In many cases, standard-of-care medicines may not provide clinical benefit. This "trial-and-error" method of prescription drugs significantly contributes to the healthcare spend, wasting countless resources. Tools like StrivePD, which can give a longitudinal view of a patient’s history, will allow for a better understanding of a patient’s symptoms and hopefully inform physicians in making more effective and precise care decisions.

Todd Druley, M.D., Ph.D., Chief Medical Officer, Mission Bio

There is an adage stating, 'always use the right tool for the job' and that’s been the driving force behind precision medicine for 25 years. But despite amazing advances, it seems that the goalposts keep moving, with promises that the next advance will be the true enabler while much of the industry continues operating under the 20th century drug development paradigm.

Precision medicines are only as good as the technology used to match the right drug to the right patient. The coarser and more non-specific the diagnostic tool used, the less likely the right connection is going to be made especially for complex, heterogeneous diseases like cancer. But single-cell multi-omics have finally begun to deliver on those precision medicine promises.

Now, researchers are studying resistance or relapse driven by small numbers of cancer cells. Drugmakers are looking for unique, personalized combinations of genetic changes and protein expression in cancer cells, figuring out who will best respond and what to do if they don’t. Advanced therapy developers are ensuring the quality of their cell-based and genome-edited medicines.

All this is leading to high-resolution, individualized patient testing. Soon, doctors won’t just know if the cancer is back but what specifically to do about it for each patient. Single-cell companion diagnostics will enable better, more precise clinical development and targeted patient treatment. In fact, it’s already begun.

David Radspinner, Ph.D., Chief Executive Officer, VintaBio

At VintaBio, we believe that the rapid push for scale and adjacency to traditional protein therapeutics manufacturing has created strains in the gene therapy manufacturing industry. While there are valid reasons to draw insights from protein therapeutics and to share technology in critical areas, blindly adhering to them can hinder progress in gene therapy.

Gene therapy is a complex and evolving domain that differs significantly from traditional protein production. Replicating protein production models often leads to increased costs and limits our capabilities. It's crucial for the industry to recognize that the development and manufacturing of viral vectors for gene therapy are based on specific biologic process requirements that call for unique approaches.

To effectively address the viral vector bottleneck in cell and gene therapy, we need a discerning approach. This involves carefully assessing technology and cost structures. At VintaBio, we are committed to redefining these paradigms by leveraging our industry expertise to find the optimal balance between tradition and innovation. Our goal is to enhance efficiency, reduce costs, and expand capabilities specific to gene therapy.

While tradition has its merits, it's essential to challenge the one-size-fits-all approach in the life sciences arena. Gene therapy requires tailored strategies, and our willingness to reevaluate conventional wisdom is a crucial step toward unlocking its full potential.

Mark W. Womack, Chief Executive Officer, BioCina

“Drug developers should choose the CDMO that offers the fastest timeline to manufacturing.” I believe this is among the most misguided perspectives in our industry.

If most CDMOs actually delivered on the timelines they promise, it might be a good practice for drug developers to prioritize finding the fastest timelines. The hard truth is that most CDMOs are not highly reliable in delivering what they say they'll do. In addition, many CDMOs, including many of the most well-known, have quality and regulatory issues that put their clients' programs at great risk. And many CDMOs have operations teams with very low average tenure at their company, which too often leads to human errors that result in failed batches.

Timelines that many CDMOs offer are often not remotely realistic and are offered solely to try to win the business, with the CDMOs knowing it's highly unlikely the timelines will be met and that they'll deal with their client's disappointment down the road. This often means actual timelines are far longer than what was agreed and littered with serious problems, due to pushing a team too hard to achieve an unrealistic timeline, and compromises are made to quality and compliance that result in program pauses and delays.

Drug developers would be far better served by focusing on finding a CDMO proven to be highly reliable at doing what they say they'll do, with highly experienced teams and superior quality and regulatory records and to be wary of CDMOs that offer unrealistic timelines to win business.

 

John Lee, Ph.D., Head of Cell and Gene Therapy Business Unit, SK pharmteco Cell & Gene, Center for Breakthrough Medicines

In the realm of cell therapy manufacturing, the adage "the process is the product" emphasizes the pivotal role of the manufacturing process in determining the quality of the resulting cell therapy drug product. While a robust Good Manufacturing Practice (GMP) process undeniably stands as a vital component for successful manufacturing (an inconsistent process inevitably results in subpar quality), it's important to recognize that the process alone does not guarantee quality in cell therapy production. Instead, the industry has gained insights indicating that the quality of the starting material holds equal significance in achieving critical quality attributes (CQAs). If the initial material is of poor or suboptimal quality, even a robust process will fall short in producing a high-quality cell therapy drug product. Consequently, the field has embarked on efforts to better delineate the characteristics of the starting material that contribute to improved batch success rates while meeting expected CQAs.

Amit Etkin, M.D., Ph.D., Founder and Chief Executive Officer, Alto Neuroscience

Conventional industry wisdom claims that the brain is too complex to make progress in central nervous system (CNS) drug development. This is patently false and leaves little room to challenge the status quo in understanding underlying disease biology. Until now, the field has done little to identify biological drivers of disease, and large-scale CNS trials don’t routinely measure these markers. While the CNS has proven to be a particularly challenging area for achieving success, I believe that simple, scalable brain-based biomarkers are well within reach; these will not only support higher drug approval rates but also have the potential to maximize clinical impact for patients. A view that the brain is too complex (and therefore not measuring biomarkers at all) does not preclude the possibility of simple solutions, which one can only discover by systematizing large-scale data collection.

Costly, out-of-reach technology is also not necessary to achieve meaningful progress in deciphering the brain. For example, Alto’s trials are leveraging EEG, behavioral task performance, wearable devices, and other already-scalable tools to identify clinically relevant biomarkers and likely drug responders. Over the past decade, we’ve been able to identify and categorize core domains of mental function cognition, emotion, and sleep processes to further support these robust drug-response predictions. Instead of the current approach where patients are grouped largely by symptoms, we’re looking toward a future where biomarkers will guide targeted CNS treatments. Precision medicines for the brain will enable biopharma to do better than the trial-and-error of today, enabling long overdue advances in CNS drug development for patients desperately in need of new and more effective options.

Rick Finnegan, Chief Operating Officer, Elixirgen Therapeutics

I think the conventional view of what a COO can offer to a biopharma company is a bit outdated. This is particularly true with smaller and earlier-stage companies. The standard used to be that the COO was thought of as a specialist in commercial operations and manufacturing operations, typically for large, commercial-stage companies where that is a greater focus. As such, they weren’t brought on until later stages of development. These functions are less relevant for earlier-stage companies where successful development is what determines success — this development phase includes but is not limited to efficient clinical execution (with a focus on enrollment), a proactive and forward-looking regulatory strategy, and a seamless transition from clinical to commercial manufacturing. While development is often seen as the sole purview of the head of R&D, these aspects require a depth of understanding in commercial and manufacturing operations that a COO is an expert in. Further, adding an industry-seasoned COO who can provide mentoring and serve as a sounding board is an excellent complement to a scientific founder / chief executive officer who may be less experienced. Ultimately, bringing on the right COO early in the process increases the likelihood of success.

 

Asim Siddiqui, Ph.D., Senior Vice President, Research & Tech Development, Seer

For decades, many within the life sciences industry have stayed away from utilizing mass spectrometry to study molecules, like peptides and proteins, because the technology has developed a reputation as being too difficult to perform, complex, and low throughput. While this may have been true in the past, the field has evolved significantly and novel mass spec technologies are addressing these challenges.

For example, scientists no longer need to spend months developing complex mass spec protocols to squeeze out the best performance from instruments. Canned workflows covering the majority of use cases can be implemented, making the development of mass spec processes less complicated for many scientists. Additionally, the creation of faster and more accurate analyzers makes deeper coverage, as well as higher sensitivity, more easily accessible than previous generations. Sample preparation workflows upstream of mass spec tackle challenges such as the wide dynamic range in complex biological samples, such as plasma. With the combination of newer mass spec technologies and scalable, robust sample prep, mass spec as a readout is rapidly becoming more accessible to the broad life science community.

These innovations also lower the barriers to conducting at-scale proteomics studies, especially since mass spec–based proteomics is becoming widely recognized as the gold standard for deep molecular profiling. To that end, enabling more of these types of studies will help the scientific community discover the next-generation disease biomarkers and develop safer and more effective personalized therapeutics. Taken together, these developments point to us being at a watershed moment in mass spec–based proteomics.

 

Alec Ford, Chief Executive Officer, Karius

In the healthcare industry, we need continued innovation for diagnosing infectious diseases.  Today we spend 5–10 more on genomic advances in cancer than we do genomics in infectious disease. In 70% of cases, the pathogen causing an infection is never identified in cancer patients.  As a result, every day in the United States, 800 cancer patients die from infection. Diagnosing an infection in hospitalized patients often requires multiple tests conducted within weeks of hospitalization.  Antimicrobial treatment can become guesswork, which can lead to global-reaching public health issues like the current surge of antimicrobial resistance (AMR). At Karius, our goal is to help improve the diagnosis of infectious diseases by innovating testing that leverages the latest in genomics and AI. In our most recent trial, we found as many cases of pneumonia as 7 days of testing during a hospital stay with our single, 24-hour test. The key is to work closely with others in the healthcare industry to help improve patient outcomes and reduce possible public health issues like AMR. Cancer patients at risk of infection deserve more.

 

James Atwood, Ph.D., General Manager, Robotics, Opentrons Labworks, Inc.

Conventional wisdom suggests that researchers need to first develop experimental workflows manually and then transition those processes into automation. However, in the biopharma industry, it is increasingly evident that automation is becoming a fundamental part of the drug discovery process. To that end, each industry stakeholder developing experimental workflows from academics to R&D scientists to reagent manufacturers needs to take an "automation-first" approach. 

Incorporating automation into experimental designs can drastically reduce costs and timelines in drug development. If we want to get patients lifesaving medicines, we need to ensure faster results. Automation can also improve reproducibility and scalability. Monotonous, manual processes are increasingly unmanageable as the life sciences community demands larger sample sizes to ensure accurate results, and with larger sample sizes higher demands are placed on ensuring reproducible analyses. Automating routine processes, such as liquid handling purifications and heating and cooling steps, allows scientists to process thousands of samples each day easily and efficiently.

Human errors by lab personnel, from mislabeling samples to incorrect reagent usage, are a major hindrance to the advancement of therapeutics. Automation can significantly reduce these error rates and help decrease unnecessary waste of reagents and labor resources. 

Automation shouldn’t be an afterthought in drug discovery and development. The life sciences community needs to design experimental workflows with scale in mind, and that starts by adopting an automation-first mentality. 

 

Talat Imram, Chief Executive Officer, Rani Therapeutics

One piece of misguided pharma industry wisdom is the notion that superior efficacy for a drug product solely drives market adoption. The reality is you don’t have to be the best in class to capture an impressive portion of the market. For example, most oral therapies currently on the market are inferior to their injectable counterparts. However, they can still convert patients and drive revenue because of their convenience. 

Take Otezla as an example. Despite having a PASI 75 (75% or greater improvement in psoriasis area and severity) score that is 60% lower than Cosentyx, Otezla generated substantial revenue, capturing 22% of the market in 2018, and was acquired by Amgen for $13.4 billion in 2019. This goes to show that, within therapeutic areas, the competitive landscape reaches a point where everything is good enough. After that, convenience really becomes essential from a differentiation perspective.  

Sonja Wustrack, Managing Director of Integrated Evidence Generation, OM1

There are two points to this question around RWD and RWE today. The first piece is the misconception that we primarily need larger volumes of higher quality RWD and better data standards to shift the paradigm for clinical research to incorporate RWE generation. The second piece, which is often cited as a barrier, is the lack of definitive guidelines from the FDA and regulators on the use of RWE to support product approvals.

Instead, the real work that will drive change is around increased collaboration between stakeholders: organizations need to promote new ways of working and a culture of experimentation when it comes to RWD and RWE. Infrastructure, business models, internal resources, and know-how are missing. In addition, new types of data partnerships (non-traditional data vendors) and curation standards are key to rich RWD rather than just volume. Technologies to extract data and innovations, such as artificial intelligence, are critical to enable a more personalized view of patients. Finally, while regulators are making public statements and publishing support for the use of RWD/RWE in submissions, I believe we cannot wait for prescriptive guidelines, but rather we as an industry need to create the standards. Other pressures and factors are driving increased focus of RWE within organizations, including pricing pressures and the need to show value.

Carole Nicco, Ph.D., Chief Scientific Officer, BioSenic

Over the past several decades, the industry has regrouped to shift risk associated with R&D toward smaller biotechs. It’s time for large pharmaceutical companies to embrace risk again.

Ideally, this shift of R&D to start-ups did not abolish the existing regulatory framework. Instead, more regulations and procedures have accumulated, which can be compared to sedimentation. The pharmaceutical industry responded by shifting more cutting-edge research away from the structures for which the rules were originally designed and toward smaller companies.

This has contributed to a gradual slowdown in the responsiveness and innovative potential of small businesses and start-ups. The estimated time from research to commercialization of a new drug has doubled since the 1970s.

Despite diminishing returns to the approach, pharmaceutical companies have retained elements of R&D focused on improvements in existing molecules, or new molecule–therapeutic target combinations with minimum risk. We are at a turning point where the industry must embrace greater risk and increased R&D investment, despite the challenging regulatory environment. The biggest players especially need to double down on new technologies like artificial intelligence to improve large-scale screening efficiency.

Drugmakers regularly justify the cost of new therapies by pointing to the costs of developing the next generation of treatment. This needs to be even more true. We’ve structured BioSenic with this in mind, leveraging existing therapeutics like arsenic trioxide while developing next-generation technologies like cell and gene therapies, to maintain profitability while still innovating for the future.

Angela Osborne, Chief Executive Officer and Founder, eXmoor

For several years, developers of advanced therapies had an insourcing mindset for process development and clinical scale-up. In many cases, the first move a cell or gene therapy (CGT) biotech would make after raising their first significant venture round would be to build its own manufacturing facility. This was a reflection of the perceived immaturity and limited capacity of the contract development and manufacturing organization (CDMO) field at the time.

Too many have learned the hard way that insourced manufacturing is more expensive and inefficient than they expected, burning through cash before they can even treat a single patient. Because the industry is so young and growing so fast, there aren’t enough experts to help every company build in-house from scratch.

We have been advising on and designing CGT manufacturing facilities for many years, and it’s clear to us that the field has reached a tipping point. Manufacturing sites we designed just a few years ago have changed hands several times. At the same time, CDMOs have continued to invest in capacity and are securing the expertise necessary to guide a growing number of projects in parallel. CGT drugmakers no longer need to go it alone to scale into the clinic and in most cases, it’s better if they don’t.

Becky Cap, Senior Vice President, Business Development, Advanced Therapies, BioBridge Global

As the cell therapy space matures, there has been a growing appreciation for certain metrics connected to the starting material taken from donors or patients, including baseline cell count and the ability to get to an absolute target count for the cell type of interest. In the past, that meant processing additional blood volumes by keeping a person on the machine longer, prolonging an already lengthy process. We now have clear insight that while this does increase the total cell count, it’s not an efficient way to address the demand. While we want more needles, what we get instead is more haystack.

Drugmakers are also starting to rethink the way they interact with collection sites. While some FDA guidance supports the development of static protocols in isolation, even at early stages, developers have come to appreciate the flexibility they do have and the power of collaboration. For example, protocols include tradeoffs between the standards for who can donate and the number of people who can donate. But there may be instances when some common restrictions like COVID status, BMI, or age can limit the ability to recruit unnecessarily.

Additionally, therapeutic developers may be handcuffed in dealing with technical issues like clumping or low cell counts during the collection process. But experienced collections sites can offer guidance before suboptimal protocols get locked in at later stages or even help develop adaptive protocols to prepare contingencies

Veerle d’Haenens, General Manager Global Therapeutic Systems and Cell Therapy Technology, Terumo Blood and Cell Technologies

Perhaps no industry is more optimistic about the future than the biopharma sector. Remarkable advances over the past decades keep us focused on innovating for the future, but they have also led to the notion that emerging technologies have the potential to resolve many challenges.

While it's true that cutting-edge developments like cell and gene therapies hold immense promise, they currently cater to a limited demographic, and scale-up will likely remain out of reach for the foreseeable future. It is therefore important to identify new applications of existing technologies that can continue to make meaningful contributions.

Given our technology portfolio and our commitment to patients with blood diseases, we know that one such opportunity is therapeutic apheresis. Cell and gene therapies are increasingly reliant on our automated platforms to isolate the cellular starting material for both their development programs and active patients. But these platforms also have more direct applications for patients with sickle cell disease and other diseases, where procedures like red blood cell exchange can be used for both acute and preventive care.

It’s heartbreaking to see patients suffer today because of logistical hurdles to accessing existing treatments, and it’s insufficient to ask them to wait for the next advances. We must be able to do both together.

Matthew Lakelin, Ph.D., Vice President, Scientific Affairs and Product Development, TrakCel

Until recently, I was always of the opinion that making changes to Good Manufacturing Practice (GMP) processes should only be undertaken at specific points where a natural pause presents itself for example, when moving from a phase I to a phase II clinical trial. My reasoning for this was that the change would be disruptive, and therefore the advantages of the change would be outweighed by the disadvantages.

However, I recently spoke with a current customer for our Tracking Cells podcast about how they have managed the evolution of their various CGT assets. The conversation turned to how to adopt and apply a new chain-of-identity (COI) standard to an established supply chain model. They delivered this wonderful pearl of wisdom:

“Anything is possible with change control.”

Their point was that if a change has the potential to be beneficial, provided the underpinning support processes are robust enough, there should not be a barrier to deploying these wherever the benefits can be recognized, even if this is mid-stage. This made me rethink my previous opinion regarding changing GMP processes. If you have a rigorous, well-managed change control process, then at least from a GMP perspective, anything is possible at any time.

Jenny Stjernberg, Ph.D., Commercial Director, EMEA, ScaleReady

The cell and gene therapy field is one of constant innovation. Here, therapeutics developers are eager to test and integrate cutting-edge technologies, such as engineering T cell receptors to modify CAR-T cell therapies, optimizing different cell types like natural killer (NK) or stem cells, or adapting novel gene editing tools like CRISPR.

This willingness to explore new technology is a major reason why the future seems so limitless for advanced therapies. But the other side of the coin is: the space is growing so rapidly that experience can be hard to come by. In addition, many developers have too-literally retained the conventional wisdom that for cell-based therapies, "process is product." While this is true in certain contexts, it has led to the unnecessary siloing of information, especially related to process development and scale-up, where standardized best practices are still far off.

In practical terms, the combination of inexperience and opacity means companies find themselves geared to reinvent the wheel, leading to waste that can squander resources. Especially for young companies, this hampers or even ends development before these potential new treatments can progress through human trials. 

Too often, companies attempt to optimize a manufacturing tool entirely independently, a misguided approach that results in unnecessary costs –– and frustration when performance falls below the system's true potential. We see ourselves as partners with a stake in our customers' successes, sharing a genuine commitment to improving their process, not just selling a product. I hope developers take better advantage of vendor support to optimize process development.

Jason C. Foster, Chief Executive Officer, Ori Biotech

There are a number of challenges the CGT field faces that have resulted from us trying to apply our past experiences in biologics and small molecule manufacturing directly to this new generation of advanced therapies. For example, in the advanced therapies field, COGs matter, so we need to focus on manufacturability during R&D/PD and not late in the clinical process as we would traditionally. We cannot manufacture/batch-release personalized living medicines on paper in the same way we are manufacturing generic tablets –– we need fully digitized processes to batch release at scale. We cannot automatically assume that we will get open, early line market access for products that cost $500,000 to $4 million each, so we need to have pricing/contracting flexibility. We cannot assume that good clinical data at phase II alone will lead to a buyout from a big pharma company; they need to see a path to commercial viability. These old models of thinking were born from our experiences with small and large molecule development in the past and do not apply for this new generation of cell and gene therapies. 

Tamara Laskowski, Ph.D., Senior Director, Clinical Development, Personalized Medicines, Lonza

The cell therapy industry is evolving and innovating at an incredibly fast pace. Earlier on, there was a common understanding that manufacturing of cell therapy products required a rigid, lengthy process to ensure that a robust product was generated. Propelled by learnings from clinical trials, there has been a change in how the field determines the best strategies for cell therapy manufacturing. We are observing a stronger trend toward shortening manufacturing time and generating rapid cell therapy products that can be infused in patients in 24–72 hours while maintaining if not improving the efficacy of the overall manufacturing process. This concept is centered on the fact that, by reducing the time spent on expanding cell therapy products ex vivo, we can preserve the product’s potency and robustness, administer it to patients sooner, and allow the product to complete its manufacturing in vivo.

Chad Telgenhof, Chief Commercial Officer, Sterling Pharma Solutions

A common misbelief within the API CDMO industry is the idea that once a process is validated, it should remain unchanged to avoid regulatory scrutiny. This perspective stems from a time when regulatory agencies emphasized stability and consistency to ensure product safety.

However, in today's dynamic regulatory landscape and with advancements in science and technology, this mindset can hinder progress. Holding onto outdated processes without exploring improvements can lead to missed opportunities for efficiency, cost-effectiveness, supply chain security, and product quality enhancements.

Modern regulatory frameworks, such as quality-by-design (QbD) and risk-based approaches, encourage continuous improvement and adaptation. Sticking to the status quo might now have the undesired effect of attracting regulatory attention if it is perceived as neglecting advancements that could enhance patient safety or product quality.

Furthermore, maintaining a rigid stance against process updates overlooks the potential benefits of innovative technologies and methodologies. Embracing advancements, such as continuous manufacturing, real-time monitoring, and using data analytics, can lead to more robust and reliable processes, reducing variability while improving product quality.

Read Part 1: Q1. How is your organization currently leveraging real-world data/evidence and what uses do you anticipate in the future?