All rights reserved. No part of this publication may be reproduced, duplicated, stored in any retrieval system or transmitted in any form by any means without prior written permission of the Publishers.
The next issue of IBI will be published in Autumn 2025.
ISSN No.International Biopharmaceutical Industry ISSN 1755-4578.
The opinions and views expressed by the authors in this journal are not necessarily those of the Editor or the Publisher. Please note that although care is taken in the preparation of this publication, the Editor and the Publisher are not responsible for opinions, views, and inaccuracies in the articles. Great care is taken concerning artwork supplied, but the Publisher cannot be held responsible for any loss or damage incurred. This publication is protected by copyright.
2025 Senglobal ltd.
Volume 8 Issue 1 – Spring 2025
www.international-biopharma.com
04 Foreword WATCH PAGES
06 Why Renting Product Inspection Systems Makes Strategic Sense for Pharma Manufacturers
When pharmaceutical manufacturers face a spike in demand, new quality control requirements or equipment failure, delays in sourcing inspection systems can lead to costly setbacks. Renting product inspection technology offers a rapid, low-risk way to respond. Christine Gottschalk of Mettler-Toledo explores why pharmaceutical businesses are beginning to turn to rental inspection technologies and the benefits that this provides.
08 The Importance of Mass Spectrometry
Mass spectrometry (MS) is an analytical technique in the pharmaceutical and biopharmaceutical industries. Its ability to provide detailed molecular information has made it indispensable in the development, characterisation, and quality control of complex biological products. As the landscape of therapeutics evolves, Alistair Michel of BioChek UK LTD explains how MS has continued to adapt and how it will shape the future of pharmaceutical development.
REGULATORY AND COMPLIANCE
10 Plan, Lead, Deliver: A Framework for Regulatory Writing Success
Success in preparing a dossier for submission to regulators hinges on the approach the writing team takes. Their leadership and project management skills are as critical as their ability to communicate clearly, and impartially. Keith Dawes and Tim Weber of ICON discuss how a well-coordinated strategy can allow medical writers can navigate a complex landscape with confidence and efficiency.
14 Navigating FDA and USP Regulatory Guidance with LAL Reagents for Bacterial Endotoxin Testing
Navigating the regulatory landscape of the Bacterial Endotoxin Test (BET) involves a thorough review of guidelines from regulatory agencies. While there is considerable overlap in the guidance from these agencies, there are also some points of obscurity that may be difficult to work through. Delaney Novak of FUJIFILM Biosciences discusses how to get an accurate understanding of the requirements for BET and stay on top of the everchanging landscape.
PRECLINICAL
16 The Impact of Clinical Development Decisions on Time to Market and Development Risk in Oncology
Efficiency of innovation in oncology R&D is becoming a challenge, as innovation is being driven by both advances in basic science and increasing investment. Matthew Furlow et al. of Intelligencia AI analyse how reducing the length on a programme’s time to market can be valuable. There is both a need and opportunity for sponsors to consider strategic decisions and their potential impact, in order to chart the best course to market.
RESEARCH/
INNOVATION/ DEVELOPMENT
22 Modelling the Bone Marrow Niche In Vitro: A Roadmap for Drug Development
The bone marrow niche provides a tightly controlled setting where cellular interactions, biochemical signalling, and mechanical stimuli
collectively shape the fate of hematopoietic stem cells (HSCs). Given its role in haematopoiesis, understanding and accurately replicating this niche is crucial for advancing stem cell therapies and developing effective treatments for haematological diseases, such as leukaemia, aplastic anaemia, and myelodysplastic syndromes. Talita Stessuk of Crown Bioscience explores how the evolution of bone marrow niche in vitro models continues to drive advancements in drug development, providing biopharma companies with powerful tools to study stem cell dynamics, haematological disorders, and therapeutic interventions.
TECHNOLOGY
28 Challenging Targets in Antibody Discovery
Antibody-based therapeutics offer a highly specific approach for diseases such as cancer, but advancements in the space are reliant on working with target proteins in their native conformation. Antibodies represent a rapidly growing therapeutic modality, demonstrating clinical success over a wide variety of diseases. Miao Li for Bio-Rad Laboratories explores how recent advances in technologies for the development of antibodies are further driving this growth, accelerating the discovery process through the rapid generation and screening of antibody candidates.
32 Building the Foundation for Biopharma 4.0 Through Strategic Digital Transformation
Biopharma 4.0 refers to the next-generation modernisation approach to biopharmaceutical development, manufacturing and quality. Jim Sulzberger of Sapio Sciences evaluates how digital transformation serves as the foundation for Biopharma 4.0. In an industry where precision is critical, accelerating development without sacrificing quality is a competitive advantage. Organisations that embrace this transformation are already realising tangible gains in efficiency, reproducibility, and compliance.
MANUFACTURING AND PROCESSING
36 TPDs:
Developing the Next Generation of Oral Therapeutics
Targeted Protein Degraders are ushering in a new era of drug development, offering a fundamentally different approach through the selective and irreversible elimination of disease-causing proteins. Although their development is accompanied by a distinct set of scientific and logistical challenges, specialised CDMOs are playing a pivotal role in overcoming these obstacles, supporting everything from advanced formulation to high-containment manufacturing and regulatory navigation. Anshul Gupte of PCI explains that with continued progress in medicinal chemistry, bioengineering, and computational tools, the momentum behind TPDs is only expected to grow.
SUBSECTION: AI AND MACHINE LEARNING
42 Harnessing the Power of Generative AI: Transforming Life Science Manufacturing
Generative AI offers a transformative solution by revolutionising data utilisation in pharmaceutical companies. David Staunton at Cognizant analyses the challenges of developing and delivering life-changing medicines at an accelerated pace. Traditional data management approaches ultimately restrict a company’s ability to respond effectively to the dynamic market and regulatory landscape. He proposes the use of Generative AI as a transformative solution by revolutionising data utilisation in pharmaceutical companies.
46 Building a Global Data Foundation for Scaling AI
AI use cases are rippling across commercial biopharma, helping companies make faster and more informed decisions. However, Karl Goossens at Veeva explores how almost 70% of top generative AI (GenAI) users cite poor data quality as their most significant obstacle in unlocking AI's full potential. As the adoption of applications grows, the true competitive edge lies in the quality of the data fuelling them. Those who focus on creating standardised and well-integrated data can unlock AI’s full potential to gain a competitive advantage and drive long-term success.
48 Regulatory Impact Assessment is Obvious Next Target for GenAI, Experts Conclude
When anything changes to a product’s make-up or manufacture a whole chain of events is triggered, starting with an assessment of the regulatory impact in each regulatory jurisdiction, swiftly followed by required actions. Any delay or omission could be costly, so smart process automation offers attractive potential. Regulatory experts Preeya Beczek and Agnes Cwienczek of ArisGlobal scope the opportunity.
LOGISTICS & SUPPLY CHAIN
50 Avoiding the Domino Effect: How Smarter Temperature Control and Excursion Management Prevent Clinical Trial Disruptions
When investigational medicinal products (IMPs) are exposed to conditions outside of their approved storage range, sponsors need to be ready to act or risk the integrity and success of clinical trials. Sarah McAliskey of Almac explores how to keep patients safe and clinical trials on track. To do so, sponsors must implement a combination of proactive strategies to prevent temperature excursions from occurring during drug transportation and storage, and implement a robust, time-efficient process that enhances detectability throughout the entire clinical supply chain.
APPLICATION NOTE
24 Modelling Retinal Safety: The Era of Predictive In Vitro Toxicology Testing
Bringing a novel therapeutic to market is a complex and costly journey, and toxicity remains a leading reason for failure along the way. Despite remarkable advances in drug discovery and screening technologies, unanticipated toxic effects are still responsible for high attrition rates during both preclinical and clinical development, draining time, resources, and opportunity. Dr. Maria Georgiou of Newcells Biotech Ltd explains what they are doing to support innovation in drug discovery through cutting-edge iPSC-derived RO and RPE technologies.
40 The Science of Cell Line Development for Biologics: Improving Stability and Yield
Mammalian cell line development is essential to biologics manufacturing, ensuring stable, high-yield expression of therapeutic proteins. With expanding biologics pipelines, the industry is continuously innovating to improve productivity, speed to patient, and scalability. Among the most widely used cell lines for biologics production are Chinese Hamster Ovary (CHO) cells, which have become the gold standard for monoclonal antibody and recombinant protein production. Thermo Fisher explains why their adaptability, scalability, and ability to achieve high titers make them essential for developing monoclonal antibodies and other complex biologics.
Process Analytics –
Cover your Bioprocess with Precision Analytics you can Trust
Our client´s QC-professionals come to A&M STABTEST for precision, traceability, compliance and non-negotiable high quality. Process engineers choose our process analytics service for our flexibility, fast turnaround time and high-quality analytical data they need to optimize their bioprocess – from fermentation to downstream processing to fill ad finish.
Depending on your project we start with scientifically sound methods and progress to full GMP-status to support your process validation studies Our process analytics service integrates seamlessly into your workflows, we help you stay ahead of critical quality attributes and regulatory expectations.
Situated in Germany we offer all our services under one roof:
• LC-MS characterization of your product throughout the DSP
• Detection of host cell DNA and proteins
• Determination of process-related impurities, e.g., antifoam, MTX, EDTA, IPTG, antibiotics, insulin, protein A …
• Extraction studies of single-use materials acc. USP <665>
• Full QC-testing of the DP in different process matrices
Take advantage of our flexibility enabling analysis of single sample from early development to hundreds of samples from QbD and process validation campaigns.
Partner with us for analytical solutions that meet today’s challenges and tomorrow’s standards!
For more information how we can become your reliable analytics partner for upcoming project, please contact Dr. Steven A. Watt
This summer issue of IBI will take a closer look at the production of Biopharmaceuticals. Bioprocessing is an ever-interesting field where scientific innovation and digital transformation meet regulatory complexity. Leading experts from the industry examine key developments and present novel applications across the value chain – from discovery and development to manufacturing and compliance. As a keen user of mass spectrometry in the past, I am particularly happy to find an article discussing “The Importance of Mass Spectrometry” for the biopharmaceutical industry presented by Alistair Michel of BioChek UK LTD.
In the Technology section, Miao Li (Bio-Rad Laboratories) explores how advances in antibody development are accelerating therapeutic discovery in “Challenging Targets in Antibody Discovery.” Jim Sulzberger (Sapio Sciences) explains how digital transformation lays the groundwork for Biopharma 4.0 in “Building the Foundation for Biopharma 4.0.”
Under the AI and Machine Learning subsection, David Staunton (Cognizant) analyses the transformative role of Generative AI in “Harnessing the Power of Generative AI,” offering solutions for data bottlenecks in drug manufacturing. Karl Goossens (Veeva) discusses the central role of data quality in “Building a Global Data Foundation for Scaling AI,” while Preeya Beczek and Agnes Cwienczek (ArisGlobal) assess how GenAI can optimise regulatory processes in “Regulatory Impact Assessment is Obvious Next Target for GenAI.” AI can also be a useful tool in preclinical development. Matthew Furlow et al. (Intelligencia AI) highlight how strategic clinical decisions can reduce time to market and risk in oncology in “The Impact of Clinical Development Decisions.”
In Regulatory and Compliance, Keith Dawes and Tim Weber (ICON) provide a practical framework for successful dossier preparation in “Plan, Lead, Deliver.” Delaney Novak (FUJIFILM Biosciences) clarifies the overlapping and nuanced guidelines for bacterial endotoxin testing in “Navigating FDA and USP Regulatory Guidance with LAL Reagents.”
And Finally…
Welcome to the second edition of IBI in 2025. I have recently joined the team at Senglobal and I am excited to be working closely with plenty of you to get authoritative contemporary articles and white papers published. In this edition, our contributors cover a wide range of regulatory discovery articles. As you can see, we have a special subsection in this issue on AI and Machine Learning, which starts on page 42. Please take a look at the informative ideas in the field that have been put forward by our contributors. On page 10 you will find an excellent article by
• Cellia K. Habita, President & CEO, Arianne Corporation
• Deborah A. Komlos, Senior Medical & Regulatory Writer, Clarivate Analytics
• Francis Crawley, Executive Director of the Good Clinical Practice Alliance – Europe (GCPA) and a World Health Organisation (WHO) Expert in Ethics
• Hermann Schulz, MD, Founder, PresseKontext
The Research / Innovation / Development section features Talita Stessuk (Crown Bioscience), who presents cutting-edge in vitro models in “Modelling the Bone Marrow Niche In Vitro,” supporting the development of advanced hematologic therapies.
In Manufacturing and Processing, Anshul Gupte of PCI discusses the rise of Targeted Protein Degraders and the crucial role of CDMOs in “TPDs: Developing the Next crucial role of CDMOs”. “The Science of Cell Line Development for Biologics – Thermo Fisher emphasises the importance of stable, high-yield cell lines, particularly CHO cells, in biologics production.
Christine Gottschalk (Mettler-Toledo) highlights how rental inspection systems offer strategic agility in “Why Renting Product Inspection Systems Makes Strategic Sense.” Additionally, the article underscores the evolving significance of mass spectrometry in characterising complex biologics.
In Logistics & Supply Chain, Sarah McAliskey (Almac) outlines robust strategies to mitigate temperature excursions in clinical trial logistics in “Avoiding the Domino Effect.” Finally, Dr. Maria Georgiou (Newcells Biotech) describes innovative in vitro toxicology testing for ocular safety in “Modelling Retinal Safety.”
The articles carefully curated by IBI underscore very clearly the innovative drive of the biopharmaceutical industry, by the same time having a careful eye on quality and the regulatory demands of international pharmaceutical authorities. What the contributing authors and the organisations they represent also clearly show, is how different disciplines from life sciences, physics, IT and lawmakers interact to translate cutting edge science into safe and efficacious medication.
I am looking forward to the exciting developments the second half of 2025 has on offer. Have a great summer and stay safe!
Dr. Steven A. Watt, CBDO (Chief Business Development Officer) at A&M STABTEST GmbH
Keith Dawes and Tim Weber of ICON titled “Plan, Lead, Deliver: A Framework for Regulatory Writing Success.” This focuses on how to form a well-coordinated strategy to accomplish writing a dossier for submission, a key step that can sometimes be overlooked.
We will be distributing at the Bio Process Summit and Biologics USA, as well as attending events in the autumn. I look forward to meeting many of you in person and, in the meantime, please keep submitting your white papers to me for future issues!
• Rafael Antunes, Vice President Business Development, Aurisco Pharmaceutical Europe
• Stanley Tam, General Manager, Eurofins MEDINET (Singapore, Shanghai)
• Stefan Astrom, Founder and CEO of Astrom Research International HB
• Steven A. Watt, CBDO (Chief Business Development Officer) at A&M STABTEST GmbH
Why Renting Product Inspection Systems Makes Strategic Sense for Pharma Manufacturers
Production lines don’t pause for procurement. When pharmaceutical manufacturers face a sudden spike in demand, new quality control requirements or equipment failure, delays in sourcing inspection systems can lead to costly setbacks. Renting product inspection technology offers a rapid, low-risk way to respond – whether it's for bridging gaps in delivery, managing seasonal peaks or testing new solutions before purchase.
In this article, we will explore why more pharmaceutical businesses are turning to rental options as a smart, strategic choice.
Meeting Urgent Quality Control Demands Without Capital Outlay
Whether facing an unexpected increase in production or a temporary equipment breakdown, manufacturers need rapid access to reliable product inspection solutions. Renting checkweighing, metal detection, x-ray or vision inspection systems provides a cost-effective, low-risk response. These systems offer the ability to scale production without the delay or financial commitment associated with capital expenditure. For pharmaceutical manufacturers where precision is paramount, this approach enables continuity without compromise.
Bridging Equipment Gaps with No Downtime
Long lead times are often a reality when procuring new inspection equipment. Rental systems, however, are available immediately and can be installed directly into the production line by expert teams. This avoids costly delays and supports uninterrupted output – essential for companies supplying life-saving medicines on tight deadlines. Temporary equipment can also be used as a contingency measure while permanent
solutions are being commissioned, giving manufacturers confidence in their ability to keep operations running smoothly.
Adapting to Seasonal Peaks with Agility
Pharmaceutical production often fluctuates throughout the year, with spikes during flu seasons, public health campaigns or new product launches. Rather than investing in permanent equipment that may sit idle during quieter periods, renting inspection systems allows for temporary increases in capacity. This offers operational flexibility while maintaining rigorous quality control. It also helps manufacturers respond quickly to unexpected surges in demand, without overextending internal resources or compromising delivery timelines.
‘Test Before You Invest’: A Smarter Route to Permanent Solutions
Choosing the right inspection technology is a significant decision, especially in environments with complex packaging formats or sensitive formulations. Renting allows pharmaceutical manufacturers to trial systems in live production conditions, verifying performance and compatibility before committing to a purchase. This approach reduces risk and builds confidence in investment decisions.
Rapid Re-Inspection Capabilities for Recall or Quality Verification
In the event of a product recall or the need for additional quality assurance, rental equipment provides a fast and efficient route to re-inspection. A manufacturer that initially employed metal detection might, for example, rent x-ray technology to detect other contaminants such as glass and plastics – all without disrupting routine operations. This is particularly valuable when batches need to be rechecked at short notice, protecting brand integrity and maintain supply commitments with minimal interruption.
Versatile Technologies Available for Hire
A wide range of product inspection technologies are available for rent, offering pharmaceutical manufacturers the flexibility to maintain high standards without committing to long-term investments. Checkweighing systems for example, from basic to highly precise models, are commonly used to monitor fill levels and control product weight – essential for accurate dosing. Plus, they can verify that the instruction leaflet is present. Similarly, metal detection solutions can be applied across various formats, including conveyor, pipeline and free-fall setups, and are capable of identifying ferrous, non-ferrous and stainless-steel contaminants.
X-ray inspection technology is another rental solution, particularly effective for detecting a broad spectrum of foreign bodies, even within foil blister packs, and can simultaneously perform additional quality control tasks such as verifying pack integrity and counting components. Further, vision inspection systems help verify labels and printed information, supporting compliance with pharmaceutical labelling standards.
In many cases, combination systems that integrate checkweighing with either metal detection or x-ray, with a third technology option of vision inspection, into a single unit are also a viable solution to an efficient use of space and streamlined quality checks.
Rental solutions are typically supported with expert installation and technical guidance, allowing for quick and effective integration into existing production lines.
Fast Access and Comprehensive Support
Rental systems can be deployed almost immediately. Suppliers
often offer remote demonstrations, testing in controlled environments and on-site installations. Throughout the rental period, customers can benefit from technical training, responsive service support and access to application expertise – helping maintain consistent performance at every stage.
Conclusion
For manufacturers seeking to drive performance without the constraints of large upfront investments, rental inspection technology opens new possibilities. It provides a practical route to flexibility, innovation and growth – helping businesses meet today’s demands while building a foundation for tomorrow’s success.
To explore the Mettler-Toledo rental solutions or to trial systems under the 'Test Before You Invest' programme, visit: www.mt.com/pi-testbeforeinvest-pr
Christine Gottschalk
As Head of PI Test and Demo Center, Christine supports the Mettler-Toledo team across Europe by supplying them with product inspection solutions to help food and pharmaceutical customers increase their quality management. She leads the ‘Test Before Invest’ approach wherein customers can find the right test product inspection solutions for their applications by experiencing demos, renting machines and product testing. Christine has worked for Mettler-Toledo Product Inspection for over 10 years. Since 2018 she has been responsible for the Test and Demo Center including rentals and service support throughout Europe.
The Importance of Mass Spectrometry
Mass spectrometry (MS) has long been a cornerstone analytical technique in the pharmaceutical and biopharmaceutical industries. Its ability to provide detailed molecular information has made it indispensable in the development, characterisation, and quality control of complex biological products. As the landscape of therapeutics evolves to include biologics, cell and gene therapies, and nucleic acid-based treatments, MS continues to adapt, offering solutions to emerging analytical challenges.
For those seeking a deeper technical insight into this topic, my colleague, Dr. Milena Quaglia, has written a comprehensive review article on the evolution of mass spectrometry for the analysis of bioproducts, which is available on the RSSL website. Readers are encouraged to refer to that article for more detailed discussion and reference information.
Biopharmaceutical Foundations
Traditional biologics, such as monoclonal antibodies (mAbs), remain a critical part of pharmaceutical pipelines. Characterisation of these large and heterogeneous molecules requires detailed structural information to confirm identity, assess post translational modifications, and evaluate stability. MS plays a pivotal role here through peptide mapping, intact mass analysis, and glycan profiling.
MS based methods offer unparalleled resolution and accuracy when assessing changes to mAbs over time or in response to formulation conditions. Moreover, regulatory guidance from agencies such as the FDA and EMA increasingly expects high resolution analytical data during product development. As such, MS provides not only research support but also fulfils critical roles in meeting compliance and quality expectations.
Cell and Gene Therapy
Cell and gene therapies (CGTs) represent some of the most innovative and complex modalities currently in clinical development and commercial use. These therapies rely on delivering genetic material to modify or correct cellular function. Viral vectors, such as adeno associated viruses (AAVs), are among the most common delivery vehicles.
One of the key analytical challenges in this space is the characterisation of viral vector particles; specifically, understanding the ratio of full to empty capsids, detecting aggregates, and confirming identity. While techniques such as qPCR and ELISA are commonly used, MS is emerging as a powerful complementary tool.
Native MS allows the analysis of intact AAV capsids in their natural, non-denatured form. This technique offers insights into
capsid assembly, particle integrity, and batch heterogeneity. When combined with ion mobility spectrometry, it can provide data on size and conformation, offering a deeper understanding of the structural landscape of viral vectors.
Additionally, proteomic workflows can be used to characterise the proteins making up viral capsids or the host cell protein (HCP) impurities that may co-purify during production. Given the immunogenic risk posed by HCPs, their detection and quantification are essential. MS based proteomics, including data independent acquisition (DIA) and parallel reaction monitoring (PRM), provides sensitive and specific options for profiling these impurities.
Characterisation of Gene Therapy Components
In gene therapies using plasmids or mRNA, another layer of complexity is introduced. Mass spectrometry can contribute to the characterisation of nucleic acid structures and assess degradation products. Although MS of large RNA molecules presents challenges due to size and ionisation efficiency, progress is being made in the use of top-down and bottom-up strategies to identify sequence integrity and chemical modifications.
Lipid nanoparticles (LNPs), which are increasingly used as delivery systems from RNA therapies, also require detailed compositional analysis. MS can be employed to study the lipid components, identify impurities, and ensure batch to batch consistency, especially when combined with chromatographybased separations.
Advanced Instrumentation and Workflows
The evolution of instrumentation has significantly expanded the capabilities of MS. High resolution accurate mass (HRAM) systems, such as Orbitrap and time of flight (TOF) analysers, are now standard in many labs due to their sensitivity and resolution. Coupling these instruments with ultra-high performance liquid chromatography (UHPLC) or capillary electrophoresis (CE) provides powerful multidimensional analytical platforms.
Ion mobility spectrometry (IMS) adds a further separation dimension based on molecular shape and size, which is particularly useful when analysing conformers or aggregates. When applied to biotherapeutics or viral vectors, IMS-MS workflows can uncover hidden heterogeneity and identify minor variants that may impact safety or efficacy.
The adoption of automation and data processing software also continues to improve, facilitating high throughput workflows, minimising human error, and improving data reproducibility. These developments are critical in CGT pipelines, where speed to market can be a competitive differentiator.
From Discovery to QC
Initially, MS was primarily a tool for discovery and development.
However, its role in quality control (QC) is growing. With improvements in robustness and user-friendly software, MS based methods are now being validated for lot release testing. In CGTs, where conventional methods may not suffice, MS offers precise identification and quantitation capabilities suitable for QC environments.
Regulatory acceptance is increasing, particularly when MS methods are shown to offer advantages over traditional techniques. As such, companies are investing in the development of platform MS methods that can be adapted across a range of products.
Outlook and Challenges
Despite its advantages, the integration of MS into routine workflows in CGT is not without obstacles. Challenges include the complexity of sample preparation, especially for cell-based therapies, and the need for standardised protocols. Instrument cost and the requirement for skilled operators can also be limiting factors for smaller organisations or early-stage developers. Nonetheless, ongoing advances in miniaturisation, automation, and data interpretation are likely to mitigate these issues. Cross industry initiatives aimed at harmonising analytical standards are also expected to support broader adoption. The push toward digital transformation in pharma and biopharma is likely to further embed MS into data driven decision making. The ability of MS to provide comprehensive, high resolution molecular information positions it as a cornerstone technology not just for traditional biologics but also for next generation therapies.
Conclusion
Mass spectrometry is no longer just a research tool; it is rapidly becoming a critical enabler of modern medicine. From the detailed characterisation of biologics to the emerging applications in gene therapy and viral vector analysis, MS provides unmatched analytical power. As therapeutic modalities continue to diversify, the role of MS will only grow, shaping the future of pharmaceutical development and quality control.
Alistair Michel is an immunologist with a BSc (Hons) from the University of Edinburgh and an MSc from Imperial College London. He is a member of the British Society for Immunology and the Royal Society of Biology. With over 20 years of experience as a bioanalytical scientist, he has developed, validated, and optimised a range of analytical methods, including ELISA, other immunoassays, enzyme-based assays, and gel-based techniques, in GxP-compliant laboratories. Alistair currently works at BioChek UK Ltd, where he leads technical process development for ELISA products and plays a central role in bridging R&D and production to drive continuous improvement.
Regulatory and Compliance
Plan, Lead, Deliver: A Framework for Regulatory Writing Success
Q: Is your team simply writing a dossier, or are they following a clear strategy?
A: The most effective teams do both, but strategy should come first. Preparing a submission package entails developing a detailed project plan and adopting a well-coordinated strategy to usher a dossier through the various stages of compilation, review, quality control, editing, and completion.
Success in preparing a dossier for submission to regulators hinges on the approach the writing team takes, including how they plan, communicate, coordinate, and resolve differences. In other words, their leadership and project management skills are every bit as critical as their ability to communicate clearly, conclusively, and impartially.
The Need for Leadership at Multiple Levels
The complexity of preparing a regulatory submission calls for strong leadership across multiple levels to ensure that deadlines are met, divergent views explored, quality issues resolved, and resources allotted efficiently. Otherwise, there is a risk that the project can be derailed, delaying submission and ultimately, product availability for patients.
The scope of the project requires that an experienced senior manager be responsible for overseeing the team of medical writers and their work. This leader is charged with setting the direction, creating alignment across the diverse teams, promoting transparency, and ensuring accountability for meeting deadlines and quality standards.
The individual medical writers who are drafting the various document modules must also demonstrate leadership skills in executing the overall plan and accepting responsibility for meeting expectations with their contribution.
The Essential Pre-work
Ideally, a medical writing team is established and led by a senior manager. This team will consist of experienced senior medical writers who act as document owners for each component of the submission. They are primarily responsible for developing each document and may be supported by one or two medical copy editors, or other support writers, to prepare patient narratives and appendices, a regulatory publisher, and a clinical trial transparency associate, if needed. The medical writing team can also interact with other important representatives from other groups associated with the submission, such as, Medical Affairs, Statistics, Pharmacokinetics, Regulatory Affairs, and Clinical Operations, in a wider cross-functional team.
The medical writing team has much to do in advance of entering the first keystroke, beginning with convening a
kick-off meeting to train all involved on their role, explain the tasks ahead, and clarify the standards expected. Members should also understand the methods for communicating with one another and the pathway for escalating issues or sharing achievements.
The next step is to develop a project plan with input from all key stakeholders to define the scope of the project, assign responsibilities, and lay out a detailed timeline. While it is helpful to include day-to-day milestones and interdependencies in the timeline, revising the entire schedule if there is a one or two-day lapse in completing a step should be avoided.
At this point, the team should agree on key messages and a storyboard of how they’ll be presented, since defining the end message in advance will help maintain focus throughout the process. The plan should include a checklist that maps out what information will be needed for each module and who will be responsible for securing it. This is an extensive effort that can’t be completed in an afternoon.
It is vital to involve reviewers at this early stage to brief them on what will be expected of them to avoid conflicts at a later stage – conflicts that put the timeline at risk. Will their focus be on scientific accuracy, compliance, or formatting? The goal is to prevent the sudden appearance of a “wild card” reviewer who weighs in at the end of the process, perhaps disagreeing with content that has already passed multiple approval stages.
For efficiency’s sake, reviewers should be instructed to:
• Provide clear and constructive comments (rather than open-ended questions) and alternative text where applicable. Conceptual comments or those that invite further discussion can delay progress.
• Hold discussions outside of the document review system as needed to reach consensus.
• Refrain from making editorial comments, as these will be addressed later.
• Make global comments once if they apply to multiple sections.
Best Practices for Managing the Project
To enhance the quality of the submission and shorten the preparation timeline, medical writers should adopt a well-coordinated strategy that entails:
• Convening regular status meetings of the cross-functional team. These meetings allow all involved to stay informed, the lead writer to stay abreast of co-authors’ progress, and team members to share helpful tips and tricks.
• Creating a shell document using source documents such as the protocol and including pre-agreed results text, in-text tables, and conclusions (based on key messages),
YOUR PRODUCT — OUR COMPETENCE AND DEDICATION FOR MORE THAN 35 YEARS!
Richter Biologics is your professional and experienced partner offering CDMO solutions from gene to product all from one source.
Richter Biologics: expert for late stage and commercial production.
RICHTER BIOLOGICS
Suhrenkamp 59, 22335 Hamburg, Germany
Phone: +49 40 55290-801
BusinessDevelopment@richterbiologics eu
www.international-biopharma.com
CONTACT US TO BRING YOUR PROJECT TO SUCCESS!
Regulatory and Compliance
with extensive placeholders for the study results. The existing text can be reviewed and agreed upon during the shell development; only the draft results will need to be reviewed later, as the rest of the document will have been “locked down.”
• Holding structured comments resolution meetings (CRMs) with mandatory attendance. Addressing conflicting or non-consolidated comments from reviewers is typically one of the greatest and most time-consuming challenges medical writers face. Such meetings should be scheduled as soon as the overall timeline is agreed upon, and if a key decision maker cannot attend, a suitable backup should be appointed. To control the process, comments should be circulated prior to the meeting and categorised as “accepted without discussion,” “rejected with reasons,” or “require further discussion at the CRM.” It is helpful to set time limits on each discussion and to annotate adopted resolutions in the draft.
• Employing technology to the fullest extent possible. Centralised authoring/review platforms are available to monitor progress, track changes, control versions, collaborate in real time, and ensure adherence to timelines. Such automated tools also facilitate the flow of information between modules. Comprehensive, electronic documentation provides an audit trail for accountability and compliance as well as facilitating communication across geographically dispersed teams.
• Applying rigorous quality controls to maintain consistency in terminology, standards, and information across documents. Consistency across documents is, in fact, the biggest driver of quality in the process. Ideally, quality review teams should not have been involved in preparing the draft so that they can bring a fresh perspective and minimal bias to the task. Their ability to spot inconsistencies will be aided by providing them with a style guide or cheat sheet on what to consider. Customised checklists can also
help them ensure that the document aligns with regulatory guidelines around document content, structure, and formatting. Quality control reviews should be conducted on a rolling basis as sections are ready, rather than once all components are completed.
• Convening a signature meeting for final approval.
Navigating a complex submission landscape with confidence and efficiency demands that medical writers carry out a well-coordinated strategy. Through proper planning, following a set of established best practices, and relying on available tools, medical writers can not only transform complex data into clear, concise, and scientifically robust documents, but they can also minimise the risk of queries and delays along the way.
Keith Dawes
Keith Dawes, PhD, Senior Director, Medical Writing, ICON has over 20 years of medical writing experience in CROs, Pharma and medical communications. He has extensive experience in writing and managing regulatory submissions. Since 2017, Keith has also managed ICON’s writing team in Europe, India and China.
Tim Weber
Tim Weber, PhD, Senior Director, Medical Writing, ICON has more than 25 years of CRO medical writing experience, including over 15 years as a manager of medical writing staff in North America. Before moving into management, his primary areas of focus included GI disorders, oncology, immunology, and respiratory. He has written and managed regulatory submissions in all of these therapeutic areas.
Company Profile
InsideReg is a specialist regulatory affairs consultancy, with full-service capabilities, founded in 2018 by ex-MHRA Assessor and CHMP Expert, Dr. Laura Millichamp. InsideReg offers regulatory affairs advice and support throughout the development of medicinal products from proof of concept to Marketing Authorisation and commercialisation. All treatment modalities are supported, including biologicals, ATMPs, small molecule, microbiome and radiopharmaceuticals; and all indications, including oncology, neurodegenerative disorders, pain, cardiovascular, rare and paediatric indications.
The Consulting team comprises highly experienced Clinical, Non-Clinical, CMC and Regulatory professionals, all of whom bring tens of years’ experience to the team, and many are ex-Regulatory Authority Assessors. InsideReg focuses on delivering agile, innovative, and solution-oriented support, ensuring that regulatory strategies align with broader business goals and adapting its services to meet the evolving needs of clients. Whether working with startups, mid-sized companies, or large pharmaceutical firms, the consultancy offers scalable solutions that align with each client’s stage of development and regulatory maturity. A summary of some of the key services provided is outlined below.
CORE SERVICES
• Early Regulatory Strategy
The regulatory roadmap starts with a landscape analysis of the current market and regulatory expectations, and a gap analysis of the non-clinical data generated to date, detailing any additional studies that may be needed. A conceptual clinical development plan is then developed, considering future commercial product positioning, highlighting opportunities for differentiation from competitors.
A robust and clear regulatory roadmap is then developed, with clearly defined data requirements, optimised to ensure the most expeditious path to market is selected. Recent case studies include repositioning a product from 505(b)(1) to 505(b)(2) pathway to leverage published data, reduce clinical trial burden and shorten time to market from approximately 10 years, to approximately 4 years.
• Clinical Trial Applications
Approximately 70% of InsideReg clients are in Phase 1 or 2 of development. InsideReg clinical experts include specialist Phase 1 and Phase 2 clinical study managers who lead the CRO relationship from first RfP to trial close out, working seamlessly with both the Sponsor and the InsideReg regulatory team. The regulatory team generate all the documentation required (e.g. clinical study protocol authoring, IND/IMPD and IB) for a successful CTA, checking all the documents, to ensure a high-quality submission. Any potential questions are identified in advance so that the short timelines for responses (e.g. 2 days in Canada, 12 days in EU) can be met with ease. Recent case studies include CTA preparation and submission of CTAs in the EU, UK, Canada and Switzerland, through the appropriate Agency portals (CTIS, ESGNextGen and IRAS) for a global clinical trial.
• Scientific Advice
InsideReg Expert teams frequently support Regulatory Authority meetings (e.g. EOP2, Type C, Pre-IND) with EU Authorities and FDA on a range of topics and indications. With a deep knowledge of the current regulatory landscape and Authority expectations, a clear direction and strategy is set from the outset, precisely positioning the data to maximise the chances of agreement. Full support is provided with briefing book authoring and meeting preparation, identifying possible areas for discussion focus, assigning team roles and responsibility and preparing presentation slides. Rehearsals ensure clear messaging and presentation of a highly professional image to the Regulatory Authorities.
• Orphan Drug Designation
InsideReg regulatory experts have extensive experience with ODDs, having supported over 250 ODD applications in both the EU and US. Once a good probability of success based on clear prevalence and a robust data set has been identified through an initial feasibility assessment, an ODD is prepared and submitted. InsideReg’s EU presence enables them to hold the ODD on behalf of non-EU clients.
• Marketing Authorisation and Lifecycle maintenance
InsideReg Regulatory Experts have in-depth experience of the available submission pathways and are ideally placed to advise on the best strategy for each product, considering submission pathways such as 505(b)(1) or (2) in the US, Centralised or Decentralised procedures in the EU, EU Member State preferences and the current regulatory landscape in each market.
In house developed checklists, trackers and project plans are finalised to manage the process smoothly from start to finish. InsideReg Expert teams identify any potential Major Objections and authority questions in advance and develop risk mitigation plans if needed. This precision-driven approach results in a high-quality dossier, instilling confidence in the Assessing team and maximising the chances of success.
Following approval, InsideReg dedicated lifecycle maintenance team prepare and submit variations on behalf of clients to support timely and accurate maintenance of INDs and product licenses.
WHY CHOOSE InsideReg?
• Proven Track Record: Over two decades of successful regulatory support.
• Tailored Solutions: Services customised to each client’s needs and goals.
• Expert Team: Professionals with both industry and regulatory authority experience.
• Agile and Innovative: Emphasis on flexible, forward-thinking approaches.
• End-to-End Support: Guidance from early development through to post-launch.
Contact Laura Millichamp, Founder W: www.insidereg.com
InsideReg, Rue du Château 4, 1350 Orbe, Switzerland E: contact@insidereg.com T: +41 21 802 18 77
Regulatory and Compliance
Navigating FDA and USP Regulatory Guidance with LAL Reagents for Bacterial Endotoxin Testing
Navigating the regulatory landscape of the Bacterial Endotoxin Test (BET) involves a thorough review of guidelines from regulatory agencies such as the FDA, USP, and AAMI. While there is considerable overlap in the guidance from these agencies, there are also some points of obscurity that may be difficult to work through. A common reservation for those who perform compendial BET is whether the reagent they are using is FDA-licensed or not, and what that means in terms of FDA acceptance and USP chapter <85> compliance. An example of this will be looked at closely using FUJIFILM Wako’s kinetic chromogenic LAL reagent, Limulus Colour KY.
In 1987, the FDA published guidance for LAL testing titled Guideline on Validation of the Limulus Amebocyte Lysate Test as an End-Product Endotoxin Test for Human and Animal Parenteral Drugs, Biological Products, and Medical Devices. In 2011, the FDA determined that this document was obsolete and withdrew the guidance in lieu of USP and AAMI guidelines. There was dissonance in the requirements that the FDA stated in the document and those of the USP and AAMI. This included discrepancies regarding endotoxin limits, qualification and validation procedures, and medical device testing.
One source of uncertainty was the requirement for use of FDA-licensed LAL reagents. The 1987 guidance stated that LAL reagents used for endotoxin testing should be licensed by the FDA Center for Biologics Evaluation and Research (CBER); however, the USP does not include that same explicit requirement in chapter <85> on Bacterial Endotoxin Testing. The 1987 guidance stated that manufacturers “shall use an LAL reagent licensed by CBER in all validation, in-process, and end-product LAL tests.” Instead, USP <85> states that LAL reagent “refers only to a product manufactured in accordance with the regulations of the competent authority.” At the time, this was interpreted that LAL reagents should be FDA-licensed. However, this guidance has since been replaced by the FDA’s 2012 Guidance for Industry: Pyrogen and Endotoxins Testing: Questions and Answers, and the interpretation of that statement in USP <85> has been further clarified.
In the 2012 FDA Guidance, the agency instead took to the USP and AAMI guidelines on the Bacterial Endotoxin Test, stating that the “FDA has found that the published USP and AAMI documents describing methods and calculation of pyrogen and endotoxins testing limits provide industry with appropriate information. We also note the continued development of USP Chapters <85> and <161> and FDA guidance documents. The Agency has withdrawn the 1987 Guidance because it “no longer reflects the Agency’s current thinking on the topic.” Additionally, the 2012 guidance did not put forth the same requirement that manufacturers must use FDA-licensed reagent for endotoxin testing, regardless of whether it is used for in-process or end-product testing.
With the 1987 Guidance withdrawn and the 2012 Guidance taking its place, there were no longer explicit requirements from the FDA that LAL reagents for endotoxin testing must be licensed. In the same year, the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use provided harmonisation between major international pharmacopeial texts on BET.
ICH Q4B Annex 14 imparts harmonisation between USP <85>, Japanese Pharmacopeia (JP) 4.01, and European Pharmacopeia (Ph. Eur.) 2.6.14 texts on Bacterial Endotoxin Testing. ICH Q4B Annex 14 section 2.1 states that “…the analytical procedures described in the official pharmacopoeial texts, Ph.Eur. 2.6.14. Bacterial Endotoxins, JP 4.01 Bacterial Endotoxins Test, and USP General Chapter <85> Bacterial Endotoxins Test, can be used as interchangeable…”
Additionally, the guideline deems that there is interchangeability between the three reference standards (RSE) of the pharmacopeial bodies:
“The USP, JP, and Ph.Eur. reference standards are considered interchangeable as they have been suitably calibrated against the WHO (World Health Organisation) International Standard for Endotoxin.”
The FDA’s 2013 Guidance for Industry discusses the ICH Q4B Annex 14 harmonisation (Guidance for Industry: Evaluation and Recommendation of Pharmacopoeial Texts for Use in the ICH Regions). This guidance provides the agency’s current thoughts and practices for Bacterial Endotoxin Testing and ICH Q4B Annex 14 harmonisation. In reference to section 2.1 of the guidance, the document states that “the pharmacopeial texts referenced in section II.A (2.1) of this annex can be considered interchangeable.” The guidance also states the FDA may still require method suitability testing for the specific product or material being examined, irrespective of the origin of the method. This means that even though it can be asked for those reagents that aren’t FDA-licensed, it can also be asked for those reagents that are FDA-licensed. However, it is important to note that method suitability, or interference testing, is already a required preliminary step for verification of BET under the USP. This does not add an additional requirement, rather, it elaborates on and emphasises the thinking of the USP. In summary, regardless of the origin or licensure of a reagent, preliminary method suitability testing should be performed for all LAL reagents prior to compliant routine testing, following the Test for Interfering Factors in USP <85>.
With the licensing requirement of the 1987 guidance withdrawn and subsequent pharmacopeial harmonisation for BET, what does this mean for pharmaceutical manufacturers?
In order to examine the relationship between the FDA, USP, and ICH guidelines for Bacterial Endotoxin Testing, FUJIFILM
Regulatory and Compliance
Wako’s Limulus Colour KY reagent will be assessed as a case study. The kinetic chromogenic LAL reagent is manufactured by FUJIFILM Wako in Japan. Since the reagent is manufactured outside the US, it does not fall under FDA CBER’s licensing jurisdiction as a manufactured biological product. However, as a test reagent for Bacterial Endotoxin Testing, it is accepted by FDA’s Center for Drug Evaluation and Research (CDER) for all testing requiring USP <85>.
Limulus Colour KY is manufactured in Japan under requirements following Japanese Pharmacopeia 4.01 on BET, which includes the standardisation of matched control standard endotoxin (CSE) to JP-RSE. Under the ICH Q4B Annex 14 guidance, JP 4.01 and JP-RSE are both considered interchangeable with USP <85> and USP-RSE, respectively. This means that Limulus Colour KY reagent, although manufactured in Japan, is fully USP <85> compliant and can be used for compliant endotoxin testing. Because the reagent follows the guidelines set forth by the USP and ICH, the FDA accepts Limulus Colour KY for use as a kinetic chromogenic LAL reagent in the United States. This includes usage of the reagent for validation, in-process, and end-product testing.
When navigating the world of Bacterial Endotoxin and Pyrogen testing, there are various guidelines from different regulatory agencies to consider. In order to get an accurate understanding of the requirements for BET, it is important to examine the relationship between these guidelines and agencies. USP Chapter <85> and harmonised texts JP 4.01 and Ph. Eur. 2.6.14 provide the overarching requirements for BET. Guidance texts, such as the FDA Guidance for Industry documents, are able to fill in any gaps that may be left unanswered by the regulatory chapters and pharmacopeia.
While the technologies and methodologies for BET continue to change and adapt, so does the regulatory landscape.
REFERENCES
1. United States Pharmacopeia. Chapter 85. The Bacterial Endotoxins Test.
2. Japanese Pharmacopeia. Section 4.01. Bacterial Endotoxins Test.
3. European Pharmacopeia. Section 2.6.14. Bacterial Endotoxins.
4. International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use. ICH Harmonized Tripartite Guideline. Evaluation and Recommendation of Pharmacopoeial Texts for Use in the ICH Regions On Bacterial Endotoxins Test General Chapter. Q4B Annex 14.
5. FDA Guidance for Industry Q4B Evaluation and Recommendation of Pharmacopoeial Texts for Use in the ICH Regions Annex 14 Bacterial Endotoxins Test General Chapter (2013).
6. FDA Guidance for Industry: Pyrogen and Endotoxins Testing: Questions and Answers (2012).
7. FDA Guideline on Validation of the Limulus Amebocyte Lysate Test as an End-Product Endotoxin Test for Human and Animal Parenteral Drugs, Biological Products, and Medical Devices (1987).
Delaney Novak
Delaney Novak is a Technical Specialist for the Pyrogen Testing Division of FUJIFILM Biosciences. She holds a B.S. in Environmental Science alongside a minor in Biology. She enjoys working in a collaborative environment and is always open to addressing new challenges and answering complex questions. She applies these skills to further support any technical needs or concerns you may have in your bacterial endotoxin testing endeavors.
The Impact of Clinical Development Decisions on Time to Market and Development Risk in Oncology
Key Findings
• Every month a programme can accelerate to market adds an estimated $5–6M of net present value per billion dollars of peak year sales
• We have identified clinical development decisions that entail trade-offs in accelerating time to market and increasing likelihood of approval
• We have identified common clinical development decisions that have a relatively neutral impact on time to market and likelihood of approval
• We have examined skipping trial phases and aggregating trial phases, which have disparate impacts on time to market and likelihood of approval
Why Time to Market Matters in Oncology Drug Development
Innovation in oncology R&D has been driven both by advances in basic science (e.g., improved understanding of hallmarks of cancer) and steadily increasing biopharma R&D investment (see Figure 1). This growing investment has been justified by oncology’s position as a therapeutic area that offers an efficient path to marketable drugs, driven by a high percentage of unmet needs and relatively short development times.
However, efficiency of innovation in oncology R&D is becoming a challenge. Success rates for oncology clinical trials (Phase 1–3) have declined steadily from 2015 to 2023 (see Figure 1). Notably, the distribution of clinical trials by phase has remained relatively consistent over this period, meaning this decline can be attributed to declining clinical trial success rates.
Learn more about AI-driven oncology trial insights in our case study on clinical trial optimisation.
There’s
debate regarding accelerating treatments and addressing unmet patient needs for cancer medicines. The more quickly and efficiently new cancer treatments are commercialised, the better the industry can address patients’
medical needs, as well as the financial and operational goals of optimising R&D investment and return on investment. As an industry, we achieve these goals through strategic decisions that drive clinical trial design or sequencing, or operational decisions that define clinical trial execution once designed. This analysis focuses on the former, specifically on the following strategic decisions:
1. Pursuit of accelerated approval: Seeking approval for drugs that target serious conditions and fill unmet medical needs based on surrogate endpoints.
2. Use of surrogate endpoints (for regular approvals): Employing markers that have been proven (or are likely) to predict clinical benefits.
3. Front-loading indication expansion: Initiating clinical programmes to expand approved indications for drugs before they demonstrate clinical benefit in their initial registrational clinical trials.
4. Study of patient subpopulations (as opposed to allcomers): Targeting focused subgroups (e.g., those defined by biomarker, risk status, age) rather than entire tumour types.
6. Innovative programme design – skipping trial phases (for regular approvals): Designing and running programmes that omit a phase (e.g., initiating a Phase 2 trial in a specific tumour type without a Phase 1 trial dedicated to that tumour type).
7. Choice of novel (rather than non-novel) clinical trial comparators: Comparing against a novel medicine (i.e., medicines other than chemotherapy and other generic cytotoxic regimens, radiotherapy, generic hormonal therapies, surgery, and placebo/observation/best supportive care).
The U.S. National Academies of Sciences, Engineering, and Medicine (NASEM) have identified most of these decisions as mechanisms to address inefficiency in cancer drug development. However, these decisions often involve difficult trade-offs between accelerating time to market and improving the probability of programme success.
The industry often finds itself facing such trade-offs, as illustrated by Roche’s CEO, Dr. Thomas Schinecker, who said at the company’s 2024 Pharma Day:
“We have in fact made decisions in some cases to stop studies that we did not feel met the bar. One example was in the tiragolumab space. There were two – at least a couple of studies, including a Phase 3 trial that we terminated, because we didn’t feel like the aggregate evidence – emerging evidence justified that investment. So, certainly there are cases where ethically, it doesn’t make sense to stop Phase 3 studies midstream, but we
are intentional about redirecting resources no matter where those resources are, if that is what the bar says to do.”
Methodology
For this analysis we examined over 20,000 industry-sponsored interventional oncology clinical programmes (around 12,400 clinical trials) included in Intelligencia AI’s proprietary database. We further focused on a subset of 10,129 programmes that were completed (“historical”) and resulted either in approval (regular or accelerated) or discontinuation. These historical programmes were further classified according to the seven strategic decisions described earlier.
Details of historical trial cohorts corresponding to each strategic decision are described in Figure 2.
Executive Summary
To evaluate each of these strategic decisions, we assessed their association with time to market and historical success rate. The results are summarised in Figure 3. Consistent with the theme of difficult trade-offs, decisions to pursue accelerated approval, test against a novel comparator, or front-load indication expansion are high-risk, high-reward. Clinical programmes using these decisions on average reach the market quicker than comparable programmes not employing such strategies, though their success rates demonstrate a consequent decrease. Unsurprisingly, these high-risk, high-reward decisions are less common.
Using surrogate endpoints (in pursuit of regular approval) or testing within patient subpopulations are relatively more common decisions with a neutral observed effect on time to market and historical success rate. Aggregating trial phases (in pursuit of regular approval) is surprisingly common, despite slightly increasing time to market and adding risk compared to keeping the phases separate.
By contrast, skipping trial phases (in pursuit of regular approval), the most common decision in this analysis, significantly
reduces time to market while also improving historical success rate relative to not skipping trial phases. The following sections provide detailed insights into each strategic decision.
1. Decisions Requiring Difficult Trade-offs
With high-risk, high-reward strategic decisions, we find that accelerated approvals, testing against novel comparators, and front-loading indication expansion require difficult trade-offs, since they lead to lower success rates than their alternatives, but that each are associated with shorter time to market.
Accelerated approvals, first instituted by the FDA in 1992, have historically reduced time to market for oncology drugs by 28% compared to regular approvals (see Figure 4). However, for trials started in the past five years, this time-saving advantage has decreased to 18%. This decrease is due in part to increased scrutiny around accelerated approvals; since 2020, taking into account the 2021 Oncology Drugs Advisory Committee (ODAC) review of dangling checkpoint inhibitor approvals and the impact of Project Confirm, 13 accelerated approvals have been withdrawn. The accelerated approval approach also carries additional risk: programmes pursuing accelerated approval
Figure 3: Reducing time to market vs. undertaking risk in clinical development
Figure 2: Description of analysis cohorts
have historically shown a lower success rate compared to programmes pursuing regular approval (1.5% vs. 4.4%).
The choice of comparator in a clinical trial is less “flexible” given the need to demonstrate benefit against a standard of care. Still, one would expect that choosing a novel comparator could increase time to market, given the increased “bar” to cross for clinical evidence. Figure 5 illustrates that comparing vs. novel medicines is associated with decreased time to market, but with a lower success rate (0.8% vs. 3.0%). This particular difference may be attributable to a greater proportion of novel comparator programmes focusing on advanced/metastatic (and further, later-line advanced/metastatic) patient populations relative to non-novel comparator programmes (where we see proportionately more local/regional and first-line advanced/ metastatic patient populations).
The final “risk vs. reward” decision we studied is front-loading indication expansion. Interestingly, the vast majority of second indications (following an initial “parent” approval) for a given asset come within the “front-loaded” period described above. Accordingly, we must simulate what “non-front-loaded” programmes would look like. We observe in Figure 6 that second indications initiated in the “front-loaded” period reach market in very similar timeframes as initial approvals (5.7 vs. 5.8 years).
If we further model “front-loaded” programmes shifting into the “non-front-loaded” window, time to market increases by one year (to 6.7 years, making front-loading 15% faster). While there isn’t a basis of comparison for success rates between “front-loaded” and “non-front-loaded” decisions, programmes successfully achieving multiple approvals are quite rare, with historical programmes achieving a 1.0% success rate.
2. Decisions With Neutral Impact
Here, we explore strategic decisions with a “neutral” impact on both time to market and success rate when compared to their alternatives. Given their prevalence in oncology, insights around these decisions can help shape realistic expectations regarding time to market and the likelihood of success for programmes implementing these decisions.
Focusing first on the study of patient subpopulations, we observe a significant shift over the past decade. Historically, the majority of FDA oncology approvals were for all-comers populations (see Figure 7). However, in the past 10 years, approvals for patient subpopulations have risen sharply, nearly matching those for all-comers. This trend highlights a paradigm shift toward precision medicine.
Figure 4: Time to market comparison between regular and accelerated approvals
Figure 5: Time to market comparison between novel and non-novel comparators
Figure 6: Time to market copmparison for indication expansion approaches
Testing medicines in subpopulation indications has historically shown a neutral to slightly slower time to market (see Figure 8A). But in the past six years, subpopulation approvals appear demonstrably slower to market (see Figure 8B), reflecting both increased relative investment in subpopulation indication trials and associated advances in treating subpopulations (e.g., second- and third-generation targeted inhibitors in non-small-cell lung cancer). The subpopulation approach also carries some risk, with a historical success rate of 1.4% (reflecting higher unmet needs and patient types that are more difficult to treat), relative to 2.4% for all-comers programmes.
Surrogate endpoints have been used extensively to support accelerated approvals. To explore their role further, we examined the use of surrogate endpoints in regular approvals. As shown in Figure 9, surrogate endpoints are associated with a modest (5%) decrease in time to market for regular approvals, while their impact on success rates remains small (2.7% vs 2.9%).
The potential of surrogate endpoints to accelerate access to new medicines, particularly in indications with extended overall survival, has driven ongoing research to strengthen the
evidence supporting their use. Most recently, the FDA Oncologic Drugs Advisory Committee (ODAC) unanimously voted in favour of authorising Minimal Residual Disease (MRD) testing as a surrogate endpoint in multiple myeloma.
3. Decisions on Aggregation vs. Skipping a Phase
For the final part of our analysis, we examined a pair of decisions with disparate impacts.
Figure 7: Number of approvals for subpopulation vs. all-comers indications over time
Figure 9: Time to market comparison between regular approvals with and without surrogate endpoints
Figure 8a/8b: Time to market comparison between trials testing in subpopulations vs. trials testing in all-comers, 1989–2023
Preclinical
1. Aggregating trial phases, while common, appears less successful than running distinct trial phases
2. The common decision to skip a clinical trial phase appears relatively successful.
We expected that aggregating trial phases in a clinical programme would reduce time to market, due to the lowered operational requirements of running a single trial instead of two. This acceleration is particularly evident for Phase 1 and 2 trials, where aggregated Phase 1 and 2 trials are, on average, 23% faster compared to a sequential Phase 1 to Phase 2 approach (5.0 vs. 6.5 years).
However, Figure 10 shows that across the entire drug development timeline, aggregating trial phases is associated with a slightly slower time to market relative to following the complete phase path, and also results in a lower success rate (1.3% vs 4.4%). Our interpretation is that the initial acceleration of time to market for programmes with aggregated trial phases is ultimately offset by factors such as increased FDA scrutiny or coordination requirements, more complex statistical considerations, the need to reassess dosing regimens, or undisclosed decisions by programme sponsors to delay certain stages. Despite the limited overall benefit, this strategic approach remains a common practice in oncology clinical development.
Similarly, we expected that skipping an entire phase of a clinical programme would lead to a significant decrease in time to market, which is confirmed in Figure 11. This approach additionally increases the success rate (4.5% vs 1.2%) compared to not skipping trial phases.
This strategic decision is understandably the most common among those we analysed, and its most common form is to skip a dedicated Phase 1 trial for a specific tumour type and proceed directly into Phase 2, likely leveraging knowledge of potential efficacy and optimal dosing from other studies and/or approved indications. This specific approach, and the associated de-risking of clinical programmes, may also partly explain the observed benefits to success rates.
Putting Our Findings into Perspective
The oncology sector and the patients it serves continue to face a significant unmet need for new treatments, while biopharma
companies are under pressure to deliver returns from growing R&D investments. Setting aside competitive dynamics, each month a programme can accelerate to market adds an estimated $5–6M of net present value per billion dollars of peak year sales. Competitive dynamics, such as reaching the market before an in-class competitor, could amplify this impact. This makes strategic decisions like those described above financially impactful, especially for potential blockbusters and mega-blockbusters.
We have found that while accelerated approval can be useful in accelerating time to market, it is not a panacea, as it results in a lower success rate compared to regular approvals. Moreover, as the benefits of accelerated approval in reducing time to market have diminished over time, and scrutiny around confirmatory trials intensifies, sponsors will need to carefully select programmes for accelerated approval. Sponsors should focus on high unmet need indications and/or treatments that offer clinical benefit that far surpasses the standard of care.
Understanding the Financial Impact
We have also learned that while novel comparators may add risk to a programme, they don’t appear to impact time to market significantly (in aggregate and at present). However, as programmes in local/regional or early advanced/metastatic settings start to test more against novel comparators, we may start to see time to market align more closely with programmes using non-novel comparators. Since choice of comparator is often more constrained by clinical guidelines and standard of care, these findings can help set expectations for such comparisons when they are required.
For medicines with multi-indication potential, we find that front-loading indication expansion is associated with reduced time to market. While this decision does carry some risk to the success rate, the risk is comparable to other strategic decisions in this analysis. As such, it represents a promising strategy for medicines targeting multiple tumour types.
Analysing approaches with a relatively neutral impact on both time to market and success rate, we find first that testing a medicine in patient subpopulations should be guided by the medicine’s specificity (e.g., to a mutation, cell-surface antigen or risk signature) rather than being applied as an overarching strategy.
Figure 10: Time to market comparison for innovative trial designs (aggregating trial phases) within a prigram, regular approvals
Figure 11: Time to market comparison for innovative trial designs (skipping trial phases) within a prigram, regular approvals
The Role of Surrogate Endpoints
For surrogate endpoints, while we find that there is a neutral impact in aggregate for regular approvals, there remains significant potential to reduce time to market for programmes pursuing early-stage cancers and/or chronic malignancies like multiple myeloma or chronic lymphocytic Leukaemia Considering their relatively neutral impact in aggregate, these approaches should be reserved for situations where they make strategic sense, based on the nature of the product and target indication.
Skipping vs. Aggregating Trial Phases: Which Works Best?
Aggregating or skipping trial phases are both common strategic decisions. Skipping trial phases, however, is a safer and more effective means of reducing time to market than aggregating clinical trial phases. However, it also seems to be an “open secret” within the industry, as evidenced by being the most common strategic decision in this analysis.
Moreover, since Phase 1 trials dedicated to specific tumour types are most frequently skipped, this decision is likely most suitable for programmes that have already undergone more advanced development in other indications, where clinical activity and dosing have already been established. While this approach appears promising, it should be noted that the FDA imposes higher requirements for approval of this and other adaptive design approaches, which are more easily implementable for subsequent indications where clinical activity is better established. Additionally, the FDA’s Project Optimus programme may complicate this strategy’s use by requiring more robust dosing data for specific tumour types, which decreases the utility of Phase 1 basket trials. On the other hand, aggregating trials shows some benefit in accelerating early-phase clinical programme development. These aggregations must be planned both with the FDA and within the context of the entire clinical programme to ensure that time savings realised in early phases of development lead to overall time to market reduction.
Final Thoughts:
Navigating the Trade-offs
Even though no individual strategic decision can guarantee success on its own, shaving even a few months off a programme's time to market can be highly valuable. Our analysis highlights the
need and opportunity for sponsors to consider these strategic decisions and their potential impact, both before initiating and during oncology clinical programmes, in order to chart the best course to market.
Andreas Dimakakos PhD, MBA, PMP, Director of Scientific Insights, Intelligencia AI
Evgenia Kyriakidou
Business Insights Associate, Intelligencia AI
Contributors:
Joshua Hattem Principal, R&D Advisory Lead, ZS
Panos Karelis VP of Commercial , Intelligencia AI
Modelling the Bone Marrow Niche In Vitro: A Roadmap for Drug Development
The bone marrow niche provides a tightly controlled setting where cellular interactions, biochemical signalling, and mechanical stimuli collectively shape the fate of hematopoietic stem cells (HSCs). Given its role in haematopoiesis, understanding and accurately replicating this niche is crucial for advancing stem cell therapies and developing effective treatments for haematological diseases, such as leukaemia, aplastic anaemia, and myelodysplastic syndromes.
Cutting-edge in vitro models hold great promise for drug development by offering a more predictive platform for studying HSC biology, disease mechanisms, and therapeutic responses. They can serve as valuable tools for testing novel drugs, screening potential treatments for disorders associated with bone marrow, and optimising stem cell-based therapies for transplantation and regenerative medicine.
Challenges in Bone Marrow Niche Modelling
Recreating the bone marrow niche in vitro presents significant challenges. The niche consists of a mix of cellular components, including mesenchymal stromal cells, osteoblasts, endothelial cells, and immune cells, all embedded within a structurally complex extracellular matrix (ECM). Additionally, biochemical gradients, mechanical forces, and fluid dynamics contribute to the functional integrity of the niche, making it difficult to fully replicate in traditional in vitro systems.
The bone marrow niche also undergoes changes in response to ageing, disease, and therapeutic interventions. Effective models must capture these dynamic processes to be relevant for drug testing and development. These complexities require innovative bioengineering strategies to create physiologically relevant models that can accurately mimic the in vitro bone marrow environment.
Modelling Approaches of the Bone Marrow Niche
Recent advancements in tissue engineering, biomaterials, and microfluidic technologies have led to the development of sophisticated bone marrow niche in vitro models that more closely resemble physiological conditions.
• Scaffold-based 3D cell cultures provide structural support and facilitate cell-cell interactions in a three-dimensional environment, enhancing cellular functionality and response.
• Microfluidic systems and organ-on-a-chip platforms offer precise control over fluid flow, nutrient exchange, and biochemical gradients, enabling researchers to study niche dynamics in real-time.
• Co-culture models, which integrate multiple niche cell types, allow for the study of critical interactions that influence HSC behaviour and disease progression.
• Other emerging advancements in bioprinting, selfassembling organoids, and dynamic bioreactors are further expanding the repertoire of bone marrow niche models, improving their predictive power in drug development.
By integrating advanced biomaterials, microfluidic platforms, and co-culture systems, researchers have been able to create increasingly accurate representations of the bone marrow niche that offer greater biological relevance compared to traditional two-dimensional cultures. By providing a more physiologically relevant environment for drug testing, these models have the potential to accelerate the development of safer and more effective therapies while also reducing the reliance on animal models.
As research in this field continues to evolve, these advanced models are expected to revolutionise the way we study the
Figure 1: 3D In Vitro Bone Marrow Model (Image provided by Nikon BioImaging Center)
Figure 2: Enhanced Pre-Vascular Branch Thickening via Fibronectin Over 12 Days. Blue, Nuclei; Red, Actin/Cytoskeleton; Green, Fibronectin.
Research / Innovation / Development
bone marrow niche, accelerating the discovery of innovative treatments for haematological diseases and improving patient outcomes.
Applications of Bone Marrow Niche Models in Drug Development
Beyond fundamental research, these engineered bone marrow niche models have significant translational applications.
Understanding HSC Regulation
In vitro models provide a platform to study the molecular mechanisms that govern HSC fate decisions, including self-renewal, differentiation, and quiescence. This knowledge is crucial for developing targeted therapies for haematological disorders, enhancing regenerative medicine approaches, and improving stem cell transplantation outcomes.
Studying Haematological Diseases
Diseases such as leukaemia, aplastic anaemia, and myelodysplastic syndromes significantly alter the structure and function of the bone marrow niche, disrupting haematopoiesis and impairing normal stem cell behaviour. In vitro models provide a valuable tool for researchers to dissect the underlying disease mechanisms, including aberrant signalling pathways, genetic mutations, and microenvironmental changes.
Developing Stem Cell Therapies
Bone marrow niche models play a critical role in advancing stem cell-based therapies by providing a controlled environment to study the complex interactions between HSCs and their microenvironment. By mimicking physiological conditions, these models contribute to the development of safer and more effective treatments for haematological disorders, enhancing the success rates of stem cell therapies and bone marrow transplants.
Drug Development
By providing a physiologically relevant environment that closely mimics the bone marrow niche, these models enhance the accuracy and efficiency of drug screening processes. They allow researchers to evaluate the effects of potential therapeutics on HSCs and their supportive microenvironment under controlled conditions, enabling more precise assessments of drug efficacy, toxicity, and off-target effects.
Conclusion
The evolution of bone marrow niche in vitro models continues to drive advancements in drug development, providing biopharma companies with powerful tools to study stem cell dynamics, haematological disorders, and therapeutic interventions. Scaffold-based 3D cell cultures, microfluidic systems, and co-culture approaches are at the forefront of this innovation, offering more reliable and predictive platforms for testing next-generation therapies.
As the field continues to evolve, bioengineered bone marrow niche models are expected to play an increasingly pivotal role in both research and clinical applications. By bridging the gap between in vitro studies and in vivo physiology, these models hold the promise of revolutionising our understanding of the bone marrow microenvironment, ultimately leading to improved treatments for haematological disorders and enhanced regenerative medicine strategies.
REFERENCES
1. https://www.crownbio.com/3d-bone-marrow-models
Talita Stessuk
Talita Stessuk, PhD, works as a scientist in the Ex Vivo Patient Tissue Platform, at Crown Bioscience. She holds a PhD in Biotechnology from the University of São Paulo (Brazil) and served as a postdoctoral researcher at Radboud UMC and TUe (The Netherlands). She has a strong foundation in regenerative medicine and tissue regeneration. With an extensive background in pre-clinical and clinical research, Talita has experience in the application of mesenchymal stromal cells (MSCs) for the regeneration of different tissues, including bone tissue engineering. At Crown Bioscience, Talita spearheaded the development of the 3D Bone Marrow Niche (BMN) platform, advancing research in haematological malignancies. She is an expert in advanced 3D culture systems, high throughput screening, high content image analysis, 3D immunofluorescence, and multi-colour flow cytometry. Passionate about pushing the boundaries of scientific innovation, Talita is committed to reproducing the tumour microenvironment in order to optimise the pre-clinical screening of bone marrow cancers.
3D Bone Marrow Niche, Acute Myeloid Leukemia
Modelling Retinal Safety: The Era of Predictive In Vitro Toxicology Testing
Bringing a novel therapeutic to market is a complex and costly journey, and toxicity remains a leading reason for failure along the way. Despite remarkable advances in drug discovery and screening technologies, unanticipated toxic effects are still responsible for high attrition rates during both preclinical and clinical development, draining time, resources, and opportunity. To remain competitive, drug developers increasingly require tools that can predict tissue-specific toxicity earlier in the pipeline, particularly in complex and sensitive organs like the retina.
Cytotoxicity, the potential of a compound to induce cell damage or death, is a central parameter in assessing drug safety. Whether optimising the therapeutic index of a CNS active compound, evaluating delivery vectors for gene therapy, or assessing biocompatibility of new materials, cytotoxicity data is crucial. Moreover, understanding the mechanism of how cells die, whether through apoptosis, necrosis, or other regulated forms of cell death, such as ferroptosis, is equally important. These distinct mechanisms have varied implications for tissue integrity and long-term clinical outcomes. With increasing demand for more predictive and mechanistic toxicology models, researchers are shifting toward high-content, human-relevant systems that provide earlier, more detailed insight into compound behaviour.
The retina, as one of the most metabolically active and structurally complex tissues in the human body, is especially vulnerable to off-target toxic effects. Both systemic drugs and locally administered ocular therapeutics can affect the neurosensory retina or the retinal pigment epithelium (RPE), leading to irreversible vision damage. Historically, retinal safety evaluation has relied heavily on animal models or human retinal explants. However, these approaches present significant limitations. Rodents, for example, lack macula, the central region responsible for high-acuity vision, while human explants are variable, low-throughput, and difficult to source. These limitations, coupled with ethical and cost concerns, are driving the industry toward predictive and scalable in vitro models.
Regulatory bodies and funding agencies are aligning with the direction of leveraging data from in vitro models to predict clinical outcomes. The National Institutes of Health (NIH) has launched initiatives to reduce reliance on animal models and promote the development of advanced human-based systems such as organoids and tissue chips. Simultaneously, the Food and Drug Administration (FDA) is encouraging adoption of in vitro models that better predict human responses. These policies are accelerating the development and adoption of next-generation in vitro systems that combine biological relevance, reproducibility, and scalability.
Human Relevant Predictive Retinal Models
Among the most promising of these technologies are human
induced pluripotent stem cell (iPSC)-derived 3D retinal organoids (ROs) and RPE monolayers. These models closely recapitulate the structural and functional characteristics of the human retina. By day 150 of differentiation, ROs contain all major retinal cell types, including photoreceptors. Similarly, iPSC-derived RPE monolayers exhibit hallmark features such as pigmentation, cellular polarity, and phagocytic activity, and form a tight epithelial barrier which can be quantitatively assessed via trans-epithelial electrical resistance (TEER). At Newcells Biotech, both ROs and RPEs are derived from the same donor iPSC line, enabling isogenic, parallel assessments across different retinal compartments. This integrated approach significantly enhances the translational relevance and depth of toxicity data, offering a window into tissue-specific responses that animal models cannot always match.
These platforms have already been successfully adopted for cytotoxicity and efficacy screening across a range of drug modalities. At Newcells Biotech, we have developed and validated a range of readouts to evaluate both overall cell health and the underlying mechanisms of toxicity. Our methodologies support the assessment of metabolic activity, membrane disruption, and cell death pathways in both our RO and RPE models. In the RPE model specifically, additional essays provide valuable insights into tissue functionality, including tight junction integrity, pigmentation levels, and other physiologically relevant changes. To further advance mechanistic understanding of cytotoxicity, we also offer high-resolution single-cell readouts for both RO and RPE models, allowing us to accurately distinguish between viable, apoptotic, and necrotic cells at the single-cell level. This powerful approach has been validated in both organoids and RPE monolayers, enabling detailed analysis of cell health and death pathways within complex, heterogeneous populations.
Accurate Assessment of AAV Transduction Efficiency and Safety in Retinal Organoids
Differentiating between modes of cell death is especially critical in the retina, where apoptosis and necrosis can have distinct implications for tissue integrity, long-term function, and therapeutic reversibility. An example of this approach in action is a recent project aimed at evaluating both the transduction efficiency and safety profile of a novel AAV gene therapy vector. In this study, wild-type human retinal organoids were transduced with two doses of the client’s AAV vector alongside one of Newcells’ validated control vector. ROs at three differentiation stages (day 120, 150, and 180) were included to evaluate transduction efficiency across different developmental timepoints. Over a 28-day post-transduction period, brightfield and live-cell fluorescence imaging were performed weekly to monitor fluorescent reporter expression and morphological integrity.
Representative images from day 150 ROs transduced with Newcells’ control vector are shown in Figure 1. Results
A)
ROs, up to 28 days post transduction. Each ROs was treated with 1E09 viral genomes (vg). Scale bar = 600 µm. B) Quantification of mean eGFP fluorescence intensity per RO over 28 days post-transduction. Data are presented as mean ± SEM. Each data point represents the average fluorescence from six organoid images. C) Immunofluorescent images of ROs sections 28 days post-transduction, showing co-localisation of GFP (green) with the pan-photoreceptor marker Recoverin (RCVRN, red). White arrows indicate regions of co-localisation. Scale bar = 50 µm.
demonstrated a time- and dose-dependent increase in GFP expression across all treated groups, while untransduced control organoids showed no detectable signal (Figure 1A, B). At endpoint, immunofluorescent staining of RO sections revealed localised transgene expression within the photoreceptor layer, including both rods and cones, indicating effective and cell-type–specific transduction at the tested doses and durations (Figure 1C).
Cytotoxicity and Cell Death Mode Evaluation in Retinal Organoids
Given the importance of safety assessment in gene therapy
development, cytotoxicity evaluations were conducted in parallel to determine the impact of AAV treatment on cell health. A multiparametric approach was used to assess both metabolic activity and membrane integrity, delivering a robust, complementary picture of cell viability and treatment-related toxicity. Differences in membrane integrity, as measured by Annexin V and 7AAD staining, were detected between untransduced and AAV-treated groups, prompting further mechanistic investigation.
To explore cell death pathways, organoids were dissociated and analysed at the single-cell level to distinguish viable,
Figure 2: (A) Representative gating strategy using unstained and positive control samples to establish boundaries for Annexin V and 7-AAD staining. (B) Representative flow cytometry scatter plots of 7-AAD (y-axis) vs. Annexin V (x-axis) in dissociated ROs after 28 days of treatment. Shown are dot plots from three conditions: Untransduced, AAV2.7m8 CAG-GFP-treated, and a positive control for apoptosis. (C) Quantitative analysis of cell populations based on Annexin V and 7-AAD staining. Bar graph indicates the percentages of viable (Annexin V⁻ / 7-AAD ), early apoptotic (Annexin V⁺ / 7-AAD ), and late apoptotic (Annexin V⁺ / 7-AAD⁺) cells. (n=1)
Figure 1:
Live cell imaging of ROs transduced with AAV2.7m8 CAG-GFP compared to untransduced control
Application Note
apoptotic, and necrotic populations. Across all AAV-treated RO ages, brightfield imaging revealed no visible signs of toxicity or reduction in viability over the 28-day period. Flow cytometry analysis further confirmed that AAV-treated and untransduced ROs exhibited comparable levels of early and late apoptosis, suggesting that the viral vector had no measurable cytotoxic effect under the tested conditions. In contrast, organoids treated with a cytotoxicity positive control showed a distinct population shift in Annexin V and 7-AAD staining, validating the assay’s sensitivity and utility for detecting apoptotic responses (Figure 2).
Together, these findings highlight the safety of the tested AAV vector while underscoring the value of integrating iPSC-derived ROs, validated transduction protocols, and high content analytical readouts into a unified workflow. The resulting dataset offered clear, quantitative insights into both AAV uptake and safety, supporting more informed, lower-risk decisions in therapeutic development.
Future Developments
Looking ahead, Newcells Biotech continues to support innovation in drug discovery through its cutting-edge iPSC-derived RO and
RPE technologies. Through the development of high content, human-relevant models, we are empowering our partners to gain earlier and more predictive insights into compound safety, accelerating the journey toward safer, more effective therapeutics. As we expand our assay portfolio and increase the physiological complexity of our models, Newcells is committed to shaping the future of predictive toxicology and redefining how retinal safety is evaluated in preclinical development.
Dr. Maria Georgiou
Dr. Maria Georgiou is a Senior Scientist at Newcells Biotech Ltd, specialising in stem cell research and retinal biology. She holds a PhD from Newcastle University and brings over eight years of experience in developing in vitro models to support drug discovery. Dr. Georgiou has expertise in iPSC-derived retinal models, contributing to advancements in assay development, gene therapies and disease modelling helping to enhance the translational relevance of preclinical research.
Funct ional & Predict ive Human iPSC Deriv Ret inal Models
Safety & efficacy studies
Disease modelling
Gene therapy vector assesment
Readily available retinal organoids to ship
Predict ive Data from in vitro Studies
Rapid in vitro evaluation of drug efficacy and safety is possible on isogenic human iPSC derived healt hy and diseased ret inal organoid and RPE models, allowing for confident decision making to accelerate lead compound selection or submission of data for your IND.
T he models recapitulate t he structure and funct ion of human ret ina tissues and can be generated from healt hy and from pat ients’ iPSCs or gene edited iPSC lines, facilitating predictive advanced in vitro testing of new therapeutics
Challenging Targets in Antibody Discovery Technology
Antibodies represent a rapidly growing therapeutic modality, demonstrating clinical success over a wide variety of diseases, including cancer, autoimmune disorders, and infections. Their ability to bind with high specificity and their inherent stability enable a targeted therapeutic approach, minimising off-target effects and reducing drug-drug interactions.1 Recent advances in technologies for the development of antibodies are further driving this growth, accelerating the discovery process through the rapid generation and screening of antibody candidates.
Phage Display Technology for the Identification and Development of Therapeutic Antibodies
Phage display technology has played a crucial role in the discovery and optimisation of antibodies for a wide range of clinical and research applications, with the greatest impact seen in the development of antibody-based drugs. This well-established approach enables the identification of fully human therapeutic monoclonal antibodies (mAbs) from a diverse collection of antibody fragments presented on the surface of bacteriophages, known as a phage display library.2
The process for the selection of antibodies in phage display is known as “biopanning”, where immobilised target antigens
bind phages displaying complementary antibodies. Unbound phages are then removed through rigorous washing steps, while surface-bound phages are recovered and amplified (Figure 1). High-affinity phage clones are enriched through repeated rounds of selection and amplification, and resulting antibody fragments are isolated, characterised, and expressed as recombinant proteins.
Challenging Targets: Transmembrane Proteins
There are currently over 200 therapeutic antibodies approved or under review in the USA or EU, with more than half of these targeting membrane proteins. However, only three of the antibodies target complex membrane proteins, highlighting a huge opportunity for novel antibody therapeutics targeting complex proteins.
Transmembrane (TMEM) proteins are embedded in the lipid bilayer and are capable of interacting with both the intracellular and extracellular environments, facilitating transport of molecules across the membrane, transmitting signals from the extracellular environment to the cell interior, or participating in cellular processes. TMEM proteins make up only ~20% of the human proteome, but are the target of ~50% small-molecule drugs on the market due to their critical roles in cell-to-cell communication and drug delivery and absorption.3,4 It has also been demonstrated that mutations in TMEM proteins can alter their folding or assembly and are
Figure 1: Phage display selection by biopanning. Iterative rounds of biopanning enrich high-affinity phage clones, screening phage display libraries to identify antibody candidates with desirable properties. Figure recreated from Alfaleh et al. 2020 under a Creative Commons Attribution License (CC BY 4).
Technology
contributors to diseases, including cancers and cardiovascular disease, making them a constant focus for biological research.5,6 However, despite the central role of TMEM proteins in cellular processes, structural information is still limited, leading to challenges associated with their production. When separated from the surrounding lipid bilayer, these proteins exhibit unstable conformations, becoming insoluble and displaying a loss of activity. Additionally, the position of TMEM proteins in the membrane can occlude binding sites, complicating the process of generating and evaluating candidate antibodies. These characteristics, combined with the low protein expression and poor immunogenecity, have hindered the study of such proteins and thus limited the development of effective targeted therapeutics. G proteincoupled receptors (GPCRs) form the largest family of TMEM proteins with more than 800 unique proteins, and, consistent with other TMEM proteins, represent a challenging target due to their complex structure with seven transmembrane domains, their limited exposed epitopes, the position within the lipid bilayer and multiple active conformations.
CXCR4, a transmembrane GPCR, along with its canonical ligand, CXCL12, is one of the most common chemokine receptors associated with the progression of over 23 types of cancer, including breast, ovarian, and prostate cancer.7 Over recent years, many CXCR4 antagonists have been evaluated through clinical trials, but Plerixafor, a small molecule drug for multiple myeloma and non-Hodgkin lymphoma, is currently the only approved CXCR4-targeting therapeutic on the market.8 The therapeutic antibody ulocuplumab is an anti-CXCR4 monoclonal antibody which blocks the interaction between CXCR4 and CXCL12 to prevent tumour progression, and initially showed promising results in the treatment of cancers.9
Stabilisation of Transmembrane Proteins Using Nanomembrane Technology
Traditional methods for the stabilisation of TMEM proteins frequently fail to fully preserve the native protein structure and therefore, offer limited structural information. Detergentbased screening is the conventional method for solubilising membrane proteins, but there are challenges associated with choosing the optimal detergent for the target protein to ensure protein stability. This approach allows for characterisation of TMEM proteins in an aqueous environment but is expensive and time-consuming.10
Termini restraining offers an alternative approach to protein stabilisation where the two termini of membrane proteins are restrained using a self-assembling protein coupler, which retains the protein function, but is limited to small membrane proteins with even numbers of transmembrane helices.11
To overcome these challenges, a combined approach of a nanomembrane platform technology with a custom therapeutic antibody discovery platform has been designed to facilitate the development of antibody therapeutics for difficult targets such as membrane proteins.
A nanomembrane platform technology can work to stabilise the TMEM proteins, holding them in their native conformations, allowing for direct reconstitution of membrane proteins from crude cell membranes into stable, lipid-containing nanoparticles (Figure 2). This process is facilitated by a “flexible” scaffold protein, which self-assembles into disc-like nanoparticles and allows for reconstitution of any TMEM protein, regardless of size or shape. 12 Use of such nanoparticles eliminates the need for protein engineering and maintains the structural and functional integrity of TMEM proteins within a native lipid environment.
The second stage of the process could harness the capabilities of a human antibody phage display library to identify lead antibody candidates. This platform has built-in SpyTag technology to simplify and accelerate the process, including the selection step that uses a proprietary SpyDisplay selection system to covalently display Fabs on filamentous phage (Figure 3). The platform is also capable of generating and screening bispecific antibodies using SpyLock technology, allowing researchers to identify effective combinations of antibodies to take forward to clinical development.13,14
Novel Antibodies to Target CXCR4
The combined nanomembrane platform technology and antibody discovery platform could be deployed to identify multiple novel human CXCR4-targeting antibodies by using purified stabilised CXCR4 nanodiscs as the antigen. Data presented at PEGS Boston 2025 showed that the 48 identified antibodies demonstrated comparable or superior affinity to ulocuplumab, which is currently in clinical trials. These antibodies have also demonstrated their ability to bind to endogenously expressed CXCR4 on Jurkat cells (an immortalised line of human T lymphocytes). Four of
Figure 2: Nanomembrane platform technology stabilises transmembrane proteins, allowing for direct reconstitution of proteins from crude cell membranes.
them were further described with favourable developability profiles. The antibodies also deactivated adenylate cyclase in a concentration-dependent manner, showing an effective block of CXCR4 signalling. Additionally, these antibodies also inhibit CXCL12 mediated cell migration in tumour cells and could therefore be more effective at halting cancer growth, requiring lower therapeutic doses, resulting in fewer side effects and ultimately improving patient outcomes.15 Results from this study represent the potential for the combined platform technologies to identify and characterise novel antibody candidates that show comparable or superior performance to other clinically developed therapeutic antibodies.
The combined platforms also hold promise in characterising newly discovered antibodies. Traditional techniques such as surface plasmon resonance and bio-layer interferometry can be limited by the instability of TMEMs. Use of a nanomembrane stabilisation technology and purified proteins can facilitate rapid initial in vitro characterisation and accurate assessment of antibody binding kinetics and affinities, further accelerating the development of therapeutic antibodies.
Tackling Challenging Targets to Advance Antibody Therapeutics
Antibody-based therapeutics offer a highly specific approach for diseases such as cancer, but advancements in the space are reliant on working with target proteins in their native conformation. Development of a nanomembrane platform technology, used in combination with a highly diverse phage display antibody discovery platform, is facilitating a deeper understanding of complex and unstable proteins, opening new avenues for the discovery of novel antibodies against TMEM proteins, and enabling researchers to unlock the full potential of these crucial, but challenging, therapeutic targets.
REFERENCES
1. Castelli M S et al, The pharmacology and therapeutic applications of monoclonal antibodies. Pharmacology Research and Perspectives, 7(6), e00535, 2019, https://doi.org/10.1002/prp2.535
2. Hentrich C et al, Monoclonal Antibody Generation by Phage Display. In Elsevier eBooks (pp. 47–80), 2018, https://doi.org/10.1016/ b978-0-12-811762-0.00003-7
3. Overington J P et al, How many drug targets are there? Nature Reviews Drug Discovery, 5(12) 993–6 2006: https://doi.org/10.1038/nrd2199
4. Gromiha M M et al, Bioinformatics approaches for functional annotation of membrane proteins. Briefings in Bioinformatics
5. Errasti-Murugarren E et al, Membrane Protein Stabilization Strategies for Structural and Functional Studies, Membranes (Basel) 11(2), 155, 2021, https://doi.org/10.3390/membranes11020155
6. Zaucha Jan et al, Mutations in transmembrane proteins: diseases, evolutionary insights, prediction and comparison with globular proteins, Briefings in bioinformatics, 22(3), bbaa132, 2020, https:// doi.org/10.1093/bib/bbaa132
7. Sun, Xueqing et al, CXCL12 / CXCR4 / CXCR7 chemokine axis and cancer progression, Cancer Metastasis Reviews vol. 29(4), 709-22, 2010https://doi.org/10.1007/s10555-010-9256-x
9. Kashyap M K et al, Ulocuplumab (BMS-936564 / MDX1338): a fully human anti-CXCR4 antibody induces cell death in chronic lymphocytic leukemia mediated through a reactive oxygen species-dependent pathway, Oncotarget 7(3) 2809-22, 2016 https://doi.org/10.18632/ oncotarget.6465
10. Ratkeviciute G et al, Methods for the solubilisation of membrane proteins: the miscellaneous world of membrane protein solubilisation, 49(4), 1763-1777, 2021, https://doi.org/10.1042/BST20210181
11. Liu S et al, Termini restraining of small membrane proteins enables structure determination at near-atomic resolution, 6(51), 2020, https://doi.org/10.1126/sciadv.abe3717
12. Engineering biologics towards challenging membrane protein targets: https://www.genengnews.com/topics/drug-discovery/ engineering-biologics-toward-challenging-membrane-proteintargets/
13. Kellmann S J et al, SpyDisplay: a versatile phage display selection system using SpyTag/SpyCatcher technology, mAbs 15(1), 2022, https://doi.org/10.1080/19420862.2023.2177978
14. Hentrich C et al, Engineered Reversible Inhibition of SpyCatcher Reactivity Enables Rapid Generation of Bispecific Antibodies. Nature communications, 15(1), 5939, 2024, https://doi.org/10.1038/ s41467-024-50296-y
15. Kellmann S J et al, Rapid lead generation with the Pioneer Antibody Discovery Platform, PEGS Boston, Boston, 12-15 May 2025, Unpublished conference paper. Bio-Rad Laboratories Inc., Neuried, Germany, 2025
Miao Li
Miao Li is a Market Development Manager for Bio-Rad Laboratories' newly developed Pioneer Antibody Discovery Platform. She has been in the biotechnology industry for six years, focusing on innovative antibody discovery technologies. Miao holds a bachelor’s degree from China Agricultural University and a Ph.D. in biochemistry from Kansas State University.
Figure 3: The Fab is covalently displayed via SpyTag/SpyCatcher ligation on the filamentous phage carrying the SpyCatcher fused to the pIII coat protein.
Building the Foundation for Biopharma 4.0 Through Strategic Digital Transformation Technology
What is Biopharma 4.0?
Biopharma 4.0 refers to the next-generation modernisation approach to biopharmaceutical development, manufacturing, and quality. It integrates advanced digital, automation, and data-driven technologies to maintain or enhance safety, quality, identity, potency, and purity (SQuIPP), while accelerating timelines across the therapeutic lifecycle, including research, development, and production. Inspired by the broader Industry 4.0 movement,1 it applies both digital tools and physical innovations to reshape how therapies are discovered, developed, and delivered.2
This transformation allows biopharma organisations to improve efficiency, reduce costs, and uphold compliance, all while delivering high-quality products to patients more quickly. Adaptation to the principles of Biopharma 4.0 also enables platformability – the ability to establish standardised manufacturing unit operations and analytical methods to produce a specific therapeutic modality from a specific source to meet established specifications.3
Core Components of Biopharma 4.0 – and The Scientific Relevance
Foundational capabilities such as cloud infrastructure, cybersecurity, data integrity, and compliance are essential to any digital transformation effort. The scientific value of Biopharma 4.0 lies in how it builds upon this foundation to enable a connected, data-driven environment that enhances experimental design, accelerates development cycles, and improves process control. By integrating advanced digital tools directly into discovery, development, manufacturing and quality operations, scientific teams can achieve deeper process understanding, increased reproducibility, and real-time adaptability.
Once an effective digital transformation strategy is deployed, several core components representing the most impactful Biopharma 4.0 technologies can be adopted.
An important tool is artificial intelligence (AI), which can transform large volumes of structured and unstructured data into actionable insights. AI and machine learning algorithms can identify subtle patterns, support and train models to predict clinical outcomes for therapeutic types, determine how to optimise those outcomes, and execute experiments to test them. These capabilities enhance scientific interpretation and reduce the time from observation to invention, improving outcomes for patients from discovery through post-clinical trial regulatory approval.
Complementary to AI is the use of digital twins,4 a virtual replica of a manufacturing process or its respective unit operation components that continuously ingests live and
historical data. By coupling mechanistic or statistical models with real-time process information, digital twins allow scientists and engineers to simulate, monitor, and optimise complex experiments before, or in place of, physical execution. This predictive capability reduces development risk, streamlines tech transfer, and supports more robust process design.
To enable these data-driven systems, Internet of Things (IoT) networks connect instruments, sensors, and production equipment, allowing continuous, high-resolution data capture. This connectivity supports remote monitoring, real-time visibility into process conditions, and closed-loop control, all of which are critical for responsive and consistent bioprocess operations.
Automation and robotics can replace or augment manual, variable-prone tasks with digitally orchestrated workflows. From executing manufacturing processes to performing analytical methods, automation reduces hands-on time for scientists and engineers while ensuring consistent outcomes that are less prone to human error.
Process analytical technology (PAT) embeds quality control directly into the process. By using inline, real-time analytical methods to monitor critical process parameters (CPPs) that may impact the critical quality attributes (CQAs) of a drug substance or product, biopharma organisations can implement adaptive control strategies based on Quality by Design (QbD) principles. This ensures that quality is maintained proactively rather than tested retrospectively.
Finally, continuous manufacturing and process intensification are enabled by these components. By replacing batch-based manufacturing processes with technologies that couple unit operations – such as continuous chromatography or singlepass tangential flow filtration (TFF) – the cost of goods (COGs) for lifesaving therapeutics can be significantly reduced. This supports steady-state manufacturing and more efficient resource utilisation while maintaining regulatory compliance and product quality.
Together, these components form an interconnected scientific framework that enables greater agility, precision, and insight, accelerating innovation while maintaining rigorous process control.
Lab
Informatics at the Core of Digital Transformation
Effective data management strategies are critical at every stage of therapeutic discovery, development, production, and quality. Research illustrates that next-generation technology stacks, centred around lab informatics, are essential to establishing a competent digital transformation framework.5
Modern lab informatics plays a central role as the digital backbone for many organisations. Laboratory Information
Management Systems (LIMS), Electronic Lab Notebooks (ELNs), and Scientific Data Management Systems (SDMS) each have distinct, essential functions within this framework.
LIMS help coordinate lab workflows, manage sample tracking, and maintain consistency across instruments and sites. ELNs provide a structured way to record protocols, observations, and decisions, making experimental details easier to find and share. SDMS ensure that raw data from instruments is stored securely, captured in context, and clearly linked to the experiments they support.
Individually, these systems offer incremental benefits. When connected, they form a cohesive digital backbone that eliminates redundancy, streamlines documentation, and improves the speed and clarity of decision-making. Data moves cleanly from experiment to review, and historical data becomes more accessible due to the reduced data silos these tools help eliminate.
The common challenges faced by labs today – including disconnected systems, custom workarounds, and missing data – are well documented and are often addressed through the adoption of a modern lab informatics platform. For digital transformation, informatics is a strategic layer that connects people, processes, and technologies into a unified ecosystem.
Implementation of High Throughput Purification in Tandem with Digital Tools
Janssen Pharmaceuticals noticed that traditional purification processes are often slow and inefficient, lacking the automation and scalability required to support the rapid pace of modern medicinal chemistry. 6 Drug discovery typically involves purifying large libraries of compounds, ranging from milligram to gram scale, which demands high-throughput systems capable of processing hundreds of samples per day. However, most existing automated workflows rely solely on reversedphase chromatography, limiting flexibility and missing the benefits of orthogonal techniques like supercritical fluid chromatography (SFC), which offers faster, greener separations with complementary selectivity. Additionally, method development remains a bottleneck, as selecting the optimal chromatographic conditions is time-consuming and critical to achieving high-resolution separations.
Previously, purification groups at different Janssen sites used disconnected tools or spreadsheets for tracking and managing workflows, leading to inconsistencies and inefficiencies. Compounding these issues, the growing volume of analytical data generated by Ultraviolet and Mass Spectroscopy (UV, MS), charged aerosol detection (CAD), and nuclear magnetic resonance (NMR) detection creates a need for automated, intelligent data processing tools to drive decision-making.
Technology
To overcome these challenges, Janssen digitally transformed their workflow using a LIMS, which enabled them to take advantage of two tenets of Biopharma 4.0, robotics and automation.
By doing these things, Jannesen was able to connect every step of the workflow from sample submission to assay-ready compounds. Not only did sample tracking and method selection eventually become automated, but data now flows cleanly across multiple orthogonal analytical methods and semi-preparative scale purification steps such as RP-HPLCMS, SFC-MS, and high-throughput NMR. By also integrating data analysis tools, the platform processes LC-MS and CAD data to identify the best purification conditions using criteria like retention time, peak purity, and resolution. The system automatically generates focused gradients for preparative runs, manages fraction QC, and handles redissolution into DMSO for biological testing.
By implementing a modern lab informatics platform to digitally transform their workflow, Janssen was able to utilise Biopharma 4.0 components in robotics and automation, which permits them to purify thousands of compounds quickly and consistently across global sites, improving efficiency and reducing drug discovery cycle times.
Summary
Digital transformation serves as the foundation for Biopharma 4.0, streamlining data management across the value chain while enabling integration of advanced technologies such as automation, PAT, and continuous bioprocessing. In an industry where precision is critical, accelerating development without sacrificing quality is a competitive advantage.
Importantly, transitioning to Biopharma 4.0 does not need to be complex or disruptive. It can begin with strategic, manageable steps, such as strengthening data management practices, aligning leadership on a digital transformation roadmap, or implementing foundational tools like LIMS, ELNs, or SDMS. These early investments lay the groundwork for a scalable digital backbone, enabling further adoption of Biopharma 4.0 technologies as organisational readiness and needs evolve.
Alongside the technologies, organisations must also consider the organisational and managerial factors that influence the success of a Biopharma 4.0 strategy.7 Implementing new tools and systems may require the organisation to reassess roles, workflows, and decision-making processes, changes that affect how scientific work gets done day-to-day and how teams interact across discovery, development, manufacturing, and quality functions.
Equally important is ensuring that technology investments support broader company goals, whether that's improving supply chain resilience, speeding time to market, or meeting compliance and regulatory requirements. A Biopharma 4.0 strategy is most likely to succeed when digital tools and operational needs are aligned with existing and future business priorities.
Organisations that embrace this transformation are already realising tangible gains in efficiency, reproducibility,
and compliance. At the heart of this shift are lab informatics platforms, empowering real-time decision-making, driving data integrity, and anchoring the digital infrastructure required to compete in a rapidly advancing biopharmaceutical landscape.
3. Poell, T., Nieborg, D. and van Dijck, J. (2019). Platformisation. Internet Policy Review, 8(4).
4. Chen, Y., et al. (2020). Digital Twins in Pharmaceutical and Biopharmaceutical Manufacturing: A Literature Review. Processes, 8, 1088.
5. Yang, H.S., et al. (2023). Building the Model. Archives of pathology & laboratory medicine, 147(7), 826–836.
6. Chen, K., et al. (2025). Automated high-throughput RP-HPLC-MS and SFC-MS analytical and purification platforms to support drug discovery. Journal of Chromatography. A, 1742, 465648.
7. Miozza, M., Brunetta, F. and Appio, F.P. (2024). Digital transformation of the Pharmaceutical Industry: A future research agenda for management studies. Technological Forecasting and Social Change, 207, 123580.
Jim Sulzberger
Jim Sulzberger, Director of CMC, Scientific Office, Sapio Sciences is an accomplished leader in the biopharmaceutical industry with over 17 years of experience spanning process development, analytical science, and CMC strategy. Currently Director of CMC at Sapio Sciences, he has held key technical and leadership roles at COUR Pharmaceuticals, Aldevron, Bio-Rad Laboratories, Pall Corporation, and Celgene. His work has focused on developing scalable, cost-effective manufacturing processes and analytical methods for complex biologics, including monoclonal antibodies, recombinant proteins, and nanoparticle-based therapeutics. Jim’s strengths include process intensification, chromatography, and regulatory submissions, with a hands-on approach that bridges science and strategy. He holds an MS in Chemistry from Northeastern Illinois University and a BS in Biology from the University of Illinois at Chicago, along with a certificate in Enterprise Transformation from MIT Sloan. His innovations include a provisional patent for a novel two-step purification process and impactful contributions to the advancement of continuous chromatography and downstream process development.
Document, search, and collaborate
Scientific data management with new add-on functionality to make discoveries faster
Generate, QC, and analyze results
Register entities to secure your assets
Keep track of entities
Deep learning and AI
Assay informatics Manage and analyze experimental data
Connect instruments, data, and tools
Plot and mine results
Manufacturing & Processing
TPDs: Developing the Next Generation of Oral Therapeutics
Targeted Protein Degraders (TPDs) are transforming the landscape of modern drug discovery and development, introducing a ground-breaking method for tackling diseases previously deemed untreatable. In contrast to traditional small-molecule inhibitors that only block protein function, TPDs harness the body’s natural degradation systems to eliminate harmful proteins altogether. This pioneering approach paves the way for novel therapies in oncology, neurodegenerative diseases, and immune-related conditions. Reflecting this potential, investment in TPDs has surged; venture funding soared from $33 million in 2017 to $707 million in 2022, marking an increase of more than 2,000%. The global TPD market, valued at roughly $544.4 million in 2024, is projected to expand at a compound annual growth rate (CAGR) of 20.8% between 2025 and 2030. However, despite their vast potential, TPDs still encounter major obstacles, including formulation complexity, limited bioavailability, and challenges in scaling up manufacturing.
Engineering Precision
The scientific foundation of Targeted Protein Degraders (TPDs) lies in their unique ability to selectively and irreversibly eliminate specific proteins by exploiting the body’s ubiquitinproteasome system. Unlike traditional therapies, such as tyrosine kinase inhibitors (TKIs), which focus on blocking protein function and often provide only temporary suppression, TPDs offer a more permanent solution by binding to the target protein and guiding it toward degradation. This results in a more sustained therapeutic effect and helps overcome the issue of drug resistance. TPDs are especially valuable in situations where conventional small-molecule inhibitors fail to achieve adequate specificity or efficacy.
In oncology, TPDs are showing strong therapeutic potential, with active investigations underway for their use in treating lung cancer, breast cancer, multiple myeloma, and lymphoma. Their applications are not limited to cancer; research is expanding into neurodegenerative disorders like Parkinson’s disease, where protein misfolding and aggregation are central to disease progression. TPDs also hold significant promise for immunological conditions, offering a novel way to selectively target and eliminate proteins that drive chronic inflammation and immune system dysregulation. With thousands of proteins performing diverse roles in the human body, the scope of TPDs is vast, making them one of the most compelling emerging areas in modern drug development.
Leading TPD strategies currently under exploration include proteolysis-targeting chimeras (PROTACs) and molecular glues, both of which have already progressed to clinical trials. PROTACs are bifunctional molecules that simultaneously bind to the target protein and an E3 ubiquitin ligase, initiating the
degradation cascade. In contrast, molecular glues facilitate the degradation process by strengthening the interaction between a target protein and a ubiquitin ligase, subtly recruiting the body’s degradation machinery with greater efficiency. As both technologies continue to evolve, they are expected to fundamentally reshape the future of therapeutic intervention.
Development Challenges with TPDs
The therapeutic promise of Targeted Protein Degraders (TPDs) is substantial, yet their development presents a series of complex challenges. Unlike conventional small-molecule drugs, which generally align with well-characterised pharmacokinetic and pharmacodynamic behaviours, TPDs often possess atypical properties that complicate formulation and delivery.
A key hurdle in formulation arises from the structural complexity of TPDs. Their relatively high molecular weights, along with the inclusion of multiple functional groups required for effective target binding and E3 ligase recruitment, frequently result in poor solubility and limited membrane permeability. These physicochemical characteristics can significantly impair oral bioavailability, prompting the need for advanced formulation techniques to enhance absorption. Among the two main classes of TPDs, molecular glues, being smaller and structurally simpler, have demonstrated better potential for oral delivery, whereas PROTACs typically demand additional solubility enhancement strategies.
To address solubility limitations, researchers are investigating a variety of formulation methods, including hot-melt extrusion, spray drying, nano-milling, and lipid-based delivery systems. Each approach offers specific advantages depending on the unique attributes of the compound. For instance, spray drying and hot-melt extrusion can increase the bioavailability of poorly soluble drugs by generating amorphous solid dispersions. Nano-milling enhances dissolution by reducing particle size, while lipid-based formulations exploit the body’s endogenous lipid absorption pathways to facilitate drug uptake.
Further complicating development is the fact that many TPDs do not conform to Lipinski’s Rule of Five, a widely accepted guideline for predicting the oral bioavailability of small-molecule drugs. TPDs often exhibit characteristics such as elevated molecular weight, limited permeability, and high numbers of hydrogen bond donors and acceptors. These deviations necessitate comprehensive solubility screening and early-stage optimisation to ensure alignment between the formulation strategy and the intended route of administration. Moreover, excipients play a pivotal role in stabilising the final dosage form and ensuring content uniformity, particularly important for TPDs, which often require low drug loads due to their high potency.
Phase-Appropriate Development
When working with complex modalities such as TPDs,
Manufacturing & Processing
pharmaceutical development must be approached as a continuous, adaptive process rather than a fixed set of activities. A scientifically rigorous, strategically flexible, and operationally collaborative framework is essential to guiding molecules from discovery to commercial readiness.
Scientifically, the foundation lies in Quality by Design (QbD). This approach begins with a clear understanding of the desired product attributes, such as quality, safety, and efficacy, defined through the Target Product Profile (TPP). However, the TPP is not static. It starts broadly during early development and becomes more refined as the programme progresses, ultimately aligning with regulatory expectations and real-world patient needs.
Strategically, a phase-appropriate plan is vital. Each phase should build intelligently upon the last, with clear inputs, outputs, and technical goals tailored to where the product is in its lifecycle. In early-phase development, speed is paramount. Simple formulations, such as a neat API in capsule, can provide sufficient stability for first-in-human studies. As the product advances, more robust and scalable formulations are introduced to meet the demands of later-stage trials and commercial production.
Operationally, communication and collaboration are key. Development cannot occur in silos; alignment across drug substance, CMC and clinical teams is critical. CDMOs must be able to interpret the sponsor’s vision and translate it into viable, scalable, and patient-friendly formulations without locking themselves into inflexible paths. This requires not only scientific expertise but also a willingness to adapt based on new
data or shifting priorities.
PCI applies this philosophy through a stage-gated pharmaceutical development process that mirrors the clinical lifecycle, spanning formulation and analytical development, toxicology batches, to Phase I/II/III studies. As development progresses, the focus shifts from flexibility to robustness, with increasing attention to identifying and controlling Critical Material Attributes (CMAs), Critical Quality Attributes (CQAs), and Critical Process Parameters (CPPs). These elements inform the control strategies and specifications that underpin a successful product.
Given the concurrent scale-up of drug substance and drug product, close coordination is especially important. Understanding how an API’s characteristics, such as morphology or impurity profile, might evolve during scale-up is essential to avoid formulation issues downstream. In the context of targeted therapies like TPDs, where precision is paramount, the margin for error is razor-thin. A phase-appropriate, science-led development strategy is not just ideal, it’s imperative.
Manufacturing and Stability
Manufacturing TPDs demands a highly sophisticated approach that accounts for their potency, scalability, and stability. Due to their precise mechanism of action, TPDs are frequently highly potent compounds, which necessitates rigorous containment protocols to ensure safety during production. As a result, manufacturing environments must be equipped with high-containment capabilities – something not all facilities possess. This limitation makes strategic partnerships with
Manufacturing & Processing
contract development and manufacturing organisations (CDMOs) a critical component of TPD development and scale-up.
Containment is only part of the equation; the structural complexity that grants TPDs their therapeutic precision can also render them more vulnerable to chemical or physical degradation under various conditions. As these compounds move from laboratory to large-scale production, ensuring their stability becomes a significant focus. Maintaining formulation integrity throughout scale-up and commercial manufacturing requires the judicious selection of excipients, optimised processing techniques, and advanced analytical tools for continuous monitoring. Robust in-process controls are essential not only for maintaining batch-to-batch consistency but also for ensuring that the final drug product meets all regulatory and quality requirements.
The CDMO Role
As drug development becomes increasingly complex, biotech and pharmaceutical companies are increasingly turning to contract development and manufacturing organisations (CDMOs) to navigate the unique challenges of TPDs. A strong CDMO partner provides essential support across the development lifecycle, offering deep expertise in the handling of highly potent compounds, advanced formulation technologies, and contained manufacturing. Their infrastructure is critical in supporting the seamless transition of TPDs from preclinical stages through to commercial-scale supply.
A particularly crucial area of support lies in analytical development. The characterisation of TPDs requires a suite of highly specialised techniques, including surface plasmon resonance, mass spectrometry, fluorescence polarisation, X-ray crystallography, and bio-NMR spectroscopy. These advanced methods are indispensable for assessing key attributes such as protein binding efficiency, degradation kinetics, and molecular stability. Given the complexity of these analytical tools and the expertise required to operate them effectively, many pharmaceutical companies partner with CDMOs to gain access to the necessary infrastructure and know-how.
In addition to our strengths in manufacturing and analytical services, PCI Pharma Services leverages more than 35 years of experience in the development of highly potent drug products. The company has also formed strategic partnerships to further expand its capabilities in solubility enhancement and particle engineering, two areas that are particularly relevant to the development of TPDs. Through cutting-edge techniques such as hot-melt extrusion, amorphous solid dispersion formation, and nano-milling, PCI’s partner network helps optimise bioavailability and ensure consistent product performance, enabling the successful development of TPD candidates that might otherwise be limited by poor solubility and inconsistent pharmacokinetics.
TPDs: Looking Ahead
The field of TPD research is advancing at a remarkable pace, fuelled by ongoing innovation in medicinal chemistry, structural biology, and computational modelling. Over the next five to ten years, the pipeline of new TPD candidates is expected to grow substantially, with a strong focus on enhancing oral
bioavailability and expanding the spectrum of druggable proteins. Continued progress in the discovery and refinement of E3 ligase ligands will further improve the specificity and efficiency of TPD design, enabling more precise and effective therapeutic outcomes.
Artificial intelligence and machine learning are also set to play transformative roles in TPD discovery. By streamlining compound identification and optimising lead selection, AI-driven high-throughput screening is poised to reduce development timelines and increase the likelihood of clinical success. These technologies will enable researchers to prioritise the most promising molecules for synthesis and biological evaluation, accelerating the journey from concept to candidate.
As regulatory pathways for TPDs continue to evolve, early engagement with regulatory agencies will be essential for ensuring alignment and avoiding delays. Proactive dialogue can help clarify requirements and facilitate smoother approvals. Throughout this process, CDMOs will remain invaluable partners, bringing deep regulatory experience, advanced formulation capabilities, and scalable manufacturing infrastructure to support the development of high-quality, stable TPD-based therapies.
Summary
Targeted Protein Degraders are ushering in a new era of drug development, offering a fundamentally different approach through the selective and irreversible elimination of diseasecausing proteins. Although their development is accompanied by a distinct set of scientific and logistical challenges, specialised CDMOs are playing a pivotal role in overcoming these obstacles, supporting everything from advanced formulation to high-containment manufacturing and regulatory navigation. With continued progress in medicinal chemistry, bioengineering, and computational tools, the momentum behind TPDs is only expected to grow. This promising modality stands poised to deliver transformative therapies across a broad range of diseases, potentially reshaping the future of modern medicine.
Anshul Gupte, Ph.D., RAC Drugs, joined PCI in March 2024 as Vice President, Pharmaceutical Development. He brings over 16 years of experience across solid orals, liquids, sterile, and topical dosage forms, with contributions to numerous global regulatory submissions. Anshul previously held senior development roles at Catalent and Metrics Contract Services and led complex generics formulation at Mayne Pharma in Australia.
Application Note
The Science of Cell Line Development for Biologics: Improving Stability and Yield
Mammalian cell line development is essential to biologics manufacturing, ensuring stable, high-yield expression of therapeutic proteins. With expanding biologics pipelines, the industry is continuously innovating to improve productivity, speed to patient, and scalability. Among the most widely used cell lines for biologics production are Chinese Hamster Ovary (CHO) cells, which have become the gold standard for monoclonal antibody and recombinant protein production. Their adaptability, scalability, and ability to achieve high titers make them essential for developing monoclonal antibodies and other complex biologics.
Cell Line Development Steps for Monoclonal Antibody Production
Developing a stable, high-yield cell line requires multiple steps, including the genetic modification of mammalian cells to integrate the gene encoding the protein of interest into the host genome. This is followed by rigorous screening, characterisation, and banking to ensure high productivity and product stability.
Development of Cell Lines for Bispecific Antibodies
Developing cell lines for bispecific antibodies is particularly challenging due to the inherent complexity of these molecules. Bispecific antibodies are designed to bind two different antigens simultaneously, adding both structural and functional complexities.
Efficiently expressing bispecific antibodies requires cells to produce two different heavy chains and two different light chains, which must correctly pair to form functional bispecific molecules. Incorrect pairing often results in product-related impurities, such as homodimers, which are less effective and difficult to remove during purification due to their similar physical and chemical properties.
The Role of CHO Cell Lines in Biologics Manufacturing
CHO cells are the predominant mammalian cell lines used for producing therapeutic proteins and monoclonal antibodies. First isolated in 1956, CHO cells have been extensively optimised to create subclones that result in higher yields and improved product quality. Initially existing as adherent cell lines, they have been adapted to suspension culture, allowing them to grow in a suspension environment without the need for solid media.
Key stages in monoclonal antibody cell line development include:
Stage Description
1. Selection of host cell line Choosing an appropriate cell line that is amenable to genetic manipulation and capable of high-yield protein production.
2. Transfection
3. Stable pool generation
4. Single-cell cloning
This initial step involves integrating the gene of interest into the host genome. Transfection can be achieved through physical methods like electroporation or chemical methods such as lipofection or methods using calcium phosphate.
Post-transfection, cells incorporating the gene of interest are selected using selection markers and antibiotic markers. Common systems used include methotrexate (MTX) or the glutamine synthetase (GS) system.
Ensuring the monoclonality of cell lines is crucial. This step involves isolating single cells to establish monoclonal cell lines, which guarantee consistent production of the target protein. Regulatory authorities require stringent standards of monoclonality. Advanced equipment like the Beacon from Berkeley Lights utilises microfluidics and Opti Electro Positioning (OEP) technology to move cells in and out of nanopens, significantly reducing development timelines compared to traditional methods like limiting dilution.
5. Screening and isolation
6. Cell line stability studies
7. Master cell banking and characterisation
This step involves assessing a large number of clones for yield, quality, and manufacturability. High-throughput equipment like the ambr250 system is typically used to screen multiple clones efficiently.
The stability of established cell lines is evaluated to ensure that clones can maintain consistent titer and quality of the product over multiple generations.
Master cell banks of the lead clones are created to ensure an adequate source of cells for future large-scale production. These banks are comprehensively characterised to meet regulatory requirements.
Some widely used CHO cell lines include:
• CHO-K1: CHO-K1 remains one of the most widely used CHO cell lines due to its adaptability, genetic stability, and scalability in biologics manufacturing. However, not all CHO-K1 cell lines are created equal. Advances in cell line engineering have led to next-generation CHO-K1 variants that optimise titer expression, gene stability, and process scalability.
For example, Thermo Fisher Scientific’s CHO-K1 high-titer cell line leverages proprietary transposase technology to enhance gene integration and achieve titers of up to 8 g/L. This next-generation CHO-K1 platform improves expression stability and process efficiency, accelerating IND submission timelines and reshaping expectations for biologics development.
• CHO-S: CHO-S is a suspension-adapted variant of Chinese Hamster Ovary cells, making it highly suitable for large-scale bioreactor production. Its capability for proper protein folding and post-translational modifications, coupled with high-yield, renders it ideal for industrial-scale production of biotherapeutics and research in cellular biology. Additionally, CHO-S is widely accepted by regulatory agencies for the production of biopharmaceuticals, further solidifying its role in the industry.
• CHO-DG44: This CHO variant, characterised by a double deletion of the dihydrofolate reductase (DHFR) gene, enables the use of Methotrexate (MTX) for selection and gene amplification. This attribute facilitates high-level expression of recombinant proteins. Like other CHO cell lines, CHO-DG44 is leveraged for large-scale production due to its high-yield, proper protein folding, and effective post-translational modifications.
• CHO-DXB11: Also deficient in DHFR, CHO-DXB11 is utilised for similar purposes as CHO-DG44. It holds historical significance as the first CHO cell line used in the biotechnology era for the production of recombinant mammalian proteins. CHO-DXB11 was pivotal in producing large quantities of human tissue plasminogen activator (TPA). However, its utility was somewhat limited by the potential for reversion of DHFR activity under mutagenic conditions.
These CHO variants have distinct characteristics that make them well-suited for a range of biotechnology and biopharmaceutical applications. Selecting the right cell line depends on multiple factors, including growth characteristics, gene amplification capabilities, protein production efficiency, and regulatory compliance.
Advancing
Biologics Through Cell Line Innovation
Cell line development is a fundamental step in biologics manufacturing, shaping the efficiency, scalability, and success of monoclonal antibody and recombinant protein production. From host cell selection and genetic modification to screening and banking, each stage plays a critical role in ensuring high-yield, stable expression.
CHO cell lines continue to set the industry standard, with advancements in engineering enabling higher titers, improved gene stability, and more efficient pathways to IND submission. Whether for monoclonal antibodies, bispecifics, or other complex biologics, selecting the right cell line is key to optimising production and meeting regulatory standards.
As cell line development evolves, next-generation CHO-K1 platforms, such as Thermo Fisher Scientific’s high-titer CHO-K1 cell line, are pushing the boundaries of productivity and process efficiency. With innovations in transposase technology and gene integration, these advancements are reshaping expectations for biologics development, offering more scalable and efficient solutions for today’s fast-moving pipelines.
Learn about Thermo Fisher Scientific’s Path to IND and First-in-Human platform to understand how we leverage these techniques for our clients to increase stability and yield, while reducing the timeline, all without compromising quality.
Thermo Fisher Scientific provides industry-leading pharmaceutical services solutions for drug development, clinical trial logistics and commercial manufacturing. With more than 60 locations worldwide, the company offers integrated, end-to-end capabilities across all phases of development. Pharmaceutical and biotech companies of all sizes gain instant access to a global network of facilities and technical experts. Thermo Fisher delivers integrated drug development and clinical services tailored to fit each drug development journey, ensuring high quality, reliability and compliance as a leading pharmaceutical services provider.
Harnessing the Power of Generative AI: Transforming Life Science Manufacturing
The life science industry faces the challenges of developing and delivering life-changing medicines at an accelerated pace, all while maintaining quality and safety and ensuring sustainable practices. This challenge is magnified by the sheer volume and complexity of data generated throughout the product life cycle. When leveraged strategically, this data can significantly enhance the effectiveness of pharmaceutical manufacturing. In many cases, the data necessary to address these challenges already exists within organisations. However, traditional data management approaches, often characterised by complex, fragmented systems with manual processes, hinder access to critical insights, impede innovation and slow down decision-making processes – ultimately restricting a company’s ability to respond effectively to the dynamic market and regulatory landscape.
Generative AI offers a transformative solution by revolutionising data utilisation in pharmaceutical companies. It provides an intuitive interface that simplifies access to previously fragmented information, empowering users to extract valuable insights and drive improvements across the entire manufacturing lifecycle. By moving beyond traditional data processing and pattern application, generative AI enables informed, data-driven decisions at all organisational levels.
The Power of Integrated AI: A Holistic System
The convergence of generative AI with other AI-powered tools creates a more powerful and comprehensive solution than these individual components in isolation. This integrated approach leverages the unique strengths of each tool to elevate decisionmaking and drive innovation across the entire manufacturing life cycle. Technologies such as machine learning, mathematical modelling and digital twins (virtual replicas of physical assets or processes) create a synergistic ecosystem that elevates decision-making to new heights. This integrated approach enables manufacturers to tackle their most pressing challenges, from identifying the “next best action” for manufacturing and lab personnel to optimising maintenance schedules and IT/OT operations.
Incorporated into a comprehensive digital transformation strategy, this AI ecosystem has the potential to revolutionise how life science companies operate. By providing real-time, actionable insights based on in-depth data analysis, it can answer critical questions, optimise processes and drive continuous improvement. It is no surprise that the life science industry is increasingly embracing this technology as a cornerstone of its business transformation initiatives.
Unlocking Value:
Generative AI’s Impact on Pharmaceutical Companies
Generative AI, while a relatively new entrant in the technological
landscape, has rapidly emerged as a potential solution to long-standing challenges faced in the pharmaceutical industry. From accelerating drug discovery and development through the analysis of vast datasets to optimising manufacturing processes and supply chains, the theoretical applications seem endless. Beyond this theoretical promise, generative AI is already proving its worth in the pharmaceutical industry, providing tangible results that address key business challenges and create significant value.
• Accelerating Time to Market:
By automating complex tasks like process fit-gap analysis and lab method transfer, generative AI significantly shortens the new product introduction (NPI) timeline. This translates to a faster time to market, giving companies a crucial competitive edge in a dynamic industry that also translates to substantial cost savings.
• Enhancing a Data-driven Workforce:
Generative AI democratises access to data, empowering decision-makers at all levels. AI-powered assistants act as virtual collaborators, providing real-time insights and guidance, enabling staff to make informed decisions and execute tasks more effectively and leading to increased capacity.
• Building Resilient Supply Chains:
The pharmaceutical supply chain is complex and vulnerable to disruptions. Generative AI offers a powerful tool for mitigating risk and ensuring supply chain continuity. By analysing vast datasets, AI can predict disruptions, identify bottlenecks and recommend proactive measures to ensure a secure and reliable supply of critical medicines.
• Pioneering Sustainable Practices:
Sustainability is a growing concern for consumers and investors. Generative AI enables the creation of digital twins, virtual replicas of manufacturing processes that can provide real-time insights into resource consumption. By analysing data on carbon, energy, water, solvent and other waste, companies can identify optimisation opportunities and make informed decisions that promote sustainability and reduce their environmental impact.
• Fostering a Culture of Continuous Improvement:
Generative AI is a catalyst for operational excellence. By defining standard work processes and identifying deviations in real time, AI allows for continuous improvement through data-driven insights, leading to increased efficiency and reduced costs.
Realising Tangible Benefits: Generative AI in Action
The transformative impact of generative AI in the life science space is already being realised in practical applications that
Subsection: AI and Machine Learning
yield measurable results for pharmaceutical companies. These real-world examples demonstrate how this technology is reshaping the industry and driving tangible benefits across the entire value chain.
A key advantage of generative AI is its ability to democratise data access. Traditionally, data has often been siloed within organisations, limiting its availability and utility. Generative AI breaks down these barriers, empowering executives, supervisors, technicians and operators at all levels with the information they need to make informed decisions. This democratisation of data fosters a data-driven culture, where decisions are grounded in evidence and insights, leading to enhanced manufacturing efficiency and improved outcomes.
Generative AI assistants further amplify the impact of democratised data. These intelligent agents provide real-time guidance and recommendations, supporting both good manufacturing practice (GMP) and non-GMP decisionmaking processes. By integrating with existing systems and leveraging large language models (LLM), these assistants offer context-specific insights, helping staff navigate complex tasks, troubleshoot issues and optimise operations. This enhances individual performance and contributes to a more agile and responsive organisation.
The potential applications of generative AI in the life science ecosystem are vast and varied. For GMP material supply, generative AI can optimise efficiency and security by streamlining supplier management processes, ensuring timely delivery of high-quality materials and reducing the risk of disruptions. This translates to a more reliable output of finished products, whether for clinical trials or commercial markets.
Moreover, generative AI enhances collaboration between pharmaceutical organisations and their contract manufacturing partners. By facilitating seamless communication and data sharing, AI can help eliminate inefficiencies caused by miscommunication, delays and errors. This fosters a more collaborative and productive relationship, ultimately benefiting both parties.
Generative AI also allows companies to make more informed post-commercial decisions. By analysing real-world data on product performance, patient outcomes and market trends, AI can provide valuable insights into product life cycles, enabling companies to optimise marketing strategies, identify new indications and maximise return on investment.
Building a Robust AI Strategy: Key Considerations for Success
While the potential benefits of generative AI are undeniable, its successful integration into pharmaceutical manufacturing requires careful planning and a comprehensive strategy. As with any disruptive technology, there are challenges and pitfalls to navigate to ensure that generative AI delivers on its promise.
1. Cost-Benefit Analysis:
Implementing generative AI can be a significant investment. Companies must carefully evaluate the total cost of ownership, including hardware, software, training and ongoing maintenance. A thorough cost-benefit analysis is essential. This will determine whether the value that generative AI adds to the business in terms of enhanced efficiency, greater resilience and improved sustainability justifies the expense and disruption of implementation. It can also ensure that the investment aligns with the company’s strategic goals.
2. Ensuring GMP Compliance:
The use of generative AI in GMP decision-making requires rigorous validation and robust processes to ensure compliance with regulatory standards. Companies must establish clear guidelines and protocols for how AI-generated outputs will be evaluated, validated and incorporated into decisionmaking processes. This will build trust and confidence in the technology, both internally and among regulators.
3. Data Integrity and Accessibility:
Generative AI models rely on high-quality, consistent data to produce accurate and reliable results. However, many pharmaceutical organisations struggle with fragmented data systems, disparate data formats and data integrity issues.
Subsection: AI and Machine Learning
Before embarking on a generative AI journey, companies must invest in data harmonisation, ensuring that data is clean, standardised and easily accessible across the organisation.
4. Validation and Reliability:
The consistency and repeatability of AI-generated outputs are critical, especially in heavily regulated industries like pharmaceutical manufacturing. Rigorous testing and validation processes are necessary to ensure that AI models produce reliable results in different scenarios. Expert support may be required to design and execute these validation protocols effectively.
By addressing these key considerations, pharmaceutical companies can lay a solid foundation for successfully implementing generative AI. This strategic approach will mitigate potential risks and maximise the value that this technology can bring to organisations.
5. Navigating Cultural Shifts:
One of the most significant challenges in implementing generative AI is managing the cultural impact. Introducing new technology can often be met with resistance or scepticism from employees who are accustomed to traditional ways of working. Effective change management is paramount to ensuring a smooth transition. Companies must proactively engage employees at all levels, fostering understanding and buy-in for the AI-driven transformation.
Implementing Generative AI: Lessons Learned for Success
The journey of integrating generative AI into life science manufacturing is not without its complexities. However, by learning from the experiences of early adopters, companies can navigate this path more effectively and maximise the value this technology brings.
1. Link AI Solutions to Measurable Business Outcomes:
From the outset, it is crucial to establish a clear link between generative AI solutions and measurable improvements in key business drivers, such as increased capacity, accelerated product launches or reduced waste. This demonstrates the return on investment and ensures that AI initiatives are aligned with the organisation’s strategic goals. By tracking and quantifying the impact of AI, companies can continuously refine their strategies and optimise the value derived from this technology.
2. Focus on Capturing Tacit Knowledge Proactively: Rather than attempting to retrospectively incorporate tacit knowledge into generative AI systems, companies should prioritise capturing this knowledge from the moment AI is activated. This approach streamlines the integration of disparate data systems and accelerates the time to value. By proactively capturing knowledge as it is generated, companies can build a robust knowledge base that continuously evolves and improves, ensuring that AI models are always up to date and relevant.
3. Embrace the Transformation Journey:
Embarking on a digital transformation journey can be daunting, especially when it involves cutting-edge technologies like generative AI. However, the potential
rewards are significant. Companies should not hesitate to take the first step. By actively exploring and evaluating generative AI tools, engaging with experts and experimenting with pilot projects, organisations can gain valuable insights and experience, paving the way for a smoother and more successful implementation.
The sooner companies begin to incorporate generative AI into their operations, the sooner they can unlock its potential. By learning from the experiences of others and adopting a proactive, strategic approach, life science companies can position themselves at the forefront of innovation and reap the benefits of this groundbreaking technology.
The Future of Generative AI in Life Science Manufacturing
The integration of generative AI in life sciences manufacturing is still in its early stages, but its potential is vast. As the technology continues to advance, we can expect to see even more innovative applications of AI in areas such as process optimisation, quality control, and supply chain management within drug manufacturing.
The industry is also seeing the emergence of Agentic AI, an evolution of generative AI, where AI systems can autonomously execute tasks, make decisions, and even collaborate to achieve specific goals, promising to further revolutionize automation and problem-solving within the industry. This represents a significant leap forward, moving beyond simple generation to more dynamic and proactive AI-driven processes.
By embracing this transformative technology and partnering with experienced digital transformation providers, life science organizations can position themselves at the forefront of innovation and secure a brighter future for the industry and the patients it serves.
David Staunton
David Staunton (BEng Hons. Aerospace, MSc Project Management) is the Life Science Manufacturing Leader of Transformation at Cognizant and drives business outcome transformation with our Tier 1 Clients. David has over 25 years’ experience in delivering Digital, MES, Data and Automation projects and services for Pharma and Biotech Companies. In consistent dialogue with CIOs, Global VPs of Engineering, Quality VPs and Site Heads of GMP Manufacturing and Lab facilities, David leads the delivery of Transformation Services for Industry 4.0. David is an internationally recognised speaker on Industry 4.0 and Lectures on a variety UCD Master’s Programmes. An innovator in the Life Science Industry, David recognises the transformative potential of AI, heralding it as a game changer in areas from drug discovery to operational efficiency and ESG. Transforming the economics making medicine David is equipped with a practical and contemporary approach from engagements with Tier 1 Pharma companies, that are steering the sector towards a future where accelerated launch, predictive analytics, and unparalleled efficiency are the norm.
•
•
•
•
•
•
talkfuture@pci.com
AI and Machine Learning Subsection
Building a Global Data Foundation for Scaling AI
AI use cases are rippling across commercial biopharma, helping companies make faster, more informed decisions. Yet almost 70% of top generative AI (GenAI) users cite poor data quality as their most significant obstacle in unlocking AI's full potential. As the adoption of applications grows, the true competitive edge lies in the quality of the data fuelling them.
To fully harness AI, commercial leaders are establishing a scalable, seamlessly connected data foundation across markets, functions, and disease areas. Without it, companies’ AI pilots could amount to isolated experiments. Those who focus on creating standardised and well-integrated data can unlock AI’s full potential to gain a competitive advantage and drive long-term success.
Data Consistency and Connectivity: The Foundation of AI
Commercial biopharma teams are uniquely positioned to strategically leverage AI as they collect vast amounts of data, including customer, sales, medical engagement, and social media activity. The next step is to harmonise the data – essentially to “speak the same language” to generate accurate and scalable insights.
Consider a common scenario: One system lists a healthcare professional (HCP) as “John Smith” and another as “J. Smith.” Or perhaps "cardiology" is recorded in one database while “heart medicine” appears in another. AI may fail to connect the variations, leading to errors, duplication, and unreliable insights. These inconsistencies often stem from diverse data sources that don’t speak to each other, creating friction for AI and significantly reducing its ability to provide value.
In another example, a biopharma’s HCP database had over 25,000 specialty classifications, rendering AI-driven insights nearly impossible. The company resolved the issue by implementing global data standards, significantly improving accuracy and scalability.
While AI continues to improve in handling inconsistencies, its success still hinges on the quality of the data it's trained on. This is especially critical in commercial biopharma, where data is often fragmented, sparse and inconsistent, disrupting AI’s ability to generate meaningful insights.
Bayer AG's Journey to AI-ready and Globally Standardised Data Overcoming data consistency challenges requires an organisation-wide approach. Some biopharma leaders are already making strides by prioritising global data standardisation to connect data and run advanced analytics initiatives.
For example, Bayer AG sought to create a 360-degree customer view to provide its field teams with comprehensive
insights before engaging with HCPs. However, data silos across geographies made it challenging to achieve a unified view.
Stefan Schmidt, group product manager at Bayer AG, led the company’s data harmonisation efforts. Schmidt understood that AI insights would remain unreliable without a centralised, accurate data foundation. “Our global data landscape was fragmented – different countries relied on different sources. To see the full picture, we needed a unified customer master,” Schmidt explains.
By harmonising data across geographies and functions, Bayer eliminated inconsistencies and improved accessibility. The company consolidated key data sources – CRM, engagement history, and customer profiles – into a single, intuitive platform for its sales teams.
“In just weeks, we developed a solution that our teams genuinely valued,” Schmidt shares. With a single, connected source of truth, Bayer AG is now positioned for scalable, AI-driven insights across the organisation.
How Commercial Leaders Can Scale AI
Bayer AG's experience demonstrates the power of a globally standardised data foundation and the importance of making it a strategic priority to scale the impact of AI.
To avoid the common pitfalls, commercial leaders must address three key data challenges:
1. Business: Moving AI Pilots from Isolation to Execution
A clear AI strategy, aligned with business priorities, is the strongest predictor of success. Many organisations run local pilots without considering scalability, repeatedly building country-specific solutions based only on country data. This approach prevents data from being connected across countries and limits AI’s ability to generate cross-country insights.
To effectively scale AI efforts, commercial leaders should:
• Align AI priorities with long-term business goals to ensure they address high-impact opportunities rather than short-term experimentation.
• Collaborate across functions – data, analytics, digital, and IT – to build a scalable AI roadmap with defined resources, timelines, and investments.
• Establish governance structures that support AI adoption at an enterprise level, ensuring consistency and alignment across regions.
2. Data and Analytics: Establishing Global Data Standards
Once a strategic direction is set, data and analytics teams can ensure access to high-quality, globally standardised, connected data. Piecing together country-specific data will make deploying initiatives across different markets challenging.
AI and Machine Learning Subsection
To overcome fragmentation, organisations should:
• Standardise data structures globally, ensuring that AI models trained in one region can be applied seamlessly worldwide.
• Invest in connectable data assets that integrate customer, sales, and engagement data across the organisation.
• Continuously refine data quality, ensuring AI models are built on accurate, harmonised data that supports enterprisewide decision-making.
3. Digital and IT: Reducing Integration Complexity
Technology teams play a pivotal role in making AI scalable by reducing data friction, eliminating costly integrations, and breaking down data silos.
To support AI efforts, technology teams should:
• Align data models across systems to prevent inefficient data mapping and redundant integrations.
• Evaluate process inefficiencies such as third-party access (TPA) agreements that slow down data flow and require unnecessary administrative work.
• Implement scalable data governance frameworks that streamline AI deployment across multiple markets.
Your Data Defines AI's Possibilities
AI adoption in commercial biopharma is accelerating, increasing
the need for high-quality, connected data for more personalised engagement.
Approaching data standardisation with the same urgency as defining AI strategy and infrastructure is critical. After all, the real question isn't, “How can I use AI?” but “How can I make my data work for AI?”
Karl Goossens is Director of OpenData Strategy at Veeva. Since joining the company in 2021, he has played a key role in building Veeva’s Commercial Analytics capabilities in Europe, before shifting his focus to advancing the company’s OpenData offering. Karl holds a Master’s in Process Engineering from ETH Zurich and an MBA from IESE Business School in Barcelona.
AI and Machine Learning Subsection
Regulatory Impact Assessment is Obvious Next Target for GenAI, Experts Conclude
When anything changes to a product’s make-up or manufacture, a whole chain of events is triggered, starting with an assessment of the regulatory impact in each regulatory jurisdiction, swiftly followed by required actions. Any delay or omission could be costly, so smart process automation offers attractive potential. Regulatory experts Preeya Beczek and ArisGlobal’s Agnes Cwienczek scope the opportunity.
AI’s potential to transform speed and efficiency, and improve accuracy, has already been demonstrated across a number of Regulatory use cases. These early successes have instilled industry confidence in what the technology can do.
Next in companies’ sights is regulatory impact assessment as part of overall product change control (the structured process to assess, document and implement changes to products). Intensifying pressure to bring medicines and updates to market quickly is galvanising Regulatory teams to work in more streamlined ways. Without recourse to artificial intelligence (AI), this is becoming increasingly unviable.
As well as being crucial for maintaining product quality, safety, and efficacy, effective product change control is critical in ensuring compliance with regulations and managing risk. Understanding the ramifications of any changes to products (planned or reactive) requires meticulous regulatory impact assessments for each market affected.
Yet this process adds little immediate value beyond risk mitigation and is notoriously protracted and labour-intensive, not to mention time-pressured. It is becoming increasingly challenging too, as regulators expand and refine their expectations in response to their own efforts to balance speed to market with drug quality and patient safety. The appeal of AI-based intervention, as an aid to expedited delivery, is inevitable. In an industry survey last year , regulatory impact assessment was cited as one of the top target use cases for AI automation support by senior regulatory professionals.
Addressing Key Process Pain Points
So, how can AI help? As the technology’s capabilities continue to advance and mature, its scope to streamline processes is becoming both sharper and more comprehensive. Not only can AI find and distil key insights at speed, it can also be taught what’s important and differentiate at a highly granular level what good looks like, what is urgent and crucial, and what specifically needs to be enhanced and adapted to maintain compliance and a positive regulator response.
Core AI capabilities are already established in a life sciences regulatory context. Already today, pharma companies are using AI to hone marketing authorisation applications and maintain
registrations. This is bringing welcome relief to Regulatory departments beset with multiplying and evolving workloads, entrenched manual processes (e.g. creating documents or summaries from scratch, extracting data manually, uploading agency correspondences), and poor visibility across departmental boundaries (due to siloed systems, duplicated information recording, and disjointed ways of working).
Regulatory impact assessment is the logical next target for AI. Fronting any process involving a product change, the activity bears an inherent time pressure. If an urgent safety change comes in, the associated regulatory impact assessment typically needs to be performed within hours, not days.
That’s irrespective of the extensive scouring and reviewing of information this will entail. Among the immediate considerations are: “What did we present to the authority last time?”, “What does our label say?”, “How soon must the change be implemented/within what timeframe, and which documentation is required?” All of which requires extensive searching and referencing of diverse and often unconnected sources, including manual lookup of non-indexed (unstructured) data buried in static documents.
Often, the investigative work extends beyond HQ too, spanning feedback loops from affiliates about the current status and local regulations, information which may be recorded in different languages.
When a product change triggers a regulatory impact assessment, typically this will happen initially above an individual country level. That assessment then has to be repeated to some degree by the local operation, where national licenses are involved. Each country will then decide whether and when they will need to make a change (e.g. reflect it in product labelling) and update their registration/notify the relevant health authority. Is it a case of “do first, then tell”, for instance? And what of the manufacturing sites where the product is held? When will the change be rolled out? Will a grace period be required? How urgent is it: can it wait for the next print run?
Moreover, much of this activity will need to take place in parallel, to support forward planning – demand planning, supply planning, materials availability, and so on. And the associated safety/regulatory changes will need to follow this chain of events very promptly.
All of this adds to the complexity of product change control, and the work it generates. In the 2024 survey, 55% of senior regulatory professionals actively expressed interest in an advanced, AI-enabled technology support for the task, to relieve the intensity of the workload and speed up delivery. As many as 97% agreed that AI-enabled automation would be useful in identifying the direct impact of product changes.
AI and Machine Learning Subsection
Where to Start
The potential for AI to help cut through regulatory impact assessment as part of product change control is significant, but where should organisations start?
A practical approach is to break the end-to-end process down and consider the individual stages where intelligent automation could really make a difference. With so many variables in the make-up, structure, and focus of individual organisations, it is unlikely that one size will fit all. Data and technology-readiness will have a bearing on what’s possible now and what is likely to deliver the best results.
Starting small is advisable, focusing on 1–2 particular product lines, or a specific region or country. The key is to identify a painful problem that needs to be overcome, where AI could present a solution.
As AI assumes the detailed exploratory work, process stakeholders (central regulatory professionals, local regulatory representatives, plus those operating at a manufacturing level, demand and supply chain level, and in Quality and Safety) can start to align more closely and collaborate more effectively on next actions.
Even just speeding up the review process in the initial assessment (locating and searching all of the information, and determining where efforts need to be concentrated) will empower teams to move more swiftly in determining and executing next moves. The ability to automatically scan the latest regulatory intelligence in different markets, and consult previous Agency exchanges, can then help further expedite next steps, or at least pinpoint where supplementary insights may be needed where the latest local requirements are less clear.
The more embedded AI becomes in the end-to-end process, the more the gains will multiply. Where an AI tool is pulling information from several sources into one place, teams can be ready to review and validate the findings. Generative AI tools can help with structured content authoring, meanwhile, or swiftly bring a document from version zero to a solid first draft, knowing what data to pull in, and where to find it.
Governance becomes very important in all of this, as regulatory teams turn increasingly to AI to take over the administrative heavy lifting. However smart and well-trained
the AI capabilities are, professionals should not be deferring to the technology to make the decisions for them. Cross-functional teams will still need to agree whether and where a change is applicable, whether it needs to be made now, or whether it can be deferred, and when it should be reported to the relevant regulatory body, for instance.
Organisations must have a comprehensive plan to share the vision and upskill its people, too. This will help ensure that teams across their R&D operations are optimally equipped to harness data and documents more efficiently, understanding both the potential of, and the need for care and due diligence in, applying AI-powered automation to regulatory processes.
REFERENCES
1. Survey: Unsustainable Regulatory Workloads Leave No Choice About AI Adoption, ArisGlobal/Censuswide, November 2024: https://www. arisglobal.com/media/press-release/survey-regulatory-workloadsai-adoption/ (Full survey report at: https://www.arisglobal.com/ resources/regulatory-industry-survey/ )
Preeya Beczek
Preeya Beczek, MD of consulting firm Beczek. COM, is a regulatory affairs and compliance expert, with over 26 years’ industry experience.
Email: preeya@beczek.com Web: www.beczek.com/wp/
Agnes Cwienczek
Agnes Cwienczek is Director of Product Management at ArisGlobal, with a remit including the provision of business process and data management expertise in the areas of Regulatory Information Management, Document Management, Submission Management and Labelling Management. Agnes previously worked at Merck in Global Regulatory and Quality Assurance, during her two decades at the frontline of regulatory information management.
Smarter Temperature Control and Excursion Management Prevent Clinical Trial Disruptions
When investigational medicinal products (IMPs) are exposed to conditions outside of their approved storage range, sponsors need to be ready to act or risk the integrity and success of clinical trials. Temperature excursion events not only have the potential to compromise the stability, efficacy and safety of the compromised IMP, they may distort clinical outcomes by undermining the reliability of trial data, increase risk to patients and trigger regulatory concerns via unreported or poorly documented excursions. The commercial and operational impacts associated with temperature control failures can be equally significant, with the potential to delay timelines, increase costs, and cause reputational damage.
Considering the growing complexity of modern clinical trials and the sensitivity of many biologics and advanced therapies currently in the R&D pipeline, avoiding this domino effect requires effective mitigation and management. To keep patients safe and clinical trials on track, sponsors must implement a combination of proactive strategies to prevent temperature excursions from occurring during drug transportation and storage, and implement a robust, time-efficient process that enhances detectability throughout the entire clinical supply chain.
Strategies for Enhanced Temperature Excursion Prevention
In clinical trials, much like in medicine, prevention is better than cure. While it's unrealistic to eliminate every excursion risk, minimising their occurrence is crucial to protecting patient safety and maintaining study timelines.
The foundation of any effective prevention strategy lies in understanding the risks, where they exist and what happens when they are realised. IMP can be exposed to adverse temperatures during storage or transit. From there, two outcomes are possible. Either the excursion is reported, or it isn’t. Both carry significant consequences.
Reported excursions can result in drug rejections, delays in resupply, and, in worst-case scenarios, patient dropouts. This can derail trials and force sponsors to invest in additional recruitment activity. Conversely, unreported excursions may lead to compromised products reaching patients, potentially causing adverse reactions – an even more critical concern.
Prevention begins with pinpointing when and where excursions are likely to occur. While the audited facilities of manufacturers and CMOs offer robust control and monitoring, visibility sharply declines once IMP begins its journey to sites and patients. At customs, for example, shippers can sit for long periods of time, and boxes can be opened or stored incorrectly, increasing risk. At clinical sites, often lacking the rigorous processes of earlier stages, risk and detectability challenges
peak, making this the most vulnerable point in the supply chain from a temperature management perspective.
With risk hotspots clearly identified, sponsors can implement proactive strategies. These include using validated phase change shippers that maintain strict temperature ranges and considering real-time temperature monitoring devices, balancing cost with benefit. Collaborating with experienced distribution partners and selecting premium courier services that understand the nuances of temperature-controlled shipments is equally vital. This includes ensuring proper handling, such as timely dry ice replenishment for example. But perhaps the most overlooked element is site-level compliance management. Educating clinical site staff on proper shipment handling, such as stopping temperature monitors to prevent false excursions and embedding robust compliance processes for return of all temperature data from shipments are essential components of a successful prevention plan.
Improving Detectability with a Focus on People, Data, and Technology
Decreasing the likelihood of temperature excursions with effective prevention strategies is half the battle. Increasing the detectability of excursions when they do occur is the second half. To achieve this, sponsors need to adopt robust excursion management processes built around people, data, and technology. By doing this, sponsors can enhance patient centricity by meeting their primary objectives of ensuring patients have access to safe, timely IMP. It also means that trials can continue to run smoothly and successfully, which, of course, benefits future patients.
Before exploring how to improve detectability, particularly at clinical sites, it’s important to consider why there’s a need to. While many sponsors have confidence in their existing processes for identifying and reporting site-based temperature excursions, they are often not as robust as they may seem.
This was a lesson a large pharmaceutical company recently learned the hard way during a routine FDA audit, where 12 previously unreported temperature excursions – some dating back over two years – were discovered in site temperature logs. Despite frequent site visits, these excursions had gone undetected by the sponsor’s clinical research associate, whose broad remit limited their ability to closely monitor temperature data. To address the issue, temperature management specialists from Almac conducted a chronological review of kit-level data to evaluate the cumulative impact of the excursions.
This comprehensive analysis identified the number of excursions associated with each kit, enabling a swift adjudication process. Fortunately, the findings confirmed that no product rejections were necessary, as the safety and efficacy of the affected kits was proven to have remained uncompromised. This enabled the sponsor to progress with their clinical trial
Logistics & Supply Chain
but also highlighted a clear need for a more robust process moving forward. As such, this sponsor has since extended its partnership with Almac to boost site compliance with monthly proactive collection and review of site temperature logs by Almac’s global team of temperature management experts. This is drastically reducing the window of risk and, by storing data centrally, is enhancing visibility to support more timely detection of excursions and improved patient safety. Improving detectability with a focus on people, data and technology can help to avoid a similar situation, which had the potential to cause catastrophic disruption and reputational damage for this sponsor.
Where the people element is concerned, it’s important to answer three key questions: what can you ask clinical sites to do? What can your team manage? And when do you need help from external parties? For sites, processes need to be easy, site staff need clarity and direction, and time needs to be spent considering how sites can be best supported to remain compliant with reporting processes.
For sponsor teams, consider what resource is available and if the level of experience aligns with what’s being asked. Consider also the size of the study and what impact on workload it’s going to have. Finally, think about whether team resource will have centralised access to data. It may be prudent to bring in expert people from external parties to support with excursion management, especially if sponsors are working across
different time zones, struggling to manage site compliance and determine what has been shipped vs. what data has been returned and/or are attempting to operate with disparate data.
To improve detectability of excursions in the supply chain, it’s important to make data a priority. It is not sufficient to simply collect data relating to excursions. The absence of data does not prove the absence of excursions and regulators are acutely aware of this. A robust approach demands full visibility and access over all relevant temperature data. Understanding what data is needed to both satisfy regulators and inform effective adjudication is key. As is knowing when the data is needed. For instance, when temperature monitor data from shipments should be available, when excursions should be notified and how this data will be managed at the clinical site storage facilities. Introducing mechanisms to easily collate and locate data, being able to identify when there are gaps in the data, having a process to manage stability updates and figuring out a process to cumulatively track excursions are other important considerations.
If these questions cannot be easily answered, it’s probably time to think about introducing technology. Knowing when it makes sense to introduce systems to streamline temperature management processes, how different stakeholders in the supply chain will need to interact with it and what support and implementation will entail are all points to consider. From temperature management systems to IRT, ensuring appropriate
Logistics & Supply Chain
levels of system validation and justifying investment with a cost/benefit analysis is a sound first step.
A Smarter Approach to Decreasing Response Time
There’s a lot to consider before making an informed adjudication decision – from monitor data to confirmation of what exactly was contained in the compromised shipment, to which drug was affected by the excursion at a clinical site, to stability information and so on. Piecing this data together from various locations, such as email and quality management systems, slows down the process. It also leaves sponsors vulnerable to vital data falling through the gaps and decisions being based on incomplete information. Minimising the risk associated with temperature excursions demands a smarter approach to decreasing response time, while prioritising complete and trustworthy data needed to support effective adjudications.
This is made possible by centralising all temperature data into one validated, integrated and easily accessible system that provides instant visibility over a study drug’s complete temperature history. Furthermore, with data centralised into one intuitive and intelligent temperature management system, significant efficiencies can be gained via process automation.
Auto adjudication functionality, for example, can provide sponsors with an immediate, on-screen update on whether supplies affected by an in-transit excursion are acceptable for use. This decreases response time, meaning patients can be dosed safely and to schedule with acceptable drug, even after an alarm during transit. This ensures continued patient centricity and streamlined resource management by helping to avoid the need to reschedule patients while awaiting the outcome of in-transit excursion adjudications. Embracing automation is also good for site staff, who benefit from a quick, easy, no-logonrequired validated process that ensures the correct monitor is uploaded and associated with the correct shipment. This means sites can easily report and obtain rapid, trusted resolutions for in-transit excursions.
Centralising data and embracing smart technology can also support sponsors to calculate the impact of cumulative
excursion events with ease. By tracking and deducting all time out of conditions at the kit level in a validated system, it’s possible to enhance patient safety by guaranteeing drug with cumulative stability never exceeds allowable limits. Full lifecycle reporting and centralised data also facilitates kit-level visibility of remaining time out of conditions – once kits are allocated med IDs. A final benefit is audit readiness. With an end-to-end audit trail of deductions made throughout the supply chain, sponsors can access the information they need to satisfy regulators in a few clicks of a mouse.
Supplying with Care in a Complex Clinical Trials Landscape
In today’s complex clinical trials landscape, avoiding the domino effect caused by temperature excursions is not just a matter of operational efficiency – it’s a critical component of patient safety and trial integrity. By implementing pro-active strategies to prevent temperature excursions from occurring during drug transportation and storage; enhancing detectability with a focus on people, data and technology; and decreasing response time sponsors can minimise risk. In doing so, it’s possible to streamline clinical trial operations, uphold regulatory compliance and – crucially – promote safe, timely supply to patients.
Sarah McAliskey
Sarah McAliskey is Temperature Services Manager at Almac Clinical Services. In this role, she works closely with clients to understand the challenges they face with distributing temperature sensitive drug product and advising on the best solutions to implement for efficient management of temperature data on a global scale. Sarah joined Almac in 2016 and has worked with a large number of Pharma and Biotech companies initially in developing proposals to fulfil packaging and distribution needs, aiding in the successful delivery of a range of clinical trials. Prior to joining Almac, she worked in the clinical diagnostics industry, gaining in solutions for shipping temperature sensitive product in challenging climates. Sarah holds a BEd in Business Studies from Queens University Belfast.
Media and Communications
IPI
Peer Reviewed, IPI looks into the best practice in outsourcing management for the Pharmaceutical and BioPharmaceutical industry.
www.international-pharma.com
JCS
Peer Reviewed, JCS provides you with the best practice guidelines for conducting global Clinical Trials. JCS is the specialist journal providing you with relevant articles which will help you to navigate emerging markets.
www.journalforclinicalstudies.com
IAHJ
Peer Reviewed, IAHJ looks into the entire outsourcing management of the Veterinary Drug, Veterinary Devices & Animal Food Development Industry.
www.international-animalhealth.com
IBI
Peer reviewed, IBI provides the biopharmaceutical industry with practical advice on managing bioprocessing and technology, upstream and downstream processing, manufacturing, regulations, formulation, scale-up/technology transfer, drug delivery, analytical testing and more.
www.international-biopharma.com
PNP
Pharma Nature Positive, is a platform for all stakeholders in this industry to influence decision making by regulators, governments, investors and other service providers to achieve Nature Net Positive Results. This journal will enable pharma the ability to choose the right services to attain this goal.
www.pharmanaturepositive.com
PHARMA POD
‘Putting science into conversation, and conversation into science.’Join some of the most esteemed and integral members of the Drug Discovery & Development world as they give insights & introspect into the latest movements, discoveries and innovations within the industry.
senglobalcoms.com
Cell 2024: Accelerating Cell & Gene Therapy from Research to Commercialisation
Cell 2024, held in London from 6–8 November 2024, was a three-day summit dedicated to advancing the full cell and gene therapy (CGT) value chain – from discovery through to scalable manufacturing and regulatory strategy. Cell 2024 brought together over 450 attendees from biopharma, biotech, academia, regulatory bodies and technology providers, making it one of Europe’s most focused and collaborative CGT events. With 14 content tracks and a bustling exhibition floor, the event offered an essential platform for networking, knowledge exchange, and identifying new innovations across the cell therapy pipeline.
Exploring the Full CGT Value Chain
Cell 2024 offered tailored programming across three co-located conferences:
• Cell Culture & Bioprocessing
• Advanced Therapy Development
• Cell & Gene Therapy Manufacturing
Each programme delivered a mix of keynote talks, technical case studies, and interactive panel discussions, covering critical areas such as cell line engineering & development, supply chain & logistics and iPSCs and stem cell therapy development. In total, over 80 speakers shared their insights, including senior experts from Bayer, Resolution Therapeutics, Galapagos, the MHRA, Cell & Gene Therapy Catapult, GOSH, and King’s College London.
From Scientific Innovation to Strategic Insight
The content at Cell 2024 spanned both early-stage R&D and commercial readiness, offering a unique perspective on how to navigate complex CGT development challenges.
Highlights included:
• A keynote from Ruben Rizzi, Senior Vice President, Global Regulatory Affairs at BioNTech SE, exploring novel cell therapy approaches in solid tumours.
• A closed-door executive panel with insights from regulatory leaders and investors on the global CGT investment landscape.
• Technical sessions covering real-world GMP implementation, AI in quality control, and next-generation viral and non-viral delivery tools.
Connecting Key Players in Cell & Gene Therapy
Cell 2024 was designed with collaboration in mind. Attendees could engage in prearranged 1:1 meeting, interactive roundtables, an evening drinks reception and networking breaks.
The exhibition floor featured 30+ solution providers offering tools and platforms for CGT research, manufacturing, characterisation and compliance, helping attendees discover new partnerships and technologies to accelerate their pipelines.
Key attendees included senior professionals from Pfizer, UCB, BioNTech SE, AviadoBio and iBET, as well as fast-growing start-ups and academic spinouts.
Looking Ahead to Cell 2025
With the success of Cell 2024 still fresh, Oxford Global unveiled its plans for Cell 2025, to be held on 11–12 November 2025 in London, UK. Promising to be the largest event yet, with over 1,000 attendees, 230 hours of pre-arranged 1:1 meeting, and 30+ hours of sessions, Cell 2025 is poised to build on the momentum of its predecessor and introduce more groundbreaking features.
At the core of Cell 2025 is a continued commitment to end-toend value chain coverage. From discovery and development to manufacturing, supply chain, logistics, and commercialisation, the agenda covers every critical stage, featuring Cell Culture & Bioprocessing, Advanced Therapy Development, and Cell & Gene Therapy Manufacturing programmes.
In response to feedback and market trends, the 2025 agenda will place special emphasis on:
• AI & Machine Learning in Biomanufacturing: Exploring how data-driven tools are revolutionising process efficiency, predictive quality control, and adaptive systems.
• Plug-and-Play Platforms: As CGT development moves towards modular design, Cell 2025 will offer real-world insights into scalable platforms accelerating therapy timelines.
• Stem Cell & microRNA Technologies: With regenerative medicine gaining traction, next year’s conference will feature deep dives into stem cell bioprocessing and gene regulation strategies.
What Attendees Can Expect
Cell 2025 aims to be more than a scientific conference; it’s a marketplace for innovation, an incubator for partnerships, and a forum for the exchange of transformative ideas. Attendees can expect:
• Interactive Thought Leadership: Over 30 hours of expert-led presentations, collaborative roundtables, and live Q&As.
• A Global Technology Showcase: Premier vendors and tech innovators displaying the latest advancements in bioprocessing, analytics, automation, and digital twin technology.
• Enhanced Start-Up Zone & Poster Presentations: New voices and ideas from across the world of emerging biotech and academia.
• The Cell Leaders Awards 2025: Celebrating trailblazers in advanced therapy R&D, the awards ceremony will recognise individuals and organisations driving impactful change in the field. Nominate yourself or your team here: https://hubs.la/ Q03pVyrs0
Find out more and register your interest now to be part of the movement advancing cell & gene innovation – faster, smarter, and together. https://hubs.la/Q03pVqsh0
SOLVING TODAY’S CHALLENGES, LEADING TO TOMORROW’S ADVANCES
August 18-21, 2025 | Boston, MA
Omni Boston Hotel at the Seaport + Virtual NEW VENUE!
1,500 Attendees
300 Presentations
14 Conference Tracks
Stream #1 UPSTREAM PROCESSING
Stream #2 DOWNSTREAM PROCESSING
Stream #3 New for 2025 AI AND DIGITALIZATION
Stream #4 ANALYTICAL & QUALITY
90 Sponsors/Exhibitors
Stream #5 GENE THERAPY
Stream #6 CELL THERAPY
Stream #7 RNA AND GENETIC MEDICINES
Stream #8 FORMULATION AND STABILITY
Where partnering becomes the science of connection
The premier biopharma partnering event, in Vienna, Austria, November 3-5, 2025. This flagship gathering is set to bring together over 5,500 life science professionals from 60+ countries, facilitating over 30,000 one-to-one meetings.
BIO-Europe offers unparalleled networking, innovative partnering programs, and industry insights, making it essential for anyone in the biotechnology value chain. With additional digital partnering days and the powerful partneringONE® platform, BIO-Europe 2025 promises maximum ROI and opportunities to drive your business forward. NOVEMBER 3–5, 2025
Register now before July 25, 2025 and save up to €1,500.
Drug Discovery 2025
A festival of life
science
21 – 22 October 2025 Exhibition Centre Liverpool
Drug Discovery 2025 is Europe’s largest free-to-attend life science conference, bringing together 3,000+ researchers to explore cutting-edge innovations and technologies in drug discovery.
80+ speakers
150 scientific talks
200+ exhibitors
400+ scientific posters
3,000+ delegates Breakthrough Zone and Innovation Prize Early Careers Professionals activities Register here, free
12 scientific tracks
Page 55
Page 3
IBC
Page 56
Page 39
Page 35
Page 5
Page 57
Page 13
BC
Page 27
Page 45
Page 11
Page 53
Page 29
IFC
Subscribe today at www.international-biopharma.com or email info@senglobalcoms.com
17th Annual Bioprocessing Summit
A&M STABTEST
Asymchem Laboratories
Bio-Europe
Catalyst clinical research
Collaborative Drug Discovery Inc
Crown Bioscience
Drug Discovery 2025
InsideReg
Lightcast Discovery
Newcells Biotech
PCI Pharma Services
Richter Biologics GmbH & Co. Kg
Senglobal Ltd
Steribar Systems Ltd
Taconic Biosciences
I hope this journal guides you progressively, through the maze of activities and changes taking place in the biopharmaceutical industry
IBI is also now active on social media. Follow us on:
Traditional antibody discovery assesses function too late, advancing ineffective binders and missing rare, potent candidates. The next generation Envisia platform is purpose built for function focused screening at single cell resolution, enabling earlier and more accurate selection of antibodies with true therapeutic potential.
Key benefits of Envisia
Dynamic, real-time functional insights
Multi-assay functional screening
Faster hit identification and single-cell recovery
Discover how Envisa can transform your antibody discovery. Visit www.lightcast.bio