Skip to main content

IBI - Volume 9 Issue 1

Page 1


Enhancing Compliance and Efficiency

With a unified quality management system

CiPA-Ready hiPSC Cardiomyocytes

For Cardiac Safety Assessment

Operational Excellence at Scale: Why Integrated Manufacturing is the Future

How AI is Transforming Microbial Ingredient Production

Sponsor Company:

Cell and gene therapy page 38

PARI
Wallonia

DIRECTOR: Mark A. Barker

INTERNATIONAL MEDIA DIRECTOR: Anthony Stewart anthony@senglobalcoms.com

MANAGING EDITOR: Alice Phillips alice@senglobalcoms.com

EDITOR: Carla Devine carla@senglobalcoms.com

DESIGN DIRECTOR: Jana Sukenikova www.fanahshapeless.com

FINANCE DEPARTMENT: Akash Sharma accounts@senglobal.co.uk

COVER IMAGE: iStockphoto ©

PUBLISHED BY:

Senglobal Ltd., 2 Larch Court, London United Kingdom, SE1 3GB

Tel: +44 (0) 203 3420130

Email: info@senglobalcoms.com www.international-biopharma.com

All rights reserved. No part of this publication may be reproduced, duplicated, stored in any retrieval system or transmitted in any form by any means without prior written permission of the Publishers.

The next issue of IBI will be published in Summer 2026.

ISSN No.International Biopharmaceutical Industry ISSN 1755-4578.

The opinions and views expressed by the authors in this journal are not necessarily those of the Editor or the Publisher. Please note that although care is taken in the preparation of this publication, the Editor and the Publisher are not responsible for opinions, views, and inaccuracies in the articles. Great care is taken concerning artwork supplied, but the Publisher cannot be held responsible for any loss or damage incurred. This publication is protected by copyright.

2026 Senglobal ltd.

Volume 9 Issue 1 – Spring 2026

04 Foreword

TALKING POINT

06 Wallonia’s Long Game in Biopharma

The southern region of Belgium, Wallonia, has grown into a major European hub for biopharma research and manufacturing. Ludovic Waha of Wallonia Export & Investment Agency (AWEX) discusses some of the factors that have made Wallonia, and Belgium more broadly, an appealing destination for life science researchers, entrepreneurs and investors.

REGULATORY AND COMPLIANCE

08 Enhancing Compliance and Efficiency with a Unified Quality Management System: Why CDMOs should consolidate their QMS landscape and how to do it well

Quality management systems (QMS) that are not continuously challenged and optimised pose compliance risks, slow down operations and consume valuable resources. Birte Müller of Richter BioLogics evaluates how selecting a unified, GMP-compliant quality management system with foresight is an investment in an efficient working environment.

12 2026 Predictions: How AI and Regulation Will Reshape Biopharma Execution

Biopharmas have been advancing in how they connect data and processes across clinical, regulatory, safety and quality. Rik van Mol of Veeva addresses in 2026, the organisations that move the fastest will be the ones that can automate data flow across clinical, regulatory, safety, and quality with an inspection-ready development foundation and apply AI in effective ways that teams can trust.

PRECLINICAL

14 Bridging the Translational Gap: How PDX Models are Transforming Breast Cancer Research, Drug Discovery and Precision Oncology

Breast cancer is the most common cancer in women worldwide. It is highly heterogeneous with distinct breast cancer subtypes, posing challenges for diagnosis and treatment. Malathi Raman Srivastava of CancerTools offers insight into how breast cancer PDX models can pave the way to developing more effective personalised treatments and better outcomes for breast cancer patients.

RESEARCH/INNOVATION/DEVELOPMENT

16 CiPA-Ready hiPSC Cardiomyocytes for Cardiac Safety Assessment

Drug-induced arrhythmias remain a significant challenge in pharmaceutical development, often resulting in late-stage failure, regulatory rejection and post-market withdrawals. Vijay Bhaskar Reddy Konala and Amit Khanna of Yashraj Biotechnology highlight how CiPA-ready hiPSC-CM systems provide a robust foundation for accurate, efficient and ethically sound assessment of proarrhythmic risk, cardiotoxic liability and overall cardiac safety across the drug development continuum.

20 The Hidden Friction in Pharma-CRO Collaboration: When Data Sharing Undermines Data Security

The challenges of collaboration between pharmaceutical companies and contract research organisations (CROs) are widely discussed

across the industry. Abraham Wang of CDD Vault argues that as the industry moves toward increasingly collaborative models of discovery, the ability to design data environments that preserve context, security and scientific continuity may be as vital as the experiments themselves.

TECHNOLOGY

22 How AI is Transforming Microbial Ingredient Production

Population growth, shrinking arable land and climate change are intensifying pressure on global food systems. Julie Rojas and Heykel Trabelsi of Smey explore how AI can increasingly complement, rather than replace, established biological modelling, enabling faster discovery, more efficient engineering and more reliable manufacturing of food and cosmetic ingredients.

MANUFACTURING AND PROCESSING

26 The ADC Era: Bridging Innovation and Manufacturability for Precision Biopharmaceuticals

Antibody–drug conjugates (ADCs) are at the forefront of targeted therapy, combining the precision of monoclonal antibodies with the efficacy of cytotoxic agents. Richard Lewis and Dr. Mattia Cassanelli of Biopharma Group argue that the most efficient organisations choose targeted equipment investments, purchasing instrumentation and equipment alongside specialist training that amplifies flexibility, supports scalability and complements their in-house capabilities for use on subsequent internal R&D projects.

32 Operational Excellence at Scale: Why Integrated Manufacturing is the Future Biopharmaceutical Standard

Demand for end-to-end, integrated manufacturing capabilities continues to amplify in the biopharmaceutical industry amid lingering operational and regulatory challenges. Kevin Sharp of Samsung Biologics argues that by consolidating technical, operational and digital capabilities, integrated CDMOs can deliver tangible outcomes across the value chain.

34 Vertical Integration: Building Resilience and Scalability in Nucleic Acid Therapeutics Manufacturing

The biopharmaceutical sector is entering a new phase shaped by the clinical and commercial momentum of nucleic acid therapeutics (NATs), which include mRNA-based products and oligonucleotide therapeutics such as siRNA and ASOs, as well as sgRNA used in gene editing programs. David Butler of Hongene suggests that for developers, shifting to an integrated partner will mean moving beyond fragmented vendor relationships toward a collaborative

model that prioritises de-risked scaling, execution efficiency and programme continuity.

CELL AND GENE THERAPY SUBSECTION

38 Mitigating At-Risk Cell and Gene Therapy Application with Rapid Sterility Testing

Modern medicine is predicated on the safety provided by sterility assurance and aseptic practices. Delaney Novak and Timothy Francis of FUJIFILM Biosciences examine how RiboNAT™ retains the benefits of NAT for sterility assurance while also minimising the drawbacks associated with DNA-based methods, including false positives and decreased sensitivity.

42 From Policy to Practice: How Regulatory Momentum for New Approach Methodologies Is Accelerating the Adoption of Organ-on-a-Chip Technology

The biopharmaceutical industry is at an inflexion point. For decades, animal models have served as the backbone of preclinical research, supporting target validation, efficacy assessment and safety evaluation. Lorna Ewart of Emulate looks ahead to how the role of Organ-on-a-Chip technology is likely to evolve in tandem with broader changes in regulatory science.

46 Bridging the ADMET Translational Gap: How New Approach Methodologies and Organ-on-a-Chip Technology are Redefining Drug Development

Drug developers face challenges in translating preclinical findings to safe and effective human trials. Traditionally, preclinical testing relies on a phased approach that combines simple in vitro assays and whole animal in vivo studies. Dr. Yassen Abbas of CN Bio evaluates the translational gap that remains between these findings and the clinical outcomes.

50 The CRISP Revolution: Redefining Therapeutic Strategy and Pharmaceutical Innovation

For decades, medicine has focused primarily on disease management. Dr. Koushik Yetukuri and Siddharth Kumar of the Chalapathi Institute of Pharmaceutical Sciences evaluate how genome editing is now transforming that paradigm as CRISPR technology enables precise modification of DNA within human cells, allowing scientists to correct defective genes at their source.

APPLICATION NOTE

29 A Quantitative Rationale for Hybrid Buffer Preparation Strategies in Biopharmaceutical Manufacturing

Biopharmaceutical manufacturing continues to evolve as upstream titres increase, therapeutic modalities diversify and continuous processing approaches gain wider adoption. Shannon J. Ryan of Avantor argues that by segmenting buffer demand, applying multi-criteria decision frameworks and maintaining flexibility as priorities evolve, organisations can ensure that buffer operations support.

54 KODA: Advancing Laboratory Sample Management Through 2D Data-Matrix Readers

In modern scientific environments, the reliability of sample identification underpins the integrity of research, diagnostics and patient care. Megan Ricketts of Steribar highlights how the company is helping laboratories move confidently toward secure, data-driven operations by combining technical expertise with genuine customer engagement.

LABORATORY FOR ANALYSIS AND STABILITY TESTING

30 YEARS GMP EXPERIENCE

Full spectrum NBE/NCE analysis of human and veterinary drug substances and products at one site

FROM DISCOVERY TO DELIVERY … AND BEYOND Including development-related studies, technology development + validation, ready-for-market product testing, batch release, stability testing

SAMPLE STORAGE/ MANAGEMENT

ICH compliance, customizable conditions, and scalable capacity.

IN THE HEART OF EUROPE close to Pharma Hub Frankfurt Airport, temperature-controlled sample pick-up service possible

OUR ANALYTICAL PROTEIN CAPABILITIES INCLUDE:

OUR ANALYTICAL PROTEIN CAPABILITIES INCLUDE:

I dentity & Structure

LC–MS/MS peptide mapping, intact mass analysis, seq uence confirmation

Purity & Heterogeneity

HPLC / UPLC (HILIC, RP, SEC, CEX/AIX) with UV, fluo rescence, CAD, CE (CGE-SDS, CGE, CZE), icIEF, glycan profilings

Biological Activity

Potency assays, cell-based assays, ELISA

steven.watt@am-labor.de

www.am-labor.de

Foreword

Living in Germany, what typically comes to mind when I think of our Belgian neighbours are images of legendary cycling races over cobblestones and muddy winter tracks, exceptional chocolate, fantastic beer and, of course, a certain statue cheerfully urinating beer while occasionally dressed in a football shirt. It is a country rich in culture, character and quirks.

And yet, every time I attend biotechnology and biopharmaceutical events in Belgium, particularly in Wallonia, I find myself genuinely astonished. There is a rare intensity to the ecosystem, a high density of outstanding universities, deeply rooted research excellence and a startup scene that is not only vibrant but remarkably well connected. What stands out most is how naturally these elements interact. Science, entrepreneurship and policy do not operate in silos; instead, they reinforce one another in a way that many larger European neighbours, despite their scale and resources, still struggle to replicate.

This unique environment is no coincidence. It reflects a long-term vision, sustained public and private investment, and a clear commitment to translating scientific strength into tangible economic and societal impact. Our opening article, “Wallonia’s Long Game in Biopharma,” captures this trajectory, offering valuable insight into how the region has positioned itself as a leading European hub for life sciences.

This issue of the IBI Journal highlights several innovations that exemplify the direction in which the field is moving, particularly in cell engineering and advanced model systems that enable the development and manufacturing of increasingly individualised therapies. Technologies such as patient-derived xenograft (PDX) models and Organ-on-a-Chip platforms are redefining how we test efficacy and safety, bringing us closer to therapies that are tailored to the biology of individual patients and designed to maximise clinical outcomes.

Our feature on gene therapy further underscores this shift. We are now at a point where the conversation is moving beyond

And Finally…

Welcome to the first IBI journal of 2026, I hope everyone has had a good start to the year. It’s been a positive start over here at Senglobal and thanks to our brilliant contributors, we are able to bring you a spring edition full of interesting articles. This issue features a subsection on cell and gene therapy, which starts on page 38. We kick off this section with a feature from Delaney Novak and Timothy Francis of FUJIFILM Biosciences titled “Mitigating At-Risk Cell and Gene Therapy Application with Rapid Sterility Testing”. Lorna Ewart of Emulate offers an incredibly

IBI – Editorial Advisory Board

• Alistair Michel (MRSB) – Senior Scientist, Reading Scientific Services Ltd

• Cellia K. Habita, President & CEO, Arianne Corporation

• Deborah A. Komlos, Senior Medical & Regulatory Writer, Clarivate Analytics

• Francis Crawley, Executive Director of the Good Clinical Practice Alliance – Europe (GCPA) and a World Health Organisation (WHO) Expert in Ethics

• Hermann Schulz, MD, Founder, PresseKontext

managing disease toward the possibility of curing it. Advances in translational oncology demonstrate how PDX models are unlocking more precise approaches to breast cancer treatment, addressing the complexity and heterogeneity of the disease. At the same time, regulatory momentum is accelerating the adoption of Organ-on-a-Chip technologies, signalling a broader transformation in how preclinical research is conducted and validated.

Technological progress is also reshaping safety assessment. The development of CiPA-ready hiPSC-derived cardiomyocytes represents a significant step forward in predicting cardiac risk, offering more reliable, efficient and ethically grounded alternatives to traditional models. These systems provide a stronger foundation for evaluating cardiotoxicity across the drug development continuum and reducing late-stage failures.

Meanwhile, genome editing technologies such as CRISPR are redefining the very foundations of therapeutic strategy. By enabling precise modification of DNA within human cells, they open the door to correcting diseases at their source, fundamentally changing how we think about intervention and long-term outcomes.

Taken together, the contributions in this issue highlight a sector in transition, one that is becoming more precise, more predictive and more collaborative. They also underscore the importance of ecosystems like Wallonia’s, where the right conditions allow innovation not only to emerge, but to thrive and scale.

I hope this issue provides both insight and inspiration and encourages continued dialogue across disciplines, borders and sectors as we collectively shape the future of biopharma.

If any Belgians are reading, I am a big fan of the cobble-classics and your beer. I am open to invitations!

Dr. Steven A. Watt, CBDO (Chief Business Development Officer) at A&M STABTEST GmbH

interesting piece titled “From Policy to Practice: How Regulatory Momentum for New Approach Methodologies Is Accelerating the Adoption of Organ-on-a-Chip Technology”, which starts on page 42. This takes a look at the role organ-on-a-chip technology can have in the biopharmaceutical industry and how this can evolve with changes in regulatory science.

I hope you enjoy the articles ahead. With several conferences on our distribution list in the coming months, I may well have the opportunity to meet some of you soon. Either way, I look forward to receiving your white papers for our next issue!

Editor

• Lorna. M. Graham, BSc Hons, MSc, Director, Project Management, Worldwide Clinical Trials

• Rafael Antunes, Vice President Business Development, Aurisco Pharmaceutical Europe

• Stanley Tam, General Manager, Eurofins MEDINET (Singapore, Shanghai)

• Stefan Astrom, Founder and CEO of Astrom Research International HB

• Steven A. Watt, CBDO (Chief Business Development Officer) at A&M STABTEST GmbH

RICHTER BIOLOGICS

Suhrenkamp 59, 22335 Hamburg, Germany

Phone: +49 40 55290-801

BusinessDevelopment@richterbiologics.eu

CONTACT US TO BRING YOUR PROJECT TO SUCESS! BEST IN CLASS BIOLOGICS CDMO FOR OVER 35 YEARS!

YOUR PRODUCT – OUR COMPETENCE AND DEDICATION

Richter Biologics is your professional and experienced partner offering CDMO solutions from gene to product all from one source.

Richter Biologics: your expert for late stage and commercial production.

Wallonia’s Long Game in Biopharma

The southern region of Belgium, Wallonia, has grown into a major European hub for biopharma research and manufacturing. Ludovic Waha, Senior Life Sciences Business Developer at the Wallonia Export & Investment Agency (AWEX) discussed some of the factors that have made Wallonia, and Belgium more broadly, an appealing destination for life science researchers, entrepreneurs and investors.

How recently has the Wallonia region begun to prioritise investment in the biotech and pharma industries?

Most countries in Europe decided to strengthen their life science industry after the COVID period. For Belgium and Wallonia, we have been supporting and financing this industry for a much longer time, about 20 years. Historically, a number of companies, with the support of the Government, grew and became major global players such as GSK, Baxter and UCB. This success paved the way for new medium-sized and smaller companies and led to the emergence of a genuine, dynamic ecosystem.

The rapid expansion of this ecosystem has been built on a winning trio of strong collaboration between universities and research centres, public authorities, and the industrial sector, each playing a complementary and essential role in innovation, growth and industrial development.

What are some of the ways that the region has facilitated that growth?

We had some challenges years ago, one of which was real estate. I had the opportunity to meet CEOs from foreign companies who wanted to come to Wallonia. They explained to me, “We are looking for 10,000 square meters of labs,” and I was not able to give them solutions. At this stage, the government, but also the private sector, decided to invest in science parks. Today, we have the ability to promote labs and offices anywhere in the region and there are more than 100,000 square meters available for new companies arriving in the country.

Belgium has 11 million inhabitants, and the industry is investing each year +/-6 billion€. It’s quite a huge investment for a small country (compared to Germany or France). We also have public funding for companies. The public funding agencies are able to support and propose dilutive funding from a few euros up to several million to support the development of companies.

How has it worked in terms of attracting talent and corporate investment from abroad?

Everyone in the world knows that Belgium is strong in the biopharma industry. We actively encourage corporate investment through a range of incentives. These include investment grants, support for R&D activities and tax incentives designed to lower labour costs and benefit from a reduced

corporate tax rate. Furthermore, starting from this year, we are opening a brand-new EU Biotech Campus. It’s a 25-million-Euro investment and with this biotech campus, we'll be able to attract talent both from the whole of Europe and also worldwide.

What are some of the other advantages that Belgium and the Wallonia region can offer the biopharma sector?

We are really strong in terms of clinical trials; we rank number two per capita in Europe after Denmark. Wallonia offers a highly supportive environment to move rapidly from early development to clinical validation, thanks to a dense network of hospitals and specialised service providers able to manage regulatory requirements and accompany companies throughout the clinical trial process.

In broader terms of quality of life, what might a job-hunter find appealing about life in Wallonia?

You probably know that we can offer chocolate and beer, but on top of that, the quality of life is really high. Wallonia is located in the very heart of Europe and within less than two hours you can be in London, Amsterdam and Paris. The cost of living is also much lower. If you compare per square metre in London, Paris, or Frankfurt, it's three to four times more expensive if you want to find housing.

Ludovic Waha

Ludovic Waha is a Senior Business Developer for Life Sciences at AWEX in Wallonia, Belgium, supporting local companies in their international development and promoting the region abroad. He holds a master’s degree in economics from the Louvain-la-Neuve University (Belgium), with a background in finance. Ludovic joined AWEX 15 years ago as a Project Manager for Foreign Investment Attraction before moving into the Life Sciences unit.

AT-Closed Vial® for advanced therapies

The Annex 1–compliant AT Closed Vial® delivers proven cryogenic CCI at -196°C with fully validated design, materials, manufacturing, and filling process. Trusted in 19 commercial product approvals.

Optimize your aseptic workflow with our single use liquid processing systems.

Regulatory and Compliance

Enhancing Compliance and Efficiency with a Unified Quality Management System

Why CDMOs should

consolidate their QMS landscape and how to do it well

Quality management systems (QMS) that are not continuously challenged and optimised pose compliance risks, slow down operations and consume valuable resources. For contract development and manufacturing organisations (CDMOs), the challenge is amplified; systems must meet both regulatory requirements and diverse customer expectations, often across multiple sites and modalities. This article outlines how a validated, integrated QMS eliminates fragmented system landscapes, prevents media breaks and delivers measurable gains in speed, consistency and audit readiness. Using Training, Document and Deviation Management as core examples, practical levers, realistic workload calculations and a platform-based selection approach that scales are highlighted.

When a QMS Turns into a Wave

In growing companies and fast-changing environments, adding “one more form,” “one more step,” or “one more approval” can feel like a pragmatic response to a change request by authorities, customers or internal findings. Over time, that mindset produces layered and locally optimised systems, paper here, spreadsheets there, a point solution somewhere else, each justified at the time, none optimised end-to-end. The result is a QMS wave; complex to navigate, costly to maintain and hard to govern consistently across sites or product lines.

Three questions reveal whether action is necessary:

• Are paper-based processes still used for core GMP activities despite growth, site diversity, or hybrid work?

• Do we accept media breaks, manual transfers and parallel systems interpreted differently by the site?

• Are special cases embedded in the base workflow, rather than documented as controlled exceptions?

Most companies agree that the answer to the first two questions should be a clear no, but start debating the third one. A crucial shift is recognising that exception-proofing standard flows increases complexity, slowing decision making while adding little real compliance value. Optimisation is not about recreating the status quo in an app; it is about simplifying work, standardising and eliminating exceptions.

Where to Start?

Any initiative for optimising the QMS should start with a detailed understanding of business processes and cycle times. Only by quantifying the baseline can improvements be assessed both from a compliance perspective and an economic one. Map ownership, handoffs and media breaks, then design scalable, automated workflows that eliminate manual rework, duplicate data entry and signature loops. The following sections outline three key areas of a QMS, setting out the challenges and benefits of optimisation in each area.

Training: Making Complexity Manageable Regulations require structured training programmes, ongoing refreshers and effectiveness checks linked to roles and responsibilities. While this can be done without a digital system, it is inefficient and error-prone at scale.

The

Challenge

During onboarding, an employee must receive numerous trainings. A manufacturing employee at a CDMO with different products can quickly be assigned more than 200 GMP-related trainings during onboarding. Additional non-GMP trainings, such as safety, are necessary. Prioritising and structuring the content to enable effective training consumes resources for training delivery and coordination. As a company grows, the manual effort multiplies and places supervisors and training coordinators before an almost unsolvable task. The consequences are missing training, resulting in deviations and poorly trained employees.

Not only initial training, but also continuous training is required. For a production site with 200 employees, this can amount to 28,000 individual trainings per year. If only five minutes are needed for documentation per training, that equals over 2,300 hours per year, slightly more than one full-time employee. Documentation alone does not ensure that outstanding training is identified and followed up on.

Advantages of Well-Designed Training Management

When setting up a training system, a training matrix will certainly be established that assigns training needs based on roles and responsibilities. The work involved in setting up such a matrix requires initial effort, but afterwards, companies only need to invest a small amount of effort to keep it up to date. Based on this matrix, all employees are trained uniformly, continuously and transparently for employees, supervisors and Quality Assurance.

Implementing such a training system significantly reduces administrative tasks and shifts from chasing signatures to improving content (e.g., short videos for critical tasks, microlearning for recurrent risks), freeing time for careful training delivery and effectiveness checks, resulting in consistent, auditable and scalable training.

Document Management: From Paper Repositories to a Validated Document Management System (DMS)

Documents are the backbone of the QMS; SOPs, manufacturing instructions, risk analyses, validation and qualification, reports and assessments, and executed GMP protocols. In mid-sized biotech/CDMO environments, the annual volume often reaches tens of thousands of documents. With such large quantities, it is difficult to maintain a consistent filing system that is maintained solely by the manual efforts of individual employees.

The Challenge

All documents must follow a defined process for creation, review, approval, publication, updating and archiving.

Regulatory and Compliance

Depending on company size and portfolio, a large number of GMP documents accumulate quickly. When looking at the different types of documents, differences quickly become apparent, resulting in different requirements for workflows. A global Standard Operating Procedure (SOP) or a manufacturing instruction typically requires a more complex workflow than monitoring protocols or cleaning reports. At CDMOs, customer involvement may be necessary. Nevertheless, all these documents belong to the DMS.

Another challenge arises when using protocols that are filled out by hand during manufacturing or testing. These documents must be registered and accounted for. If a medium-sized biotechnology company has around 50,000 documents per year, this would require two full-time employees working at a rate of five minutes per document. This working time does not yet include the effort required for archiving, which, in the case of paper documents, requires physical space in addition to human resources. Archiving efforts are becoming more complex when electronic signatures are used outside of validated, electronic DMS. 21 CFR Part 11 compliance must be verified, and the documents must be archived securely in an unalterable manner in compliance with GMP, including all certificates.

A real-life challenge is that electronically signed documents that are created outside of electronic DMS must be transferred for archiving. Depending on the number of people responsible for the documents, the level of care can vary greatly and lead to an incomplete electronic archive, which in turn represents a high compliance risk. Documents that cannot be found are considered nonexistent documents.

Advantages of Well-Designed Document Management

When setting up a document management system, the existing document structure is analysed to identify differences and similarities. Instructional documents must certainly be treated differently from reporting documents. This results in different requirements, which are reflected in perhaps three different lifecycle processes. Firstly, the documents must be mapped to the new lifecycles. Once this initial work has been done, new documents are automatically assigned to the necessary workflows, user tasks are created and finally, the documents are clearly archived, providing employees with a central searchable library as a single source of truth for all authorised documents. This, in turn, has a direct impact on effectiveness. In addition, the system takes care of protocol creation and accounting, freeing up staff for other tasks.

The use of fully validated and carefully designed DMS provides a tool that complies with regulations and can be used throughout a company's entire GMP area, improving its compliance.

Deviation Management: Standardise to Avoid Ping-pong

The handling of deviations offers a variety of possible workflows that all have the same goal: identify root causes, assess risk and implement effective actions. The opportunity to design any workflow can result in slow, approval-heavy processes that cloud accountability and extend cycle times.

The Challenge

When developing these workflows, responsibilities and exact procedures must be defined in detail. Who acts when, and what

is their task at that point? It must be defined at which point quality assurance must be involved. As a CDMO, customers must be informed of or involved in the investigation and assessment at defined stages. If the Workflow is not fully thought through and every eventiuality is included in the standard workflow, a company quickly runs the risk of ending up with inefficient processes and a ping-pong game between departments.

For example, if a deviation process involves five steps with two to three departments each, 10–15 approvals may be required. Each task within the deviation workflow requires a brief familiarisation with the deviation topic, at least five minutes, before the task can be performed. There is a difference between having to gain an overview two times versus five times. By consolidating work steps and eliminating only two individual steps involving two departments, a medium-sized company with 1,000 deviations per year could save over 300 hours of familiarisation time. Please keep in mind that during this time, no root cause analysis has been performed and no corrective or preventive measures have been defined.

Another challenge to keep in mind is multi-site collaboration. If manufacturing is located at a different site than quality control, which may provide data for deviation assessment, it is necessary that both departments have access to all relevant data. Without electronic workflows, forms are used that are filled out manually or electronically and circulated by hand or email. The risk that an item is overlooked at a desk or in an inbox is significant and constitutes a compliance risk.

Advantages of Well-Designed Deviation Management

When setting up a well-thought-out deviation management system, it is important to examine the requirements of the majority of deviations. Special cases should not be considered when creating standard workflows, as these usually complicate the forms and require additional steps. By reducing the number of topics to those that are truly relevant, processes can be simplified. Of course, special cases must be carefully documented, but additional attachments are likely to suffice for this purpose. Don't fall into the trap of thinking that managers can only stay informed about what's going on in their department by signing documents. It's better to reduce the number of signatures and keep managers informed using well-designed reporting tools. This requires well-written, comparable deviation reports and sensibly linked data records.

Sticking to this will result in a deviation management system that not only complies with regulations and ensures products and systems meet quality standards but is also transparent and does not waste valuable resources.

How to Choose the Right System for Your Company

The number of providers of electronic QMS seems endless, so choosing the right system for your company is essential. Before a decision can be made, it is necessary to determine what the expectations are. Start by answering the following questions:

• Which QMS processes should be available?

• Do you want a fully validated standard system, or do you want a completely customised one?

• Should the system be scalable for growing companies or adding new processes?

Regulatory and Compliance

• Are there interfaces with existing systems (e.g. ERP) that are essential?

• Do you want the market leader where your company is one of many customers or a small provider where your company is the most important client?

• What stakeholders need to be involved in the decision making?

• Do you want regular updates of the software, or do you not want any changes after implementation?

The list is certainly only an incomplete starting point for launching an optimisation project for your QMS. Please keep in mind not to get lost in the details, especially at the beginning.

Returning to the example of deviation management, let's look at how standard workflows can help. If, as suggested above, special cases are neglected when defining workflows, most providers offer standard workflows. If these out-of-thebox workflows come from widely used providers, they can be considered industry standards that are frequently reviewed by authorities and customers for GMP compliance. Therefore, using standard workflows from established software providers can be associated with a lower compliance risk.

Conclusion: Unified Systems Pay Off Twice

The individual aspects demonstrate the significant potential for optimisation. If topics are addressed separately by different well-chosen project teams, improvements in compliance and efficiency will certainly occur. However, this approach can also result in multiple systems that do not, or only with additional effort, communicate with each other, leading to manual transfers, duplicate master data maintenance and suboptimal use of digital capabilities and efficiency. Therefore, topics should not be considered in isolation; project planning should include an overview of which QMS must be considered so that candidate

systems can be evaluated for suitability across all aspects, resulting in a well-thought trough roadmap. If selection is made carefully and a single platform is chosen, further efficiency gains follow automatically. Employees need to learn only the general operation of one system and the specifics of individual applications. Data sets do not need to be created multiple times, but can be used by different applications within a platform.

The idea is clear. Optimisation in the areas of training, document and deviation management requires standardisation and automation. Thoroughly evaluating the systems under consideration in all relevant areas prevents a fragmented landscape. Investing in a validated, integrated QMS platform ensures scalability, audit readiness and operational excellence, providing benefits such as freed-up human resources, shorter cycle times and higher data quality. Consequently, selecting a unified, GMP-compliant quality management system with foresight is an investment in an efficient working environment, enabling the costs associated with digital systems to be amortised over time.

Birte Müller

In her 15 years of experience in quality assurance, Birte Müller has gained extensive knowledge of biotech products’ entire life cycle. She has a strong interest in digitalisation and has been involved in various systems implementations, such as Quality Management Systems and Laboratory Information Management Systems. Most recently, in addition to her GMP compliance team leader role at Richter BioLogics, she led the implementation of a document management and training system.

End-to-end

Advance your clinical trial from Phase I to IV with PHARMExcel - a collaborative, qualitydriven UK-based CRO. Our trusted partnerships with leading UK universities, NHS sites and clinical research centres allow us to deliver tailored, hands‑on support that drives lasting success for your trial programme.

How we can help you:

• Guidance in regulatory approval pathways

• Efficient trial set-up & delivery (Phase I-IV)

• Data & safety management

• ISO-certified Quality Management System

• A Pick & Mix service offering

“PHARMExcel is the most collaborative CRO I’ve worked with.”

Prof Mark Lowdell INmune Bio Inc

Regulatory and Compliance

2026 Predictions: How AI and Regulation Will Reshape Biopharma Execution

Biopharmas have been advancing in how they connect data and processes across clinical, regulatory, safety and quality. In 2026, the operational focus will shift to creating even greater flow, connected execution across teams, supported by a technology backbone that improves data transparency, traceability and inspection readiness as Europe’s regulatory expectations keep evolving. In parallel, AI will move from early capability augmentation to more generic AI that operates embedded in systems with compliance.

Below are five predictions for where life sciences are headed in 2026 that operate within clear guardrails.

Europe’s Regulatory Landscape Will Push Teams Toward Inspection-ready Execution by Design

In 2026, regulatory change in Europe will feel less like a series of one-off milestones and more like a steady operating reality. Clinical teams will be firmly in a CTIS-first world under the EU Clinical Trials Regulation, which continues to raise expectations for consistency across countries, faster coordination and complete, traceable documentation, alongside broader moves toward more structured submissions such as eCTD 4.0. As organisations settle into this mode, the pressure shifts from “getting it done” to “getting it right, every time” with fewer exceptions and less tolerance for local workarounds.

At the same time, ICH E6(R3) continues to move the industry toward a more explicit, risk-based approach to GCP. Sponsors will increasingly be expected to demonstrate how quality is designed into a study and how oversight is executed across partners, data sources and systems. That narrows the gap between operations and compliance. It also changes what “inspection readiness” means day to day. It is not a scramble at the end. It is a continuous state that depends on clear process ownership, consistent documentation and a reliable trail of decisions.

Finally, structured data requirements will keep advancing. IDMP is one important signal of where regulators are heading, toward standardised product and substance data that can be reused and reconciled across the lifecycle. In practice, 2026 will reward companies that reduce manual handoffs between clinical, regulatory, safety, and quality and instead run on shared data and common process standards. That is how teams increase speed while staying audit-ready.

AI-readiness Will be Key as the Industry Builds Toward Agentic AI

In 2026, many companies will have moved past the novelty phase of AI. Early initiatives have shown varied success in targeted areas, such as summarisation, classification, extraction and draft generation. They also surfaced a consistent limitation: AI is only as dependable as the data, processes and governance underneath

it. As expectations rise, especially with the EU AI Act shaping how regulated industries think about responsible AI, sponsors will increasingly treat AI-readiness as an operational capability, not a collection of pilots.

This is where the conversation shifts from “Can AI help?” to “Can AI help in a way we can trust, explain and scale?” The path to that outcome is not mysterious, but it is demanding:

• Harmonised data and metadata so AI outputs are grounded and consistent

• Standardised workflows so tasks can be executed with clear control points

• Strong governance so responsibility, validation and monitoring are explicit

• Audit-friendly traceability so decisions can be understood and defended

These foundations are also what make the next phase possible, agentic AI. In 2026, more organisations will begin operationalising controlled, task-based agents that can initiate workflows, check completeness, summarise outcomes, flag exceptions and route work to the right people. The winners will be the companies that pair AI with disciplined processes and connected data, so agents improve cycle time and quality without introducing unacceptable risk.

New EMA–FDA Principles Will Further Shape Biopharma’s AI Operating Model

That shift toward AI-readiness is now being reinforced by regulators, not just through European policy signals, but through growing alignment on what “good” looks like when sponsors use AI across the drug development lifecycle. In January, the EMA and FDA published joint guiding principles for good AI practice in drug development, marking a turning point toward a more disciplined, operational approach to AI in biopharma.

For sponsors, the value of the principles is that they establish a shared reference point for what will matter as AI moves from pilots into regulated workflows. They do not prescribe specific technologies. Instead, they emphasise how AI should be designed, assessed, deployed and managed over time, including a clearly defined context of use, strong data governance and documentation, lifecycle controls, and clear, essential information to support oversight.

In practice, this raises the bar for any AI used in GxP-relevant activities. If AI supports protocol development, safety signal triage, document classification or quality decision-making, sponsors need to be able to support oversight with credible evidence. That includes tracing inputs back to authoritative sources, understanding how a model is trained or configured for its intended context of use, and showing where human responsibility sits for decisions, including when outputs are reviewed, overridden or escalated.

Regulatory and Compliance

These expectations have real architectural consequences. Fragmented point solutions make end-to-end assurance harder to demonstrate across clinical, regulatory, safety and quality, especially as systems and processes evolve. Data provenance can be lost in handoffs and local workarounds, and documentation becomes difficult to keep consistent and verifiable over time, precisely where regulators are signalling the need for stronger lifecycle management and defensible records.

An industry cloud with a shared data foundation makes these expectations more achievable. When data is harmonised, workflows are standardised and governance is built in, sponsors can better contextualise, audit and trust AI-supported outputs without turning every question into an investigation. In that sense, regulatory alignment around AI reinforces the industry’s broader move toward connected execution. AI reliability depends on data quality and provenance, and both depend on coherent operational design.

Over the course of the year, many biopharmas will use the EMA–FDA principles as a blueprint, not to deploy AI everywhere, but to deploy it where guardrails are explicit, performance can be monitored and outputs can withstand regulatory scrutiny. Regulation is not constraining innovation. It is clarifying the conditions under which innovation can scale responsibly.

Clinical Trial Data Flow Will Advance Recruitment and Improve Patient Access and Experience

The flow of clinical data between sites and sponsors will yield faster, more efficient trials. Study information will go straight to physicians to connect their patients with relevant research. New embedded AI will connect trial data between sponsors and

sites so that physicians can search for treatment and trial options based on a patient’s conditions or test results. This direct-tophysician approach will reduce the industry’s reliance on sites to find study participants to meet recruitment goals sooner and improve patients’ access to clinical trials.

With less burden from patient recruitment requirements and modern technology, sites will see the promise of eliminating paper and manual source data verification (SDV) for clinical research associates (CRAs) become a reality. eSource tools will better connect upstream and downstream clinical data sources, first with EHRs, so that patient health data can merge more efficiently with trial data. When connected with EDC, source forms will be defined by a trial definition, so data can flow faster and with more clarity to the sponsor. This data flow will streamline study visits for patients and advance trials for sites and sponsors.

Agentic

AI

Lab Assistants Will Drive

Connectivity and Speed Labs will move beyond chatbots to embed agentic lab assistants that connect highly specific tasks in a regulated environment. QC labs are turning their attention to the efficiency potential of AI agents and steering effort toward activating them across people and processes. However, the technology ecosystems in QC labs are fragmented and paper-based processes persist. Companies will modernise and consolidate systems, standardise data and workflows, and integrate quality assurance to reap the productivity gains of QC-specific AI.

Lab analysts will work alongside agents capable of starting workflows, summarising outcomes, and observing and analysing trends. This will advance proactive risk management by identifying issues early on and driving right-first-time execution. The outcome will be a highly effective and efficient QC lab where people and agents work together to shorten batch cycle times.

What These Shifts Mean for 2026

Across these predictions, there is convergence toward execution. Regulatory expectations for transparency, traceability and consistent oversight are increasing. At the same time, AI is driving a deeper look into operational foundations, because agents cannot scale on fragmented data and inconsistent processes. In 2026, the organisations that move the fastest will be the ones that can automate data flow across clinical, regulatory, safety, and quality with an inspection-ready development foundation and apply AI in effective ways that teams can trust. They will shift investment from isolated pilots to repeatable operating models, shared data standards, standardised workflows and governance that makes accountability explicit. They will simplify system landscapes so AI can work end-to-end, not inside silos, and so performance can be monitored and improved over time. The outcome is practical and measurable: fewer handoffs, fewer surprises, stronger compliance and faster delivery of therapies for patients.

Rik Van Mol leads R&D and Quality strategy in Europe and is responsible for overall growth and execution of the European market.

Pre-clinical

Bridging the Translational Gap: How PDX Models are Transforming Breast Cancer Research, Drug Discovery and Precision Oncology

Breast cancer is the most common cancer in women worldwide. It is highly heterogeneous with distinct breast cancer subtypes, posing challenges for diagnosis and treatment. There are four key molecular breast cancer subtypes classified based on the expression of hormone and growth factor receptors.1

Luminal A subtype tumours are characterised by the expression of oestrogen receptor (ER+) and/or progesterone receptor (PR+) and the absence of human epidermal growth factor receptor (HER2-). Clinically, they are low grade, slow growing and have the best prognosis with less incidence of relapse and higher survival rate. Luminal B subtype tumours are of higher grade and have a worse prognosis than Luminal A tumours. They are ER+, can be PR+ or PR- negative and HER2- and are generally of intermediate/ high histologic grade. These tumours may benefit from hormonal therapy along with chemotherapy. The HER2-positive subtype is characterised by high HER2 expression and is ER- and PR-. They grow faster than the luminal ones and prognosis has improved after the introduction of HER2-targeted therapies.

The triple-negative breast cancer (TNBC) subtype is characterised by the lack of expression of any of the above receptors (ER-/PR-/HER2-). This is the most challenging breast cancer subtype as it is more aggressive and does not respond to hormonal therapies or HER2-targeted therapies. Treatment usually involves chemotherapy, and patients have a higher risk of early recurrence.

There is therefore a need for breast cancer models that capture its heterogeneous subtypes to uncover novel disease mechanisms underlying its complexity and for more precise drug development. Patient-Derived Xenograft (PDX) models have emerged as a leading breast cancer in vivo system to study the intricacies of cancer biology and are created by transplanting patient tumour tissue into immunodeficient mice.

PDX models faithfully recapitulate patient tumour characteristics, including more representative intratumour heterogeneity, genomic features, metastatic patterns and drug

responses than traditional cell line and animal models. These highly translatable models are ideally suited to support more accurate advanced tumour modelling and preclinical in vivo drug validation studies. At the forefront of this field is Professor Alana Welm, based at the University of Utah, who has shared insights from her group’s groundbreaking research with us in this article.

Journey into Breast Cancer Research

Professor Welm’s journey into breast cancer research began after her PhD, with a desire to make an impact on human health. She explains, “I wanted to do some research that was more directly applicable to human health. So, for my postdoc, I joined J. Michael Bishop’s laboratory in UCSF, where I got to learn a lot about oncogenes and started working on breast cancer metastasis.”

Here, she discovered the difficulties in studying metastatic disease: “One of the challenges I realised was that human breast cancer cell lines are poorly metastatic. This is why I set out to make PDX models.”

Professor Welm hypothesised that growing these cancer cells in vivo, as opposed to on plastic, might enable a “metastatic memory” in the cells or allow them to “interact with their environment in a more physiologically relevant way.”

Pioneering Complex Breast Cancer PDX Models at the Huntsman Cancer Institute

In 2007, Professor Welm established her own laboratory at the University of Utah’s Huntsman Cancer Institute and began developing novel breast cancer PDX models. These spanned different breast cancer types. Her research focuses on solving the problem of breast cancer metastasis, using in vivo PDX modelling of complex and heterogeneous breast cancers.

Professor Welm’s group have generated a large biobank of PDX models2 that represent breast cancer patients affected by the most advanced and lethal forms of the disease. This includes aggressive, metastatic and treatment-resistant subtypes, providing a truer representation of the entire spectrum of disease than previously available.

Capturing the Complexity of Breast Cancer

Professor Welm’s work focuses on the importance of studying metastatic disease, the killer in breast cancer. She endeavours to make sure her models are representative of metastatic disease and highlights the challenges faced and strategies used to ensure this.

Image: Schematic illustrating the establishment of PDX models by grafting tumour tissues from a patient into immunodeficient mice, created with Biorender.
Image: Professor Alana Welm, University of Utah

Welm said, “We try really hard to make sure that our models are representative of metastatic disease, but it is hard to get metastatic samples from patients… What we really try to get are the pairs, primary metastatic pairs or longitudinal samples, because then people could use them to study how the tumour evolves in the patient or how it evolves resistance to therapies.”

These matched samples provide researchers with the tools needed to study the biology underpinning metastases. In addition to representing metastatic disease, Professor Welm and her team have worked hard to capture the spectrum of disease experienced across breast cancer. When asked about key subsets in her collection, she notes, “I think a really unique set are the oestrogen-receptor positive (ER+) tumours because they are harder to grow, so there are only a few of them. We characterise all of them for their oestrogen dependence.”

She further explains that the collection also includes: “models of ER+ breast cancers with naturally occurring mutations in the oestrogen receptor, which occur in humans with hormone therapy.”

The biobank also contains a vast collection of triple-negative breast cancer (TNBC) models, which is essential. Professor Welm explains further: “TNBC is a vastly heterogeneous subtype. It’s really the absence of a subtype, I guess. We need a lot of those as well to just represent human breast cancer.”

Unlocking

Unexpected Discoveries Using These Models

Professor Welm shared with us one of her most unexpected discoveries from using these PDX models: “I think the biggest finding that we didn't expect was that the ability to generate a PDX model actually predicted distant recurrence for TNBC patients. That was a complete accident. It's like a functional test for aggressivity.” This discovery led to further innovations, Professor Welm explains: “It inspired our functional precision oncology trials. We knew these patients would have a bad outcome and yet we were growing their tumours. So, we thought we need to do something about this.”

This work evolved into developing matching organoids for higher throughput in vitro drug screening, enhancing the potential for personalised treatment responses.

Advancing Breast Cancer Research Through Collaboration

When asked about her goals for implementing PDX models in breast cancer research, Professor Welm emphasised the importance of addressing unmet clinical needs.

“Well, I think for us, the goal is to use these models to research areas of the greatest medical need in breast cancer, which are the recurrent drug-resistant metastatic tumours. There are a lot of primary tumours that we could study, we could make models of, but those might not represent the disease where we need to make new therapies.”

Professor Welm has deposited 53 of these PDX models, including models of the most advanced and lethal forms of breast cancer, such as aggressive, metastatic and treatmentresistant subtypes, with CancerTools, a global, non-profit research tools provider. CancerTools, part of Cancer Research UK, connects scientists worldwide with academic-developed

cancer research tools, accelerating discovery and innovation through open, collaborative science. This partnership aims to further accelerate breast cancer research, drug discovery and precision oncology by making these breast cancer PDX models more accessible to the global cancer research and drug discovery community.

For Professor Welm, sharing these models through CancerTools honours both scientific principles and patient wishes: “It's important to us that these patient-derived models are made available to the scientific community to advance research on breast cancer, not only because it's our obligation as cancer researchers to do so, but because this is what the patients wanted when they donated their tissue to research.”

However, some remaining challenges with PDX models include low take rates, high costs and long timelines involved in generating and maintaining these models. The development of new immunodeficient mice and/or better methods of tumour transplantation should help to overcome these limitations.3 The lack of a functional immune system in current PDX models is another challenge, since tumour-immune interactions are an important part of tumour behaviour and therapeutic response. To overcome this disadvantage, mice engrafted with a human immune system are expected to be a promising tool for the next generation of PDX models.3

In conclusion, these highly translatable breast cancer PDX models enable scientists to study how different breast cancer subtypes respond to potential therapies and provide more predictive responses than traditional models, thereby derisking preclinical drug validation. Moreover, these models also offer scientists unprecedented opportunities to study breast cancer metastasis, drug resistance and tumour evolution in a physiologically relevant context. This paves the way to developing more effective personalised treatments and better outcomes for breast cancer patients.

REFERENCES

1. Orrantia-Borunda et al. Subtypes of Breast Cancer. In: Mayrovitz HN, editor. Breast Cancer [Internet]. Brisbane (AU): Exon Publications; 2022 Aug 6. Chapter 3.

2. Guillen et al. A human breast cancer-derived xenograft and organoid platform for drug discovery and precision oncology. Nat Cancer. 2022 Feb;3(2):232-250. PMID: 35221336.

3. Murayama et al. Patient-Derived Xenograft Models of Breast Cancer and Their Application. Cells. 2019 Jun 20;8(6):621. PMID: 31226846.

Malathi Raman Srivastava

Malathi Raman Srivastava is a Senior Product Marketing Manager at CancerTools, managing the breast and lung cancer product portfolios, including patient-derived xenograft (PDX) models. She previously worked at Bit Bio Ltd as a Product Development Manager and at Takara Bio Europe as a Senior Product Manager. She holds a PhD from Imperial College London and has done post-doctoral research at the Leeds Institute of Molecular Medicine in Leeds, U.K.

CiPA-Ready hiPSC Cardiomyocytes for Cardiac Safety Assessment

Drug-induced arrhythmias remain a significant challenge in pharmaceutical development, often resulting in late-stage failure, regulatory rejection and post-market withdrawals. Traditional safety assays focused on hERG potassium channel inhibition and QT prolongation, while effective at detecting torsadogenic compounds, frequently overestimate risk for multichannel-acting drugs and underestimate chronic structural toxicity.1,2 Human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) have emerged as a superior human-relevant alternative, particularly when integrated into the Comprehensive in vitro Proarrhythmia Assay (CiPA) framework.2,3 CiPA-ready hiPSC-CM platforms offer a convergent solution that combines physiological relevance, mechanistic insight and regulatory alignment, positioning them as pivotal tools for next-generation cardiac safety assessment.1,2,3

The Need for Better Cardiac Safety Testing

Cardiovascular toxicity remains a leading cause of late-stage drug development failure and post-marketing withdrawal. Historical analyses show that 8.7% of drugs withdrawn between 1960 and 1999, and 14% withdrawn between 1953 and 2013, were due to unanticipated cardiotoxic effects.4,5 These failures span diverse therapeutic classes, antiarrhythmics like dronedarone, anticancer agents including anthracyclines, and non-cardiac drugs such as antidiabetics, fluoroquinolones and SSRIs.1,6,7,8

Current regulatory paradigms mandate hERG channel testing and QT interval assessment.1 While valuable, these approaches have inherent limitations. Single-channel focus can overestimate clinical risk for drugs with beneficial multichannel effects (e.g., verapamil blocking both hERG and L-type calcium channels, or ranolazine modulating late sodium current), leading to unnecessary rejection of potentially useful compounds.1,9 Additionally, conventional assays fail to capture chronic functional decline, structural injury, metabolic dysfunction and complex arrhythmic mechanisms.1

The CiPA initiative integrates three complementary methodologies: 1. quantitative in vitro ion channel electrophysiology, 2. in silico human ventricular action potential modelling and 3. human hiPSC-CM functional assays capturing integrated cellular responses.2,3 This multitiered approach enables evaluation of repolarisation, QT prolongation, beat rate, conduction, contractility and emergent arrhythmic phenotypes within a human-relevant context.1

Generating and Characterising CiPA-Ready hiPSC-Derived Cardiomyocytes

Successful CiPA implementation requires consistent production of large cell batches with reproducible structural, molecular and electrophysiological properties. hiPSC lines are expanded feeder-free on hESC-qualified matrix in defined medium, passaged at ~80% confluency to preserve genomic stability and differentiation competence.1 Cardiomyocyte

Figure 1. Directed differentiation and characterisation of hiPSC-derived cardiomyocytes. (A) Schematic of the stepwise differentiation from hiPSC expansion through mesoderm and cardiac progenitor stages to cardiomyocytes. (B) Representative phase-contrast images at defined time points illustrating morphological changes during differentiation. (C) Flow cytometric analysis confirming expression of cardiac markers cardiac troponin, α-actinin, MLC2v and MLC2a in the differentiated population. (D) Immunofluorescence staining of differentiated cardiomyocytes for cardiac troponin, α-actinin, MLC2v and MLC2a (scale bar: 10 μm).

Research / Innovation / Development

differentiation employs monolayer-based protocols with stage-specific Wnt modulation and progressing through mesodermal induction and cardiac progenitor specification to autonomously contracting ventricular-like cardiomyocytes by days.10–15 Phenotypic characterisation is essential for platform qualification. Immunofluorescence microscopy and flow cytometry of day 20 YBLiCardio cells demonstrate high expression of canonical sarcomeric markers, cardiac troponin T (cTnT, ~91%), alpha-actinin (~93%) and ventricular myosin light chain isoforms MLC2v and MLC2a (~80–81%), supporting their classification as mature, ventricular-enriched cardiomyocytes suitable for electrophysiological safety assessment (Figure. 1).1,11

Quantitative RT-PCR reveals a sharp temporal transition during differentiation. Cardiac-specific genes (TNNT2, MYL2, MYH7) remain undetectable in undifferentiated hiPSCs but increase dramatically after day 15, while pluripotencyassociated transcripts (hTERT) progressively decline, confirming authentic ventricular cardiomyocyte identity.1

Transcriptomic Maturation and Alignment with Adult Heart

A persistent criticism of hiPSC-CMs is their "fetal-like" phenotype, potentially limiting translational relevance for adult cardiac safety. Comparative RNA sequencing of hiPSC-CMs at sequential stages (days 6, 15, 30) versus adult human heart RNA reveals time-dependent maturation.1 Principal component analysis shows undifferentiated hiPSCs segregating distinctly from cardiomyocytes and heart tissue, while later-stage hiPSC-CMs progressively cluster toward adult heart reference, indicating convergent transcriptional programmes.1 Focused gene set analysis demonstrates substantial alignment with native heart tissue. Pluripotency markers (NANOG, SOX2) decline progressively, cardiac transcription factors (GATA4, TBX5, NKX2-5, MEF2C) approach adult levels, sarcomeric genes (ACTC1, ACTN2) show robust expression and ion channel genes critical for depolarisation/repolarisation (SCN5A, KCNQ1, KCNH2) exhibit expression levels comparable to native myocardium.1 Day 30 hiPSC-CMs show enhanced expression of functional genes (MYL2, CASQ2, CAMK2D) relative to day 15, confirming progressive maturation toward an adult-like phenotype.1

Functional Pharmacology: Acute and Chronic Responses

Functional validation using CardioExcyte or FLEXcyte MEA platforms enables simultaneous recording of extracellular field potentials across 96 wells, systematically evaluating acute and chronic drug responses.1

Acute studies with canonical ion channel modulators demonstrate expected pharmacological signatures. Nifedipine (L-type Ca²⁺ blocker) produces concentration-dependent amplitude reduction and contraction shortening, consistent with suppressed calcium-dependent excitation–contraction coupling.1 Lidocaine (Na⁺ blocker) induces negative inotropy and upstroke/recovery prolongation.1 E-4031 (hERG blocker) produces marked beat duration prolongation, negative inotropy and spontaneous arrhythmias, mirroring its known proarrhythmic liability.1 Isoproterenol (β-agonist) increases amplitude and beat rate at submicromolar concentrations, and desensitisation at high doses illustrates biphasic adrenergic modulation.1 Carbachol (muscarinic agonist) prolongs action potential duration and decreases beat rate, indicating functional parasympathetic modulation and mixed atrial–ventricular phenotype.1

Chronic monitoring over 72 hours reveals time- and concentration-dependent functional deterioration. Doxorubicin (anthracycline) induces progressive amplitude reduction with eventual beat cessation at higher concentrations.1 Sunitinib (multi-targeted tyrosine kinase inhibitor) produces concentrationdependent reductions in amplitude and beat rate with rapid functional collapse.1 In contrast, erlotinib (a lower-risk tyrosine kinase inhibitor) shows minimal effects, supporting low-risk classification.1 Pentamidine (hERG trafficking blocker) produces a gradual amplitude decline with eventual beat cessation, plus duration prolongation and altered contractile slopes.1

These datasets illustrate how well-characterised hiPSC-CM systems connect molecular targets to integrated electrophysiological and contractile phenotypes over clinically relevant timescales.1

CiPA Risk Stratification and Compound Classification

CiPA-ready platforms detect and classify drug-induced QT prolongation and proarrhythmic risk concordant with human clinical outcomes1,2,3 by measuring field potential duration (FPD) normalised to vehicle control.1 In this study, FPD prolongation bands of ≤113% (low), 113–139% (intermediate) and ≥165% (high) were used for TdP risk categorisation, aligned with ranges reported in HESI/CiPA validation studies.1,14

Applied to the HESI CiPA reference panel, robust hiPSC-CM platforms show: 1. minimal FPD changes for low-risk agents (mexiletine, ranolazine), 2. graded prolongation for intermediaterisk compounds (antipsychotics, antihistamines) and 3. marked prolongation and electrophysiological instability for high-risk compounds (dofetilide, sotalol, quinidine, bepridil) with strong clinical TdP associations.1

Notably, high-quality platforms refine borderline compounds. Droperidol and domperidone, originally designated intermediaterisk, manifest high-risk-like FPD prolongation and arrhythmias in sensitive assays, consistent with clinical concerns and primary cardiomyocyte studies.1 Enhanced chlorpromazine-risk detection, historically underestimated in some systems, underscores the advantage of platforms with mature ion channel expression and ventricular enrichment.1 CiPA-ready platforms thus function not merely as confirmatory tools but as sensitive systems capable of refining and reclassifying clinical risk categories (Figure 2).1

Figure 2. Classification of compound-induced QT prolongation risk in hiPSC-derived cardiomyocytes. (A) Summary of predefined thresholds used to categorise compounds as low, intermediate, or high risk based on percentage changes in field potential duration (FPD) relative to control. (B–D) Dose-dependent effects on FPD induced by representative low-, intermediate- and high-risk compounds, with values normalised to vehicle-treated controls. (E) Heatmap displaying all tested compounds ranked by normalised FPD, annotated with established risk categories and grouped according to increasing QT prolongation, with colour coding indicating low (green), intermediate (yellow) and high (red) risk. Data are presented as mean ± SD (n = 3–5 independent experiments); *p < 0.05, **p < 0.01 versus control.

Regulatory Landscape and Future Directions

These advances align with an evolving regulatory environment explicitly supporting non-animal, human-relevant methodologies. The FDA Modernization Act 2.0 (2022) endorses advanced in vitro and in silico approaches in drug development and regulatory safety assessment, reinforcing the importance of human-relevant platforms like hiPSC-CMs for safer drug discovery with reduced animal reliance.1,13

The field advances toward three-dimensional, mechanically and electrically conditioned tissue constructs emulating native myocardial architecture.1,11 Ventricular-enriched hiPSC-CMs optimised for 2D CiPA assays serve as building blocks for 3D engineered heart tissues and bioprinted patches, where mechanical loading and electrical pacing further enhance maturation and tissue-level biomechanics.1,11 Integration with controlled mechanical stretch and pacing enhances sarcomeric organisation, calcium handling and force generation, progressively closing the translational gap between in vitro models and adult human cardiac physiology.1

CiPA-qualified hiPSC-CM platforms supported by rigorous phenotypic, transcriptomic and functional validation demonstrating HESI concordance are well positioned for future regulatory cardiac safety packages.1,2,3 Their scalability, high-throughput compatibility, technical accessibility and potential extension into 3D cardiac organoids make them attractive for discovery triage through mechanistic risk assessment.1,2,3,16 Their human derivation and mechanistic fidelity align with contemporary regulatory emphasis on translationally relevant, ethically sound and scientifically rigorous safety assessment.1

Conclusion (Stand out – Colour coded)

CiPA-qualified hiPSC-derived cardiomyocytes represent a convergent solution to long-standing challenges in cardiac

safety testing and drug development. These human-relevant, mechanistically informative, technically scalable platforms align with emerging non-animal testing expectations. By combining high-purity ventricular-like phenotypes with comprehensive structural, molecular and functional characterisation, CiPA-ready hiPSC-CM systems provide a robust foundation for accurate, efficient and ethically sound assessment of proarrhythmic risk, cardiotoxic liability and overall cardiac safety across the drug development continuum. As regulatory frameworks evolve to support non-animal testing, these platforms are poised to play an increasingly central role in protecting patients while accelerating safer, more effective pharmaceutical development.

REFERENCES

1. Konala, V.B.R., Kuhikar, R., More, S., Gossmann, M., Lickiss, B., Linder, P., Sarkar, J., Bhanushali, P. & Khanna, A. CiPA-qualified human iPSC-derived cardiomyocytes: A new frontier in toxicity testing by evaluating drug-induced arrhythmias. Toxicol. In Vitro 108, 106100 (2025).

2. Sager, P.T., Gintant, G., Turner, J.R., Pettit, S. & Stockbridge, N. Rechanneling the cardiac proarrhythmia safety paradigm: A meeting report from the cardiac safety research consortium. Am. Heart J. 167, 292–300 (2014).

3. Colatsky, T., Fermini, B., Gintant, G., Pierson, J.B., Sager, P., Sekino, Y., Strauss, D.G. & Stockbridge, N. Comprehensive in vitro proarrhythmia assay (CiPA) – update on progress. J. Pharmacol. Toxicol. Methods 81, 15–20 (2016).

4. Fung, M., Thornton, A., Mybeck, K. et al. Evaluation of the characteristics of safety withdrawal of prescription drugs from worldwide pharmaceutical markets – 1960 to 1999. Ther. Innov. Regul. Sci. 35, 293–317 (2001).

5. Onakpoya, I.J., Heneghan, C.J. & Aronson, J.K. Post-marketing withdrawal of 462 medicinal products because of adverse drug reactions: A systematic review of the world literature. BMC Med. 14, 10 (2016).

6. Singh, B.N., Connolly, S.J., Crijns, H.J.G.M. et al. Dronedarone for maintenance of sinus rhythm in atrial fibrillation or flutter. N. Engl. J. Med. 357, 987–999 (2007).

Research / Innovation / Development

7. Frothingham, R. Rates of torsades de pointes associated with ciprofloxacin, ofloxacin, levofloxacin, gatifloxacin, and moxifloxacin. Pharmacotherapy 21, 1468–1472 (2001).

8. Funk, K.A. & Bostwick, J.R. A comparison of the risk of QT prolongation among SSRIs. Ann. Pharmacother. 47, 1330–1341 (2013).

9. Johannesen, L., Vicente, J., Mason, J.W., Sanabria, C., Waite-Labott, K., Hong, M., Guo, P., Lin, J., Sorensen, J.S., Galeotti, L. et al. Differentiating drug-induced multichannel block on the electrocardiogram: Randomized study of dofetilide, quinidine, ranolazine, and verapamil. Clin. Pharmacol. Ther. 96, 549–558 (2014).

10. Seyhan, A.A. Lost in translation: the valley of death across preclinical and clinical divide – identification of problems and overcoming obstacles. Transl. Med. Commun. 4, 18 (2019).

11. Ronaldson-Bouchard, K., Ma, S.P., Yeager, K., Chen, T., Song, L., Sirabella, D., Morikawa, K., Teles, D., Yazawa, M. & Vunjak-Novakovic, G. Advanced maturation of human cardiac tissue grown from pluripotent stem cells. Nature 556, 239–243 (2018).

12. Blinova, K., Dang, Q., Millard, D. et al. Comprehensive translational assessment of human-induced pluripotent stem cell-derived cardiomyocytes for evaluating drug-induced arrhythmias. Toxicol. Sci. 155, 234–247 (2017).

13. Text of the FDA Modernization Act 2.0. 117th Congress, S.5002 (2022). Available at: https://www.congress.gov/bill/117th-congress/senatebill/5002/text, visited on 24 Jan 2026.

14. Gintant, G., Sager, P.T. & Stockbridge, N. Evolution of strategies to improve preclinical cardiac safety testing. Nat. Rev. Drug Discov. 15, 457–471 (2016).

15. Lundy, S.D., Zhu, W.Z., Regnier, M. & Laflamme, M.A. Structural and functional maturation of cardiomyocytes derived from human pluripotent stem cells. Stem Cells Dev. 22, 1991–2002 (2013).

16. U.S. Food and Drug Administration. Roadmap to reducing animal testing in preclinical safety studies. FDA (2019). Available at: https:// www.fda.gov/files/newsroom/published/roadmap_to_reducing_ animal_testing_in_preclinical_safety_studies.pdf, visited on 24 Jan 2026.

Vijay Bhaskar Reddy Konala

Vijay Bhaskar Reddy Konala is a Principal Scientist at Yashraj Biotechnology and the first author with more than 15 years of experience in stem cell research. He is instrumental in study design and execution.

Email: vijay.konala@yashraj.com

Rutuja Kuhikar

Rutuja Kuhikar is a scientist and has optimised the hiPSC-cardiomyocyte differentiation protocol.

Email: rutuja.kuhikar@yashraj.com

Shruti More

Shruti More is a scientist and has been instrumental in performing the validation studies of hiPSC-cardiomyocytes. shruti.more@yashraj.com

Matthias Gossmann

Matthias Gossmann is the CEO of Innovitro and has led the MEA studies of hiPSC-Cardiomyocytes.

Email: gossmann@innovitro.de

Bettina Lickiss

Bettina Lickiss is a scientist at Innovitro and has been involved in the MEA study conduction.

Email: lickiss@innovitro.de

Peter Linder

Peter Linder is the simulation scientist at Innovitro and has performed the simulation studies of this examination.

Email: linder@innovitro.de

Jaganmay Sarkar

Jaganmay Sarkar is a scientist and has done various batch generations of hiPSC- Cardiomyocytes.

Email: jaganmay.sarkar@yashraj.com

Paresh Bhanushali

Paresh Bhanushali is responsible for the overall study operations and fund approvals.

Email: paresh.bhnushali@yashraj.com

Amit Khanna

Amit Khanna is the corresponding author and lead scientist responsible for the overall study design, execution and cross-functional collaborations.

Email: amit.khanna@yashraj.com

The Hidden Friction in Pharma-CRO Collaboration: When Data Sharing Undermines Data Security

The challenges of collaboration between pharmaceutical companies and contract research organisations (CROs) are widely discussed across the industry. Ask leaders on either side of the partnership what challenges they encounter and the answers tend to converge: communication is inefficient, timelines slip, integrity is questioned, and quality can be inconsistent. These issues are real and they have been examined extensively.

What is discussed far less often, however, is a deeper, more structural problem, one that does not always manifest as an immediate failure. Instead, it steadily erodes research efficiency over time. It is a problem that becomes most visible in large, long-running collaborations and at precisely the moments when data-driven decisions matter most.

At the heart of many pharma-CRO tensions lies an uncomfortable reality: data sharing and data security are rarely achieved simultaneously. In practice, organisations are forced to trade one for the other, either constraining access and slowing decision-making or widening sharing at the cost of long-term data control and protection.

Beyond Execution: A Structural Mismatch

Most post-mortems of pharma-CRO collaborations focus on execution. CROs are said to lack rigour, communication between teams is described as inefficient and project management processes are blamed for delays.1 While these factors certainly play a role, they tend to obscure the more fundamental issue.

Drug discovery data is not easily modularised. It is not a finished product that can be cleanly separated from the process that produced it. Experimental results are inseparable from their historical context; why an experiment was designed a certain way, what failed before it worked, what assumptions were revised along the way and which paths were deliberately abandoned.

Yet many collaboration models implicitly assume that data can be segmented, delivered and consumed independently of this context. This assumption creates a structural mismatch between how science actually progresses and how collaboration is operationalised.

How the Problem Appears in Practice

In day-to-day projects, this mismatch rarely appears dramatic. In fact, it often looks deceptively normal. CRO teams typically retain the full experimental narrative, while sponsors receive curated result files. The reasoning behind experimental designs, intermediate failures and exploratory iterations may be discussed in meetings or emails and yet never fully captured in a shared system of record.

While a project is in motion, the arrangement may appear sufficient. However, complications often emerge when a programme transitions phases, leadership changes, or critical go/no-go decisions must be made. At these junctures, teams discover that vital context is unrecoverable. Data may satisfy contractual requirements, though remain scientifically deficient. The result is a subtle but consequential failure: data has been delivered, yet it cannot adequately inform decision-making.

Why “Data Delivery” Falls Short of True Collaboration

Many collaboration models still operate under the assumption that timely delivery of complete datasets constitutes success. This mindset may have worked when drug discovery programmes were more linear and less interconnected. It is now increasingly misaligned with modern R&D.

Today’s research environments are iterative and cumulative. Experiments build on one another, hypotheses evolve over time and data must be reinterpreted repeatedly as new insights emerge. Decisions rely not only on final results, but on historical comparisons, negative data and experimental rationale.

File-based handoffs, spreadsheets, PDFs and static reports falter in this environment. Detached from their broader context, data becomes static rather than explanatory. Security, meanwhile, relies heavily on contractual agreements and procedural discipline instead of granular, system-level controls. Consequently, when a collaboration ends, the underlying knowledge often leaves with it.

None of this reflects a lack of effort or expertise; rather, it reveals collaboration models that were not designed for data-centric, longitudinal science.

Research / Innovation / Development

Why Some Organisations Prefer to Build In-House

This structural tension helps explain why large pharmaceutical companies often attempt to internalise as much research as possible. The motivation is not necessarily distrust of CRO capabilities, but control over continuity.

Within a single organisation, data systems, access permissions, experimental context and decision histories are naturally integrated. Collaboration challenges still exist, but they are managed internally rather than across corporate boundaries.

The cost of this approach is significant. Building and maintaining internal R&D capacity requires deep resources, mature infrastructure and long-term investment. For many biotech companies and innovation-focused pharma organisations, this path is neither feasible nor desirable.

As a result, the industry remains caught in a persistent contradiction: CROs are essential for speed and scale, yet sponsors often struggle to fully trust that the data generated through these partnerships will remain secure, interpretable and reusable over time.

Toward Sustainable, Scalable Collaboration

When pharma-CRO collaborations function smoothly over the long term, it is rarely the result of heavier processes or more frequent communication. Instead, success tends to stem from a shared data foundation established at the outset of the project.

In these cases, data is created and managed within a controlled collaborative environment. Access is role-based and purposedriven. Every action is recorded and traceable. Experimental context is preserved alongside results. When a collaboration ends, the accumulated knowledge remains intact within the sponsor’s ecosystem rather than dispersing across disconnected files and inboxes.

This approach addresses a central challenge in modern R&D collaboration: how to allow external partners to contribute as

if they were internal teams, without transferring ownership or control of critical scientific assets.

Solving this problem requires more than just better project management. It requires deliberate design of research data systems that treat collaboration, security and long-term usability as interdependent rather than competing goals.

The Cost of Ignoring the Issue

Failure is an inherent part of drug discovery and few organisations are surprised by unsuccessful experiments. What proves far more damaging is discovering, at a critical decision point, that years of work cannot be confidently reused or re-examined.

Organisations invest enormous resources into generating data. When that data cannot be safely, completely and reliably leveraged over time, the loss extends beyond individual projects; it undermines institutional learning itself.

As R&D becomes increasingly distributed across organisations, geographies and specialities, the consequences of fragmented data practices will only intensify. Collaboration will continue to expand; however, without strategic oversight of how scientific data is shared and governed, its value will continue to decay.

A Final Reflection

For leaders overseeing active research programmes, one question is worth asking: how much of the data generated today would you trust to guide decisions two years from now, after teams change and priorities shift?

In conversations across pharmaceutical companies and CROs of all sizes, the underlying concerns are strikingly consistent, yet they are rarely addressed at a systemic level. As the industry moves toward increasingly collaborative models of discovery, the ability to design data environments that preserve context, security and scientific continuity may be as vital as the experiments themselves.

Ultimately, the future of collaborative drug discovery will be shaped not only by who conducts the science but by how the science is captured, shared and sustained.

REFERENCES

1. https://www.mckinsey.com/industries/life-sciences/our-insights/ building-a-shared-vision-for-pharma-r-and-d-supplierpartnerships

Abraham Wang

Abraham Wang is Director of Marketing at Collaborative Drug Discovery, provider of the CDD Vault research informatics platform. He brings extensive experience spanning technical roles in early-stage drug discovery and strategic product marketing across the life sciences industry. Abraham combines scientific expertise with market insight to help research organisations accelerate innovation and make data-driven decisions in a competitive global landscape.

How AI Is Transforming Microbial Ingredient Production

Population growth, shrinking arable land and climate change are intensifying pressure on global food systems. Microorganism-based ingredients offer a compelling alternative; many fungi, yeasts and microalgae contain 50–70% protein by dry weight, and modern metabolic engineering now enables microbes to produce specialised food proteins. Fermentation already underpins the production of vitamins, flavours, sweeteners, pigments, enzymes and cosmetic actives, while oleaginous microbes provide sustainable routes to edible and functional oils.

Recent advances in synthetic biology and multi-omics have made microbial platforms more predictable and versatile. Yet major challenges remain in discovering the right ingredients, engineering efficient production pathways and ensuring quality at scale. This article explores how artificial intelligence (AI) can accelerate these steps from ingredient discovery to bioprocess optimisation.

Ingredient Innovation Through AI-Driven Biotechnology: Revolutionising Sourcing and Yield Optimisation AI-driven approaches tackle ingredient innovation through three complementary strategies:

1. Screening natural microbial diversity to identify organisms that naturally produce compounds of interest.

2. Enzymatic engineering for optimising catalytic processes and discovering novel enzymes.

3. Metabolic pathway engineering for creating entirely new biosynthetic routes through microbial cell factories.

Together, these methodologies are reshaping how industries think about ingredient sourcing, moving from synthetic chemistry to synthetic biology.

1. AI in Natural Ingredient Discovery

To get a picture of what a microorganism can and cannot produce, one needs a detailed mapping of its genes and metabolites. Classical approaches usually rely on similarity searches against previously characterised proteins. Hence, most protein function remain unknown; only 0.3% of the 250 million protein sequences catalogued in the UniProt database have a functional annotation.1 This leads, down the road, to an incomplete picture of an organism's metabolic potential, that is, what compound can an organism produce. ML models have been proven effective in predicting proteins’ enzymatic activity, reaching 87% accuracy (Yu et al.). Yu and colleagues demonstrated the value of their model by validating their prediction on halogenases, an enzyme that is difficult to predict but plays an important role in the synthesis of drugs and other bioactives. They reached an accuracy of 86–100%.2 Prediction of protein structure by AlphaFold2 and others has further boosted functional annotations through structure similarity

annotation.1 The same principle, inferring biological function directly from sequence data, can be extended beyond enzymes to other metabolic traits, for instance, oils. Transformer models applied to genomic sequences can predict a yeast’s propensity to accumulate lipids, enabling the discovery of novel oleaginous yeasts with unique fatty-acid profiles.3 Some of these profiles could potentially replace up to 74 plant-based oils.

2. AI-Enhanced Enzymatic Engineering:

Optimising Nature's Catalysts

Engineering proteins is commonly used in drug manufacturing, agriculture, consumer products and more. Antibodies, for instance, are engineered to enhance their binding and specificity as therapeutics, whereas the stabilities and activities of enzymes can be improved under process conditions to reach more efficient chemical syntheses. AI models integrating sequence, structural and high-throughput experimental data help prioritise variants with enhanced activity, selectivity and robustness.4 These methods allowed Climax Food to design a Casein analog produced by precision fermentation that replicates the taste and texture of real cheese in plant-based products.5 Generative AI represents a fundamental shift from analysing and modifying natural proteins to designing them from scratch. Protein language models (e.g., ProteinMPNN6) and diffusion-based networks enable function-first design. Users specify catalytic requirements, and models generate stable, foldable scaffolds beyond evolutionary space. Autonomous platforms coupling large language models with robotic biofoundries require only target function and fitness metrics,7 unlocking enzymes for non-natural substrates and achieving catalytic efficiencies exceeding directed evolution.

Ginkgo Bioworks showcased the power of generative AI in enzyme engineering through a landmark project on a key enzyme in Central Carbon Metabolism, one that had seen only a two-fold performance gain in over 50 years of traditional research. Using their AI platform Owl, Ginkgo designed and screened iterative libraries beginning with 2,000 variants, and within just four AI-guided generations achieved a whopping 10-fold increase in catalytic efficiency. This breakthrough, validated through extensive activity assays and protein characterisation, shows how generative models can rapidly learn sequence–function relationships and produce highly customised biocatalysts.

Another example of generative AI applied to novel ingredient design is showcased by TastePepAI, a platform for de novo design of multifunctional taste peptides with customisable flavour profiles.8 The system is built on a tailored variational autoencoder that learns from known sweet, salty and umami peptides while avoiding undesirable profiles such as bitterness. This allows the model to generate entirely new peptide sequences predicted to have specific taste properties. Candidate peptides are then screened for safety using SpepToxPred, an AI toxicity predictor optimised for short food peptides. In experimental validation, the team successfully identified 73 new taste peptides, all of

which showed the expected flavour characteristics in electronictongue testing and demonstrated excellent biocompatibility. Such AI-accelerated workflows now make it possible to design enzymes for ingredient manufacturing, such as lipases for emollient esters or glycosyltransferases for robust prebiotic oligosaccharides, faster and more precisely.

3. AI-Powered Metabolic Pathway Engineering: Redesigning Production from First Principles

Several graph-based algorithms and other non–machine learning approaches have been successfully used to guide metabolic pathway engineering. These methods rely on curated biochemical reaction databases, such as MetaCyc and KEGG, and have enabled the biosynthesis of a range of target molecules, including vanillin in engineered yeast. Integrating ML into these established retro-biosynthetic workflows shows strong potential to accelerate pathway design by reducing the number of trial-and-error iterations typically required in metabolic engineering.9 Such hybrid approaches could also unlock the production of molecules previously considered technically or economically infeasible. Recent AI-enabled platforms illustrate this promise. BioNavi-NP, for example, identifies biosynthetic pathways for over 90% of test compounds and achieves 1.7-fold higher accuracy than conventional rule-based tools.10 These computational advances support more accurate and scalable strain-design pipelines. As a practical demonstration, MorenoPaz et al. reported a 68% increase in p-coumaric acid titre in yeast after two machine-learning-guided Design–Build–Test–Learn (DBTL) cycles, underscoring how ML can directly enhance microbial production performance.11

Manufacturing Optimisation and Quality Control

Optimising and controlling fermentation remains one of the greatest challenges in microorganism-based ingredient production. These steps are essential for reducing variability, lowering production costs and achieving competitiveness as a food source. This challenge is amplified in fermented foods that depend on complex environmental microflora rather than a single organism, making them inherently susceptible to microbial shifts, batch-to-batch inconsistency and safety risks.

1. AI/ML for Real-Time Process Monitoring

The integration of IoT technologies into fermenters has transformed how fermentation is monitored and controlled. Real-time sensors and cloud-based data pipelines now enable high-resolution tracking of key variables with greater precision and scalability.12 Smart biosensors, including electronic noses and tongues, further strengthen process transparency by detecting biochemical markers, such as glucose or amino acids, at extremely low concentrations. Recent advances in AI-enhanced biosensors improve contaminant detection and overall process reliability. Dynamic regulation of fermentation parameters helps to maintain optimal culture conditions while limiting human intervention. Reinforcement learning approaches have been used for adaptive control strategies that adjust process variables in real-time.

The emergence of deep reinforcement learning (DRL) has strengthened adaptive process control by combining reinforcement learning with deep neural networks, enabling faster and more accurate real-time decision-making. For example, the DRL controller improved penicillin yield by 14%, highlighting its promise for real-world fed-batch fermentation systems.13

2. Digital Twins

Digital twins (DTs) are emerging as a powerful addition to this ecosystem. A DT is a virtual representation of a biological process that integrates multi-omics data (genomics, transcriptomics, proteomics, metabolomics) alongside real-time sensor data from bioprocessors. Together, these layers create a dynamic virtual model of organism growth and metabolism under varying conditions. Such systems allow in silico testing of genetic modifications, medium compositions or operating conditions, potentially reducing months or years of iterative experimentation.14 Although digital twins have proven effective in multiple industries, their adoption in biomanufacturing remains early-stage due to biological complexity and stringent regulatory constraints.

Promising examples still exist, notably in traditional fermented ingredients. In the beer industry, hybrid digital twins combining multiple ML models have been used to predict beer quality throughout fermentation.15 Similarly, AI-driven modelling in wine fermentation enables dynamic condition adjustment, flavour optimisation and quality forecasting.16 In algae-based bioactive compound production, AI/ML systems have been used to continuously update operational parameters and predict biomass yields.16 It is important to recognise that most bioprocess digital twins are not purely ML-driven. Physics-based models, particularly mass-balance ordinary differential equations, remain widely used because they rely on fewer parameters and do not require the extensive datasets needed for ML methods. ML approaches, while powerful, are often limited by the high cost and labour required to generate sufficiently large and diverse experimental data. Hybrid methods combining ML approaches have also demonstrated benefits in fermentation optimisation. For instance, a back-propagation neural network (BPNN) coupled with an adaptive genetic algorithm (AGA) improved Lincomycin fermentation efficiency and increased yield by 8%.17

Conclusion

AI is rapidly transforming microbial ingredient innovation, from predicting protein function to designing enzymes, rewiring metabolic pathways and modernising process control. Its performance, however, depends on access to diverse, high-quality biological datasets, while publicly available genomic and functional data tend to be skewed toward a narrow group of model organisms. Initiatives such as IMG/M,18 the Global Microbial Gene Catalogue19 and large-scale efforts like BaseData20 are beginning to close this gap.

As these resources expand, AI will increasingly complement, rather than replace, established biological modelling, enabling faster discovery, more efficient engineering and more reliable manufacturing of food and cosmetic ingredients.

REFERENCES

1. Wang X, Yang P, Zhao B, Liu S. AI-assisted food enzymes design and engineering: a critical review. Systems Microbiology and Biomanufacturing. 2022 Oct 1;3(1):75–87.

2. Yu T, Cui H, Li JC, Luo Y, Jiang G, Zhao H. Enzyme function prediction using contrastive learning. Science. 2023 Mar 31;379(6639):1358–63.

3. Lawrence C. Creating biotech for the post-deforestation era: SMEY launches NOY, a “Neobank of Yeasts” for cultivated oils (Internet). Tech.eu. 2025 (cited 2025 Nov 26). Available from: https://tech. eu/2025/06/26/oil-biotech-for-the-post-deforestation-era-smey-

AI-Driven Microbial Ingredient discovery pipeline

1) MICROBE GENOME EXPLORATION

Microbe Genomics data: Functional predictions of enzymes (Homology based + AI-based)

AI structure prediction

2) ENZYME DESIGN OUTPUT

Activity prediction

Functional improvement through AI-guided mutations

Do-Novo Enzyme Design by AI

> Improved enzymes, New

3) METABOLIC PATHWAY DISCOVERY & ENGINEERING

Metabolic Network

reconstitution

Biosynthetic engineering

launches-noy-a-neobank-of-yeasts-for-cultivated-oils/

4. Orsi E, Schada von Borzyskowski L, Noack S, Nikel PI, Lindner SN. Automated in vivo enzyme engineering accelerates biocatalyst optimization. Nature Communications (Internet). 2024 Apr 24;15(1):3447. Available from: https://www.nature.com/articles/ s41467-024-46574-4

Ideal Microbial Compound Producer

eg: Carotenoids, Oil, Peptides, Costmetics, Drugs

5. BabyBel maker Bel Group, biotech Climax Foods partner to refine plant-based cheese with AI (Internet). FoodNavigator-USA.com. 2023 (cited 2025 Nov 27). Available from: https://www.foodnavigator-usa. com/Article/2023/04/12/babybel-maker-bel-group-biotech-climaxfoods-partner-to-refine-plant-based-cheese-with-ai/

6. Jin S, Wu Q, Fu G, Lu D, Wang F, Deng L, et al. Breaking Evolution’s Ceiling: AI-Powered Protein Engineering. Catalysts. 2025 Sep 2;15(9):842.

7. Singh N, Lane S, Yu T, Lu J, Ramos A, Cui H, et al. A generalized platform for artificial intelligence-powered autonomous enzyme engineering. Nature Communications (Internet). 2025 Jul 1;16(1). Available from: https://www.nature.com/articles/s41467-025-61209-y

8. Yue J, Li T, Ouyang J, Xu J, Tan H, Chen Z, et al. TastepepAI: An artificial intelligence platform for taste peptide de novo design. PLoS Computational Biology. 2025 Oct 16;21(10):e1013602–2.

9. Guillaume Gricourt, Meyer P, Duigou T, Jean-Loup Faulon. Artificial Intelligence Methods and Models for Retro-Biosynthesis: A Scoping Review. ACS Synthetic Biology. 2024 Jul 24;13(8):2276–94.

10. Zheng S, Zeng T, Li C, Chen B, Coley CW, Yang Y, et al. Deep learning driven biosynthetic pathways navigation for natural products with BioNavi-NP. Nature Communications (Internet). 2022 Jun 10 (cited 2023 Jun 1);13(1):3342. Available from: https://www.nature.com/ articles/s41467-022-30970-9

11. Moreno-Paz S, van, Elif Eliana, Zwartjens P, Gosiewska S, Santos, et al. Machine Learning-Guided Optimization of p-Coumaric Acid Production in Yeast. ACS synthetic biology. 2024 Mar 28;13(4):1312–22.

12. Yee CS, Nur Asyiqin Zahia-Azizan, Muhamad, Mohd A, Raja Balqis Raja-Razali, Muhammad Ameer Ushidee-Radzi, et al. Smart Fermentation Technologies: Microbial Process Control in Traditional Fermented Foods. Fermentation (Internet). 2025 Jun 5;11(6):323–3. Available from: https://www.mdpi.com/2311-5637/11/6/323

13. Li H, Qiu T, You F. AI-based optimal control of fed-batch biopharmaceutical process leveraging deep reinforcement learning. Chemical Engineering Science. 2024 Jun;292:119990.

14. Helmy M, Elhalis H, Rashid MM, Selvarajoo K. Can digital twin efforts shape microorganism-based alternative food? Current Opinion in Biotechnology (Internet). 2024 Jun (cited 2025 May 20);87:103115. Available from: https://doi.org/10.1016/j.copbio.2024.103115

15. Colomba Dazzarola, Tighe R, Pérez-Correa JR, Saa PA. Toward a digital twin for beer quality control: development of a digital model integrating industrial process data and model-based fermentation descriptors. Journal of Food Engineering. 2025 Jul 1;112726–6.

16. Khosravi P, Cosimo D’Aiello. Enhancing Wine Fermentation: The Role of AI-Driven Predictive Modeling in Flavor Optimization. Communications in computer and information science. 2025 Jan 1;64–83.

4) SMART FERMENTATION

Microbial Fermentor

AI-Metabolic Hybrid Model for optimization of the fermentation process

Digital Twin Early anomaly

detection

> Higher yields

> Stable Production

Source: SMEY

Guo B, Lu X, Jiang X, Shen XL, Wei Z, Zhang Y. Artificial Intelligence in Advancing Algal Bioactive Ingredients: Production, Characterization, and Application. Foods. 2025 May 17;14(10):1783–3.

18. Li Z, Faiza Atique, Shahzad M, Rehman KU. Intelligent Control Strategy Based on Back-Propagation Neural Network with Adaptive Genetic Algorithm for Lincomycin Fermentation Process. Industrial Biotechnology. 2022 Mar 30;18(2):98–105.

19. JGI IMG Integrated Microbial Genomes & Microbiomes (Internet). Doe. gov. 2018. Available from: https://img.jgi.doe.gov/

20. Ivica Letunic. GMGC: Global Microbial Gene Catalog (Internet). Embl. de. 2020 (cited 2025 Nov 27). Available from: https://gmgc.embl.de/

21. Vince O, Oldach P, Pereno V, Leung MHY, Greco C, Minto-Cowcher G, et al. Breaking Through Biology’s Data Wall: Expanding the Known Tree of Life by Over 10x using a Global Biodiscovery Pipeline. BioRxiv. 2025 Jun 14

Julie Rojas

Julie Rojas, PhD, is a computational biologist and Scientist AI at SMEY and holds a doctorate from the Ludwig-Maximilians-Universität München (Germany). She develops machine learning models to predict lipid production and profiles in wild yeast strains for sustainable ingredient discovery. With over six years of experience in bioinformatics and multi-omics analysis, she combines genomics, lipidomics and statistical modelling to accelerate microbial innovation in oils, enzymes and fermentation-based bioprocesses.

Email: julie.r@smey.cc

Heykel Trabelsi

Heykel Trabelsi, PhD, is a Product Manager at SMEY and holds a doctorate in Biotechnology from the Université catholique de Louvain (Belgium). His work focuses on microbial strain engineering for the production of high-value biosourced ingredients. At SMEY, he contributes to the development of a novel yeast neobank (NOY), enabling lipid production from non-GMO yeasts and the design of tailored oils and fats.

Email: heykel.t@smey.cc

Document, search, and collaborate

Scientific data management with new add-on functionality to make discoveries faster

Generate, QC, and analyze results

Register entities to secure your assets

Keep track of entities

Deep learning and AI

Connect instruments, data, and tools

Manage and analyze experimental data

Assay informatics

Plot and mine results

The ADC Era: Bridging Innovation and Manufacturability for Precision Biopharmaceuticals

Antibody–drug conjugates (ADCs) are at the forefront of targeted therapy, combining the precision of monoclonal antibodies with the efficacy of cytotoxic agents, delivering highly potent treatments directly to diseased cells while limiting systemic toxicity. The promise of ADCs to transform disease treatment since their inception in the 1960s is undeniable,1 but so are the challenges for their development and manufacture.2 As their clinical impact grows, so does the need to refine development economics and efficiencies as ADCs are inherently complex, requiring specialised payload handling with high containment environments. Historically, these difficulties have translated into extended timelines, elevated costs and operational bottlenecks. As the field advances, particularly in relation to the increasing volume of ADC-based therapies reaching clinical trials and the increased requirement for large-scale manufacture, addressing bottlenecks by selecting the correct equipment for each specific step becomes increasingly important, whether through purchasing equipment for use in-house or by working with a trusted CDMO partner.

Although the first antibody-drug conjugates (ADCs) were tested on animal models in the 1960s,1 and the first ADC-based clinical trials conducted in the 1980s, the first ADC-based drug, Mylotarg (gemtuzumab ozogamicin), was only approved in the year 2000. However, Mylotarg had a rocky start; it required a black box warning to be added to the packaging a year after its first approval and was withdrawn from the US market in 2010 due to hepatotoxicity. Nevertheless, it was reintroduced in 2017 by Pfizer following further clinical trials.3 Despite this initial setback for the field, it stimulated progress in the area, with several ADC-based therapies being approved in the 2010s, including the blockbuster Adcetris (brentuximab vedotin), heralding a new dawn in the field of ADC modalities and an explosion of interest in this class of drugs.

To keep pace with the rise in the number of ADCs approved as treatments for disease, advances in process design, manufacturability and integrated development have also seen great leaps forward. Furthermore, as knowledge and expertise around large-scale manufacturing have improved, so has the ability to streamline drug development programmes, ensuring predictability and cost-effectiveness. Organisations that take a holistic overview, for example by aligning early scientific decisions with scalable process design and informed equipment strategy, are uncovering opportunities to reduce costs without compromising performance or safety. This integrated approach strengthens data integrity, supports risk assessment and helps minimise deviations during scale-up and clinical submissions, creating a more predictable and efficient development pathway. Although many groups now hold specialised capabilities within the ADC ecosystem, it is the integration of

these capabilities either through developing bespoke in-house solutions or working with a single CDMO, or even a series of highly specialised partners within a consortium, that shapes the most efficient and commercially viable development paths.

Rethinking ADC Complexity: Interdependence Drives Cost

The technical architecture of an ADC is well established with all three components, the antibody, linker and payload, bringing their own distinct manufacturing and regulatory requirements. For example, the antibody must maintain structural integrity through upstream expression and downstream purification; the linker, which can be prone to aggregation or unintended cleavage, must remain stable during storage and conjugation; and the payload must be handled under strict containment while delivering predictable drug-to-antibody ratios (DARs). Poor decisions early in a project amplify downstream inefficiencies. For example, unstable linkers may require additional purification, hydrophobic payloads may complicate conjugation control and marginally stable liquid formulations may necessitate costly cold-chain logistics. Understanding these interdependencies is the foundation of both technical and economic success.

The integration of biologics with synthetic chemistry is a key area of focus due to the requirement to manage the needs of each component alongside the breadth of expertise required. Each stage of the manufacture of ADCs requires specialised equipment, robust process control and keen analytical monitoring to ensure product quality and process reproducibility. Integration of these processes is essential to reduce risk, minimise product loss and ensure enhanced scalability to secure commercial and clinical supply:

• Bioreactors are used for mAb production and enable precise control over cell growth, nutrient feed, pH, temperature and oxygen content, with both stainless steel and single-use formats available to suit specific client needs. Advanced bioreactors support real-time monitoring through Process Analytical Technology (PAT), which allows rapid identification of any deviations that could negatively impact the manufacture.

• Tangential flow filtration (TFF) and chromatography are both used for mAb purification, with stainless steel and single-use formats available. During these processes, minimal shear stress alongside control of concentration and temperature are essential for antibody integrity. Crucially, TFF supports ultra, micro, nano and diafiltration operations, enabling gentle and efficient buffer exchange, concentration and impurity removal for ADCs, while preserving conjugate integrity and providing scalable, consistent purification from development through manufacturing.

• Control over the reaction conditions during conjugation is vital, with concentration, homogeneity and solvent removal all extremely important to ensure reproducible DARs.

• Modern isolator technologies and modular suites allow safe handling of potent payloads without the heavy capital

Manufacturing & Processing

burden associated with traditional facility design. These modular solutions reduce downtime, simplify cleaning and allow facilities to adapt more easily to shifting project needs.

• Single-use systems continue to gain traction, particularly in early clinical production. Their ability to minimise cleaning validation, reduce cross-contamination risk and enable rapid changeover allows companies to move faster with fewer operational overheads.

Variability in any single element can cascade across the process. The whole manufacturing sequence from bioreactor to purification, conjugation, lyophilisation and aseptic fill-finish behaves as a tightly coupled system, as shown in Figure 1.

1:

Designing for Manufacturability: Preventing Costly Challenges Early

R&D choices determine how simple (or how burdensome) the manufacturing path will be. Companies that intentionally design for scalability and manufacturability experience shorter timelines, fewer surprises and significantly lower costs than those that don’t. Investment in and development of in-house capabilities, especially in relation to the purchase of specific equipment and instrumentation, is of increasing importance. In

addition, investment in personnel and paying attention to key processes early can pay dividends, for example, by inclusion of manufacturability evaluations, early risk assessment and guidance on how formulation behaviour, conjugation strategy and downstream processing will interact at scale. If an in-house solution is not possible, then working with a single CDMO or a series of highly specialised partners within a consortium can also be a viable route forward, allowing ADC developers to capitalise on the specialist knowledge and expertise within niche CDMO providers. Put simply, knowledge is power.

CASE STUDY

Working with Clients to Optimise ADC Manufacture and Remove Bottlenecks

A mid-stage oncology ADC programme faced significant challenges during Phase II clinical development due to instability in the liquid formulation. The molecule exhibited linker hydrolysis, aggregation and loss of potency, resulting in limited shelf life and an over-reliance on cold-chain storage. These issues risked delaying pivotal studies and increasing both material use and manufacturing cost.

To address these challenges, the development and manufacturing process was re-engineered by working with the client to develop an integrated approach that aligned upstream production, conjugation control, solvent management, formulation optimisation and lyophilisation strategy.

Key elements included:

1. Consistent monoclonal antibody production

High-yield upstream expression was achieved using a 50 L single-use bioreactor, followed by tangential flow filtration (TFF) for buffer exchange and concentration. This provided a stable, well-characterised starting material for conjugation.

2. Safer, efficient solvent removal

Vacuum evaporation supported controlled solvent reduction while minimising operator exposure to high-potency intermediates.

3. Optimised formulation and freeze-drying cycle

Pre-lyophilisation characterisation, including thermal analysis and controlled nucleation studies, enabled the design of a reproducible pilot-scale freeze-drying cycle. This improved cake structure, reduced variability and supported a robust stability profile.

4. Contained aseptic fill–finish

The final drug product was filled under isolator conditions, ensuring sterility and operator protection while supporting efficient turnaround of high-potency batches.

The resulting lyophilised ADC demonstrated >95% reconstitution recovery, retained potency through 12 months of ambient storage and no longer required frozen or refrigerated distribution. This provided significant logistical advantages and reduced storage and transport costs. The integrated development strategy also supported seamless transition into GMP manufacture at multi-kilogram scale, reducing risk and improving regulatory readiness.

Figure
The five critical stages to ADC preparation, where TFF is tangential flow filtration.

Manufacturing & Processing

Equipment Choices:

Lyophilisation as a Tool for Long-term Stability

Lyophilisation is emerging as one of the most important cost-shaping technologies in ADC development; in fact, over 80% of commercially approved ADCs to date are formulated as lyophilised forms.4 The reason for this is that many ADCs degrade rapidly in aqueous form, for example, through linker hydrolysis, deamidation and payload-induced destabilisation, often rendering liquid formulations unsuitable beyond short-term storage. Freeze-drying provides a route to restore shelf life, reduce reliance on cold-chain distribution and support more flexible global supply. However, achieving the benefits of lyophilisation depends on precise cycle design, industrial expertise and having the infrastructure that can work with highly potent active pharmaceutical ingredients (HPAPIs) that are classified under Occupational Exposure Band 6 (OEB 6), the highest safety classification for hazardous substances.

During the lyophilisation process, the freezing behaviour determines both the ice morphology and sublimation pathways, and there are many aspects to consider. Rapid cooling may protect product integrity in some cases, whereas controlled nucleation,5 in which the freezing process is conducted at a higher temperature and therefore slowed, enables larger crystal formation and reduced randomness in crystal structure, which may deliver more uniform cake structures overall. Controlled nucleation has several other benefits, including more reproducible product quality and consistency, faster reconstitution during end-use, improved mechanical stability and reduced costs. Primary drying must balance heat input and chamber pressure to avoid collapse, while secondary drying must achieve the residual moisture content necessary for long-term stability without damaging the antibody, linker or payload.

Working with an organisation that has world-leading expertise and access to state-of-the-art equipment across both ADC manufacture and subsequent drug product preparation, such as lyophilisation, can be invaluable. Insights into the design and validation of the freeze-drying approach that can streamline cycle development can be offered, strengthening process control and reducing the risk of scale-up challenges.

Developing In-House Expertise or Using a CDMO Partner?

As ADC programmes progress towards more advanced stages, the practical value of developing in-house expertise becomes clear, although the breadth of expertise required, from handling high-potency materials to conjugation chemistry, purification, analytical characterisation, formulation development, lyophilisation and aseptic processing, can render it impractical for many organisations to build complete internal capabilities. In this case, enlisting a partner with technical expertise in a specialised area can significantly accelerate development and remove the need for large capital investments.

The ADC field is continually evolving, with more complex constructs, more potent payloads and more ambitious clinical targets constantly being developed. Rather than building capabilities indiscriminately, the most efficient organisations choose targeted equipment investments, purchasing instrumentation and equipment alongside specialist training that amplifies flexibility, supports scalability and complements their in-house capabilities for use on subsequent internal R&D

projects. If this isn’t possible, then working with a trusted CDMO partner with expertise in a specific area, such as lyophilisation or formulation development, can be a valuable alternative.

REFERENCES

1. Fu, Z., Li, S., Han, S., Shi, C. & Zhang, Y. Antibody drug conjugate: the “biological missile” for targeted cancer therapy. Signal Transduct. Target. Ther. 7, 93 (2022). https://doi.org/10.1038/s41392-02200947-7

2. Beck, A., Goetsch, L., Dumontet, C. et al. Strategies and challenges for the next generation of antibody–drug conjugates. Nat. Rev. Drug Discov. 16, 315–337 (2017). https://doi.org/10.1038/nrd.2016.268

3. Ali, S., Dunmore, H.-M., Karres, D. et al. The EMA Review of Mylotarg (Gemtuzumab Ozogamicin) for the Treatment of Acute Myeloid Leukemia. The Oncologist, 24, e171–e179 (2019). https://doi. org/10.1634/theoncologist.2019-0025

4. Wen, L., Zhang, Y., Wang, S. S. et al. Fundamental properties and principal areas of focus in antibody–drug conjugates formulation development. Antibody Therapeutics, 8, 99–110 (2025). https:// doi.org/10.1093/abt/tbaf005

5. Geidobler, R. & Winter, G. Controlled ice nucleation in the field of freeze-drying: Fundamentals and technology review. Eur. J. Pharm. Biopharm., 85, 214–222 (2013). https://doi.org/10.1016/j. ejpb.2013.04.014

Richard Lewis

Richard Lewis has been with Biopharma Group, primarily in sales, for nearly 15 years, developing a wealth of experience across the upstream and downstream process workflow offered. As part of his role, he regularly visits clients, seeing first-hand the latest developments in the field and enabling Biopharma Group to continue supporting their customers with the latest innovations. He is currently Director of the BPS Capital Equipment portfolio.

Email: rlewis@biopharma.co.uk

Dr. Mattia Cassanelli

Mattia Cassanelli completed a PhD in chemical engineering from the University of Birmingham, where his research considered the impact of drying techniques, such as supercritical fluid drying and freeze drying, on hydrocolloid structure. Since joining Biopharma Group in 2018, he has held roles that are at the interface of consultancy and sales. He is currently Director of CDMO Services.

Email: mcassanelli@biopharma.co.uk

A Quantitative Rationale for Hybrid Buffer Preparation Strategies in Biopharmaceutical Manufacturing

Biopharmaceutical manufacturing continues to evolve as upstream titres increase, therapeutic modalities diversify and continuous processing approaches gain wider adoption. These advances place increasing demand on downstream operations, where buffer preparation remains a critical aspect of manufacturing productivity. Traditionally, buffers were rarely a focal point of strategic discussions, but this paradigm is changing as deficiencies in buffer availability, quality or timing disrupt schedules, increase labour burden and delay batch release. As process productivity and portfolio complexity grow, buffer logistics become proportionally more complex, making buffer strategy a foundational element of operational performance rather than a peripheral concern.

Historically, made-in-house (MIH) buffer preparation, based on powder dispensing, dilution with water for injection (WFI) and manual adjustment of pH and conductivity, has been the default approach in many facilities. At scale, MIH can offer favourable raw material economics and a high degree of internal control. However, as production volumes increase, MIH approaches can introduce significant challenges, including ergonomic and environmental health and safety (EHS) risks associated with powder handling, increased facility footprint requirements for hold vessels and downstream delays linked to quality control (QC) testing and release. Deviations or failed releases in buffer preparation can propagate through an entire manufacturing campaign, amplifying operational risk.

These challenges are compounded in facilities that support multiple programmes or modalities. A manufacturing site dedicated to a single high-volume monoclonal antibody may be well served by MIH strategies, but the addition of cell and gene therapy programmes, niche recombinant proteins or clinicalscale campaigns alters the optimisation landscape. In such environments, MIH may be inefficient for small or intermittent buffer demands while lacking the flexibility required for rapidly changing schedules. As a result, reliance on a single buffer preparation method increasingly proves insufficient.

Buffer Preparation Strategies

Modern facilities typically select among three buffer preparation strategies in addition to MIH:

• In-line dilution (ILD)

• In-line conditioning (ILC)

• Ready-to-use (RTU) buffers

As described above, MIH involves internal preparation using powders or liquid concentrates, typically in stainless steel or single-use vessels. ILD generates working-strength buffers by diluting concentrated stock solutions on demand, while ILC produces buffers by blending defined acid, base and salt streams

to achieve target pH and conductivity in real time. RTU buffers are procured as fully formulated 1X solutions from external suppliers.

Each approach presents distinct advantages and limitations. MIH often offers the lowest raw material cost for large volumes but is constrained by labour, footprint and release timelines. RTU buffers can minimise operational burden and EHS risk for low-volume or high-variability applications, but may introduce higher material costs and storage requirements. ILD and ILC can reduce footprint and improve responsiveness, but they require investment in equipment, automation and analytical verification, as well as robust operational expertise.

Decision Drivers Beyond Material Cost

While direct material cost is frequently the starting point for buffer strategy discussions, effective decision-making requires a broader, multi-criteria perspective. Cost considerations must include not only raw materials but also capital investment, consumables, labour and quality compliance activities. Supply chain factors such as storage footprint, warehouse capacity and buffer availability under just-in-time conditions are equally important. Operational strategy further encompasses speed of delivery, recipe flexibility and EHS considerations related to powder handling and the use of concentrated acids or bases.

An MIH-only strategy may appear favourable when evaluated solely on raw material cost, but often becomes less attractive once labour intensity, space utilisation and deviation risk are incorporated. Conversely, RTU buffers may appear costly until the operational value of reduced preparation time, improved safety or avoidance of schedule disruptions is considered. ILD and ILC can offer substantial gains in agility and footprint efficiency, provided that metering accuracy, automation integration and verification strategies are appropriately implemented. Consequently, the optimal buffer preparation strategy is context-dependent and may evolve as facility priorities change.

Advantages of Hybrid Buffer Strategies

Increasingly, biomanufacturers are implementing hybrid buffer strategies and quantitative, multi-criteria assessments consistently demonstrate that hybrid buffer strategies outperform singlemethod approaches across a range of operational scenarios. By aligning buffer preparation methods with demand characteristics, hybrid models enable cost efficiency where volume is high and operational flexibility where variability or risk is greatest. High-volume, stable buffers are often best served by MIH or ILC approaches, which leverage economies of scale and continuous supply. Medium-volume buffers frequently align well with ILD strategies that balance footprint reduction with cost control. Low-volume or specialised buffers are often best addressed through RTU solutions that minimise operational complexity and EHS exposure.

With this as a baseline, Table 1 summarises eight criteria commonly incorporated into such assessments, spanning cost,

supply chain and operational strategy dimensions. When evaluating and scoring these criteria using a weighted model, hybrid strategies are typically proven to be more favourable than single method baselines.

While each criterion represents an important consideration, evaluating them according to organisational priorities and aggregating their combined impact enables the use of a weighted score to assess overall favorability (Figure 1). A higher weighted score indicates a buffer process that is more favourable.

Illustrative Application of a Buffer Assessment Framework

A hypothetical example illustrates how shifting priorities can influence buffer strategy decisions and is demonstrated in Figure 2. Consider a facility that prepares clean-in-place (CIP) solutions in-house and initially prioritises direct raw material cost. In this scenario, a multi-criteria assessment weighted toward cost yields a favourable baseline score (2.81) for MIH preparation. However, as higher-concentration CIP solutions are introduced, EHS risks associated with handling corrosive

materials increase, prompting a reassessment of priorities. When EHS is weighted more heavily, the favorability of MIH decreases substantially to 1.80.

A subsequent evaluation demonstrates that transitioning to RTU CIP solutions improves the consolidated score (2.92) under the revised priority structure, reflecting improved alignment with safety and operational objectives.

Aligning Buffer Strategies with Facility Design and Organisational Capabilities

MIH operations benefit from investments in contained powder transfer systems, ergonomic aids and dust control measures, which often yield rapid returns through reduced safety incidents. RTU adoption requires realistic assessments of storage capacity, secondary containment, temperature control and shelf-life alignment with campaign schedules. Across all methods, automation integration and validation pathways must be clearly defined, with robust governance of recipes, master data and change control.

Table 1. Favourability of buffer preparation methods can be assessed across eight criteria.
Figure 1. A buffer assessment tool allows organisations to evaluate buffer preparation methods based on their priorities and the overweigh importance of the top three criteria.

Application Note

Successful implementation of hybrid buffer strategies depends on attention to technical details. For ILD and ILC systems, metering accuracy must be validated across relevant viscosity and temperature ranges, with dual verification approaches such as mass flow measurement and gravimetric checks for critical buffers. Inline pH and conductivity analytics should be integrated with appropriate alarms and interlocks to prevent out-of-spec blending. Single-use components offer speed and flexibility at low to moderate volumes but may become cost-inefficient at higher throughput, where reusable systems with validated cleaning can offer lower total cost of ownership.

Buffer strategy decisions are also shaped by organisational capabilities and facility layout. Equipment placement influences material flow, operator movement and safety, while poor layout choices can negate the benefits of advanced preparation technologies. Different buffer methods shift complexity across functions. MIH relies heavily on skilled operators and QC throughput, ILD and ILC emphasise automation and metrology expertise, and RTU strategies increase dependence on supplier management and incoming quality control.

As priorities shift, whether due to safety incidents, capital constraints or pipeline changes, the buffer preparation mix must be able to adapt without requiring wholesale facility redesign.

Facilities seeking to modernise buffer preparation without disrupting ongoing operations can adopt a phased approach. Initial efforts should focus on baseline characterisation of buffer demand by volume and process step, coupled with quantification of cost, footprint, labour and EHS impacts. Pilot implementations can then target specific high-impact buffers, with clearly defined performance metrics. Successful pilots

can be expanded into broader hybrid strategies, supported by standardised recipes, aligned quality processes and supplier agreements. Ongoing governance ensures that buffer strategies remain aligned with evolving operational priorities.

Conclusion

The task of buffer preparation may lack visibility, but it exerts a disproportionate influence on manufacturing performance. In an environment characterised by higher productivity, tighter timelines and increasing portfolio diversity, hybrid buffer strategies provide a pragmatic and quantitatively defensible approach. By segmenting buffer demand, applying multi-criteria decision frameworks and maintaining flexibility as priorities evolve, organisations can ensure that buffer operations support, rather than constrain, the broader manufacturing strategy.

The article was originally published in European Biopharmaceutical Review.

Shannon J. Ryan

Shannon J. Ryan, PhD, is the senior director of application services at Avantor, where he leads a team of subject matter experts supporting clients to efficiently design and develop scalable biomanufacturing processes from development through commercial scale. Shannon has more than 20 years of experience in biotechnology with a focus on process development, scale-up and process intensification. He holds a PhD in organic chemistry from Colorado State University in Fort Collins, Colorado, and a B.A. in biology and chemistry from Central College in Pella, Iowa.

Figure 2. Assessments and quantitative scores using the buffer assessment tool in a hypothetical example.

Manufacturing & Processing

Operational Excellence at Scale: Why Integrated Manufacturing is the Future Biopharmaceutical

Standard

Demand for end-to-end, integrated manufacturing capabilities continues to amplify in the biopharmaceutical industry amid lingering operational and regulatory challenges.

The integrated manufacturing model positions contract development and manufacturing organisations (CDMOs) to offer three competitive advantages to their clients: simplified processes across the value chain, standardised operations for consistency, and scalable production with agility. By unifying traditionally fragmented steps into a continuum, CDMOs can turn logistical complexities into operational excellence, achieving efficiency, speed and scale without compromising quality, and ultimately earning the trust of clients, regulatory authorities and the industry.

Conventional, siloed manufacturing models are becoming outdated as the industry adapts to tighter regulations and shifting dynamics. Delays in technology transfers, cracks in data integrity and inconsistent batch outcomes not only harm operations but also erode client trust. On the other hand, CDMOs that implement end-to-end, integrated operations stay ahead of these headwinds. They can reinforce digital infrastructures through electronic manufacturing batch records (eMBRs) and manufacturing execution systems (MESs), ensure that technology transfers are both rapid and reliable, consistently release batches on time and continuously enhance their track record of client satisfaction.

Building Blocks for Integration

Integrated CDMOs combine technical and operational expertise that is built incrementally over time. Flexibility drives this buildup, starting with adaptable facility design and modular bioprocessing strategies, extending through digitalisation and anchoring in a robust quality culture, all reinforced by strategic partnerships.

• Facility Design: A CDMO facility must be built to evolve in a continuously changing industry landscape. Hybrid stainless-steel and single-use systems provide both long-term capacity and agile changeover readiness, while modular cleanrooms and utility backups facilitate reconfiguration without interrupting operations. By analysing cumulative process and product data, CDMOs can design facilities to accommodate mainstream industry demands and reserve modularity for specific niches, thereby avoiding costly over-engineering. The outcomes are shorter product changeovers, faster adaptability to new modalities and efficiency that scales with client demand.

• Modular Bioprocess Strategies: Process adaptability underpins integration. In upstream operations, process analytical technology with Raman probes monitors

nutrient concentrations and cell metabolism in real time, reducing risks and triggering automated adjustments. Dual-feeding strategies combine bolus and continuous methods to accommodate a wider range of biologics, while interchangeable centrifuge bowl designs optimise yield across varying cell densities. In downstream processing, multi-train systems and adaptable chromatography columns enable parallel processing, manage complex molecules and match purification scales to client requirements, keeping projects on track despite shifting demands.

• Digitalisation: eMBRs and MESs lay the digital foundation for integration. By linking production floor systems with quality management, these digital advancements provide real-time traceability, ensure proactive process control and support faster batch release. Beyond compliance, digitalisation allows knowledge from one project to be captured and applied to the next. This accelerates learning cycles, reinforces traceability and issue-response capabilities, and enables data-driven flexibility across all stages of manufacturing.

• Quality Culture: No operation is considered minor. From labelling sampling bags to logging pH variations in a batch record, teams operate on a right-first-time principle. A culture of continuous improvement embeds regulatory readiness, translates data into actionable insights and reinforces client trust.

• Strategic Partnerships: Through strategic partnerships, CDMOs collect operational insights and project data, then leverage them to enhance their facility flexibility, modular strategies and optimised workflows. This iterative cycle builds an integrated infrastructure that adapts, scales and performs reliably under dynamic market and regulatory conditions.

Competitive Advantages of End-to-end Integration

The integrated manufacturing model offers three defining advantages to clients: it streamlines processes across the value chain, standardises operations for consistency and scales production with agility. Together, these benefits accelerate time-to-market, enhance operational excellence and build trust.

• Streamlined Processes for Risk Mitigation and a Higher Batch Success Rate: Streamlined operations stabilise production, control deviations and enforce quality oversight. Uniform training, analytics and troubleshooting protocols reduce downtime and ensure batch-to-batch consistency. Integrated validation systems synchronise process performance qualification campaigns across drug substances and drug products through shared protocols and real-time aggregated data. Parallel runs and consolidated

Manufacturing & Processing

reporting remove handoffs, minimising manual work and sparing time for rigorous corrective actions. Unified serialisation optimises supply chain management, reduces waste and supports strategic resource planning.

• Standardised Operations for Faster Technology Transfers and Supply: If streamlined processes create stability, standardisation embeds consistency. Shared infrastructure, retained process knowledge and digital continuity facilitate rigorous gap assessments and, thereby, expedite technology transfers without compromise. Integrated digital systems unify documentation practices and governance frameworks, preemptively equipping CDMOs for evolving regulatory compliance. One-team rapport forged among scientists across the manufacturing, validation and quality units drives timely scale-ups, batch releases and filings according to client requirements. Speed and reliability are embedded in standardised operations.

• Scalability for Manufacturing Readiness and First-to-market Advantage: Scalability drives integration by turning stability and consistency into agility and resilience. CDMOs centralise supply across the clinical and commercial phases, align regulatory requirements and accelerate filings. Scalability also tackles a key industry risk: the manufacturing capacity squeeze. Without a clear path to scale, timelines slip and opportunities vanish. CDMOs address these capacity constraints by adopting standardised facility designs that enable robust technology transfers, embedding automation and digital-first operations to cut errors and deliver right-first-time production. In concert with streamlined processes and standardised strategies, scalability elevates integration to a strategic engine, securing readiness, reducing risk and enabling a first-to-market advantage.

Operational Excellence at Scale

Integrated manufacturing systems are not new in industrial production environments. The manufacturing industry, such as the automotive and aerospace sectors, has long relied on modular facilities, digitalised production records and standardised process flows to achieve manufacturing stability and efficiency across varied product lines.

The biopharmaceutical industry, however, has faced distinct constraints. The industry’s evolving regulations, strict quality controls and product complexity have made the adoption of integrated manufacturing architectures slower and less

uniform. Biopharmaceutical products require customised development pathways, controlled cleanroom operations and robust analytical validations at each stage. These factors have disincentivised infrastructure designs that tightly link upstream and downstream activities or that unify development with commercial scale-up.

Nonetheless, industry demand has shifted. With accelerated development timelines, the diversification of therapeutic modalities and the pressure to reduce the cost of goods sold across biologics portfolios, the need for integrated systems has heightened.

As a result, end-to-end integration becomes a strategic imperative. As pipelines grow increasingly complex and regulatory pressures intensify, CDMOs that adopt integrated models secure their roles as trusted partners. Success depends on delivering speed, scale and quality within a unified, accountable framework.

Fragmented models fail to meet today’s demand for complex modalities and accelerated timelines. Integrated operations, from adaptable facilities and modular bioprocessing to digitalisation and embedded quality, create the infrastructure to perform, adapt and excel. They accelerate technology transfers, boost batch success, streamline validations, simplify supply chains and strengthen client relationships.

By consolidating technical, operational and digital capabilities, integrated CDMOs transform from outsourcing vendors into strategic partners. They deliver tangible outcomes across the value chain, enable faster delivery of life-saving therapies and define the benchmark for operational excellence at scale in biopharmaceutical manufacturing.

REFERENCES

1. International Council for Harmonisation (ICH). ICH Q10: Pharmaceutical Quality System (Step 4/5 guideline, 2009).

2. International Council for Harmonisation (ICH). ICH Q12: Technical and Regulatory Considerations for Pharmaceutical Product Lifecycle Management (Step 4 guideline, 2019).

3. U.S. Food and Drug Administration (FDA). Process Validation: General Principles and Practices (Guidance for Industry, Jan 2011).

4. U.S. Food and Drug Administration (FDA). PAT—A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance (Guidance for Industry, 2004).

5. European Commission (EudraLex, EU GMP). Annex 11: Computerised Systems (effective 2011).

Kevin Sharp

Kevin Sharp is Executive Vice President and Head of Sales & Operations at Samsung Biologics, joining in 2017. He brings extensive U.S. pharmaceutical experience, including Director of Business Development at Contract Pharmacal Corp. Previously, he spent over nine years at GSK in various business development and procurement roles for vaccines, pharmaceuticals and consumer healthcare. Kevin completed the Executive Development Program at Northwestern University's Kellogg School of Management.

Manufacturing & Processing

Vertical Integration: Building Resilience

and Scalability in Nucleic Acid Therapeutics Manufacturing

The Shift Towards an Integrated Supply Model

The biopharmaceutical sector is entering a new phase shaped by the clinical and commercial momentum of nucleic acid therapeutics (NATs), which include mRNA-based products and oligonucleotide therapeutics such as siRNA and ASOs, as well as sgRNA used in gene editing programmes. Together, these technologies are expanding treatment options by enabling precise genetic targeting and adaptable therapeutic design.

Since the first nucleic acid-based therapeutic approval in 1998, the field has expanded to more than 20 approved products, with hundreds of ongoing clinical trials.1 As more programmes move toward larger studies and commercial supply, manufacturing expectations are tightening around consistency and scalable execution. The field’s expansion is exposing vulnerabilities that challenge scalability, quality consistency and speed to clinic.

In response, some contract development and manufacturing organisations (CDMOs) are adopting a vertically integrated supply model that unites raw material synthesis with GMP production of drug substance and drug product within the same company. By aligning key production stages, this approach can support reliability and scalability while strengthening operational control.

This article examines how vertically integrated models are being applied to mRNA and oligonucleotide manufacturing and how this approach is influencing the standards of efficiency and quality in NAT development and manufacturing.

Managing NAT Manufacturing Complexity as Demand Increases

The therapeutic potential of NATs has advanced rapidly in recent years, driven by their programmable nature and high specificity, which support targeted intervention for diseases previously considered undruggable.1,2 Reflecting this shift, the global market for NATs is projected to reach USD 16.4 billion by 2030, growing at a compound annual growth rate (CAGR) of 16.3% as the technology matures towards wider clinical and commercial use.3

Meeting this growing demand will be challenging. The molecular complexity of NATs necessitates highly specialised production capabilities, and existing manufacturing infrastructure must adapt to meet the large volumes of NATs that are expected to be needed in the future to support cardiometabolic disease targets such as APOC3, LPA, PCSK9, AGT, HSD and INHBE.2,4 To support this expanding pipeline, supply chains must also scale to accommodate unprecedented volumes while maintaining the rigorous quality standards required for commercial approval.

As demand for NATs accelerates, developers are also under pressure to manage increasingly complex workflows that often span multiple vendors and geographies. In this fragmented environment, supply chain resilience is a practical determinant of speed to clinic and manufacturing consistency, particularly when programmes are scaling and have a low tolerance for disruption. A fragmented supply chain makes coordination more difficult, increasing the risk of unwanted variability and delays, as seen during the COVID-19 pandemic, when accelerated timelines, raw materials supply and constraints in fill–finish capacity amplified disruption risk.5

Fragmented Supply Chains Compound NAT Programme Risks

In conventional NAT manufacturing, production is typically fragmented across multiple vendors spanning raw materials (including building blocks and regulatory starting materials [RSMs]), cGMP drug substance production, drug product formulation and fill–finish and analytical testing. Each handoff of materials or documentation between organisations introduces operational risk, increasing the likelihood of misalignment or delays that can compound as programmes advance.

Fragmentation also makes investigation and issue resolution more complicated. When problems occur, root cause analyses and corrective actions can become more difficult and time-consuming when decisions, data and batch records are dispersed across multiple stakeholders and quality systems.

For some developers, supplier scheduling dynamics add another layer of uncertainty. Smaller programmes may be deprioritised in vendor queues, particularly when competing against higher volume accounts, which could impact timelines that are already sensitive to upstream or downstream handoffs.

Managing Complexity With a Vertically Integrated CDMO Partner

As demand for NATs continues to grow, the limitations of a fragmented supply chain model will become more pronounced. A vertically integrated CDMO model mitigates these constraints by consolidating all manufacturing activities within one organisation, from building block and RSM production through to cGMP CDMO services. With the elimination of vendor handoffs, project management and documentation are better A

Manufacturing & Processing

coordinated, traceability is strengthened and development timelines can be accelerated.

This structure also enhances regulatory readiness. When batch records, raw material traceability and quality documentation are maintained within a single organisation, it streamlines audit preparation and simplifies the compilation of data packages for regulatory submissions.

Key Features of a Robust Vertically Integrated CDMO Partner:

1. Full Spectrum Capabilities from Raw Materials to cGMP Services

The CDMO will be competent to manage the entire supply chain from building block and RSM production through to cGMP drug substance manufacturing, drug product formulation, fill–finish, packaging and QC as a “one-stop-shop”, “under one roof”.

2. Technical Leadership, Facilities and Workforce Readiness

A vertically integrated CDMO will have end-to-end operations overseen by expert technical leaders and executed in modern facilities using state-of-the-art equipment by an appropriately trained workforce.

3. Built for Regulatory Compliance

Facilities will be cGMP-compliant and supported by a robust quality system that has successfully supported regulatory submissions to global health agencies. Traceability infrastructure will enable lot-level tracking from building blocks through to the released batch, with comprehensive documentation and batch records.

4. Capacity to Support Scale-up and Continuity

Scalable infrastructure supports continuity with a single partner from early development through to commercial production. A vertically integrated CDMO’s capacity will support scale-up and parallel projects without delays, with dedicated facilities, flexible layouts and adaptable workflows as indicators of long-term fit.

5. A Cross-functional Project Team That is an Extension of Your Team

Bringing programme leadership and manufacturing teams together helps to ensure that technical decisions are made quickly and issues are resolved efficiently. When the CDMO works closely with the sponsor as a partner, it’s easier to stay aligned and respond quickly when problems arise.

6. Next-generation Technologies

As part of a full spectrum of end-to-end capabilities, a vertically integrated CDMO will be capable of leveraging next-generation technologies like chemoenzymatic ligation and liquid phase synthesis, where it makes sense to do so to enhance scalability, sustainability and control costs.

7. Organisational Flexibility Aligned to Different Developer Profiles

A vertically integrated CDMO provides the flexibility to support different types of developers and clinical stages. Staying with one partner from early development through to late-stage manufacturing helps maintain quality, consistency and smooth execution as the programme grows.

Modality-specific

Considerations for Vertically Integrated Manufacturing

The potential benefits of a vertically integrated approach become even clearer when viewed through the lens of specific NAT modalities, which differ in chemistry, raw materials and process sensitivities, creating distinct pressure points across their manufacturing workflows:

Oligonucleotide-based Therapeutics

Oligonucleotide synthesis relies on specialised RSMs, including phosphoramidites, loaded solid supports and ligands such as GalNAc. These inputs, which are manufactured from building blocks such as nucleosides, are typically treated as the point where cGMP controls begin for oligonucleotide manufacture, with stringent expectations for sourcing justification and quality control.

• From these RSMs, oligonucleotide drug substance manufacture involves tightly controlled chemical steps for chain elongation, including protection and deprotection steps plus sequential coupling and oxidation steps.

• Chemoenzymatic ligation may optionally be used for siRNAs and other oligonucleotides. Where it is used, enzymes and buffer components are introduced as raw materials, which also require sourcing and control as part of the manufacturing strategy.

• In a vertically integrated model, aligning RSM manufacture with downstream drug substance production can simplify traceability and change control when specifications or processes evolve.

mRNA-based Therapeutics

The workflow to manufacture mRNA therapeutics involves complex, interdependent steps with sensitivity to raw material inputs and process conditions. Consolidating control over key raw materials such as modified nucleotides, enzymes, cap analogs and DNA templates can help maintain alignment across the workflow and reduce variability as scale increases.

• Downstream of the raw materials, integrated management across cGMP mRNA production by IVT, purification, LNP formulation, fill–finish and packaging support faster technical iteration because changes can be evaluated within the context of a single quality environment and the need for transfer between vendors is eliminated.

• Gene Editing Programmes That Use Long Guide RNA

In addition to all the components described in the workflow for mRNA therapeutics, gene editing programmes also require coordinated manufacture of guide RNA, such as sgRNA and pegRNA. These long, chemically modified RNA molecules can be produced with exceptionally high purity using chemoenzymatic ligation with a CDMO partner that is competent with that technology.

Implications for the Future of NAT Manufacturing

As the NAT industry continues its march toward large-scale volumes to address chronic global health burdens, the need for more efficient and scalable manufacturing will grow. By consolidating materials, processes and documentation “under one roof”, from

Manufacturing & Processing

A vertically integrated oligonucleotide and mRNA manufacturing platform in which a single CDMO partner controls the entire supply chain, from raw material building blocks through to drug substance and drug product.

production of building blocks and RSMs like phosphoramidites to final drug product fill–finish, vertically integrated CDMOs eliminate the vendor handoffs that have historically hindered speed and increased operational risk. This integrated model ensures that quality is embedded throughout the manufacturing lifecycle, with a transparent and traceable trail of documentation that simplifies regulatory submissions and strengthens long-term supply security.

For developers, shifting to an integrated partner means moving beyond fragmented vendor relationships toward a collaborative model that prioritises de-risked scaling, execution efficiency and programme continuity. As the global NAT pipeline continues to diversify and expand, those who leverage this model will be best positioned to bridge the gap between clinical promise and commercial reality, bringing transformative genetic medicines to patients with greater speed and confidence.

REFERENCES

1. Belgrad, J., Fakih, H. H., & Khvorova, A. (2024). Nucleic Acid Therapeutics: Successes, milestones, and upcoming innovation. Nucleic Acid Therapeutics, 34(2), 52–72. https://doi.org/10.1089/nat.2023.0068; https://pmc.ncbi.nlm.nih.gov/articles/PMC11302270/

2. Dzau, V. J., & Hodgkinson, C. P. (2024). RNA therapeutics for the cardiovascular system. Circulation, 149(9), 707–716. https://doi. org/10.1161/circulationaha.123.067373 https://www.ahajournals. org/doi/10.1161/CIRCULATIONAHA.123.067373

3. Nucleic Acid Therapeutics Market Report, Industry and market

Size & Revenue, Share, Forecast 2024–2030. (n.d.). https:// www.strategicmarketresearch.com/market-report/nucleic-acidtherapeutics-market

4. Obexer, R., Nassir, M., Moody, E. R., Baran, P. S., & Lovelock, S. L. (2024). Modern approaches to therapeutic oligonucleotide manufacturing. Science, 384(6692), eadl4015. https://doi.org/10.1126/science. adl4015

5. Kis, Z., Kontoravdi, C., Shattock, R., & Shah, N. (2020). Resources, production scales and time required for producing RNA vaccines for the global pandemic demand. Vaccines, 9(1), 3. https://doi. org/10.3390/vaccines9010003; https://pubmed.ncbi.nlm.nih. gov/33374802/

David Butler

David Butler, PhD, is the Chief Technology Officer at Hongene Biotech Corporation, a vertically integrated raw materials supplier and CDMO services provider that is focused on the RNA manufacturing space, the scope of which includes mRNA, vaccines, oligonucleotides, gene editing and gene therapy. Previously in his career, he led organisations driving drug discovery and development of oligonucleotide therapeutics. He also spent time as a Principal Scientist, developing early LNP technologies for siRNA delivery that were the progenitors of those used for mRNA-related products today.

Mitigating At-Risk Cell and Gene Therapy Application with Rapid Sterility Testing

Modern medicine is predicated on the safety provided by sterility assurance and aseptic practices. The United States Pharmacopeia defines these two important concepts. “Sterile means having a complete absence of viable microorganisms or organisms that have the potential to reproduce.” “Aseptic describes the process for handling sterilised materials in a controlled environment designed to maintain microbial contamination at levels known to present minimal risk” (USP <1116>). From these two concepts arises the quality control regulations and practices that guide the medical and pharmaceutical world today.

From these principles, the manufacturing of sterile products requires many aseptic techniques and tests to assure sterility. These include adequate facilities that are cleaned of particles, regular environmental monitoring and bioburden testing of the raw materials coming in. These procedures all lead to the final product, which will be tested according to the sterility test.

Traditional Sterility Testing Methods

The sterility test is a simple and accurate test that has remained consistent since being adopted by the British Pharmacopoeia in 1932. It has since been accepted globally and harmonised. Examples of its requirements can be found in USP <71>, European Pharmacopoeia (Ph. Eur.) 2.6.1 and ICH Q4B Annex 8. The test involves an inoculation of the sample in a growth media and then incubation for 14 days. The media is then visually inspected for growth. A negative test indicates that the sample tested is free of viable microorganisms.

Although the sterility test is an excellent test to ensure the safety of a product, several considerations must be made to ensure the efficacy of the test. First, a negative test result of the samples does not confirm that the entire manufactured lot of product is sterile. USP <71> makes it clear that the test results only apply to the sample that was tested. The purpose of this limitation is to emphasise the importance of the entire aseptic process to provide confirmation of sterility. Sterility is never proved by a specific test; rather, it is assured by the contamination control strategies and in-process testing that confirm the sterility of the final test result.

Secondly, the suitability of the test must be proven for each product tested. The chapter guidance makes it clear that the test results are not reliable unless samples spiked with known microorganisms have been detected by the test.

Thirdly, the entire process of performing the test needs to be kept in an aseptic environment to prevent environmental contamination leading to false positives. With these considerations, the sterility test has kept modern medicine safe.

Challenges with Traditional Sterility Testing Methods

With the correct contamination control strategies and test suitability considerations, the sterility test becomes a highly effective method for screening pharmaceutical injectables, medical devices, ophthalmic solutions and other topical solutions needing sterility assurance. However, despite these strengths, the traditional sterility test has one major unmitigable limitation, the 14-day test to result. For many products, this is not an issue as this hold time is minuscule compared to the shelf life of the product. But for products that require a short turnaround, there is a risk when they are administered before the final metric of sterility assurance has been determined.

The regulatory agencies refer to products that involve modifications of cells, proteins and tissues that are then administered to the patient as Advanced Therapy Medicinal Products (ATMPs). The need for these products for a metric of sterility within hours has led the USP and Ph. Eur. to draft chapters with considerations for Rapid Microbiological Methods (RMMs) to fill this need for rapid sterility assurance. These corresponding chapters are USP <1071> Rapid Microbial Tests and Ph. Eur. 5.1.6 Alternative Methods for Control. The needs in these chapters of a rapid method for sterility include the ability to produce results in less than 24 hours, the ability to detect both a low quantity and a wide range of viable microorganisms, and to test low sample volumes while also testing multiple samples at once. The goal of these requirements is to reduce risk to patient survival by immediate product administration as well as to detect microorganisms in samples that would not be detected by the traditional sterility test, including antibiotic-containing samples and culturenegative infectious agents.

RAPID STERILITY METHODS

Challenges with growth-based rapid sterility tests

In response to these needs, several solutions have been presented as rapid microbiological methods for sterility testing. These solutions can be categorised as growth-based methods or non-growth-based methods. These solutions all report in Colony Forming Units (CFU), which correspond to the ability of the traditional sterility test to detect a single microorganism that propagates.

First, USP has accepted two growth-based alternatives to the sterility test (<71>). First, <72> includes respiration-based techniques for detecting contamination. This test is like the traditional sterility test in that the sample is inoculated in growth media and then incubated. However, microorganisms are not detected by just visible growth, but an indicator will allow the detection of carbon dioxide. As a byproduct of respiration, this test essentially is a test of all respirating, and therefore viable, microorganisms. This test method will report between 1 and 10 CFU per sample and will provide results of sterility assurance within 7 days.

Subsection: Cell and Gene Therapy

The second growth-based method involves the detection of ATP (<73>). These methods involve using a luminescence indicator to detect ATP, allowing sterility assurance within 5 days to a detection of 1-10 CFU. Because ATP is a result of cellular respiration, this test provides a broad, accurate test of viable microorganisms.

Although these methods provide an equivalent measure of sterility in less than half the time of the traditional test, these methods still fall short of a test result within 24 hours that many RMMs require. Additionally, these tests are still susceptible to the potential interferences that a sample can provide to growth.

Challenges with non-growth-based rapid sterility tests

Although the USP has not currently issued a compendial chapter on any non-growth methods for sterility assurance, USP <1071> and Ph. Eur. 5.1.6 do provide guidance on the adoption of several non-growth techniques for rapid sterility indication. First, solid phase cytometry is an extremely useful tool for visually scanning an entire sample for microorganisms. This method will give rapid results within a few hours and will provide results with a detection level of 1-10 CFU, on par with the USP’s recommended sensitivity for rapid methods. Although this method is simple, reliable, sensitive and rapid, it performs best for samples that are free of other cells in suspension as a part of the product. This means that for many of the cellular & gene therapies that make up the majority of ATMPs, this method can have difficulties.

A final method mentioned as a solution includes nucleic acid amplification tests (NAT) using PCR analysis. These tests use PCR technology to amplify target genes that are conserved among bacterial and fungal DNA. These tests can use conventional qPCR analysis and they provide a broad-coverage test within a few hours. However, these tests have a few limitations. Firstly, DNA is a stable molecule and therefore, these tests are susceptible to false positives caused by residual DNA. Secondly, these methods do not have the same sensitivity as the other RMMs, with quantification limits typically in the range of 10-100 CFU.

Many alternatives to the traditional sterility test exist. However, for an ATMP that contains a high concentration of cells, needs results within 24 hours, needs detection below 10 CFU and potentially contains residual bacterial or fungal DNA, a perfect alternative method does not conventionally exist. With these limitations in mind, a test has been developed to mitigate these.

RIBONAT™ PERFORMANCE

RiboNAT™ Methodology

RIboNAT™ Rapid Sterility Test kit utilises the nucleic acid amplification test (NAT) method in which ribosomal RNA (rRNA) is detected using RT-rt PCR. Rather than detecting genomic DNA (gDNA), as with other DNA-based microbial methods, the detection of rRNA allows for a higher sensitivity and reduced false positives from residual DNA of dead microorganisms. Highly conserved ribosomal subunits are targeted for both bacteria and fungi, which provides specificity to prevent cross-reactions with human cells while also supporting a wide detection range of microorganisms. With this rapid microbial

test method, sterility assurance for short shelf-life products can be confirmed before patient application to obviate at-risk treatment.

The RiboNAT™ kit consists of three distinct parts, including a pre-treatment step, an RNA isolation step and a final measurement and detection step, allowing for the full assay flow to be completed within seven hours. In the pre-treatment process, the sample is cultured both aerobically and anaerobically while inactivating free DNA and DNA from dead microorganisms. The second step involves a two-stage DNA degradation process with the use of DNase as well as magnetic beads for the extraction and purification of RNA from microorganisms. In the final detection step, the purified ribosomal RNA is detected utilising one-step RT-rt PCR. The reagents employed in the RiboNAT™ Rapid Sterility Test are divided into three individual kit components: RNA Isolation Kit 1, RNA Isolation Kit 2 and the Detection kit, to accommodate the distinguished stages of the assay.

RiboNAT™ Sensitivity and Minimisation of False Positives

Through the detection of rRNA rather than gDNA, RiboNAT™ is able to provide a wider, more sensitive detection range of both anaerobic and aerobic bacteria as well as fungi in a single assay. Due to the larger relative quantity of RNA when compared to DNA, RiboNAT™ has an enhanced detection sensitivity when compared to conventional DNA-based NAT methods, validated to as low as 9 CFU/mL. The standard protocol for RiboNAT™ involves incubating the samples for 3 hours. However, with an extended incubation period, sensitivity can be increased

Subsection: Cell and Gene Therapy

for certain samples. With a 14-hour incubation period, three microorganisms tested (Aspergillus brasiliensis, Clostridium sporogenes, Cutibacterium Acnes) were detectable at 2 CFU/mL. Six microorganisms specified in the Strains of the Test Microorganisms Suitable for Use in the Growth Promotion Test and the Method Suitability Test of USP Chapter <71> were used to validate the sensitivity of the RiboNAT™ assay. All strains were prepared at 9 CFU/mL and were successfully detected with threshold cycle (Ct) values below 35. In addition, the six strains were tested in HEK293, MSC and T-cell suspension samples, which were then detected following the RiboNAT™ assay protocol. All six strains were detectable at 9 CFU/mL in these cell suspension sample preparations.

With the utilisation of an RNA-based detection method, false positives were significantly minimised when compared to conventional DNA-based NAT methods for sterility assurance. RiboNAT™ not only includes a DNA inactivation step but also targets rRNA instead of gDNA to reduce false positives from dead microorganisms and residual DNA. When sterile PBS was tested with both a commercial DNA extraction kit and the RiboNAT™ sterility kit, it was found that amplification curves were observed in the DNA-based method, but no amplification was detected using RiboNAT™.

RiboNAT™ Applications

Traditional growth-based sterility test methods that are compendial in USP <71> require a 14-day incubation period, which is not suitable for products that have short shelf lives or are prepared for immediate use. These products, which include compounded sterile preparations (CSPs), nuclear medicine products and Advanced Therapy Medicinal Products (ATMPs), if not tested with a rapid microbial method for sterility assurance, can pose a risk to patient safety if administered before sterility testing is completed (USP <1071>).

Manufacturers of ATMPs, including cell and gene therapy products, as well as other immediate-use and short shelf-life products that are typically administered at-risk, can benefit from the accelerated sterility assurance timeline provided by the RiboNAT™ Rapid Sterility Test kit. With only a 3-hour incubation period required, the entire assay can be completed in one day, within 7 hours.

RiboNAT™ can also be beneficial for those products that experience difficulties with other DNA-based NAT methods. Additionally, products that have small batch volumes may find increased ease of benefits with RiboNAT™, as only 1 mL of product is required for each aerobic and anaerobic sample preparation. Additionally, multiple samples can be tested within one assay, allowing for increased throughput if necessary.

Conclusion

Compendial sterility testing requires a 14-day incubation period, which can be incompatible with products that have short shelf-lives or are manufactured for immediate use. For products that are not suitable for traditional growth-based sterility testing and are administered at-risk, the RiboNAT™ Rapid Sterility Test kit offers a rapid microbial method to assure product sterility before patient application. RiboNAT™ retains the benefits of NAT for sterility assurance while also minimising the drawbacks associated with DNA-based methods, including false positives and decreased sensitivity.

REFERENCES

1. European Pharmacopeia, 2.6.1 Sterility, Ph. Eur. 12.1 edition, 2025.

2. European Pharmacopeia, 5.1.6 Alternative Methods for Control of Microbiological Quality, Ph. Eur. 12.1 edition, 2025.

3. International Council for Harmonization, Q4B Annex 8 On Evaluation and Recommendation of Pharmacopoeial Texts for Use in The ICH Regions Sterility Test, 2013.

4. United States Pharmacopeia, <71> Sterility Tests, USP–NF 2024, Issue 1, 2024.

5. United States Pharmacopeia, <1071> Rapid Microbial Tests for Release of Sterile Short-Life Products: A Risk-Based Approach, USP–NF 2024, Issue 1, 2024.

6. United States Pharmacopeia, <1116> Microbiological Evaluation of Clean Rooms and Other Controlled Environments, USP–NF 2024, Issue 1, 2024.

Timothy Francis

Timothy Francis is the Senior Technical Specialist for the Pyrogen Testing Products Division of FUJIFILM Biosciences. He holds a B.S. In Biology and an M.S. in Curriculum and Instruction in Science. He came into this role with five years of experience teaching the natural sciences at a college level. He is proficient at taking the complex, technical aspects of a topic and breaking them down into clear, understandable pieces that all connect back to the big picture. He draws upon this experience to provide professional technical support and training for the PYROSTAR™ line and to help you with your technical needs.

Delaney Novak

Delaney Novak is a Technical Specialist for the Pyrogen Testing Division of FUJIFILM Biosciences. She holds a B.S. in Environmental Science alongside a minor in Biology. She enjoys working in a collaborative environment and is always open to addressing new challenges and answering complex questions. She applies these skills to further support any technical needs or concerns you may have in your pyrogen testing endeavours.

Subsection: Cell and Gene Therapy

From Policy to Practice: How Regulatory Momentum for New Approach Methodologies Is Accelerating the Adoption of

Organ-on-a-Chip Technology

A Turning Point for Drug Development Models

The biopharmaceutical industry is at an inflexion point. For decades, animal models have served as the backbone of preclinical research, supporting target validation, efficacy assessment and safety evaluation. While these models have enabled countless therapeutic advances, their limitations in predicting human outcomes are increasingly well documented. Persistent late-stage attrition, unexpected toxicities and species-specific differences continue to highlight the translational gap between preclinical promise and clinical reality.

At the same time, a diverse set of human-relevant scientific tools has matured rapidly. Collectively referred to as New Approach Methodologies (NAMs), these approaches are reshaping how researchers think about safety and efficacy assessment. The term “New Approach Methodologies” was formally introduced in 2016 to describe a broad range of techniques, technologies and strategies designed to inform regulatory decision-making without relying on animal testing. While the acronym “NAMs” is sometimes informally used to mean “non-animal methods” or “new alternative methods,” its regulatory meaning is more specific. NAMs are fit-for-purpose approaches that generate data relevant to hazard identification, risk assessment or safety evaluation in a manner that can support regulatory review.

Importantly, NAMs are not a single test or platform. Rather, they encompass a wide scientific ecosystem, including:

• In vitro systems, such as human cell–based assays, 3D organoids and microphysiological systems, including Organ-on-a-Chip technologies

• In silico models, including quantitative structure –activity relationship (QSAR) models, physiologically based pharmacokinetic (PBPK) simulations and artificial intelligence or machine learning approaches

• Omics technologies, such as transcriptomics, proteomics and metabolomics, which provide mechanistic insight into biological responses

• In chemico assays, which assess chemical reactivity directly

• Integrated frameworks, such as Integrated Approaches to Testing and Assessment (IATA), that combine multiple data streams into weight-of-evidence evaluations

What unites these diverse approaches is not simply the absence of animal use, but a shared emphasis on human-relevant biology, mechanistic understanding and data integration.

Within this broader NAM landscape, Organ-on-a-Chip technology has emerged as a particularly compelling solution. By combining human cells with microengineering, controlled fluid flow and physiologically relevant mechanical cues (Figure 1), these platforms seek to replicate key aspects of organ-level function in vitro. They represent a bridge between traditional cell culture and whole-organism biology, offering dynamic, tissue–tissue interfaces and functional readouts that were previously difficult to capture outside of animal models.

The growing sophistication of NAMs such as Organ-Chips coincides with a shift in regulatory policy and scientific priorities. Agencies in the United States, the United Kingdom and other regions have signalled increasing openness to alternative approaches and have articulated structured plans to expand their use. This convergence of technological maturity and regulatory momentum marks a turning point. The key question facing the industry now is not whether NAMs will play a role in the future

Figure 1: Diagram of an Organ-Chip. The original Organ-Chip design, Chip-S1 Stretchable Chip, consists of two parallel microfluidic channels (1, 6), with distinct channels for epithelial cells (2) and endothelial cells (5) and their respective cell culture media. A porous membrane (4) separates the two channels while enabling inter-channel communication and cell migration. Vacuum channels (3) alongside the microfluidic channels provide tunable mechanical stretch across the membrane. Image credit: Emulate, Inc.

Subsection: Cell and Gene Therapy

of drug development, but how quickly and strategically they will be integrated and how the balance between traditional animal models and human-relevant systems will evolve in the years ahead.

Regulatory Signals: From Encouragement to Structural Change

Regulatory agencies have long endorsed the principles of the 3Rs (replacement, reduction and refinement of animal use in research), but a pivotal change occurred in the United States with the passage of the FDA Modernization Act 2.0 in late 2022. It removed the explicit mandate that required the use of animal models in preclinical studies, instead allowing sponsors to use “nonclinical tests or studies”, including non-animal models when scientifically appropriate. The significance of this change was both symbolic and practical. For decades, animal testing had been embedded in regulatory frameworks as the default expectation. By updating the statutory language, the FDA Modernization Act 2.0 formally acknowledged that scientific progress had expanded the toolkit available to developers. It signalled that human-relevant NAMs were not merely supplementary, but potentially sufficient in defined contexts.

However, the passage of the Act did not immediately trigger a widespread shift in industry practice. Several factors contributed to this measured response. First, while the legislation removed a legal requirement, it did not automatically establish detailed guidance on how and when alternatives would be accepted in place of animal data. Sponsors remained cautious, recognising that regulatory review ultimately depends on the totality of evidence and scientific justification. Second, broader adoption

Date Agency

Dec 2020 FDA

Dec 29, 2022 US Congress

Feb 6, 2024 US Congress

Sep 24, 2024 FDA

Apr 10, 2025 FDA

Apr 29, 2025 NIH

Milestone & Impact

depends on experience and confidence in the performance of emerging platforms. As with any new scientific approach, organisations seek assurance that systems are robust, reliable and reproducible across studies and settings. Building that confidence requires well-characterised datasets, cross-laboratory experience and clarity around the context of use. Third, global harmonisation remained incomplete. Pharmaceutical development is inherently international and sponsors must consider requirements across multiple regulatory jurisdictions. Until broader alignment emerged, companies were hesitant to diverge dramatically from established global practices.

The next inflexion point came in April 2025, when the U.S. FDA announced a formal initiative to phase out animal testing requirements in specific areas over time, accompanied by a roadmap outlining how that transition would be achieved. Unlike the statutory revision in 2022, which simply removed an obligation, the 2025 announcement articulated and actively endorsed a new direction of travel. Crucially, the roadmap framed the transition as data-driven and iterative. Rather than setting arbitrary deadlines, it outlined measurable milestones, identifying priority areas where animal models are known to have limited predictive value, supporting comparative studies to benchmark NAM performance and increasing reviewer training to ensure consistent evaluation of alternative data streams. This approach acknowledged both the urgency of modernisation and the need for scientific rigour.

The FDA’s announcement had an impact beyond U.S. borders. Regulatory science operates within a global ecosystem and leadership from one major agency often influences others. In

ISTAND Programme Launch. Opens formal pathway for novel Drug Development Tools, such as Organ-Chips Chips.

FDA Modernization Act 2.0. Removes statutory animal test test mandate; defines “nonclinical tests” to include in vitro, in silico and microphysiological systems.

FDA Modernization Act 3.0 (introduced). Directs FDA to build a routine qualification pathway for NAMs.

First Organ-Chip accepted into ISTAND. Emulate's submission of the Liver Chip accepted into ISTAND. Emulate's submission of the LiverChip S1 for predicting DILI establishes Chip S1 for predicting DILI establishes evidentiary precedent.

Roadmap & Phase-Out Plan. Animal studies to become “the exception”, prioritises MPS data and AI Out Plan. Animal studies to become “the exception”, prioritises MPS data and AIdriven driven models.

Funding Priorities Shift. Grants now favour human-based technologies over animal-based technologies over animal-only studies.

May 29, 2025 US Navy Ends Cat & Dog Experiments. Signals wider federal move toward NAMs.

Jul 7, 2025 NIH

November 11, 2025 UK Government

January 22, 2026

Early 2026

US Environmental Protection Agency

European Medicines Agency

Bars Animal-Only Proposals. Requires at least one validated human Only Proposals. Requires at least one validated human-relevant method in funded research.

Announcement of a similar plan as the US FDA to phase out animal testing, with specific milestones and £75 million in new funding to help accomplish that goal.

Reaffirms its commitment to phasing out animal testing by 2035.

The EMA is expected to issue guidelines on the reduction of animal usage in preclinical research in early 2026.

Table 1: A Timeline of Recent Regulatory Milestones

Subsection: Cell and Gene Therapy

the months following the FDA’s roadmap, other governments, including the United Kingdom, European Union and parts of Asia-Pacific, reiterated or expanded their own commitments to reducing reliance on animal testing. In the United Kingdom, policymakers highlighted NAMs within broader life sciences strategies and research funding priorities. In Europe and parts of Asia-Pacific, agencies signalled growing openness to incorporating advanced in vitro and computational methods into safety and efficacy assessments. A summary of major regulatory agency updates can be found in the table below.

While the specific policies differ across jurisdictions, a clear pattern has emerged: regulatory authorities are no longer simply permitting alternatives; they are actively encouraging their development and structured integration. Importantly, though, none of these developments implies the immediate disappearance of animal models. Instead, these policies will foster a period of co-existence that will enable a gradual but deliberate rebalancing through incremental integration of NAMs, focusing first on where they address known limitations of existing models and where they can provide additional insight.

Organ-on-a-Chip Technology:

Maturing to Meet Regulatory Demand

As regulatory expectations evolve, so too has the maturity of Organ-on-a-Chip technology itself. These systems were initially developed to capture key aspects of human tissue structure and function by combining primary or stem cell–derived human cells with microengineering (such as controlled fluid flow, shear stress and mechanical stretch), alongside in vivo-relevant factors such as a tissue-vascular interface and appropriate extracellular matrices. Early demonstrations focused on physiological credibility, reproducing barrier integrity, tissue–tissue interfaces, mechanical

stretch or metabolic function in ways that conventional static culture could not. As the technology has developed, its value proposition has shifted accordingly, with [LE1.1]the emphasis increasingly on decision support; can results from a human-relevant microphysiological system influence a go/no-go determination, clarify a mechanistic ambiguity or strengthen a risk assessment? This reframing is essential for adoption within pharmaceutical organisations, where new methods must operate within established pipelines and development timelines.

As a decision-making tool, Organ-Chips are often introduced in areas where species differences are well recognised or where conventional models have shown limited predictive value. Human-specific toxicities that fail to manifest in standard preclinical species, mechanistic investigations requiring direct access to human tissue responses and de-risking programmes involving novel modalities or first-in-class targets are common entry points. In these settings, the goal is not wholesale replacement but enhanced confidence. By generating data in parallel with established in vitro and in vivo approaches, teams can assess concordance, explore discrepancies and build institutional familiarity without jeopardising regulatory-facing workflows.

As an example of how Organ-Chip data can enhance decisionmaking, consider recent work that evaluated a human Liver-Chip model in the context of drug-induced liver injury (DILI), one of the most common causes of clinical attrition and post-market drug withdrawals. In a study analysing nearly nine hundred Liver-Chips across 27 compounds with well–characterised clinical hepatotoxicity profiles, researchers benchmarked the performance of the Liver-Chip model against the historical behaviour of these drugs in humans and conventional preclinical

Subsection: Cell and Gene Therapy

models.1 The Liver-Chip correctly identified approximately 87% of compounds known to cause DILI in patients (despite passing through animal testing), with no false positives for non-hepatotoxic compounds, demonstrating both high sensitivity and specificity for the context of DILI prediction. These results suggest that Liver-Chips can outperform conventional preclinical approaches, offering earlier and more human-relevant indications of liver risk. By incorporating such data alongside animal and traditional in vitro models, teams can gain deeper mechanistic insight and more confidence in safety assessments earlier in development, ultimately reducing unnecessary downstream attrition and enhancing portfolio decision quality. This example highlights how Organ-Chip data, when aligned with defined contexts of use and performance benchmarks, can move beyond proof-of-concept toward practical integration in drug development strategies. Indeed, this Liver-Chip model is now in the final phase of the FDA’s ISTAND program to qualify its use as a Drug Development Tool for the prediction of DILI in preclinical studies.2

This example also highlights a central theme in discussions of broader adoption of Organ-Chips and other NAMs, that of technological validation, of which there are multiple aspects to consider.

Technical robustness is, of course, one of the most important factors. Is the system reproducible within and across laboratories, stable over time and compatible with standardised analytical methods? Without consistent performance, even the most biologically compelling system will struggle to gain traction in risk-averse environments.

Equally important is clarity surrounding the context of use. Organ-Chip platforms are not designed to answer every preclinical question, nor should they be positioned as universal solutions. Their application should be appropriately defined around a specific objective, predicting a specific class of toxicities, interrogating a defined pathway or modelling a particular organ-level response, for example.

Comparative data will further strengthen confidence in the use of Organ-Chip data. Robust studies that evaluate Organ-Chip outputs alongside legacy animal data, established in vitro assays or known clinical outcomes will help stakeholders understand where these systems provide convergent insight

and where they diverge. Notably, divergence is not inherently problematic; in some cases, differences illuminate the limitations of traditional models or reveal human-specific mechanisms that animal studies could not capture. Transparent reporting of both positive and negative results is essential to advancing collective understanding and value attribution.

Finally, regulatory engagement itself will also play a critical role in building confidence. Early dialogue around study design, endpoints and interpretation can clarify expectations and reduce uncertainty. As reviewers gain experience evaluating NAM-derived datasets, confidence will increase on both sides of the submission process.

Looking ahead, the role of Organ-on-a-Chip technology is likely to evolve in tandem with broader changes in regulatory science. If the past several years have marked the beginning of structural regulatory change, the coming years will test the industry’s ability to convert that momentum into operational practice. The trajectory suggests not an overnight transformation, but rather a steady rebalancing, guided by human relevance, scientific rigour and a shared commitment to improving translational outcomes.

REFERENCES

1. https://www.nature.com/articles/s43856-022-00209-1

2. https://www.fda.gov/drugs/drug-safety-and-availability/ fdas-istand-pilot-program-accepts-submission-first-organ-chiptechnology-designed-predict-human-drug

Lorna Ewart, PhD, is Chief Scientific Officer at Emulate. With more than two decades of experience in pharmaceutical R&D, Dr Ewart has been a driving force in advancing bioscience and drug safety innovation. At Emulate, she leads the biological sciences division and plays a pivotal role in shaping the company's scientific vision, guiding its collaborations across academia, industry and regulatory agencies.

Lorna Ewart

Subsection: Cell and Gene Therapy

Bridging the ADMET Translational Gap:

How New Approach Methodologies and Organ-on-a-Chip Technology Are Redefining Drug Development

Drug developers face challenges in translating preclinical findings to safe and effective human trials. Traditionally, preclinical testing relies on a phased approach that combines simple in vitro assays and whole animal in vivo studies; however, a translational gap remains between these findings and the clinical outcomes.

The translational gap is caused by the limited ability of simple in vitro assays to accurately predict human responses, combined with the interspecies limitations of animal studies, a core requirement by global regulatory bodies before human trials, unless no suitable model exists. The transition from preclinical models to human subjects therefore requires navigating complex pharmacokinetic (PK) and toxicology uncertainties that vary significantly by drug modality and therapeutic area.

Our inability to adequately address these uncertainties is represented by consistently high drug attrition rates. Toxicity remains one of the most significant causes of drug attrition, accounting for 30% of drug failures in the clinic.1 Poor human translation in pre-clinical PK studies is also a major driver of clinical failures, with PK and bioavailability listed as the third most common cause of attrition and 16% of all failures.2

The relatively poor predictability of animal models, alongside a pull for more ethical and sustainable science, has led to recent regulatory shifts from the US FDA and the UK Government, highlighting their respective commitments to phasing out animal testing in favour of animal alternatives, known as New Approach Methodologies (NAMs).

We now stand at a pivotal moment in drug development, with regulators signalling a future where NAMs, such as in silico models and microphysiological systems (MPS), are more than just peripheral tools; they will now become central to defining how new medicines are discovered, evaluated and approved. To facilitate broader adoption and expedite the pace of change, numerous efforts are being deployed to validate and standardise NAMs for use in regulatory submissions, with many drug developers already using a broad range of NAMs for internal decision making.

This article will explore how NAMs can be utilised in combination to predict a translatable first-in-human dose to support earlier pipeline decision-making and enable developers to build greater confidence in their lead candidates.

Why Is There Momentum Behind NAMs' use to Bridge the ADMET Translational Gap?

In 2025, the FDA announced its decision to phase out animal testing requirements for monoclonal antibodies, followed by other drugs, signalling a clear shift towards the use of more

relevant human approaches for preclinical safety and toxicity testing. Later in the year, the UK government announced its plans to phase out animal testing faster, where reliable and effective alternative methods can replace them by offering the same level of safety for human exposure. Specifically called out was a goal to reduce pharmacokinetic studies using dogs and non-human primates by 2030.

Another clear window into the inevitability of NAMs moving from promise to regulatory reality can be found by looking at the FDA’s Innovative Science and Technology Approaches for New Drugs (ISTAND) Programme. While the sample size is modest, the dataset reveals patterns about regulatory expectations, timelines and which technologies are most likely to succeed. As of the 21st January 2026, six of 15 accepted technologies being evaluated in the Drug Development Tool database are MPS, a type of Organ-ona-chip technology for toxicity testing and drug dosing.

Alongside regulatory announcements, partnerships between MPS providers and the industry’s support network of CROs further demonstrate the growing acceptance of MPS platforms as strategic tools that address real-world translational gaps and enable better preparedness for the clinic. From our experience, most contract research studies are currently focused on the use of MPS to solve late-stage challenges, such as exploring dose setting for drugs with a narrow therapeutic window or seeking clarity where there is conflicting data from preclinical species. However, the real efficiency and benefits of MPS come from integrating the technology earlier to guide decision making rather than troubleshooting. As more companies see how MPS helps bridge the translatability gap, adoption is likely to shift increasingly toward earlier stages.

The power of NAMs to change how lead candidates are selected is something that will be led by their use as a collective, rather than in isolation. For example, data generated by MPS becomes far more insightful when combined with in silico tools, such as physiologically based pharmacokinetic (PBPK) models, machine learning and computational toxicology frameworks to bridge gaps between in vitro findings and predicted human responses.

The Limitations of Standard In Vitro DMPK Tools

Caco-2 epithelial cells have long been heralded as the gold standard for studying intestinal absorption and permeability. However, these immortalised colorectal adenocarcinoma cells have well-documented limitations in fully replicating the complexity of the human intestine. Notably, expression of key enzymes involved in the hydrolysis of ester prodrugs is neither physiological nor human-relevant in Caco-2 cells, affecting the accurate prediction of the metabolism and absorption of ester-containing drugs. Furthermore, this model operates in isolation and therefore cannot profile the contribution to bioavailability of both the gut and liver. Because of these shortcomings, Caco-2 assays often necessitate additional

Subsection: Cell and Gene Therapy

animal studies, which would not have been required had more advanced and appropriate models been used.

Enabling Human-Relevant Insights for Bioavailability and Oral PK Profiling Using MPS

A critical differentiator of MPS is the ability to measure and model drug exposure within the tissue environment. Unlike static cultures, MPS incorporate perfusion, microfluidics and multi organ communication, all of which influence drug distribution. For regulatory submissions, simply stating the nominal concentration added to the medium is insufficient. Researchers must be able to articulate what exposure the tissues experienced over time and whether clinically relevant maximum concentration, area under curve, or steady-state levels were achieved.

Establishing pharmacokinetic/pharmacodynamic (PK/PD) relationships within MPS assays is essential for demonstrating translational validity. It allows regulators to understand not just what happened in the system, but why it happened and how it relates to human dosing. Moreover, this alignment strengthens the case for reducing animal studies where discrepancies in absorption, metabolism or clearance can obscure human-relevant outcomes.

There are examples of MPS approaches that address these challenges, including a Gut/Liver MPS described by Abbas et al. in 2025, comprising entirely primary human gut and liver tissues capable of data-rich insights into both oral and intravenous drug dosing.4 The system recapitulates two critical determinants of human PK, intestinal permeability and first-pass metabolism. By linking gut and liver tissues under perfusion to mimic blood flow, the model replicates the sequence of drug absorption and metabolism, enabling mechanistic insights with significantly more accuracy than animal models or conventional static cultures.

The study also demonstrated how data derived from the Gut/Liver MPS-based bioavailability assay can be quantitatively integrated with in silico computational modelling, enabling the prediction of tissue-specific PK parameters, generating robust and reliable data to close the translational gap and enabling an improved estimation of preclinical bioavailability. Understanding oral bioavailability is particularly crucial as it shapes dose predictions, therapeutic efficacy expectations, safety margins and formulation strategies.

The Limitations of Standard Liver Safety Tools

Liver toxicity remains one of the most significant causes of drug attrition, accounting for 18% of drug development failures due to adverse reactions.5 Existing methods to assess liver toxicity are poorly suited to predicting results in humans. Both standard 2D and simple 3D cell cultures lack the complexity and longevity to predict or understand the mechanism of complex or latent events, whilst animal models can inaccurately reflect human outcomes due to differences in genetics, metabolism and immunological responses. As a result, unsafe drug candidates may progress too far, and potentially life-saving ones may be misclassified and abandoned.

Advancing Liver Safety and DILI Assessment with MPS

Liver MPS (or Liver-on-a-chip) are essentially co-cultures of primary human cells, including hepatocytes and Kupffer cells, which are crucial for maintaining liver function and detecting inflammatory responses.6 They are cultured under perfusion to simulate the liver microenvironment, including blood flow and mechanical shear stress. Perfusion is essential for promoting high metabolic activity, culture longevity (for repeat dosing studies) and liver-specific biomarker detection, including albumin and ALT/AST, for enhanced data translatability. Earlier and more advanced assessment of human liver toxicity using MPS enables promising drugs to undergo structural modifications and move forward with more confidence or caution.

It is important to note that not all MPS offer the same degree of data output for mechanistic insights, sensitivity for predicting complex toxicities such as cholestasis or longevity.7,3 Thus, an understanding of which platforms are better suited to the different stages of drug discovery and development is required. The most advanced in vitro tools enable the sensitive evaluation of compound toxicity with full dose-response curves and deep mechanistic insights that go beyond intrinsic DILI predictions.3,5–9 Advanced MPS models capture a wide spectrum of biological signals, from simple cell health markers to complex multiomics datasets. The key to their successful implementation is to align endpoints with the biological questions and anticipated clinical context. For example, when assessing hepatotoxicity, combining functional outputs (albumin secretion), structural changes (histology or imaging) and mechanistic biomarkers (stress pathways, metabolic flux) provides a multidimensional view of drug response that is far more compelling to regulators than any

Figure 1. A schematic diagram of one well of a Dual-organ Multi-chip plate, demonstrating how MPS incorporates perfusion to replicate the sequence of drug absorption and metabolism.
Figure 2. Combining data derived from a Gut/Liver MPS with a mechanistic mathematical model accurately predicts key ADME parameters and oral bioavailability by estimating the fraction absorbed (Fa), the fraction escaping gut metabolism (Fg) and the fraction escaping hepatic metabolism (Fh).

Subsection: Cell and Gene Therapy

single readout. By understanding the mechanism behind the cause, MPS provides the foresight into potential DILI liabilities that can be subsequently managed via more informed risk mitigation strategies.

In addition to predicting human Drug-induced liver injury, Liver MPS can also be used to predict interspecies differences before in vivo testing or address discrepancies between traditional human in vitro methods and in vivo animal studies that make it challenging to predict safety risks during preclinical testing. In these scenarios, cross-species MPS models further expand the in vitro to in vivo extrapolation (IVIVE) capabilities of MPS-based assays by offering rapid, comparative studies of human and animal responses.10

MPS & In Silico Approaches as a Cornerstone of Future Drug Development

MPS are no longer experimental curiosities or ethical alternatives; they are a regulatory-endorsed approach with research demonstrating scientific superiority in generating truly translational human-relevant data. When thoughtfully designed, rigorously executed and integrated with computational approaches, MPS assays provide compelling evidence to support regulatory submissions and, in some cases, fill critical gaps left by traditional animal models.

Embracing NAMs, including MPS technologies, offers a strategic advantage for ADME and toxicity studies when utilised in combination to predict a translatable first-in-human dose that balances tolerability versus optimal pharmacokinetic profile. Going forward, the earlier use of NAMs will support earlier pipeline decision-making, greater confidence in lead candidates, mechanistically informed regulatory packages, the transition away from animal testing and accelerate the path to market.

The strategic use of MPS platforms will increasingly define the scientific backbone of future drug development. For researchers across the biotech and pharmaceutical sectors, now is the time to invest in the frameworks, skills and partnerships needed to run MPS assays with the level of rigour regulators expect. Those who do will be well positioned to accelerate innovation and bring safer, more effective medicines to patients faster and with science that truly reflects human biology.

REFERENCES

1. Sun, D. et al. Why 90% of clinical drug development fails and how to improve it?, Acta Pharmaceutica Sinica B, 12(7), pp. 3049–3062 (2022)

2. Waring, M.J. et al. An analysis of the attrition of drug candidates from four major pharmaceutical companies, Nature Reviews Drug Discovery, 14(7), pp. 475–486 (2015)

3. Rubiano, A. et al. ‘Characterizing the reproducibility in using a liver microphysiological system for assaying drug toxicity, metabolism, and accumulation’, Clinical and Translational Science, 14(3), pp. 1049–1061(2021)

4. Abbas, Y. et al. A primary human gut/liver microphysiological system to estimate human oral bioavailability, Drug Metabolism and Disposition, 53(9), p. 100130 (2025)

5. Onakpoya, I.J., Heneghan, C.J. and Aronson, J.K. Post-marketing withdrawal of 462 medicinal products because of adverse drug reactions: A systematic review of the World Literature, BMC Medicine, 14(1) (2016)

6. Novac, O. et al. Human liver microphysiological system for assessing drug-induced liver toxicity in vitro, Journal of Visualized Experiments [Preprint], (179) (2022)

7. Nitsche, K.S. et al. Exploring the potential of liver microphysiological systems of varied configurations to model cholestatic chemical effects, Archives of Toxicology [Preprint] (2025)

8. Sarkar, U. et al. Integrated assessment of diclofenac biotransformation, pharmacokinetics, and omics-based toxicity in a three-dimensional human liver-immunocompetent Coculture system, Drug Metabolism and Disposition, 45(7), pp. 855–866 (2017)

9. Kopp, B. et al. Liver-on-chip model and application in predictive genotoxicity and mutagenicity of drugs, Mutation Research - Genetic Toxicology and Environmental Mutagenesis, 896, p. 503762 (2024)

10. Negi, C.K. et al. Comparative analysis of species-specific hepatocyte function and drug effects in a liver microphysiological system physiomimix LC12 and 96-well plates, ACS Pharmacology &amp; Translational Science, 8(11), pp. 4138–4158 (2025)

Dr. Yassen Abbas is Biology Group Leader at CN Bio, leading a team advancing organ on a chip applications in ADME and oral bioavailability, digital twins, and CAR T therapy models. His background includes the European Space Agency, a PhD and postdoc at the University of Cambridge in tissue engineered models of human reproduction. He has authored peer reviewed publications, including six first author papers, and is named on three patent applications.

Dr. Yassen Abbas
Figure 3. The drug discovery and development stages where human and cross-species Liver MPS advance liver safety and DILI assessments

UNCOVER THE POTENTIAL OF MULTI-COLUMN CHROMATOGRAPHY

A versatile technology to intensify purification processes

Octave™ multi-column chromatography (MCC) systems support 1 to 8 column processes to address downstream challenges for a range of molecules and process intensification.

A multi-column chromatography platform Octave systems integrate with MCC-optimized SkillPak™ pre-packed columns containing our best-in-class resins to provide a comprehensive MCC solution.

A team of experts to support your work

Our team of chromatography experts provides our biopharma partners with solutions to develop safe and efficient therapies.

Subsection: Cell and Gene Therapy

The CRISPR Revolution: Redefining

Therapeutic

Strategy and Pharmaceutical Innovation

For decades, medicine has focused primarily on disease management. Patients with genetic disorders often required lifelong therapy, controlling symptoms but rarely addressing the underlying cause. Genome editing is now transforming that paradigm. CRISPR technology enables precise modification of DNA within human cells, allowing scientists to correct defective genes at their source rather than merely compensating for them. What once seemed like theoretical molecular biology has evolved into a clinically validated therapeutic platform.

For pharmacy and biotechnology professionals, CRISPR is no longer an emerging concept. It is a disruptive force reshaping drug discovery, clinical development, manufacturing strategy and value-based reimbursement models. Its impact extends well beyond rare genetic diseases, signalling a structural shift in how medicines are designed and delivered.

Understanding CRISPR:

From Bacterial Immunity to Programmable Editing CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) was first identified as part of a bacterial adaptive immune system. Bacteria use CRISPR-associated (Cas) proteins to recognise and cleave viral DNA.

Scientists rapidly recognised that this system could be re-engineered as a programmable gene-editing tool. By designing guide RNA sequences, Cas enzymes can be directed to precise genomic loci, enabling targeted modification.1

Initially described as “molecular scissors”, CRISPR systems created double-strand DNA breaks (DSBs). Today, the platform has evolved into something more sophisticated, capable not only of cutting DNA but rewriting, inserting or regulating genetic information with increasing precision.

The Moment That Changed Everything

In December 2023, the field reached a historic milestone. The U.S. Food and Drug Administration approved Casgevy, the first CRISPR-based therapy for sickle cell disease and β-thalassemia.

This therapy works by editing a patient’s own stem cells to reactivate fetal haemoglobin, effectively compensating for the defective gene responsible for the disease. For many patients, it represents a one-time treatment with long-term benefits.2

The approval was more than regulatory success. It was proof that genome editing could meet the highest standards of safety and effectiveness.

From Cutting DNA to Rewriting It:

Early CRISPR systems created double-strand breaks in DNA. While effective, this approach relied on the cell’s natural repair mechanisms, which sometimes introduced unintended changes.

Newer technologies are far more refined.

Base editors can change a single DNA letter without cutting both strands. Prime editors can insert, delete or replace small DNA segments with improved precision. These innovations reduce the risk of large genomic rearrangements and increase safety.

In simple terms, CRISPR is evolving from scissors into a precision pen.

Expanding Beyond Rare Genetic Diseases

Initially, CRISPR therapies targeted rare inherited disorders. But the field is rapidly expanding.

In 2024–2025, new clinical trials began exploring CRISPRbased treatments for cardiovascular disease by editing genes such as PCSK9 to permanently lower cholesterol levels. Researchers are also advancing therapies for liver diseases, eye disorders and certain cancers.

This shift signals something profound: gene editing is moving from niche applications toward common health conditions that affect millions.3

Figure 1. CRISPR–Cas9 uses a guide RNA to direct precise DNA cleavage, enabling targeted gene correction or insertion through cellular repair mechanisms.

Subsection: Cell and Gene Therapy

The Technological Evolution: Beyond Double-Strand Breaks

First-generation CRISPR platforms relied on Cas9-mediated double-strand breaks (DSBs), leveraging endogenous repair pathways such as non-homologous end joining (NHEJ) and homology-directed repair (HDR). While transformative, DSB-dependent editing carries risks of large deletions, chromosomal rearrangements and p53 activation.4

Next-generation technologies are addressing these limitations:

• Base editing – enables precise single-nucleotide transitions without DSB formation.

• Prime editing – introduces templated insertions, deletions and substitutions with improved predictability.

• Compact nucleases (e.g., Cas12f variants) enhance packaging compatibility with viral vectors.

• CRISPR-associated transposases and integrases are being engineered to facilitate targeted multi-kilobase insertions in post-mitotic tissues.

These refinements collectively reduce genomic instability and expand the therapeutic window, particularly for in vivo applications.

The Delivery Challenges

Editing DNA is only part of the therapeutic equation. The genome-editing machinery must be delivered efficiently and safely to the appropriate target cells within the body.

Two principal delivery platforms currently dominate translational research. Viral vectors, particularly adeno-associated viruses (AAVs), offer high transduction efficiency and sustained expression but are constrained by limited cargo capacity and the potential for pre-existing or treatment-induced immune responses.

Non-viral systems, especially lipid nanoparticles (LNPs), have demonstrated effective systemic delivery, most notably to hepatocytes, and have benefited from advances pioneered in mRNA vaccine technology. Ongoing formulation refinements aim to expand tissue tropism beyond the liver while improving specificity and tolerability.5

Overcoming delivery barriers remains central to extending CRISPR therapeutics to less accessible organs such as the brain, lungs and cardiovascular tissues.

Beyond Permanent Editing: Epigenome and RNA Approaches

Permanent genomic modification is not always clinically desirable. Epigenome editing platforms enable reversible gene activation or silencing through targeted methylation or chromatin remodelling without altering DNA sequence. RNA-targeting systems allow transient transcript modification, providing temporal control over gene expression.

These complementary strategies expand CRISPR’s versatility and may prove especially valuable in neurology and oncology, where controlled modulation is preferable to permanent alteration.

Transforming Pharmaceutical Research and Development

CRISPR’s influence extends well beyond therapeutic applications. Genome-wide CRISPR knockout and activation screens have surpassed RNA interference (RNAi) approaches in target validation and resistance mapping. Functional genomics programmes now rely heavily on CRISPR libraries to identify disease-driving genes with greater precision.

Engineered organoids and disease models incorporating CRISPR modifications provide more predictive translational systems. Genetic barcoding and lineage tracing offer high-resolution insights into tumour evolution and clonal dynamics.

For biotech strategists, CRISPR simultaneously functions as:

• A discovery accelerator

• A precision therapeutic platform

• A competitive differentiator in pipeline innovation

Manufacturing, Economics and Regulatory Considerations

Scaling CRISPR therapies presents complex manufacturing and regulatory challenges. GMP-grade production of nucleases, guide RNAs and delivery vehicles requires stringent analytical validation.

Long-term pharmacovigilance must assess durability, immunogenicity, insertional risks and off-target effects. Additionally, the high upfront cost of potentially curative interventions necessitates innovative reimbursement frameworks, including outcome-based agreements.

Figure 2. Adeno-associated viral (AAV) vector architecture showing capsid structure, inverted terminal repeats (ITRs) and limited genomic cargo capacity.
Figure 3. In vivo CRISPR delivery workflow illustrating lipid nanoparticle (LNP)-mediated systemic administration of gene-editing components via intravenous infusion.

Subsection: Cell and Gene Therapy

Clinical applications remain confined to somatic editing. Germline modification is not permitted under current regulatory frameworks and remains subject to strict international oversight.

2025 Clinical Highlights

The momentum continued with novel IND approvals and trial initiations:

• YOLT-101 gained clinical authorisation in both the United States and China to treat heterozygous familial hypercholesterolemia using base editing to disrupt the PCSK9 gene, the first in-vivo gene-editing therapy targeting cardiovascular disease to enter global clinical evaluation.

• Multiple CRISPR trials have advanced into Phase I/II for conditions such as hyperlipidemia (CTX310), refractory lipid disorders (VERVE-201) and additional cardiometabolic targets, expanding CRISPR’s reach beyond traditional genetic diseases into broader health conditions.

• Prime editing, a newer CRISPR variation that avoids doublestrand breaks, reported promising early human data showing safety and functional restoration in disorders like chronic granulomatous disease.

Clinical Success Stories: Impact Beyond Numbers

• Perhaps the most compelling narrative of CRISPR’s maturation is real patients experiencing real outcomes.

• Beyond large-scale indications, CRISPR has enabled highly personalised therapies. An infant born with a rare metabolic disorder (carbamoyl phosphate synthetase I deficiency) received a tailored CRISPR intervention in 2025, becoming the first child saved by personalised genome editing in record time.

Outlook: The Strategic Future of Genome Editing

The convergence of precision editing, compact nucleases, AI-guided design and scalable delivery technologies positions CRISPR as a foundational platform for next-generation therapeutics.

Over the coming decade, stakeholders can anticipate:

• Expansion into cardiometabolic and neurodegenerative diseases

• Increased clinical adoption of base and prime editing platforms

• Deeper integration of CRISPR into early-stage discovery pipelines

• Emergence of modular, platform-based development strategies

CRISPR is redefining not only therapeutic development but also value generation, shifting medicine from chronic management toward durable intervention.

For pharmacy and biotechnology professionals, the conclusion is clear: genome editing is not a peripheral innovation. It represents a structural transformation of the therapeutic landscape.

REFERENCES

1. Pacesa M., Pelea O., Jinek M. Past, present, and future of CRISPR genome editing technologies.

2. Wang J. Y., Doudna J. A. CRISPR technology: A decade of genome editing is only the beginning.

3. Zhang X., Ma D., Liu F. CRISPR Technology and Its Emerging Applications.

4. Ratan Z. A. et al. CRISPR-Cas9: a promising genetic engineering approach in cancer research.

5. Li T. et al. CRISPR/Cas9 therapeutics: progress and prospects.

Dr. Koushik Yetukuri

Dr. Koushik Yetukuri is an Associate Professor at Chalapathi Institute of Pharmaceutical Sciences (Autonomous), Guntur, and Academic Coordinator for the M.Pharmacy programme in Regulatory Affairs. With over 13 years of experience in industry and academia, his expertise includes pharmaceutical quality systems, regulatory affairs and validation. He has published 20+ research articles, contributed to international book chapters and serves on editorial boards of peer-reviewed journals. His research focuses on QbD, DoE, nanoformulations and advanced drug delivery systems. He received the Young Scientist Award in 2024 and is a lifetime member of APTI.

Siddharth Kumar

Mr. Siddharth Kumar is an II/IV B. Pharmacy student at Chalapathi Institute of Pharmaceutical Sciences (Autonomous), Guntur. He is an enthusiastic undergraduate with a strong interest in pharmaceutics and drug development, actively participating in seminars and workshops to strengthen his academic foundation and future career in the pharmaceutical field.

Media and Communications

IPI

Peer Reviewed, IPI looks into the best practice in outsourcing management for the Pharmaceutical and BioPharmaceutical industry.

www.international-pharma.com

JCS

Peer Reviewed, JCS provides you with the best practice guidelines for conducting global Clinical Trials. JCS is the specialist journal providing you with relevant articles which will help you to navigate emerging markets.

www.journalforclinicalstudies.com

IAHJ

Peer Reviewed, IAHJ looks into the entire outsourcing management of the Veterinary Drug, Veterinary Devices & Animal Food Development Industry.

www.international-animalhealth.com

IBI

Peer reviewed, IBI provides the biopharmaceutical industry with practical advice on managing bioprocessing and technology, upstream and downstream processing, manufacturing, regulations, formulation, scale-up/technology transfer, drug delivery, analytical testing and more.

www.international-biopharma.com

PNP

Pharma Nature Positive, is a platform for all stakeholders in this industry to influence decision making by regulators, governments, investors and other service providers to achieve Nature Net Positive Results. This journal will enable pharma the ability to choose the right services to attain this goal.

www.pharmanaturepositive.com

PHARMA POD

‘Putting science into conversation, and conversation into science.’Join some of the most esteemed and integral members of the Drug Discovery & Development world as they give insights & introspect into the latest movements, discoveries and innovations within the industry.

senglobalcoms.com

KODA: Advancing Laboratory Sample Management Through 2D Data-Matrix Readers

In modern scientific environments, the reliability of sample identification underpins the integrity of research, diagnostics and patient care. As laboratories continue to expand in scale and complexity, the demand for robust, accurate and scalable tracking systems has intensified. 2D Data-Matrix technology has emerged as a powerful solution, offering durable marking and seamless integration with digital workflows. Among the organisations driving this transformation is Steribar, a specialist manufacturer whose roots lie in both healthcare and industrial traceability. Through continuous innovation and a focus on practical solutions, Steribar has helped laboratories move confidently toward secure, data-driven operations.

2D Data-Matrix Specialists

Founded in 2003, Steribar emerged at the intersection of clinical practice and industrial tracking, with a clear purpose to develop robust solutions for environments where safety, accuracy and traceability are critical. By combining extensive knowledge from healthcare with proven technical expertise, the company initially set out to address the urgent need to track surgical instruments. Recognising the potential of developing barcode technology, Steribar adopted the 2D Data-Matrix standard for its compact footprint, exceptional data density and durability in demanding clinical settings. Dedicated scanners were engineered to decode these small, often low-contrast markings with consistent precision. This commitment to practical, high-performance systems ultimately expanded beyond surgical environments, shaping the advanced laboratory solutions that KODA provides today.

Story of Steribar

Steribar’s origins can be traced to the late 1990s, when growing concerns around patient safety and infection control were driving profound changes in how medical equipment was managed. At the centre of this story is Director Liz Clynes, who began her professional career as a laboratory technician in the chemical industry before moving into clinical and later senior management

roles within the UK National Health Service. Her hands-on experience in laboratories and clinical environments gave her a deep appreciation of the practical challenges faced by frontline staff. This background would later prove influential in shaping Steribar’s pragmatic and user-focused approach to technology.

During this period, the healthcare sector was grappling with the implications of Bovine Spongiform Encephalopathy (BSE) and Creutzfeldt-Jakob Disease (CJD). These fatal, neurodegenerative conditions are caused by prions that accumulate in the brain and induce abnormal folding of healthy proteins. Unlike bacteria or viruses, prions are exceptionally resistant to conventional sterilisation methods, including high temperatures, radiation and chemical disinfectants. This resilience raised serious concerns around surgical instruments potentially acting as vectors for disease transmission. As a result, a robust tracking system became essential to upholding patient safety and public health.

Whilst Liz was confronting these challenges within her clinical practice, a parallel strand of expertise had been developing close to home. Liz’s husband, Mike Clynes, worked closely with Richard Byng to engineer bespoke traceability solutions for demanding industrial settings. Their work encompassed temperature monitoring on poultry farms, asset logging at the Sellafield nuclear facility and VIN tracking for Triumph Motorcycles. These applications required accurate reading of small, low-contrast and often imperfectly marked identifiers. Through this experience, they developed some of the earliest affordable 2D Data-Matrix readers capable of decoding “difficult” marks on metal surfaces.

In a coincidence that so often underpins meaningful innovation, these principles proved directly transferable to healthcare. 2D Data-Matrix codes appeared as the most robust and information-dense marking method suitable for surgical instruments with a limited surface area. Their compact footprint allowed unique identifiers to be permanently etched onto tools without compromising structural integrity or usability. By addressing this challenge, Steribar developed expertise in building readers for difficult-to-read codes. This capability soon attracted interest beyond healthcare, particularly from manufacturers of coded laboratory tubes and racks.

Today, Steribar is recognised as a world-leading designer and manufacturer of high-performance 2D Data-Matrix readers for smarter laboratory sample management. Although the company has long been something of an “industry secret” through OEM partnerships, its technology has been embedded in large laboratories and automated systems for decades. With the launch of the KODA range, Steribar is now bringing its specialist support directly to laboratories of all sizes.

Laboratory Sample Management

Modern laboratories are handling increasing volumes of samples, driven by advances in genomics, diagnostics,

drug discovery and biobanking. Despite this growth, many laboratories still rely on paper-based processes or manual data entry to track samples. These approaches are inherently vulnerable to transcription errors, misplaced records and incomplete audit trails. Even small mistakes can have significant consequences, ranging from wasted time and resources to compromised data integrity. In regulated environments, such weaknesses may also jeopardise compliance with international standards and accreditation requirements.

2D Data-Matrix barcoding offers a powerful solution to these challenges by linking each physical sample directly to its digital record. A single scan can instantly capture a unique identifier and populate a spreadsheet, database or Laboratory Information Management System (LIMS) with accurate information. Unlike handwritten labels, Data-Matrix codes remain readable even when very small, partially damaged or exposed to harsh laboratory conditions. This makes them particularly well-suited to tubes, vials and cryogenic storage formats where space is limited and durability is essential. By replacing manual transcription with automated capture, laboratories dramatically reduce the scope for human error.

In high-throughput laboratories, hundreds of samples may be received, processed and stored every day. Manually labelling and tracking these samples is not only inefficient but also impractical at scale. Barcoded tubes and racks allow samples to be identified, located and retrieved quickly and unambiguously at any point in their lifecycle. This end-to-end traceability also reduces the risk of sample mix-ups and cross-contamination, which is critical in clinical and research settings.

Barcode systems also support seamless coordination among laboratory teams and collaborators. Samples are often shared across multiple experiments, departments or even organisations, each with differing users and workflows. With a consistent barcode-based system, any authorised user can immediately identify a sample, check its status and update its record after use. This clarity reduces disruption, avoids

Application Note

duplication of effort and supports a more structured and transparent workflow. Ultimately, 2D Data Matrix readers enable laboratories to move away from fragmented, paper-heavy processes toward integrated, digital-first operations.

The KODA Readers

Building on decades of expertise in unlocking codes, Steribar developed the KODA range to address the diverse needs of modern laboratories. Each reader is engineered to solve specific workflow challenges while sharing a common emphasis on reliability, ease of use and seamless integration.

• KODA Sense serves as the flagship rack reader for routine laboratory use. Designed for SBS-format racks, it features automatic rack detection, triggering immediate scanning upon placement. Orientation correction algorithms ensure accurate decoding even when racks are inverted. Capable of scanning an entire rack in under one second, KODA Sense supports high-throughput laboratories, research institutes and biobanks where speed and reliability are essential.

• KODA Cryo addresses the unique challenges of frozen sample workflows. Conventional scanners often incorporate protective glass windows that become obscured by frost and condensation when exposed to temperature differentials. KODA Cryo employs an open, glass-free design that eliminates these issues, allowing racks and cryoboxes to be scanned directly from freezers. By removing the need for thawing prior to identification, this approach preserves sample integrity, reduces handling time and maintains cold-chain continuity.

• KODA Slim is optimised for automation and spaceconstrained environments. With an SBS-compatible footprint, it integrates seamlessly into robotic systems and automated liquid-handling platforms. Its design permits racks to be scanned slightly above the reader surface, facilitating efficient robotic arm operation. Compatibility across multiple rack formats ensures flexibility.

• KODA Solo provides a compact, plug-and-play solution for single-tube scanning. Supporting 2D Data-Matrix, 1D barcodes and QR codes, it offers visual and audible confirmation of successful scans. Affordability and simplicity make it particularly suitable for bench-top workflows, academic laboratories and smaller facilities seeking to transition from manual to digital identification.

The KODA readers are compatible with all tube manufacturers and integrate with any known Laboratory Information Management System (LIMS) platforms. KODA Capture software provides live scan feedback and multiple viewing modes for verification. Devices can be moved between workstations with minimal configuration, supporting flexible layouts. By combining decoding performance with usability, the KODA range translates technical expertise into accessible, laboratoryready solutions.

Innovation & Customer Support

Innovation at Steribar is driven by close collaboration with customers and partners. All KODA products are designed, built and tested at the company’s UK facility, granting the team

complete control over quality and performance. Hardware and software development are carried out in-house, enabling rapid iteration and continuous improvement. Feedback from real-world laboratory use is actively encouraged and carefully evaluated. When a suggested change genuinely improves usability or performance, it is often adopted as part of the standard product evolution.

Customer support is viewed as an extension of the product rather than an afterthought. From initial consultation through installation and ongoing use, Steribar works closely with laboratories to ensure readers are configured correctly and integrated smoothly. Licence-free software allows devices to be moved easily between computers, reducing administrative overhead. Clear documentation, intuitive interfaces and minimal training requirements further lower barriers to adoption.

Ultimately, Steribar’s journey from surgical instrument tracking to advanced laboratory sample management reflects a consistent focus on safety, traceability and usability. The same principles that once helped address concerns around BSE and CJD now underpin solutions for modern, data-driven laboratories. Through the KODA range, Steribar aims to make high-quality 2D Data-Matrix reading accessible to organisations of all sizes. By combining technical expertise with genuine customer engagement, the company continues to support laboratories in building safer, more efficient and more resilient workflows.

Megan Ricketts

Megan Ricketts is an Assistant at Steribar in the Content & Marketing Department.

Email: megan.ricketts@steribar.com

Anglonordic is exclusively for European investors and R&D companies from the Nordics and the UK to connect with each other. W ith an established format of panel discussions, parallel technology and biotech investment rooms, plus 1:1 mee�ngs, this conference provides excep�onal value. The Reception is open to all registered delegates on the evening of 22 April at no extra charge.

10% discount with code IBI10

For more informa�on about the conference visit:

Register today.

Advance discount registration rates through 19 May!

Keynote Speaker

Willem Mulder, PhD Professor of Precision Medicine

Radbound University Medical Center and Eindhoven University of Technology

Inventor, Entrepreneur

slas.org/europe2026

Drug Discovery USA 2026

ELRIG is a not-for-profit organisation with 25 years’ experience connecting the global life science and drug discovery industry through open-access, free-to-attend events.

Would you like to reach an audience of 75% pharma & Biotech companies? Exhibit with us, email sales@elrig.org for info

15 speakers

20+ exhibitors

250+ attendees 4 tracks of consecutive science 16-17 June 2026 Pfizer, Boston (MA) USA Innovation Award Early Career Professional (EPC) Poster Award

Page 3

A&M Stabtest GmbH

Page 58 Anglonordic Life Science Conference

Page 7 Aseptic Tech

Page 37 Biopharma Group

Page 60 Bio IT World

Page 25 CDD Vault

Page 61 Drug Discovery USA 2026

BC GenXPro GmbH

Page 60 PEGS Boston

Page 41 PCI Pharma Services

Page 11 PharmExcel

Page 5 Richter Biologics GmbH & Co. Kg

IBC Scientific Laboratory Show

Page 53 Senglobal Ltd

Page 59 SLAS Europe 2026

Page 57 Steribar Systems Ltd

Page 49 Tosoh Bioscience

IFC Wallonia Trade & Investment Office

Subscribe today at www.international-biopharma.com or email info@senglobalcoms.com

I hope this journal guides you progressively, through the maze of activities and changes taking place in the biopharmaceutical industry

IBI is also now active on social media. Follow us on:

www.facebook.com/Biopharmaceuticalmedia www.plus.google.com/biopharmaceuticalmedia www.twitter.com/biopharmace www.biopharmaceuticalmedia.tumblr.com/

Join scientists, researchers and laboratory leaders from across the pharmaceutical, biotechnology and life sciences sectors for a full day of innovation, insight and networking. Discover the technologies, equipment and services supporting drug discovery, analytical development, quality control and laboratory operations.

13 MAY East Midlands Conference Centre, Nottingham

top scientific suppliers, all housed under one roof

Presented by

STOP SEQUENCING. START DISCOVERING.

Mastering RNA Complexity with the GXP BioInfonomics Suite

In the race for the next breakthrough, data noise is your greatest enemy. Extract the biological truth with GenXPro’s definitive analytical engine for mRNA, small RNA, and epigenetic data.

THE BIOINFORMATICS CORE

• Interactive Discovery: Move beyond static PDFs. Use our Suite for real-time Pathway Mapping (GSEA, GO), Volcano Plots, and Heatmaps.

• Minimized Bias – High Accuracy: Our patented TrueQuant (TQ) technology eliminates PCR artifacts, providing the high-fidelity input required for AI-driven Drug Discovery.

• TQ pan RNA-Seq Integration: Simultaneously resolve the coding and non-coding landscape (mRNA, lncRNA, small RNA) in one unified pipeline.

• Cross-Domain Mastery: Expertly engineered pipelines for Human Oncology, Plant Science, and Non-Model Organisms – handling complexity where others fail.

THE PROFESSIONAL EDGE

• Scientific Partnership: Direct, PhD-level bioinformatics support – no automated “Black Box” reports.

• Data Sovereignty: 100% GDPR-compliant, Germanbased infrastructure for sensitive R&D assets.

• De Novo Excellence: Specialized tools for assembly and annotation of non-model species.

EXPLORE THE SUITE

Scan to Test-Drive our Engine: Experience the BioInfonomics Suite instantly.

Username: TUser

Password: testtest

Turn static files into dynamic content formats.

Create a flipbook
IBI - Volume 9 Issue 1 by Senglobal - Issuu