Time to experiment, time to clinic, and time to market are becoming even more important for developers and manufacturers of biopharmaceuticals. Here we give some insights on tools to make process development more efficient.
Process development landscape is changing and facing new challenges
Historically, the development timeline for a pharmaceutical drug has been 8 to 10 years. However, we are now facing shorter timelines for decision making with the use of expedited programs, such as fast track approval and breakthrough drug designation, which can significantly shrink the development timeline. The consequence of this change is that process development and CMC activities have become even more critical factors in the time to market, with many being performed in parallel with clinical trials.
These pressures, compounded by the need to improve facility efficiency, lead many companies to reuse knowledge to make product development faster and more predictable for current and future projects. This is typically referred to as a platform approach, whether it is for development, manufacturing, or analysis. Each platform can then contain standard operating procedures, checklists, and equipment or raw material selection, in order to ensure a certain performance.
However, today’s pipelines have an increasing number of more complex and diversified molecules, adding significant challenges to process development organizations. For some of these new molecules, the platform approach for analysis, manufacturing process, and facility fit does not match and needs to be adapted. This requires higher efforts for the development teams to reach the “standard level” but also means higher risk for delays in process development.
The platforms and the tools used to support process development are therefore continuously evolving. Today, advanced approaches are emerging to support smarter process development. Such tools include high-throughput development, mechanistic modeling, and fiber-based chromatography.
High-throughput process development
With more drug candidates entering today’s pipelines, the total number of experiments has increased dramatically. Facilitating these activities is vital to efficiency in process development. One approach to doing so is high-throughput process development (HTPD). HTPD is enabled by parallelization and automation of the experimentation itself. Some of the requirements for performing HTPD are:
- Appropriate tools to perform parallel experiments
- Experimental design to test the process development space
- Methods for data analysis and management
One example that illustrates the need for HTPD workflows is the generation of stable cell lines. In contrast to legacy workflows where only one product candidate is used to generate a stable cell line, some candidates today are developed in parallel in a streamlined cell line development (CLD) workflow. These candidates could be variants of the monoclonal antibody (mAb) construct with, e.g., optimized vs. wildtype gene sequences, choice of signal peptide, or heavy chain (HC) to light chain (LC) ratio.
As a result, top clones for each construct will need to be assessed for titer, viable cell density (VCD), viability, and product quality assessment. In addition, a bioreactor performance assessment on top clones to optimize culture conditions, assess process performance, titer, VCD, viability, and product quality is usually performed. These studies require high-throughput tools that are representative of bioreactor scales to be used when moving the project toward pre-clinical and clinical manufacturing.
In the downstream space, HTPD has been successfully applied in many industrial labs for more than a decade. One of the most common tools in downstream HTPD is the use of 96-well plates filled with resin, which are a suitable format for screening of binding, elution, and wash conditions. These are operated in static batch mode and can be used in manual operation or with a robotic platform. Mini columns in sizes of 50 to 600 µL is another commonly used format for HTPD that are operated in dynamic mode. Both the 96-well plates and the mini columns are operated in parallel mode and use relatively small amounts of sample, which greatly increases the amount of data. This data enables better decisions, thus proving these approaches to be valuable tools that often complement each other during process development.
A mechanistic model in chromatography is a mathematical representation of the physiochemical interactions during chromatography. This uses equations that describe how molecules diffuse in a liquid, how they move in the stagnant film around chromatography beads or how they diffuse inside the pores of a bead, and then how the molecules interact with ligands. It also describes how molecules compete for ligands when binding. This is represented as differential equations. There will be a number of constants in these equations that need to be numerically solved by the fitting of data from chromatographic runs so that the model represents the actual chromatographic behavior. When the model has been calibrated, you can start to use the model to simulate chromatographic behavior and experiment in silico.
There are many applications of mechanistic modeling today. For example, it is possible to predict step elution conditions for cation exchange and the impact of bed height variability on process performance, as well as explain certain deviations in manufacturing. Mechanistic modeling can be used as a risk assessment tool to guide and perhaps reduce process characterization efforts, and it can also be used to predict scale-up from lab scale columns to process columns.
There are several opportunities for mechanistic models to increase process development efficiency. For instance, in pre-clinical phases, it is possible to reduce the time needed to develop a process for toxicity runs. In Phase 1, you can reduce the number of chromatographic runs per step, and in Phase 2, you can predict scale-up and support tech transfer activity. There is also the potential for mechanistic models in Phase 3 to inform the risk assessment of which parameters have the biggest impact on product quality or productivity or process economy. In late stage, i.e., commercial, mechanistic models can support identification of root causes or the management of deviations.
Combining HTPD and mechanistic modeling
While the theory of mechanistic modeling is not new, adoption in the biopharmaceutical industry has been hampered by a lack of user-friendly software and computer power. Now, software specifically designed for mechanistic modeling in biopharmaceutical process development are commercially available, and the capacity of computer power is constantly improving. Furthermore, by marrying mechanistic modeling with HTPD, you can potentially create a tool that is even stronger than each one would be on its own. One example of the benefits of combining HTPD and mechanistic modeling is the development of validated scale-up and scale-down models, which is important both in process development as well as for troubleshooting of the manufacturing process.
HTPD tools are typically not validated as scale-down models because they have different kinds of offsets that make them less representative of a larger-scale process. With the assistance of mechanistic modeling, you can explain the offsets and why they do not reflect the same chromatographic behavior as a larger column would show. By compensating or recalculating the data with a mechanistic model, you get, in principle, a way to validate the HTPD development format as a scaled-down model. This means the ability to do more experiments in a validated format, which is smaller and in parallel, leading to a more robust and optimal process design.
Building capabilities in HTPD and mechanistic modeling requires significant effort. Initial and ongoing investments into competence development as well as in equipment are needed. If you have only one or two candidates for pre-clinical or clinical testing, it may be better to work with experienced partners ― vendors and/or CMOs ― that have the right capabilities, allowing a small, emerging company to utilize the latest technology without building capabilities in-house.
New emerging technologies
To support improved process development for biomanufacturing, there are other new technologies emerging. Rapid purification using fiber-based chromatography is one example of such innovations. This technology utilizes the high flow rates and high capacities of cellulose fibers. Fiber-based chromatography can offer improved protein capture compared to conventional chromatography, with residence times measured in seconds rather than minutes. These are some examples of applications that can benefit from such short residence time and cycling times:
- Screening situations where speed is key, e.g., extensive process characterization work or multiple feed studies
- Cell line screening from Ambr™ systems (multiple samples, high volume) or Fibro plus ÄKTA™ connected to an autosampler to run multiple samples in an automated way
- Rapid titer determination
- Sensitive mAb and mAb-conjugates that benefit from quick processing and short elution time
- Low titer situations in research applications
Another benefit is that, in contrast to the earlier discussed mini column units, full chromatograms are provided with the fiber chromatography technology, giving improved quality monitoring.
Process developers in the biopharmaceutical industry face new challenges with shorter timelines and accelerated development programs and new molecular entities increasing regulatory expectations on process understanding. Therefore, it is more important now than ever to understand and utilize modern tools and solutions to improve efficiency in process development.