Highlights from the 5th International HTPD Conference, Part 1
The High-Throughput Process Development (HTPD) Conference Series is the key international forum for the presentation and discussion of topics relevant to high-throughput process development and “smart process development” for biopharmaceuticals.
This article is the first of three from the 2019 conference and focuses on summarizing some of the major conclusions from the pre-conference day, which to a large extent covered the fundamentals and the evolution of HTPD, as well as the plenary talk. The other two articles will cover the rest of the conference on the topics of ‘Applications of modern high-throughput process development’ and ’Smart process development and a look into the future’.
High-throughput process development (HTPD) has today become a standard methodology in the biopharmaceutical industry. HTPD was initially applied in the downstream space for chromatography resin screening in 96-well filter plates. It became a method to obtain and analyze the large amounts of process development data related to understanding existing unit operations for Quality by Design (QbD). Today critical materials attributes (CMA), critical process parameters (CPP), and critical quality attributes (CQA) are characterized via HTPD to enable QbD-facilitated “systemic” and “risk-understanding”.
Advances in techniques related to target and contaminant characterization, molecular modeling, quantitative structure-activity relationships (QSAR), at-line and on-line process monitoring, as well as improved understanding of separation process related transport and surface phenomena are merging to support a “new dawn” for bioprocessing.
Rather than just characterizing the nuances of existing processes, HTPD now supports in silico methods to cost-effectively design optimal processes and separation approaches. HTPD-type approaches are also expanding “upstream” (e.g., HTPD for cell culture and viral vector development) and further “downstream” to formulation (e.g., drug product stability).
Evolution of process development methodology
Moving from OFAT to raw material insights and mechanistic modeling
Peter Hagwall (Cytiva) kicked off the pre-conference and reviewed the trends in today’s manufacturing and process development landscape, especially the need to speed-up process development for accelerated clinical development process.
He overviewed the evolution of process development methods: process development started with the one-factor-at-a-time (OFAT) approach in the early days of chromatography. The next leap forward was the development of the design of experiments (DoE) approach, which reduced the number of experiments while maintaining a high level of process understanding and led to the emergence of HTPD around 10 years ago.
Standard HTPD tools available today include microwell plates (e.g., PreDictor 96-well plates) to provide speed and minimal sample consumption, and larger volume RoboColumn® units, which allows HTPD to deliver more accurate investigation of column-based mass transfer, and operational (pressure flow) properties.
Peter also highlighted how critical it is to have a well-characterized process, especially from a raw material variability standpoint, to secure robustness via a solid control strategy.
As regards chromatography resins, ligand density can be considered as a variable for characterization as it is the resin attribute most likely to affect production processes.
Further reading on the issues described by Peter are found here.
Peter went on to describe how the emerging mechanistic modeling technique shows great promise to reduce experimentation further with computer simulation based on physiochemical understanding of the phenomena involved in chromatography. Mechanistic modeling supports the implementation of QbD-compliant process development. This QbD methodology is based on a science and risk-based approach that centers development activities to what is most important to ensure patient safety and drug efficacy.
Fig 1. Evolution of process development methodology
He linked the progress in DoE and mechanistic modeling to the position of strength HTPD has today and detailed how HTPD can help companies deal with modern PD challenges related to:
- Modular and multiproduct manufacturing
- Global and reliable raw material and process supply
- Platform process development, and modification
- The diversity of emerging and existing targets
Enhanced process development is also supported, for example, by use of DoE capabilities built in to ÄKTA and other chromatography systems.
These various approaches allow companies to deal with common “product post-launch” challenges such as raw material variables, process variables, and product variables (e.g., contaminant levels).
Multiwell-plates and minicolumn HTPD tools
How today’s HTPD tools complement each other in downstream PD to save time and sample
Peter´s presentation led to Eggert Brekkan (Cytiva) providing more details to “Downstream tools and when to use them”. He discussed design space insights to be gained from using HTPD and QbD in downstream process development—probably the most mature area of HTPD.
Eggert noted the significance of various HTPD related data such as binding capacity, wash and elution conditions, adsorption isotherms, Henry´s Constant (Kp) screening, resin screening, and resin cleaning conditions. He gave some recommendations in the choice of the following HTPD tools which are very complementary: 96 well plates and minicolumns. For example, 96-well plates enable the study of static binding capacities, while mini-columns can be operated as a true chromatography mode, enabling dynamic binding capacity studies. Also, when using 96-well plates, binding capacity studies should rather be performed, with 2 or 6 µL resin/well. For elution studies, plates with 20 or 50 µL are recommended. Important RoboColumn unit operation variables such as loading time, sample dilution on loading, and residence time are similar to those when using other conventional lab-scale chromatography columns.
Eggert discussed some important practical considerations when applying downstream HTPD (see Table 1). Efficient mixing when working with 96-well filter plates is very important as “completeness of mixing” variables often lead to data variation. Sample clarification is also an important variable. Detergents, antifoam agents, and other additives often significantly affect HTPD results.
Table 1. Comparison of HTPD formats
|• No chromatogram
• Static binding capacity
• Very fast
• Low sample consumption
• Manual or robotic system
|•Chromatogram after fraction analysis
•Dynamic binding capacity
•Low sample consumption
•Mostly run on robotic systems
HTPD in upstream development
How more clones can be evaluated under process relevant conditions
Andreas Castan (Cytiva) continued the workshop with a presentation on HTPD in upstream processing to enhance cell line development (CLD) and upstream process development (UPD). Figure 2 illustrates the objective in each step from CLD to a developed upstream process, the number of clones evaluated, and typical cell culture volumes.
Fig 2. HTPD workflows in cell line development and upstream process development
Andreas pointed out that high througput (HT) workflows allow several gene constructs to be evaluated in CLD, which increases the chance for success and of finding high producers. Furthermore, the time gain in an improved CLD process compared to a traditional (10-16 versus 40 weeks) was discussed. Andreas gave an overview of cell culture systems for HT workflows and discussed deep well plates, spin tubes, shake flasks and micro bioreactors as scale-down models for large scale bioreactors. He pointed out the importance to validate the scale-down model to ensure that clones are ranked correctly with respect to titer and product quality. HT workflows in upstream development require HT titer methods and product quality assessment in HTP format and available technologies were presented. Finally, Andreas described how data acquisition, data management and data analysis in HTPD workflows can be accomplished upstream.
Helping to gain better process understanding
Tobias Hahn from GoSilico™ GmbH presented an overview on digitization of biopharma downstream processing which included an introduction to mechanistic modeling, as well as the resources available for such modeling. A mechanistic model is a mathematical representation of the physiochemical interactions during for example chromatography. This is performed using equations that describe how molecules diffuse in a liquid, how they move in the stagnant film around chromatography beads, or how they diffuse inside the pores of a bead, and then how the molecules interact with ligands.
GoSilico uses their capabilities and their ChromX™ software to support customer´s interests in better understanding issues related to Molecule throughput, Process understanding, Regulatory risks, Production challenges, and Process transfer. Tobias explained that primary reason industry is interested is to gain process understanding including the impact of chromatographic resin variability. He noted that common DoE analyses generate data but not knowledge. Mechanistic models can help make domain knowledge available and allow for extrapolation. As such they provide a good basis for root cause analysis of process challenges.
Regulatory agencies, such as FDA, are well aware of the roles of mechanistic models for biopharmaceutical drug development and manufacture. For a recent article, see http://digitalcommons.unl.edu/usfda/23.
Fig 3. Knowledge pyramid for developing mathematical models, and schematics of types of models. (Source: An Overview of the Role of Mathematical Models in Implementation of Quality by Design Paradigm for Drug Development and Manufacture. Sharmista Chatterjee1, Christine M. V. Moore2, and Moheb M. Nasr3)
1Office of New Drugs Quality Assessment, Food and Drug Administration, Silver Spring, MD, USA
2Global CMC Policy, Merck, Inc., Philadelphia, PA, USA
3GlaxoSmithKline Pharmaceuticals, Brentford, UK
HTPD method adoption across all company functions
A worthwhile journey, but one that can also be outsourced to save time
Jennifer Pollard (MSD Biologics, Merck & Co.) presented on the Current State of MSD Biologics Use of HTPD Across all Functions and showed how Merck has progressed over the past decade to become a leader in HTPD use. Jennifer noted that in 2000 to 2007, MSD started with low level use of TECAN® robotic workstations. From 2008–2011, HTPD began but was driven at the program level.
In 2011 she brought a dedicated technical leader (J. Welsh) onboard to support HTPD as a company-wide resource. This led to an interest in customized HTPD solutions. In the period 2014–2018, Merck & Co supported a University College London PhD student to work on scaled down ultracentrifugation. By 2016, HTPD was used in 80% of PD projects to develop standard workflows. There is now even greater company-wide coordinated use of HTPD data. The large amount of supplemental data that comes from HTPD has encouraged internal support for dedicated investment and resources.
Lessons learned on the HTPD journey of Merck & Co are:
- HTPD requires a strategy with vision for end-user model and value.
- Requires centralization of HTPD as a core competency.
- Set up a dedicated HTS Group in PD.
- Implementation is driven at the Group Leader/Project Leader level by individuals with a strategic desire to implement.
- Important to have the right people—engineering fundamentals more important than experience with robotics. People need to be dedicated to HTPD, not just doing it on the side. It takes three to six months to learn TECAN programming.
- Develop a two-year plan which includes development of needed workflows.
For some organizations that are lacking internal HTPD resources, outsourcing some parts of the process development might be a better option. The workshop ended with a presentation by Charlotte Brink from Cytiva. She presented a case study on the use of HTPD to support development of a cation exchange chromatography step for a biosimilar mAb.
This work has been performed by Cytiva’s Fast Trak organization that supports biopharmaceutical companies with process development and other related services.
During the short project (three to four weeks), support for resin selection and identification of conditions and critical parameters were accomplished. Some of this work was also presented at HTPD 2017 and can be found in the report Extended abstract book from that meeting.
Keynote presentation on tools for in silico facilitated PD
Professor Cramer´s Keynote presentation “In silico facilitated downstream process development”, noted the interplay between various in silico modeling tools developed in the Cramer lab, high-throughput experimentation to facilitate more rapid process development, as well as the development of more robust and integrated downstream processes.
He demonstrated how several different in silico modeling tools are available today, for example:
- Mechanistic column models
- Molecular dynamics
- Quantitative structure activity relationship (QSAR) models
- In silico process ranking tools
- Protein docking software
- Surface property maps
- Big data driven tools.
Professor Cramer noted that QSAR is often limited by data and that it is important to train with the right data set. He then gave examples of how to the use the above approaches to understand individual chromatographic unit operations, as well as integrated downstream processes.
Some recent work on mAb target interactions with contaminants was noted (Ranjan, S. et al. Investigation of cathepsin D–mAb interactions using a combined experimental and computational tool set. Biotech. and Bioeng. 116, 1684–1697, ). Professor Cramer´s talk ended with a vision of where the HTPD field can go in the future. For example, synergistic unit operations to provide for robust two column step (capture and polish) mAb purification processes, as well as the use of QSAR and mechanistic modeling to address issues in formulation.
Overall the conference attendants were appreciative of the HTPD Conference Series and are looking forward to “HTPD VI” in two years. They are optimistic about the future for HTPD—especially regarding development of better models and analytical methods with the following caveats.
- There is a need for better programming tools for some of the equipment used in HTPD.
- Use of HTPD in biopharmaceutical industry is expanding in terms of application areas and focus.
- Analytical methodologies remain key to the value and accuracy of HTPD. There is a common desire to continue to discuss and share best practices at the next HTPD meeting.
Over half of the meeting attendees were working for companies whose processing interests included viral vectors.
HTPD platforms appear amenable to new target molecule formats as such formats (e.g., bispecific antibodies, various vaccine types, nucleic acid-based therapeutics) often present novel separation challenges.
Smart PD is poised to make exciting strides in the next few years as we move to more accurate data twins and training sets on the road to a more in silico process development future. Present challenges continue to involve real-time process data acquisition, data management, and standardization.
The future should see more collaboration between users and vendors in regard to enhancing HTPD capabilities so as to reflect industrial processing (e.g., clarification methods, flow through chromatography, membrane chromatography), new separation modalities (e.g., convective media), connected and continuous processing, adapting in silico approaches and models to more complex industrial operations.
Read the other reports from the HTPD conference: Applications of modern high-throughput process development and Smart process development and a look into the future.