With advancing digitalization, mechanistic modeling has established itself as the method of choice for creating improved process understanding, intensified process development, and unparalleled process performance. Mechanistic models have become a very valuable technique to describe chromatography separation processes. Digital twins based on mechanistic models enable in-depth understanding of even very complex separation processes and are applicable at any point of a therapeutic product’s life cycle, reducing the cost and time for downstream process development.

However, the main hurdle within every mechanistic modeling project is the often-cumbersome approach to model calibration. Multicomponent feed stocks lead to multidimensional parameter estimation problems with many unknown biomolecule-specific thermodynamic parameters. Unsupervised model calibration approaches may result in unreasonable correlations and unrealistic physical parameter estimates.

This article outlines considerations within a standardized workflow to ensure good modeling practice. General considerations are complemented by an application example showcasing the optimization of an ion-exchange polishing step. The target biomolecule, a monoclonal antibody, was separated from charge variants using a precharacterized f(x) column for rapid calibration.

1. Set the scope

To follow good modeling practice, the first and important step is to set the scope and context of use and to define the project goal. These considerations are very important for the experimental design of the calibration runs and for the definition of a model validation strategy. It is crucial to know about the process challenges and define the expected model capability to ensure that the model will meet the desired requirements later.

Find a general guide to setting the scope here.

2. Select variable and non-variable process parameters and chromatography setup

After defining the project scope, the variable and non-variable process parameters of the project are selected. Variable parameters are the parameters that are varied to fulfill the scope of the project (e.g., salt concentration for elution, pH at different steps of the process, or load density). The non-variable parameters refer to the chromatography setup and include, for instance, the scale of column and system used for calibration. Other non-variable parameters that are part of the chromatographic separation problem at hand may include resin type, buffer system, or suitable analytical methods to distinguish target and impurities.

Identifying the process parameters that will not be varied to achieve the defined goal of the modeling project is the first step to narrowing down the boundaries for your calibration experiments. Hereby, the setup and process parameters can be selected based on experimental knowledge and experience, for example, based on processes with a comparable molecule or a resin screening.

3. Define boundaries for variable process parameters

The third step is identifying the boundaries for the variable process parameters that are relevant to meeting the project goal. By setting these boundaries for process parameters like pH, phase duration, or load density, we effectively narrow down the parameter space that is going to be explored to find optimal conditions. Setting these boundaries also serves as a requirement to define the experimental design for model calibration. It allows us to focus our efforts on exploring the most promising regions of the parameter space. In many cases, these boundaries can be set based on experience or existing experimental knowledge such as plate screening studies or dynamic binding capacity runs.

4. Plan and execute your calibration experiments

Experimental design and considerate execution are key to successful process development. The previously defined boundaries, project goal, and available resources should be considered for the experimental design. The aim of experiments performed for model calibration is to capture all relevant chromatographic effects and phenomena. Consequently, experimental variation (e.g., gradients and steps, high and low load density) needs to be reflected by the experimental plan.

Furthermore, the predictive power of the model relies on the experimental data it is built on. A mechanistic model for chromatography combines fluid-dynamic effects occurring in the system and column with thermodynamic effects taking place between biomolecule and ligand. To be able to describe both effects, system and column characterization experiments as well as adsorption experiments with biomolecule feed should be performed. Reliable experimental data is crucial to ensure that the calibrated model fits its purpose.

5. Data integrity check

The next step is to start building the virtual representation of the chromatographic process by importing data into GoSilico chromatography modeling software. A data integrity check should be performed before starting model calibration. The characteristics of the experimental data such as dead volumes and mixing effects in the system need to be accounted for to model the conductivity trace accurately.

6. Select a model

A mechanistic model for liquid chromatography accounts for the fluid-dynamic phenomena of system and column as well as the thermodynamic phenomena describing the interaction of biomolecules and ligand. Model selection follows a bottom-up approach. The selected column, pore, and isotherm models should be as simple as possible and only as complex as necessary. It is important to describe the present effects while avoiding overparameterization by choosing complex models with physically unreasonable parameters for the project at hand.

Find a general guide to model selection here.

Figure 4: Fluid-dynamic and thermodynamic models

Fig 4. Fluid-dynamic and thermodynamic models

7. Calibrate your model

Model calibration, or parameter estimation, is the process of measuring or mathematically estimating parameters of generally applicable model equations to represent the actual physical system. The input data used for model calibration are the wet-lab experiments and the respective offline analytical data. The aim of calibration is to identify model parameter values such that the simulated measurements match the experimental measurements. Certain attributes of chromatograms contain information about certain parameters. Ideally, this fact should be used for parameter determination to avoid parameter correlations and unreasonable parameter estimates without physical meaning.

Figure 5: Model calibration

Fig 5. Model calibration

For example, ion-exchange chromatography linear gradient elutions (LGE) at low column loading can be used to determine the biomolecule charge and the equilibrium constant using the Yamamoto approach. To do so, at least two, ideally three LGE experiments featuring different gradient lengths are needed to determine the charge and equilibrium constant from the gradient slopes and the salt concentration at peak elution. The Yamamoto approach is readily available in GoSilico chromatography modeling software .

After the charge and equilibrium parameters are identified, other parameters such as the binding kinetics and pore diffusion characteristics can be determined from a step elution experiment, for example. The biomolecule-ligand interaction at high load density, displacement, or repulsion between the biomolecules can be determined from an additional high loaded LGE. This step-by-step approach will mitigate the risk of unreasonable correlations and unrealistic physical parameter estimates.

Find a general guide to model calibration here.

8. Analyze parameter uncertainty

After model calibration, the model quality needs to be investigated. The first indicator for model quality is the visual fit. A high-quality model describes all calibration runs accurately.

Once a good visual fit is obtained, the model quality can be evaluated more intensely. To investigate the parameter sensitivity, the 95% confidence level needs to be determined. Those can be calculated directly in GoSilico chromatography modeling software. Well-determined parameters have narrow confidence intervals. Broad confidence intervals indicate that the respective parameters could not be determined accurately from the calibration data.

To investigate the influence of the parameter sensitivity on the model prediction, a parameter space sampling for the calibration experiments can be done within the calculated confidence intervals. This sampling will help determine if specific parameter sensitivities are sufficient. Based on the outcome of the parameter uncertainty analysis, additional calibration experiments can be performed to improve the model quality.

9. Validate your model

Once the model is calibrated, the model must be validated experimentally. The model validation can be done with one or multiple experiments. A common approach is to perform model validation with in silico optimized process conditions. Another possibility is to choose process conditions at the edges of failure of the model. It is also possible to include experiments at different scales for model validation if all used systems and columns were characterized.

The validation is typically done with respect to peak shapes and positions as well as critical quality attributes. The validation runs can be imported to GoSilico chromatography modeling software to compare experimental data and model prediction.

10. Apply and further develop your mechanistic model

Modeling workflow step 10 - Application

Once successfully calibrated and validated, the model can be used along the entire development life cycle of the product and process. Validated mechanistic models can extrapolate to process conditions outside the calibration space.

To guarantee a suitable ratio between effort and benefit, keep the desired model application in mind during the whole modeling workflow as different model purposes result in different implications regarding model quality and capability.

Read more about model applications examples here.

Model applications along the lifecycle of a biomolecule

Fig 9. Model applications along the lifecycle of a biomolecule

Summary

This clearly defined, straightforward workflow facilitates the calibration and application of mechanistic models significantly by increasing the speed of model calibration, reducing the model uncertainty, and mitigating the risk of parameter overfitting at the same time. Following this approach allows mechanistic modeling to unfold its full potential as in silico process development tool.

References

  1. Briskot T, Hahn T, Huuk T, et al. Analysis of complex protein elution behavior in preparative ion exchange processes using a colloidal particle adsorption model. J Chromatogr A. 2021;1654:462439. doi:10.1016/j.chroma.2021.462439
  2. Lavoisier A, Schlaeppi J-M. Early developability screen of therapeutic antibody candidates using Taylor dispersion analysis and UV area imaging detection. MAbs. 2015;7(1):77-83. doi:10.4161/19420862.2014.985544

Learn more mechanistic modeling