Protein purification workflow: A retrospective 10 years in the making

Abstract

The explosion of orphan drugs and biosimilars over the last decade, as well as the practical realization of personalized medicine, has resulted in a complete re-engineering of the discovery and manufacturing of protein therapeutics. Whereas the pharmaceutical industry had once completely focused its efforts on finding molecules that would service a broad patient population and could be produced in mass quantities, pricing pressures and smaller target markets for many therapeutic categories have changed the focus to smaller, more selective production runs.

Initiatives like the Dark Proteome, and high-throughput, deep-diving metabolomics that dramatically improved the understanding of cellular function and pathophysiology, were combined with computational methods to redesign protein therapeutics and thereby increase efficacy while limiting side effects.

The 25,000-liter tanks that once dominated cavernous production facilities have been replaced, for the  most part, with portable, single-use production pods that can be quickly linked together to form completely continuous protein expression and purification streams. And yet, remarkably, the material being produced in these smaller tanks is significantly more stable and requires lower doses than similar biotherapeutics produced in the larger facilities.

So how did we get to this point? What innovations occurred a decade ago that led us to this reality?

Looking back

March 2016—Although the protein-based therapeutics market continues to be dominated by monoclonal antibodies (mAbs), other proteins and peptides are being explored as potential therapeutic and diagnostic resources. And just like mAbs, each category offers advantages and challenges.

“To overcome the size and stability limitations of mAbs, a large body of work has focused on the generation of small non-antibody scaffolds for human therapy and imaging applications,” wrote MedImmune’s Lutz Jermutus and colleagues in a recent review of molecules such as affibodies, bicyclic peptides and Kunitz domains.

“The lack of intrinsic half-life extension immediately puts scaffolds at a disadvantage by limiting their ability to accumulate at a site of pharmacological action,” they cautioned. “Consequently, most scaffold-based drug candidates require the use of protein engineering or formulation to ensure continuous exposure after dosing.”

Similarly, Keld Fosgerau and Torsten Hoffmann of Zealand Pharma described efforts to convert peptides ranging from hormones to growth factors to anti-infectives into therapeutics. “Naturally occurring peptides are often not directly suitable for use as convenient therapeutics because they have intrinsic weaknesses, including poor chemical and physical stability, and a short circulating plasma half-life,” the researchers wrote in a review. “Some of these weaknesses have been successfully resolved through what we term the ‘traditional design’ of therapeutic peptides.

“We foresee that medicinal peptide chemistry could be applied increasingly in a modular fashion, comparable to micro-electronics or LEGO bricks,” they continued. “The intelligent assembly of known modules with unique properties and functions could lead to the construction of multifunctional molecular medicines with improved efficacy, pharmacokinetic properties, and targeted delivery.”

Much like the combinatorial chemistry advancements that have already occurred, the success of protein and peptide design relies not only on computational tools to predict the changes most likely to improve efficacy or safety, and expression systems in which to generate those changes, but also on an array of analytical tools to verify and validate the modifications.

Roche Diagnostics’ Dietmar Reusch and colleagues described their efforts to develop a high-throughput system to analyze the glycosylation state of recombinantly expressed IgGs. By adapting a DNA analyzer, they were able to examine up to 96 samples simultaneously using capillary gel electrophoresis (CGE) and laser-induced fluorescence (LIF) detection. “Using the 96-capillary setting, and allowing for the 45-min run-time for one CGE-LIF analysis, up to 3000 samples can be analyzed in a day,” the authors reported. “Because the method is rather simple and does not rely on mass spectrometry (MS), we expect that it can also be run in non-specialized laboratories or in an at-line analytics setting near the fermentation process.”

Another challenge for protein therapeutics is that they are produced in living organisms that contribute their own proteins to the mix. In many cases, target proteins are expressed into the growth medium, limiting host cell protein (HCP) contamination. And whenever possible, the initial enrichment step involves affinity chromatography (e.g., mAbs and Protein A resin). But even with these steps, it is vital to know that the observed biological effects result from the therapeutic candidate and not from a contaminant. In addition, even if HCPs don’t affect efficacy, they may yet trigger side effects.

“To date, proteomics-based profiling of HCPs has been almost universally performed by highly resolving 2D gel electrophoresis followed by MS identification of excised spots, usually with the purpose of identifying proteins differentially expressed under specific conditions,” explained Andrew Goetze and colleagues at Amgen. “These approaches are inherently non-quantitative in an absolute sense, but can be used to make relative comparisons.”

The researchers analyzed mAb isolates with 2D liquid chromatography prior to MS analysis (MSE). By overloading the column(s) to improve sensitivity for rare molecules, the researchers were able to consistently identify and quantitate HCPs present down to 20 ppm.

“We anticipate that rapid advances in instrumentation will soon facilitate more confident MSE analysis of downstream samples, including mAb drug product,” they concluded. “This, in turn, should allow more rational evaluation of potential HCP-related safety risks to patients based on identification and quantification of individual HCPs. This can be especially pertinent for comparing biosimilar mAbs, which may represent the end products of very different cell culture and purification processes.”

Just because a molecule looks good in the discovery phase, however, doesn’t mean it will be satisfactory as it moves into small-scale production and ultimately into manufacturing.

Just because a molecule looks good in the discovery phase, however, doesn’t mean it will be satisfactory as it moves into small-scale production and ultimately into manufacturing. For this reason, the historic divisions between discovery and development have started to dissolve. The result is the sharing of expertise to understand developability of candidate therapeutics.

From discovery to development

“Companies don’t have the resources to pursue research discoveries that just aren’t capable of being made into a product,” says Michael Ultee, founder of Ulteemit BioConsulting. They are therefore forced to determine ways in which to transform a promising lead into something that can be more easily produced.

Fortunately, he suggests, researchers now have a better understanding of the molecules they are trying to produce and can link critical quality attributes (CQAs)—such as free sulfhydryl groups or batch-variable glycosylation—back to therapeutic characteristics such as efficacy and immunogenicity.

“These are not things that normally discovery scientists would be screening for,” offers Tom Ransohoff, vice president of BioProcess Technology Consultants. “It is this improved methodology and understanding of what’s important that’s led to the concept of quality by design,” or QbD.

“People are looking at the structures up front and asking is this important to the activity of the molecule,” Ultee says. “If it is, we’ll find some way to work with it. If not, let’s engineer it out with a more manufacturing-friendly structure.”

Thus, rather than wait until the discovery phase has been completed and then re-engineer the candidate protein to suit development, questions about CQAs are being asked much earlier. “The other reason is that we want to do things faster,” Ransohoff explains. “If we can get a jump-start on development even before we’ve selected our final product, we may be able to save months in the development program without spending a huge amount more money.”

He adds, “When I started in the industry over 25 years ago, whatever the process gave us was what we were going to put into humans to see if it worked. And then if it worked, we were just going to try to make it the same way and scale it up.” Now, he says, researchers are moving to a mind-set of defining up front what they want the product to do and then designing into the molecule the CQAs that it needs to meet those objectives.

Similarly, many of the high-throughput resources that served discovery-phase characterization are helping optimize developability of the redesigned candidates. “On the upstream side, it’s the development of high-efficiency, well-controlled minibioreactors,” Ultee says, suggesting that developers can vary growth conditions or test expression systems in 24 or 48 small bioreactors, each of which can be monitored for pH, CO2 and O2 levels, stirring speed and other factors.

On the downstream side, there have been significant advances in the automated screening of chromatographic resins to optimize protein-isolation methods. For example, microtiter plates filled with a variety of resins can be monitored for basic on/off binding, washing and elution via automated liquid handling. “This is something that we used to do with packed chromatography columns at a much larger scale, one run at a time,” Ransohoff recalls. “Now we can do 96 experiments at a time to look at optimizing our downstream processes more efficiently and with less material.”

“What it doesn’t fully capture, however, is the dynamic nature of chromatography processes” shared Ransohoff. This is where high-throughput chromatography systems that rely on small, prepacked columns are finding increased use. “This has enabled process development scientists to not just use their best hunch for what might work, but to try several iterations and do in a week what otherwise might take months,” Ultee explains.

The result is a much more complete picture of the design space for any individual step. At the same time, just as chromatographic separation is moving into high-throughput, so too must the analytics. “That’s the other piece of the puzzle that is really important in driving this forward and that people are working on,” Ransohoff says. “It’s still obviously a work in progress, and that ties back to the whole integration of discovery and development, too.”

Scale and single-use

Improvements in the design of the target product and the steps required to produce it have also led to shrinking differences in scale between pilot projects and manufacturing. “A lot of the giant tanks that were built years ago are now no longer really being used as much, except by someone who has an existing process,” Ultee says. “The most famous case is Genentech, where they built a new facility with four 25,000-L bioreactors and then never really used them. The titers got high enough that they could get by with smaller bioreactors.”

Increasing titers from cell culture has meant companies can now harvest as much protein from smaller bioreactors as they used to harvest from large stainless steel vats.

Increasing titers from cell culture has meant companies can now harvest as much protein from smaller bioreactors as they used to harvest from large stainless steel vats. “You can get by with 2000-L production at 5 g/L,” Ultee says.

“The other trend that’s happened is that the molecules have gotten more specific and selective, so that you’re able to use lower dosages,” he explains. “For some of the early antibody products, the dosages were quite high.” He offers the examples of Herceptin and Avastin, where each dose for a patient may have been half a gram. Newer products, however, have benefited from advances in medical science and molecular design, making it possible to administer much lower doses. Therefore, less material needs to be produced. “These trends have all come together to reduce the size of the bioreactor or enable single-use bioreactors to be a broad platform across the industry.”

According to Ransohoff, the influx of single-use or disposable technologies in the industry offers benefits such as significant reductions in the amount of capital required to build a manufacturing facility and in the amount of time needed to get a manufacturing facility up and running. It also impacts process portability.

“No matter how hard you try, if you’re building a stainless steel facility in two different locations, they don’t wind up being exactly the same,” Ransohoff explains. “So moving a process from one facility to another requires a certain amount of effort and investment in technology transfer.”

Ransohoffe adds, “That becomes easier and easier as we understand our processes better, but if you have a single-use process where you’re buying all of the equipment every batch you make, and you’re just inflating the bags and setting them up in a facility, it makes the differences between facilities far less significant.”

According to Ultee, “A lot of companies are going all single-use as their production module. This is certainly the case with Catalent.”

Shortly after announcing its entry into the antibody-drug conjugate market in 2013, Catalent Biologics opened its new single-use biomanufacturing facility, quadrupling its capacity. Company president Barry Littlejohns stated the move to single-use would reduce the risk of cross contamination and give the company flexibility and scale.

Late last year, the company announced a further expansion of its single-use capabilities with a focus on start-ups looking to accelerate proof of concept and achieve subsequent rounds of funding. “Other companies have whole single-use modules,” Ultee adds, citing Patheon’s FlexFactory.

The move to single-use often starts with CMOs, Ultee suggests, because they have so many different products being manufactured and therefore require greater operational flexibility. That said, Pfizer partnered with Jacobs Engineering in 2014 to develop a Rapid Deployment Module that integrates modular equipment, fully automated control systems and single-use technology.

Continuously improving

Another advent that has facilitated the move to smaller bioreactors is the adoption of continuous culturing techniques whereby growth media into which target protein is being secreted is harvested on the fly and is then replaced by fresh media. This allows cultures to be run at higher cell densities than with batch mode, and potentially grown for months rather than weeks.
 
Continuous bioproduction is particularly useful when the protein candidate is unstable or labile, but it may offer broader benefits.

“There may be quality benefits even with the production of stable proteins as glycosylation profiling changes … have been reported in batch systems, most likely due to post-secretion enzymatic modification or degradation,” reported MIT’s Charles Cooney and Genzyme’s Konstantin Konstantinov in a white paper presented to the 2014 International Symposium on Continuing Manufacturing of Pharmaceuticals.

“Modifications, such as deamidation, that are associated with prolonged molecule exposure to bioreactor pH and temperature conditions can also be minimized with continuous cultivation,” they continued. “The operation of a process at an optimal steady state facilitates the application of QbD principles. The high degree of automation in continuous processing further increases operational consistency as it minimizes manual operations and subjective decision-making.”

At the same time, the current technologies are not easily implemented, cautions Eric Langer, managing partner of BioPlan Associates. “Even though approaches like perfusion have been around for many decades, they tend to be perceived as complex, risky in terms of contamination events, and [they] require unique sets of expertise that are often difficult to hire in.”

Furthermore, “until downstream technologies are introduced, the full benefits of a truly continuous approach will not be realized,” he adds. “There are a number of technologies in development, but until more end users see continuous bioprocessing as a viable manufacturing strategy, and until these technologies are accepted by regulators, the path forward will be a slow one.”

Although some prefer the familiarity of batch mode, other companies endeavor to bring continuous bioprocessing to market. Ultee describes the technology as a simulated moving bed (SMB) approach, in which sample and buffers switch among multiple columns running in series. Although SMB has been around for decades, says Ransohoff, it relied on mechanical parts that were largely anathema to biopharmaceutical development. “Rotating valves with sliding seal surfaces, where the surfaces are wetted, is okay in chemical process industries, but it is absolutely not okay in biopharma industries where the possibility of bacterial growth is always there,” he says. Thus, SMB developers had to re-engineer systems. 

Looking beyond chromatography, Cooney points out that many of the technologies already in use in batch mode are inherently continuous, citing as examples centrifugation and tangential flow filtration. “I believe experience—aka confidence—will slowly convert the batch to continuous workflows, as material use and thus costs are reduced with continuous,” he says.

One step in helping companies adapt to continuous methods will be the development of near-real-time monitoring, akin to process analytical technology (PAT). “People have different schools of thought on PAT across the industry,” Ransohoff offers. “Some people feel like once you really understand your process, the types of things you need to measure to determine whether your process is in control should be relatively simple and straightforward and fully applicable to PAT.” 

But “other people see more value in bringing more sophisticated analytical techniques closer to the process,” he continues. These techniques have included everything from quantitative PCR to monitor gene expression in cultures to size-exclusion chromatography, differential scanning calorimetry and resonant mass measurement to monitor protein aggregation.

Although Ultee is loath to suggest process monitoring is likely to offer real-time analysis, if only because most of the analyses take too long, he notes that monitoring efforts have increased in recent years. “You can look at the titer with rapid online sampling,” he says, but if you’re looking for process impurities such as HCPs, you are likely relying on an ELISA assay that may take hours. However, he is quick to point to technological improvements in HPLC and UPLC that have enabled process engineers to study batch variations much more quickly. “They are tiny columns that are typically run in less than a few minutes,” he says. “So you’re getting opportunities to get more data closer to real-time.”

BioPlan’s Langer is more bullish: “According to our 12th Annual Report and Survey of Biomanufacturing, there is a large and rapidly growing need for more and better types of analytical technologies and assays. And the need for real-time monitoring is only going to increase.”

How these production and analytical innovations affect protein purification research over the next decade, however, is anyone’s guess. Clearly, evolution, optimization and streamlining to create more efficient and effective workflow processes will be a primary focus.


References

Vazquez-Lombardi R, Phan TG, Zimmerman C, Lowe D, Jermutus L, Christ D. “Challenges and opportunities for non-antibody scaffold drugs” Drug Disc Today. 2015; 20(10):1271-1283.
Fosgerau K, Hoffmann T. “Peptide therapeutics: current status and future directions” Drug Disc Today. 2015; 20(1):122-128.

Reusch D, Haberger M, Kailich T, Heidenreich A-K, Kampe M, Bulau P, Wuhrer M. “High-throughput glycosylation analysis of therapeutic immunoglobulin G by capillary gel electrophoresis using a DNA analyzer” mAbs. 2014; 6(1):185-196.

Zhang Q, Goetze AM, Cui H, Wylie J, Trimble S, Hewig A, Flynn GC. “Comprehensive tracking of host cell proteins during monoclonal antibody purifications using mass spectrometry” mAbs. 2014; 6(3):659-670.

Cooney CL, Konstantinov KB. “White paper on continuous bioprocessing” J Pharm Sci. 2015; 104(3):813-820.