The biomanufacturing industry is undergoing a major shift, from single-product processes and stainless steel infrastructure to flexible, multi-product facilities using single-use technology. Though single-use is being widely adopted, there still exists a lag in automation and measurement to make the most use of the technology and data integration.
Next Challenges in Biomanufacturing
The next set of biomanufacturing challenges go hand-in-hand: (1) the scale-down of the bioprocess, and (2) perfusion/continuous processing (CP). The first is driven by the fact that production titers continue to increase; a 6 g/L titer is not uncommon anymore, and this figure is already past the limit of a lot downstream processing capabilities. The progress in production titers has created a gap in single-use processing, where the upstream productivity is mismatched with the throughput and capacity on the downstream equipment. For a typical 200 liter bioreactor – far below the 2000 liter scale – companies will need to run multiple cycles downstream (specifically chromatography) or run parallel single-use units to meet throughput demands, raising important questions about how to automate and configure the plant for that level of throughput. Chromatography equipment will have to be configured differently to handle the large volume, as will filtration equipment, as high amounts of protein will clog filters. In these cases, companies must ask themselves:
- Will we have automation to wash the filter?
- Will we have intermediate cleaning steps?
- How do we create the unit operation to handle that amount of protein?
CP offers many benefits, including increased productivity, but it brings complexities in process control, data recording and regulatory compliance. In the past, when engineers designed an end-to-end process it was divided up into unit operations that had clear boundaries: a unit operation batch followed by a transition point, the next unit operation followed by a transition point and so forth. A batch was comprised of this series of unit operations. This was regulated with unit operations in an organized manner.
With CP, there is continuous flow of material from the bioreactor through the last polishing step. How is a batch defined, now that product runs through all the unit operations continuously end to end? How is a process deviation documented when a batch fails? What are the affected materials in a failure? If a failure is detected at the end of the process, how does one know when the batch went bad, and can one keep the materials previously produced? With batch operations, it is much easier to determine the affected lots in a deviation.
The benefits to CP coupled with single-use technology are higher throughput, increased flexibility, lack of cleaning issues, and reduced operating costs. As downtime is minimized and titers are increased, products can be processed much more quickly than in standard batch operations. But as facilities scale processes down and link everything together, automation must be improved.
Measurement in Continuous Processing
The addition of CP also drives the need for analytics. Continuous operations do not have the same intermediate cut-off points to perform analytics that batch operations do, so there is a need for in-line measurement to continuously measure the protein or antibody quantity at a certain point.
This presents a challenge for the analyzers themselves, as well as the issue of how process data is fed back and how the process reacts to it. In some cases, the analyzers simply do not exist yet, but where they do exist, they may take considerable time to provide results. If there is a reduction in quality or a failure during processing, this lag makes it difficult to determine the point in time at which the issue began and diagnose the quality or contamination issue. In the meantime, the CP process is producing materials and incurring operating expenses.
Automation: Addressing Challenges and Making Use of Existing Infrastructure
Automation will play a major role in facilitating new processing strategies such as semi-continuous, perfusion, and continuous and will help bridge the gap for companies with existing infrastructure. It will enable more efficient dosing/feeding in the bioreactor, buffer dilution, pH changes in the process, and creation of gradients in the chromatography. In a continuous process with large amounts of buffers, automation will allow facilities to dilute concentrated buffers in-line to avoid sizeable liquid storage.
Updating Existing Infrastructure
One of the attractive qualities about single-use is that the components are easy to modify, unlike stainless steel equipment, which can require pipe cutting, welding, and the addition of ports. In a single-use context, the alteration is simply adding another tube, a sensor, or a junction by melting plastic. Though modification in the single-use sphere is relatively uncomplicated, automation must be added to maximize the potential of the technology. Flexible plug and play control systems will become critical in enabling older generations of single-use equipment to carry out more complex operations.
Additionally, components such as pumps and valve controls must be scaled down and made more accurate for continuous and smaller-scale processing. These physical improvements will be necessary as dose amounts become smaller – at a certain point, companies may hit a limit where a dose is a single droplet coming from a tube, which puts the burden of accuracy (or availability/existence) on the system components themselves.
Global Process Optimization: From Islands of Data to Integrated Facilities
In the past, users would normally start with unit operations of a batch process, but as processes move from single process steps to continuous, these “islands of data” must be tied together through a layer of automation for communication. There is an interdependency between measurement and real-time “action” that requires global process optimization.
OPC from Microsoft, a protocol that allows for real-time vertical integration, provides connection between different kinds of controllers (e.g. Siemens or Emerson DeltaV), data storage in one large repository, and also allows all the unit steps to interact through the upper layer. A facility can also add a manufacturing execution system (MES), such as Werum or Syncade (Emerson), so that the plant network architecture[OU1] consists of multiple layers, which perform the following functions:
- MES layer: allows the vendor to create and optimize a recipe based on all of the unit steps, and subsequently operates the automation islands in synchronicity with one another and accounts for transition steps to complete the batch. This upper layer contains the master recipe and electronic records of each batch.
- Harmonization layer: OPC provides open connectivity between controllers and the MES layer, effectively tying the islands of data together.
- Control layer: oversees and controls each unit operation and gathers process data in one repository (called a historian database)
- Measurement layer: provides the process parameter information to the control and MES systems
Without the “upper layer,” data is not integrated and there is nothing to tie the transition between operations together. The integrated approach is a more streamlined process because a facility’s information systems and actions can function as a whole. If a problem is detected in one part of the process, the batch can be stopped before consuming a significant amount of materials (if a protein were of suboptimal quality, a facility would not want to waste the chromatography resins to purify a protein that would not go into an end product).
The benefit of integrated facilities is to provide user flexibility, and, in the case of….
The text above is owned by the site bellow referred.
Here is only a small part of the article, for more please follow the link