Skip to main content

Process optimisation and technology transfer

Data transfer

The speed and volume of the discovery process can create bottlenecks though, meaning in silico techniques should extend beyond discovery and into process development and tech transfer (Image: Shutterstock)

As part of a series of articles based on a recent online roundtable event entitled ‘From research to manufacturing: Overcoming data challenges in the drug development cycle’, hosted by Scientific Computing World, we asked our panellists about how better data policies in drug development can lead to better process optimisation and technology transfer.

 

The speed and volume of the discovery process can create bottlenecks though, meaning in silico techniques should extend beyond discovery and into process development and tech transfer.

 

“The adoption of AI in drug discovery means more and more candidates are coming through with a higher likelihood of making it through clinical trials,” says DataHow’s von Stosch. “As AI is used here too (in clinical trials), that part should be accelerated too. That all adds up to a much faster timeline in drug development. However, we have the same process development capacity. We are not able to keep up with the increased throughput we’re seeing.

 

“So, we need to increase efficiency on the process development side. While automation is one way of doing that, a much more efficient solution is to use in silico process development data to increase capacity.

 

“Several pharmaceutical companies have built a process model that allows data from phase one to be used in phases two or three, as well as carry out process characterisation studies partially in silico. Now, though, you can even do this using data from several different molecules or from several different gene therapies together – this can lead to huge savings. This is commonly referred to as ‘transfer learning’.”

 

Recycling data from manufacturing back to in silico experiments is key to refining recipes in drug development and streamlining processes.

 

“You want to increase the number in silico experiments,” says CCDC’s Back, “and you want to use as much manufacturing data as you can. One of the challenges in that area is getting the manufacturing data in the right format so that you can leverage it for the future. You might be outsourcing manufacturing to external organisations, so you need to ensure they receive data from you in a way that they can act on it, but also that they can feed data back to you so that you can learn from it and use it.

 

“In research, we’re better at storing our data but, in GMP manufacturing sites, they have different priorities. Moving them to the point where you get that data – where you can learn from it and hopefully train machine learning models on it – is probably a little bit further away.”

 

Exactly when process optimisations start and end is often down to practicality and finances. “Process development doesn’t always get to the point where it is fully optimised,” says CPI’s Kürten. “Often, it stops as soon as you get a process that you can run in a plant. In the biopharma space, the product is made in very small volumes, but at a very high value. So, even a large pharma company will not necessarily have enough data to build a substantive model that actually gives you insights, because even it won’t produce enough of it, or won’t want to do enough development on it. 

 

“At CPI, we are putting together collaborative projects where people share those learnings and those insights to help on the process manufacturing side of things.”

 

The challenge of technology transfer – moving from theoretical data into the manufacturing space – involves many different persona, according to Curlew’s Lynch. “Depending on the modality, you would have a data team, then as your transition to manufacturing, and you might have non-core pharma personnel involved, particularly if it goes out to a third-party. It’s about aligning those different persona and providing the manufacturing group with the right information to proceed.

 

“The ISA-88 standard (a set of guidelines for designing, implementing and operating batch control systems) is key here, as it helps keep the recipe separate from the equipment, meaning the former could be moved to different providers without necessarily needing the exact same equipment.”

 

Kürten agrees that there are cultural differences between the experimenters and the manufacturers. “There are different ontologies and business architectures on the development side compared to manufacturing,” he says. “As you move from your development plant to your manufacturing plant, there will be different kinds of people from different cultures and they might even have different terms for the same things that they’re doing.

 

“Successful partnerships have a structured way of talking about data to help bridge that gap between the process development people, who are used to one way of working and one way of describing what they’re doing, and the manufacturing people. It might not even just be the distinction between GMP and development, but also just the different mindsets between people.

 

“The key is developing an ontology or architecture that’s both powerful enough to describe to everyone involved, but flexible enough to maintain the ability to add new things.”

 

There are a number of mitigation strategies organisations can take to minimise disruption in technology transfer – and it’s not just as simple as writing a guide, as von Stosch explains. “There is often room for interpretation between the way a transfer guide is written and how the person that later reads that guide understands it,” he says. “That ‘room for interpretation’ can increase tech transfer times, as it takes a lot of back and forth to smooth out the misunderstandings.

 

“It’s important to record data points that may be less obvious, because once it’s recorded, there can be no ambiguity about that data. A more systematic, digitally-supported knowledge transfer can save time.”

 

ChemPlus’s Green says modelling has a part to play right up until the manufacturing stage: “At the manufacturing end, you have quality controls and traditional ways of rejecting batches,” he says, “but modelling might offer you the opportunity to investigate why that is happening, or even predict a rejection before you observe it.”

 

Back says there are lots of examples of stakeholders working together to improve the feedback and optimisation between manufacturing and experimentation. “There was some work done a few years ago in the ADDoPT (Advanced Digital Design of Pharmaceutical Therapeutics) consortium by Perceptive Engineering (now part of Applied Materials). It was working with PSE (Process Systems Enterprise, now part of Siemens) to integrate soft sensors into manufacturing processes. It meant that if you observed some form of deviation in the output you were getting from your PAT (Process Analytical Technology), the settings of the equipment would be sent off to a model, which would process that change and make a tweak to the settings of whatever piece of kit was connected to that. As well as the data structure, this modelling can help adapt processes on the fly in a way that’s acceptable.”

 

Collaboration is also at the heart of the work that Lighten and his team do at Cell and Gene Therapy Catapult. “In the field of cell and gene therapies,” he says, “new companies are spawning organically trying to solve a biological problem, i.e. augment our immune systems to cure complex diseases, while perhaps neglecting core elements of data integrity standards as recommended by the Medicines and Healthcare Products Regulatory Agency (MHRA, UK) or the Food and Drugs Administration (FDA, USA) data governance guidance for GxP sectors (good laboratory practice, good clinical practice, good manufacturing practice, good distribution practice and good pharmacovigilance practice). There are GMP/digital partners that can help them address their issues, from development to manufacturing, who have been applying rigorous manufacturing and data integrity standards in other sectors for decades.”

 

Lighten also says that digitisation of batch manufacturing records has huge potential for time savings. “We have recently collaborated with Autolomous to develop a digital version of our paper-based batch manufacturing record – an eBMR – that exploits manufacturing system connectivity (e.g. APIs) to pull together data from the manufacturing process. The tricky part is implementing an orchestration layer that allows seamless ‘plug and play’ interoperability among diverse manufacturing equipment and systems. The development of personalised  medicine will only add to the potential for more data silos unless we solve this interoperability issue and carefully design how data is shared and protected between systems and users.”

 

The full report is available to download as a White Paper, which also covers: Data collection and formats; Data silos and how to avoid them; Data ontologies and efficiency of process development; Cultural change and the digitisation journey; and The shift to in silico for experiments.
 

You can register to download the White Paper here.

The roundtable and series of articles is sponsored by Siemens Digital Industries.

Media Partners