Quality by Design, or QbD, is a concept that aims to ensure optimum product quality by incorporating into the overall process the precise understanding and ability to control all the parameters that may impact on each material and its processing, and on the manufacture of the final product. The approach goes hand in hand with the application of in-line process analytical technologies (PAT) that monitor key quality attributes and process parameters in real-time, allowing for immediate adjustment and, feasibly, negating the need for the laboratory analysis of final products prior to batch release.
QbD can be applied to any industry, and both the US Food and Drug Administration (FDA) and the European Medicines Agency now have guidelines that direct companies in the pharmaceutical and biotech sectors to base their product development and regulatory submissions on QbD principles. This has led both R&D and manufacturing organisations to embrace new thinking and new informatics technologies.
QbD impacts on both process development and QA/QC, suggests Kjell Francois, business team manager at Siemens, and Trish Meek, director of product strategy at Thermo Fisher Scientific. ‘PAT gives developers key data that can inform and help to direct changes in process parameters for scale-up, for example,’ Meek comments.
Combining real-time data with LIMS results
The same is true in pharmaceutical manufacturing and in upstream bio-production processes, such as fermentation, adds Francois. ‘Data collected in real time, which is coming out of individual PAT instrumentation, such as NIR (near infrared) and Raman spectroscopy, laser diffraction and the latest ultra performance liquid chromatography (UPLC) systems, can be reviewed in combination with data from the LIMS (laboratory information management system) that collates and exports results from the laboratory testing of raw materials, which isn’t carried out in real time. This collective information can then be interpreted with a view to choreographing automation systems to ensure optimum manufacturing or processing parameters.’
Changing roles
QbD and PAT are also changing the role of the wet chemistry lab, Francois suggests. Real-time PAT will reduce the need for offline QC testing, but laboratories will still play a key role in optimising the measurement techniques and deriving the quality models and calibration curves for implementation of these in-line technologies. ‘The responsibility of the laboratory will shift from product testing to an almost research-driven role that will provide the models necessary to implement the PAT.’
QbD and PAT will in addition facilitate a shift from batch to continuous manufacturing, Francois continues. ‘Running a continuous production line is only feasible if you have in-line technologies that can measure and monitor critical parameters in real time. Otherwise, you are running blind at high throughput until offline analyses are carried out.’
Marrying data
In-line PAT capabilities are accelerating the drive to integrate data from multiple sources, Meek suggests. ‘Gone are the days when data from laboratory information management systems (LIMS), laboratory execution systems (LES), scientific data management systems (SDMS) and PAT, can be viewed as disparate and in separate layers. Only when you pull together data from all these systems, and in particular marry process and configuration data with that from in-line PAT analyses, can you realistically take full advantage of the tools available to interrogate and make sense of big data.’
Thermo Fisher’s SampleManager enables raw data management of the x-y pairs that make up the mass spectroscopy, chromatography, and FT-IR data. ‘We have a web-based laboratory execution module that supports all the laboratory methods, and all that data is then rolled into a central database, along with other laboratory management tools. So now you can create dashboards and views, and real-time analyses of what is happening in your laboratory, holistically. Statistical quality control is also embedded so that you carry out trend charting and adjust your SQC limits, in real time.’
Change in mind-set
QbD and PAT have thus effectively led to a complete change in mind-set, to embrace collective multivariate analyses rather than bolting together disparate univariate analyses, Francois continues. ‘In the bio-production sector, for example, our customers are looking at integrating new types of analytical instrumentation and methodologies into their processes, and combining the data from these technologies with classic process parameters. ‘It’s opening doors for much better controlled production and, ultimately, immediate product release. This goal has already been realised for the manufacturing of Merck’s Januvia tablets, which are released onto the US market based entirely on in-line testing, with no QA analyses carried out in the laboratory.’
Multivariate analyses for one process or product may also be relevant for other products or processes, and potentially can help to direct process or manufacturing development, Meek adds. ‘Companies have realised that if they are going to take advantage of all this data and use it as part of their QbD approach, then they need all that data centralised and collated into a single, holistic solution, and that provides opportunities to leverage multivariate data available for advanced analytics not just for a single product, but potentially for designing processes and predicting bottlenecks or manufacturing problems for similar products in development. This ability to transfer knowledge and make it relevant to multiple products is in part what’s driving paperless laboratory initiatives, and convergent informatics. R&D departments are talking to manufacturing departments because PAT data from process and manufacturing stages can also feasibly be used to inform compound selection and product development at an early stage.’
A single interface for integration
The application of QbD and PAT requires the application of flexible technologies that can capture real time data and present it, in combination with off-line analytical and process data from LIMS and automation systems, to enable immediate decision making and changes in process parameters. Siemens’ SIPAT platform, for example, has been developed as a single interface to facilitate the implementation of PAT in a pharmaceutical manufacturing setting, Francois claims. ‘The system can be integrated with, configure and control a range of analytical instrumentation, and collects, stores and reports all analytical data. SIPAT is also easily integrated with existing LIMS and automation systems to provide a single platform for accessing, combining, querying and analysing multivariate data in a range of formats, and making immediate changes to parameters to maintain optimum processing and manufacturing quality.’
Enabling efficient process design
The SIPAT model builder and report manager enable the user to search and request data including context and calibration data, so that all necessary information can be used in modelling software packages. ‘With SIPAT you get all your relevant quality and process data presented in one single tool,’ Francois explains. ‘SIPAT can retrieve all the process, analytical and LIMS data from a single or multiple time points, and feed that data into mathematical packages and software, including Siemens’ own tools and third party platforms. It means that you can view your processes from a much wider perspective, and gain new insights into the impact of changing either individual parameters or multiple parameters. The ability to use data from multiple source layers, such as LIMS, PAT and automation, will ultimately enable the design of much more efficient processes and higher-quality products, which brings us right back to the underlying concept of QbD.’
This ability to manage, query, report and manipulate multivariate data fits in with the ultimate goal of big data management and integration; effectively getting the maximum utility from data from disparate sources. We may be on the brink of a new generation of tools that enable the smarter utilisation of data from multiple layers that may otherwise remain as dead and unusable data.