In today’s world, where drug development integrates science and technology, consumer safety is paramount in the pharmaceutical industry. Any drug released to the public must adhere to strict regulatory standards and have robust data to prove that the results are an accurate reflection of the drug development process used.
The maintenance and assurance of data accuracy and reliability across processes, including compound synthesis, production optimisation, quality control, and product release, are critical aspects of drug development. Without managing data integrity across all of the phases of discovery and development, process errors can go unchecked, procedure and activity tracking becomes unattainable, and multitudes of data can be lost. This can result in sub-par products that could present hazards to human health.
Data integrity is defined as the completeness, accuracy and consistency of data, indicated by the absence of any inconsistencies or alterations in prescribed methods. Companies strive for data integrity and are required to maintain it through compliance with regulations set by government agencies and safety organisations. As the pharmaceutical industry develops medicines to improve health, they are held to the highest standards of data management compared to other industries.
Ensuring data integrity is not a simple task. With such a complex chain of events and multiple groups working together to achieve an end goal, preserving data integrity across an entire process is an enormous challenge. Consider the need to track every lab employee’s activity, every instrument calibration, use and reading, every compound analysis step and every quality control point. The data collection alone can quickly become a significant undertaking.
Taking an informatics view of data acquisition and management builds in and automates data integrity.
Guiding principles
The pharmaceutical industry generates a massive amount of information through the research, discovery and development of drug products. From gaining a new understanding of metabolic pathways that influence drug efficacy, to synthesising new compounds discovered to be a key ingredient in a new treatment, labs constantly generate data. Due to this regular influx of information, labs can adhere to certain principles that help maintain quality in their processes and output.
Regulatory agencies, including the US Food and Drug Administration (FDA) and the European Medicines Agency (EMA) recommend five key principles to assist labs in good industry practices. The set of principles that goes by the acronym ALCOA governs the way that data can be generated, collected and managed. According to ALCOA, all data associated with a pharmaceutical product should be attributable, legitimate, contemporaneous, original and accurate, in addition to complete, consistent and enduring. Following these principles provides an opportunity for pharmaceutical labs to ensure a high quality standard that can stand the test of time and regulatory reviews.
On top of the desire to keep data consistent with the ALCOA principles, labs are required to comply with regulations set in place to create an assurance of quality among regulators and consumers. Regulation compliance can only be confirmed with regular audits and reviews of laboratory processes and data management systems. In order to successfully demonstrate that a lab is in compliance, evidence of a monitored production chain across the complex phases of drug development must be readily available.
The data collection web
Obtaining accurate and integrated data across all networks within the pharmaceutical development, manufacturing and supply chains can mean the difference between success and failure. Data comes from everywhere within a pharmaceutical network and creates a web of information that can be organised into one location for analysis. Following the ALCOA principles and complying with strict regulations on data management – to ensure a high level of data integrity – allows the industry to maintain public trust, regulatory compliance and brand value.
The need to have good data collection and management systems is central to any lab’s priorities. If a lab analyst records out-of-specification data from a quality control check in an uncontrolled worksheet, then validates and passes the product on official reporting documents, the data chain has been broken. Even if a second quality control check validated the product, the data was not managed appropriately, leading to disputable results and a product that cannot move forward. Aggregating instrument use and all activities, monitoring automated SOPs, and using electronic laboratory notebooks (ELNs) can significantly ease not only time spent in data management, but also in accountability and accuracy.
To err is to be human
Human error is inevitable in any circumstance, whether in a lab or in the office. Companies can introduce automated systems and regulated processes in order to minimise the effects of error on a company’s bottom line. However, in the pharmaceutical industry, human error and tendency for variability can be detrimental. Mistakes that create downstream inconsistencies can ultimately result in an ineffective or unsafe product.
Because of the extreme pressures put on labs dealing with drug discovery, development and production, automation becomes even more important and integral in each process. Adopting electronic SOPs and ELNs, and integrating these with a scientific data management system (SDMS), is the first step in increasing accountability and minimising errors. Electronic processes such as these create automatic check points within each phase of a protocol or longer term procedure, so that any inconsistency observed by the system at any point along a process can be caught and corrected before moving on to the next step.
This is extremely valuable for manual procedures, including working with instrumentation, running through a QC protocol, or measuring values. For example, users not adhering to SOPs with analytical techniques such as GC-MS and LC-MS used in quality control can affect the data integrity chain. Manual integrations of peaks in chromatograms can be inconsistent if not done automatically. Implementing suitable controls contained in modern chromatography data systems (CDS) platforms delivers reproducible results using sophisticated identification and integration algorithms. Integrated laboratory information systems (LIMS) can include automated CDS in this case to enter, store, track and trace data for every step, as well as alert the user if methods stray from the prescribed strategy.
The success of a connected lab
By adopting integrated data systems, from LIMS and CDS to ELNs and SDMS, pharmaceutical companies immediately solve consistency issues, can clearly define protocols and accountability, and centralise data collection and maintenance. These systems can enhance data quality throughout the development process, ensuring good laboratory practice (GLP) and good manufacturing practice (GMP).
Integrated informatics is useful in synchronising and centralising all activities for all members of a lab network. These platforms can increase operational efficiency, effectiveness and flexibility, improve brand integrity and reduce both costs and issues. By easing data retrieval for further analysis and review, adverse events can be raised before they affect the development pipeline. Corrective actions and quick decisions can be made before an error can amplify or even occur.
These benefits result in higher accuracy, faster processing and better visibility. A LIMS can securely archive instrument readings, method parameters and test data associated with a particular sample, in addition to any user interactions made with the software. As a result, every activity can be electronically documented, along with the identity of the individual who performed it, ensuring complete transparency throughout an entire workflow.
Providing comprehensive audit trails and fully searchable workflows can help limit the need to manually record data or inaccurately enter information. A search for activities involving too few steps, or those that have been interrupted or aborted, can quickly reveal actions which can subsequently be used by supervisors to investigate nonconforming procedures.
Further analysis of integrated LIMS data can also assist in forecasting future demands and planning. Examining useable data allows managers to audit current protocols or update regulations to ensure compliance with upcoming changes.
Managing control
Implementing and integrating various process workflows keeps a lab running at its highest efficiency, and in turn can ease reviews, maintain audit trails and allow better control of experiments. Process traceability makes data retrieval simple. In reviewing a process for compliance and quality, data can be searched and the relevant information easily recalled. Not only can this safeguard robust protocols and ensure adherence to regulatory standards, but can immediately call out any inconsistencies or user error. By integrating all systems into one centralised LIMS, reviews become a simple process that only assists in proving quality in processes, instead of alerting regulators to inaccurate results and indefensible products.
In order to ensure superior data quality and that ALCOA principles are adhered to, integrated data systems are the key to success. An integrated LIMS works within the pharmaceutical data chain to streamline and automate procedures, and improve governance of manual input. Using connected systems provides a level of traceability across a product workflow that instils end-user trust in product safety. While regulatory authorities continue to raise expectations for data integrity, pharmaceutical manufacturers are increasingly looking to exceed basic data management by using integrated informatics to track entire processes from discovery to market.