Search

Type here to search the website for related content

Search

Show results for
  • Pages ()
  • Blog ()
  • In the News ()
  • Press Releases ()
  • Events ()
  • Webinars ()
  • Resources ()
  • Success Stories ()

More resources

lightbulb

Whitepaper: Delivering quality results in a pharmaceutical platform faster

How an  can meet the needs of both analytical development AND QC teams

Analytical Development and QC teams are continually challenged to deliver results faster. However, in the absence of a modern informatics infrastructure, analytical teams can only manage their workload reactively and deal with problems retrospectively, which results in reduced agility and capacity. However, any initiative to address these problems by ‘going electronic’ needs to go further than simply replacing paper notebooks.

In the whitepaper ‘The Integrated Development Platform – A Modern Approach to Transform the Commercialization of Innovative Therapeutics’ we describe how deploying a common data management environment across the Development organization can expedite the development and transfer of robust processes, reduce costs, improve quality and enhance corporate image. In addition to exploring the benefits Analytical teams can expect to achieve and by taking a platform approach, this whitepaper also explains how an Integrated Development Platform can meet the needs of both analytical development and QC.

reduce costs

Shorter Turnaround Time

The development of a novel therapeutic requires tens of discrete and diverse analytical tests, both during processing and on the final material. Making effective development decisions relies on receiving timely results, but this is severely constrained when the communication between process and analytical testing teams is based on exchanging files via email.

A modern Integrated Development Platform will include built-in requesting capabilities, which provide laboratory managers with early insight into the workload coming their way so they can make more informed scheduling decisions. This results in better utilization of resources, not just people but also to reduce waste of costly materials and ensure expensive instruments are not sitting idle.

Analysts also receive far richer information about the samples they receive than is possible using typical requesting systems.

Knowing how the material the sample was taken from was made and understanding the purpose of the process scientist’s experiment the analyst can quickly determine the appropriate test strategy to take. For example, they can check how the material was processed prior to receipt and find what buffer or solvent was used and adapt their assay procedure accordingly, such as selecting the most appropriate marker or standard to use.

An understanding of the flow of people and materials within a laboratory can help optimise processes, particularly when workstations are implemented. It is possible to optimize the flow of data too. Deploying a data management platform in analytical development and QC laboratories can simplify the way people record and use experimental data, particularly when coupled with the implementation of the most suitable data entry device for a given procedure. For example, for some workstations a fixed touchscreen can be both simpler and safer to use than a portable tablet computer.

When moving to an electronic data capture system, there is a tendency to simply replicate paper processes electronically. However, as paper forms are one-dimensional, the way a procedure is laid out in a paper based form often doesn’t reflect the order in which analysts fill it in. Once a procedure can be executed electronically, the information captured in an audit trail provides insight into how analysts interact with a procedure, providing a valuable resource to identify ways in which the procedural flow can be improved or simplified.

By integrating inventory and barcoding with experimental execution, error-prone and unnecessary manual data entry can be reduced – an analyst will be able to scan a reagent and automatically populate details such as batch number and expiry date within the electronic execution record. In addition to eliminating transcription errors, electronic experimental execution also enables visual alerting of potential issues, such as a protocol deviation or the attempted use of an out of date reagent.

Integration with analytical instruments delivers similar benefits. For example, the ability to setup a worklist within the experimental record, then submit it to a Chromatography Data System (CDS) and subsequently retrieve the results directly into the experiment streamlines the flow of data and eliminates manual data transfer steps that add no value and introduce errors.

However, connecting directly to every piece of analytical equipment will result in unnecessary complexity and introduces multiple points of failure. A one size fits all approach to interfacing with equipment is rarely the best approach – for each instrument consider whether it is appropriate to connect directly, via integration with a CDS, through a dedicated instrument integration layer or not at all. Integration with some instruments will add little or no improvement in quality or efficiency. It is cost effective to take a holistic view of the technological change to the laboratory environment and upgrade particularly old instruments to simplify connectivity.

Fewer Repeats

Unnecessarily repeating work is an expensive business. Take an HPLC experiment for instance – the two most evident costs are those of reagents and time. The reagent costs can exceed $150 per run, especially when using a more expensive solvent such as Acetonitrile. Add to that personnel time – not simply the time spent re-executing the method, but also the time spent investigating the problem and writing-up and reviewing the experiment.

An often neglected cost of an experiment is that of the instrument itself: the costs of capital equipment such as an analytical instrument is spread over its useful life. The contribution to the cost of an individual experiment is based on the expected number of runs in its lifetime and can equate into $100s per run.

Finally, there’s the opportunity cost – spending time repeating work means that other work has to be rescheduled and overall capacity is reduced. All in all, a single repeated HPLC experiment can cost the business between $1,500 and $2,500. Consider the range of different assays and the current frequency of repeats in your own laboratory and it’s evident that reducing the number of unnecessarily repeated experiments can save tens of thousands of dollars each year.

The two most common reasons for needing to repeat an experiment are missing information and deviations during execution, including using out of date reagents or equipment that is out of calibration. When filling in paper forms, it is possible for analysts to omit information by mistake, resulting in an incomplete  experimental record. A reliance on paper or disconnected file-based systems to store experimental data not only means it takes time to find vital information, it often results in lost or inaccessible data.

From a GLP perspective, if you don’t have the proof of an experiment being conducted, it’s effectively the same as having never actually done it in the first place.

Paper forms don’t prevent errors or enforce the order of steps in an experimental workflow. Analysts can record values without noticing they are out of range, and this may not always be spotted by a review. In fact, experiments that shouldn’t pass acceptance criteria sometimes do and, as a consequence, critical issues might only be spotted some point further down the line.

During scale-up, a single error in offline testing can cost the business in excess of $100,000.

This can result in significant cost and time implications as a result of a single deviation not being spotted in a timely manner. For example, if a sample from the first unit operation to purify a drug substance passes an offline analytical test erroneously, the material being processed can end up undergoing additional unit operations before the problem is detected.

Thus, the cost to the organization is not just that of repeating a single test but also the cost of repeating the entire process (including equipment, consumables and labour costs) and the cost of repeating all the associated analytical tests. During scale-up, a single error in offline testing can cost the business in excess of $100,000.

What can be done about this? Replacing paper forms with a robust Integrated Development Platform ensures information is centralized and accessible, eliminating the problem of lost information. As we will discuss in the following pages, an Integrated Development Platform supports a preventative approach to compliance and quality by ensuring that all critical data is captured at the point of conducting the experiment and highlighting out of range values as soon as they are entered, enabling issues to be addressed in real time.

Consistency, Compliant and Higher Quality

As analytical teams grow, there tends to be a corresponding decrease in consistency between different groups and sites. Different sites can use completely different procedures and analysts can run the same procedure in startlingly different ways. The transition from paper and file-based data silos is an opportunity to review the different ways of working across analytical teams, identify and address inconsistencies and ‘grey areas’. Similarly, leveraging emerging analytical standards and terminologies, such as those developed by the Allotrope foundation and other similar industry initiatives, will reduce ambiguity and ensure harmonization in the way experimental procedures and results are documented.

Where experimental data is recorded on paper, the QA process for checking and approving analytical results involves a reviewer manually checking each and every aspect of an experimental write-up to ensure there are no issues and documenting any deviations. This laborious task of adds 20-30% to the overall time required to deliver results.

Most compliance issues are a result of transcription errors, calculation errors, using an outdated protocol or missing or incorrectly formatted data, such as using the wrong date format. By working in collaboration with QA teams, it is possible to pinpoint the errors that take up a disproportionate amount of review time. Once such issues have been identified and prioritised, an Integrated Development Platform can be easily configured to either prevent them (such as ensuring the current version of an SOP is used) or, in the small number of cases where prevention is not possible, to flag the issue to the analyst or reviewer.

This engenders more care and attention during experimental execution and increases the quality of both the work and the resulting data.

The rich, linked information managed by an Integrated Development Platform supports the identification, validation and monitoring of the critical quality attributes (CQA) of a product. By incorporating CQAs and tolerance levels within electronic experimental procedures, any issues with the quality of material being tested can be identified and addressed early. As will be described later, done correctly, this does not need to reduce the flexibility required by Analytical Development.

A comprehensive audit trail is a must, but when this is coupled with an automatically generated summary of all the deviations within an experimental record, QA can implement an ‘audit by exception’ approach. This has been demonstrated to accelerate the review process and free up QA resources to focus on critical issues. Through an iterative process, quality can be built into every procedure and a majority of common issues can be eliminated, resulting in ‘right first time’ experiment execution and improved quality management.

Faster investigations and Reduced downtime

search investigateInvestigations into quality issues or defects can have far reaching consequences. Not only do they cause significant delays to the development and manufacturing process, they reduce the capacity of the organization to work on other projects. Without access to comprehensive information, problems can only be dealt with after the fact. An Integrated Development Platform enables investigations to be conducted more efficiently and to identify issues before they become costly problems.

When registration and storage information is integrated with testing data, it is possible to manage and track samples and materials throughout their lifecycle – from preparation through to testing and disposal. Test results are automatically associated to a specific batch, process step and/or ingredient, which also ensures traceability of information. By using a common platform, process and analytical scientists can collaborate more easily, for example when troubleshooting to find the reason for a decrease in yield.

Access to end-to-end information about a product and its associated process promotes organizational learning.

For example, enables analytical scientists to more readily identify the source of impurities (such as unexpected peaks in chromatograms), for example because they can rapidly find out if a new reagent supplier had been used during processing. In fact, an Integrated Development Platform will become a growing knowledge base which enables tracking of impurities and supports impurity life cycle management.

save time elnIf assay results are not managed centrally, it is difficult to assess performance across tests. Hence,  performance assessments such as variations in inter-assay control results is only conducted every 2-3 weeks or often even less frequently. An Integrated Development Platform makes it possible to perform on-demand performance trending for a specific test and identify if it is starting to go out of specification and investigate the cause, before it becomes an issue.

When equipment calibration and use is captured directly within experimental records, the need for separate logbooks or spreadsheets is eliminated. This avoids duplication of information and makes easy to check if equipment is performing within predefined limits and identify opportunities for preventative maintenance. Since reactive (or ‘breakdown’) maintenance is typically 3-5 times more expensive than preventative maintenance, the availability of on-demand equipment performance data helps avoid costly downtime.

Similarly, problems with consumables can be identified faster and the inherent traceability of information makes it possible to identify potential problems, such as an issues related to a change of supplier, that may affect other ongoing experiments and proactively correct or terminate them, potentially saving $10,000s in re-work and development delays.

End-to-end Analytical Testing

On the surface, the needs of Analytical Development and Quality Control teams can be perceived as sufficiently different to necessitate using different systems to manage data and execute experiments. The need for stringency in QC can seem at odds with the exploratory, dynamic nature of Analytical Development. But
when Analytical Development and QC use different systems, the process of transferring and validating new methods can be onerous and is a bottleneck in the overall development process.

Is it possible to balance GLP and non-GLP requirements in a single system? The characteristics of result data that are valued by QC – accuracy, consistency and reliability – are also essential for Analytical Development to be able to learn from their experiences and develop new methods more efficiently. The main issue with both teams using the system is that providing the flexibility required by Analytical Development can result in complexity, which introduces validation challenges.

Modern data management platforms can provide automatic GLP-strength auditing in the background, without constraining Analytical Development. This results in end-to-end traceability, which will simplify regulatory inspections.

Procedures created by Analytical Development can be designed and documented with the future need for GLP validation in mind from the outset – the key to success is simplicity. Developing a single ‘monolithic’ electronic procedure for a given method results in a complex network of interlinked verification checks which can introduce unnecessary points of failure and make updates and validation challenging. A more effective approach is to take a ‘building block’ approach and create methods from pre-existing method sections or ‘blocks’, each of which performs a specific task, such as equipment calibration and buffer preparation, coupled with integrated applications that deliver critical cross-functional capabilities, such as inventory and request management. Employing standard inputs and outputs ensures connectivity between individual ‘blocks’ and with analytical instruments. This results in methods that are flexible enough to be used in Analytical Development yet can be ‘locked down’ and validated more easily to be used by QC.

Summary

An Integrated Development Platform will expedite analytical testing by streamlining workflows, reducing repeats and simplifying investigations. By using a common platform across Analytical Development and QC, quality can be built-in from the outset. The need to translate a method from one system to another is eliminated, and QC can start using a new, quality method sooner.

IDBS understands the challenges faced by biopharmaceutical organizations. Our Integrated Development Platform is the culmination of almost 30 years of experience and knowledge of capturing and managing biopharmaceutical data. Global pharmaceutical companies, emerging biotechs and leading CDMOs have partnered with IDBS and have leveraged our unparalleled know-how to transform their Development operations.

If you need to deliver quality results faster, speak to one
of our experts today or
email us at: info@idbs.com
www.idbs.com/contact

Download now More info on Pharmaceuticals Request a demo