Biologics represents the fastest growing sector in pharmaceutical therapeutics, and for regulatory bodies to approve a drug, scientists must demonstrate consistency between batches and prove the efficacy of each bioprocess step with in-depth analytical testing and characterization.
Throughout the process development lifecycle, samples of biological material require testing and analysis across a plethora of different method types with specific instrumentation. While platform analytical methods for monoclonal antibodies are well-documented and follow a prescribed path for qualification and validation, analytical method development teams are often required to develop and qualify bespoke methods by which to test alternative biological entities. From supporting cell line development teams with titer assays to measure protein expression, to evaluating product charge and glycosylation from upstream samples, to assessing product and process related impurities such as host cell proteins from downstream processing, analytical scientists are heavily relied upon to assess the quality of the drug product, measure key attributes and feed this information back to the relevant process teams in order for them to improve or enhance their processes. With the huge volumes of data generated across analytical testing, traceability, accuracy and context are vital for usability.
Along with the demands for speed in drug design and development, analytical and QC teams are also feeling the pressure. Using paper and MS Excel files to share data, not all experimental information is recorded contemporaneously, which can result in delayed identification of problems; the late flagging of issues can lead to serious GMP consequences in the form of Deviations and CAPAs (Corrective And Preventive Actions). Considering developing a new therapeutic involves numerous analytical tests, such delays can significantly reduce agility and capacity in the lab.
As pressure for more biotherapeutics to be produced faster and at a reduced cost mounts, so does the strain on analytical teams running tests on a day-to-day basis. Modern informatics software, such as an integrated research execution platform, enables organizations to meet these challenges head on.
Focus on effective data capture to improve data integrity and productivity
For a safe product, traceability and repeatability are essential when labs have hundreds of samples going through analytical assessment every day. Each sample generates numerous data points from the different analytical procedures such as ELISA, HPLC and gel-based assays where samples may be tested as replicates under different conditions – all of which needs to be recorded accurately, and quickly, to ensure adherence to compliance procedures and targeting experiments to be “right-first-time”.
Paper-based lab notebooks are not synonymous with productivity and speed. More than that, recording information on paper takes time, data cannot be transferred or shared easily, and is vulnerable to loss and damage. When compiling data for reports, or performing troubleshooting exercises, without access to all the relevant metadata, process data and results data, experiments might have to be repeated.
Repeating tests can accumulate costs as these also include the auxiliary expenses associated with testing such as purchased buffers, columns, plates etc. Now imagine repeating these experiments due to a multitude of reasons: performance issues with the experiment, inaccurate results, deviations, incomplete data, expired reagents etc. Not only do the reagents and time contribute to the costs, but repetition delays other projects and therefore the capacity of the laboratory overall.
An integrated, electronic system can help negate these issues, while also catching any deviations from prescribed methods or SOPs (Standard Operating Procedures) – a vital feature when researchers test and process hundreds of samples a day. While in practice the identification of a problem may be simple, flagging issues or deviations at the point of execution can prevent a chain of actions downstream and potentially save organizations both time and money.
Data should be securely captured and converted into information with context for analytical scientists to decipher, interpret, analyze and make decisions. Moreover, these results need to be shared with the relevant process development teams, whether they are internal or external, who need to see the link between the sample results and the parent material from which these samples were taken.
Contact us Bioprocess Solutions