IDBS Blogdata standards for the lab

IDBS Blog | 27th June 2024

Standardized and structured data equate to smarter decision-making

Scientist analysing data on a screen. Has to have data standards.

By Craig Williamson, Lead Platform Product Manager (Integrations Strategy), IDBS

A recent study by the Capgemini Research Institute shows that 75% of pharma organizations have taken steps to mature digitally by investing in digital transformation, which could greatly streamline their data management. Yet, 90% admittedly still report they face data-related challenges. Poor data management, particularly poor data capture, can result in rework, repetition and delayed submission – which can cost hundreds of thousands of dollars per day.

Better data management can be achieved via data standards and structured data. Before we go too deep into this discussion, let us define what we mean by these terms. Data standards govern how data is managed, structured, formatted, defined and exchanged. Structured data is data organized in a standardized manner, with an easily identifiable structure to make it simpler to analyze and interpret.

Last May, the Food & Drug Administration (FDA) instituted new requirements for data standards that will apply to most study data submitted to the Center for Drug Evaluation and Research (CDER) and the Center for Biologics Evaluation and Research (CBER). Any electronic submission of study data that does not conform to required standards may be refused filing.

Dave Watrous, previously Strategy Director, BioPharma Lifecycle Management, and now VP of Sales, Customer Success and Marketing at IDBS, touts the benefits of data standards and structured data in a recent MedCity News article. His message is clear: An investment in better data management will result in faster and smarter decision-making – and achieve digital maturity – throughout the biopharma lifecycle.

Leverage data standards for improved information sharing

Data standards, as defined by the FDA, require underlying data capture principles to be rigorously adhered to to preserve the integrity of drug development data. CDER and CBER encourage sponsors and applicants to implement and use data standards as early as possible in the product development lifecycle, thus creating an expectation that these secondary data standards are also followed from the word go.

In fact, the FDA looks to the ALCOA+ (Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring and Available) principles to define data integrity expectations when aligning experimental data to data standards. ALCOA+ helps integrate data between teams and partners, ensures high-quality data standards are upheld, and promotes the leveragability of data throughout the biopharma lifecycle. Similarly, F.A.I.R. (Findable, Accessible, Interoperable and Reusable) data principles recommend subscribing to one system that enables crosstalk between two or more systems in a machine-readable format to achieve the four principles.

Watrous warns that even with these foundations, shared data alone is not sufficient to unlock the opportunities that collaboration throughout the lifecycle can provide – especially if your data is maintained in disparate systems. It is not uncommon for scientists and process engineers to spend countless hours finding, reconciling and assembling data stored in myriad places.

Consider the real-world example of a biotech company struggling to generate cell line reports because of hybrid tracking systems consisting of paper, an electronic lab notebook and Excel spreadsheets. Given the hybrid paper/digital operating environment, it was estimated that more than 32,000 resource hours per year were spent manually connecting processes, data, materials, equipment and people. Ultimately, it took six to eight weeks to generate a final cell line development report.

Digital workflows enable process understanding

IDBS worked with the biotech to map out the cell line development process to determine how data was structured and how the cell culture material and data flowed. It was decided that implementing digital workflows at the point of experimental execution and data creation would embed the required data standards via structured data capture. A digital platform, such as IDBS Polar, provides the foundation for real process understanding by providing high-quality data capture and automatically mapping the complex relationships between material attributes, process parameters and product quality across unit operations.

After implementing IDBS’ digital workflows for structured data capture, the biotech calculated actual time savings of more than 14,000 resource hours per year. The time to generate a final cell line development report was calculated to drop to two to three weeks, which is a 50% reduction.

The team reported additional benefits, including the ability to generate technical reports against searchable data, enabling scientists to search for cell culture data directly; reduced error resolution times because alerts were added to flag data errors and inconsistencies in the clone selection process, which called immediate attention to issues; and a 25% increase in lab capacity to support additional projects.

Semantic enrichment ensures interoperability

Watrous says that digital workflows also enable the alignment of scientific terms with standard terminology and ontologies. To ensure interoperable data is leveraged, data should be accompanied by a semantic enrichment layer.

In the case of F.A.I.R., for example, an ontology management platform (OMS) can be implemented to provide a data-centric method for aligning data to a set of common standards, resulting in improved interoperability. Time spent wrangling data to create associations between terms, simply labeling things differently, for example, concentration vs conc., is a key challenge in the industry. Deploying ontologies, public or proprietary, across the enterprise is key to data findability and interoperability. This is accomplished by capturing process and analytical data in full context from the beginning of the workflow. Combining an OMS with a digital platform, like IDBS Polar, synchronizes terminology and unlocks value from interoperable and standardized data to accelerate scientific discovery and development.

Proactive digital mapping eliminates bottlenecks

Advancing digital collaboration through data standards will also accelerate scientific discovery and development. In fact, BioPhorum’s Vision for Digital Maturity stresses that data standards ensure that data and data structures will improve and advance digital collaboration between sponsors and contract organizations – collaborations that have often utilized bespoke point-to-point solutions.

BioPhorum states that achieving digital maturity means transferring data to partners via open, standard and structured platforms. Watrous advises realistically assessing every partner’s capability to develop a sound digital strategy. He says: “Sponsors, collaborators, contract organizations and IT service providers must all utilize standards-based digital engagement to realize the benefits that automated insights can unlock for patients and the industry.” He adds that structured data must be generated and communicated with all stakeholders.

However, BioPhorum points out that standards must accommodate future innovations in a biotech industry that is constantly innovating new therapies. The group maintains that “special structures” should be part of the standard to allow for innovation and new elements to be added to the standard in the future.

Taking a proactive approach to digital maturity to include data structure and data standards upfront will eliminate bottlenecks down the road. And while collaboration between sponsors and contract organizations has not yet reached full digital maturity, a collaborative approach early in the process will increase speed, reduce cost and ensure improvements in data quality.

 

About the author

Craig WilliamsonCraig Williamson, Lead Platform Product Manager (Integrations Strategy), IDBS

As a Lead Platform Product Manager at IDBS, Craig is responsible for the platform’s data tier, termed the “Data Backbone”, which organizes platform data compliant with the FAIR data principles, so that all platform components can contribute data equitably. Craig is also responsible for IDBS’ out-of-box integrations to common informatics systems.

Craig is an organic chemist by training and has enjoyed a prior career in R&D, creating novel drug candidates and developing formulations to combat Alzheimer’s disease. 

 

References:

  1. Capgemini. (n.d.). Large pharma organizations to invest nearly 7% of revenue on building connected cutting-edge lab environments by 2025. Retrieved from https://www.capgemini.com/us-en/news/press-releases/large-pharma-organizations-to-invest-nearly-7-of-revenue-on-building-connected-cutting-edge-lab-environments-by-2025/
  2. IDBS. (2024, March). Digitalization best practice for biotech. Retrieved from https://www.idbs.com/2024/03/digitalization-best-practice-for-biotech/
  3. U.S. Food and Drug Administration. (n.d.). Study data submission to CDER and CBER. Retrieved from https://www.fda.gov/industry/study-data-standards-resources/study-data-submission-cder-and-cber
  4. MedCity News. (2023, June). Biopharma supply chain manufacturing. Retrieved from https://medcitynews.com/2023/06/biopharma-supply-chain-manufacturing/
  5. IDBS. (2022, December). Streamlined cell line development. Retrieved from https://www.idbs.com/2022/12/streamlined-cell-line-dev/
  6. BioPhorum. (n.d.). Vision for digital maturity in the integration between biomanufacturers and partner organizations. Retrieved from https://www.biophorum.com/download/vision-for-digital-maturity-in-the-integration-between-biomanufacturers-and-partner-organizations/
Check out our Blog section for more like this

More news