Agile process development


How deploying a common data management can benefit process development teams



The development of novel therapeutic products involves a complex network of internal departments and external partners. Different groups across Process Development organizations typically use a variety of approaches to capture, manage, analyse and report data.

These can range from paper and Excel, to in-house developed electronic systems or commercial solutions. Regardless of their IT maturity, the lack of a consistent approach has a negative impact on the agility of the business including:

  • Process scientists are unable to identify issues in a timely manner, resulting in a reactive approach to problems
  • Reports take time to compile and are out of date as soon as they are created
  • Critical decisions are made based on incomplete information

In the whitepaper ‘The Integrated Development Platform – A Modern Approach to Transform the Commercialization of Innovative Therapeutics’ we describe how deploying a common data management environment across the Development organization can expedite the development and transfer of robust processes, reduce costs, improve quality and enhance corporate image. This whitepaper describes the business benefits Process Development teams can expect to achieve and by taking
a platform approach.

Connect People and Processes

A lack of consistency in the capture, management and analysis of data is a barrier to both collaboration and understanding, and hampers the development of novel products and processes. By deploying a single data management and process execution platform across Process Development, communication between groups is improved and it becomes possible to trace information and material throughout the development lifecycle.

When development and testing teams use the same platform, the requesting process is streamlined – it is easy for process scientists to create, update and track testing requests. Because of the richer information afforded by an Integrated Development Platform, process scientists can see more than just a result – they can see context-rich information about how their material was tested. Process and analytical scientists can collaborate more easily, for example when troubleshooting to find the source of an impurity.

To deliver on its full potential, an effective data management platform should encompass inventory information as well as assay results. Incorporating inventory into the scientist’s workflow applications means the amount of manual data entry will be significantly reduced – a scientist can scan the barcode of the container of a consumable and automatically populate details such as the material type, batch number and expiry date. In addition to eliminating transcription errors, an effective user interface will also provide visual alerting to potential issues, such as a protocol deviation or the attempted use of an out of date reagent. All these elements increase the quality of the work done and reduce the amount of time
re-capturing information that already exists.

The ability to exchange information with laboratory instruments at the point of process execution (e.g. submitting worklists and retrieving results from a Mass Spectrometer) also streamlines workflows and eliminates manual data transfer steps. But care must be taken to avoid an ‘integration spaghetti’ – connecting to each and every instrument results in an unnecessarily complex ecosystem that is difficult to maintain and introduces many potential points of failure.

Connecting via a dedicated instrument integration layer can address this issue. Similarly, care must be taken to identify which integrations actually add value and to avoid excessive effort in developing and maintaining interfaces to old instruments.

An Ergonomic Laboratory Environment

The layout of most laboratories is usually the incremental result of accommodating new instruments or more people over time, rather than something that has been achieved by systematic design. Deploying an Integrated Development Platform not only enhances the connectivity between Process Development teams and processes, it is also an opportunity to improve how people connect with, and utilize, laboratory systems and equipment on a day-to-day basis.

Implementing ergonomically designed workstations within a laboratory environment has been demonstrated to reduce the movement of people and materials.

This approach can be extended to streamline the way people capture and use data in the lab to reduce non-value added data-related activities.

Data capture should be as simple as possible, with an uncluttered interface that requires minimal user interaction and leverages alternatives to manual keyboard entry where possible.

To maximize the benefits of an ergonomic laboratory, the data entry mechanism should be tailored to the specific process. For example, fixed touchscreens are better suited to clean room environments than portable tablet computers. Use of such devices will also improve safety – paper records can become coated in chemicals, making it necessary to read them in a dust hood.

As technologies evolve, voice recognition and wearables will provide alternatives to typing, and laboratory work and process execution will become even more agile. When it’s easy to capture accurate information, the quality and richness of the data is increased.

Obtain Deep Product & Process Understanding

Searching for information can feel like a full-time job. Using disconnected point solutions which typically store information in a person- or experiment-centric manner compounds the problem that critical information frequently resides in people’s heads. Collating online, at-line and offline data across groups and siloed systems is a complex undertaking, often resulting in gaps in information. Maintaining data from the execution of a process and everything connected to it in one place overcomes these problems.

Centralizing testing data, process parameters and information about the usage of equipment and consumables results in a context-rich electronic record describing the development and optimization of a product and process.

This approach also results in standardization across development projects and is an opportunity to align nomenclature and terminology between groups and sites and reduce ambiguity – ‘Day 0’ of a process means the same thing to everyone.

Unlike static reports created in Microsoft Word or PowerPoint, data is stored in a format that enables it to be reused for different purposes, both for routine reports and to answer ad hoc questions.

The harmonization of data and the connectedness of information makes it simple to compare, visualize and statistically analyze data across processes, unit operations and products. Since data is available in near real-time it becomes possible to monitor if process is in control, identify issues before they become problems and course correct pre-emptively. Troubleshooting analyses can draw from information across the entire development process, thus expediting investigations and outcomes.

Making data accessible across the development process not only makes it simpler to apply existing knowledge,
it also ensures full traceability of information, equipment and materials, enabling more effective utilization of resources and facilitating the process of batch release. It also fosters a ‘self-service’ approach. Scientists and managers are able to find what they need, when they need it, avoiding laborious searches for critical data. They can track the status of a request or process and make decisions based on consistent and up to date information.

Making effective use of prior knowledge and the re-use of data and processes wherever and whenever possible are Lean Drug Development principles3. With everything in one place, it’s possible to perform historical analyses, identify and measure key performance indicators (KPIs) and reveal opportunities for operational improvement.

Develop Robust Processes with quality built in

The biopharmaceutical industry is moving away from a ‘Quality by QC’/’Quality after Design’ approach and implementing Quality by Design (QbD) philosophy. The International Conference for Harmonization’s (ICH) guidelines emphasize the need for “documented evidence that the process, operated within established parameters, can perform effectively and reproducibly to produce a drug substance or intermediate meeting its predetermined specifications and quality attributes.”

When Process Development data is stored in silos, demonstrating to regulatory authorities that a  comprehensive exploration of the relationships between CPPs and CQAs has been conducted is challenging.

As illustrated by Genentech’s first two QbD-based submissions, this can be the difference between rejection and approval.

Centralizing information across the development process facilitates the characterization of complex interactions, such as understanding the impact of changes to upstream processes on the purity of batches of finished product and the impact of hold times and storage conditions between processing steps. If clinical data is available, it becomes possible to identify correlations between the performance of a specific batch and
the parameters used in its production. Hence, access to a comprehensive pool of connected, context-rich development data is a key enabler of QbD and pivotal to developing a risk based quality control strategy and enabling a lifecycle approach to process validation.

Streamline the Transition From Development to Funding

Is pilot plant the last stage of Development or the first stage of Manufacturing? Whatever your perspective, it is important to ensure a smooth flow of information from Process Development, through Pilot Plant and into Manufacturing. Selecting the right data management solution for Pilot Plant can avoid unnecessary bottlenecks.

In the quest to achieve organizational improvements through better data management, the transfer of information at the individual level should not be overlooked. For Pilot Plants wanting 24/7 continuity, deploying an Integrated Development Environment can ensure consistency from person to person, reducing the handover period between shifts.

There is a tendency to try to extend Manufacturing systems, such as a Manufacturing Execution System (MES)
into Pilot Plant. However, such systems typically need to include process exception rules – which results in a lot of effort for process engineers in Pilot Plant to keep track of product and process recipe changes, because every change is deemed an exception. The dynamic nature of Pilot Plant means its data and process management needs are more similar to those of earlier in Process Development. Similarly, Manufacturing Support groups also benefit from more flexibility than an MES can provide.

However, the need for flexibility needs to be balanced with the need for GMP validation. If not managed appropriately, flexibility can result in complexity, which introduces validation challenges. Hence, any workflows, protocols and templates deployed in an Integrated Development Environment should be designed and documented with the future need for GMP validation in mind from the outset so that Development processes can be transferred to Pilot Plant. Applying the auditing requirements of GMP throughout the development process also ensures traceability, which helps facilitate regulatory inspections.

Conversely, the increased focus on precision medicine has implications for Manufacturing, with the ultimate goal of producing smaller batches, closer to the patient6. Such ‘factories of the future’ will require greater agility than can be supported by a typical MES.

Regardless of which stage an MES is introduced, manually translating a new process for execution in Manufacturing is a laborious task. However, the ANSI/ISA-88 (or S88 for short) standard provides a means for harmonizing process descriptions7 and the associated BatchML (Batch Markup Language) format facilitates transfer of a process ‘recipe’8 between electronic systems. Hence, by leveraging BatchML, an Integrated Development Platform can ensure the smooth and accurate transfer of a validated process to a Manufacturing environment.


An Integrated Development Platform for data management and process execution increases the connectivity between Process Development teams.
It eliminates repetitive manual data tasks and reduce non-value added activities, such as searching and collating data for reports and investigations.

By centralizing context-rich information from across the organization, process scientists and managers can gain new insights and expedite the validation and transfer of robust, scalable processes
to manufacturing.

IDBS understands the challenges faced by biopharmaceutical organizations. Our Integrated Development Platform is the culmination of almost 30 years of experience and knowledge of capturing and managing biopharmaceutical data.

Global pharmaceutical companies, emerging biotechs and leading CDMOs have partnered with IDBS and have leveraged our unparalleled know-how to transform their Development operations.

If you’d like to increase the agility of your process development organization, speak to one of our experts today or email us at:

Download now

Request a DemoConnect with an Expert

More whitepapers

Don’t get data stuck in email

Changing Business Landscape A quickly accelerating paradigm shift is under way among scientific enterprises. Whether in the energy sector, food…