IDBS Customer Event i3 is coming to London this year.

Search

Type here to search the website for related content

Search
Generic filters
Exact matches only

Upstream Processing

lightbulb
lightbulb

An approach tailored to your upstream workflows

Upstream processing for biotherapeutics can involve different expression systems such as microbial fermentation and mammalian cell culture for an increasingly wide variety of products such as recombinant proteins, vaccines, monoclonal antibodies, and viruses for cell and gene therapies.

The IDBS Bioprocess Solution aids in each of these processes. Our workflows systematically capture process parameters, equipment and material details, instrument data, and cell line information while providing automatic calculations and visualization of results including full sample tracking. Inventory and Request management is an integral part of the solution and our workflows provide both the flexibility needed for early development as well as the comprehensive documentation needed for later development. Upstream workflows include:

  • Cell expansion: provides the link to cell line development, including vial thawing and seed train expansion
  • Bioreactor preparation and production: encompasses batch, fed-batch, and perfusion modes of operation
  • Harvest using centrifugation: and/or filtration for clarification and product recovery
  • Cell lysate preparation: for intracellular products
  • Resuspension: and wash for insoluble (inclusion-body) products

Siloed data makes it difficult for researchers to understand the context of the experiment and make informed decisions accordingly. Accessible data provides an interconnected web of information that is context-rich and pivotal to enabling Quality by Design (QbD).

Integrated inventory enables visibility and control over samples, materials and equipment

To reduce simple mistakes and inaccuracies through accidental human error, recording data electronically is highly valuable. In practice, automated processes include scanning materials with a barcode to enter required information such as batch number, supplier and expiry date. Such processes will also flag up any issues or inconsistencies with the product – say, if the item is out of date or simply incorrect – providing quality checking in real time.

If there is a problem with the materials or equipment, the experiment will often have to be repeated. On average, 10% to 20% of experimental work has to be run again due to issues with data integrity and accessibility. Running individual experiments, or repeating tasks can have an enormous impact on both the day-to-day lives of scientists and the overall profitability of the business.

Take HPLC as an example – this common test is vital to analyzing samples in conjunction with upstream processes, but the reagents used are expensive at about $150 per run, and executing and interpreting the test takes time. On top of that, time is spent investigating any issues with the method execution, writing the report and reviewing the processes.

The goal is to catch these deviations early in the process of innovation and solve them in real time.

Having all relevant data in a single centralized location also has an impact on characterizing complex interactions such as the impact of storage times and conditions between steps in the process. In addition, it contributes to understanding the impact of changes to upstream processes on the purity and yield of the final product. Data is the key to identifying correlations between the batches of the finished product and the parameters used in the process to develop it.

An integrated development platform drives efficiency and quality

To tighten the timeframes involved, workflows need to be streamlined, samples and materials managed efficiently, and the reporting phase sped up while still observing good practice guidelines (GxP) where appropriate. Between templates for data entry to enforce business rules, and reducing transcription errors via barcodes, queries and integrations, using software to document experimental data encourages the highest level of accuracy.

When it comes to efficiency, reducing reporting timelines and automating data entry through integrations and templates, saves both time and money. The data documenting stage alone can take an average of 7 hours a week to complete, per scientist. With an integrated development platform, this arduous task can be cut down by at least 50%.

Similarly, an average scientist will spend at least one hour a week searching for information and experiments. Having all relevant information in a single place including the cell line history enables easy access to data that was previously difficult to find, reducing the time spent searching by 90%. Along with ensuring a high standard of traceability, documenting and reporting become effortless.

Moreover, streamlining process execution helps improve data integrity and ensures adherence to Standard Operating Procedures (SOPs) in the laboratory.

Discover more about The IDBS Bioprocess Solution and how it can help your lab workflows in upstream bioprocessing by clicking here.

 

Contact us Bioprocess Solutions