IDBS BlogSoftware validation. Scientist collaborating information with laptop

IDBS Blog | 31st October 2023

Software validation keeps GxP systems current and speeds up time-to-market

Software validation. Scientist collaborating information with laptop

By Fran Carmody, Services Enablement Manager, IDBS

Industry insiders warn that with more scrutiny on software processes, they expect the Food and Drug Administration (FDA) to issue more warning letters on product software validation1 – the process of ensuring that a software product meets a user’s demands and expectations. In recent years, global regulatory agencies, including the European Medicines Agency (EMA), have followed the FDA’s lead in citing deficiencies in computer and software system validation, particularly regarding GxP-regulated activities.2

As part of the 2002 final guidance on General Principles of Software Validation, the FDA requires that pharmaceutical companies demonstrate and document that their software accurately and consistently produces results that meet predetermined guidelines for compliance and quality management.3 Requirements to maintain a GxP environment and the need to understand how the system works mean that life sciences researchers must ensure that any updated software continues to meet the needs of all who use it and that it functions as designed.

But, as Jim Brooks, GxP Solution Owner at IDBS, writes in a recent Technology Networks article,4 validating new software, such as a laboratory information management system (LIMS), can take up to a year to complete, eating up resources that could be spent on development, manufacture, testing and delivery of much-needed medications. He is optimistic that, through the following best practices, the validation process and ultimate system implementation can be hastened with careful evaluation and continuous improvement.

A risk-based approach to software validation ensures more effective system monitoring

Brooks first recommends taking a risk-based approach toward validation. Last year, the International Society for Pharmaceutical Engineering (ISPE) published its second GAMP 5 edition on “A risk-based approach to Compliant GxP Computerized Systems”.5 The purpose is to ensure that the validation activities align with the complexity of design and risk associated with using the software for its intended use.

Traditional system validation relied on a document-centered process of checklists, templates and predefined procedures to test and document system components. Thus, efforts to validate systems focused on creating documentation to demonstrate regulatory compliance rather than concentrating efforts on ensuring the correct operation of high-risk functionality of the system. This approach resulted in the validation process being more burdensome than necessary and missing the opportunity to gain knowledge about the system’s optimum use in the GxP environment.6

As a result, critical thinking was not always being applied during the risk assessment process and the FDA recognized that further guidance was required to ensure risk-based approaches focused on intended use were being applied. So in 2018, the FDA published its Q&A guidance for Data Integrity and Compliance With Drug cGMP,7 which states that validation for consistent intended performance should include a risk-based approach. Here, critical thinking will be used to prioritize the validation effort based on the level of risk associated with the computer system.6

Leverage, don’t duplicate, vendor activities

ISPE GAMP5 stresses the importance of encouraging life sciences companies to involve technology suppliers in the validation process, taking advantage of their knowledge and experience.5 Brooks says vendors can provide documentation and qualification services to support the validation process. With cloud-based software, vendors are well-positioned to provide even more value to customers during the deployment and upgrade by performing risk-based testing directly on the customer’s environment.

During a recent validation webinar,8 IDBS customers shared their software validation experiences. One of those clients, Merry Danley, IT Director at Q2 Lab Solutions, says that vendors offering cloud-based software as a service (SaaS) will have performed all the necessary testing so there is no need for the user to duplicate those efforts, saving time and resources. “There is no need to repeat something internally that you are outsourcing,” she says.

IDBS Cloud Validation Services,9 for instance, is transforming validation through testing automation and standardized validation required by regulatory agencies. These services can reduce cost and time.

During the webinar presentation,8 Ryan McGee, Associate Director of Bioanalysis at Incyte and an IDBS system user, explains that he takes confidence in IDBS, validating that the software operates as intended, while he and his team focus instead on the specific workflows that are needed for day-to-day operations. He says: “Having that assurance that the vendor has done its due diligence allows us to take a risk-based approach and move forward with less testing on the back end.”

Thus, when choosing a new software supplier, review their validation documentation as part of their service package. This documentation will serve as a starting point for your validation process, saving time in the long run.

Automate vendor test scripts to increase confidence

Brooks agrees with industry guidance that automation can play a role in achieving control and quality and reducing risk during the validation process. Many technology vendors offer test scripts to increase confidence that the system is operating as intended. Such scripts will run automatically and alert users when something is out of specification. For example, automated operation qualification (OQ) testing during the implementation stage involves testing equipment to confirm it operates as intended and within manufacturer-approved operating ranges.

Additionally, vendors may offer customer-specific performance qualification (PQ) testing to help verify use cases. This can prove cost-effective to the user.

System users can focus attention on high-risk areas using PQ test scripts, leveraging risk assessments created by vendors. These high-risk areas may include audit trails, electronic signatures, regulatory aspects and data integrity. It is important to note here that 65% of FDA warning letters in 2021 cited data integrity issues.10

Danley says she can see value in setting up automated test scripts to run on a predefined basis and receive alerts when something fails, as long as those failures are monitored and addressed.8

Automating test scripts can help support the system through its lifecycle rather than investing time and resources to manually develop and run the test scripts. Automating tests that are repeated often yet unlikely to vary can reduce timelines while still ensuring quality.

Validate software updates faster for continued improvement

Whenever a change is made in a regulated environment, like life sciences, validation is necessary. So whether switching to a new software vendor or receiving regular system updates from your current vendor, validation is required. Cloud-based platforms, like IDBS Polar, will regularly release updates, patches and other changes—all of which require validation. You want to maintain control over these updates and ensure the system continues to perform in a validated state.

Brooks says these changes don’t have to mean more work. For example, only validate the new functionalities that will be used by your organization. He says this creates “an opportunity for continuous improvement that will make your organization more agile.”4

McGee says: “We recently received upgrades to our existing solution and we chose to focus our validation efforts on the new functionality of the software that we would use in our daily workflow.”8

In the Technology Networks article, Brooks notes that it is important that the vendor has detailed knowledge about its software and the updates being made. This will streamline the validation process, assess risks and implement new software features more quickly – in less time and with fewer resources.

Approach software validation as an opportunity to continually improve

Brooks says these best practices begin with partnering with a vendor that understands life sciences regulatory and quality requirements. But no validation is without a few bumps, agree Danley and McGhee.

“There will always be challenges when it comes to validation,” says Danley.8 “But it all comes down to continuously improving. Understand your system. Know what you are getting. Don’t just validate to check a box. Take the opportunity to become more efficient and make the system most useful for you.”

 

A szerzőről

Fran Carmody, Services Enablement Manager at IDBS

Fran is responsible for unlocking the value of the IDBS platform to solve some of the hardest challenges in emerging and global BioPharma companies and their contractor partners. Before that, Fran led the Global Professional Services team, responsible for customer transformation to a culture of data with the implementation of the IDBS technology stack.  

Before joining the IDBS team in 2012, Fran was a consumer of IDBS technologies while working with several start-up biotech companies, focused on novel immunotherapies, including monoclonal antibodies and small molecules.  

Fran has a dual BA in Biology and Psychology from Skidmore College.  

 

References

  1. Kaminski, E. (2023). 2023 FDA warning letters and software validation. Ketryx. Retrieved from [https://www.ketryx.com/blog/2023-fda-warning-letters-and-software-validation]
  2. Unger, B. (2017). An analysis of FDA warning letters on data governance & data integrity. Pharmaceutical Online. Retrieved from [https://www.pharmaceuticalonline.com/doc/an-analysis-of-fda-warning-letters-on-data-governance-data-integrity-0001]
  3. Compliance Online. (n.d.). The fundamentals of FDA software validation. Retrieved from [https://www.complianceonline.com/resources/fundamentals-of-fda-software-validation.html]
  4. Brooks, J. (2023). Improving time-to-value for GxP computer systems. Technology Networks. Retrieved from [https://www.technologynetworks.com/informatics/articles/improving-time-to-value-for-gxp-computer-systems-372720]
  5. Wyn, S., & Clark, C. (2023). What you need to know about GAMP 5 guide, 2nd Edition. Pharmaceutical Engineering. Retrieved from [https://ispe.org/pharmaceutical-engineering/january-february-2023/what-you-need-know-about-gampr-5-guide-2nd-edition?]
  6. Baker, P., & Khaja, H. (2021). A risk-based approach to GxP computer systems validation using critical thinking. Agilent. Retrieved from [https://www.agilent.com/about/data-integrity/en/risk-based-approach-to-csv.html]
  7. FDA. (2018). Data integrity and compliance with drug cGMP: Questions and answers. Retrieved from [https://www.fda.gov/regulatory-information/search-fda-guidance-documents/data-integrity-and-compliance-drug-cgmp-questions-and-answers]
  8. Danley, M., Brooks, J., McGee, R., & Curtis, D. (2022). Smarter, faster, better validation. Retrieved from [https://www.idbs.com/2022/06/smarter-faster-better-validation/]
  9. IDBS. (2021). IDBS Cloud Validation Services. Retrieved from [https://www.idbs.com/cloud-validation-services/]
  10. Jansen, D. (2022). How to avoid warning letters for data integrity non-conformances. MasterControl. Retrieved from [https://www.mastercontrol.com/gxp-lifeline/data-integrity-trends/]

Further reading

Infosheet: IDBS Cloud Validation Services

On-demand webinar: Smarter, faster, better validation

Blog: Deploying and managing cloud microservices in GxP environments – Part 1

Check out our Blog section for more like this

More news