In 2001, the FDA published the first version of the Bioanalytical Method Validation (BMV) Guidance for the bioanalysis industry.
Over the next 17 years, the FDA would come out with another draft document, followed by a final version in 2018, based on scientist’s laboratory suggestions.
Scientists were encouraged to follow all versions of the document, however, “there’s no substitute for Good Science.” From the beginning of the 2018 finalized version, expectations of upholding the seven essentials of bioanalytical method validation – selectivity, sensitivity, accuracy, precision, reproducibility, limit of quantitation, and stability – are laid out along with their definitions and importance.
A quick scan reveals the major changes and how they affect bioanalytical researchers.
Method development – a greater emphasis
Towards the start of the document, there are updates to bioanalytical method development – “Freshly prepared Quality Controls are recommended for precision and accuracy analyses during method development, as stability data are generally not available at this time.”
Making fresh quality controls daily creates additional prep work for method validation scientists. While it’s good practice, it’s also time-consuming and laborious. Compliance on this point could potentially avoid creating a Form 483 in the future, where an FDA inspection raises concerns regarding operations and penalizes the laboratory.
The Technology Piece?
Technology could help enforce compliance to this new guideline by alerting users, in real time, of an expired QC in their experiment. This would ensure a fresh QC is made that day and the Bioanalyst can proceed with their work, documenting appropriately to complete their experiment.
Internal standards – requirements for monitoring variability
When it comes to chromatographic assays, the Internal Standard response has a significant impact on the results since it is used to normalize the integrated areas of the chromatographic peaks as part of the data analysis. “For CCs (chromatographic assays), the IS response should be monitored for variability. An SOP should be developed a priori to address issues with IS variability.”
Generating an official SOP is laborious and takes lots of work and thought. Add to that the requirement to be reviewed by several approving parties, who decide which elements are essential, and you are looking at a considerably extended timeline.
Plotting an internal standard (ISTD) graph could present another issue – if a LIMS cannot perform this plot, external software such as Excel may have to be used to calculate the necessary statistics and flag any threshold outliers. Yes, Excel can process the data, but this solution leaves lots of room for human error. In addition, all formulas need to be reviewed and locked down to ensure data integrity.
The Technology Piece?
Software that features an advanced spreadsheet technology can put individual runs through ISTD assessment and highlight any outliers that exceed the upper or lower threshold for review. Comparing ISTD response between runs for accuracy – whether it’s an original validation run, or a passing sample analysis run from the day before – can determine if the ISTD solution is stable. If there are any changes in the mass spectrometer that may have affected the assay, or if the extraction was problematic, it’ll be clearly revealed on the comparison between runs.
Documenting and reporting
Updates on reporting are mentioned later in the guidelines:
“Table of calibrator concentration and response function results of all runs (pass and fail) with accuracy and precision,” and “Table of QC results of all runs (pass and fail) with accuracy and precision results of the QC samples and between run accuracy and precision results from successful runs.”
To follow this suggestion with a traditional LIMS, all runs in the study would have to be unaccepted so that any failing calibrator that was deactivated from the curve can be re-activated to pull the complete data table. This also affects sample and QC results. For QCs, Excel might have to be used to calculate Intra- and Inter-run statistics, with and without failing QCs, to meet this new guidance.
At this point in the process, the data has already been through scientific and SOP review, so project managers and report writers need to be cautious. Bringing the failed calibrators back in can change the data analysis – one must be careful to reactivate the right ones as to not skew the data.
The Technology Piece?
With a sophisticated electronic data management system, duplicate tables for calibrators and QCs can be generated simultaneously, saving time and reducing the risk of changing data.
Further reporting recommendations follow – “Table of re-injected runs with results from original and re-injected runs and reason(s) for reinjection.” If something external has affected the run, it can be re-injected. However, the original run and the re-injected run results must be side by side and reviewed thoroughly to ensure the data is accurate before generating reports.
Using a LIMS, accepting both the original and re-injected results will cause confusion when it comes to reporting the correct result. Project managers/report writers must follow a sequence of events to ensure they are tabling the appropriate data. This process can be time-intensive, and it has to be reviewed to confirm the correct data ends up in the final report.
With advanced spreadsheet technology, such as IDBS’ E-WorkBook and the Advance spreadsheet, the original and re-injected run values can be highlighted so that they are reported under the correct run type. This eliminates any manual data transfer to comply with guideline standards.
Alterations with the scientists in mind
Some updates to the guidelines have changed for the better – “The sponsor should prepare any calibration standards and QCs from separate stock solutions. However, if the sponsor can demonstrate the precision and accuracy in one validation run using calibrators and QCs prepared from separate stock solutions, then the sponsor can use calibrators and QCs prepared from the same stock solution in subsequent runs.”
In the past, a different stock solution was required to prepare Calibration Standards and QCs, particularly for small molecule studies. If they compare within 5%, then they can be used to prepare the known samples. With this new guideline, after proving two separate stocks compare in a P&A run, only one stock can be used to prepare knowns from then on, saving time and expensive reagents. This will also help achieve better precision and accuracy results in subsequent runs; therefore, producing a better method.
The Technology Piece?
Where current technology capabilities are inadequate, innovative software can be leveraged to address these changes. With the flexibility and minimal re-configuration using E-WorkBook and the Advance spreadsheet, bioanalytical researchers can quickly meet new guidance standards to remain in compliance without any heavy lifting – both today and in any future challenges raised by the BMV.
You can find the full finalized 2018 BMV guidance here.
If you’d like to know more about how technology can transform your bioanalysis operations, then click here.