NGS (next gen sequencing) can help in the delivery of personalised medicine and the understanding of disease – no doubt in my mind. But like all the things that have gone before – HTS (high throughput screening), Combinatorial Chemistry, molecular modelling – it is not a silver bullet – but a tool that can be used to aid the cause. Alone it cannot solve the problem.
We have seen it over and over again – a new tech / method / thingamabob that will solve all problems. In the Wild West they called it snake oil and we know what that means!
To properly leverage technologies such as NGS – they need to be “tamed”. By that I mean used effectively and in a manner that will provide consistent, reproducible context-rich results. To do this the results have to be generated in a fashion similar to all other “analytical methods”, with good control of all the variables that can affect results – thus enabling comparison and use of the results in decision-making.
The goal for NGS is obviously to get into the clinical diagnostics area – and this will be a huge step forward for the provision of personalised medicines. However, to do that, the technologies must be used in conjunction with informatics that can provide this extra layer of control and context of the results obtained … I am obviously talking about ELNs and good scientific data management.
There are also many similarities between the front end of the lab process in all these “analytical areas” and the requirements for lab asset handling, traceability of information and “exception reporting” which suggests that NGS labs will benefit from the same tools and informatics systems: i.e. ELNs, LIMS and data management.
Of course I may be a little biased, but I think our data solutions are light years ahead of the rest.
Perhaps it’s time to have a look at why they are causing such a buzz in the market?