A fascinating New York Times article on how US cancer centers are racing to map patients’ genes has made me think how personalized/precision medicine can be approached successfully. The article refers to an “an arms race within the war on cancer” as major academic medical centers spend and recruit heavily. As the US medical establishment moves towards the routine sequencing of every patient’s genome in the quest for precision medicine, most pundits are predicting a time when whole genome sequencing is ubiquitous throughout health care.
Although this fundamental shift will present many challenges, any successful approach to personalized/precision medicine has to be centered on the data. Much industry analysis revolves around how to partner real world data providers and their data together for the benefit of patients, providers and researchers. When it comes to analyzing data to achieve better patient outcomes, new research and targeted therapies are only as good as the quality data the analysis is based on. Big may be beautiful, though when it comes to data, it’s worthless unless it is of high quality and we can make sense of it.
So we all need a smart – not just a BIG – approach to the highly complex datasets we are going to use to drive future medicine. That means sharing interoperable data among the emerging new groups of stakeholders and ensuring protection is afforded to personal data. The opportunities to drive precision medicine through better understanding of the real world are broad and deep. A keen focus on the quality and interoperability of the data are, as always, vital.