The term personalized medicine was coined in the 1990s to describe the application of new methods of high-throughput biological analysis to drug discovery and validation, eventually leading to bespoke medical interventions. Over the last decade, the concept of personalized medicine has been gradually replaced by the broader term, precision medicine. The goal of precision medicine is to optimize therapeutic benefits for specific groups of patients, thereby improving outcomes and reducing costs. This paradigm has been driven by the recognition that complex, chronic disease requires a tailored approach to patient management, if value-based care is to be realized.
A cornerstone of precision medicine is the use of data to drive medical decision making. In 2013, the US Institute of Medicine argued for improvements in the use of data to inform interventions and health system performance. Research in cardiovascular disease has uncovered gaps in quality of care with significant variation in patient outcomes and the cost of care. To enable precision medicine, effective collection and use of data is essential.
Advances in computational capacity and computer science have led to the development of analytical platforms that can analyze large and diverse data sets. Big data analytics can be defined as the application of computational algorithms to large volumes of data to identify hidden patterns and make predictions that could improve patient outcomes. Machine learning (ML) goes a step further, engaging analytical techniques to enable computers to self-learn from data. Both techniques require large volumes of high-quality data to effectively impact precision medicine. Patients suffering from long term chronic conditions generate a plethora of data as they journey through the healthcare system. The challenge has been to effectively capture and exploit that data.
The major sources of data in cardiovascular medicine include patient records in Electronic Health Records (EHRs), administrative databases, and clinical registries. Increasingly, patient-generated data from biometric wearables, mobile health applications, and social media are being integrated into data analytics platforms. The quantitative data derived from laboratory testing, biomarker data, and “omics” data have been particularly influential in developing insights for precision medicine. One of the greatest challenges in the field of medical analytics is data quality. EHRs contain multiple data sources, many of which are unstructured or heterogenous, rendering them difficult to analyze. For instance, data entered into EHRs may consist of physician notes, quantitative tests, and images. Assimilating different data types into a format that can be interrogated by computational algorithms is time consuming. Unstructured data such as written notes may require manual processing before it can be analyzed effectively. However, as EHRs also typically contain a high proportion of data from laboratory tests, their quantitative nature makes these data points easier to incorporate into analyses.
To date, data analytics techniques including EHR and laboratory data have been implemented in a number of studies to predict mortality, hospital readmission, and post-operative complications in cardiovascular care. For example, a team from Kaiser Permanente Medical Care Program developed a predictive model through interrogation of EHRs to determine the likelihood of deterioration amongst hospitalized patients with cardiovascular disease. A team from Boston University recently tested a series of machine learning algorithms for the ability to predict hospitalization events based on the analysis of EHR data. They were able to demonstrate a detection rate as high as 82 percent, offering the potential for significant cost savings. A collaboration between the U.K. and Singapore went further, combining machine learning and natural language processing to predict patients who are at high risk of developing cardiovascular disease. These approaches have used a variety of data sources and analytics techniques to generate models that could be implemented in precision medicine.
It is clear that clinical laboratories handle large volumes of data from routine testing that can be used to inform medical practice. However, laboratories can also play a role in generating data through biomarker or “omics” studies that could be influential in developing precision medicine further. Clinical labs process and retain vast numbers of biological specimens that could be interrogated for research purposes. Through partnerships with research organizations, clinical laboratories are well-positioned to provide high quality specimens, complete with medical data that could aid in biomarker and “omics” investigations. Several service providers have emerged to facilitate the efficient and ethical transfer of samples between laboratories and research organizations. The ability to leverage precision medicine in routine clinical care would be enhanced by understanding the role that genetics and other key biomarkers play in disease.
It is expected that data analytics research will reveal new applications for clinical tests and their role in value-based healthcare. However, it should not be assumed that data analytics alone will improve patient care or outcomes. The management of patients with chronic conditions such as cardiovascular disease is complex and much work remains before precision medicine will become a part of routine clinical care.
Dr. Sophie Laurenson is a scientist and social entrepreneur. Her company develops medical technology for resource-limited settings and offers consultancy services for businesses operating in emerging markets through africanpuffling.com.
Like this article? Click here to subscribe to free newsletters from Lab Manager