When it Comes to Precision Medicine it’s all About the Data

by April 18, 2019 0 comments

While the growth and advancement of precision medicine has been remarkable, its ultimate success is reliant upon the ability of scientists and clinicians to collaborate and share information. The longstanding assumption of precision medicine is that it would provide clinicians with the tools and therapies they need to consistently deliver the right treatment to the right patient while simultaneously reducing waste and yielding cost savings for health systems. However, while the pace of discovery within the field of precision medicine has been remarkable, its ultimate success is reliant upon all parties to partner and distribute critical data and information assets.

To be truly successful, all stakeholders in the precision medicine ecosystem must be involved in shaping how the data is collected, analyzed and used so that: providers can have information at the point of decision so that they are able to use it in the context of their clinical workflow; patients can define preferences about the use and sharing of their specific genomic and other health-related information; researchers can identify and adopt best practices for research using electronic health record (EHR)-linked genomic information; and health systems can offer tools and technologies that will enable everyone involved to make more informed decisions.

In this precision practice, integrating high-quality data into a healthcare system must be a priority for ensuring that the best possible information is available for patient care and research. In turn, there will be a collective understanding of how precision medicine innovations impact the health of populations and care delivery within healthcare systems.

 

Prescription? It’s all in the Data

Making full use of precision medicine’s multidimensional data streams necessitates standardized methods of data aggregation and analysis as well as leveraging emerging technologies, such as machine learning, natural-language processing, and artificial intelligence. The application of these new analytical methods enables healthcare professionals to recognize the patterns of health and disease and to create more efficient and sustainable models of care. One of these methods is data preparation.

Self-service data preparation supports integrating and standardizing large data sets, at scale, and provides healthcare professionals with the ability to quickly and efficiently parse and separate multiple data elements that most molecular biology tools cram together in a single field. Leveraging data preparation, organizations can easily combine, clean, and shape Big Data so it can be useful for analytics.

 

Precision Medicine in Action: How PrecisionProfile Leverages Data Preparation

Our customer, PrecisionProfile, provides an example of how one organization leveraged a combination of technologies to help oncologists and research scientists analyze genomic profiles.

To advance cancer research and treatment, PrecisionProfile has developed a genomic analytics platform, which provides analytics for researchers, universities, and oncologists to uncover shared disease drivers. Most importantly, it helps physicians develop treatment plans based on protocols that were effective for similarly-affected patients. Given many sequences and processes in research are repeatable, scientists can now replicate a specific sequence for new sets of demographics or drugs as new cancer types, patients, and clinical data become available.

In this setting, Paxata data prep technology is more specifically used to streamline the treatment research process by enabling researchers to upload patient-specific information, normalize variances in information, then match patients with similar genetic characteristics and cancers to find specific drugs and treatment protocols that are the most effective.

The ability to bring together public and private genomic data, information from a patient’s EHR and clinical records from outside sources lets clinicians focus on researching treatment options, not on managing data. Other benefits and results include:

•  Reducing the cycle time of a genome clinical study from 1-3 months to 2-8 hours.

•  Decreasing time from physician diagnosis to treatment from 2-3 weeks to hours.

•  Empowering oncologists to leverage data to recommend personalized cancer treatment plans with the highest probability of success.

•  Reducing the spend on drugs that may be ineffective.

•  Enabling physicians and healthcare organizations to increase patient volume by 10 percent.

While one specific technology for precision medicine has not yet emerged, its clear that organizations need to know, and learn from, the data they have, integrate, and blend it with other sources of information from their partners or public sources, easily standardize different formats and move it from one clinical setting to another. With the increasing amount of IoT data, Fitbit and wearable insights, medical information from iPhones or other mobile apps, everyone involved in precision medicine will need to rely on some aspect of data preparation to easily digest data for analysis, create personalized treatments, and deliver precision in improving patient outcomes.

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.

Your data will be safe!Your e-mail address will not be published. Also other data will not be shared with third person.