What is the future of big data in health care?


Big data might be daunting, but Matthew J. Becker, Senior Director and Global Head of Statistical Programming at inVentive Health Clinical, says the goal remains simple for those in health care and life sciences: cure disease and improve health outcomes.

Speaking last week at SAS Global Forum, Becker noted that most people had probably heard of the three big data Vs – volume, velocity and variety – but he believes there are two more Vs merging:

  1. Value is increasingly important with multiple stakeholders who place different value on the data gathered. For instance, the value of the data from a patient’s glucose monitor is different for the doctor, the device manufacturer and the patient. 
  2. Validity is also an important consideration as accuracy and completeness factor huge in decision-making.

Four distinct big data pools exist in the US health care domain, and Becker said there is very little overlap in ownership and integration of these pools, though that will be critical in making big strides with big data in health care:

  • Pharmaceutical R&D data
  • Clinical data
  • Activity (claims) and cost data
  • Patient behavior and sentiment data

Becker went on to describe challenges and trends related to big data in health care, the evolving role of the data scientist and roadblocks to success. His parting words of advice cautioned conference attendees from taking on too much, too soon when dealing with big data.

“We’re all seeing there are things in our companies we need to improve. Take it one step at a time…Don’t try to pull all your pedabytes of data from claims, the patient experience, sentiment. Start with one problem at a time.”

Highlights from the presentation:

  • Trends include earlier or immediate clinical trial decision-making (“go/no go”), collaboration and partnership networks, data governance for multiple data sources and stricter regulations.
  • The role of the data scientist is emerging as organizations look for people who can make sense of the variety of data, model complex business problems and demonstrate the intellectual curiosity and business knowledge to apply context to the data.
  • The size of data does not eliminate the need for quality. It’s going to be crucial not just to analyze the data, but the methods used to gather that data.
  • Tackling big data is not just an IT role, but is cross-functional. Organizations need to understand the goal in analyzing data.

About Author

Becky Graebe

Director, Communications

In addition to traditional employee communication efforts at SAS, Becky Graebe oversees an award-winning global intranet and a variety of enterprise social media channels. Her goal is to create a working environment where SAS employees around the world feel connected and inspired to share fresh ideas, solutions and expertise with colleagues and customers. Having studied at Southern Methodist University and earned her degree from Stetson University, she now serves on the Employee Communications Section board for the National Public Relations Society of America, is an active member of Triangle Women in Communications, and volunteers with Citizen Schools and the Wake County Support Circle Program.


Back to Top