Five pearls of wisdom from health analytics executive panel

At the 2014 SAS Health Analytics Executive Conference last week, I had the privilege of hosting a panel discussion with three industry leaders representing a cross-section of large, pioneering health and life sciences organizations who are each on the leading edge in their use of analytics to pave the path toward value-based and patient-centered care. The panelists joining me were:

  • Terhilda Garrido, Vice President of Health IT Transformation and Analytics with Kaiser Permanente
  • Patrick McIntyre, Senior Vice President of Health Care Analytics, Wellpoint, and
  • Sai Venkat, Senior Director of Enterprise Architecture for Janssen Pharmaceuticals, Johnson & Johnson.
Executive Leadership panelists at the SAS Health Analytics Executive Conference.

Executive Leadership panelists at the SAS Health Analytics Executive Conference.

Several themes emerged from the discussion; in particular, five are in my opinion especially valuable for others embarking on the same journey towards analytic maturity:

  1. Put the patient at the center. Given that face-to-face health care delivery represents a fraction of the opportunity for improving health and wellness, we need to start thinking about how to establish an “always on” health care system – one that provides individuals with convenient access to a seamless blend of online and face-to-face care services – in the same way that they expect from any other consumer-centric organization. Start to think beyond the static patient portal as the only way to extend interaction beyond the walls of your organization, and think about how to create mass personalization by using data to understand patients as unique individuals.
  2. Collaborate. We heard that there is a growing realization that collaborations are required to gather a truly comprehensive understanding of individual patients as well as populations of patients. Providers, health plans and big pharma have started to form strategic alliances that, over time, can contribute to development of a 360-degree view of the patient; something that would be impossible for any single organization to develop in isolation. This holds the promise of:
    • More rapid and targeted drug development
    • Greater ability to predict which patients are at higher risk and could benefit from earlier intervention
    • Development of win-win risk sharing arrangements
    • Control of spiraling health care costs.
  3. Cross-pollinate. The panelists agreed that too often there are multiple groups within their organizations, each tackling the same types of analytic challenges. Each organization is focused on pooling expertise or sharing best practices to their analytic approaches and insights across the enterprise. For some this means developing a formal Center of Excellence (COE), for others it means establishing routine mechanisms for analytic teams to collaborate across traditional departmental boundaries. Whichever approach is taken, each panelist agreed that they are focused on ensuring that IT, analytics and business leadership are staying in tight lock-step to ensure data-driven initiatives deliver maximum business benefit.
  4. Bring in novel data sources. Our panelists talked about the opportunities associated with bringing EMR and claims data together to build a broad and deep view of the patient. They also described initiatives underway for enriching existing data with novel big data sources. All organizations should consider the potential value of leveraging socioeconomic, consumer, geographic and social media data, while also keeping an eye on a wave of data emerging from mobile apps, wearable devices and other sensor technology. While blending data in this way requires careful attention to ensure that patient privacy is protected, the panel agreed that they are each considering how to leverage this wave of health care big data to improve the health and quality of care for their patients.
  5. Think business, not analytics. The most sophisticated analytics endeavors won’t make a difference if they don’t change or improve something in the real world. Make sure the analytics you conduct is prioritized carefully, aligns with key business strategies and has a clear mechanism to effect change, rather than simply delivering reports or risk scores. While this sounds a bit like motherhood and apple pie, we’re all familiar with cases where increasingly scarce analytic talent isn’t used as effectively as possible or is working in a reactive mode rather that in a proactive, strategic fashion. Analytic maturity means being joined at the hip with the front-line needs of the business.

What will separate the leaders in the changing health care landscape?

As health care continues to evolve, two things are certain: first, consumerism is here to stay and the old adage that the customer is king will rock today’s health care industry. And second, the value-based health care train has left the station and there is no turning back.

Organizations that embrace data to excel in customer experience while being able to measurably move the needle on the cost/quality equation will likely be around in 2020. Those that don’t probably won’t be.

If you missed our Virtual Conference, I invite you to watch the Executive Leadership Panel, as well as several others available on-demand.

Post a Comment

How analytics informs the new health care ecosystem

This week, we had the opportunity to hear from industry leaders at the SAS HLS Virtual conference, where we learned how #HealthAnalytics can inform the future of the health and life sciences industries.

We were fortunate to have John Balla, a SAS expert in customer intelligence, join us for the conference. John has experience with everything from start-ups to Fortune 100 global operators, but he isn’t an industry insider. So as someone who probably knows a little more than the average consumer about the use of analytics, he brings a unique perspective to putting the customer at the center. John’s written an overview of the day in which he considers what the big themes discussed mean to consumers. In his blog post, he addresses four major topics:

Big changes

The three greatest drivers of change in the healthcare ecosystem right now are:

  • Technology and innovation
  • Government mandates
  • Empowered consumers

How Health Analytics moves the needle

Moving the needle  comes from the power of forward-looking predictive analytics, which can change the game for the industry (and the patient).

Health analytics is the treatment plan

Dr. Mostashari summed up the problem: health care “costs too damn much and we’re not getting enough value for it.” But analytics isn't only about controlling costs, it's about getting more value for individual patients.

Caring for the patient as a consumer

Ultimately, health analytics can both support the individual’s ability to make choices about their care more easily - as well as help to drive better outcomes.

I’ve only scratched the surface in this summary; I encourage you to read John’s post in its entirety.


Post a Comment

Health care transformation and the power of prediction

Dr. Farzad Mostashari, a SAS user for 25 years, kicked off the 11th annual SAS Health Analytics Executive Conference yesterday, and made some bold statements regarding transformation and analytics in health care.

“Transformation faces resistance,” he said, and it’s constantly faced with the ‘we’ve been doing it this way for a long time’ mentality. With the change from volume to value, providers and payers alike are frustrated as outcomes and incomes are becoming interrelated. But Mostashari says frustration is in fact a good thing.  Here’s why.

The changing landscape to value-based care is creating a playing field for disruption across the entire ecosystem of health care. And all of these changes are being driven by one thing: cost. We’ve heard it time and time again that we simply cannot continue to spend the growing amount of money each year on health care that we do.  Federal governments, state governments, employers and employees just can’t afford it.

So, Mostashari posed a critical question: What’s going to happen when that cost is revenue? Everyone’s thinking about “who’s going to win,” and as innovators commonly quote, “it’s not disruption unless someone loses their job.” Mostashari questioned the audience of health care leaders - “Are we going to see companies go out of business? Will it be you? What are you going to do? He encouraged these leaders to embrace disruption so that “you’re not one of those.”

Additionally, Mostashari enlightened the room with his discussion on reactive versus proactive analytics, and provided several examples to drive home the criticality of the latter.  Most of us, for example, know about the failures of managed care from the 80s and 90s, and because of this, some of us are skeptics about the current reform. But, what’s different this time around? We have tools now we could only dream of before, Mostashari confidently pointed out.  We can now predict.

We can predict ER visits, predict who will get sick and when, predict outbreaks across communities, and so much more. He highlighted that when people talk about population health, it’s usually a “bunch of reports,” without any insights or action; no insights on how it’s actually going to change anything. He emphasized using the power of prediction and putting yourself on the line.  Moreover, he compared the use of predictive analytics by the Obama 2012 campaign to health care, saying that in health care, every day is “model validation day” – aka election day. Every day someone gets sick, someone is diagnosed with cancer, someone has a heart attack; we have to put ourselves on the line and use prediction to optimize health care.  Every other part of the economy does, and it’s long overdue for health care to get on board.

Post a Comment

#HealthAnalytics virtual conference coming soon to your screen

Anticipation is building for our 2014 Health and Life Sciences Virtual Conference to take place next Wednesday, May 14th, and we’d like to extend the invitation to you.  We’ve got a great lineup of speakers that promise to get into the big issues facing health care and life sciences today, including how health analytics can be applied to solve them.  And it can all be seen on that screen right in front of you.

The star power starts first thing in the morning. Our first keynote will be delivered by Farzad Mostashari, MD (@Farzad_MD), the visionary former National Coordinator for Health Information Technology and current Visiting Fellow at The Brookings Institution.  Of course most of us know Dr. Mostashari as an advocate for electronic medical records.  In his keynote, we anticipate that he'll explore why “data is oxygen for innovation,” and how data from EMRs are creating an ecosystem of applications and services with the promise of solving some of our most pressing health care problems.  This is a talk not to be missed, and a great way to kick off the conference!

If history is a guide, one of the most interesting sessions will be when different industry representatives participate in a panel on the main stage.  Invariably, we find commonalities in the analytics challenges across payer, provider and life sciences industries.  Moderated by our own CMO, Dr. Graham Hughes (@GrahamHughesMD), you’ll hear from executives from Johnson & Johnson, Kaiser Permanente, WellPoint and McKinsey who have succeeded with analytics. This is a great chance to learn what works, and what their vision is for analytics in their organizations.

We’ll have a number of sessions in the afternoon to choose from that cover topics such as data transparency, consumer engagement, and a session on developing an analytics strategy and culture.

You’ll want to make sure you stay with us through the afternoon for our second keynote, as you’ll get to hear from John F. Crowley, Chairman and CEO of Amicus Therapeutics, and the inspiration for the major motion picture Extraordinary Measures. Crowley has a unique perspective as CEO of a 100-person publicly held biopharmaceutical company working on the development of drugs to treat a range of human genetic diseases. His involvement in the biotech industry goes back over a decade and stems from the 1998 diagnosis of his two youngest children, Megan and Patrick, with Pompe disease, a rare and fatal neuromuscular disorder. Crowley embarked on his journey as an entrepreneur to find a treatment that would save his kids' lives – and the lives of thousands of others.

Please join us on Wednesday May 14, 2014 – complimentary registration can be found here And please be sure to participate with us on social media by following @SASAnalytics and using  #healthanalytics.

Post a Comment

Medicare physician files aren't big data, but they are a big deal

With the release of the most recent Medicare Provider Utilization and Payment Data set, there have been a lot of reports discussing this as “big data.”  And while this release may contain a lot of records (over 9 million), or be a bigger file than is normally downloaded (1.7 GB), it’s misleading to call it big data.

For the data contained in this file only meets one of the criteria to be considered big data – volume.  Not velocity.  Not variety.  For us to consider this big data, it would need to be “streaming in at unprecedented speed and must be dealt with in a timely manner.”  Of course, the current format – a static file – has no speed. And when the data in the form of claims was coming to CMS, I don’t think we could say they were coming at unprecedented speed.  As for variety, this doesn't apply either.  This IS a big file, but it contains the same types of records.  In this case, the challenge doesn't exist to have to manage, merge and govern different varieties of data. (This is not to diminish the work of CMS in preparing this information.  As their methodology document discusses, there was much work that was done to create these files.)

So why am I compelled to point out that this is not big data?

We can all agree that we've been hearing a great deal about the promise that big data holds for changing health care.  If we can gain analytical insights from this big data we’ve been hearing about, then caregivers and the health care ecosystem can make better health care decisions.  This most certainly is still the case, but as an industry, we’re still novices at mining big data to create insight.

It’s important that we don’t dilute this promise by misunderstanding what big data really is.  Even for the sake of clickable headlines (yep – I see the big bold letters above and know I’m guilty too). Big data holds promise for new learnings because it contains a variety of data, so we can see how various elements correlate to each other.  And it comes at us fast.  It may contain machine-generated data from sensors – recording health care information at a rate we've never been able to capture before.  And of course there’s the volume of it, which is important because only now do we have the analytical capacity to make sense of all of these elements without having to sample small portions because of technology constraints.

But not being big data doesn't mean it’s not a big deal.  I’m both optimistic for what releasing this kind of data means for transparency efforts and cautious about what we can learn.

Access to open data:  Release of this data is a continuation of the open data revolution, and we should cheer the commitment to transparency and accountability that it represents.

Drawing conclusions carefully:  The data may be interesting, but we may not learn a whole lot from this particular data set.  My take away from Lisa Rosenbaum’s New Yorker piece is that conclusions made from this data should be drawn carefully, and with appropriate context.  For instance, understanding industry practices such as an entire practice billing under one physician's name is critical to not jumping to false conclusions about the scale of payments made to any one provider. 

A big win anyway

Celebrate the win for consumers.  Though this will not transform health care in one fell swoop, it is an important step of many to make our health care system more transparent, which will ultimately benefit all of us as consumers.

Post a Comment

GRID delivers fast, high availability clinical research

Many biopharmaceutical companies are opting for modern grid architectures for their SAS computing infrastructures. They’ve used SAS for decades in clinical research to clean and analyze clinical study data in order to determine safety and efficacy for drugs in their pipeline. During this time, clinical trial sponsors and CROs built traditional server-based SAS computing infrastructures to analyze and report clinical trials data.

But recently, many companies have discovered the flexibility, scalability and cost advantages of a modern grid architecture.

What’s driving the need for a new approach?

While the legacy server-based architecture met the SAS infrastructure requirements of clinical development for decades, a number of factors evolved to prompt many companies to examine a more modern GRID architecture.

Practically every biopharmaceutical company and CRO provides a SAS computing infrastructure for their clinical development organization. Usually, the SAS infrastructure consists of one or more servers located either in a single data center or scattered across the globe based on the business requirements of the organization.

These servers may have widely different speeds and capacities, and may not even have the same SAS software configuration. This lack of consistency places a burden on the user community to completely understand the pros and cons of each server in the configuration when submitting their jobs for processing.

Imagine being the new employee who unknowingly submits a long running, resource intensive “rush job” to the slowest available server and storage configuration – and then wonders why the job is taking so long to complete!

And some companies are running their clinical development SAS infrastructure on a server and storage platform that includes aging or end-of-life hardware. As a result, these companies are typically dealing with issues such as compute or storage capacity shortages, leading to performance issues with jobs running too slow. In addition, single points of failure in the server or storage infrastructure can increase downtime.

Modern clinical development is expanding the business requirements for the clinical development SAS infrastructure through:

  • More data.
  • More complex questions and analyses.
  • Greater need for speed and faster time to insights.

Modern grid architecture provides the cure

Many biopharmaceutical sponsors and CROs are choosing to transition to a flexible, modern grid architecture for their clinical development SAS computing infrastructure. A well-planned grid architecture can result in a SAS computing infrastructure that can grow incrementally and cost effectively, provide high availability, and meet changing business demands with dynamic workload balancing.

After implementing a grid architecture, SAS Grid Manager provides workload balancing to determine where and how each SAS job gets distributed to the grid based on the attributes of each job and the available processing capacity across the grid.

In addition to workload balancing, SAS Grid Manager provides:

  • High availability when using a grid architecture that limits single points of failure.
  • Flexible, low-cost scalability by adding commodity computer infrastructure as needed.
  • The ability to scale with minimal disruption.
  • Job execution time improvements with grid-enabled code.

If you want to learn more about how to implement grid computing in the life sciences industry, this white paper provides some excellent insights into what you can accomplish to create a modern SAS programming environment in a global pharmaceutical company. A grid architecture may be what you need to help your company respond in a cost-effective manner to increasing data volumes, more complex analyses and the need for faster time to insight.

Post a Comment

Collaborating to cure cancer

It started about three years ago with a white board session in a conference room in Cary: “What if we could share, integrate and analyze our collective historical cancer research data in a single location?” Thanks to the incredible efforts of the CEO Roundtable on Cancer and its member organizations, the lofty goal of sharing cancer research data is now a reality.

Last week the Project Data Sphere initiative officially launched its groundbreaking platform that makes historical data and analytics broadly accessible to those seeking to advance cancer treatment research.

The number of individuals and organizations that have come together simply because it’s the right thing to do is amazing and inspiring.

While this project involves many stakeholders, SAS’ role has been to provide the analytics technology to enable researchers to discover insights that previously would have been hidden in the data. At SAS, we've  been able to use our extensive experience in hosting data from other highly regulated industries, such as banking, to help develop a system that can handle the ocean of oncology research data that must be prepared, analyzed, reported and shared in a secure manner.

Those participating in Project Data Sphere welcome receiving data from the many organizations that have data to give. And we embrace the researchers that will come to the site to analyze their data and uncover breakthroughs in cancer research down the road.

The Project Data Sphere initiative can make a difference in cancer research, but this launch milestone doesn’t mean we’re going to sit back and rest. On the contrary, we’ve already begun working on next steps to ensure everything is in place to keep researchers and their studies moving forward.

This is just the beginning.

Post a Comment

Can data transparency save the world?

Imagine for-profit organizations sharing data to solve significant health problems. Day-to-day adversaries getting together to create a massive pool of clinical data that can help researchers one day find a cure for the most puzzling diseases.

It’s no longer a dream. With Project Data Sphere, an initiative of the CEO Roundtable on Cancer’s Life Sciences Consortium, researchers from life sciences organizations, hospitals and other institutions can share and analyze cancer research data gleaned from across the industry. The result? An impressive pool of data designed to jump-start cancer research.

Project Data Sphere was a topic during a data transparency discussion at the 2014 SAS Executive Forum. Dr. Kald Abdallah, vice president of immunology for Sanofi US, talked about the project and the need for more ambitious cancer research. While the mortality rate for illnesses like heart disease continue to drop, cancer death rates are climbing.

Part of the reason for this, Abdallah said, is because of the decentralized way that research is conducted. Pharmaceutical companies and research organizations have been doing individual bits of research, but it was difficult to share information.

“Cancer is very complicated, and we’re not finding solutions fast enough,” Abdallah said. “The question is ‘How do we change that paradigm so that we can find solutions – and improve cancer care – faster?’”

The answer came when the CEO Roundtable on Cancer established Project Data Sphere as a way for pharmaceutical and life sciences companies to share their clinical trial data. Once a proprietary asset, this information is being aggregated, and in April, it will be available for researchers across the globe. SAS plays a role in Project Data Sphere, providing data hosting and analytics tools to researchers.

“The only commitment we ask from researchers is that they acknowledge that their findings came from Project Data Sphere,” Abdallah said. That’s not much to ask for, he said, given the scope of the program. Now, researchers in China and India can use the same data as North American researchers, adding a new dimension to cancer research.

Cancer is only one area where increased data transparency is changing lives. Mike Wirth is a special adviser on business and technology reengineering to Virginia’s Secretary of Health. Wirth works with state agencies across Virginia’s local governments to aggregate information to serve under-privileged children.

Part of the Comprehensive Services Act for At-Risk Youth and Families, the program administers a fund that purchases services for at-risk children throughout the state. Their goal, Wirth said, is to ask critical questions and improve the lives of Virginia’s youth.

“We have to know several things,” Wirth said. “Are services available to the children who need them? Are services being provided in accordance with each child’s needs? Are fund being spent wisely? Are programs meeting measurable goals?”

In the past, answering those questions was difficult with 95 counties and 38 independent cities that can have local pools of data about the services provided to children in those areas. To solve the problem, Wirth and his team first took on a pilot project that helped proved the concept to both the general assembly and the localities. From there, they established a baseline of expenditures to understand “typical” expenditures for specific services.

Once they had that baseline and an integrated data set, the next phase was to look for outliers. Through data visualization, Wirth’s team can “see” an individual child and as well as his or her support network. If the child has a darker color, it means they are more at risk. A lighter color means that the data shows that the child is getting necessary support.

“I can start to slide the time [metric] and see kids getting brighter or darker,” Wirth said. “If their symbol gets darker, I can drill in and see what’s happening there. Is it that they are in a bad situation, or is the provider overburdened? It’s a really powerful way to visualize and see the data.”

Post a Comment

The time is now for health care price transparency

The second annual Report Card on State Price Transparency Laws was released earlier this week, and the grades came as a shock to many. Even with the profusion of dialogs and activities that occurred in 2013 around health care price transparency, the US is nowhere near where it needs to be in providing consumers timely, accessible price information. In fact, the 2014 grades are worse; and contrary to last year, not one state earned an “A” grade. Though the lower grades are in part due to the additional evaluation criteria, we still have to ask whether we're headed in the right direction. What changes do we need to make? And most importantly, why do we need to make these changes?

I’ve discussed before on this blog the multiple reasons why health care price transparency is important, but with today’s evolving health care market, the number one reason for change is to aid consumer decision-making. Consumers are playing an increasing role in making decisions around their care and they still have very little means of knowing health care prices. This lack of information contributes to both individual and systemic escalation of health care costs. Further, it’s something consumers are demanding – especially those with high deductible plans and those “shopping” for various health care procedures like an MRI or routine screenings. And the research suggests more than that; consumers want accessible websites that provide these cost comparisons across providers.

So what do states need to do in 2014 to boost their transparency grades?

  1. Legislation. This is a team effort; stakeholders – including consumers, payers and providers - need to come together to advocate for and enact price transparency laws. As Francois DeBrantes of HCI3 stated, “…without legislation, there’s simply no assurance that consumers will have long term access to any pricing information.”
  2. All-payer claims database (APCD). APCDs are the best source of information to provide price information to consumers.  While many states have an APCD or are in the process of developing one, others like New Jersey and Washington have not succeeded in passing legislation for one, despite widespread support. APCDs are no longer a nice-to-have, but a necessity.
  3. Public website. Those states with an all-payer claims database or other repositories of price and quality information, must make the data available to consumers. Further, they’ve got to provide the right and relevant data in an easy to use manner. In other words – just having a public website doesn’t cut it. For example, New Hampshire didn’t get any points for its public website this year because it was disabled and unavailable to consumers for an extended period of time.

Taking these steps towards price transparency will bring us closer to effective consumer-decision making. Although it doesn’t stop at costs, as quality information must also be included to drive the best decisions. Creating price transparency is a significant step that will drive positive change within the US health care system.

Post a Comment

The paparazzi experience

Do you know the look on someone’s face or in their eyes when you just know they’ve had an epiphany or something click and they got it? Many of us refer to that as the light bulb turning on. The past day and half at the SAS Global Forum Executive Conference has been filled with light bulbs flashing on over and over throughout the convention center. It was like the paparazzi was everywhere!

Making decisions without insight

The massive keynote ballroom had six huge blue ring-type structures dangling high on the ceiling giving all pause to wonder and provided uniquely arranged seating options ranging from traditional straight-back rows of chairs  to leather lounge and mod pod chairs to high-top bar-style tables and chairs.

The keynotes by retired General Colin Powell, former CIA and FBI Director Philip Mudd and business advisor Geoffrey Moore were fascinating discussions about the challenges leaders have in making accurate, timely decisions. They gave some great ideas for overcoming those challenges, and cautioned about the critical need for leaders to balance instinct with relevant insight from data from systems of record and analysis from systems of intelligence.

The concepts were complex but energizing. They talked about many ideas, including ways to approach automation of transactional data for national security, leveraging unstructured text and social media banter and better understanding our own businesses and exposures.

Every industry represented discovered new ways to approach existing data and leverage a variety of other data sources to master their own business ecosystem. Among my healthcare colleagues, we began to explore what these new angles of thought now meant to providing quality care, engaging patients in their health decisions, improving the care continuum with physician contact points and detecting symptoms early to avoid a catastrophic health incident.

What were those six big blue rings?

 As the hall emptied, the rings slowly descended unfurling blue curtains - each forming a ringed-room with its own audio and presentation functions. The rooms hosted three sessions of six concurrent presentations that filled the afternoon with stories, examples and lessons in areas such as:

  • Analytic approaches to better understanding your customer.
  • Using unstructured text sources to improve business processes.
  • Creating innovation or idea labs to tighten the ratio between effort and insight.
  • Streamlining detection and prevention of improper payments and fraud.
  • Maximizing the learning curve thru data and analytic visualization.

The light bulbs continued going on for us as we listened. We were asked to approach each session considering eight metrics that matter in using data in our businesses:

  • ROI – Return on investment and return on insight are a must.
  • Data Governance – Is it relative?
  • Productivity – Will this support the analytic life cycle.
  • Timeliness – Will the analytic value be delivered in the time window I need?
  • Accuracy – Are we spending the time to ensure the data is accurate? What is the impact if not?
  • Effectiveness – What is the impact on the pool of staff talent? Is it time to bring on a data scientist?
  • Empowerment – Will this increase be self-sufficienct?
  • Maturity – Where are we in terms of people, process and culture  - beyond technology?

The quote of the day

There is one simple observation that was made during the conference that just sticks with me. Before a packed audience, Bryan Sivak, an entrepreneur and innovator, currently filling the CTO role in the US Department of Health and Human Services, shared how increasing amounts of data are being liberated for use in analytics across a wide variety of public-private partnerships. This liberation has a  direct impact on improving healthcare and lowering costs.  He established an IDEA Lab that focuses on rapidly moving from concept to operationalizing solutions that depend on data and analysis. At the end, someone in audience rose and said, “You are breath of fresh air!” and the audience nodded quickly in affirmation.

Not the most profound quote perhaps but indicative of the value each session brought to all of us in attendance. Indeed fresh air was found in every tent that afternoon.

What does this all mean to the attendees, me included, when we get back to our own desks? Stay tuned for my next post.

Post a Comment
  • About this blog

    Welcome to the SAS Health and Life Sciences blog. We explore how the health care ecosystem – providers, payers, pharmaceutical firms, regulators and consumers – can collaboratively use information and analytics to transform health quality, cost and outcomes.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives