Health analytics - Rapidly gaining ground

Having long ago witnessed the power of analytics to improve performance, efficiency, cost and quality of online banking and investment services, I have been an advocate and evangelist of its power to do the same in health care. That’s one reason why I’m excited about the recent tidal wave of news, articles, blogs, announcements and public dialogue about the value of analytics in health care.

The recent Health Affairs article captured my attention partly because it was authored exclusively by industry clinicians and academicians, not by technology vendors. The article, Big Data In Health Care: Using Analytics To Identify And Manage High-Risk And High-Cost Patients^ acknowledges “unprecedented opportunities” to use big data to reduce the costs of health care. Perhaps most importantly, the authors identify six specific opportunities where analytics could and should be used to reduce cost:

  1. Identify early – and proactively manage – high-cost patients.
  2. Tailor interventions to reduce readmissions.
  3. Estimate risk to improve triage.
  4. Filter out noise to detect valid signals of decompensation.
  5. Predict severity of illness to prevent adverse events.
  6. Optimize treatment for diseases affecting multiple organ systems. 

It's encouraging to see health care executives acknowledging the need for analytical competency in their organizations; to see the US Congress acknowledging the need for data transparency and interoperability; to hear clinicians asking for analytically-derived decision support tools; to watch prestigious academic organizations expanding advanced degree programs in health informatics and biostatistics; and to hear health IT organizations demanding interoperability of data between EMR (electronic medical records) and their other systems.Health Analytics

I'm delighted by the article above, and by the early wins and impressive results generated in these six areas by friends and colleagues who are using advanced analytics to surface insights in their organizations across the globe. For example, our friends at the UNC School of Medicine are exploring the utility of big data for predicting exacerbation in diabetic patients, an innovation with the potential to simultaneously tackle the items in the list above: 1 (high-cost patients), 2 (tailor interventions) and 5 (predict to prevent).

Another example is the work being done at the Department of Orthopedic Surgery at Denmark's Lillebælt Hospital to use text analytics in automated clinical audits to detect and correct errors. The Lillebælt innovation demonstrates the efficiency gains made possible only through automation and the power to prevent patient injury at a scale which would otherwise be cost-prohibitive.

Perhaps the most exciting news of late is the announcement that Dignity Health is partnering with SAS to build a cloud-based big data analytics platform to enable value-based healthcare. In my opinion, this announcement represents a systemwide commitment to adopting health analytics as a core competency and puts Dignity Health on the road to realizing value in all six of the areas mentioned by Bates, and many more too numerous to list.

These are leading indicators that health care is modernizing and, I’m confident, will ultimately showcase the power of analytics to improve health care. The bottom line: Advanced health analytics is gaining ground in the industry and is picking up speed as more and more providers realize The Power to Know®.

Reference:

^ Bates, D.W., Saria, S., Ohno-Machado, L., Shah, A., and Escobar, G. (July 2014). "Big Data In Health Care: Using Analytics To Identify And Manage High-Risk And High-Cost Patients," Health Affairs, 33, no.7.

Post a Comment

The value of big data – Part 3: “Big something”

I seem to write quite a few blogs downplaying the idea of big data; but to be quite honest, buzzwords like that tend to annoy me. They take attention away from the underlying problems and the more we use these terms, the less real meaning they seem to have. Saying “I have big data” seems to be a reflex, much like that embarrassing moment when the person behind the ticket counter tells you to enjoy the movie and you say “Thanks, you too”. You aren’t quite thinking about what you are saying but you mean well when you do it.

While the term big data is traditionally defined in a relative manner (allowing everyone to share in the joy of having it), I think it should be reserved for specific things. If, for instance, you have big data because you bought a lot more data than you can consume because you had money allocated for it, you have a “big spender.” If you have big data but you still don’t know how to analyze and utilize the data you’ve had for the last five years, then you have a “big dreamer.” And finally, if you have big data and you don’t have any idea how you got there, well, then you simply have a “big problem.”166539242_300x169_72dpi

Big data should be reserved for when otherwise carefully curated and managed sources of data and analytics have some kind of fundamental paradigm shift which changes their volume, variety, velocity and/or value. When your previously well-managed EMR analytics group is now able to use text mining and can look at 10 years of unstructured data, you have big data. When your data provider is now able to give you daily feeds of data instead of quarterly, then you have big data. When your genetic assay test drops from $400 to $4 and you can collect 100x as much information as you could before, you have big data.

This doesn’t mean that the people with a “big spender,” “big dreamer,” or “big problem” don’t have real issues that advances in analytics and technology may be able to resolve. But rather, the problems they need to solve are of a different nature. While it may seem like we all are swimming in the same big data pool, it’s important to keep in mind where you dove in from before you start trying to swim.

Post a Comment

Desiderata for enterprise health analytics in the 21st century

With apologies and acknowledgments to Dr. James Cimino, whose landmark paper on controlled medical terminologies still sets a challenging bar for vocabulary developers, standards organizations and vendors, I humbly propose a set of new desiderata for analytic systems in health care. These desiderata are, by definition, a list of highly desirable attributes that organizations should consider as a whole as they lay out their health analytics strategy – rather than adopting a piecemeal approach. They form the foundation for the collaboration that we at SAS have underway with Dignity Health.

The problem with today’s business intelligence infrastructure is that it was never conceived of as a true enterprise analytics platform, and definitely wasn’t architected for the big data needs of today or tomorrow. Many, in fact probably most, health care delivery organizations have allowed their analytic infrastructure to evolve in what a charitable person might describe as controlled anarchy. There has always been some level of demand for executive dashboards which led to IT investment in home grown, centralized, monolithic and relational database-centric enterprise data warehouses (EDWs) with one or more online analytical processing-type systems (such as Crystal Reports, Cognos or BusinessObjects) grafted on top to create the end-user-facing reports. Over time, departmental reporting systems have continued to grow up like weeds; data integration and data quality has become a mini-village that can never keep up with end-user demands. Something has to change. We’re working with Dignity Health to showcase what an advanced enterprise analytics architecture looks like and the transformations that it can enable.

Here are the desiderata that you should consider as you develop your analytic strategy:

  1. Define your analytic core platform and standardize. As organizations mature, they begin to standardize on the suite of enterprise applications they will use. This helps to control processes and reduces the complexity and ambiguity associated with having multiple systems of record. As with other enterprise applications such as electronic health record (EHR), you need to define those processes that require high levels of centralized control and those that can be configured locally. For EHR it’s important to have a single architecture for enterprise orders management, rules, results reporting and documentation engines, with support for local adaptability. Similarly with enterprise analytics, it’s important to have a single architecture for data integration, data quality, data storage, enterprise dashboards and report generation – as well as forecasting, predictive modelling, machine learning and optimization.
  2. Wrap your EDW with Hadoop. We’re entering an era where it’s easier to store everything than decide which data to throw away. Hadoop is an example of a technology that anticipates and enables this new era of data abundance. Use it as a staging area and ensure that your data quality and data transformation strategy incorporates and leverages Hadoop as a highly cost-effective storage and massively scalable query environment.
  3. Assume mobile and web as primary interaction. Although a small number of folks enjoy being glued to their computer, most don’t. Plan for this by making sure that your enterprise analytic tools are web-based and can be used from anywhere on any device that supports a web browser.
  4. Develop purpose-specific analytic marts. You don’t need all the data all the time. Pick the data you need for specific use cases and pull it into optimized analytic marts. Refresh the marts automatically based on rules, and apply any remaining transformation, cleansing and data augmentation routines on the way inbound to the mart.
  5. Leverage cloud for storage and Analytics as a Service (AaaS). Cloud-based analytic platforms will become more and more pervasive due to the price/performance advantage. There’s a reason that other industries are flocking to cloud-based enterprise storage and computing capacity, and the same dynamics hold true in health care. If your strategy doesn’t include a cloud-based component, you’re going to pay too much and be forced to innovate at a very slow pace.
  6. Adopt emerging standards for data integration. Analytic insights are moving away from purely retrospective dashboards and moving to real-time notification and alerting. Getting data to your analytic engine in a timely fashion becomes essential; therefore, look to emerging standards like FHIR, SPARQL and SMART as ways to provide two-way integration of your analytic engine with workflow-based applications.
  7. Establish a knowledge management architecture. Over time, your enterprise analytic architecture will become full of rules, reports, simulations and predictive models. These all need to be curated in a managed fashion to allow you to inventory and track the lifecycle of your knowledge assets. Ideally, you should be able to include other knowledge assets (such as order sets, rules and documentation templates), as well as your analytic assets.
  8. Support decentralization and democratization. Although you’ll want to control certain aspects of enterprise analytics through some form of Center of Excellence, it will be important for you to provide controlled access by regional and point-of-service teams to innovate at the periphery without having to provide change requests to a centralized team. Centralized models never can scale to meet demand, and local teams need to be given some guardrails within which to operate. Make sure to have this defined and managed tightly.
  9. Create a social layer. Analytics aren’t static reports any more. The expectation from your users is that they can interact, comment and share the insights that they develop and that are provided to them. Folks expect a two-way communication with report and predictive model creators and they don’t want to wait to schedule a meeting to discuss it. Overlay a portal layer that encourages and anticipates a community of learning.
  10. Make it easily actionable. If analytics are just static or drill-down reports or static risk scores, users will start to ignore them. Analytic insights should be thought of as decision support; and, the well-learned rules from EHRs apply to analytics too. Provide the insights in the context of my workflow, make it easy to understand what is being communicated, and make it easily actionable – allow users to take recommended actions rather than trying to guess what they might need to do next.

Thanks for reading, and please let me know what you think. Do these desiderata resonate with you? Are we missing anything essential? Or is this a reasonable baseline for organizations to get started?

We’ll be sure to update you as our collaboration with Dignity Health progresses.

Post a Comment

The value of big data – Part 2: The perceived ultimatum of big data

There almost seems to be a perception that the value of big data is only truly realized when we process all of it. The way we talk about big data, while generally optimistic, it has an almost ominous feel behind it. As though, if we fail to tackle all of big data in the next 12 months, Rumplestiltskin will come to take away our first born.

I would challenge anyone who says they can “tackle and resolve” their big data within 12 months that they do not have true big data. They may have a lot of data, and they may not have enough resources to consume it, but they don’t face a challenge on the scale of big data. At the same time, I would say that even those with the biggest of big data can start deriving meaningful value from that data quickly.

Successful delivery against big data requires two general objectives. The first (and usually most focused upon) is the enhanced/improved software and hardware infrastructure that can be used to churn the entire universe of data. This is often an expensive and potentially time-consuming task. If it wasn’t, you probably wouldn’t find yourself taking a second thought about the process and just go along your merry way.185084968_300x225_72dpi

The second aspect required (and one that can be potentially more readily tackled) is a focus on learning and understanding the constituent data. Regardless of the source of your big data, it can all be thought of as the amalgamation of many pieces of “small data.” These individual sources may be nuanced and complex, but when taken on one at a time, the challenge they pose is greatly diminished.

My first job out of school involved the analysis of a large national survey. The volume of data would almost be laughably small compared to sources we look at now, and yet the team dedicated to the analysis of the data was larger than the entire analytics groups for many companies I’ve talked to in the health care industry. Even with this pool of dedicated, highly trained and focused individuals, I don’t think any of us would say that we had mined the data to exhaustion. There was constantly more value we could extract, and likely there always will be.

Certainly we hit points of diminishing return, where new truly insightful discoveries in the data were few and far between. And I know for a fact that my colleagues and I would have loved the opportunity to bring this data together with other sources and see what we could find out – but we didn’t have that luxury. That said, I can’t remember a week going by where someone didn’t have a new idea, method or approach to tackle the data that didn’t bring out a little something new.

The point being that there is value still remaining in the data you can access and evaluate now. I don’t mean to downplay the value of what a big data-enabled platform can deliver, but rather to remind us all that even small, focused, incremental growth in understanding and utilization of the smaller building blocks of data will not only prepare us for reaping the long-term value of the big data conglomeration, but will likely provide meaningful insight along the way.

In Part 3 of this series, I’ll narrow down how to define big data, which is critical to understanding how best to utilize it.

Post a Comment

The value of big data – Part 1: Big board games

I collect board games, and probably, I collect too many of them. Each game is different and has its own charm and value. Some are fun for large groups, others work best when you play them one-on-one. Sometimes what draws me to a game is a great theme and sometimes it’s a novel mechanic. Regardless, there is something about all of them that makes me want them.
Board game and dice
Sadly, I only have so much time in a day and only so much of that can be devoted to playing games. My shelves are getting full with great games that I’ve never played, some even still have the shrink-wrap on them, never opened and likely never will be. To be honest, I have a problem. I have a wealth of entertainment value but no real way to appreciate it. When I buy games, I ignore the fact that I already have 20 games sitting ready to be played that haven’t been touched. And, among those games I have played, rarely have I truly mined their depths, rather I end up playing them once or twice and move on before getting all their worth.

Yes, I have a problem, and I call that problem “Big Board Games.” I have games of great variety and volume, acquired at high velocity and all having (at least to me) great value. All of you reading this probably think what a waste of money my games might be, or, at best what an interesting quirk for a statistician. But if I were to have told the same story in the context of data, we would think of this as a source of universal pride and envy.

It is a rare case that we stumble upon big data by accident or surprise. It is a deliberate process by which we desire and acquire new data sources. Some sources come from changes in technology allowing us to capture more detailed or more frequent data, perhaps a new type of genetic assay. Some sources come from vendors who sell EMR, claims or other rich sources of data. Sometimes we get lucky and find new ways to analyze previously inaccessible data; for instance text mining of EMR notes from a previously purchased source. But all of it we knowingly capture and all of it comes at a cost to us.

Granted, I’m a bit long-winded in getting to the point, but the point is a valid one. The acquisition of big data in and of itself isn’t a point of pride. Rather, the presence of big data indicates a data acquisition strategy that may be out of pace with the corresponding analytic support capability. We will all face times where we step into the realm of big data; this is a necessary product of growth and exploration. But those periods should be short-lived and carefully evaluated. We should be asking ourselves what was the factor that pushed us into a state of big data, and is the corresponding value worth the cost of acquisition – and more importantly, is it worth the cost of utilization?

When I store a new batch of board games on my shelves at home, I find I often have an introspective moment. As I rearrange and push aside old games I’m forced to ask myself whether these new games truly add value to my collection and enjoyment. Through acquisition of games, even great games, my ability to play isn’t increased, yet the potential pool for demand is growing constantly. Some games must be diminished, others forgotten almost completely; and yet, while I’m aware of all of this potential waste and lost value, there’s always one other great game that I just have to get next.

And so it goes with big data, as we’ll see in Parts 2 and 3 of this series.

Post a Comment

Clinical trial data transparency – making the commitment

A great deal of progress has been made to increase data transparency for clinical trials. The sponsors who have rapidly increased transparency deserve our gratitude. They’re the reason why, now more than ever, more progress is possible through commitments to action on data transparency by all sponsor organizations.

Data transparency gives authorized researchers access to the clinical trial data on which the regulatory decisions for new medicines are based. In late 2013, I wrote about “Five Steps to Success” with clinical trial data transparency. These five steps were:

  1. Decide to participate.
  2. Make the data decisions.
  3. Implement a governance process.
  4. Deploy a secure analytics repository.
  5. Inform the research community and the public.

Considerable progress in a very short time

In a very short timeframe, many biopharmaceutical companies have elected to make clinical trial data and supporting documents available to researchers through various approaches, such as:

  1. Clinicalstudydatarequest.com, which currently provides study information for eight clinical trial sponsors.
  2. University-affiliated “independent review committees.” At least one company is using The Yale School of Medicine’s Open Data Access Project (YODA) to gather and review researcher requests. In addition, Duke Clinical Research Institute (CDRI) is providing an independent review committee to review researcher data access requests for a biopharmaceutical company.
  3. Consortiums such as Project Data Sphere – an initiative of the CEO Roundtable on Cancer’s Life Sciences Consortium.
  4. Pharma company websites. Some organizations are providing the ability to request clinical data access from their own company website.

While the list of organizations that have committed to greater transparency grows with each passing month, some sponsors of clinical trials have not taken action.

Some companies are waiting

Although a great deal of progress has been made in a very short time by the early pioneers of clinical trial data transparency, some biopharmaceutical organizations are waiting to take action toward providing researchers with access to anonymized patient-level data from their clinical trials. Why take a wait-and-see approach? Perhaps these organizations are waiting for a regulatory mandate. Or maybe they hope the stakeholders requesting greater transparency will be satisfied by the current progress.

The bottom line is that the stakeholders demanding greater transparency into clinical trial results expect full participation and not partial transparency.

Commitment requires action

Delaying the decision to participate isn’t a sound strategy, and here’s why:

  • High participation rates from all sponsors can help to prevent further scrutiny from regulatory authorities and other stakeholders.
  • Regulatory mandates may be more onerous if the entire industry is not showing progress by taking immediate action to implement data transparency.
  • The stakeholders are resolute. You can expect the advocates for greater transparency to be extremely dedicated – even tenacious – in pursuing access to clinical study data.

Increased access to clinical trial data for authorized researchers can be a valuable resource for the discovery of new medical knowledge. The new insights will have the potential to improve public health for decades to come. Greater transparency also helps to increase public trust in the industry. Commitment and action from all sponsors of clinical research will help to make data transparency a reality.

To learn more about how SAS can help, visit our clinical data transparency page on sas.com.

Post a Comment

Visual analytics – A step in the right direction

I just returned from the invigorating Healthcare Analytics Symposium & Expo in Chicago. The event was focused on the analytical competencies needed to navigate the shifting landscape of health care reform, and traverse the gap between FFS (fee for service) and FFV (fee for value). Although only in its third year, I wasn't surprised to see attendance double from last year. The word is out; analytical competency will be critical to everyone’s success in this industry.

Two themes seemed to dominate the Q&A, hallway discussions, and conversations with visitors to the SAS booth: First, the difference between business intelligence (BI) and analytics; and second, the need to predict and manage both clinical and financial risk. This was very encouraging to me, because last year’s themes where very different. Last year, many participants assumed that BI (or "reporting") and analytics were synonymous, and that their most urgent need was to build a clinical data warehouse (CDW) for reporting and scorecards. Fortunately, many have since learned the differences between reporting and predictive modeling and are beginning to warm to the notion that a CDW is not necessarily their best first step on the road to analytical competence.

Necessity is the mother of invention: most health care organizations have lots of data trapped in their EMR, but don't yet have a CDW to extract and organize the data for reporting. As such, almost everyone I spoke with this year was seeking advice and answers to the following two questions:

  1. How can I do basic yet meaningful clinical analytics without a data warehouse? 
  2. How can I do basic, yet meaningful clinical analytics without a staff of bioinformaticians?

The answer to both questions is surprisingly simple – use SAS® Visual Analytics.

A mere two years ago, the answers to both of those questions were far more complex. Now, things are much simpler. SAS Visual Analytics is a powerful, yet intuitive tool for exploring data and building reports – that doesn't require a prebuilt warehouse with cubes and marts. It's so easy to use that it requires no training to get started. Best of all, you don't have to be a biostatistician to perform correlation analysis or create a quick forecast.

This solution represents tremendous opportunity for the use of analytics in health care for several reasons:

  • It reduces the barrier to entry for non-statisticians to engage in analytical decision-making.
  • It dramatically reduces the time-to-value for investment in analytical capabilities.
  • It reduces dependence on already over-burdened HIT organizations.
  • It's a steppingstone toward a more sophisticated level of analytical maturity.
  • It makes analytical insights accessible to clinicians who haven't the time or energy to gather it any other way.

The conference reminded me that many health care organizations are just beginning their journey to analytical competence. An old adage comes to mind, “Think big, start small, and move fast!” SAS Visual Analytics provides the opportunity to do just that; don't wait on that warehouse ... get started now. Go online and give it a try.

Post a Comment

Prescription for analytics: Ten smart moves to develop your strategy and culture

The health care industry has only scratched the surface of what big data analytics can do to advance the triple aim. It’s an open secret that if health and life science companies could analyze the big health data they already have – even before bringing in novel data sources – they could gain unprecedented insight to make enormous improvements in health care delivery, costs and outcomes.

However, much of this opportunity is untapped – not due to technologic or even data constraints, but cultural ones. I see it every day in my work as part of SAS Best Practices. Changing the culture is about adapting mindsets and business processes – creating an environment where people understand, value and demand fact-based decisions and strategies.

At the 11th annual SAS Health Analytics Executive Conference last week, I spoke with experts who shared what’s worked for them as they developed an analytic culture in their organizations. I’ve boiled this valuable information down to ten smart approaches for transforming from within:

  1. Think big, start small and move fast.
    It’s great to have a vision, a strategy and a dream of where you want to get to, but start small and move fast with getting those things done. Too broad a vision and you’ll never get off the ground. And because breaking new ground is inherently risky, it’s important to not only move fast but to fail fast and move on quickly if a particular analytic experiment comes up empty.
  2. Don’t baffle them with equations.
    Most people have a lingering math phobia from school days. They see a statistician coming  and fear you’re going to smother them with equations. Tell stories and provide relatable analogies, and people don’t have to be afraid.
  3. Present a business case, not an analytics case.
    The quants amongst us think the beauty of analytics should be self-evident. Even so, you have to present your case in business terms, with a plan to deliver quantifiable costs and benefits. This makes it hard for that C-(Anything)-O to say no.
  4. Show them you’re really going to help them.
    Understand – and communicate to your consumers - that an analytics project is not about generating another report. Rather, the objective is to integrate new insights into business processes and practices to help the folks on the ground do what they need – and want – to do more efficiently and effectively.
  5. Show tangible results in their terms.
    Show how you saved money, avoided an erroneous conclusion or suboptimal decision, improved the ability to respond to a patient’s needs, positively impacted outcomes or made a better drug development determination earlier in the process. When you can do that, you gain credibility. Push turns to pull.
  6. Get people to believe in models.
    Yes, scientists and medical professionals are analytical by nature. This doesn’t mean, however, they naturally gravitate to or embrace analytics. Sometimes you have to convince them that models really can represent complex systems. Not sure how?  See #s 2, 4 and 5.
  7. Enable capabilities, not applications
    When you build a broad capability for analytic discovery – the best performance envelope you can – people will surprise you with the innovations they can bring to the table. To commandeer an old saying: teach them to fish. Provide the pole. And perhaps even an initial pond. With some coaching, they’ll take it from there.
  8. Show them the possibilities.
    Business users don’t know what the math can do, so they’re not going to ask you for it. Data scientists might have to be the ones saying, “Here’s something really valuable we can do for you.” Then their eyes start to light up. But remember, action trumps interesting. Make sure you’re speaking to real-world problems and opportunities that matter to them. Not just to you.
  9. Consider an analytics center of excellence (COE).
    Most organizations have pockets of analytics expertise, but you gain great synergies and economies of scale by bringing your brightest stars together to focus on the most important analytical approaches. A center of excellence mobilizes analytic resources for the good of the enterprise – not just for specific business units or one-off projects. Used correctly a COE can play a critical role in transforming your organizational culture to one that embraces the value of analytics-driven decisions.
  10. Make it irresistibly sexy
    If you’ve laid the groundwork – such as building an advanced analytics laboratory – show it off! Invite folks in and experiment together. Share the technology in a very visual way. People will see that when they ask what-if questions, they can get answers instantly. That’s exciting, and inspires everyone to keep asking more questions.

If you missed our Virtual Conference, I invite you to watch the session, An Analytic Prescription: Developing a Robust Strategy and Culture, as well as several others available on-demand.

Post a Comment

Caution: Don’t underestimate the promise of analytics

Big data analytics have ushered in the potential to dramatically improve health care costs, the care experience and patient outcomes. Masses of genomic data, clinical trial data, electronic health records (EHRs), claims data and research study data can be brought together to reveal important discoveries, support better operational and medical decisions  and, ideally, to bring analytic insight right to the point of care.

As Director of Business Strategies for SAS Best Practices, I’ve seen a lot of change in the last 15 years in both the awareness of and execution of analytics programs to drive value. At the 11th annual SAS Health Analytics Executive Conference, I had the privilege of talking with some folks who are leading the analytics charge in pharmaceutical development and integrated health care delivery. In a panel discussion, they described some cases where their organizations are applying analytics to improve quality, close care gaps and accelerate scientific discovery.

  • Quality. Preventable medical errors are a leading cause of death in the US – higher than diabetes. Imagine the opportunity to use data and analytics to reduce such events. Analysis of EHRs can be used to automatically detect misdiagnoses, monitor medication use and better assess risks to provide the optimal therapeutic interventions.
  • Completeness of care. The health care industry lags financial services and retail in the ability create a 360-degree view of the customer/patient/member and deliver the right insights at the right moment. Imagine the potential to deliver analytic results into the exam room while the provider is with the patient, making care decisions. The potential exists to customize not only what, but how, information is communicated to a patient and adjust therapeutic approaches throughout the continuum of care based on a comprehensive understanding of not only a person’s medical history, but their personal preferences, family history and demonstrated behaviors (just to name a few).
  • Speed of scientific discovery.  Did you know that it takes 12 to 14 years and at least $1 billion to develop a drug? A staggering fact shared by one of our conference executives from a leading pharmaceutical company. But analytics can change that. Life sciences organizations are using analytics to design better clinical development programs that identify effective drugs more quickly, and also to identify subgroups of patients for whom a therapy will work particularly well.
  • Real-world understanding. Pharmaceutical companies are working to integrate clinical trial data with real-world evidence. The scientific mandate: we need to do the correct science to create the correct drugs. But there’s also the practical mandate: how are we positively affecting or changing people’s lives?

The bottom line

Those in the health and life sciences professions have a mission to help those they serve. But whether your organization is categorized as for-profit or not-for-profit, if your bottom line isn’t black, you’re not going to be around to help anyone. Analytics can be used to identify waste and inefficiency as well as to optimize the allocation of limited resources – skilled providers, capital, facilities, etc. Smart, lean operational decisions become ever-more critical as we work to transform the health care system from a fee-for-service model to value-based care.

It’s easy to get people to agree in concept about the value of analytics, but it’s not always easy to move from vision to action, from high-level roadmap to implementation. In my next posting I’ll talk about why the industry isn’t exploiting analytics to the degree that it can – and should. Since I work at SAS, I can assure you it’s not a lack of data or analytical horsepower.

If you missed our Virtual Conference, I invite you to watch the session, An Analytic Prescription: Developing a Robust Strategy and Culture, as well as several others available on-demand.

Post a Comment

ANALYTICS power life sciences organizations

My favorite sessions at the SAS Health Analytics Executive Conference are the ones where life science customers describe case studies about the value of analytics in their organization. This year’s conference featured outstanding presentations from Xiaoying Wu, MD of Janssen; Alay Shah of Cardinal Health; and Craig Willis of Physicians Pharmacy Alliance.

Data-driven adaptive clinical trial design

Dr. Xiaoying Wu described a major project at Janssen Research and Development building a data warehouse to support modelling and simulation and clinical trial protocol design. The project supports adaptive clinical trial design and increases the capability for data-driven decision making in the organization.

Before building the data warehouse, it used to take 2 to 14 weeks just to identify the data necessary to answer a question, such as “What is the response rate in placebo arm for disease X?” The tasks would include items such as:

  • Identifying all trials for disease X
  • Locating key documents
  • Locating available data and securing access to the data
  • Understanding previous analyses, and
  • Identifying the right population – among others.

Janssen R&D is now able to more rapidly provide insights for adaptive trial design which will ultimately reduce the cost of clinical trials and shorten time to market – while better meeting the needs of the patient. The project was successful thanks to excellent collaboration between business and IT.

GRID unites silos

A second case study of great collaboration between IT and business users was presented by Alay Shah from Cardinal Health.

The grid implementation at Cardinal Health brought together multiple groups across the organization to use the same platform. Their grid now supports about 800 users, and the excellent support from IT now allows SAS programmers to spend more of their time solving business problems.

As a result of the grid implementation, Cardinal Health was able to eliminate silos, reduce annual server repair and maintenance costs, improve server utilization and improve availability for the SAS platform. In addition, programs now run faster. For example, the runtime for price audits was reduced from 3 hours to 30 minutes!

Alay noted that the most important achievement from this project was to improve the relationship between the SAS users in the business and the internal IT services group.

Transforming their business with analytics

Craig Willis from Physicians Pharmacy Alliance told the captivating story of how his fast growing company is transforming their business with analytics, especially SAS® Visual Analytics and SAS® Office Analytics. Physicians Pharmacy Alliance is a leading medication care management pharmacy that improves adherence and lowers the cost of health care for complex patients.

Physicians Pharmacy Alliance works with managed Medicaid plans to develop a series of analytic algorithms to identify complex patients who are likely to have challenges adhering to their medical regimen. The company needed to find a more scalable way to grow their business than their existing tools provided.

Physicians Pharmacy Alliance has used analytics to transform their business through:

  • Comprehensive claims analysis to drive efficient patient identification
  • Integration of multiple data sources to draw a clear link between adherence and reduced medical spend
  • Automated contractual reporting for their customers.

Learn more

Find out more about Health Analytics by viewing the OnDemand version of the 2014 SAS Health Analytics Conference.

Post a Comment
  • About this blog

    Welcome to the SAS Health and Life Sciences blog. We explore how the health care ecosystem – providers, payers, pharmaceutical firms, regulators and consumers – can collaboratively use information and analytics to transform health quality, cost and outcomes.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives