Some organizations I visit don’t seem to have changed their analytics technology environment much since the early days of IT. I often encounter companies with 70s-era base statistical packages running on mainframes or large servers, data warehouses (originated in the 80s), and lots of reporting applications. These tools usually continue to work, and there is a natural—but dangerous—human tendency to leave well enough alone.
Vendors have produced a variety of new tools for predictive and prescriptive analytics, and they offer many custom analytics solutions for industry-specific problems like anti-money laundering in banking, or churn prevention in telecom. Visual analytics are easier to generate, are much more visually appealing, and even offer recommendations for what visuals would best depict a particular type of data or variable relationship.
New tools are leading to a dramatic increase in the speed and scale with which analytics can be performed, and much greater integration with business processes. I often refer to this set of changes as “Analytics 3.0,” and many large, established firms have adapted these approaches. They make it possible to make analytical decisions in near-real time, which often yields benefits in terms of increased conversions, optimized operations, or other results. And the process of generating analytical models has become substantially more agile.
Not taking advantage of these analytics modernization opportunities has some substantial implications. The 5 step process I outline below will get you started on the road to analytic modernization.
Step 1: Assess technical and human capabilities. As you might expect, the process of modernization often begins with an assessment. What can’t we do today that we would like to do? Where is there technology that would enable a faster or better approach to analytics? Where in the current analytical technology architecture are existing technologies outdated?
As an organization begins to identify promising new analytical technologies, it should also assess its human capabilities. Some data management technologies, including Hadoop, require special skills that may be difficult or expensive to hire in some markets. New analytical software may require familiarity with different statistical or mathematical methods than current staff have mastered. Some skill requirements, of course, can be addressed through retraining rather than new hires.
Step 2: Proof of concept for new technology. Since analytics modernization involves use of a new set of technologies, companies will usually want to gain experience with them on a small scale before jumping in wholeheartedly. Modernization initiatives often proceed quickly to proofs of concept (POCs) rather than full production projects. POCs make it possible to rapidly confirm that a new technology can deliver value and that it can function within an existing technology architecture.
Step 3: Begin redesigning the work process. At the same time new technologies are being explored, companies will want to begin addressing process changes that the new technologies enable. It is great that an analytical result can now be obtained in seconds or minutes rather than hours or days, but that must lead to changes in the business or decision process in which the result is employed. It may mean, for example, that multiple modeling alternatives can be explored, or that customers can be given a much faster response. It’s never too early to begin designing the new technology-enabled process. Early process prototypes can be refined and put into production over time just as technology POCs lead to production applications.
Step 4: Address business/IT relationships. Companies that succeed with leading-edge analytics on a large scale tend to have close and collaborative relationships between business leaders and IT organizations. On a small scale, of course, it’s possible for business groups to adopt and implement analytical technology on their own. However, as applications and architectures grow, the majority of business groups will find it burdensome to maintain these initiatives by themselves. Collaboration between business and IT groups from the beginning will minimize disruptive midstream handoffs.
In some cases, IT/business partnerships can accelerate analytical initiatives at an early stage. At Intel, for example, a partnership between the IT and Strategy groups to advance analytics across the company led to a company-wide analytics summit, identification of barriers, and approaches to educating senior management about business opportunities from analytics.
Step 5: Measure outcomes. Since the analytics technology environment will continue to change and require new investment, it’s important to measure each project involving new technology to illustrate the value of modernization. The easiest way to measure outcomes is at the particular application level. A common approach is to assess how a decision was made before the new analytics, and how much cost and value the old approach required. Then a comparison can be made to any new approaches.
There is little doubt that we are in the age of analytics, and strong analytical capabilities demand modern analytics technologies. Those companies that have invested in new technologies have achieved great results. I know of few types of investments that will yield greater returns.
Learn more about analytic modernization from SAS and Deloitte.
1 Comment
We are on the verge of witnessing a disruptive Industrial transformation soon. It is due to the rapidly expanding information emitting from every spectrum of our Industrial bases. As for health care, we are seeing a convergence of exponential evolution of ' OMIC' data, the advancing field of Bio-Medical Informatics along with the emerging frontier of Population Health management. The information generated should be harnessed utilizing the concepts of Cognitive technology like machine learning,neural networks, artificial intelligence to provide analytical information for task automation, enable performance augmentation and enhance amplification of predictive outcomes.
To optimize outcomes the information generated from the converging spectra should be stored safely, in databases which are architecturally designed for rapid retrieval, analysis and manipulation.