Assessing and Advancing Analytics Maturity: Part 1


Analytics maturity is a hot topic right now.  Many come to SAS for answers on how to assess their analytics maturity and advance their use of analytics, especially at a corporate level.  I want to share the highlights of what we usually prescribe from a best practices perspective regarding advancing analytics maturity.  We often give customers these four tips:

Good analytics always begins with quality, pre-staged data

Data used for predictive modeling is typically not like other kinds of enterprise data, particularly data consisting of purely operational or transactional records.  Instead, advanced analytics almost always requires transactional observations to be pre-summarized and/or transposed into what is called a “one-row-per-subject” analytic base table or data mart.  This is a crucial step for analytics analysis and is a topic finally getting more focus and attention.  An example of a one-row-per-subject table would be a unique list of customers with their past two years of monthly purchases aggregated into 24 columns – with the ‘one-subject’ being ‘customer’ in this case.

Furthermore, the maxim “garbage in, garbage out” is inherently true for all analytic modeling efforts.  In fact, there are a lot of modifications and quality checks that usually need to be done to data in order to refine it for consumption by specific statistical techniques.  One such modification might be to add new columns to the data that consist of external or explanatory variables.  In reality these additions could be comprised of demographic or macro-economic variables not typically found inside traditional enterprise data warehouses.  Another type of data modification is adding what are called ‘derived fields’, namely columns that are often created from transforming existing inputs and information.  Changes like these are usually necessary to obtain the most value from a modeling data set and comprise a process commonly called ‘analytics data-staging’. 

Improving analytics maturity involves people, process, and technology considerations and re-engineering, not just having better analytic modeling capabilities

Increasing analytics maturity at a departmental level must include a holistic view of infrastructure components that both directly and indirectly impact the success of predictive modeling efforts.  Software and hardware capabilities are major contributors for how an organization is able to conduct analytics, but other factors need to be examined as well.  For instance, outmoded business processes need to be identified and evaluated as to their effectiveness, replacing outmoded processes if necessary.  Another key component of analytics maturity is how well the staff has been trained to incorporate analytics into their daily decision making.  Sometimes it means hiring new people, but other times it means educating the staff on new choices they need to make based on factual information and statistical predictions.

There are always things that can be done to improve analytics usage, regardless of where you are maturity-wise

Regardless of how you assess yourself, your business unit, or your division, there are concrete things you can do to improve your analytics maturity.  Perhaps within your company predictive analytics are largely limited to production applications and the full benefits of analytics are not understood by a majority of your colleagues.  You can begin improving your tactical capabilities by helping to prioritize future analytical projects and showcasing successful outcomes.  Publicizing success stories and informing people where they can go for help is also a great way to get upper management interested.   You might even be able to start a monthly user group to discuss analytics usage within your company.  These sorts of activities will ultimately propel you and your company toward being a more sophisticated analytics worker and more analytics-driven.

Automating data preparation is fundamental to making analytics professionals more productive

For some higher levels of analytics functioning, and once analytics professionals have been identified as having common data consumption needs, efforts need to be made to automate time-consuming data preparation steps that are precursors of more advanced modeling efforts.   We estimate that a full 80% of model development effort is spent shaping and preparing the data for predictive analysis.  The implication is that the IT department and a variety of business units need to be brought together in a true collaborative environment in order to reduce redundant efforts and make data preparation tasks more efficient.  IT professionals are usually unaware of how to best support advanced analytics needs, and an educational effort has to be the focus of getting groups of people with different goals and agendas to start talking with one another.  At SAS we offer assessment workshops to help with this collaboration, and there is a nice article written by Dr. Anne Robinson, Director of Analytics at Verizon Wireless, where she shares five tips for bridging the business-IT gap.

Next time I will lay out some distinctions that can be used to characterize different levels of analytics maturity, as well as discuss why assigning labels to one group or to a company can be a difficult and sometimes counter-productive endeavor.  Please stay tuned!

Meanwhile, what have you found to be best practices?


About Author

Phil Weiss

Analytics Systems Manager

Phil Weiss is an Advisory Solution Architect in Sales and Marketing Support. He was an accomplished application developer and statistical consultant for 10 years before joining SAS. His technical specialties are time series forecasting, high-performance analytics and distributed processing systems. Phil has written a book on the history of Lake Tahoe’s oldest licensed casino, the Cal-Neva Resort, a place once owned by Frank Sinatra.

Related Posts

Back to Top