Big Data is dead. Long live big data.
Of course that’s not true. Finding good, relevant data IS important. Having a wide range of analytic techniques and approaches DOES provide competitive advantage. But when data outgrows our ability to simply process it, only analytics give us the ability to extract organizational and market insights.
So, an even more important question than, "How BIG is our data?" is, “How much VALUE is our data analysis bringing?” In his a recent post about "not so big data," James M. Connolly points out succinctly that valuable insights can be found in relatively small data with properly evaluated, operational analyses.
I propose that the next big focus needs to be on evaluating derived analytic value. This means comparing analytically driven decisions and campaigns. These comparisons, when done systematically, provide "meta analytics" to answer the key questions about which models are best in which contexts.
Without such evaluations, we might as well go back to writing business information down on notepads.
And yet, time and time again, organizations make analytic feedback seem like little more than a pesky side note to their bigger and better data and algorithm approaches. In my opinion, there’s more value to be achieved in a well-fed, automated modelling feedback loop than the most powerful algorithms on the biggest data. Decision management software plays an important role in this task.
Yet defining analytic value can be particularly elusive. What do we need to measure?
Instead of the ball bath at Ikea, where the solution is always, "Add more balls," I believe the answer here is, "Add more science."
Let’s take a common example: Targeted marketing for customers. How should we evaluate an analytic approach to a business expert approach? Answer: Use a more scientific approach. Test two treatments, one analytically defined, one by carefully curated business rules. Then make initial tests on outcomes to evaluate which performs better. Very often the most valuable approach will combine models with business rules.
Once you measure outcomes, a grounded discussion about analytics value can begin. Is the value of analytics the net lift of the analytic approach? Well, that of course depends on whether the lift can be directly converted into value (especially added, incremental business value).
On top of that, there can also be a myriad of confounding explanations and questions, including:
- The targeted customers were simply the ones who would have responded anyways. With our promotion, did we just cut the margin on customers who would’ve responded anyways while missing chances to lure new, potentially high-value customers.
- The two groups were not alike due to (insert your choice of 1000's of possible reasons). This points to a lack of consistency in execution.
- The offer itself is biased towards the analytic approach.
- The results just don’t make sense and we don’t accept a:
- Mathematical or statistical explanation.
- Black box explanation.
- It’s a fluke, test it again.
- It’s a fluke, test it again!!!
- It just doesn’t feel right.
- Your value measurement isn’t correct…we wanted new customers to respond…in our group there’s 7 percent and in yours only 5 percent.
- Yeah, so they responded to your campaign, how do we know it’s not a one-shot?
- Yeah, but what was the cost in generating the model results (this is actually a great question we should be asking ourselves)?
- The model’s chosen variables don’t make intuitive business sense….what about A, B, C and D?
- Our results indicate differently….we actually won…straight up!
- Your predictive model acted on something that didn’t happen yet. How can we be sure it was actually going to happen? When we act reactively, we know 100 percent we are not wasting resources.
- Your predictive model has great results! But gives no clear action or indication or enough time to adequately influence.
- Your predictive model has great results! What about downturn conditions? Black swans?
- Similarly, your model is based on past behavior, how can you assure me that such behavior will remain constant?
- I heard you should use deep learning, neural networks (insert trendy analytic method) for this type of business problem, why didn’t you?
- Did you take into consideration that customer group A is fundamentally different to customer group B?
Sound familiar? Some of these questions may seem a bit cheeky, I know - but defining and consistently measuring the value of your analytics can help you answer them all, from the most serious to the most absurd.
Enterprise decision management industrializes "meta analytics" creation to provide a solid base for these discussions and to keep your analytics, and analysts, working where they bring the most value.
In my next blog post, I’ll elaborate on some ways to clarify the above topics while moving closer to objective analytic value metrics.