With all the hype over big data we often overlook the importance of modeling as its necessary counterpart. There are two independent limiting factors when it comes to decision support: the quality of the data, and the quality of the model. Most of the big data hype assumes that the data is always the limiting factor, and while that may be for a majority of projects, I’d venture that bad or inadequate models share more of the blame than we care to admit.
It’s a balancing act, between the quantity and quality of our data, and the quality and fit-for-purposeness of our models, a relationship that can frequently get significantly out of balance. Or more likely, complete mismatches between data and modeling can crop up all over our organization. In one instance we may have remarkable models starved for good data, and on the other hand, volumes of sensor or customer data sit idle with no established approach to exploration, analysis and action.
This imperative to balance the data with the model reminds me of an espionage story from WWII. In early 1941, a German Abwehr operative, turned double agent, was dispatched to the United States with a 'shopping list' of intelligence concealed on a microdot, which he presented to the head of the FBI, J. Edgar Hoover. Included on this ‘shopping list’ was a request, on behalf of Germany’s ally, Japan, for depth measurements of Pearl Harbor. Hoover took a keen interest in the microdot technology, something the United States was not at the time using, but showed no particular concern regarding the contents of the list. He did pass the material on to both Army and Navy intelligence, but, regrettably, both of those agencies also failed to demonstrate any curiosity about the German interest in Pearl Harbor -- a military base far removed from any possible German military concern.
Context is everything. The data was there, as was the communication and decision support process. What was missing was the (mental) model that might have connected the data, the request for depth readings, to the broader complex reality of the global political and military situation.
My previous post, “Why Build Models?”, which focused on the three standard model configurations and other ways modeling can be put to use, elicited this insightful comment from Douglas Hicks, principle at D. T. Hicks Cost Measurement & Management Consulting:
“Recognizing that all of our decisions are based on our models of reality, not reality itself, is a key to understanding decision making. Too many individuals concentrate their efforts on perfecting “the data” that they then proceed to process through models that have little or no semblance of reality. Good models can still generate quality decision support information when the data is less than perfect, but bad models can only generate misleading decision support information – even when the data is perfect. Data is great, but it cannot become quality, actionable information unless it is processed through a valid model.”
The late Alfred Oxenfeldt, professor of Business Economics and Marketing at Colombia University, wrote; “It is our models of phenomena that determine our behavior, not the phenomena themselves. The validity of our decisions depends on our perception and understanding of reality. Good decisions require good models and the caliber of our decisions reflects the quality and validity of our models”.
A more robust approach to modeling can come from many different quarters:
- Do you need a more robust approach to forecasting than what simple trends and moving averages provide?
- Do you need a better way to quickly and easily build and deploy hundreds of predictive models across multiple segments?
- Should machine learning be part of your modeling repertoire?
- Could you make use of rapid prototyping that puts the power of model building directly into the hands of business users?
- Never underestimate how coaching, consulting and expert advice can quickly move you up the steep part of the modeling learning curve.
Lastly, have you outgrown spreadsheets - is it time for you to be moving off spreadsheets as your primary modeling tool? Are you allowing the column/row/worksheet format to limit the depth, breadth and complexity of the reality you are attempting to model? As Oxenfeldt said, your models determine your behavior, and a two-dimensional modeling approach cannot help but restrict your perception, insight and actions.
Data and models, however, exist not in a vacuum but in the context of a larger decision support system, which is where you should always begin the process: What problems must the business address? What questions does the business need answered? What insights does the business need to innovate? What business decisions and actions need quantitative support? Build your models to support these issues and only then pursue the required data. Big data is the servant, not the master.
1 Comment
Pingback: Big Model: The necessary complement to big data