One of my first jobs was as an intern with the City of Wilson, NC. For a summer in the mid-90s, I served the citizens of Wilson by performing some highly strategic work in the city manager's office. I answered phones, fixed the copier and, on occasion, wrote speeches for the mayor (his Fourth of July speech to the local VFW was a real corker).
Beyond learning how to fix a paper jam, that summer opened my eyes to what it took to run an organization, even a medium-sized city like Wilson. I sat back and watched the city manager deal with a variety of competing priorities and constituents. From downtown redevelopment to crime, he often had to go with gut feel for the situation versus an informed, data-driven decision – especially when the issue involved multiple departments. After all, the city's IT systems were a mish-mash of COBOL and other legacy systems, and the data was often vague, confusing and hard to assemble.
I was reminded of this when reading Bill Coleman's blog on State and Local Connection, where he provided an example of the data integration challenges that lead to a confused view of municipal issues, including whether a neighborhood is in decline. Bill is an expert on this topic, having served as the town manager in Cary, NC (which, although it's a called a "town," is three times the size of Wilson).
Bill faced hundreds – probably thousands – of these scenarios in his tenure. A governing body will ask staff to review a complaint from constituents. The manager asks each department to weigh in, each using their own slice of data to evaluate the situation. In Bill's scenario, the city manager got four different, and often conflicting, views based on the data silos.
This situation is not confined to municipal governments. Whether it's a local government or a private-sector business, organizations rarely have a uniform approach to data management. For example, some departments or divisions have invested the time and energy to create high-quality data, often due to a business imperative (such as compliance or fraud detection). Other departments or division may lag behind because there is no business need to stay on the "cutting edge." What's often lacking is a consistent approach to managing data at the highest levels.
The nirvana is a data management (or, more precisely, data governance) framework that adds some consistency to the way that information is collected, managed and distributed. A data governance project can often start with some basic, fundamental questions, like "How do we define a customer," or "What assets in the organization do we need to track?" The program then moves to more complex issues, such as creating business rules that can be enforced to ensure consistent data is available.
Once those fundamental data governance tasks are underway, you can think about integrating data or creating a master data management (MDM) environment. To Bill's point, that's where more useful, understandable data really begins. By creating consistent rules, applying them across the IT environment, and integrating data, you get a more accurate snapshot of what's going on.
That's the nirvana of data. Unfortunately, while we've made great strides in the field of data management, managers often struggle with the same problems I saw in 1996. Data obviously has value, but only in context of how it can be understood and consumed. One of the many challenges data management professional face today is understanding managers' needs for information - and how they can improve the quality of data to support those needs.
Do you have any examples of managers forced to make decisions in the absence of good data?