Are data governance and MDM still inseparable?

Yes. But since this post needs to be more than a one-word answer to its title, allow me to elaborate.

Data governance (DG) enters into the discussion of all enterprise information initiatives. Whether or not DG should be the opening salvo of these discussions is akin to asking whether the chicken or the egg came first. However, any initiative believing its manifest destiny is to expand across the organization and pervade every nook and cranny of the enterprise eventually needs DG. Master data management (MDM) is no exception.

One thing differentiating MDM is its focus on providing the organization with a single view of master data entities (parties, products, locations, assets). It does this by consolidating, standardizing and matching common data elements to achieve a more consistent view of these entities across the organization, creating their best data representations – often referred to as the organization’s single version of the truth. Another important aspect is the party-role relationship, which is where MDM manages the data and relationships underlying the high-level terminology commonly used in business discussions about the party master data entity (e.g., customer, supplier, employee). This complex challenge is perhaps best exemplified when the customer role crashes the MDM party.

As you might imagine, or may have painfully experienced, what MDM is attempting to accomplish involves much more than data and technology. For example, even as a concept MDM’s single version of the truth must contend with the various versions of verisimilitude believed by people across the enterprise. And as an implementation, MDM must consider the lesson learned by many an enterprise data warehouse past (and possibly present): Just because you beautifully build it doesn’t mean people will dutifully use it.

People, and the unique corporate culture they embody, can make or break MDM. This is where DG comes in. DG provides the guiding principles and context-specific policies that frame the processes and procedures of MDM.

1421435697166[1]An example of a DG guiding principle for MDM is “master data will be managed as a shared asset to maximize business value and reduce risk.” DG policies for MDM will provide context to the specific business uses of master data, such as the different ways billing and marketing define who a customer is, and who has the authority to access sensitive data (e.g., social security and credit card numbers) describing those customers. DG will connect the dots between these principles and policies and MDM processes and procedures, making sure principles are followed, policies are enforced, and any and all changes are well-communicated across the organization.

While it is possible, and definitely easier, to start MDM without DG, MDM isn’t done until it invites DG to the party (and to the product, location and asset too). This is why DG and MDM still are, and forever will be, inseparable.

Post a Comment

Struggling with data governance alignment? Look to history.

If your organization is large enough, it probably has multiple data-related initiatives going on at any given time. Perhaps a new data warehouse is planned, an ERP upgrade is imminent or a data quality project is underway.

Whatever the initiative, it may raise questions around data governance – closely followed by discussions about the need to "align" with the business. Aligning data governance to business value is where many initiatives falter, because it is not always easy to demonstrate tangible value. Read More »

Post a Comment

Big data, big governance

Traditional data governance is all about establishing a boundary around a specific data domain. This translates to establishing authority to define key business terms within that domain; establishing business-driven decision making processes for changing the business terminology and the rules that apply to them; defining content standards (e.g., metadata and data quality rules); and outlining an ongoing process for measuring and monitoring.

The recent data explosion highlights the point that data governance is critical to organizations' success. In fact, the need for a mature data governance framework is accepted more than ever. But despite this acknowledgement, established methods for governing data have not been challenged or altered.

Read More »

Post a Comment

Is effective data governance possible in an era of big data?

"A man's gotta to know his limitations."
—Clint Eastwood as "Dirty" Harry Callahan, Magnum Force

Let's go back in time to 2005, well before the arrival of what we now call Big Data.

A decade ago, YouTube didn't even exist. Facebook was still limited to college students. No one talked about cloud computing.

Seems like a long time ago, right?

Read More »

Post a Comment

ESP can determine if big data is eventful

Many recent posts on this blog have discussed various aspects of event stream processing (ESP) where data is continuously analyzed while it’s still in motion, within what are referred to as event streams. This differs from traditional data analytics where data is not analyzed until after it has stopped moving and has been stored.

Read More »

Post a Comment

Event stream processing – Tips 2 and 3: Understand the life cycle of the data, collection and consumption

Determining the life cycle of event stream data requires us to first understand our business and how fast it changes. If event data is analyzed, it makes sense that the results of that analysis would feed another process. For example, a customer relationship management (CRM) system or campaign management system like SalesForce.com. Here are some questions I would ask:

Read More »

Post a Comment

Can ESP bridge the data quality gap?

As consumers, the quality of our day is all too often governed by the outcome of computed events. My recent online shopping experience was a great example of how computed events can transpire to make (or break) a relaxing event.

We had ordered grocery delivery with a new service provider. Our existing provider gave amazing service – but at a higher cost – so we were keen to see how the competition fared.

The first order was a success. It arrived on time and at a considerable cost savings.

The second order was a disaster. It also highlights a data quality gap that I believe is a perfect scenario for event stream processing. Read More »

Post a Comment

Event stream processing – Tip 1: Don’t be overwhelmed

I believe most people become overwhelmed when considering the data that can be created during event processing. Number one, it is A LOT of data – and number two, the data needs real-time analysis. For the past few years, most of us have been analyzing data after we collected it, not during the event itself. Read More »

Post a Comment

Embedding event stream analytics

In my last two posts, I introduced some opportunities that arise from integrating event stream processing (ESP) within the nodes of a distributed network. We considered one type of deployment that includes the emergent Internet of Things (IoT) model in which there are numerous end nodes that monitor a set of sensors, perform some internal computations, and then generate data that gets pushed back into the network. Most scenarios assume these data streams are accumulated at a central server that analyzes the data and then subjects it to existing predictive and prescriptive analytical models. Then, the models generate notifications or trigger the desired, automated actions.

The conclusion we came to, though, is that forcing all the decisions to be made at the central server might be a somewhat heavier burden than is necessary. Because this approach requires a full round trip for communication (sensors to end node to network to central server, then back to network to end node to controllers, for example). Read More »

Post a Comment