Data should not be good for goodness sake

As this is the week of Christmas, many, myself included, have Christmas songs stuck in their head. One of these jolly jingles is Santa Claus Is Coming To Town, which includes the line: “He knows if you’ve been bad or good, so be good for goodness sake!” The lyric is a playful warning to children that if they are not going to be good for goodness sake, then they should at least be good so that Santa Claus will bring them Christmas presents. Read More »

Post a Comment

Creating the MDM demo

With our recent client engagements in which the organization is implementing one or more master data management (MDM) projects, I have been advocating that a task to design a demonstration application be added to the early part of the project plan. Many early MDM implementers seem to have taken the “Field of Dreams” approach to adoption: “build the master data environment and the business applications will come.”

However, we often underestimate the complexity of integrating business applications with an existing master data environment, especially if there has not been an adequate process for assessing how the application’s owners expect to use master data. This is compounded when the business users do not really understand how MDM works in the first place! Read More »

Post a Comment

Dead, or as good as dead: The end of an era for thin clients

In an era long gone by (actually not so long ago) we all interacted with computers via terminals to a mainframe or minicomputer systems. Sometimes, you had to book a slot for when you could access and exploit computer resources. The user was subject to interrupted connections or very poor response times during busy periods. For example, there was a time when, every afternoon, a keypress on a mainframe terminal was followed by a 10-second wait for the letter to show-up on the display!

What followed was the era of "full" client applications on our local machines – we all had the power to do things whenever and wherever we wanted. In this "utopia," we had broken free of the data centers and from control. Naturally, the next phase was the frustration caused by slow performance and by grumblings about access to data. The loss of data, due to failure and lack of proper backups were the final straw. Read More »

Post a Comment

Cracking the code to successful conversions: Decommission plan

I have a rule – any conversion or upgrade will require the creation of a decommission plan. A decommission plan should include the following:

  1. A list and definition of each database, table and column (source and target).
  2. A list and definition of each of the current programs in use (you may want to include the ones that are not used).
  3. A list and definition of each report and interface from the current application.

Some companies may have a specific format to use for the decommission plan – if so. use it! I usually use spreadsheets for these lists, and leave plenty of room for analysis of what is salvageable. Really, why create new programs or reports, if we can repoint an old program and change some variables – right? Read More »

Post a Comment

Cracking the code to successful conversions: Physical data models

The physical data model should represent exactly the way the tables and columns are designed in the in the database management system.  I recommend keeping storage, partitioning, indexing and other physical characteristics in the data model if at all possible.  This will make upkeep and comparison with the development, test or production database much much easier. Read More »

Post a Comment

Thoughts on dynamic data provenance

We've explored data provenance and the importance of data lineage before on the Data Roundtable (see here). If you are working in a regulated sector such as banking, insurance or healthcare, it is especially important right now and one of the essential elements of data quality that they look for in their assessments.

Data lineage is vital to data quality management because we need to know where data originates from in order to build data quality rules to measure, monitor and improve it, ideally at source. Read More »

Post a Comment

Can big data and bad administration coexist?

Around the time of the publication of Too Big to Ignore, I spoke at a large conference. The conference sponsor—a very large company that we'll call ABC here—was struggling to find speakers. (I suppose that there weren't too many "big data experts" back then.) What unfolded over the next two months convinced me that, at an organizational level, ABC was incapable of acting with the agility that big data demands.

Read More »

Post a Comment

Next-gen 360° view of customers

For a long time, master data management (MDM) practitioners boasted about their ability to build a 360° view of customers by aggregating and proactively managing information coming from various business applications such as CRM systems, ERP applications, and other operational systems.MDM_BigData

But was it really a 360° view? What about transactional and historical data? What about external data sources like social media? What about unstructured content such as emails or call records? Read More »

Post a Comment