Cracking the code for successful conversions: Detailed information analysis and facilitated sessions

Facilitated sessions are an excellent way of gaining additional insight into the requirements and expectations for any conversion project.  What I have found is that most of the participants in a facilitated session have different agendas and different priorities for the project, so the facilitator must be neutral.

Usually 5-8 active participants keep the session interesting and controlled.  It is nice if you can send out preparation information for the participants.

Most companies bring in consultants to facilitate the process.  A good facilitator should not be from a business unit or have a major stake in the outcome for this project.  Also, an understanding of team dynamics, and the applications being converted is always a plus! Read More »

Post a Comment

Personal data marketplaces

One of my favorite business books is Chris Anderson's Free: The Future of a Radical Price. The book serves as the basis for the über-popular freemium model embraced by so many companies today. In my favorite passage, Anderson recounts the tale of an Amazon free-shipping promotion in Europe in the late 1990s. Because of an obscure French law, however, Amazon had to charge a nominal sum in the country (something like a few francs.) In England, there was no such law.

Read More »

Post a Comment

Big data versus the not-so-humble opinion

Henrik Liliendahl Sørensen recently blogged about the times when a HiPPO (Highest Paid Person’s Opinion) outweighs data in business decision-making. While I have seen plenty of hefty opinions trump high-quality data, those opinions did not always come from the highest paid person.

The stubborn truth is that we all hold our opinions in high regard regardless of what we base our opinions on.

“Opinions are robust: they persist without support,” Beston Jack Abrams explained. While you would think that data should shake such wobbly foundations, as Abrams further explained, “nothing causes greater adherence to an opinion than opposition to it.” Read More »

Post a Comment

What do Hadoop superheroes do now that Hallows' Eve has come and gone?

Great works of fiction are filled with dynamic duos. Sherlock Holmes and Mr. Watson. Rosencrantz and Guildenstern. And, of course, superheroes like Batman and Robin. On Thursday, Nov. 5 at 1 p.m. ET, two real-world Hadoop superheroes – Arun C. Murthy, co-founder of Hortonworks, and Paul Kent, vice president of big data at SAS – are teaming up to take on Hadoop YARN in this webcast: Combine SAS High-Performance Capabilites with Hadoop YARN…with SAS and Apache Hadoop. Read More »

Post a Comment

What is transaction synchronization? And why does MDM care?

A way of paraphrasing what I suggested in my last post was that as master data practitioners, we are often focused too much on pulling data from source systems to populate a master entity model and not focused enough on understanding how dependencies across business processes may influence the proper synchronization of master entity data. This is often a byproduct of batch updating performed on a periodic basis with data extracted from transaction systems. Read More »

Post a Comment

Cracking the code to successful conversions: Detailed information analysis via existing sources

To complete any conversion, with success, always seems to require a good understanding of the existing environment and platform.  In fact, I would not move forward without the background analysis work products. You may want to consider some analysis of what is already in place, and understand:

  1. What is the objective of the conversion?  (Always a good critical success factor to keep in mind)
  2. Do we convert EVERYTHING?  (Consider your project scope)
  3. How different is the converted data?  (A different data type in the new database/platform may require us to CAST data to a new type in the conversion programs.  This can add more complexity to those programs.)
  4. Do we have a tremendous amount of data quality issues? Read More »
Post a Comment

Ideas for justifying your data quality existence

Conference season is hotting up in the UK, and there are no doubt lots of practitioners putting the finishing touches to their data quality presentations.

One interesting observation I’ve encountered is a high churn rate amongst data quality professionals, particularly within the leadership community.

Their decision to quit is not always voluntary.

Many data quality presenters focus on the tactics of delivering a data quality project, but fail to explain what the organisation gained in terms of tangible business benefits. Whilst the presenter may just be aiming to impart their knowledge to a captive audience, I suspect a lack of robust performance measurement is often at fault. Read More »

Post a Comment

Treat your data steward like a rock star

Every day of the year, there's a holiday celebrating one thing or another. In fact, you probably didn't know that Oct. 22 WAS CAPS LOCK DAY. Whoops. Or, if you're like me, you completely spaced on Oct. 26. It was Mother-in-Law Day. Boy, we'll be hearing about that for the next few months.

A few years ago, some of us in the data management community noted that there was an under-appreciated group that deserved more recognition. That's why we started International Data Stewards Day – a once-a-year celebration of the people who care about, obsess about and fuss with the data that drives your organization.

This year, it's back – and bigger than ever. The theme for the 2014 International Data Stewards Day is that data stewards are the rock stars of your organization. OK, maybe they're not larger-than-life characters who command the stage and play extended guitar solos. Read More »

Post a Comment

Data-based television

This isn't 1950. Half of the population is not crowded around a TV at night watching three shows. For a long time now, traditional TV networks have been struggling. This is no blip. More people than ever are cutting the cord. Traditional media outlets are scared—and they should be.

Read More »

Post a Comment

Big data and omission neglect

In my previous post, I used the book Mastermind: How to Think Like Sherlock Holmes by Maria Konnikova to explain how additional information can make us overconfident even when it doesn’t add to our knowledge in a significant way. Knowing this can help us determine how much data our decisions need to be driven by.

Another important concept Konnikova described is what is known as omission neglect.

“We fail to note what we do not perceive up front,” Konnikova explained, “and we fail to inquire further or to take the missing pieces into account as we make our decision. Some information is always available, but some is always silent—and it will remain silent unless we actively stir it up.” This is why noise is sometimes necessary. Read More »

Post a Comment