I think NOT! I have read a lot of articles lately on data integration and quality, especially for companies that are embarking on large conversions. Every one of the articles stated that data quality was a critical step for success in any conversion. So, why do companies have a tendency to overlook or jerry-rig the data quality?
An example of overlooking the data quality issues would be where we accept ANY and ALL data during the conversion, and just convert the identifiers to something that won’t show up during reporting (like negative identifiers). While this gets the job done faster, each and every application and report will be required to use a view that excludes the negative identifiers. A bit of overhead for SQL and something else I have to remember. Wouldn’t it be nicer to investigate the interfaces, and really understand what other applications require this data?
An example of jerry-rigging the data would be where the corrupt or "uncleansed" data is changed as it comes into the new environment, but the source systems AGAIN are not investigated to understand the usage and need of this information. HELLO, MDM! MDM, in its essence, reviews the integration and need of master data across the enterprise. Not all data - just the data needed. MDM initiatives need to address these types of data quality issues. I am pretty sure I cannot be successful on ANY project without first addressing the data quality of the source systems involved in the conversion.