Modernization is a term used to describe the necessary evolution of information technologies that organizations rely on to remain competitive in today’s constantly changing business world. New technologies – many designed to better leverage big data – challenge existing data infrastructures and business models. This forces enterprises to modernize their approach to data
Author
Modernization is a term used to describe the necessary evolution of information technologies that organizations rely on to remain competitive in today’s constantly changing business world. New technologies – many designed to better leverage big data – challenge existing data infrastructures and business models. This forces enterprises to modernize their approach to data
Master data management (MDM) is distinct from other data management disciplines due to its primary focus on giving the enterprise a single view of the master data that represents key business entities, such as parties, products, locations and assets. MDM achieves this by standardizing, matching and consolidating common data elements across traditional and big
Master data management (MDM) is distinct from other data management disciplines due to its primary focus on giving the enterprise a single view of the master data that represents key business entities, such as parties, products, locations and assets. MDM achieves this by standardizing, matching and consolidating common data elements across traditional and big
I've spent much of my career managing the quality of data after it was moved from its sources to a central location, such as an enterprise data warehouse. Nowadays not only do we have a lot more data – but a lot of it is in motion. One of the
In my previous post I discussed the practice of putting data quality processes as close to data sources as possible. Historically this meant data quality happened during data integration in preparation for loading quality data into an enterprise data warehouse (EDW) or a master data management (MDM) hub. Nowadays, however, there’s a lot of
Throughout my long career of building and implementing data quality processes, I've consistently been told that data quality could not be implemented within data sources, because doing so would disrupt production systems. Therefore, source data was often copied to a central location – a staging area – where it was cleansed, transformed, unduplicated, restructured
While it’s obvious that chickens hatch from eggs that were laid by other chickens, what’s less obvious is which came first – the chicken or the egg? This classic conundrum has long puzzled non-scientists and scientists alike. There are almost as many people on Team Chicken as there are on Team
Many data quality issues are a result of the distance separating data from the real-world object or entity it attempts to describe. This is the case with master data, which describes parties, products, locations and assets. Customer (one of the roles within party) master data quality issues are rife with examples, especially
Data quality has always been relative and variable, meaning data quality is relative to a particular business use and can vary by user. Data of sufficient quality for one business use may be insufficient for other business uses, and data considered good by one user may be considered bad by others.