Streaming technologies have been around for years, but as Felix Liao recently blogged, the numbers and types of use cases that can take advantage of these technologies have now increased exponentially. I've blogged about why streaming is the most effective way to handle the volume, variety and velocity of big data. That's
Author
Historically, before data was managed it was moved to a central location. For a long time that central location was the staging area for an enterprise data warehouse (EDW). While EDWs and their staging areas are still in use – especially for structured, transactional and internally generated data – big
Our world is now so awash in data that many organizations have an embarrassment of riches when it comes to available data to support operational, tactical and strategic activities of the enterprise. Such a data-rich environment is highly susceptible to poor-quality data. This is especially true when swimming in data lakes –
Most enterprises employ multiple analytical models in their business intelligence applications and decision-making processes. These analytical models include descriptive analytics that help the organization understand what has happened and what is happening now, predictive analytics that determine the probability of what will happen next, and prescriptive analytics that focus on
Data governance plays an integral role in many enterprise information initiatives, such as data quality, master data management and analytics. It requires coordinating a complex combination of factors, including executive sponsorship, funding, decision rights, arbitration of conflicting priorities, policy definition, policy implementation, data stewardship and change management. With so much overhead involved in
Data governance has been the topic of many of the recent posts here on the Data Roundtable. And rightfully so, since data governance plays such an integral role in the success of many enterprise information initiatives – such as data quality, master data management and analytics. These posts can help you prepare for discussing
As I've previously written, data analytics historically analyzed data after it stopped moving and was stored, often in a data warehouse. But in the era of big data, data needs to be continuously analyzed while it’s still in motion – that is, while it’s streaming. This allows for capturing the real-time value of data
Lately I've been binge-watching a lot of police procedural television shows. The standard format for almost every episode is the same. It starts with the commission or discovery of a crime, followed by forensic investigation of the crime scene, analysis of the collected evidence, and interviews or interrogations with potential suspects. It ends
Critical business applications depend on the enterprise creating and maintaining high-quality data. So, whenever new data is received – especially from a new source – it’s great when that source can provide data without defects or other data quality issues. The recent rise in self-service data preparation options has definitely improved the quality of
Data access and data privacy are often fundamentally at odds with each other. Organizations want unfettered access to the data describing customers. Meanwhile, customers want their data – especially their personally identifiable information – to remain as private as possible. Organizations need to protect data privacy by only granting data access to authorized