Datenaufbereitung, Datenintegration, Datenqualität, Datensicherheit – all das hört sich nach Pflichtprogramm für die IT an und ist längst nicht so sexy wie Hype-Themen à la Data Science, Internet of Things oder Artificial Intelligence. Dass Datenmanagement im Businesskontext aber einen mindestens ebenso großen Stellenwert hat – sei es für die Optimierung
Tag: data preparation
Get faster value out of your data by empowering business users to work with data on their own.
Phil Simon weighs in on the value of getting your own hands dirty using self-service data prep.
Get on with your day faster by taking a self-service approach to data preparation.
David Loshin recommends enforcing governed standards to help avoid conflicting analytical results.
Phil Simon chimes in on the immediacy of enterprise data.
Helmut Plinke explains why modernizing your data management is essential to supporting your analytics platform.
When developing SAS applications, you can feed database tables into your application by using the libname access engine either by directly referring a database table, or via SAS or database views that themselves refer to one or more of the database tables. More on Automation with SAS: Let SAS write
Welche Rolle Datenqualität und Data Governance beim Data Management für Analytics spielen, habe ich mit meinem Kollegen Gerhard Svolba zuletzt an dieser Stelle diskutiert. Doch was genau macht modernes Datenmanagement aus, und welche Rolle spielen dabei neue Technologien à la Hadoop und Co.? Und wie sieht überhaupt die künftige Zusammenarbeit
Auch wenn der Hype von Gartner für beendet erklärt wurde: An Big Data und der Auswertung entsprechender (oftmals unstrukturierter) Datenmengen kommt kein Unternehmen vorbei. Doch welche Herausforderungen stellen Big Data und damit einhergehende Entwicklungen an das Data Management? Wie können Data Scientists, IT und Fachabteilung heute zusammenarbeiten? Und wo prallen
The rise of self-service analytics, and the idea of the ‘citizen data scientist’, has also brought a number of issues to the fore in organizations. In particular, two common areas of discussion are the twin pillars of data quality and data preparation. There is no doubt that good quality, well-prepared
„Die IT liefert nicht, der Fachbereich weiß nicht, was er heute oder morgen an Daten haben will“… Beide haben recht, ein Dilemma, das darin endet, dass Selbsthilfe betrieben wird. Der Informationshunger besteht weiterhin, und was nicht geliefert wird, besorgt man sich auf anderem Wege. Da wären: die SAP-Maske, Excel, Datenbank(en),
One aspect of high-quality information is consistency. We often think about consistency in terms of consistent values. A large portion of the effort expended on “data quality dimensions” essentially focuses on data value consistency. For example, when we describe accuracy, what we often mean is consistency with a defined source
It's that time of year again where almost 50 million Americans travel home for Thanksgiving. We'll share a smorgasbord of turkey, stuffing and vegetables and discuss fun political topics, all to celebrate the ironic friendship between colonists and Native Americans. Being part Italian, my family augments the 20-pound turkey with pasta –
.@philsimon says don't treat data self-service as a binary.
Most enterprises employ multiple analytical models in their business intelligence applications and decision-making processes. These analytical models include descriptive analytics that help the organization understand what has happened and what is happening now, predictive analytics that determine the probability of what will happen next, and prescriptive analytics that focus on
.@philsimon on the need to adopt agile methodologies for data prep and analytics.
In Part 1 of this two-part series, I defined data preparation and data wrangling, then raised some questions about requirements gathering in a governed environment (i.e., ODS and/or data warehouse). Now – all of us very-managed people are looking at the horizon, and we see the data lake. How do
Lately I've been binge-watching a lot of police procedural television shows. The standard format for almost every episode is the same. It starts with the commission or discovery of a crime, followed by forensic investigation of the crime scene, analysis of the collected evidence, and interviews or interrogations with potential suspects. It ends
.@philsimon chimes in on new data-gathering methods and what they mean for analytics.
I'm a very fortunate woman. I have the privilege of working with some of the brightest people in the industry. But when it comes to data, everyone takes sides. Do you “govern” the use of all data, or do you let the analysts do what they want with the data to
.@philsimon on the downside of the Band-Aid approach.
Critical business applications depend on the enterprise creating and maintaining high-quality data. So, whenever new data is received – especially from a new source – it’s great when that source can provide data without defects or other data quality issues. The recent rise in self-service data preparation options has definitely improved the quality of
Hadoop has driven an enormous amount of data analytics activity lately. And this poses a problem for many practitioners coming from the traditional relational database management system (RDBMS) world. Hadoop is well known for having lots of variety in the structure of data it stores and processes. But it's fair to
.@philsimon continues his series on data prep and anlytics.
In my last post, I talked about how data still needs to be cleaned up – and data strategy still needs to be re-evaluated – as we start to work with nontraditional databases and other new technologies. There are lots of ways to use these new platforms (like Hadoop). For example, many
I'm hard-pressed to think of a trendier yet more amorphous term today than analytics. It seems that every organization wants to take advantage of analytics, but few really are doing that – at least to the extent possible. This topic interests me quite a bit, and I hope to explore
What does it really mean when we talk about the concept of a data asset? For the purposes of this discussion, let's say that a data asset is a manifestation of information that can be monetized. In my last post we explored how bringing many data artifacts together in a
If your enterprise is working with Hadoop, MongoDB or other nontraditional databases, then you need to evaluate your data strategy. A data strategy must adapt to current data trends based on business requirements. So am I still the clean-up woman? The answer is YES! I still work on the quality of the data.
A long time ago, I worked for a company that had positioned itself as basically a third-party “data trust” to perform collaborative analytics. The business proposition was to engage different types of organizations whose customer bases overlapped, ingest their data sets, and perform a number of analyses using the accumulated