Blend, cleanse and prepare data for analytics, reporting or data modernization efforts
Jim Harris says event stream processing determines if big data is eventful and relevant enough to process and store.
Blend, cleanse and prepare data for analytics, reporting or data modernization efforts
Jim Harris says event stream processing determines if big data is eventful and relevant enough to process and store.
The metaphors we choose to describe our data are important, for they can either open up the potential for understanding and insight, or they can limit our ability to effectively extract all the value our data may hold. Consisting as it does of nothing but electric potentials, or variations in
Determining the life cycle of event stream data requires us to first understand our business and how fast it changes. If event data is analyzed, it makes sense that the results of that analysis would feed another process. For example, a customer relationship management (CRM) system or campaign management system like
As consumers, the quality of our day is all too often governed by the outcome of computed events. My recent online shopping experience was a great example of how computed events can transpire to make (or break) a relaxing event. We had ordered grocery delivery with a new service provider. Our existing provider
I believe most people become overwhelmed when considering the data that can be created during event processing. Number one, it is A LOT of data – and number two, the data needs real-time analysis. For the past few years, most of us have been analyzing data after we collected it,
You've probably heard many times about the fantastic untapped potential of combining online and offline customer data. But relax, I’m going to cut out the fluff and address this matter in a way that makes the idea plausible and its objectives achievable. The reality is that while much has been
In my last two posts, I introduced some opportunities that arise from integrating event stream processing (ESP) within the nodes of a distributed network. We considered one type of deployment that includes the emergent Internet of Things (IoT) model in which there are numerous end nodes that monitor a set of sensors,
In my previous post, I discussed the similarities, differences and overlap between event stream processing (ESP) and real-time processing (RTP). In this post, I want to highlight three things that need to get real. In other words, three things that should be enhanced with real-time capabilities, whether it’s ESP, RTP or
You might have lots of data on lots of customers, but imagine if you could suddenly add in a huge dollop of new, highly informative data that you weren’t able to access before. You could then use analytics to extract some really important insights about these customers, allowing you to
In my last post, we examined the growing importance of event stream processing to predictive and prescriptive analytics. In the example we discussed, we looked at how all the event streams from point-of-sale systems from multiple retail locations are absorbed at a centralized point for analysis. Yet the beneficiaries of those
As we enter the era of “everything connected,” we cannot forget that gathering data is not enough. We need to process that data to gain new knowledge and build our competitive advantage. The Internet of Things is not just a consumer thing – it also makes our businesses more intelligent. Whenever
.@philsimon says that you shouldn't bring a knife to a gun fight.
(Otherwise known as Truncate – Load – Analyze – Repeat!) After you’ve prepared data for analysis and then analyzed it, how do you complete this process again? And again? And again? Most analytical applications are created to truncate the prior data, load new data for analysis, analyze it and repeat
Well OK, so there is an "i" in science, but being a data scientist is certainly not a lonesome job. Engagement with other team members is essential with data analytics work, so you never really work in isolation. Without the rest of the team, we would fail to ask all
Event stream processing (ESP) and real-time processing (RTP) so often come up in the same conversation that it begs the question if they are one and the same. The short answer is yes and/or no. But since I don’t need the other kind of ESP to know that you won’t
What sends a data management product to the top of the “hot” list? In a word – speed. Especially when that speed can gracefully accommodate the huge world of streaming data from the Internet of Things. One of SAS’ hottest (and recently enhanced) products, SAS Event Stream Processing is an
@philsimon on the need to adopt new tools to understand events.
Over the past year and a half, there has been a subtle shift in media attention from big data analytics to what is referred to as the Internet of Things, or IoT for short. The shift in focus is not intended to diminish the value of big data platforms and
Once you have assessed the types of reporting and analytics projects and activities are to be done by the community of data analysts and consumers and have assessed their business needs and requirements for performance, you can then evaluate – with confidence – how different platforms and tools can be combined to satisfy
Data governance and data virtualization can become powerful allies. The word governance is not be understood here as a law but more as a support and vision for business analytics application. Our governance processes must become agile the same way our business is transforming. Data virtualization, being a very versatile
In my previous post I used junk drawers as an example of the downside of including more data in our analytics just in case it helps us discover more insights only to end up with more flotsam than findings. In this post I want to float some thoughts about a two-word concept
@philsimon says that, yes, we can learn a great deal.
In April, the free trial of SAS Data Loader for Hadoop became available globally. Now, you can take a test drive of our new technology designed to increase the speed and ease of managing data within Hadoop. The downloads might take a while (after all, this is big data), but I think you’ll
Data governance and data virtualization can become powerful allies. The word governance is not be understood here as a law but more as a support and vision for business analytics application. Our governance processes must become agile the same way our business is transforming. Data virtualization, being a very versatile
As the age old idiom goes, the early bird gets the worm and the early adopter gets the break. New technologies give clear advantages to those organisations that figure out before their early-adopting competitors how to use them effectively, an advantage that recedes as others catch up, and the technology
In the last post, we talked about creating the requirements for the data analytics, and profiling the data prior to load. Now, let’s consider how to filter, format and deliver that data to the analytics application. Filter – the act of selecting the data of interest to be used in the
In the era of big data, we collect, prepare, manage, and analyze a lot of data that is supposed to provide us with a better picture of our customers, partners, products, and services. These vast data murals are impressive to behold, but in painting such a broad canvas, these pictures
One area that often gets overlooked when building out a new data analytics solution is the importance of ensuring accurate and robust data definitions. This is one of those issues that is difficult to detect because unlike a data quality defect, there are no alarms or reports to indicate a
What data do you prepare to analysis? Where does that data come from in the enterprise? Hopefully, by answering these questions, we can understand what is required to supply data for an analytics process. Data preparation is the act of cleansing (or not) the data required to meet the business
.@philsimon on what we can learn from Seattle's juggernaut.