Remember when performance reporting took place monthly, or even annually? When executives solemnly received a static report in a board meeting, which told them what had happened a month or so ago, often longer, and they had to make decisions about what to do next? This period actually isn’t all that long ago, although in some ways it seems like more than a lifetime. Back then, the processes for cleaning and assuring data took so long that waiting ‘just’ a month seemed like progress.
The rise of streaming data, however, and the arrival of tools to handle it, have ensured that performance analysis, and action in response, can now take place in real time. Immediate responses to the information within the data are often driven by algorithms programmed to take the most appropriate action. But despite the huge rise in the availability of streaming data, there are still many people—and organisations—who have not yet tapped into this resource, and who still see it simply as a by-product of the Internet of Things (IoT).
Is your organisation fit for intelligent connectivity? Participate in the analytics platform study and find out what you might be missing.
Analysing streaming data
The real usefulness of streaming data lies in the power of new tools to handle and analyse it, such as event stream processing which encompasses edge analytics. Without a good analysis tool, after all, data is just data, regardless of its type or source. Edge analytics is one of the most powerful tools available for this purpose, handling data while it is still in motion, rather than once it has been stored. It can apply all types of data analytics and algorithms to the data while it is ‘on the fly’, to help analysts discover patterns, and even forecast from the data. You could say that it enables analysts to pause time, analyse the data, then set time running again.
Edge analytics also has a huge range of other applications, many of which are likely to prove much more significant in the longer term. It is, however, taking a while for everyone to catch up with this potential.
I see event stream processing widely applicable across the range of cloud applications that generate data and where machines talk to each other. For example, it can be used in telecoms to help optimise networks, by measuring the data flows around the infrastructure, and managing those flows better. Other applications are used in financial services and capital markets, including trading floors.
Managing risk and preventing fraud
The ability to handle and analyse data in real time, and generate optimal solutions, is becoming a critical part of risk management. One good example is the detection and prevention of fraud. Long gone are the days when fraud reports waited weeks to be assessed; even waiting until the markets close for the day seems slow and outdated now. Risk and liquidity can now be managed during the trading day, with huge benefits to the organisations involved enabling them to remain less exposed.
Edge analytics in retail
Edge analytics also has potential in retail. Data from points of sale or even the shelves themselves can be used to manage restocking. This, in turn, feeds into supply chain management in a broader sense, all the way up and down the chain. It can be used on the shop floor, in factories, in distribution: the list is more or less endless. It can be used also in customer services, for example, to analyse customer interactions with chatbots, and ensure human intervention at the right time. At the moment, this is one of the big challenges with introducing AI systems in customer care: seamless integration between humans and algorithms, so that customer experience does not suffer.
Power and application
As the use of cloud-based applications spreads, there are likely to be more and more systems and processes that generate streaming data. Currently, two of the most relevant data streams are:
- Event stream containing sensor data coming from physical assets. It includes GPS-based location data from vehicles or smart phones, temperature or accelerometer data from sensors, RFID tag readings, heart beats from patient monitors.
- Signals from supervisory control and data access (SCADA) systems on machines – this is been the fastest growing one lately– and the information reports stream, such as news feed articles, market data, weather reports, tweets and other social media updates. External social sources may contain valuable information that is relevant to decision making within the company. For example, look at Deloitte’s recent acquisition of predictive social intelligence platform to support brand / reputational events monitoring.
Managing and getting benefit from this data should, in my view, be every business' priority. With such effective tools now available, the excuses for not doing so are wearing a bit thin.
Do you want to link edge/streams to ‘fast data’? SAS can help on defining fast data-driven decisions.
Read a true story on how Levi Strauss & Co. uses SAS Analytics to analyze millions of consumer demand signals, to create a supply chain geared to the preferences of individual customers.
Learn more about SAS Event Stream Processing.
We hosted a digital panel discussion on Twitter covering this theme. Read the highlights as a Storify: Reimagining the analytics platform