The first sensors appeared many decades ago, and have been around for quite some time in various forms, even though they’ve really only entered the popular vocabulary over the past few years thanks to the Internet of Things.
How do sensors work? A sensor detects events, or changes in quantities, and provides a corresponding output, generally as an electrical or optical signal.
Today, sensors are used in everyday objects such as touch-sensitive elevator buttons and lamps that dim or brighten by touching the base as well as a large number of other places most people are unaware of. For example, they are used in manufacturing, medicine, robotics, cars, airplanes and aerospace.
The largest challenges when it comes to sensors occur after measurements have been made. At that point, you have to decide: Where do I collect the data being generated and how can I use it, for example, to improve my operations by decreasing variability and improve quality?
To capture and collect the measurements coming from sensors, solutions have emerged such as operational historians. An operational historian refers to a database software application that logs or historizes time-based data flowing from sensors. They are optimized for time-series data and are designed to answer questions such as: "What was today's hourly unit production standard deviation?”
These historian software solutions have all developed complementary tools to provide reporting and monitoring features to detect trends or correlations that indicate a problem and alert an operator to take immediate action before equipment damage occurs. In complement, analytic solutions are today largely used to analyze the data stored in these systems in order to provide high-end predictive analytics and to optimize equipment maintenance operations.
Until recently, that was the state of the art for generating value out of sensor data.
The Internet of Things: The next big data explosion
Then two major changes have been deeply shaking the sensor world over the last two years:
- The size of sensors has dramatically reduced: Technological progress now allows more and more sensors to be manufactured on a microscopic scale leading to micro-sensors using technologies like MEMS. This means sensors can also now be embedded in places it was not previously sensible such as clothing.
- Wireless connectivity and communication technologies have fantastically improved and now nearly all types of electronic equipment are able to provide wireless data connectivity allowing sensors embedded in connected devices to send and receive data over the network.
These two technological improvements have ignited the Internet of Things (IoT). It is now possible to put sensors in all types of equipment or devices and to continuously generate and send measures and data in real-time to other systems over the network. Whether it’s a car or a pacemaker, the data is flowing in a constant stream from the device to the network, and sometimes back to the device. This has given rise to massive amounts of data and as a result of it Internet of Things is seen as a major contributor to the Volume, Variety and Velocity of big data.
Organizations today are investing heavily in capturing and storing as much data as possible, but the critical challenge of making effective use of data when it is still in motion, and extracting valuable information from it, is a growing challenge that has now become even bigger than before.
What is even more critical is the application of analytics to these streams of data before the data they are producing is stored for post-event analysis.
The application of analytics will allow organizations to detect patterns and anomalies to have a considerable impact on how a business is run. For example,trends in traffic flow can help make traffic controls and diversions more effective to avoid bottlenecks and or potential accidents, in industries the detection of furnace temperature anomalies can be used to trigger remedial measures and actions immediately.
These examples can be extended across many different domains, and it is easy to see how understanding the activities and interactions between connected devices might help to better understand and improv, business processes and customer interactions.
When it comes to exploiting the Internet of Things it is interesting to compare it to the traditional processes where analytics is applied on stored data, essentially providing analytical processing of events that have occurred in the past. With the Internet of Things, the emphasis is to identify and examine patterns of interest as events occur, and before the data is stored, enabling real-time actions on those events of interest.
The new goal is now to analyze the data as close to the occurrence of the event as possible before its value is lost.
Using high-performance and in-memory analytics solutions significantly speed up the usage and exploitation of massive amounts of data held in big data stores. Nevertheless, for real time decision making on streaming data it is important to have a technology that is able to process data at a very high speed and this is done using an Event Stream Processing engine.
When combined with analytics solutions, event stream processing creates the optimal analytic platform that gives the ability to manage, execute analytics and deliver insights in real-time on huge volumes of streaming data that are generated across many business domains.
So are you ready for the streaming data? Because it is shifting the way we do business today by applying analytics faster than you’d think! And I am sure you wouldn’t want to miss the train ….