The role of a big data lab in your IoT programme

1

We are aware that the Internet of Things (IoT) is growing rapidly, and that growth over the next few years is expected to be exponential. IoT is, in fact, starting to deliver. It is now becoming clear that what began as clever ideas, with limited application, has potential to revolutionise huge swathes of industry, from agriculture to healthcare. Some sources suggest that it may even be as big as a second industrial revolution.

Challenges with streaming data

The usefulness of IoT depends on being able to harness and use the data generated by the hundreds of interconnected devices. Data alone is not intrinsically useful. It is only by qualifying it, filtering, collating and interpreting it, that we can really start to make use of it. In other words, IoT without analytics is actually not much use. Interpretation is key in transforming data into knowledge and then insights.

This, however, is easier said than done, because using and managing big (and streaming) data is very different from managing other information. Most companies agree that it is going to be increasingly vital, but as Andreas Goedde pointed out , it is difficult to find out where big data and analytics will be most useful. In other words, what are the best use cases?

Many companies who have tried using big data analytics have struggled to generate the expected gains quickly. This may be because getting the best out of big data requires agility, and a tolerance of failure. It is not enough to put in place one single solution; instead, companies need to learn to ‘fail fast’ and move on. There is not enough time to spend years implementing an unsuccessful solution. That path leads to madness, and falling behind the competition.

Experimenting is key to success

The key to success in using and exploiting big data is likely to be experimentation. Many companies are finding that a big data lab or similar set-up is helping them to explore options in big data and analytics in a protected way, without risking disruption to entire systems. Once a solution to a particular problem has been found to work well in the lab situation, it can then be rolled out more widely across the company.

Once a solution to a particular problem has been found to work well in the lab situation, it can then be rolled out more widely across the company.

Big data lab benefits

First, it is possible to experiment more easily in a formal ‘lab’ situation. If you are not worrying about the potential for disruption, you can try things out without worrying about failure. This supports the ‘fail fast’ approach: test and discard if unsuccessful. Try something else. This willingness to test and fail, epitomised by having a formal lab, gives a lot more freedom to those involved.

Secondly, there are very different challenges associated with testing or experimenting and implementation on a bigger scale. It is often hard to tell whether a failure to capitalise on a new innovation is because of the original idea, or the way it was implemented.

A big data lab separates the two phases, enabling fast and effective testing of the original idea first, to make sure it works, before it is handed over to those with better implementation skills.

Then there is one, even more important factor: a ‘big data lab’ might be considered as ‘institutionalising curiosity’. This matters because it is very hard to know what you don’t know. Just as some of the greatest scientific discoveries have come about by chance—think of penicillin, for example—a big data lab gives freedom to just play. And out of that ‘play’ may just come some seriously interesting insights. Maybe not the first time, or even the second or third, but at some point.

 

Tags
Share

About Author

Casper Pedersen

Thought Leader - Big Data

Thought Leader - Big Data - SAS Denmark Committed to achieving and exceeding demanding targets and business objectives while remaining focused on providing an exceptional standard of service.

1 Comment

Leave A Reply

Back to Top