Using virtual reality to understand big data


What if virtual reality technology allowed you to immerse yourself in big data? That could be your future – and sooner than you think.

It happened in August 2013 at the University of Washington: the first direct brain-to-brain communication, with one researcher controlling another researcher’s brain over Skype. Maybe in the distant future, big data insights will be injected directly into the brain, too. Luckily, the near future offers many ways to rethink how we deliver business data to the brain. And they are far more natural – not to mention much less creepy – than wiring the brain directly to back-end servers.

Your brain on big data

We’ve all heard a lot about big data. Yet in spite of all the talk, we haven’t seen a commensurate increase in the amount of business data actually going to the eyeball and, more importantly, into the brain for processing.

The human brain can handle more data – a lot more – than traditional data visualizations provide. The optic nerve has an estimated bandwidth of about 8 megabits/second. But we use less than 1 kilobit/second of bandwidth when we’re simply reading words on a screen.

Consider this example: Imagine that your kids were playing on the beach at sunset. Which of these three options would best capture the magic of the moment?

  • A grainy, low-resolution photograph taken with a 19th-century camera.
  • A five-second high-definition video clip.
  • A 360-degree interactive video that puts the viewer in the middle of the action with the kids, able to see from multiple angles, hear their laughter with surround sound, and has a real-time feed to the grandparents’ houses.

A photograph in this example is akin to traditional executive-style reporting (static pie charts and bar charts), which is great for boiling vast amounts of data down to a few high-level points. For some, that level of data visualization is all they’ll ever want or need. High-definition technologies aren’t needed because there isn’t a lot of information to communicate.

Watching the high-definition video clip for five seconds gives you a much better feel for the situation than looking at the grainy photograph for the same amount of time. But you’re still stuck with one perspective and you’re looking in from the outside – you aren’t immersed and you don’t have control.

The future of data perception could be most like option C, the 360-degree interactive video with surround sound. That’s the option that engages more of your optic nerve, and therefore, more of your brain. Your optic nerve’s bandwidth isn’t fully utilized when reading data on a piece of paper or on a screen. The full bandwidth is engaged when you’re driving or playing soccer or looking for a lost kid in the shopping mall. In other words, we consume more data input when we’re moving about the real world – the 3-D world. Using audio in effective ways is just icing on the cake.

Getting big data into your brain

How can we use computers to get more data through the optic nerve? Fact is, we’re already doing it. That’s what virtual worlds in video games do. When you’re in the virtual world of the video game, you perceive meaning and formulate actions without having to consciously think about it. Behind that virtual world: a bunch of zeros and ones.

Given that a virtual world environment allows your brain’s intuition to take over, let’s consider what this would look like in business. Instead of starting with an aggregated view of data – like a dashboard – and then clicking, clicking, clicking down through low-resolution views, you start with a high-resolution, detail-first view by throwing massive amounts of data on the screen. Then, the brain aggregates the data into patterns as part of regular perception. The brain can see the forest and the trees, and can easily change focus between the different levels of detail. Why hide all of that valuable detail behind a photo-realistic speedometer?

So, my research question is: Can we use virtual reality techniques to create “data worlds” that deliver massive amounts of business data to the brain?

The answer is yes. Different technologies are converging to make real-time data immersion possible. Here’s how:

1. Event stream processing.

Start with a real-time streaming engine that can push the data, like the SAS® Event Stream Processing Engine. It can handle millions of events per second with processing times of just a couple of milliseconds or even sub-millisecond. Due to its world-leading performance, major financial institutions use the SAS Event Stream Processing Engine for pre-trade pricing of orders. The engine can push way more data than the optic nerve – let alone human cognition – can handle. Fortunately, ESP can analyze and distill data in-stream and can even work with DS2, the powerful new programming language in SAS 9.4, for your analytical needs.

2. Immersive user interface.

Next, you need the immersive user interface. Virtual reality (VR) headsets, such as the Oculus Rift, will soon appear in the gaming market at reasonable prices. These headsets combined with the SAS Event Stream Processing Engine will enable users to navigate vast amounts of business data in infinite space and in real time. If you can’t picture your analysts wearing headsets, don’t worry; the gaming world has long supported point-of-view virtual worlds on flat displays – think World of Warcraft, Minecraft, Second Life, etc.

3. Data-driven audio.

For either a VR headset or a virtual world on a flat screen, data-driven audio can help in a variety of ways. Surround sound makes it possible for an alert (a bell or siren) to seemingly emerge from a point in space. If you hear a bell ring from over your right shoulder, you’ll know to swivel your view to the right to see the data that triggered the alert so you can act on it. Audio can also signify the sound of a data stream, just like you can hear a babbling brook when hiking in the woods before you can see it.

4. Motion-controlled navigation.

Once immersed in data, you need to navigate, filter, slice, drill into and do all the other things you need to do with data. Why settle for a mouse and keyboard? The Leap Motion Controller and Microsoft Kinect can interpret complex hand gestures and other body movements. Likewise, the gyroscopes in tablets provide interesting input capabilities. Tilting, sliding and even shaking a tablet can send you on interesting journeys through your virtual data reality.

5. Data immersion in social spaces.

Tablets can bring data immersion into a meeting room, but what if an entire social space were designed for data immersion? All of the walls could be large computer screens, as could the ceiling and even the floor. The whole room would be like a VR headset, with the advantage that people could have natural, high-bandwidth human communications with each other while they are immersed. Fields of data could extend in all directions at their feet, while “data weather” – a skyscape generated entirely from business data – would hover above their heads.

Concluding thoughts

Technological advances are opening up new worlds of possibility for the way we’ll consume data in the near future. The advantage is clear. After all, if you need surveillance across millions of transactions per second so you can recognize and respond to anomalies immediately, a pretty dashboard that shows a static view won’t suffice. What you really need is to be able to see what lies ahead and what might be coming at you from any direction. That’s the future of data perception, and we’ll be there sooner than you think.

Learn more about data visualization.


About Author

Michael Thomas

Sr Systems Architect

Michael Thomas, Senior Systems Architect at SAS. He holds two patents on high throughput visualization of real-time streaming analytics. He is the author of three books, several papers for SAS Global Forum and several articles. He is also a long time member of the SAS Chess Club.

Comments are closed.

Back to Top