In the first installment of this series on Hadoop, I shared a little of Hadoop's genesis, framing it within four phases of connectivity that we are moving through. I also stated my belief that Hadoop has already arrived in the mainstream, and we are currently moving from phases three of connecting people to phase four
Tag: high-performance computing
The tools for analytics are getting more sophisticated as data becomes more voluminous, says Jim Sterne, President of Target Marketing, in the video below. The real magic still comes from human ingenuity, explains Sterne, but it helps to give analysts the tools they need to make that magic happen. Hear
It's true. "Big data" can be a problem and an opportunity. Many organizations have struggled to manage, much less profit from, the deluge. In 2012, look for big data to spur demand for big data analytics. New developments in high-performance computing as well as increased demand for visualization and text
The promise of high-performance analytics, as I understand it, is this: Regardless of how you store your data or how much of it there is, complex analytical procedures can still access that data, conduct a series of calculations on that data and provide answers quickly, accurately and using the full
The basic big data problem is simple to understand: we create too much data to store and analyze it all. The problem gets bigger, however, when you consider the related factors: our problems themselves are getting bigger, the analytics needed to solve them are more complex and the data is