SAS Voices
News and views from the people who make SAS a great place to workIf a picture is worth a thousand words, then visualizing data in Hadoop would be like a billion. Over the last few years, organizations have rushed to leverage the low-cost distributed computing and storage power of Hadoop clusters. As Hadoop environments mature and move away from their initial focus of
You’ve likely heard the news that the Google DeepMind “AlphaGo” computer not only beat a human expert at the game of Go, defeating the European Go champion, Fan Hui in five straight games, but also beat the reigning world champion grandmaster, South Korea’s Lee Sedol, 4 games to 1. Go
Modeling risk to meet regulatory requirements is costly and complex. Because of that, some have suggested that financial services institutions (FSIs) move toward a set of standardized models. The argument is that central banks and regulatory authorities could then more easily monitor systemic risk and compare apples to apples. But
According to Lloyd Dean, president and CEO, "At Dignity Health, we are committed to developing partnerships and opportunities that harness the tremendous potential of technology, from improving the patient experience to providing caregivers with tools that will support their day-to-day care decisions." Dignity Health, one of the largest health systems
As the big data era continues to evolve, Hadoop remains the workhorse for distributed computing environments. MapReduce has been the dominant workload in Hadoop, but Spark -- due to its superior in-memory performance -- is seeing rapid acceptance and growing adoption. As the Hadoop ecosystem matures, users need the flexibility to use either traditional MapReduce
Streaming analytics is a red hot topic in many industries. As the Internet of Things continues to grow, the ability to process and analyze data from new sources like sensors, mobile phones, and web clickstreams will set you apart from your competition. Event stream processing is a popular way to