Rowing: What has 150 years of data taught us?

Every year rowers get faster, records are broken, medals are won, but can this trajectory continue? Rowing as a sport lends itself well to data analysis and at the British Rowing Sports Science and Medicine Conference earlier this year I shared some insights the rowing community has gleaned from the performance and race data of the past 150 years.

Caversham, United Kingdom,   GBR M8+, GBR Rowing, European Championships, team announcement, of crews competing in Belgrade, in May. Venue, GBR rowing training base, near Reading, 11:35:23  Tuesday  13/05/2014  [Mandatory Credit: Peter Spurrier/Intersport Images]

Intersport Images

Each decade, the men have improved by approximately two percent, with a 25 to 30 percent increase in average velocity over the last century and a half. That said, the margins for improvement are getting smaller. So how, in the face of these diminishing returns, can rowers continue to improve? To understand this, we have to go back to where the gains have historically been made.

The formula for rowing success seems simple – increase propulsive power and decrease power losses. In other words – go faster, get rid of wasted effort and the stuff slowing you down. Read More »

Post a Comment

Use Hadoop to visualize what wasn’t visible before

You might have lots of data on lots of customers, but imagine if you could suddenly add in a huge dollop of new, highly informative data that you weren’t able to access before. You could then use analytics to extract some really important insights about these customers, allowing you to improve the goods and services they receive from you leading to stronger customer loyalty and higher business revenues.

Magnifying Glass

But how might this happen? It’s probably easiest to explain what’s now possible which wasn’t possible (or practical) before by giving you a simple example.

Consider an insurance company that provides household insurance to particular customers, and an estate agent that was used to selling that property to the current owner. Now, think about the data an insurance company will typically hold in relation to this property. It will be basic information about the property, its location, number of bedrooms and the usual data about the homeowner(s) and their claims history.

Read More »

Post a Comment

Hadoop is dead! Really?

155160174After reading Gartner's 2015 Hadoop adoption study results by Analysts Nick Heudecker and Merv Adrian, the first thing that comes to my mind is Goethe's phrase from Egmont, "Himmelhoch jauchzend, zu Tode betrübt." Translated: heavenly joy, deadly sorrow.

What happened to yesterday's hype around the cute yellow elephant - which I still think can repeat the success of Linux in data centers - and what happened to all the talk about its potential to revolutionize the market for data storage and processing?

Let's take a step back and look at what has changed. From a technology perspective, the Hadoop ecosystem has progressed significantly for the past few years. In some areas, it still requires a lot of expertise and knowledge to get things done. Other areas like loading data into Hadoop and visualizing data stored in Hadoop, have become more user friendly for even non-techie users. Tools like the SAS Data Loader for Hadoop now make the platform accessible for business users.

Read More »

Post a Comment

VirtualOil: volatility and the value of a hedge

This month we take a fresh analytical view of our hypothetical VirtualOil portfolio by comparing the forward price of WTI (the green line) to the prompt month price (red line). The resulting graphic (chart 1) demonstrates the relative stability of the 48-month forward price in contrast to a very active spot price, though the forward price has been on a downward trend since 2011. The blue line shows the forward premium of that 48-month price.

Chart: Prompt, Price and Premium

Chart 1: Prompt, Price and Premium

Read More »

Post a Comment

How to prevent a failed proof of concept

42-69811179A proof of concept (POC) is smartest way for customers to evaluate if a product meets the required objectives, and the best way for vendors to demonstrate why they feel they are best placed to resolve the current outstanding problems. However, not all POCs are successful. Let’s explore why.

What is a failed proof of concept? 

A failed POC is one that has one of the following end results:

  1. Vendor(s) fail to prove the concept as originally conceived.
  2. Concept is proven but does not provide the expected outcome in terms of value.
  3. POC fails to satisfy the intended stakeholders.
  4. POC results inconclusive leaving the customer confused and the vendor frustrated, both claiming they have done little wrong.

How to avert failures or fail fast?

Read More »

Post a Comment

Three ways to modernize and expand your analytics programs

134113066The cottage industry was based on workers buying raw materials, bringing them home and producing hand-crafted items to sell. The system worked, but was slow, tedious and expensive, producing goods that were affordable only by the rich.

94364066The Industrial Revolution changed all that. The factory system brought machines and workers into factories that reliably and quickly produced mass quantities of items at a much lower cost.

You can easily see the connection to the analytical process. Too often today, the analytics process is run like a cottage industry: Workers get raw data from IT, analyze it in silos and produce predictive insights for their individual business units. Read More »

Post a Comment

Exploring my personal IoT data with SAS

After acquiring personal IoT data in part 1 and cleaning it up in part 2 of this series, we are now ready to explore the data with SAS Visual Analytics. Let's see which answers we can find with the help of data visualization and analytics!

I followed the general exploratory workflow described by the Visual Analytics Mantra):

"Analyze first, show the important, zoom, filter and analyze further, details on demand."

Read More »

There’s no ‘i’ in data science team

Well OK, so there is an "i" in science, but being a data scientist is certainly not a lonesome job. Engagement with other team members is essential with data analytics work, so you never really work in isolation. Without the rest of the team, we would fail to ask all the right questions of the data so as to solve critical business issues. The hard-earned insights produced would also not be used or understood by the organisation we’re working with.

So, what are the key ingredients of a data science team that you should be looking for? It is, in fact, a group of employees with quite diverse roles. My SAS colleague Jennifer Nenandic highlighted these in her recent blog post, How to build a data science dream team. I’ve summarised the star players here.

The translator (AKA, the business manager)Data Science Team

The subject matter expert has lots of business acumen: an understanding of the issue from a business perspective. As the team focuses on one analytics effort after another, the translator’s role will change. Their role is to help the rest of the team understand the business context of the challenge. They are involved from the beginning of the project and helping to set the scene, right through to the end result when the results are presented. With the business context in mind, they can also help prepare and present the results as well as the return on investment from the project. Read More »

Post a Comment

Costing in a shared services environment

95120626We often hear questions like: Are the shared service chargebacks to my business units’ cost centers accurate and transparent?  Will I save any money by using a centralized shared service? Why should I consider a centralized shared service?

These are all good questions.  To answer them, you need to understand your organization’s cost structure. That brings us to the age-old debate: Is traditional cost accounting sufficient to provide insight into what things cost and why they cost what they do? This in turn depends upon our ability to understand the root causes of costs as well as the cause-and-effect relationship between resources and outputs.

Many organizations today are exploring options to optimize services, particularly support and back-office services by addressing root causes of inefficiencies, including resource-sharing constraints and poorly aligned processes and supporting systems.

For example, IT services, call centers and order processing are typical candidates for process alignment and cost reduction initiatives.  More complex organizations are even looking at sharing sales, field service and logistics operations across diverse business units. Many believe these improvements can be realized through the implementation of shared service centers.

Read More »

Post a Comment

Three ways to improve your analytics architecture

Do you “buy and build as you go” with your analytics architecture? Most companies do, and have for decades. The result is a heterogeneous environment for analytics with a variety of hardware, software, databases and analytical applications used in silos. There’s tremendous duplication of data and inconsistency in the analytical process, leading to a lot of wasted time and money.

What can be done to improve your analytics architecture? It’s much like improving a home – you can renovate your existing house (modernize), expand the existing structure (extend) or knock down what you have and rebuild (innovate). Let’s take a closer look at these three scenarios: Read More »

Post a Comment