5 ways for retailers to thrive in post-Brexit Britain

Black Friday 2016 took everybody by surprise. The biggest shopping day of the year is a crucial date in any retailer’s calendar. And rightly so. Auditors and analysts predicted that 2016 would see the majority of consumers splashing out more cash than ever before on everything from scented candles to big screen TVs.retail-street

But things went a little differently than expected. Online sales failed to achieve the double-digit surge that was expected. Whilst IMRG had forecast sales hitting £1.27bn (a 16 per cent spike on last year) the figure fell to £1.1bn. Rather than spending more on the big day, shoppers waited to purchase what they would have bought anyway, and just 21 per cent of the UK opted to take advantage of the discounts.

So what went wrong?

I think Brexit is a pivotal factor in the drop in sales that's now referred to as Flop Friday. Read More »

Post a Comment

4 steps to get started with data governance

Data governance seems to be the hottest topic at data-related conferences this year, and the question I get asked most often is, “where do we start?” Followed closely by how do we do it, what skills do we need, how do 112649753we convince the rest of the organisation to get on board and not see this as the team trying to “control” how they work?

The last question is a legacy of overzealous data governance workers from the 1980s, however the first few questions are an outcome of the data-driven world we now find ourselves in.

Moore’s Law has continued to hold true, and its exponential nature now sees organisations swimming not only in data, but new and external data sources. So where you start in 2016 is quite different from where people started back in the 80s.

For those starting out, trying to show value quickly can be a daunting task -- especially when the organisation sees data governance see as a necessary evil. So here are four steps to consider as you get started.

1. Don’t talk about data governance at all, if you can help it

While the definition of data governance is still correct, and the need for it is more important now than ever, I’d suggest you update the terminology you use with your organisation so that it's less about control and rigidity, and more about frameworks, guidelines and decision-making routes.

Many large organisations are still recovering from the hangover of allocating data governance roles back in the 80s, where they were executed with gusto and diligence.

Creating a new term for data governance is not easy, but using more friendly terminology is. For example, you could describe it along the lines of data policy management or co-ordination, and I’m sure there are a lot of other suggestions out there. The goal is to remove the idea of hard absolutes that used to come with this program and replace it with a different approach to managing data for the good of the business.

2. Focus on critical outcome-based activities

The second step is to focus on outcome-based activities. The need to govern data is based on the goal of ensuring we make the right decisions and take the right actions at the right time. Those decisions and actions should be based on the current objectives of the organisation.

Identifying the critical data underlying these objectives gives us our starting point. Obviously over time these objectives will change, as will the data we manage.

3. Set benchmarks and measure, measure, measure

Once these critical data sets have been identified, you must create data quality benchmarks. Creating a data quality benchmark is key in setting a stake in the ground upon which KPIs and measurement metrics can be based. It’s also the starting point in understanding which data cannot be relied upon in its current form. This is when data cleansing/standardisation needs to occur, to rectify the issues that currently reside in the data. This progress needs to be monitored and measured.

4. Set up the framework to manage data and exceptions

As a good colleague once said, the goal here is not to become a laundromat. In order to ensure data cleansing does not become a regular occurrence, guidelines and policies need to be put into place. These may be business rules for ensuring accuracy at data creation/capture, or it may be training programs for new users to understand why data needs to be captured in a certain way, and the fallout of not capturing it correctly.

Once these policies are in place, data governance frameworks can be updated to deal with exceptions. Exceptions are not necessarily wrong, hence the need for a framework to identify the cause and which level of the data organisation should deal with it.

When setting up a framework, use the above benchmark data to illustrate the “why.” This is key to the organisation understanding the need for what you're asking of them.

Over the years, I’ve found the benchmarking and measurement step by far the most valuable in terms of both providing a starting point from which to improve, but also in helping the organisation and the executives understand the fallout of unreliable data.

Providing evidence of problems using empirical data that’s attached to the organisations’ objects helps provide focus. It’s also key in identifying the ROI of a data project that requires a governance approach and allows a data team to quickly show value via the measurement of data on its journey to reliability.

However, this is only one side of the coin. Often those identified to become the stewards of the data are seen as the owners and problem solvers. And in an organisation full of people who are all attached to the data, that’s an unrealistic expectation.

For more on this, take a look at the following article: 5 models for data stewardship.

Post a Comment

Have a holly, jolly smart grid

Ready for another (soon to be classic) fun, custom-made for IT holiday jingle?? Fire up the hot chocolate, gather around the water cooler and belt this one out at the upcoming office party:

Have a holly, jolly smart grid
It's about time we upgraded our electric infrastructure
I don't know if there will be outages
But I hope you have a IoT connected smart meter this year!

Have a holly, jolly smart grid
And when you walk down the street, enjoy the wifi
Enabling you to say hello to all the friends you know
And to link/connect/friend everyone you meet!

Oh, ho the hotspots go
Broadcasting where you can see
Somebody waits for you
Snapchat her once for me!

Have a holly, jolly smart grid
And in case you've installed solar/wind
Oh, by golly have a holly, jolly smart grid this year!

Happy holidays to you and yours!

Post a Comment

No data scientist? No analytics platform? No problem.

“Analytics” and “data scientist” aren’t new terms, but they are trending buzzwords. The popularity of these concepts has created a false impression: Analytics are mysterious abstractions that can only be decoded if you have a white lab coat and an advanced degree in computer science.

The reality couldn’t be more different. It’s true that data scientists are statistical whiz kids who skillfully manipulate massive amounts of data and generate intelligence with data mining, machine learning and predictive modeling. But with the right set of tools, non-data scientists can also reap the benefits of advanced analytics. A streamlined, automated data integration process and robust visual exploration tools can make analytics the most powerful collaborative tool available to your organization.

Read More »

Post a Comment

Drivers for the digitalization of insurance

137957211The insurance industry is becoming increasingly focused on the digitalization of its business processes. There are many factors driving digitalization, but it’s clear that a reliable and meaningful database is the basic prerequisite for a successful digitalization strategy.

Insurance companies are increasingly prioritizing digitalization, not because this issue is currently "in," but because these twelve changes in the insurance business process environment are forcing companies to act: Read More »

Post a Comment

Clinical research data sharing promises new cures and treatments

Clinical research generates extensive amounts of data, yet most of it is siloed or generally unavailable to a larger pool of willing potential researchers. If this data were liberated to the masses, we would venture into a world of endless possibilities where the search for new cures and treatments could be accelerated. From curing cancer to combating cardiovascular disease, clinical research data sharing has the potential to benefit many health care goals.

Clinical research data sharing for cardiovascular scientific discovery

The Duke Clinical Research Institute (DCRI) is at the forefront of this movement to engage in data sharing and research transparency, as noted by Drs. Eric Peterson and Michael Pencina in the short video below.

"We're trying to promote the message that more better research can be done if we come together and share," says Pencina in the video.

Read More »

Post a Comment

4 Basics disruptive innovators must get right

harmoney-pic-minBeing an industry disruptor is a lonely place to be – but if you’re successful, the rewards are well worth the initial risk. Betting big on your new way of doing things takes courage, and is only the first step in a risky process.

Your next critical step is to quickly find and convince your target market to take that ride along with you.

So what does it take to leave that well-travelled highway and take off on your own uncharted course?

At its simplest, it’s all about being agile. If you can’t respond quickly to market feedback, your route could easily turn into a dead end. So what does agility look like when you’re (often) a small start-up with a lot to prove? It means you listen, learn, adapt and act in real or, more likely, near real time. Read More »

Post a Comment

Over the wires and through the cloud

It's that festive time of year again, so you may want to build yourself a fire  and grab a cup of hot chocolate as you prepare for a rousing round of holiday/IT joy.

Grab your co-workers and gather around the water cooler while singing along to this post and others from years past like: "Twas the night before big data"; "Rudolf the red-nosed Baysenian network?"; "Big data is coming to town!"; and of course who could forget (you can't, even if you want to!): "The twelve days of big data analytics."

Over the wires and through the cloud
To Grandmother's smart house we go.
The GPS knows the way to guide the car
Through white and drifted snow.

Over the wires and through the cloud
Oh, how the data does blow up.
Text streams and bytes get stored
As sensors send data all around as we go.

Over the wires and through the cloud
The connected car throws off a full day's worth of data
Oh, hear the streaming data ringing ting-a-ling-ling,
For it is a cloud IoT day

Over the wires and through the cloud
Trot faster my Nifi way
We love the sound of music as its streams o'er the ground
For it is a cloud IoT day

Over the wires and through the cloud
And straight through the smart grid we go.
We seem to go so dreadfully slow when our wifi fails to connect
Waiting on fiber to be installed is so frustrating.

Over the wires and through the cloud
Now Grandma's hotspot I can connect to.
Hurrah for IoT; the streaming movie is almost done;
Hurrah for the spark that helps run our connected car.

Enjoy the holidays!!!
NOTE: Soundtrack will soon be available on 8-track as well as cassette (Google these if you aren't familiar with these terms :).

Post a Comment

Rise of the CDO reflects the rising role of data

609179193As technology evolves, so do the c-suite roles related to technology. In particular, the roles of Chief Digital Officer and Chief Data Officer – both referred to as CDO – have seen rapid changes.

This post will document the changes I've observed in these two roles, and answer questions I've heard as our customers have been navigating the changing technology landscape. Read More »

Post a Comment

Cybersecurity: A conflict of old and new

660495267The cybersecurity challenge exemplifies how global threats have evolved and how governments must combat them. For all the complexity of the Cold War, the United States defense officials knew the nations that posed the biggest threat.

The world is much different today. As General Michael Hayden (ret.), former Director of the National Security Agency (NSA) and Central Intelligence Agency (CIA), explained at the SAS Government Leadership Forum, the biggest threat to the United States no longer comes from conquering states – although that's still a concern – but from failing countries or individual actors.

Technology has created a lower barrier of entry for those looking to do massive harm, and cybersecurity is at the battlefront.

New world, new threats

“The industrial era tended to aggregate power toward the center,” Hayden said. “But we're no longer living in that era. We're in a post-industrial era, which pushes power to the edges and down. In other words, the things we used to fear coming at us from malevolent state powers are now within the reach of sub-state groups, gangs and even individuals.”

This dynamic creates a complex world that's difficult for defense organizations to navigate. With a range of attacks from different actors, government needs to know where these attacks come from. Advanced analytics can provide this information so agencies can not only see what's happening on their network, but who's causing the problems.

How analytics can help

Tom Sabo, a solutions architect at SAS Federal, gave some insight into how analytics can help fend against attacks. As Tom explained, we live in a world of people with different digital experiences. Everyone born after 1980 is known as a “digital native.” They grew up with technology as a major facet of their life. Those born before 1980 are known as “digital immigrants” who became immersed in technology later in life.

Attackers tend to be digital natives, while the defenders tend to be digital immigrants. That's especially true inside government. Advanced analytics, Sabo said, can help narrow the gap between the two sides.

Fritz Lehman, EVP and Chief Customer Officer for SAS, echoed this statement. He said with the increase of data, cybersecurity has become more difficult than ever. He said everyone, including government, needs a way to make sense of this data.

“Data is coming at us in bigger and faster ways than it ever has before. So we're constantly trying to figure out a way to take this flood of data and give you faster insights,” he said. “We're in a point in time where there's so much we need to know immediately, but data is getting bigger.”

The key for government is to equip a wide range of employees with data monitoring tools, and not limit data monitoring to a handful of data scientists. The world is getting more complex, but advanced analytics can help reduce the noise and provide for better cybersecurity defense.

Please check out these on-demand presentations and others from the SAS Government Leadership Forum.

Post a Comment