If you're like most large organizations, you've got a bunch of business intelligence and analytic applications at your fingertips. In my prior organization, a large insurance company, I believe the magic number of tools that the organization had to support was north of 40. No kidding. Now, this is a pretty typical situation - large divisions within the company often run independently from each other, you may have inherited technologies from mergers and acquisitions, or you really need a capability that a specific tool may provide.
A couple of years ago, I was working in a division of this insurer that desperately needed an overhaul of their information delivery and analytic capabilities in order to achieve their business objectives. A new business intelligence team was created to address this need. The team began working on modernizing both information delivery and infrastructure. We built a nice data mart and gave the user community new fancy, shiny tools to access the data. This included a robust reporting solution with three separate applications for drill-down reporting, canned reporting and ad hoc reporting, as well as two new analytic tools, SAS Enterprise Guide and JMP. We rolled out a combination of internal and external training. “Go forth and be happy!” we said to the users, thinking they'd be thankful for all of the wonderful capabilities we'd bestowed up on them.
Yet our user adoption rates were abysmal. What could we have possibly forgotten?? Where did we go wrong?? Why weren't our users buying us flowers and gift baskets as tokens of their undying gratitude? First of all, five new capabilities were TOO much at one time. We hadn't done a good job of segmenting our users, especially when it came to the analytic tools. We also poorly segmented our toolsets. Users never knew which tool they were supposed to use in which situation, and as a result, retreated to their comfort zones.
How did we overcome the problem? Tracking user adoption rates after deployment was essential. We wouldn't have known there was a problem if we hadn't been really engaged with our user community. We discovered how important it was to understand each individual's business processes, pain points and key motivating factors to adopt a new way of doing things (surprisingly, people don't like change, even when you're fixing something that's driving them crazy). People are afraid, and they may even feel threatened. Taking a three-day process and turning it into a ten-minute process can be intimidating if the end user doesn’t know what to do with all that extra time.
We retrenched and came up with a more personalized approach to training. Within our own group, we designated leads for each technology. As the lead for SAS and JMP, I instituted a program I call the “SAS Buddy” method (and certainly this could apply to any tool).
Basically, it requires a lot of meetings, either one on one or in very small groups, between an experienced user and a neophyte. The experienced user helps the new user to identify target projects that can be used with the new tool, assesses the complexity of the project, makes recommendations and provides an ear to listen and help as the user starts out. External classes are a key part of this process, too, but there is no substitute for hands-on, meaningful project work.
In smaller groups, users will feel more comfortable asking questions and trying new things. As these users gain skill and confidence, they'll become SAS Buddies to other new users. At the start, I was the SAS Buddy for 50 new users – and over time a good handful took over the onboarding process for new users in their area.
There are some other key success factors, including:
- Abandoning the “one size fits all” approach to training and customizing learning opportunities
- Make people accountable for learning the new technologies.
- Give them time to learn - let them carve out a percentage of their time to use and explore their new tools.
- Carefully document the strengths and weaknesses of each tool and provide users with the appropriate guidance on when to use what.
- Continually survey and assess your communities' needs - make sure you don't have usability or perception issues around what a tool can or cannot do (all it takes is one person to spread a rumor!).
If you don't have any experts in your organization, form a small user group and learn together. Grab the laptops, find a conference room and share things that you've learned. Be a buddy. You'll be surprised at the power of collaboration.