Don’t ignore the next great analytic competitive advantage


Andy_PulkstenisThis guest post was written by Andy Pulkstenis, Director of Advanced Analytics for State Farm Insurance. He leads a team of advanced analytics professionals providing statistical analysis and predictive modeling support for the enterprise across a variety of business units. His background includes more than a decade of experience improving business strategies with designed multivariate experiments. Pulkstenis will be a presenter at the Analytics 2015 conference in Las Vegas, Oct. 26-27. We hope to see you there. 

In my 20-year applied analytics career, I’ve been fortunate to witness to the evolving landscape of business analytics. One notable shift was when companies finally discovered the power of predictive modeling. Initially a tough sell in a world then-dominated by tradition, experience, & classic MBA methodology, it’s now difficult to imagine any company a serious contender if they don’t include predictive modeling in their analytic arsenal. Today when you examine most market leaders, statistical modeling is as firmly entrenched in the corporate culture as Microsoft Windows, SAS, khakis, and snarky Dilbert cartoons. Predictive analytics finally made it, but its cousin experimental design (i.e. statistical testing, or MVT, or DOE, or A/B testing, or test-and-learn, etc.) remains largely on the outside looking in.

Despite the potential to radically transform currently-held anecdotal beliefs about a business, unlock new or deeper insights into drivers of customer behavior, and truly optimize strategy delivery, the applied analytics community has been very slow to embrace statistical testing in the business world, even in the midst of a growing number of success stories. I can say with confidence that my conference presentations on business experimentation are consistently the best talks on testing at a given event – unfortunately because I’m typically the only speaker there talking about the topic!

I suspect this slow adoption rate is due to being blind to the power of testing, misplaced fears around complexity of implementation (in reality the degree of difficulty is on par with building and implementing predictive models), and a scarcity of skilled corporate practitioners (outside of manufacturing, agriculture, and biostatistics, that is). We ignore this valuable tool at our own peril.

Many of us face highly competitive business environments due to regulatory limitations, practical or economic constraints, or unique consumer dynamics. Until now advanced analytics has provided a bit of a competitive differentiator, reshuffling the deck and resorting the corporate winners and losers, but what happens when eventually nearly everyone in a market is using models and data science on a daily basis? Where can we go for that additional analytic competitive advantage? One answer may be statistical testing.

Even in a highly analytic culture that has embraced advanced analytics and modeling, business experimentation has significant value-add:

  • Testing can be used to further optimize strategy assignment, improving customer value as learned insights enable you to truly offer the customers what they want or need as individuals at a given point in the customer lifecycle.
  • It can lower operational costs by discovering efficiencies (and identifying inefficiencies) in your processes or operations center activities.
  • Rigorous experimentation can inform product or strategy development, whether it’s a new credit card configuration, marketing message, internet offering, customer retention effort, or something else entirely.

To learn more about the power of business testing, how we are building a culture of testing and experimentation at State Farm, and how to start or improve testing at your company, drop by my breakout session “Do You Know or Do You Think You Know? Building a Test-and-Learn Culture at State Farm” at Analytics 2015 in Las Vegas on Monday, October 26th, 2015.


About Author

Maggie Miller

Education and Training

+ Maggie Miller was formerly a communications specialist at SAS. You'll likely find her writing blogs, shooting videos and sharing it all on social media. She has nearly ten years of journalism experience that she brings to her writing to help you learn and grow with SAS. Follow on Twitter @maggiemiller0

Related Posts


  1. Andy,

    Thank you for this article, it made me much more confident in my way of testing new dogmas.
    I would like to get better understanding on - How to determine what targeting scale to use?
    Do i need to test it on 'no-filter-audience' or 'my-audience-filter'?


  2. Andy Pulkstenis on

    Hello Maik,

    Thanks for the question and sorry for the delayed response but I was out of office last week and didn't see it until this morning.

    Unfortunately, I haven't found many strong published references for how to do testing in a business setting. There are some analytics books that mention it briefly, but not in any depth beyond the kind of detail I provided here. I found one book that appeared to address it more comprehensively a while back but didn't agree with much of the content I saw inside. So I typically point folks towards the following well-regarded resources:

    1) If you have a basic or higher Statistics background, any experimental design book by Douglas Montgomery is a great place to start. It's the text used by many, many high quality graduate programs in stats, and although it covers many topics you will likely never need in business in addition to the ones you will, it's an excellent overall resource.

    2) SAS used to offer a great course called "Design of Experiments for Direct Marketing" but I checked a few months ago and they were not offering any US sessions at that time with none planned. Perhaps that has changed. The course notes are a really nice intro to applied business testing, because while direct marketing is only one application the principles applied are very common across applications. If you can get the course notes (used to be available through SAS for purchase) anyone with a little bit of SAS and statistics background can work through them very easily. I believe SAS continues to widely offer a couple JMP-based experimental design classes, but since I'm more of a SAS guy first, I'm not as familiar with the details of the JMP training content. If you have some familiarity with JMP I would guess the courses and materials offer the same high quality of content as the SAS class I mentioned. Although the topics covered almost certainly would have some differences I know JMP has a rather extensive DOE package.

    3) Finally, Harvard Business Review has some really nice high-level overviews with good examples of business testing. One I often share with business partners to get them ready for a testing conversation is "The Discipline of Business Experimentation" and another good but somewhat old one now (2009) is "How to Design Smart Business Experiments." You should be able to find both through internet searches, but like I said, they are extremely high-level.

    Good luck in your quest for more information. If you find a great published resource, maybe post it back here for the rest of us!

    • Hi Andy,

      Thanks for the detailed response. And thanks for introducing the topic, you have certainly piqued my interest. I look forward to reviewing the resources and hope to catch up in some capacity, maybe at a future SAS user group meeting.

  3. Mr. Pulkstenis,
    I am new to the topic of experimental design in a business setting, or as you say, business experimentation, and would like to learn more. I am not planning to attend the Analytics summit but instead was hoping you could point me to some published resources. I was unable to find related papers in the SAS archives. Thanks in advance. Ms. Miller, maybe you can point me in the right direction. Thanks again.

  4. Andy Pulkstenis on

    Dr. Pettit,
    Thanks for the kind words! I have found that many companies that have started testing stop at A/B testing or super simple full factorials of 4 or 6 combinations. When a company moves from no testing to A/B testing, the results are so powerful I think it creates an illusion that 'this' must be the destination they were aiming for, when in fact it's just the first step and many additional benefits are waiting for them. Next is moving to multivariate testing to understand interactions and the impact from individual components, then reduced designs which allow more ambitious and comprehensive tests, then incorporating covariates into the analysis to find the best winning combination for each segment or individual instead of one winner for all - that's a lot of additional progress and business value being left on the table simply because the first step had such a significant impact it's hard for them to imagine that more exists. Your iceberg analogy is very fitting.

  5. Andy, great article. It is so interesting that the whole side of engineering DOE is somewhat missing in the conversations and discussions around big data and now of course big data analytics. Truth be told there is an 'iceberg' of methods, with only a few (A/B, Test/Control) showing at the top. I believe, as you do, that the winners will not be afraid to step outside and embrace other areas of expertise where these MVT, Fractional Factorial, etc. etc. designs are well developed. We are at such a great phase in our industry where we can utilize all of these methods and processes to our advantage. Kudos, and look forward to seeing you at the upcoming event!

  6. Andy Pulkstenis on

    Just wanted to let people know that I will be monitoring the comments for the immediate future in case anyone had questions related to business experiments or challenges you are facing with testing at your company, so feel free to shoot them along here and I'll do my best to respond.

Back to Top