Is experimental design the red-headed stepchild of modeling?


Andy Pulkstenis of State Farm thinks it is, stating that this red-headed stepchild among modeling technques is where predictive modeling was ten years ago. He opened his talk, "Do You Know or Do You Think You Know? Creating a Testing Culture at State Farm," at the A2012 conference in Las Vegas with these assertions. He went on to lay out what it takes to be a testing culture, walked through a proof of concept test conducted on a website layout, and then gave his blueprint for building a testing culture.

You know you are in a testing culture when.....

  1. Experimental design is an ingrained part of your strategy
  2. You know when A/B testing is "good enough" and when more "advanced" techniques (e.g. multifactor tests) are advised
  3. Internal standards and best practices exist for test design
  4. Formal internal governance is practiced

Design of experiments can be overwhelming and even scary if testing isn't part of your culture, so Andy started his campaign for testing at State Farm with a fairly simple test of a website layout. The metric of interest was quote starts - how many customers clicked through to get a "quote" on an insurance policy. They had four test factors (levers to play with), but each one had multiple options. Should the page have tabs or no tabs, emphasize the quote or a phone number to call, etc. for 24 possible combinations (known as "test cells"). He used a statistical technique called d-optimal design to choose 12 of these combinations to test, allowing them to infer the results of the other 24. The combinations had differing results that were statistically significant, as well as predictions close to the observed model (increasing faith among the doubters).

They learned which model was better but also why, interaction effects of factors that they had been told would be unlikely to occur, achieved results 12% greater than their existing design, and generated enough belief in experimental design to warrant interest in trying again. In his next test, they considered seven test factors over 384 web pages. Testing a subset of combos led to a winning page with a 54% increase in quote starts and a 10% increase in quote completes.

Andy's blueprint for success in creating a testing culture includes six recommendations:

  1. Be a teacher, educating your colleagues on experimental design
  2. Be opportunistic, always on the lookout for a chance to test
  3. Be persistent, because new ideas can take a lot of determination to take hold
  4. Be ambitious, looking for at least moderate impact
  5. Be careful in executing this approach with your business partners
  6. Be ready to tell your "success story" with a pre-built deck before you need it

Resistance, in Andy's experience, is usually of three types - regulation ("our regulators won't understand this"), complication ("this is too hard for us to take on"), or organizational tradition ("we tried it before but it didn't work"). His advice was to be prepared with counterpoints relevant to your company, "testing is done extensively in pharma and financial services, both heavily regulated industries" or "did you know WD-40 got its name because the previous 39 versions didn't work?"

A firmwide engagement model to succeed at design of experiments will include a testing team comprised of the business partner, statistical designers, experts in implementation/execution, and structural and creative designers. It's important to introduce testing into the strategic approval program as well as the structural approval of subject matter experts. And finally, be sure to build a test results repository, so you can revisit past designs even when the designers have moved on.

Ironic fact: Andy mentioned that he uses the OPTEX procedure in SAS/QC®, which was written by SAS R&D Director Randy Tobias. Not only is Randy a red-head, he is also an expert in the field, a co-author of a book on experimental designs and a Fellow of the American Statistical Association!


About Author

Polly Mitchell-Guthrie

R&D Project and Program Management

Polly Mitchell-Guthrie leads the Advanced Analytics Customer Liaison Group in R&D, connecting with customers to improve SAS products. At SAS for 14 years, Polly has held a variety of roles in finance and alliances, and the Global Academic Program. She has a BA and MBA from the University of North Carolina at Chapel Hill.

Related Posts

Comments are closed.

Back to Top