If you currently implement or have thought about implementing nudges in your communication plans, how will you know if these are effective? The answer lies in rigorous and specific testing. Apart from a few cases, we simply don’t know whether nudges actually work as promised. We can give them the benefit of the doubt when they use experimentally tested techniques, but doubt should remain until actual evidence is gathered.
The bright spot in impact testing, if it is to be called that, lies in the marketing realm. Marketing optimization specialists spend their careers specifically fine-tuning communications and materials for their impact on sales. The best of them rigorously use randomized control trials (also known in that world as A/B or Split tests) to measure the final causal impact of a particular change in wording or design, aided by tools such as Adobe Target, Optimizely, Eloqua, or Apptimizer.
While many of those results are kept private, you can find a remarkable, and publically accessible, collection of hundreds of experimental tests at WhichTestWon.com. Even there, however, the results should be taken with a grain of salt.
The above resources are a great place to learn more about A/B testing and prior tests that have been conducted so you don’t have to reinvest the wheel. If you do run A/B tests of your own, we would love to hear about them – please share your results!