A few years ago, when Facebook engagement ads were just taking off, Kevin Colleran, at the time still working for the social media behemoth (he was employee number 10 and its first sales executive; now a venture capitalist, what else?) told me that the way to make your Facebook ads really effective was to give the network three or four versions and let Facebook test them in a real environment. That way Facebook could virtually guarantee the efficacy of your brand message.
He mentioned that you’d be surprised what performed best. For example, you may have thought that videos would drive deeper engagement and you’d be wrong. You could hypothesize anything, in fact, but why bother when it was so easy to get proof of what worked simply by trying a few options.
I asked Kevin how many agencies took Facebook up on the offer and he answered hardly any. Unless they were a direct response firm, it just wasn’t in their DNA. So Facebook would themselves initiative comparisons for brands in order to prove the value.
This week Wired had a great piece on A/B testing and how it has become the “open secret of high-stakes web development.” It’s the formula by which almost all of Silicon Valley (maybe not Apple) improves its online products. Real time focus group testing in real life environments.
It’s really just a technique that derives from classic direct marketing. Beat the control. In the days of envelopes and stamps, however it took multiple tries and that could take many months a you had to conceive, write, print, mail, analyze data and try again.
On the web, of course, the process takes but a few hours. Change a color, an image or a headline and the impact on action taken could be significant. You may never know why, but that’s not the point.
Yet many ad agencies now getting into the digital business – creating websites, apps, online experiences and more – remain averse to A/B testing, or at least unaware of its potential. Why? For no other reason than the old linear process by which we made advertising – strategy, concept, approval, production, distribution – remains embedded in muscle memory. Or even more likely, because most ad agencies, along with plenty of companies in other industries, still practice HiPPO decision making; the highest paid person’s opinion determines what to do.
But read the Wired piece. Consider not only the dramatic improvements that A/B testing can yield – as well as the frequency with which the HiPPO’s are wrong – and you certainly conclude the strategy has a place in anything we ever do online. Ads, websites, apps, social engagement.
Maybe I should have done two versions of this post to see which one gets the most traffic.
Your thoughts? Are you using A/B testing for any of your online initiatives? Why not give it a try?