Why Best Practices in Conversion Rate Optimization Aren’t Always Best

Intuition: it’s really a wonderful thing. Sometimes it can ‘show us the way’ and will lead us to make the right decisions. But sometimes, it’s so flawed that we wonder how we could ever have made that choice. The key to making the right decisions is not to ask whether or not to listen to your intuition, but to know when you should (not) trust it.

Conversion rate optimization is one of those cases where your guts can lead you astray. Even best practices, as described here, occasionally fail to produce results. Today we’ll explore some examples of some counter-intuitive findings in CRO where best practices weren’t always ‘best’.

Can Encouraging Your Visitors NOT to Try Your Offer Increase Your Profits?

One company tried this approach (from the WhichTestWon archives). GetItFree added a “No thanks, I’m not interested” link on their landing page offer which, as expected, caused revenue to go down. But something quite unexpected did happen.People didn’t actually abandon the page, but continued to browse other offers. This resulted in an overall increase in revenue, since the secondary offers generated more than the primary one.

Some really cool conclusions can be drawn from this:

  • If people are not interested in what you have to offer on your landing page, maybe you should provide a “substitution”, like GetItFree have done which would re-direct to your other offers.
  • You should measure and track your visitors every step of the way. If this company didn’t do that, they wouldn’t have realized that their visitors were purchasing something else. If they just measured the conversion rate on that specific page, they would conclude “wow, people aren’t buying” and quit testing.
How Changing a Plural to a Single Word Produced far Better Conversion

One of conversion rates best practices is that if you want radically different results, you have to try radically different tests.

This wasn’t the case with Brad Geddes. He has graciously shared some surprising improvements from some simple changes. One change was really surprising: Changing a word from plural to singular in an ad produced a far, far better conversion rate.

Now, this is one of these things that doesn’t make any sense at all. Why would changing a simple word have such a profound impact?

The interesting thing about our minds is that they came come up with rationalizations once we present it with a conclusion. Maybe that word was the “word” people were looking for when searching for that result. Maybe they were perceiving the “singular” version to be less expensive. But be honest: Would you ever think of testing such a small change? That’s what these case studies are for, giving you unexpected ideas.

Can a Mobile Version of a Website be Worse For Mobile than a Desktop Version?

It doesn’t make sense. A desktop version providing worse user experience; users would probably have to scale horizontally etc. Yet, an experiment by Jeff Allen demonstrated that the desktop version actually outperformed the landing page which was mobile-specific by 4% (15% vs. 11% conversion rate).

Remember, best practices means, at best, “good ideas”. They are, after all, practices, and not proofs and the Holy Grail of conversion rate optimization. I can’t think of a better best practice than optimizing sites for a specific screen, yet, as you can see, this recommendation fails in some cases.

I think what often confuses people is that using best practices only is the wording. Best means there isn’t anything higher, right? Here’s another experiment to demonstrate that isn’t the case:

Privacy Policy Wording Can Have a Big Effect on Conversion

In this wonderful experiment, Michel Lykke tested the effect of privacy policy wording on conversion. Truth be told, there isn’t a single best practice on whether to or how to place privacy policy notifications on your conversion panel.

Now, pause for a moment. What type of wording do you think would produce higher results?

  1. Wording focused on the prevention side, saying you will never send bad emails to your users?
  2. Wording focused on the promotion side, saying that their information is safe with you?

It turned out that the first, prevention-focused message lowered the conversion rate, while the second one increased it.

As more and more experiments come out on this topic, I’m pretty sure there will be a “best practice” on how to do this. We’re smart, however, and realize that “best” practices doesn’t necessarily mean “best” when conversion rate practices come into place.

When Going off the Funnel has a Positive Impact

John Doherty has decided to follow one “best practice” which recommends eliminating any links that help people go “off the funnel”. For example, if you offer a web design service on your landing page, you don’t want to have a link where they can go to your personal blog and so on.

What he discovered, however, is that the bounce rate increased and conversion rate went down when they removed the navigation which led to other areas of the site.

The most likely reason was that people felt ‘trapped’. I personally am aware of that feeling when you’re on a site and can’t click on anything except that damn “Sign up” button. What if I wanted to find out more about the company? What if I wanted to see more of their products, who they are, what third party sources have to say about them?

Some Quick Tips for Testing

All of these examples have shown if there’s one thing that’s considered as ‘best‘, then it’s testing. Everyone’s customers are different and each and every situation is different, so it’s essential we test. Many people get testing wrong, however, and I wanted to focus on one particular thing to be aware of: Interpreting your results.

When you start testing with a tool, it’s good to do an A/A test. What’s that? It’s when you test your existing design against your existing design. This is a pretty good way to check if the tool itself doesn’t skew your results.

Another thing is the interpretation of the results themselves. The actual date when you do the testing is important, like avoiding major holidays or an industry-specific event. Or, say some major news outlet mentiones your market and all out of a sudden you have thousands of people wanting to find out more. So they go to Google or some other search engine, click on your site and buy out of curiosity. You see an increased conversion rate and determine that it’s due to the new landing page, but you would, of course, be wrong.

Conclusion

Pay careful attention to these three things if you want your test to be the best you’ll do.

Conversion rate principles are just good ideas to start from, but that’s all. If you’re hoping for results, you cannot, under any circumstances take them for granted. It would be a waste of money since testing tools are so abundant these days.

Hope you enjoyed this article!

Author: (5 Posts)

Darren is passionate about psychology and how it applies to web design & development. His current project is FinderMind.com, concentrating on providing the most useful advice for finding anyone.

Comments