My notes from the 2015 London Conversion Conference:
So what did I take away from attending the London Conversion
Conference on 29th October? Well, to keep it simple here are my top
insights and key take-outs from the conference.
Psychology is our guide:
Behaviour drives preferences. The consistency bias means that we if we begin engaging with a website we soon convince ourselves that we must like it.
Although we might not like pop-up windows, they work. Use
them to collect email address with the offer of special offers to engage visitors who are not ready to buy or sign up to your product.
Use emotional hooks to motivate visitors at the beginning of
the funnel as behaviour is largely driven by our unconscious brain when we begin our search on a website. However, when it comes to making a payment we definitely use our conscious brain. Make people feel competent by asking them an easy question at first as this helps to encourage the consistency effect.
It’s not just about A/B testing:
Conversion rate optimisation is much more than just A/B testing. We should be trying to better understand visitors through Voice of
Customer research, use customer journey analysis to identify potential problems with the customer experience, fix bugs, conduct usability testing and lots of other activities that help reduce friction and improve the customer journey.
Design with care:
Visual cues are important. Flat designs may be trendy, but strong visual cues help guide users and reduce cognitive load by making the
desired journey easier to follow.
Responsive design is a characteristic and not a solution:
Responsive design can create many problems that are hidden unless you analyse visitor behaviour and conversion rates by device. Mobile and tablet users often have different goals form those using a desktop computer. Further, many responsive sites have not been designed with the mobile user in mind and so prove frustrating and poorly designed for the non-desktop user.
Voice of Customer:
Use online Voice of Customer tools on your website to find out what visitors are looking for and what they couldn’t find. This can be
invaluable in providing insights for developing hypothesis and for just fixing stuff.
Ensure you analyse VoC responses by device on responsive sites as the experience and problems will vary according to the size of the
screen and browser. Responsive design is a characteristic, not a solution.
Copy when you find a good idea:
Some experts claim you shouldn’t copy your competitors or best in class websites. It may not work on your website and your site will look just like your competitors’ sites. However, if you are selective and A/B test before you implement this can be a very fruitful approach. Sometimes it won’t work, but there will also be occasions
when it does. Our ability to copy good ideas is one of the reasons for the amazing success of the human race, so don’t knock it.
Scale is important in A/B testing:
Estimates of the proportion of A/B tests that fail range from between 60% and 80%. This means that you need to build your testing roadmap to get a sufficiently high number of tests in the pipelines to give you a reasonable return on investment.
Run tests in parallel:
Have multiple testing streams to optimise the number of tests you can run in parallel. Provided tests are not on the same section of a page and have different hypothesis you may be able to run tests simultaneously. You can also create cohorts if you have concerns, but most of the evidence suggests that interaction between tests if
often no higher than 5% or 10%.
Create a testing culture:
Sharing test results can be difficult. To make this more interesting share surprising results to engage your audience. Try to develop a
‘prove your-self wrong’ culture to make failure more acceptable as this is how we learn. Innovation often relies on failure to generate those new ways of thinking that break the mould. Involve people from all areas of your organisation and have regular workshops or presentations on how testing is being utilised to improve the performance of your business.
Speak with your fellow delegates and speakers:
I always find that engaging with other delegates and speakers is very valuable. Last year I found out about Hotjar through talking
to one of the speakers and that saved my organisation over £1000,000 a year. This time I spoke with ex-colleague and author Colin McFarland about why you should run tests in parallel and his innovative approach to testing at Skyscanner.
Thank you for reading my post. If you have found it useful please share using the social media icons below or at the top of this post.
You can view my full Digital Marketing and Optimization Toolbox here.
To browse links to all my posts on one page please click here.
- About the author: Neal provides digital optimisation consultancy services and has worked for brands such as Deezer.com, Foxybingo.com, Very.co.uk and partypoker.com. He identifies areas for improvement using a combination of approaches including web analytics, heuristic analysis, customer journey mapping, usability testing, and Voice of Customer feedback. By aligning each stage of the customer journey with the organisation’s business goals this helps to improve conversion rates and revenues significantly as almost all websites benefit from a review of customer touch points and user journeys.
- Neal has had articles published on website optimisation on Usabilla.com and as an ex-research and insight manager on the GreenBook Blog research website. If you wish to contact Neal please send an email to email@example.com. You can follow Neal on Twitter @northresearch and view his LinkedIn profile.