Why You Shouldn’t Use focus Groups To Evaluate Your Home Page! Pitfalls of A/B testing (Part 2)

image of focus group viewing facility

I recently attended a conference on conversion optimisation. One of the speakers talked about how they had used focus groups to provide feedback on alternative designs for their new home page. One of the designs received much more favourable feedback than any of the other variants. This gave the client confidence that this design would win over their visitors. However, when they used A/B testing to measure the performance of each variant the preferred design completely bombed. They were relieved that they had decided to test the design before changing their home page.

The outcome of the test did not surprise me. How we articulate what we think about a user interface design is very different from how we are likely to behave in reality. When we browse the internet we are normally seeking to complete a task that takes us closer to achieving a personal goal. This is not a group experience.

Further, neuroscience indicates that most of our decisions are taken intuitively, often with little conscious awareness. We use behavioural short-cuts to minimize cognitive load and our emotions can over-ride our rational thought processes.

How often do you browse the internet with a group of people you have never met whilst being observed through a one way mirror? Website navigation and task completion is more often than not undertaken by an individual without any group interaction. This may be changing to an extent with mobile devices, but in most cases these are still people we know.

Context is critical in how we respond to any product or service. And yet focus groups are often conducted in a completely alien and artificial environment which does not match the reality of website browsing. Viewing facilities exasperate this problem as they create a laboratory atmosphere.

Some times it is suggested that focus groups are used before designs are drawn up to understand user’s wants, needs and likes. I would also caution against this as focus groups are problematic on a number of counts:

  • A single person can dominate the discussion and respondents will sometimes change their opinion to conform to the majority or because they mistake confidence for knowledge.
  • People over-analyse and rationalize the topic when we are looking for their emotional  and intuitive response to achieving a goal. People will say one thing but behave in a totally different way in reality. This is because people are extremely poor at predicting future behaviour as we are prone to the influence of short-term emotions and cognitive bias.
  • Presentation is critical. Despite explaining to people that a design is a mock-up or not fully complete you can guarantee that the stage of development of a design will influence how people respond to it.
  • People dislike uncertainty and when it does exist people look to the actions of others to guide them. ‘Group think’ may also set in if a group is too homogeneous and we seek a consensus of opinion rather than having a show of hands. This is particularly worrying as participants to focus groups are normally recruited on the basis of being aligned to agreed demographic characteristics.

As a result I would certainly avoid using focus groups for evaluating new website pages or journeys. All research methods have their limitations but focus groups are especially problematic and inappropriate in the context of website design.

Individual usability interviews can provide much more meaningful insights about how a website or web page works or not. Learning to listen and observe rather than asking direct questions is likely to provide more useful feedback on how to improve your site. There are also other online research tools that can provide relevant sources of insight. These include:

Eye tracking: AttentionWizard.com – Uses algorithms to predict what page elements visitors are likely to look at before the page goes live.

Usability: UserTesting.com and Loop11.com allows you to submit task that visitors should try to complete and provide feedback through visual recordings of users.

Feedback from usability & conversion experts: ConceptFeedback.com

Thank you reading my post. If you found this useful please share with the social media icons on the page.

You can view my full Digital Marketing and Optimization Toolbox here.

To browse links to all my posts on one page please click here.


Further reading:



Consumerology: The Truth about Consumers and the Psychology of Shopping (new revised edition, including a new preface from the author)


Website Optimization: An Hour a Day


Don’t Make Me Think!: A Common Sense Approach to Web Usability


  • About the author:  Neal provides digital optimisation consultancy services and has worked for  brands such as Deezer.comFoxybingo.com, Very.co.uk and partypoker.com.  He identifies areas for improvement using a combination of approaches including web analytics, heuristic analysis, customer journey mapping, usability testing, and Voice of Customer feedback.  By  aligning each stage of the customer journey  with the organisation’s business goals this helps to improve conversion rates and revenues significantly as almost all websites benefit from a review of customer touch points and user journeys.
  • Neal has had articles published on website optimisation on Usabilla.com  and as an ex-research and insight manager on the GreenBook Blog research website.  If you wish to contact Neal please send an email to neal.cole@outlook.com. You can follow Neal on Twitter @northresearch and view his LinkedIn profile.

Why Do People Prefer Gut Instinct to Research?


Source: Dilbert.com

Business people pride themselves in their decision making and many  businesses embed market and competitor research into this process. However, because business people are prone to the same human frailties as all of us this can discourage the use of research and insight.

Behavioural science suggests that as people gain experience and knowledge in their area of expertise they have a tendency to become overconfident and complacent about their ability to understand the past and predict the future. Our brains assume that we are living in a simpler, more predictable world than is really the case.

Image of Dilbert cartoon where boss has gut instinct

Source: Dilbert.com

This is one of the most useful insights of behavioural economics and yet professionally a difficult truth to acknowledge when we like to be seen as an expert our field. Indeed, we are sometimes informed that decisions have been made on the advice of an ‘expert’ as if this guarantees the quality of the process.

As humans we are certainly prone to the illusion of understanding. Our minds create narrative fallacies from our continuous attempt to make sense of the world.

We notice the small number of unusual events that happen rather than the multitude of events that failed to occur.

Our memory is selective and biased by the workings of our mind. We construct vivid accounts of the past based on memories that change every time we recall them but believe they are a true reflection of past events.

We suffer from a tendency to like (or dislike) everything about a person. This  helps generate a simpler and more coherent representation of the world than is really the case. We fill gaps in our knowledge about a person using guesses that fit our emotional response.

Short-term emotions are probably the most powerful force in our decision making arsenal. Many of our judgements and decisions are directly influenced by feelings of liking and disliking rather than rational deliberation.

We hate uncertainty and suppress ambiguity because inconsistencies slow our thought processes and interfere with the clarity of our feelings. People are attracted towards confidence and we prefer decision makers that demonstrate such qualities above someone who may be equally competent but wants to think through a decision before giving an answer.

Image of Dilbert cartoon where boss wants to use his gut rather than read document

Source: Dilbert.com

People are heavily influenced by the What You See Is All There Is (WYSIATI) rule. We naturally work with the fragmentary information that we have access to as if it were all there is to know. The paradox is that it is easier to construct a consistent story when you have little knowledge. People make fallible guesses from incomplete information by making a leap of faith about how things should work.  Steven Pinker points out our only defence is that it worked sufficiently well in the world of our ancestors.

“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.” Daniel Kahneman, Thinking, fast and slow.

This can lead us to define our choices too narrowly and consequentially reduces our options. Research and working collaboratively can help by widening our horizons and introducing new insights to challenge our perception of the topic.

We are also very good at changing our beliefs after an unpredicted event without being aware of it. We often unconsciously adjust our view of the world and find it difficult to recall what we believed before the event. This leads us to evaluate the quality of decisions by the nature of the outcome rather than the process by which the decision was made.

“Asked to reconstruct their former beliefs, people retrieve their current ones instead.” Daniel Khaneman, Thinking, fast and slow.

The danger here is that people get blamed for a decision that resulted in a negative outcome despite the unpredictability of the event. In corporate decision making this can result in people relying on bureaucratic solutions to avoid blame which leads to extreme risk aversion.

Image of Dilbert cartoon where boss explains decision making process

Source: Dilbert.com


It can also result in business people receiving unjustified rewards (e.g. bonuses) for being irresponsible with risk taking and just being lucky. This can be seen prior to the 2008 financial crisis where banks and other financial institutions were paying risk takers massive remuneration packages for activities that put the whole financial system at risk. Due to the complexity of some of the assets their AAA ratings proved to be illusory.

Kahneman asserts that any comparison of how successful or not companies have been is to a large extent a comparison between how lucky or not they have been. In every story of a successful company there will have been moments when the destiny of a firm could easily have turned in an instant.

So, is the analysis of the situation more important or is it the process that is they key? Research conducted by Dan Lovallo and Oliver Sibony studied 1,048 major business decisions over 5 years. They found that “process mattered more than analysis by a factor of 6”.

I have no use whatsoever for projections or forecasts. They create an illusion of apparent precision. The more meticulous they are, the more concerned you should be. We never look at projections … —Warren Buffett

This does not mean that analysis is unimportant and should not be undertaken. Rather it should be treated as only part of the jigsaw. When making decisions it is essential that we explore uncertainties and encourage discussion of opinions that may contradict the views of senior stakeholders.


Image of Dilbert cartoon where data with human biases is given to boss

Source: Dilbert.com

Intelligence and a high IQ are not normally associated with stupidity. But research suggests that our propensity to make rash, foolish or irrational decisions is often not related to our IQ. No one is immune from making daft decisions and our reliance on IQ and educational qualifications as an indicator of competence can be a recipe for disaster. When the business culture gives too much reverence to people with certain qualifications and skills this can lead to rewarding decisions based mainly upon intuition rather than evidence.

CAVE Men! Colleagues Against Virtually Everything. People invest a lot of time and effort into their existing strategy or ideas. Dan Ariely calls this the not-invented here bias. People have a tendency to value their own ideas significantly more than others’ ideas. This can result in obsessive focus on poor ideas and probably explains some of the less successful decisions that we come across in business.

Confirmation bias also means that we tend to ignore information that does not align with our existing beliefs. We subconsciously seek and are drawn to evidence that confirms our view of the world. People are very good at overlooking facts that undermine their opinions and will follow the crowd that most closely supports those beliefs.

Image of Dilbert cartoon with confirmation bias

Source: Dilbert.com

So where does this leave the customer insight professional? It demonstrates the need for a comprehensive strategy for promoting the use of insight and collaboration to facilitate innovation and evidence based decision making.

  • Stakeholder management is essential not just to obtain the buy-in and support of senior management, but also to counteract many of the myths about how research and insight is undertaken.
  • Use storytelling to engage people at an emotional level. Our brains become more active when we are told a story, not only the language processing part of our brain, but also other areas we would normally use to experience the events of the story in real life. Some evidence suggests that our brains can synchronize with the brains of the person telling a story.
  • Spend time getting to understand your audience and their preconceptions. “What You See Is All There Is” tends to be strongly influenced by survey research when it comes to insight. Voice Of the Customer surveys are often given as an example of what research involves and yet these can be fundamentally flawed by relying on asking people direct questions.
  • Never underestimate the importance of how choices are presented and ensure you are fully prepared so that you avoid uncertainty about your recommended approach.
  •  Immerse yourself in the customer facing side of your business by meeting and observing how your organisation interacts with your customers. Don’t rely on third parties or management to identify the real challenges customer facing staff have to deal with.
  • Identify information gaps to highlight the need for research and insight. Our illusion of understanding sometimes needs reminding how little we really know about the world.
  • Challenge default methods of conducting research. Examine the potential for alternative approaches to insight, including experiments, observation and collaborative methods.
  • Encourage a culture of experimentation. For instance use A/B and multivariate testing on your website to understand what content most engages and motivates your existing and potential customers. See my post on Which A/B testing tool should you choose for more details.

Image of Dilbert cartoon about AB testing

Source: Dilbert.com

  • To counter hindsight bias always ask key stakeholders before you commission a project what they expect the outcome/findings to be. You can then use these as hypothesis to prove or disprove their views.
  • Encourage all areas of the business to share insights and engage with the research process. It shouldn’t just be marketing and customer services that buy-in to customer insight. This helps avoid group think by bringing diversity into the decision making process.


Related posts: 

To access links to all my posts go to my archive.

Why are Voice Of the Customer surveys fundamentally flawed? A behavioural economics view of VOC surveys.

5 ways to get more valuable insights from your Voice Of the Customer programme. Strategies to ensure your VOC programme has impact and delivers genuine insights.

What does mountaineering tell us about human motivations? A look at how what people say is not consistent with what they do and what really motivates us humans.

Are most of our decisions the result of other’s behaviour and opinions? How social learning and copying drives much of our behaviour.

What do business people really think about market research? The misconceptions about market research from my personal experience on the client-side.

Website optimisation toolbox. A/B testing tools and more to conduct experiments on your website to improve your conversion rate.

Are people more rational when buying financial services and big ticket items? A review of how rational people are when making purchasing decisions and how we use rules-of-thumb to reduce cognitive load assist decision making.

Thank you reading my post. If you found this useful please share with the social media icons on the page.


Further reading:



Thinking, Fast and Slow


Herd: How to Change Mass Behaviour by Harnessing Our True Nature


How to Get People to Do Stuff: Master the Art and Science of Persuasion and Motivation


 Decoded: The Science Behind Why We Buy


  • About the author:  Neal provides digital optimisation consultancy services and has worked for  brands such as Deezer.comFoxybingo.com, Very.co.uk and partypoker.com.  He identifies areas for improvement using a combination of approaches including web analytics, heuristic analysis, customer journey mapping, usability testing, and Voice of Customer feedback.
  • Neal has had articles published on website optimisation on Usabilla.com  and as an ex-research and insight manager on the GreenBook Blog research website.  If you wish to contact Neal please send an email to neal.cole@conversion-uplift.co.uk. You can follow Neal on Twitter @northresearch, see his LinkedIn profile or connect on Facebook.


Should testing your calls to action be a default strategy? – Common pitfalls of A/B testing (Part 1).


Calls to actions (CTAs) are an important element of website optimisation. However, sometimes CTA testing appears to be the default strategy for optimising web pages. This may be based upon the assumption that a web page is largely optimized, but whatever the reason it is guaranteed to lead to a sub-optimal website.

Firstly no webpage is ever fully optimal, however much you test it. The target is continually moving as visitor behaviour changes over time, competitors launch new offers or improve their websites. Technological developments may disrupt the market and new products or services result in a constantly changing landscape.

Image of Widerfunnel.com lift model
Source: Widerfunnel.com

But most importantly website optimisation needs to begin with your visitors and your value proposition. Before evaluating your CTAs you need to understand what your visitors are looking for and how your value proposition can help meet their explicit (category specific) and implicit (psychological) goals. Explicit goals, such as quality or style of merchandise, tend to be the more rational and conscious motivators that visitors articulate when evaluating which sites they will consider.

However, implicit or psychological goals, such as excitement or security, are important because these allow brands to differentiate themselves and generate an emotional response from customers. If a site wants optimise relevancy then it has to connect with visitors at both of these levels.

Image of implicit goals
Source: Decode Marketing

Next you need to align your website goals with your business goals and prioritize your goals to determine your key conversion optimisation objective. Your conversion optimisation goals should be as closely related to revenue generators as possible. Examples include:

  • Purchase conversion rate
  • Average order value
  • Account opening conversion rate
  • Deposit conversion rate
  • Average deposit value
  • Return on advertising spend

To implement a conversion strategy you should also ensure:

  • Your web analytics measure all primary (e.g. account opening) and secondary conversion goals (e.g. newsletter subscription) on pages and flows where you have a reasonable number of visitors. Don’t assume everything is automatically tracked.
  • You have appropriate A/B and multivariate testing tools to run experiments on your most visited websites. Do these allow you to segment and personalise experiences to benefit fully from online testing? Conversion optimisation is a journey that usually requires travelling through a number of distinct stages.


  • Establish a detailed testing roadmap that includes key landing pages, your home page, product pages, shopping cart, registration pages, and other important pages and flows. It is essential to consider both new and returning visitors. Review expert literature (see reading list below) on how to build and enhance your optimisation strategy.

6 types of tests to optimise a website page


  • Engage areas in your organisation that can assist in the process of improving the customer experience, including IT, UX, usability, marketing, and copywriters.
  • Build a technology roadmap to evaluate and implement new tools (e.g. live chat or App testing) and functionality to further enhance your optimisation capabilities.

When you have established a truly comprehensive conversion optimisation strategy your CTAs tests will become a small, but important element of your testing program. Finally, be bold and have fun with your experiments.

Thank you reading my post. If you found this useful please share with the social media icons on the page.

You can view my full Digital Marketing and Optimization Toolbox here.

To browse links to all my posts on one page please click here.


Further reading:



You Should Test That: Conversion Optimization for More Leads, Sales and Profit or The Art and Science of Optimized Marketing


Landing Page Optimization: The Definitive Guide to Testing and Tuning for Conversions


Website Optimization: An Hour a Day


Your Customer Creation Equation: Unexpected Website Formulas of The Conversion Scientist TM

  • Neal has had articles published on website optimisation on Usabilla.com  and as an ex-research and insight manager on the GreenBook Blog research website.  If you wish to contact Neal please send an email to neal.cole@outlook.com. You can follow Neal on Twitter @northresearch, check out the Conversion Uplift  Facebook page or connect on LinkedIn.