Skip to content

How we improved a client’s conversion rate

In my last post, I talked about the importance of research – both quantitative and qualitative – directing your website redesign process: whether it be a completely new web design site or a more progressive/iterative process.

I touched on the importance of data analysis from services such as Google Analytics and mentioned doing some reports for clients on how their website is performing. Since there’s been interest in this, I’ll explain some more about what we’re doing. Identifying details about the client have been left off.

The problem

A client had their long-running website redesigned multiple times over the years. A few months after the last redesign contacted me with concerns. While the client had observed an increase in traffic, there had been a drop-off in the number of people signing up to the mailing list – from which the client received a significant proportion of business leads.

The suggested solution

The client suggesting that I increase the size and visual boldness of the mailing list opt in button on the homepage as a way to try and solve this problem, or adding more text to make it more compelling.

Researching the data

Normally, clients come with a suggested solution to their problem. We all think we know the right way to solve a problem – and sometimes we’re right.

Instead, where possible, we should use research to inform our decisions – whether it’s based on data from our own site or from applicable well-run studies by others. Where it’s not possible (and it’s an important problem we’re trying to solve), we should use data to test the effectiveness of our changes (such as using A/B testing).

Instead of doing a quick design change for the client, I wanted to do some research first by delving into Google Analytics’ figures and brainstormed a bunch of questions I was curious about such as:

  • How do visitor numbers compare to the previous year’s data? I.e. is it seasonal variability that’s being seen? Most businesses will naturally fluctuate in a yearly pattern.
  • Where are the visitors coming from, and what are they landing on? For example, if they’re getting a large number of visitors who are searching for something which isn’t your core business, then it may not be anything wrong with the design, it could just be the wrong type of traffic that won’t convert into business leads however you have the website.
  • What are people clicking on when they arrive on the homepage? Could it be that the opt-in button is being clicked on, but when they’re taken to the page with the subscription form on it, they’re not filling it in or completing it?
  • What proportion of people are visiting the homepage? Is the opt-in button clearly visible to non-homepage visitors? How does it perform on those pages?

Asking questions (and the right ones for your needs) and taking time to explore the data is so important.

I spent some time in Google Analytics looking at the data, identifying key dates (e.g. redesign changes) and defining measures of interest and prepared an initial benchmarking report which included a detailed comparison of visitors (and the types of visitors) before and after the last redesign.

The findings were really interesting to the client, because they didn’t completely line up with what they thought they were seeing and wasn’t necessarily intuitive:

Site traffic was slightly down to the site overall since the redesign – for a comparable period of time – primarily due to less direct visitors. This could be due to seasonality of interest in this sort of business, or other marketing efforts being less effective. This is something which can be addressed in a separate manner than the opt-ins issue.

Interestingly, the redesigned site had a greater number of visitors and a greater proportion of visitors going to the subscribe page but less people and a lower proportion converting (i.e. completing the subscription form).

In short: the key issue was the subscription page itself, not the button to get to the subscription page.

Improving the conversion rate

From this, we were able to focus right in on improving the subscription page content and design; monitoring conversion rates to see what worked and what didn’t.

One approach is to use A/B testing – i.e. test multiple subscribe page options at the same time by randomizing the subscription page that new visitors to the site see. There are tools to help do this for you, but they’re not required. There’s pitfalls with A/B testing too – such as ensuring you’re testing for statistical significance and testing for long enough (to remove effects relating to time of day or week).

With a series of adjustments to the subscription page, the conversion rate is back up again and staying up. We also have regular automated monitoring of the data for us to analyze and report back to the client on.