A/B tests on user group signup adverts

To recruit people to our user group we ran a display advert on the website. I created an A/B test using 4 different adverts. One of them was a runaway winner.

The aim of the advert was to encourage people who use the website to signup and become part of the user group. I tested the baseline against 3 variants on other popular pages throughout the website. The baseline advert ran all the time on the homepage but the homepage is not our most visited page, with just 3% of total site visits, compared to our jobs page at 11%.  So the A/B testing ran on the A-Z, contacts, forms and jobs pages as well as several other popular content areas of the site.

Here are the contenders:

Variety 1
Variety 1
Variety 2
Variety 2
Variety 3
Variety 3

Baseline [blue] The baseline advert was a text box with simple content and a clear call to action.

Variety 1 [yellow] The first variety of the advert was a graphic, showing the website homepage and touting the user group. Wording in the advert hit on the senses (take part, view, hear) and the text was centralised.

Variety 2 [orange] The second variety was similar to the first with the same wording but used a photo of a test tube. The wording was left-justified.

Variety 3 [green] The third variety was text-based, carefully-designed to appear like a badly-written personals ad clipped from a newspaper.

After running the test for a few days I removed one of the poor-performing varieties. After 9 days, I removed the remaining poor-performing advert and ran a follow-up test with the baseline advert and the likely winner. The total test took 12 days.

A/B testing - round 1

The results were conclusive, almost from the start of the test. The personals ad outperformed the other adverts with a conversion rate of 1.33%. According to Google Website Optimizer, this winning advert had a 99.9% chance of outperforming all the others. The baseline was the closest contender with a 0.48% conversion rate.

This is another test that shows how plain text beats graphic content and how users are often blind to snazzy, photographic advertising. On this occasion, both text-based adverts outperformed the photographic adverts. One could still argue that the winning advert is a graphic. But it only depicts words and characters, with a dirty grey background.

Plain text beats graphic content: A/B testing

A/B testing can help to pinpoint the type of content that staff are most attracted to. I ran an experiment to find out which advert was most effective in an internal campaign.

The problem

The intranet staff directory gets a lot of complaints.”It’s never up to date,” being the main one. Last year, we ran a campaign to encourage staff to update their records. Incidentally, our intranet staff directory is not linked to any central human resources databases or Outlook contacts. We are stuck with a system designed outside of the central intranet team and the best we can do is to try to keep the information current.

On occasion, we have previously used a standard graphic advert (the yellow original) to encourage staff to check their directory details. It wasn’t a great advert and it wasn’t very effective. For this latest campaign, I decided to test the original advert against other variations to see if they sparked any interest from staff. I designed a series of adverts with different themes, some static, some animated and one plain text advert. I ran the test over a period of weeks to find out which was the most effective at encouraging staff to take actions to update their details.

Setting up the test

I used Google Website Optimiser to handle the A/B testing. I ran the adverts on the page where staff would normally go to search for phone numbers and configured Website Optimiser to serve the various adverts, using the original yellow graphic advert as the control. I defined a successful conversion as a member of staff going to the “How to update your record” page and clicking on the option to update their details.

Testing process and results

Website Optimiser works using cookies to serve one of the predefined adverts to a visitor. It will always display the same advert to the same visitor. It then measures how successful each advert is depending on which actions the visitor takes.

After a few weeks the numbers in Website Optimiser should start to solidify and you begin to see patterns in the results. The coloured lines show the conversion rates for each advert. This is a great visual tool for getting buy-in from stakeholders; the infographics clearly show the outperforming tests.

In this test, the plain text advert was getting the best conversions and showing up as the likely winner. The last stage in the testing process is to do a follow up experiment using just the original control advert and the proposed likely winner.

Peoplefinder AB test followup

It’s very satisfying to see the green light that signifies the end of test and a definite winner. Up against all the fancy graphics and animations, the plain text advert had a 98.9% chance of beating the original.

The experiment showed that the text advert was the best approach to take for this problem, for our staff on our intranet. With user testing, the results are often unexpected, reinforcing why we must always design for the user.

See also: