In all things business, interaction with your audience is key. Clear and consistent communication is the first thing that people will notice about a company and a company's ability to engage with a customer right off the bat can be make or break. For those with an online presence a clear marker for effective engagement is conversion statistics; whether the people who visit your site go on to become customers or clients. Here is where A/B testing comes in.
A/B testing divides traffic to a particular page and sends half to one instance and half to another. By measuring the conversion rates of each instance you have an idea of its efficacy. The difference can be as small and subliminal as a change in font or as large and overt as completely rewritten copy. With enough time you can measure what success your proposed changes have made or conversely, what damage they have done.
These tests should be carried out only after rigorous internal development; they aren't designed for floating new ideas but rather for establishing whether changes you are ready to make will better your conversion rates. If you plan on updating the look of a website then the initial phase of the design process should happen in house and only once you have made a decision on the new direction should you implement A/B testing. Once you have decided on A/B testing you should allow enough time for a reasonable sample size to go through before making a decision. Whilst it may seem that a change in the look of a website initially prompts more interest, with proper time given to the test any anomalous results will be teased out.
Let’s look at a Dots+Circles client, Birtsmorton Court, as an example of how A/B testing works and what it shows us. Birtsmorton Court operate as a wedding venue and decided they wanted a more 'feminine' feel to their website. Dots+Circles went about making the changes before setting it up for A/B Testing and we can take a look at the implementation, how that process worked, and what insight was gained.
As the picture shows the only real difference is a change in the colour scheme, the functionality and layout of the website remains largely unchanged. Unlike mechanical adjustments on the server-side that can be tested for practicality and effectiveness in house, the benefits of aesthetic changes will only be confirmed by public testing and the inherently subjective responses can be measured statistically.
The instances ending with the suffix '_paris' represent the conversion rate in terms of enquiries from the amended versions and the other results represent the same data from the original versions. As you can see the website performance for the original design is markedly better in terms of conversion. In this case the client was advised to keep the original design as analysis of the results would indicate that a drop off would occur if they chose to go with the more feminine look. This form of statistical observation is key when helping to demonstrate the viability of design changes to clients. While the virtues of aesthetic choices can be entirely subjective, being able to demonstrate the subliminal effect of updating the look of a website allows an objective choice to be made.
Visual design choices make up a core component of user centered design. Where the functionality and any interactive elements of a website should make things as easy as possible for a potential customer to become a realised customer, the overall look of the site should appeal to any new and existing customers as well. These decisions are subjective and their effect on the audience, whether that's positive or negative, is often subliminal as the majority of people won't necessarily pick up on subtle design changes. Through proper public testing you can gain real insight into how choices that range from purely aesthetic to the adjustment of elements on a webpage, for example their positioning or scale, can result in completely new reactions from that website’s visitors.
In our example we see that the original colour palette was proved more beneficial and a potential dent to conversion rates was avoided. One notable issue involved with A/B testing arises when you consider that if the change does have an effect on conversion rates then half of the sample group will have a worse rate of conversion. In this case during the test the new colour scheme saw a drop off in enquiries so those potential customers may have been missed. However, the marginal dip you see during testing should almost always produce improved results overall when you decide on whether or not to approve the new look.
Design and implementation will always be an iterative process and any business with ambition should be continually improving customer-facing elements, such as websites, with a view to growing their client base. Aesthetic choices will be discussed and new layouts will be put together but as with every step of the development process, proper testing remains essential. With well planned application, the A/B testing model gives both clients and developers an accurate picture of the effectiveness of these potential adjustments and allows both parties to make collaborative, thoughtful, and educated decisions.