Arrow to the left
Back to methods overview

A/B testing

A/B testing is a great method for comparing different versions of a design with a clear goal.
 A/B testing
Illustration that represents a light bulb
1. Understand
On the first we stay in the problem space, challenging our understanding of the problem.This is some text inside of a div block.
Illustration that represents some working tools
2. Ideate
On the second day we gather inspiration, ideate and sketch solutions together.
Illustration that represents a futuristic mobile interface
3. Prototype
On the third day we turn the most voted sketches into a prototype and prepare to meet the users.
Illustration that represents some bubble talks
4. Test
On the fourth and final day we test the prototype with users and round off the sprint together.

How it works

A/B testing is a great method for comparing different versions of a design with a clear goal.

A/B testing is a great method for comparing different versions of a design with a clear goal. The method is mostly used on websites for e.g. comparing which website design leads to the highest conversion rate, but can also be used to compare different ways of solving a specific task in a product. The cheapest way to do the latter is to prototype different versions and run usability tests. It can also be used to compare a prototype to an existing solution.

Infinity symbol icon
Validate
Clock icon
1-4 hours
Users icon
Product Manager and/or Product Owner, Business Analyst, UX Designer, Developer, Quality assurer
Before

1. Define what to test. When running an A/B test, the metric must be very clear, e.g. conversion rate or task success. What you choose should be based on user feedback.

2. Prepare prototype(s) or test variations. Make sure everything is ready to launch the A/B test. A/B tests can be run in production environments, through usability testing (link) or through tools like UsabilityHub.

3. Do a dry run. Before launching the test live, try it out with a team member to see if they are able to understand what you wish to compare. Look out for confusion.

During

4. Send it out! Send out the A/B test to your intended target group and await the results.

After

5. Analyse the data. Gather your team and look at the data together. Align on which variation performed better.

6. Present and share the results with the rest of the team and internal stakeholders.

7. Put the data to good use! Depending on if you tested prototypes or something in production, the next step would be to fully launch the variation that performed better. If you still see room for improvement, this would also be the time to put what you learnt into the next iteration.

Tips
  • Have a clear baseline for metrics so that you have something to compare your results against. This will help you argue for change once your results are in. 
Learn more
This is some text inside of a div block.

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

No items found.
No items found.
Before

1. Define what to test. When running an A/B test, the metric must be very clear, e.g. conversion rate or task success. What you choose should be based on user feedback.

2. Prepare prototype(s) or test variations. Make sure everything is ready to launch the A/B test. A/B tests can be run in production environments, through usability testing (link) or through tools like UsabilityHub.

3. Do a dry run. Before launching the test live, try it out with a team member to see if they are able to understand what you wish to compare. Look out for confusion.

During

4. Send it out! Send out the A/B test to your intended target group and await the results.

After

5. Analyse the data. Gather your team and look at the data together. Align on which variation performed better.

6. Present and share the results with the rest of the team and internal stakeholders.

7. Put the data to good use! Depending on if you tested prototypes or something in production, the next step would be to fully launch the variation that performed better. If you still see room for improvement, this would also be the time to put what you learnt into the next iteration.

Tips
  • Have a clear baseline for metrics so that you have something to compare your results against. This will help you argue for change once your results are in. 
Learn more

Templates

Ultimately, the choice of design tool depends on the specific needs of the designer and the project at hand. Each tool has its strengths and weaknesses, and designers must consider factors such as cost, ease of use, and collaboration features when selecting the best tool for their needs.

Mural Board
Design Sprint 2.0 board

Follow this day-by-day exercise schedule for a meaningful and efficient Design Sprint.

Tool recommendations

Below you will find some tool recommendations, but please note that you are free to choose whatever tool you prefer. For some tools we have group wide licenses, in which case access can be requested via licenses@visma.com. Remember that you always need cost approval from your immediate manager.

Google Meet
Google Meet
Real-time meetings using your browser, share your video, desktop, and presentations with teammates and customers.
Google Meet
Maze
Maze
Maze powers your product research workflow with continuous user insights, fueling better product decision-making and business growth.
Maze
Want to learn more?

Get in touch with Visma UX

Drag
Request access