A/B testing
A/B testing lets you create variations for a number of page elements (blocks, images, content, buttons, form fields, and so on), then compare which variation performs best. It measures the number of conversions When a website visitor takes a desired action. This desired action can be, for example, buying an item, filling in a form, or clicking a link. obtained from the original (control) versus the variation (challenger), and the one that generates the most conversions during the testing period is typically promoted to the design for that page. Optimizely A/B testing has several predefined conversion goals you can use when setting up a test, and it is also possible for Optimizely developers to create customized conversion goals.
See also Optimizely Web, a powerful front-end A/B and multi-page experimentation product.
The Optimizely Digital Experience Platform contains many features to support you in your daily work. Depending on how your solution is set up, some features described in this documentation may not be available to you. Contact your system administrator to find out more. See Optimizely World for technical information.
How it works
Let's say you want to know whether a different advertisement can generate more interest from your site visitors. Using A/B testing, you create two page versions with two different advertisements that link to a target page The page that defines the end goal of a conversion path.. You set the A/B test to use the conversion goal Landing Page, which measures how many visitors click the advertisements and reach the target page.
- When a visitor views your A/B test page, the visitor sees the original (A / Control) or the variation (B / Challenger) version. A/B testing logs which version the visitor sees. If a visitor returns to the test page, the visitor sees the same version (A or B) throughout the duration of the test. However, if they clear cookies, and revisit the test page, they are considered a new visitor in the test.
- If a visitor clicks the advertisement, the target page appears and A/B testing logs the action as a conversion.
- When the test duration completes, the version that achieves the best results (that is, the most clicks) is declared the winner of the test.
- Depending on your site configuration, you can manually pick a winner (usually the one with most conversions) or the winner is automatically published when the test completes. Test winners are only automatically published if test results are statistically significant. For more information on statistical significance on A/B tests, see the Statistical significance section below.
Statistical significance
Statistical significance is a calculation that determines if test results can be considered "significant" or not. It is a function of the number of views and conversions of the variants. So, if one version is winning by a wide margin but has a relatively low number of views, it could still be calculated as the statistically significant winner of the test. Whereas a test with a great many more views, but where the variants' conversion rates run much closer, could have results that would not be considered significant. Theoretically, statistical significance can be achieved at any point during a test. This is not done here. A/B testing lets the test finish before running the calculation that determines if the results are “significant”.
So, how many views are needed to ensure statistical significance? The answer is that it depends on the margin of conversions the winning variant is winning by.
Confidence level
The confidence level set in Advanced options is used in the significance calculation to specify the amount of variance the results can have before they are considered statistically significant. The higher the confidence level you select, the more “sure” the calculation has to be that a variant is winning by a statistically significant margin. This is saying how much standard deviation the calculation can have before the results are viewed as significant. Typically, the more data in a test means that the standard deviation goes down, and thus the confidence % in those results goes up.
Once the test has completed and results are calculated using the selected confidence level, the reporting appears at the top of the Pick a Winner screen.
Starting an A/B test
- Start with a published version of a page or block as the original (A / Control). For example, you have a site devoted to air travel tips and want to get visitors interested in exploring your site. Will a fancy graphic button get more click-through Following a hypertext link to a particular website; especially a commerce site. than a plain text button?
- Create a draft by changing the button or making some other change to the page:
- Select Publish? > A/B Test changes. Do not publish the changed page. The A/B test view appears showing A / Control and B / Challenger thumbnail images.
If you are using Content approvals, set your draft to Ready for Review and let it be approved before you start the A/B test.
- Configure your A/B test by setting the following options:
Option Description Test Goal Enter your hypothesis for the test. This is for your information only. Conversion goals Select the conversion goal or goals that you want to measure. (Conversion goals are also known as key performance indicators KPIs Stands for "key performance indicator" (also known as a conversion goal); the measurement of actions on web pages. The actions can be completed purchases, pages visited, time spent on site and so on..) You can add up to five conversion goals for the A/B test, and under Advanced Options, you can decide if some goals are more or less important than others. - Landing Page. Select a target page to which visitor is taken when the visitor clicks through. Only a click-through is counted as a conversion.
- Site Stickiness. Select a target page and a timeout period (1-60 minutes). The A/B test counts a conversion if a visitor goes from the target page to any other page on your website during the time period. If the visitor closes the browser then opens your target page again within the specified time period, a new page view is not counted. However, a conversion is counted if the visitor goes from the target page to another page during the second visit.
- Time on Page. Enter a time in seconds. The A/B test counts a conversion when a visitor stays the defined time on the test page.
- Add to Cart. Select a product a site visitor can add to a cart. If a visitor adds that product to a cart, it is counted as a conversion.
- Average Order. Select this conversion goal to track completed orders on each of the test pages. The conversion goal totals up the values of all Optimizely Commerce carts created by visitors included in the A/B test. The test determines which page variant creates the highest average value for all those carts when picking a winner. If a visitor creates multiple carts, all the (purchased) carts are included in the total, which means that the visitor can “convert” many times in the test duration. On Optimizely Commerce websites using different currencies, the test converts all carts to the same currency.
- Purchase Product. Select a product a site visitor can buy. If a visitor buys that product, it is counted as a conversion.
You need Optimizely Commerce to use Commerce-related conversion goals such as Add to Cart, Purchase Product, and Average Order.
Participation percentage Enter the percentage of total amount of traffic to your A/B test. If you set it at 100%, all website visitors participate in the test. Half of the test participants will see version A, and half of them will see version B.
However, you may not want so many visitors to see version B if it includes something that might be unsuccessful. You accomplish this by lowering the percentage of visitors included in the test. Visitors not included in the test will see version A. Only visitors included in the test count in the statistics.
Test duration Specify the number of days you want the test to run. Start test Select one of the options. You can stop the test at any time before the number of specified days are done. - Start test immediately. Select this option and click Start Test after you specified the test parameters.
- Schedule for later. Select this option and a date picker appears. Select a date and time to start the test. Click the Schedule Test button after you specified the test parameters.
Advanced Options - Balance the importance of test goals. Select if one goal is more important, or less important, than the others. If two conversion goals are set to High (or Low), it is the same as leaving them at Medium, meaning they have the same importance and thus not weight the test result. Similarly, if you add a single conversion goal, the selected weight has no effect on the test result.
- Confidence level. Select the confidence level of statistical significance you want from the results that you gather. The higher the confidence level you select, the more “sure” the calculation has to be to determine that the winning variant is winning by a statistically significant margin.
Image on version A / Control: | Image on version B / Challenger: |
- Click Start Test if you set the test to start immediately, or Schedule Test if you scheduled the test for later.
Viewing a running A/B test
- To view a running A/B test, open the page you are testing, and click View Test on the notification bar.
The test results are displayed, and a flame graphic shows which version is leading: .
Beneath the two page thumbnails, you can view the currently collected test data, such as views, number of conversions, and conversion rates. If you are measuring towards multiple conversion goals, you see how each goal is performing and what weight each goal is given. If you are measuring towards one goal only, you see the test data and a pie chart visualizing the conversion rate. The conversion rate can be a percentage rate or an amount, if you are using the Average Order KPI. (The pie chart is not displayed for Average Order KPI.)
A/B testing normally calculates the views as the number of times a page has been displayed to a visitor. However, when you are testing a block, A/B testing counts the number of times the block has been requested by Optimizely CMS. If you have a condition set on your block so it is only displayed to certain visitor groups, for example, a view may be counted even though the block has not been displayed to a visitor.
Statistical significance of the test is calculated when the test is finished. Before that, it is not possible to say whether the test results are significant or not.
- You can select the following actions from the Options menu:
- Pick The Winner. If you see enough data before the test completes, you can stop the test and pick a winner. For example, perhaps the changed page is a clear runaway winner such that another few days of testing may not significantly affect the result.
If you select Pick The Winner, the Pick the Winner view appears. The leader is highlighted in green. Click Pick The Winner and it is automatically published. After you select a winner, the loser is added to the Versions gadget as a historical artifact.
- Abort A/B test. Stop the test and discard the results.
- Pick The Winner. If you see enough data before the test completes, you can stop the test and pick a winner. For example, perhaps the changed page is a clear runaway winner such that another few days of testing may not significantly affect the result.
Picking a winner
Depending on your site configuration, a test winner can be published automatically at the end of the test, or you can publish it manually, during or after the A/B test.
Publishing a test winner automatically
An administrator can set up your site to automatically publish A/B test winners at the end of a test, if the test result is statistically significant. If this setting is enabled, it affects all tests on your site. As soon as a test finishes, the test winner is published. However, if the test result is not statistically significant, you have to manually publish one of the test versions.
Publishing a test winner manually
If you have publishing rights, you can publish a test winner while the test is running or wait until it finishes.
- To view a finished A/B test, open the page you are testing, and click Pick winner on the notification bar.
The test results are displayed.
At the top of the test result screen, you can see if the results are statistically significant.
The test winner is highlighted with a green background and a trophy graphic: . The Pick The Winner button of the test winner is green, but you can publish either version.
Beneath the two page thumbnails, you can view test data, such as views, number of conversions, and conversion rates. If you are measuring towards multiple conversion goals, you see how each goal has performed.
- Click Pick The Winner on the version you want to publish and it is published immediately.
The loser is still available in the Versions gadget.
Managing A/B tests
You cannot edit the test settings or the content of a page while the test is running because you could possibly invalidate the results. If you need to change the test settings or something on the test page, you must cancel the test, make your changes, and start the test over. You can cancel the test from the Options menu in the test view or from the test page. If you open a draft of the test page, the Options menu is called Publish?.
Use the Tasks tab in the navigation pane to find A/B tests.
Scheduled Tests. Displays links to tests that are scheduled to run at a later time. Active Tests. Displays links to active test pages that are collecting data. Click an item to display the test page, where you can click the View test link to display the snapshot of result data. Completed Tests. Displays links to completed tests. Data is no longer being collected. A winner has not yet been published. Archived Tests. Displays links to completed tests where a winner has been published. |
For other statuses in the Tasks bar, see Controlling the publishing process.
Viewing completed and archived tests
In the Tasks pane, you can see all A/B tested pages by selecting Active Tests, Completed Tests, or Archived Tests. (A completed test is a test that is finished but a test winner was not yet published. An archived test is a completed test where a test winner was published.)
To view the individual tests run on a specific page, add the Archived Tests gadget to the navigation or assets pane and open a tested page. The gadget displays all archived tests run on the current page. Click a test in the gadget to view the test details.