Quality Assurance (QA) is the last step before going live with your AB test. We cannot stress the importance of thorough QA to make sure that everything is working as it should. We need to be certain that the changes will look and work as expected. The best way to ensure that you run high quality, error free and reliable tests is through a QA process.
While QAing, you need to consider several aspects of an experiment to ensure a positive user experience and reliable test results. We would recommend having a templated QA sheet that you can use for each test which contains details of all the test cases to carry out. Don’t worry if you don’t have one, we're here to help by highlighting the 7 essential areas you should cover.
Design is the first point of contact for the user so it is crucial to leave a positive impression. Any discrepancies or inconsistencies in the design can deter users, leading to exits and potentially impacting conversion rate.
To mitigate these risks, some checks to consider are:
You need to pay extra attention on this check because ensuring components function as expected is paramount. Glitches or malfunctions can disrupt the user journey and lead to abandonment and frustration.
Ensuring tracking is firing as expected is crucial to ensure that we can accurately measure the impact of the CRO experiment and provide insights into user behaviour and conversion rate performance.
Tests need to be shown to the right audiences, at the right time and on the right pages.
Any inaccuracies or misconfigurations in targeting conditions can incorrectly include or exclude users from the experiment, resulting in incorrect results and conclusions.
Ensuring we include the appropriate amount of traffic in the test and also within each variation is important for drawing meaningful insights and reaching the appropriate sample size levels.
Defining the right primary and secondary goals/objectives is the foundation to being able to measure the outcome of the test.
By defining clear primary and secondary goals that align with business objectives and user expectation, we can effectively measure and optimise user experiences.
Almost all of the testing tool(s) have the ability to integrate with others tools so that you can analyse the data in multiple places. Some common examples are Google Analytics, Mixpanel, Adobe Analytics, HotJar and Contentsquare. When you’re running a campaign make sure you have the relevant integrations enabled and configure to collect data smoothly and accurately across platforms.
There are several methods to QA your campaign, it can vary from tool to tool but you can follow any of these depends how much technical background you have or hands-on experience on the testing tool.
Most testing tools offer dedicated applications known as 'QA Assistants' or provide QA preview links. These allow QA teams to thoroughly evaluate all of the checks we've outlines previously without pushing the changes live to your users. It allows you to easily select which variation you want to 'force' into and carry out the relevant checks.
With this approach, we will push the test live but add an additional targeting condition to the test where it will check that a certain cookie has been set on the browser. This is a good was for us to truly test the experience as it where live, especially if we see unexpected behaviour with the QA links or QA assistants. Those who have this cookie value in their browser, can only see the experiment. So that you can see the changes, you will have to manually drop the cookie using the browser developer tools.
To do this, you can follow these steps:
Similar to QAing with a cookie, with query parameters we will change the targeting conditions of the test to look for a specific value in the URL that a normal user would not have set. For example:
www.exampledomain.com?abtestparam=value
Be careful, if it is another parameter already in the URL, you will need to make sure you add an ampersand:
www.exampledomain.com?param1=value1&abtestparam=value2
This methods is easy to setup and doesn’t require much technical knowledge. Look for your IP address and add into the setup and if you want others to see your QA version (eg. in the same office) you can also set an IP range in your tool.
Once done, push your campaign live and you can QA your experiment.
Since this step of building an experiment is crucial, below are some areas that I’ve tried to highlight some common mistakes to avoid.
There are lots of emulation tools in the market that offers scalability and accessibility and huge variety of operating system along with hardware support. You can easily pick OS and browser of your choice, however it is not the recommended way of QAing your test campaign. And the common reasons are;
We recommend trying to avoid emulators for your CRO experiment QA and use actual devices.
If your campaign is running smoothly on one browser, it doesn’t mean it will behave same on another. For example, there are some CSS rules that are not supported cross browser. Use analytics data to check which browsers are most popular for your users, although all browsers should be considered.
If your campaign is running seamlessly on desktop it doesn’t mean it will behave same on mobile too. Your test should be device friendly, from both design and functionality perspective.
Chat through our services with our team today and find out how we can help.
SERVICES
© Hookflash Analytics | All rights reserved | Company No. 203 831 6690 | Privacy & Cookie Policy