Recommendations on how to approach QA / Preview of SiteSpect Experiments prior to activating.
It is important to always QA everything before making it active in a production environment. Be sure to preview the campaign on multiple browsers and mobile devices if those are part of the target audience.
Test any actions a user might perform and verify the variation changes are persisting as expected. Try navigating away from and back to the target page - Adjust filters/options/input values.
It is also important to test the campaign across different URLs and page types on your site. Make sure the desired experience triggers where it should, and does not trigger where it should not. You can use the summary tab in a preview to review the counted status.
Another useful approach to follow is to work through the four core sections of the campaign builder to review the critical elements in each as outlined below.
- General
- Is the experiment in the right Set Type for traffic management? Overlay vs Non Overlay.
- Is the traffic % / assignment frequency set to the required value as per the brief outlines?
- Variations
- Are the campaigns triggers pointing to the correct pages for the experiment? If so, when you preview the variants you should only see the counted status in the preview summary tab show yes when you have been to the test page in a preview session.
- Do the variant(s) display as per the brief across all the latest versions of the major browsers and device types (Chrome, Safari, FF, Edge).
- Are there other experiments in the same location/pages and do they conflict? If so, use Advanced Preview Settings to layer multiple experiments into the same preview session(s) to confirm any issues.
- During your preview session(s) do you noticed any console errors in the developer tools for the browser? Are these present for just the variation group(s) or also in the control?
- Metrics
- Is the right KPI selected to drive the experiment alerts?
- Are the correct metrics populated in the experiment as per the brief?
- Are the metrics working as expected and triggering on the correct pages during your preview session(s)?
- Audiences
- Are the correct include and excludes for audiences populated in the experiment?
- Are they working as expected? This can be validated using Advanced Preview Settings to spoof many of the audience criteria in your preview session(s).