#CRO #Digital Marketing

How to Win at CRO and the Four Common Mistakes to Avoid

Conversion Rate Optimisation is an ongoing process of identifying where users are experiencing friction on a website or app. Through detailed analysis of both quantitative and qualitative data points, the biggest sources of customer pain are pinpointed and ideas are formulated to improve the customer experience.

“We have tried that and it does not work”.

This a common response from businesses who invest little or not at all into an optimisation team. However this overlooks the fact that an idea still needs to be correctly transformed into a hypothesis and then diligently executed to have any chance of success. A flaw in implementation, at any stage, can take a good idea and negate any potential positive impact.

Here are four common pain points and the solutions that will improve your overall CRO strategy.

1. Badly formed hypothesis

A hypothesis turns your ideas into a formal theory that can be proven right or wrong. However if they are based on limited data, they are missing the foundations of a well-crafted idea for experimentation.

Solution: Investigate data sources that support an idea that alignts with a customer challenge. Then combine this data with the business objectives to identify goals that track and define success and failure. Be sure to stick to a structured format for each hypothesis i.e. based on [data], if [cause], then [impact], because [rationale].

2. Poor coding

Experiments require a combination of HTML, CSS and JavaScript. Rushing the development process, cutting corners or relying on WYSIWYG visual editors to build experiences can lead to sub-optimal code that can be broken, slow down your webpage or break other pages on the website. 

Solution: Get a Front End Developer who has the necessary skills to build out test experiences. Brief them on what is required, have regular check ins to ensure everything is to plan and trust their ability to deliver.

3. Lack of quality assurance

Not all visitors are using Chrome and there are a range of devices and browsers with varying versions. What works in Internet Explorer 11 may not work in Internet Explorer 10.

Solution: Create a report from your analytics software (Google Analytics or Adobe Analytics) detailing which devices and browser combinations are the most used. Follow a QA checklist in these popular browsers to ensure the experiments work as expected before going live. For old browsers, or those you are unable to check, exclude these visitors from ever seeing the test.

4. Inaccurate data

If you can't measure it, you can't improve it. You need to collect and trust the data. Running an experiment without getting the necessary data removes your ability to assess the impact of any changes made.

Solution: Integrate with your analytics platform to allow segmentation and filtering of reporting based on the experiment variation seen. Ensure your data set only includes those who saw your test. Run results for the main KPI through a statistical significance calculator to ensure the results are reliable for detailed analysis.


It is easy to become risk averse and cut back on optimisation. However you should increase the velocity and expand the number of stakeholders involved instead. 

Analyse, test, and learn. A data-driven and customer-focused approach are key to improve business performance.