A first in analytics history

For many years, the continuous testing of analytics implementations has been a great challenge for analytics experts around the world. The most common way to perform a test on and implementation is to run a test scenario and to check whether the right data are measured for that scenario. For instance, you would put in a test order to check if the chosen products and their price are measured by looking at the data on that specific page. If your findings met the predefined specifications, you would make the assumption that the implementation sends a correct measurement for all situations.

Multiple products

However, that assumption is not really correct. I could just as well be that the implementation is correct in case someone only orders one product, but that an order of multiple products contains a false measurement. Many marketeers with extensive experience in analytics will confirm that the chance of such an error is quite big. So if you only perform a test on a single product order, a false implementation will go live. Of course this error will become clear over time and get fixed, but you would still have a long period of incorrect data.

Many scenarios

So is there a solution in performing many test scenarios with variations in products, browsers, pages etcetera? Well, you could do so and try to test all frequently occurring issues. But, even when using multiple testing tools, there is no other way than performing all scenarios one by one. And in that case, you would face a very time consuming process that would approximately take two or three times as long as the actual implementation.


Luckily, there are applications called crawlers, which we can use to open multiple pages and review the data that are being sent. They present you with a quick overview of a great amount of pages and the data that they send to your analytics tools. They do however have one great disadvantage. They cannot – or at least not without additional efforts – access the most important parts of the website, like forms and payment pages, because they cannot fill in forms by themselves. But these are just the pages you really want to measure and get correct data from!

A welcome solution

Both scenario tests and crawlers cannot ensure that all data that are being sent to your analytics tools are correct. And that is why a solution like Qmon is crucial to monitor the quality of your data. Because for the first time in analytics history, it is possible to continuously test 100% of the data that are being sent to your web analytics tool.


No comments yet.