Top 20 reasons why people misinterpret data and reports

Following are the top reasons why most people misinterpret analytics data and reports;

#1 Lack of context

Different people analyze and interpret the same data/chart differently. It all depends upon the context in which they analyze and interpret the data. If you have a better understanding of the context then you will interpret the data more accurately. 

Your probability of accurately interpreting the analytics data will be a thousand times better if you know the context beforehand. It is only by gaining a great understanding of your business that you can really know the context in which you should analyse and interpret the analytics data. 

To get the highest possible return on your analytics, you need to know the context beforehand. Otherwise, you may end up wasting a lot of your time doing meaningless analysis. 

Optimisers who do not understand their business really well do not really understand the context in which to analyse and interpret the data. Hence they suffer from what I call directional issues

Directional issues are the inability to move in the right direction, at the right time. It is the inability to determine:

  • What data needs to be collected and analysed, and when.
  • What data to look at.
  • What data should be overlooked.
  • Where to look in any analytics reports.
  • How to translate business objectives into measurable goals.

Just because you have access to data, it does not automatically mean that you should go ahead and analyse it. The cornerstone of every successful analysis is ‘moving in the right direction’. 

The direction in which your analysis will move will determine the direction in which your marketing campaigns and eventually your company will move in order to get the highest possible return on investment. 

In order to find the right direction, you need to know the context in which you should analyse and interpret the analytics data. To know the context, you need to acquire a great understanding of your business, industry, target market, competition, and business objectives. 

If you do not have that great understanding or if you do not know the context before you start analysing and interpreting analytics reports, you are already moving in the wrong direction. This is the direction that will almost always make you return sub-optimal results for your company.

#2 Not understanding the intent

You will not be able to recognize cognitive biases (like selection bias, confirmation bias, attribution bias etc) in data interpretation and reporting if you do not understand the intent behind a piece of data, information, or visualization. 

For example, why an agency is using 28 days click attribution window and not 7 days click attribution window? Or why the marketing reports are dominated by view-through conversions. What is their intent behind it?

#3 Attribution Bias

Attribution Bias occurs when you make a snap assumption, judgement or decision based on a limited amount of information, clarity of context and intent.

Jumping to conclusions, quickly providing recommendations, judging a book by its cover, fighting for a cause you know little about are all examples of attribution bias. The way to fight attribution bias is to ask questions and acquire deep knowledge and understanding.

#4 Not understanding statistical significance

Statistical significance means statistically meaningful.

The statistically significant result is a result that is unlikely to have occurred by chance. Whereas the statistically insignificant result is a result that is likely to have occurred by chance. The term ‘statistical significance’ is used a lot in conversion optimization and especially A/B testing. If the result from your A/B test is not statistically significant then any uplift you see in your A/B test results will not translate into increased sales.

When you do not understand the statistical significance it often results in drawing conclusions based on a statistically insignificant sample size. 

Then you could be making decisions like the one below:

“Campaign A has got 10 conversions, Campaign B has got 20 conversions. Therefore Campaign B is performing better than Campaign A. We should invest more in Campaign B.”

#5 Simpson’s paradox

Simpson’s paradox occurs when a trend that appears in several different groups of data disappears or reverses when the groups are combined. It happens when you can not see the forest for the tree. When you can not see the big picture. 

Simpson’ paradox also occurs when you lack a multi-disciplinary approach to solving a business problem. As a result, you can not create unexpected connections between different disciplines (like marketing and analytics) and all of your recommendations seem to be biased and siloed and not very useful.

#6 Causal reductionism (causal oversimplification)

Causal reductionism means logically reducing an outcome to a single and simple cause. For example, X resulted in Y. Therefore X is the only cause of Y.

Other examples:

“Our sales never increased despite posting on LinkedIn for months”.

But what if your offer wasn’t good or you never pitched on Linkedin or your posts never aligned with your business goals?

“Our sales increased by 20% ever since we started advertising on Facebook”

But what if the 20% increase in sales is mainly because of the 10% increase in organic search traffic and a 2% decline in traffic from paid search?

“Campaign A generated 100 orders. Therefore Campaign A was the only cause of the 100 orders.”

But what if Campaign A got assistance in sales from Campaign C and Campaign D?

Whenever you assume that there is a single and simple cause of an outcome, you are falling prey to ‘causal reductionism‘ (the fallacy of a single cause).

This fallacy makes it difficult for you to understand the true customers’ purchase journey and attribute conversions correctly. Analytics reports are not what you see is what you get. They are what you interpret is what you get. There are almost always multiple causes of an outcome.

Failing to recognize causal reductionism is the root cause of all the problems around attribution

#7 Dunning Kruger effect 

It is a type of cognitive bias in which people believe that they are smarter and more capable than they really are. 

This results in not asking questions or enough questions, making lot more assumptions, jumping to conclusions, talking in absolutes, taking all the burden of proof etc.

The Dunning Kruger effect is a major problem with a lot of people who have accumulated considerable work experience over the years & who now see themselves as a subject matter expert.

Asking too many questions will undermine their professional abilities & make them look clueless. So they either don’t ask questions or don’t ask enough questions.

But here is the thing. You can never be so sure about someone else’s business. You have no idea what is going on out there in their company. So you should not be 100% sure that what you are recommending is going to work.

#8 Streetlight effect

It is the type of observation bias where you get your information from where it is easiest to look. 

For example, 

  • looking only at the top 10 results of your analytics reports
  • always looking only at a certain set of dashboards
  • not segmenting the data enough
  • Just relying on standard reports etc.

#9 Confirmation bias

Confirmation bias occurs when you actively seek information and patterns in data that confirms your existing theories or beliefs.

As a result, you may start seeing a certain pattern in data that does not really exist or you test a hypothesis under the assumption that it is true or you overemphasise that data that confirms your existing beliefs.

#10 Selection bias

Selection bias occurs when you select a sample that is not a good representative of all of the data. It often results in drawing wrong conclusions. Selection bias often skews the A/B test results and create imaginary lifts in sales. 

#11 Observational bias

Observational bias occurs when you take things at face value. It often results in observational errors. For example, campaign A has a 5% conversion rate and campaign B has a 10% conversion rate.

Now you could assume that the conversion rate of campaign ‘B’ is highest only on the basis of your observation. There is a good possibility that the campaign B conversion rate is not statistically significant. 

#12 Cumulative error

The cumulative error is an error that increases with the size of the data sample revealing it. So one wrong conclusion can lead to several wrong conclusions which could eventually make your entire analysis flawed. 

#13 Relying on anecdotal evidence

When you rely on anecdotal evidence to make future decisions or further your research, it could result in a flawed test design, misinterpretation of data, misallocation of resources and even great monetary loss.

For example, you doubled the conversion rate of two of your clients just by changing the colour of their CTA button to red. So based on this anecdotal evidence you could draw the conclusion that changing the CTR colour to red is a best practice and is most likely to result in an increase in conversion rate. 

You need to remember that one or even a few (possibly isolated) examples can not stand alone as definitive proof of a greater premise.

#14 Not understanding the Pareto principle (80/20 rule)

According to the Pareto principle 80% of the outcomes from 20% of the input. So what that means only 20% of the data available to you is important enough to produce 80% of the outcome you want. The rest is all junk. 

Often optimizers focus on analyzing 80% of the data that produces 20% of the results. And this happens because they are unable to conduct a highly focused and meaningful analysis from the very start.

#18 Woozle effect

Woozle effect is the evidence by citation. It occurs when frequent citation of a claim lacking evidence is considered to be true/fact over time. 

For example, a blog named ‘A’ claimed that a very high bounce rate is bad for your website but without citing any real evidence. Overtime 10 other blogs claimed the same thing by citing blog A. Eventually, the claim that a very high bounce rate is bad is considered to be a fact in your industry. 

So when you look at your own reports and you see a very high bounce rate, your default conclusions could be that it is bad. Be aware of the data presented as facts. Do your own research and draw your own conclusions. Do not rely on case studies for data interpretation or for making business and marketing decisions. 

#19 Correlation Causation fallacy

Just because two things appeared to be correlated, it does not necessarily indicate that one thing irrefutably caused the other thing.

For example, 

Your client: “Our website traffic went down by 50% last week. We also switched to enhanced ecommerce the same week” 

Now if you draw the following conclusion then you are likely falling prey to the correlation causation fallacy:

“Their website traffic went down by 50% last week because of switching to enhanced ecommerce tracking.”

You can always find some relationship between two variables/events if you really want to. However, the mere presence of a relationship between two variables doesn’t imply that one causes the other.

In other words, correlation doesn’t imply causation.

#20 Not understanding the maths and stats behind web analytics and conversion optimization

This reason should be somewhere at the top of this list but I deliberately added it at the last in the hope to make you realize how important it is to understand the maths and stats behind web analytics. 

Analysing data without a basic understanding of maths and statistics will most likely result in drawing the wrong conclusions and losing money.

One of the best ways to interpret data accurately is by correctly using maths and statistics. Learning maths and statistics is an excellent way to develop your logical and critical thinking. It makes you a better marketer and, of course, a better analyst. 

The knowledge of maths and statistics will help you in accurately interpreting the data and quickly find anomalies in data. 

For example, if you see that the reported conversion rate is made up of a very small data sample, you know instantly that it cannot be statistically significant (i.e. statistically meaningful). 

Google Analytics reports are full of averages. And if you do not know how averages work then you can easily misinterpret them. You can then get a below average, or even poor, insight from your GA reports.

For example, one of the most misunderstood ratio metrics is the conversion rate. Because of poor statistics skills, many optimisers have no idea that the conversion rate can also negatively correlate with sales and profit. 

They think that the conversion rate always positively correlates with conversions (i.e. as conversion rate increases, sales always increase and so does profit). However, this is not always true. 

No matter how bad your analytics data is, not using the correct maths and statistics will only make it worse.

Here are a few questions for your consideration:

Q1. When your website conversion rate jumps from 10% to 12%. Is this a 2% rise in conversion rate or a 20% rise in conversion rate?

Q2. Can you double your sales by simply doubling your marketing budget?

Q3. If the average time on your website is ten minutes. Does that mean website visitors actually spent ten minutes on average?

Q4. If the campaign A conversion rate is 1% and the campaign B conversion rate is 5%, does that mean campaign B is performing better than campaign A?

The corporate world is not very forgiving to mistakes made by optimizers. If we report that the jump in conversion rate from 10% to 12% means there is a 2% rise in conversion rate, our entire analysis becomes questionable. 

We instantly create a shadow on the rest of our analysis. The thought that will instantly pop up in the mind of the recipient of our report will be “what else have they done wrong?”

The role of maths and statistics in the world of web analytics is not clear to many optimisers. Not many talk or write about the usage of maths and statistics in conversion optimisation. 

That is why I have written an entire book called ‘Maths and Stats for Web Analytics and Conversion Optimization to fill this knowledge gap.

This expert guide will teach you how to leverage the knowledge of maths and statistics to accurately interpret analytics data and reports. 

Related Articles

Register for the FREE TRAINING...

"How to use Digital Analytics to generate floods of new Sales and Customers without spending years figuring everything out on your own."



Here’s what we’re going to cover in this training…

#1 Why digital analytics is the key to online business success.

​#2 The number 1 reason why most marketers are not able to scale their advertising and maximize sales.

#3 Why Google and Facebook ads don’t work for most businesses & how to make them work.

#4 ​Why you won’t get any competitive advantage in the marketplace just by knowing Google Analytics.

#5 The number 1 reason why conversion optimization is not working for your business.

#6 How to advertise on any marketing platform for FREE with an unlimited budget.

​#7 How to learn and master digital analytics and conversion optimization in record time.



   

My best selling books on Digital Analytics and Conversion Optimization

Maths and Stats for Web Analytics and Conversion Optimization
This expert guide will teach you how to leverage the knowledge of maths and statistics in order to accurately interpret data and take actions, which can quickly improve the bottom-line of your online business.

Master the Essentials of Email Marketing Analytics
This book focuses solely on the ‘analytics’ that power your email marketing optimization program and will help you dramatically reduce your cost per acquisition and increase marketing ROI by tracking the performance of the various KPIs and metrics used for email marketing.

Attribution Modelling in Google Analytics and Beyond
Attribution modelling is the process of determining the most effective marketing channels for investment. This book has been written to help you implement attribution modelling. It will teach you how to leverage the knowledge of attribution modelling in order to allocate marketing budget and understand buying behaviour.

Attribution Modelling in Google Ads and Facebook
This book has been written to help you implement attribution modelling in Google Ads (Google AdWords) and Facebook. It will teach you, how to leverage the knowledge of attribution modelling in order to understand the customer purchasing journey and determine the most effective marketing channels for investment.

About the Author

Himanshu Sharma

  • Founder, OptimizeSmart.com
  • Over 15 years of experience in digital analytics and marketing
  • Author of four best-selling books on digital analytics and conversion optimization
  • Nominated for Digital Analytics Association Awards for Excellence
  • Runs one of the most popular blogs in the world on digital analytics
  • Consultant to countless small and big businesses over the decade
error: Alert: Content is protected !!