Fake News & Fake Views

Everyone agrees that Marketing decisions are only as good as the data behind them. This is why it is essential to have clean and reliable data flowing through your Google Analytics and into your spreadsheets. So what can we do about fake website views and false ad clicks that are disrupting our data?

Joe Hopkins
Digital Marketing Executive

so what’s the deal?

'What?' you may be asking. 'Fake views!' you may be exclaiming. And worse 'False Clicks!', which as you may realise, is the economic equivalent of burning money.

Since the dawn of the internet, scripts, spiders and crawlers have been running around the digital world like ghosts in the matrix.

Many of them are harmless and do more good than bad. However, some are programmed to disrupt and interfere with programs and data. It should come as no surprise to any digital marketers that Google Analytics data and Google Ads data is not 100% reliable. Have you ever tried to get your conversions and your goals to align? There will always be some discrepancies and statistics that don’t line up.

So which one is more reliable and which one can we trust? How do we know we can believe the data we are being fed? I didn’t see anyone visit my website last month… How do I know Google didn’t make the whole thing up and charge me? And on top of all of this...

what are bots?

A bot (obviously short for robot) is a software application that is programmed to run automated tasks online and there are billions of them! Distilnetworks.com published a paper which was (awesomely)  titled the 2019 Bad Bot Report, which claimed that over 37.9% of all online traffic in 2018 was bots. As a consequence, you may be wasting your marketing efforts analysing pages worth of data when almost 40% of that data is bogus!

Some of these little critters do good things though, such as search engine spiders or chatbots which help to make the digital world go round. These bots are responsible for keeping Google updated with your website content and rewarding good SEO pages with higher quality scores and better organic results. But others, named (yep you guessed it) bad bots have malicious intentions.

Ok so they don't look that scary, but they can cause a lot of damage - for example they can be used for political purposes and to create fake website views and clicks. These nuisances need to be identified and controlled for website creators and digital marketers to remain confident that they are engaging with the correct audiences.

how can bots help your marketing strategy grow?

As mentioned, there are a few good bots out there. The most notable being the Googlebot. A Googlebot a.k.a. A robot, a web crawler, a spider or a user agent (like in the matrix again!) is a bot that scrapes data from webpages. 

As with traditional SEO, there are over 200 ranking factors to pay attention to, but the big ones revolve around user experience. Website designers have to find a balance between satisfying the human users and Googlebots.

You can keep these bots happy and satisfied by making sure your website links, sitemaps and fetch requests are up to date and in the correct format. These bots also crawl through Google My Business listings, directories and links from other websites to get a broad view of your website. Oh and make sure your PageRank & sitemaps are up to date. 

SEO professionals will use bots to analyse market trends, optimise performance and discover technical SEO issues - making for great SEO practices, excellent user experiences & improving overall quality scores. In order to be indexed and returned in search engine results, a site needs to be accurately crawled and a SEO webmaster ought to put due time and attention into improving Googlebot optimisations.    

identifying and separating humans and programs

So onto the real question - how can we identify these digital pests and stop them clicking around our sites and more importantly interfering with our data?

This is a difficult question to answer - where do we begin in telling the difference between a person clicking around our site and a bot? A well developed bot can behave identically to a human regarding to website viewing and site exploring - they even scroll down pages and view items in a store! These programs are built to imitate the way humans engage with a website, so it can get tricky telling one from the other - just like in some kind of post apocalypse sci-fi trilogy about a war between machines and humans. 

As Kenneth Colby, an American psychiatrist who dedicated his life to the theory and application of computer science to psychiatry said:

"Before there were computers, we could distinguish persons from non-persons on the basis of an ability to participate in conversations. But now, we have hybrids operating between person and non persons with whom we can talk in ordinary language."

So simply put, it is near impossible to tell a well built bot from a human just from the basis of browsing a website.

Thankfully website Captchas and requiring a login for a website are secure ways to keep out most bots - or at least stop them signing up for newsletters with fake email addresses! 

Adding a Captcha to a website is not as difficult as it sounds. ReCAPTCHA is the most widely used and recognised version of this. It was acquired and perfected by Google in 2009 and is now widely used and recognised across various sites. 

If you are still concerned about bot traffic, you can also subscribe to the IAB/ABC International Spiders and Bots List which give you access to industry leaders, technology and bot detection and filtering lists. This is a little overkill, but will guarantee your website immunity from bots.  

stopping data disruption

Google Analytics has a handy little checkbox filter, which filters out the Google recognised bots and spiders from your data. This is exceptionally helpful when reviewing data in Google Analytics & wanting a realistic view of impressions, clicks & page visits. 

You can also identify suspicious traffic through repeat visits or outdated browsers and use a filter to exclude the corresponding IP addresses. You can create a 'referral exclusion list' under Property > Tracking Info to filter out these ‘repeat & repeat visitors’.

in conclusion 

We will never truly be able to track all of our website traffic with 100% accuracy and identify every repeat visitors due to people blocking cookies, repeat visitors from multi devices and ever-advancing bots. However, by setting up a Google Analytics account correctly we can measure over 95% of genuine visitors and gain a clearer insight into our Google data.

We're thrilled you liked our blogs!

sign up to get new blog releases

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
We are a team of team players

let’s see how we can help you

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.