I suggested a while back that Google could use clickstream – user tracking data – to modify SERPs.
I now believe Google is already applying this data via via user tracking of how long a user stays on a site – and whether they click back because it’s irrelevant, or else quickly click out via another link.
The suggestion was made to me via Per Gustafsson, a Swedish webmaster who also saw discussion on the topic at a recent SEO event in Sweden.
It remained very interesting speculation – after all, we know Google has had a problem with MFA sites in the past.
So what if Google were using their extensive collection of data – tracked via Google Search clicks – and also using Adsense stats to determine bounce rates, time on site, and time to exit? And bring the two streams of data to evaluate whether a site is really relevant or not?
Processing this data algorithmically and applying it as a modifier on ranking factors would certainly be a potentially efficient and clever way to help reduce the presence of sites delivering a poor user experience.
I may also have just seen this in action looking at the stats for one of my sites.
Take a look at this:
Traffic is steady through the first week of December, rising slightly, then on Dec 8th there was a sudden surge of traffic.
The reason is that this is a tech site which includes a significant section on satellite broadcasting, and the surge was people looking for information on broadcasts on Ricky Hatton’s boxing match in Las Vegas.
The result wasn’t just a surge in traffic, but also a big surge in Adsense clicks as the users clicked out through the Adsense rather than exploring the site.
The site certainly was never optimised for boxing fights, let along Ricky Hatton, so the clickstream data would probably suggest the site may not be relevant, and offer a lower quality user experience.
So it is very interesting indeed to notice the obvious drop in average traffic to the site the following week, with unique visitor traffic falling from an average of around 4800/day to 4000/day – a drop of around 20%.
Obviously, without knowing how Google may or may not work with such data directly, we cannot draw any firm conclusions.
After all, the first week of traffic may have indeed been inflated by viewers looking for broadcasting information in advance.
However, it could also be possible that there is a very visible clickstream effect in play using the site’s Adsense data.
It’s also difficult to determine average traffic for the site, as it was created by merging two other sites together about two months ago. Traffic has been increasing overall by around 20% anyway, and the lower stats actually show a 20% growth on last month.
The question is whether the 20% growth is natural – or stunted by clickstream data.
Here’s the lesson I am learning from this, though – if I am creating a site, whose primary purpose is to move traffic to an external site, then I need to consider setting up useful content on the site to begin with.
That way users can find what they need, then move off as required, rather than try to bounce them out too quickly.
This is also serves as yet another warning shot across the bows of social media marketing – that if you bring in traffic surges but do not develop decent links from doing so, and additionally try to push that traffic offsite via ads, then you could again invite Google to summarise your site as being of low quality.
Obviously there’s a lot of speculation here, but the point isn’t about being right or wrong, but instead the warning on how to future-proof sites against being potentially devalued in some way due to use of clickstream data.
In my opinion, a lot of working with Google successfully is centred around future-proofing strategies, so I am already going to revise my approach with this in mind.
Related posts to:
"Google using bounce rates for ranking purposes?":
Sorry, the comment form is closed at this time.
Previous: « Lawyers: Eat My Shorts
Next: New Adsense unit changes being tested »
Visited 7576 times, 1 so far today since July 24th 2007