Personalised Penguins and Pandas in Venice
Over the past year, we’ve seen more profound change in the way that Google rank websites than I can remember. And irrespective of the reaction from the SEO community, without exception, all of these changes have been about improving the experience that individuals have with Google.
The first of these big updates, Panda came last February but you can track it’s history back to Eric Schmidt’s famous comments in October 2008:
“Brands are the solution, not the problem. Brands are how you sort out the cesspool.”
Google Panda destroyed the idea of sprawling websites made up of cookie cutter pages as a viable option for SEO rankings. Google introduced Panda to counter the spread of low quality content what was detrimental to user experience. At the time, Panda was cast as an algorithm update that would raise the quality of content that people saw when they used Google to find information. The way Google introduced Panda , and advised webmasters what to expect was telling:
- Would you trust the information presented in this article?
- How much quality control is done on content?
- Is the site a recognized authority on its topic?
- For a health related query, would you trust information from this site?
- Would you recognize this site as an authoritative source when mentioned by name?
- Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
Content Farm websites which built up massive numbers of thin pages by paying fractions of a penny per word took a big hit. Deservedly.
More recently, and still in the same vein, we had Penguin. Penguin’s not about the quality of the page you’re looking at, but the reliability of the pages that link to it. The right response to Panda was to invest more in your user experience, improving the content on your website to increase the level of trust and authority that it held within a niche. Great content attracts links, because it’s the kind of thing that people are willing to “share with a friend”. But unless people can find that content in the first place they can’t. The unintended consequence of Panda was to push the low quality content into an invisible tier of the internet where it was used solely for the purposes of building links.
Rather than fill their own website with thousands of pages of garbage, publishers created networks of content that they could reuse (spin) multiple times in order to use it for links. This worked pretty well, and until comparatively recently, you would see many questionable websites ranking for big money terms. This false authority in Google creates yet another trust issue. If it’s easy to rank simply by using garbage content and posting thousands of links in a gigantic network, then you have a situation where users can’t really trust the search results. Spun content is content which has had individual words and phrases changed to differentiate it from the original version. You can do it automatically, which means it’s cheap content, and it’s also a lot easier to spot than a lot of people think. Here’s an example:
Starting with this:
Once upon a time there was a young SEO professional called James who joined an agency called Latitude to make a name for himself in the city.
You get this:
A long time ago there was a youthful Search Engine Optimisation specialist named James who entered a consultancy named Latitude to become well known in the city.
Which evolves into this:
A lengthy period in the past there was a spritely seek motor improvement doyen baptised James who inserted a troupe termed Latitude to develop legendary status in the conurbation.
Spun content is easy to spot because it stops making sense pretty quickly the more you adapt it. Google have a big database of language that makes sense because they have a big database of content that demonstrates authority and is trusted by users. By comparing the garbage level spun content that you find on a lot of sites, it becomes easy to wipe them out.
Penguin appears to be a filter that spots trash and discounts it and the links that come from it. It’s not been widely welcomed by a lot of people, because it appears to throw false positives a lot of the time – although as the filters get refined, it will get better.
Google’s been getting more and more personal over the past few years. You only have to look at your ad preferences page to get a feeling for what they know about you as a user. We’ve had personalised results based on location for a while, but the recent Venice Update changed things a lot. Rather than the older version of the localised results where you’d see a number of map listings inserted into the results page, these are now indistinguishable from the rest of the search results for selected queries.
Useful? Well, as noted above, Google have a lot of search history data from a lot of people. They measure interest in different search terms over time, and iterate their algorithm based on that. If people weren’t refining their searches based on local preferences (eg: “taxi”, “taxi Warrington”) then Google wouldn’t be able to assign a local bias to the results. Venice results are useful for users and give them better results when they search for a keyword where a local business provides a more appropriate answer.
In the past, a challenge like Venice, would inevitably have elicited an SEO response of building multiple pages to target all the different neighbourhoods in the country with a dedicated page, although in a post-panda/post-penguin era, this isn’t an option.
The challenge for an SEO campaign is that Venice makes the traditional idea of rankings obsolete. Personalised results in general mean that different people will see Google in different ways, and that’s generally a good thing. As any advertiser will tell you , the more appropriate your audience is, the more likely they are to buy from you.
Venice is interesting because it genuinely is the final nail in the coffin of absolute rankings as measured by a tool like AWR as a metric of success. Ranking software isn’t everyone, and isn’t everywhere. It’s a stateless piece of software sitting on a proxy that hides its location from the real world, and although it gives an indication of where things are heading for a website, the picture it paints becomes increasingly inaccurate as results become more focused on individuality, and take no notice of the social relationships and recommendations that people make to each other.
Penguins, Pandas, and romantic Italian Cities are all lovely fluffy things, and in many ways appropriate to the kind of social layer that Google is applying to their results. In a personalised world, the only way you can measure success is at the bottom line: more people buying more things. With Google’s recent changes and the changes to come, the only way to encourage that is to provide an amazing experience for your customers and be the company that they can trust.
Speaking of lovely fluffy things, here’s a picture of a search engine spider courtesy of my 3 year old girl:
READ OUR Personalised Penguins and Pandas in Venice INSIGHTS
- Technical SEO Considerations Based on Global and Regional Search Engine Targeting
- Google Penguin and Online PR
- Google changes its PPC policy… again
- Google’s Athletic Algorithm Wants World Beating Content Writers