Musings On Google’s Next Algorithm Change Targeting Over-Optimized Sites

Rate this item
(0 votes)

For those people wondering what the next website plummeting algorithm change might entail, Matt Cutts at Google gave a hint... his talk at SXSW in late March, 2012. In that speech, he spoke of surfacing sites with great content that may not have optimized their sites to the highest level possible. With Google's goal of delivering the best user experience possible, this theme isn't a new one but may indicate that the search engine is in the process of improving the ways in which their bots process on-site content. What remains to be seen is how the bots determine the highest quality content versus a myriad of other competing pages.


Targeting "over-optimized" sites may also mean that some on-page SEO techniques may be de-emphasized. These techniques may include: stacking keyword-loaded links in page footers, high densities of keywords in content, keyword-rich title tags, etc. Pages with a combination of over-optimization of these techniques combined with content that new and improved bots determine to be of low quality would likely suffer big hits on the search engine results pages.


To be sure, the biggest hits will be experienced by sites that are using abusive practices to over-optimize their pages. Google has always reserved the highest level of wrath for those who they determine are gaming or manipulating search engines for undeserved rankings. Websites found to be using black hat techniques will likely fall the furthest in rankings or find themselves delisted, as is the case when highly manipulative practices are discovered.


While many issues that Google tackles are clearly black and white, rating the quality of content gets them into a very subjective area. It will be interesting to see how that plays out.