Blog

Home » Blog

Confusion in Recent Google Updates

This post originally appeared on my blog.

Google pushed out some updates recently which have had SEO experts and spammers, as well as the average web developer or content author, a bit confused. It seems that some sites have been losing traffic and attributing the change to the wrong update. It also seems that some of this has percolated up to my clients in the form of fear-mongering and misinformation, so I’ll try to provide a quick overview of what has happened.

Exact Match Domains (EMD)

For years identifying a keyword-stuffed domain name for your product or service was considered the coup de grace of SEO. Frankly, on Google, this was true. For instance if my company, Algonquin Studios, wanted to rank highly for the search phrases web design buffalo or buffalo web design then I might register the domains WebDesignBuffalo.com and BuffaloWebDesign.com. I could even register nearby cities, like RochesterWebDesign.com, TorontoWebDesign.com, ClevelandWebDesign.com, and so on, with the intent to drive traffic to my Buffalo-based business.

Google has finally taken steps to prevent that decidedly spammy user-unfriendly practice. With the EMD update, Google will look at the domain name and compare the rest of the site. If the site is a spammy, keyword-stuffing, redirection mess, then it will probably be penalized. If the domain name matches my company name, product or service and (for this example) is located in the area specified by the domain, then it will probably not experience any change.

In all, Google expects this will affect 0.6% of English-US queries.

Panda/Penguin

While spammers panicked about this change, some not spammy sites noticed a change at about the same time. This may have been due to Panda and Penguin updates that rolled out around the same time and have been rolling out all along.

Considering the Panda update was affecting 2.4% of English search queries, that’s already a factor of four more of an impact than the EMD update. Considering that Google pushes out updates all the time, tracing one single update to any change in your Google result position is going to be tough.

A couple tweets from Matt Cutts, head of the web spam team at Google, help cut to the source instead of relying on SEO-middle-men to misrepresent the feedback:

This one details the number of algorithm changes that regularly happen:

The trick is trying to recognize what on your site might have been read as spam and adjust it to be user-friendly, not to try to tweak your site to beat an ever-changing algorithm.

Disavowing Links

This one ranks as confusion for a web developer like me.

The only feature Google has added that I think takes potential fun away from blogs (or any site that allows commenting) is the tool to disavow links. This tool allows a site owner to essentially tell Google not to count links pointing at it when figuring PageRank.

One reason I don’t like it is that it allows sites that have engaged in black-hat SEO tactics and have ultimately been penalized by Google to undo the now-negative effects of paid links, link exchanges and other link schemes that violate Google’s terms. While this is good for sites that have been taken to the cleaners by SEO scammers, I still don’t like how easily they could be excused.

Another reason I don’t like it is that all those liars, cheaters, scammers, spammers, thieves and crooks who have spam-posted to my blog can go and disassociate those now-negative links to their sites. Sadly, associating their sites with filth of the lowest order by careful keyword linking (as I have done at the start of this paragaph) is the only ammo I have with which to take pot-shots at their spam juggernauts.

This new tool means you might not see spammers harassing you to remove their own spammy comments from your blogs. Which is unfortunate, because ignoring them seems only fair.

Just this morning Matt Cutts tweeted a link to a Q&A to answer some questions about the tool:

The post includes some answers intended to address concerns like mine.

Meta Keywords, Redux

As I have said again and again, the use of meta keywords is pointless in all search engines, but especially in Google. This doesn’t stop SEO snake-oil salesmen from misrepresenting a recent Google update to their prospects.

Last month Google announced its news keywords meta tag, which does not follow the same syntax that traditional (and ignored) keyword meta tags follow. An example of the new syntax:

meta name="news_keywords" content="World Cup, Brazil 2014, Spain vs Netherlands"

From the announcement, you can see this is clearly targeted at news outlets and publishers that are picked up by Google News (your blog about cats or your drunk driving lawyer web site won’t benefit):

The goal is simple: empower news writers to express their stories freely while helping Google News to properly understand and classify that content so that it’s discoverable by our wide audience of users.

For further proof, the documentation for this feature is in the Google News publishers help section.

In short, unless your site is a valid news site, don’t get talked into using this feature and fire the SEO team that tries to sell you on it.

Related