Dewey: Different Results Among Various Data Centers

Dewey rolled out on the last week of March 2008 and continued on until early April. While there wasn’t any consensus on the exact changes that the update entailed, the webmaster community reported a significant shift in the algorithm with a large-scale shuffling of results. Moreover, it’s believed that Google was pushing other products like Books, but there wasn’t evidence to support this speculation.

What’s It For

It’s highly likely that the Dewey update was an effort to improve relevancy in the SERPs. The previous year’s updates, notably Universal Search, marked Google’s expansion. With the integration of news, books, videos, images, and maps in the search results, it’s no wonder that the webmaster community was kept on their toes in the months after the update rolled out.

What Were Its Effects

As opposed to previous algorithm updates that affected several smaller portions of the webmaster community, the Dewey update was described as an “old-style Google shake-up” in one of the forum threads. A lot of blogs that previously placed in the top spots in the SERPs lost their high rankings, and some were delisted. This phenomenon occurred even when the websites didn’t violate any of Google’s Webmaster Guidelines.

The SERPs themselves were changed. People reported getting different results on similar keywords, which were believed to vary according to the data center that pulled them up. Additionally, sites in the UK witnessed how websites that bought links took the top spots in the SERPs for their keywords.

What It Means for You

Paying for links is still being practiced today. Some websites that have established a massive readership count sometimes charge for guest posting. However, they even exercise full editorial control over the topics that will be published in their blog as well as the number of links that should be included in your content and whether they’re tagged with the dofollow or nofollow attribute.

What you should avoid altogether is using automated link building systems which are intended to game Google’s algorithm by using black hat tactics. It’s a form of spamdexing or spamming the index. The search engine has measures in place that can detect massive amounts of suspicious activity from your site.

Here are the three ways that Google detects paid links:

  • Spam Reports – Your competitors can underhandedly use your paid links against you by reporting to Google that you’re using this tactic. The search engine considers paying for link building as part of a link scheme, and they encourage webmasters to report anyone who tries to manipulate PageRank.
  • Manual Spam Fighters – Google is serious about upholding the credibility of their search engine; that’s why they employ spam fighters who manually assess the quality of the SERPs.
    Google Algorithm – As mentioned above, the team has come up with measures to detect paid links and any related suspicious activity that aims to game their system.

Instead, these are the white hat methods you should religiously practice:

  • Offer Top-Notch Content and Service Write high-quality content that addresses your target audience’s specific needs. It must provide them valuable knowledge and insight that can help them improve. Don’t just stop with your content; go the extra mile with your customer service as well by engaging with them on your website and social media accounts.
  • Use Appropriate Keywords Keywords are the words and phrases that users include in their query to come up with answers to their problems. Research the right search terms for your brand by using Google’s autocomplete feature. You just type in the word that you want to rank for and wait for suggestions on what users are frequently asking about. You can also look at what your competitors are using and incorporate those in your content.
  • Include Descriptive Meta Tags A meta tag or description is a snippet of text that provides a summary of the page’s content. While these don’t appear on the page itself, it’s prominently displayed in the SERPs and located beneath the page title and URL. Its ideal length is between 150 to 200 characters. Plus, it should be unique for each page and include your primary keywords.
  • Improve Your Site Architecture Your site’s navigation and structure play a significant role in how users perceive your authority and trustworthiness as a brand. The search engine pays close attention to how people interact with your website and how long they stay in your pages. To enhance user experience, you must organize and arrange your domain in a way that it flows naturally for your target audience.