Penguin 2.0: A More Comprehensive Algo Update


Penguin 2.0, also known as version #4, rolled out on May 22, 2013. The filter was created to fight against webspam and other black hat techniques that aim to manipulate the rankings. While a lot of webmasters anticipated a significant shakeup in the SERPs for this particular update, it did not have the same magnitude as when the first filter rolled out.

What’s It For

Unlike previous Penguin updates which were only data refreshes, Penguin 2.0 improved on the algorithm itself. The filter focuses on detecting websites that attempt to game the system and get high ranks in the SERPs even if their content doesn’t provide value to users.

Google has placed this measure to spot tactics such as keyword stuffing, hidden content, and link schemes so that it can continue to give searchers the information they need to solve their query. Along with Panda, which focuses on content, Penguin ensures that the search engine promotes high-quality websites by cracking down on the ones that use webspam techniques to deceive users.

What Were Its Effects

In an announcement by Matt Cutts on his blog, Google’s head of webspam reported that approximately 2.3 percent of English searches felt the impact of Penguin 2.0. This figure is relatively noticeable by regular users. The update was more comprehensive than the first version and was expected to have more considerable influence overall.

Several porn and game websites such as extremetube.com and 2dplay.com were found to have lost search visibility for some major keywords. Major brands like the Salvation Army and CheapOair were also part of the Penguin 2.0 losers list.

The most probable reason why big companies were hit was they compromised on their sites’ SEO while others reportedly had lots of untrusted and thin links. On the other hand, top sub-domains after Penguin 2.0 rolled out were Wikipedia, YouTube, and Twitter.

What It Means for You

The Penguin filter is a bit more straightforward to optimize for unlike Panda where it can be challenging to pinpoint what exact attributes make up for high-quality content. Essentially, you just need to do white hat SEO techniques to help boost your rankings or ensure that your site maintains its place in the SERPs through this algorithm.

Here are a few steps you can take for your Penguin SEO strategy:

  • Know Where Your Backlinks Come From – Links continue to be a significant signal for Google to determine whether a website deserves to show up on the first page of search results or not. That’s why, until now, link building is a useful practice for any site.
    When other blogs cite you as a reference and link to your page, search bots view this as a vote of confidence on the validity of the information you’re publishing. However, there are also downsides to this since the credibility of the source of your site’s inbound links, or backlinks affects their perceived quality.
    Avoid low-quality links by investing time and effort in reaching out to reputable websites to see if they’ll allow you to contribute a guest post with your link. Put a lot of thought into your anchor text as well and make sure that the keywords you use are relevant for your brand.
  • Know Which Pages Should Be Prioritized For Crawling – Along with a sitemap, you need to lay out your crawl path to ensure that the essential pages get the most link juice from Google. Typically, your home page must have the most backlinks since it’s considered as a user’s primary entry point from Google to other parts of your website.
    As much as possible, don’t build your site with a deep structure since you’ll only be wasting your effort on pages that may not get indexed by search bots. Moreover, you must remember to link to higher level pages internally to increase the possibility that visitors can view them.
  • Know What Each Page in Your Domain Contains – Understanding the structure of your website allows you to detect current SEO issues as well as potential problems. One common predicament for site operators is having duplicate or thin content somewhere in the hundreds of pages in their domain.
    What you can do is to scour through the different sections of your website with a Google site search using the operator “site:[yourdomain.com].” With this, you can view the folders and subdomains to determine the number of pages you have for each one and compare them with the expected amount you believe your domain should have. Use this information to investigate the areas with inflated numbers further.
arrow_icon