Panda 3.1: Pressing On For Relevant Results

Panda 3.1 was a data refresh that affected a small portion of queries. Since the filter runs separately from the core algorithm, this update indexed new pages while evaluating and ensuring that each content provides relevant, valuable, and unique information to users.

What’s It For

Panda 3.1 continued to improve on the algorithm for the search engine to display more high-quality sites. Google reportedly added quality signals involving duplicate content, usability, and engagement which enhanced content marketing for websites.

It aligns with their overarching goal to provide the most relevant answers to users. Panda was explicitly created to devalue websites with low-quality content while rewarding authority sites that put a lot of thought and research into the subject matter they’re blogging about.

What Were Its Effects

For Panda 3.1, Matt Cutts announced that the data refresh would affect less than one percent of searches. With the lack of outcries from the webmaster community, it did seem as if this update had inconsequential effects on the rankings unlike Panda 2.5 which was also deemed as minor by Google but led to massive changes in the SERPs. Some websites suffered from significant drops in traffic while others were able to obtain the top spots in the results pages.

What It Means for You

The search engine has always encouraged site operators to make sure that each page on their website has unique content and without duplicates anywhere else. Google defined duplicate content as a considerable amount of text, whether internal or external, that match or are closely similar to another.

They clarified that they don’t automatically penalize websites that unknowingly have duplicate content unless those domains were found guilty of intentionally deceiving and manipulating rankings through this practice. A majority of site owners, especially those in e-commerce, worry about having multiple internal links directing users to the same content.

An example is when customers search for an item using the same keywords with slight variations such as “www.sitesample.com/boots.asp?color=black&brand=example” and “www.sitesample.com/boots.asp?brand=example&color=black” with both URLs pointing to the same product page.

Google admitted that this kind of duplicate content could have negative repercussions for your website’s rankings, but it doesn’t result in penalties per se. It’s possible that your page and your website, overall, may be devalued due to how the search engine works.

These are the adverse effects when search bots crawl through different URLs but end up with identical content:

  • Diluted Link Popularity – These different URLs each use up link juice which means that your site won’t reap the SEO benefits of a high-traffic page if search bots don’t know where to credit its popularity.
  • Non-User Friendly URLs – Another issue that your website may encounter with duplicate content is that the search engine may display long URLs that look cluttered and unreadable. A URL that includes dictionary words which searchers can memorize improves user experience and brand recognition.

    Multiple links pointing to a single piece of content can be easily addressed through URL canonicalization. The Big Daddy update last December 2005 provided webmasters with a way to inform Google what link they want to be displayed on the SERPs through the rel=”canonical” HTML tag or 301 redirects.

Here are a few tips from Google on how to reduce duplicate content issues for your website:

  • Have a Uniform Way of Linking Internally – Consistency is crucial especially when linking content within your domain. You don’t want one page to point visitors to “www.sitesample.com/page/home/” while another to “www.sitesample.com/page/home.htm/.” While both essentially lead to the same screen, search bots consider them as different, so pick one and stick with it across your website.
  • Maximize Top-Level Domains – The use of top-level domains is particularly beneficial for companies that cater to a global market. Use these to facilitate search engine’s display of country-specific content such as “www.sitesample.au” for blog posts generated specifically for Australian consumers.
  • Be Mindful of Content Syndication – Google doesn’t penalize blogs that syndicate their content. It also won’t lead to the devaluation of your website if you and your publishing partner include the appropriate attributes like rel=”canonical” or noindex meta tag to ensure that the link juice gets credited to your domain.
  • Refrain from Publishing Placeholder Pages – Google and searchers don’t want to view pages that are devoid of content. That’s why, as much as possible, don’t publish placeholder pages until you have enough unique content to place in it. If you find that it’s necessary to get them live, make sure that you mark it with a noindex meta tag.

arrow_icon