This data is then used by the Google’s algorithm which identified an 84 percent overlap with the websites that were negatively impacted by the Panda update.
On 24th February 2011 Google made a change to its search algorithm [Panda Update] which aimed to lower the rankings of low quality websites, and bring high quality sites to the top of the search results. This change affected 12 percent of all search results. Websites that were placing a lot of advertisements on their web-pages saw a down trend, and social networking sites and news websites saw an uptrend in rankings.
The intention behind the Panda update is to help web surfers to find high quality sites when compared to sites that are built from content farming and aggregator/spam sites. Panda’s main goal is to down rank websites that offer poor user experience.
Factors that need to be kept in mind that makes a website vulnerable to Panda
- Is the information presented in the webpage trustworthy?
- Is the content of the webpage written by an expert who is well versed with the industry?
- Does the website have redundant or duplicate content?
- Is the content free from spelling mistakes and grammatical errors?
- Does the webpage provide value when compared to the other webpages that show up in the search result?
- What kind of quality control measures have been put into place to ensure the quality of the content?
- Does the content provided in the webpage offer a complete description of the topic?
- Would web surfers complain when they see your webpage/s?
- Is the web content too short and lacking important information?
- Does your webpage have a high bounce rate?
- Does your webpage have low quality of inbound links to it?
The latest version of the Panda update is 3.6 which was rolled out on 27th April 2012. On 24th April 2012 Google launched another algorithm called the Penguin Update. The Penguin update is also known as the Webspam update, and it is designed to identify those webpage’s that are spamming the Google Search Engine by using tactics that violate Google guidelines like cloaking, keyword stuffing and link schemes.
Factors that need to be kept in mind to avoid your website being vulnerable to Penguin
- Do not use hidden links or hidden text
- Do not create similar content across your domains or sub-domains
- If your website has an Affiliate Program then make sure it adds value to the web surfer by providing relevant and unique content
- Avoid sending automated queries to Google [example: using software’s to make your submissions]
- Avoid the use of doorway pages
- Ensure that your webpage’s are free from malware, Trojans, and do not involve in any malicious behavior
The Panda 3.6 and the Penguin updates are warning attempts towards over optimization that takes away the user experience of the web surfer when they do a search in Google. Google as a search engine will lose its effectiveness if it provides low quality search results, so it has to work towards improving the web surfers experience. The safest route to prevent your website from the penalties of the Panda and Penguin update is to only use White Hat SEO Techniques
which adheres to Google’s SEO Guidelines