Google develop Panda Update after conducting a research. Quality of the websites were check by many ways. Google employed these people over a long period of time to determine and evaluate the quality of many websites. Number of factors were used to determine the quality of the any website. Some of these parameters demonstrate the quality of a website at the designing stage. Other factors such as speed of downloading, content trustworthiness, number of back links were also used. New algorithm was used to determine the similarities of the content to identify duplicate content of the websites. Aim of the new algorithm was to evaluate the quality of a website. This new algorithm of Google gave less priority to one of the significant features of previous measurements of page rank. It looks as if it has given priority to keyword surfing. This was something that web designers and developers did not expect. At the very first stage of the release of Penguin in April 2012, More than three percent of the English language search queries were immensely affected. General speaking, recent Panda update has affected the ranking of the complete websites than individual web pages.
Google Panda - It seems to me based on the researches I did Google Panda and Google Penguin are doing the similar functions. When one does more broader approach the other one does more specific approach. If a web page has more than forty percent of duplicate content, the page is in the danger zone of being flagged by Google Panda algorithm. Panda looks for exact duplicate designs. Google Panda does not like all the pages in a web to have exactly the same design. Google Panda algorithm searches for thin textual contents as well. If a web page has just one or two sentences, then Google Panda will pick that up. Some web designers aim to jam pack the web pages with advertisement rather than putting textual content in them. This is absolutely not a good practice at all. On the other hand, having bland text content is also an unacceptable behavior. Some web designers have this bad practice. They put so much text but nothing much interesting other than texts. Their web pages are not going to be balanced with images, graph or music. This will produce bad usage metrics. Panda also picks up garbage or nonsense text pages. And this also includes the pages with so many link outs. if your outlinks link with law quality pages or spam pages, then Panda will catch that.
When Panda goes through your website, it takes a copy of the index. Panda acts like a spell checker or grammar checker. If Google Panda found any problematic behavior of these pages, It could spam flag these pages.
Google Penguin -This is the other main algorithm develop by Google to check the quality of a webpage. It checks both onsite pages and offsite back linking pages. These are the kinds of things that Google Penguin algorithm seems to be looking for. Firstly, Google Penguin algorithm seems to be looking for non- viewable keywords. These are the hidden keywords. You can have otherworld hidden but not the keywords. Having repetitive keywords is also a bad practice. Google Panda is so concern about repetitive keywords. When you read though the web content, if it sounds repetitive to you, then It will be repetitive to Google Panda as well. Keywords density cannot be too high. Keeping keyword density under four percent is a good practice. Finally, non editorial keywords is also going to be picked up by Google Panda. If your Website has a big block of text in the side navigation bar or in the footer section with full of keywords in it, it is a bad practice. On here you have to understand that Google is not penalizing you for having a side navigation bar or footer. But, if your purpose of having a side navigation bar or footer is mainly to fill the keywords, then that is exactly Google is looking for. If Penguin finds any of the above mentioned misbehaviors in a web page, then it will create a spam flag on the page. This will seriously effects on Google search results.