Google Penguin Update: Fears, Risks and New Opportunities
On 24 April, Google announced the new update algorithm, which was then called Penguin Update. Whenever there is an update of network algorithms, as well as unleashing panic in the forum, it creates a lot of misinformation and confusion.
Changes made to Google’s algorithms are dozens and dozens in the course of a month and take the most important names, sometimes bizarre, to better identify the changes a bit ‘more important. While that was launched last year, Panda, aimed to counter the farm content, the Penguin and aims to fight again spam the web.
At this link you can find all the major changes algorithmic Google since 2000 and have an information “untainted” on Google & SEO I recommend you to update you from sources such as SEOmoz, Search Engine Land, Search Engine Watch and Seroundtable, just to name a some or if you do not like English is the Seo Blog Giorgio Tave or discussions initiated by Enrico Altavilla in their social profiles.
We come to the Penguin Update and fears. Much has been said about and synthesizing, this algorithm aims to clean better SERP spam. This objective is in fact trying to reach him all along, but the spammers develop new techniques from time to time thus pushing Google to develop and modify the algorithms.
Among the common characteristics of the sites affected by Penguin Update gather there:
o keyword stuffing;
o excessive use of domain names with exact keyword;
o Excessive use of anchor text in inbound links accurate;
o Links from article sites mktg / communications of low quality;
o Links from blog network circuits and pay sites.
That the above techniques do not liked to Google was known for some years now, but that there could be problems for those sites that have always operated in the white hat, not that, it was not foreseeable. From the facts that emerged in forum websites, potentially clean (at least reportedly) have been penalized or warned by mail (from Google Webmaster Tools) in the presence of non-natural link that “tend to modify PageRank”. This shows how well the algorithm is never exact, and as always can not work properly. It ‘been made available to webmasters a form for requesting reconsideration of your site, even if I do not know what can really be “reconsidered” a site on the automation of the update and the thick list of sites that may feel unfairly affected by Penguin.
It’s interesting about the analysis on the distribution of anchor text of inbound links to be made Micro sites Masters on a sample of sites surveyed that appear to have been penalized by Penguin. The analysis sought to understand to what percentage you can do link building, with the exact keyword of interest, before being penalized. It seems that up to 60-65% of links with the keyword of interest could work well and be safe. However, data are to be taken with the tongs and probably also to be evaluated case by case basis, for SERP SERP (SERP competitiveness, keyphrase).
In fact, the main concern lies in the Penguin in the link text and link to them. In fact, while for keyword stuffing (a technique now forbidden for years even though you know that Google is imperfect when you see such sites positioned well for the keyword “something”), you can manually move the site and try to restore the situation, for link is a bit ‘more complex. Maybe we can act on sites bought links, on what we put into our network of sites, but what if the links have come spontaneously and it is almost impossible to contact each webmaster to request removal? The problem is here.
Although there is (for now) guides for solutions for each case (each case should be well analyzed in detail), the guys have given some indications of SEOmoz that might be useful for everyone to understand, in case of an alleged penalty, if the issue has created with its own actions or not seo.
Control the speed of acquisition of links
Using professional services such as Majestic SEO, OpenSiteExplorer ahrefs or you can look at your link graph for graphics, like the one below, the acquisition speed links. It’s clear that a situation like the above should be of concern.
Links and repeated in more parts of a page
It ‘should also check those links that are duplicated, such as those included in the blogroll, or in the footer which are then be repeated. Using Google Webmaster tool, you can see which are the sites that have this situation and eventually mobilized.
Control the distribution of anchor text links
This is the part that will take more time because the subject of several analyzes. Before coming to evaluate the distribution of anchor text links must filter out the analysis:
o Dead links;
o Domains linkanti deindicizzati for some reason;
o Set to nofollow links;
o All those links that are duplicated on one or more pages.
Once obtained the list of links “clean” and filtered, will organize on the basis of:
o Links with exact key word of interest;
o Links with keyword broad match;
o Link with the keyword brand + keyword of interest;
o This link in an image;
o Link is not optimized
Again, if you need to worry, after the filters applied to the situation will be highlighted as the one shown in the chart below.
Opportunities for Marketers
When I started doing seo, trained mainly by marketers and not by developers, I was saying that SEO was a figure that was born as developers and that it had to have skills as a developer, server administrator, learn multiple programming languages, etc. . The more time passes the more I see how marketing capabilities are becoming one of my competitive advantages and how the situation is reversed: that the technicians who are acquiring (for better or for worse) marketing skills and relational community. As time passes and I am convinced that the content marketing and the construction of meaning and community relations will guide the future of search engine. Also because they are the most difficult things that can affect a spammer.
So my suggestion is (if you do not already) keep away from the classic magician’s tricks seo Otelma and give content to the production values. Each web page should have a value for the user and links and social signals are real appreciations. This does not mean that you should not treat the websites: clean graphics, code clean and unencumbered, loading speed, hosting of trust and a focus on duplicate content and “canonicalize” will always be important. But instead of wasting time doing article mktg and / or buy links, etc. to build networks. I impegnerei on content production (research, infographics, post, presentations, useful widgets, Mobile apps, etc..) that can attract links naturally and in social media, creating reports, engagement, real sense of community and exchange information with the user.