Page 1

Guide Google Penguin Update: -How to tell if your sites were effected -How to recover an effected site -Best SEO practices in a post Google Penguin world


The latest "Google Penguin Update" has really stirred up the SEO community something awful. There have been reports of businesses losing millions of dollars, and hundreds of affiliate marketers losing hundreds or thousands of dollars per day from this update. We have seen a number of well performing sites get moved back to the later pages of Google and seen several "blackhat" sites standing tall on the first page of Google. Until today, we had very little idea of what actually happened. Google's algorithm is proprietary, meaning no one has any idea how to rank a site on Google. We can only make a series of conclusions based on data. Drawing conclusions based on advanced data sets that we have gathered over the past months has enabled us to see where people went wrong, where they were correct, and what we need to do now. Most people who do not have these large data sets are forced to rely on others, or forced to do their own process of elimination to see what "works." First, let's take a look at a piece of the blog post from Matt Cutt's last update, that he intentionally highlighted, and is now being re-posted all over the internet:

Ok, it is very clear that Google does not want to see "spun" content anywhere, anymore. For those of you who do not know what spun content is, here is a little snippet:


{This|The latest|This last|The previous} {Google update|Penguin update|Google algorithm update} {was difficult on all of us|made life very hard|was not an easy thing to deal with}.

Blackhat SEO applications will take this "spun" input and choose one word or group of words from inside each set of brackets {} to come up with a sentence. The problem is, people got lazy and started using programs to spin text, and this resulted in producing content that is not readable. This is obviously something that Google frowns upon per its policies. Anyway, lets get to the question that you've all been waiting for:

How can I bring my site(s) back from the Google Penguin Update? I am going to preface this by saying that each solution is going to be different for everyone, but we are going to go over what we found to be the most popular mistakes that people made. The first step that you should take to see what mistakes you've made, is to do an overall backlink check on your effected site, and see what anchor text you were using, and what type of links they were. There has been huge debates recently regarding anchor text over optimization in SEO. For those of you who are not familiar with what anchor text is, see the illustration below.

Anchor text is the text that "links" the word to the website that it is linking to.


For the past decade, more or less, webmasters have stuck by this rule: build as many anchor texts using the word or words that you are trying to target in the search engines as possible. Within the past few years, Google has caught on to people doing this, and is now imposing serious penalty's. We took a look at a black hat site that we've been monitoring for a long time, which we will keep anonymous for obvious reasons. The chart below illustrates that approximately 70% or more of its anchor text was the exact keyword that they were targeting!

Anchor Text in Penalized Site A

Primary Keyword Secondary Keyword Other Keywords Blank Data

It really started to make sense when we analyzed 5 more sites that were penalized (moved from page 1 to page 10 or worse) and we've found that all of these sites had 70% or far greater of their anchor texts being their primary target keyword. What does this mean? Think about a normal websites natural link progression. When people link to it, they are going to use all sorts of anchor text such as "click here" - "check out my site" - "like our page" or "my company name" very rarely do "normal" people link to a site and say "blue widgets" etc. Google now sees this and has this mechanism built into its algorithm, making it extra difficult for blackhat SEO's to use automation software to build links. Some people are calling this an "over optimization penalty" others are calling it just a plain old update. Whatever it is, it is a move in the wrong direction for people who do not diversify their back links.


What about the type of backlink? The type of backlink matters just as much as the anchor text. There are many types of backlinks, including:         

blog comments forum profile comments links from press releases links from "web 2.0 properties" like and links from blogs or blog networks links from article directories links from help sites or wiki sites links from social platforms or "social bookmarks" and many many more

These types of links are solely based on the type of website that it is, or the platform it is running. To answer the question, in this update, Google is placing much less of an emphasis on the type of link than ever. This does not mean to go out and place an order on for 100k blog comments as long as you vary your anchor text, no. Google still considers that spam and will punish you accordingly. Again, referencing Matt Cutts' blog post: "...Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content..." What we can gather from this data, is that Google is now taking a very hard look at what type of site your site is, and what type of site is being linked from. For example if you have a site all about frogs, and you have 100 links coming to it from sites like "" and "" Google sees that these links are not relevant and is going to devalue, not count, or even penalize these links. There are some exceptions; there are "general" type blogs where there will be links from many niches. There will be social sites where anything goes, etc. This is mainly talking about links coming from sites that have a singular focus. My advice is to stay away from linking from sites that have a different focus that the focus of your sites. The solution for this is to do what is natural: only place links from sites that are relevant to yours.


SEO is not dead. As long as there are search engines, there will be a way to optimize a given site to help it rank better on a given search engine. At Elite Strategies, we do not give into temptation of whatever the latest and greatest link building tool or tactic is. We have seen many Google updates in the past and we will see many more. A small percentage of our sites were penalized by this update, and this was mainly due to the practices of SEO companies employed before we were contracted. We are currently making the effort to bring them back and are already seeing positive movement.

How do you assess your site for bad practices? 1. First thing to do is get a complete report of all of the sites that are linked to you. You can do this using free methods such as Google Webmaster Tools, or paid tools such as Majestic SEO or Ahrefs. 2. Take a look at these links and in particular take a look at the anchor texts. Put the anchor texts in a spreadsheet and measure the percentages of anchor texts. If there is a heavy emphasis on one or two particular keyphrases, chances are you are not looking good. 3. Take a look at the websites themselves that are linking to you. Are they related to your content? 4. Look at the text within the site linking to you and immediately surrounding the anchor text. Is this text readable? Does it come across as not relevant or spun? Chances are if you have fallen victim to these practices, it is near impossible to remove links. You can submit a "reconsideration request" from Google, but chances are nil that that will work. The best thing you can do is build links in an attempt to outweigh the ratio of bad links with newer, good links.

How will Google know how to rank for my keyword if I am not using anchor text with the keyword in it? This is a really good question. Google does a great job of scanning your code, domain, images, structure and links and will make the determination based on that. You will be surprised at how well Google does a great job at ranking for "the correct" keyword. If you start to see that you are not ranking for what you want to rank for, write better content, look into "LSI" content, optimize your on site factors better, and use better linkage.


Pro Tips 

 

Make sure that your site is looking good. If the layout is shoddy or it is difficult to navigate, chances are that is a factor in your bad rankings. Google has it built into their algorithm to target sites that are poorly designed, poorly coded, or not functioning correctly. Add content often. Google likes to see sites that are updated often, and will reward those sites that are. Create social media accounts for your brand. Facebook and Twitter are a great place to start. Add widgets on your site for these accounts, engage your customers, and update frequently. Make sure your site does not have some of these extremely old and outdated features: o keywords stuffed in the title, meta tags, and body o a page for each surrounding city (we've seen this time and time again) o over linking inside of your page o gateway pages (welcome page) o sites made entirely in Flash (Flash is dead) o unoptimized images

Thank you all for reading, we are so happy to contribute to the community. Please feel free to leave feedback on our blog by navigating to or by leaving us an email at


Google Penguin Update  
Google Penguin Update  

1 Guide Google Penguin Update: - How to tell if your sites were effected - How to recover an effected site - Bes...