[an error occurred while processing this directive]


Diversity of Link Building Processes and Types of Sites That Link to Your Website
If your links are natural, they will come from a wide variety of websites. It’s a red flag if all the links come from only 1-2 domains or types of websites. For example, if all your links come from social bookmarking or blog websites, then this is an unnatural link profile. It is suspicious and can cause troubles.

If you see too many links coming from a single source it may be time to add more diversity to your link profile.

Lack of natural Links to your website.
Natural links are, well, natural. They have a natural link profile and create more tolerance from deviations in the natural link profile due to poorly executed or past SEO. If 90% of the links are real links, created by real people, then even if you have some spammy links, you’re chances of being penalized drop significantly. Conversely, if all the links are from SEO, it’s very difficult to generate a 100% natural link profile, anchor text distribution, etc, for a website. Additionally, not only do you need to satisfy all of Googles criteria, you also need to correctly anticipate all future algorithm updates that Google will implement. If a website has no natural buzz, then we work hard to find ways to create some discussion around the website to supplement the SEO links.

Concerns surrounding duplicate content
Duplication is a concern for many websites today. Often, a site's text is plagiarized by a competitor or well-meaning employee without the company even being aware that potential damage is being done in the process. Best SEO practices recommend that every page on a website contain content that's 100% original. So, if a portion of this text is copied and used on another website - whether it's a classified ad, directory listing, respected blogger, or even the dreaded competition, a site may get penalized by Google for such duplication. Google does not really care where the content originated or which website holds first rights of use. The best remedy for problems concerning duplication is a regular check-in process to verify if any copying has occurred, then revisions to restore all web page text to 100% original status. We use Copyscape.com to determine the exact words, pages, and content that may have been plagiarized, then develop an action plan to correct such matters.

Manual Analysis With Google Webmaster Tools
As a final check, we like to login to Google Webmaster Tools. We use this tool to tell us two things:
(a) Manual penalty. If Google gave you a manual penalty, there will be a manual notification here. If we see such a penalty, we can take specific steps to clean it up and we’ll most likely use the Google disavow tool to further clean up the links.

(b) Manual review of the links. We expect that the technique above will give us very good idea of what’s going on. We want to be 100% sure that we know exactly what the problem is. Our team uses Google webmaster tools to download all links to your site that Google has found. We can spot check this list of links to your site and see if we see anything worrisome that needs to fixed.

We do not presume to know what statistical techniques Google used to build their models.
However, there are certain types of techniques that are regularly used to answer these types of diverse classification. In particular, we prefer to use a gradient boosting algorithm. Most of us think about statistical analysis as a process where on enters some variables in an Excel spreadsheet, making a nice graph with a linear regression that shows an upward or downward trend. You can see this on the right. Unfortunately, this grossly over-simplifies complex problems and often produces a crude result where everything above the line is considered different from data below the line, when clearly they are not. In the example graph presented here, there are plenty of penalized sites that get missed by falling below the line while other completely decent sites that are above the line still get hit.

Factors Included in Our Analysis
Our process groups trust metrics and anchor text usage together. It also enables our team to analyze in combinations as well - to determine the big picture for a website. This methodology gives us a model we use to determine not only increased Penguin vulnerability, but also what factors contribute to that vulnerability and to what degree. Here are some of the factors we evaluate:
  • Exact-match anchor text
  • Phrase-match anchor text
  • Commercial anchor text
  • Site-wide links (links on pages of an external website to your site)
  • Moz Trust, Domain Authority, and Citation Flow as compared with Domain Trust Flow (Majestic) of existing sites linking to your website.

Copyright © 2013 Bergstrom Inc. All rights reserved.