Google: Don't Build Too Many Links Too Fast or We May Ignore Them.

Google's algorithms can tell is you are building links unnaturally CREDIT: THE WEBMASTER

By Jonathan Griffin. Editor, SEO Consultant, & Developer.

· 3 min read

In a Google Webmaster Hangout on Friday, Google’s John Mueller said that their algorithms might interpret a sudden rush of new links as being against the Webmaster Guidelines. In response, Google may take action, such as ignoring the links or penalizing the site manually.

Mueller was asked the following question:

“If I started link building and built 200 links in two days and then build zero links for two years, will Google see this is blackhat and penalize me?”

I’ve summarized the main points of the response below:

  • Building no links for two years is NOT Black Hat. As such, it won’t be penalized.

  • It’s not necessarily the number of links that you are building, but that Google may see that those links are being built “in a way that would not align with our Webmaster Guidelines.”

  • If those links are unnatural, Google’s algorithms may ignore them, and you may receive a manual penalty from the webspam team.

  • Mueller clarifies that it is “not so much the number of links in the time that you’re talking about. It’s really just the type of links that you’re building or the type of link building that you’re doing.”

  • In summary, Mueller indicated that Google could tell whether you are dropping links randomly on sites, buying them, exchanging links, or using a weird link network. It’s not about “the number in the period of time, it’s really the type of activity.”

You can watch Mueller’s full response in the video below:

Let’s give this approach by Google a bit of context.

Before September 2016, Google’s Penguin algorithm would automatically penalize whole sites if they had poor link profiles.

In September 2016, Google Penguin 4.0 changed everything. Instead of devaluing sites, Google started to devalue the spammy links themselves.

Google’s Gary Illyes said at the time, “Traditionally, webspam algorithms demoted whole sites. With this one we managed to devalue spam instead of demoting AND it’s also more granular AND it’s realtime.”

Fast forward to today, there has been a lot of discussion of EAT. EAT stands for Expertise, Authority, and Trust, and is a very important part of the Webmaster Guidelines, as evidenced by the Search Quality Raters Guidelines.

According to Gary Illyes, EAT is primarily based on links and mentions from authoritative sites.

To build 200 links in just a few days would most likely involve posting large amounts of links on low-quality sites with little expertise or trust.

While the webspam algorithm is much more complicated than this example, I think it does demonstrate that Google can see the trustworthiness of a backlink when deciding which ones to devalue or ignore.