As SEOs and content marketers, we talk about backlinks links a lot. And we know that all links are not created equal. We have the metrics – the DA and the DR. We know the power of relevancy. We’ve started focusing more on traffic from sites giving links as well.
But one thing that’s important to emphasize is zooming out and looking at the link graph.
The important concept to introduce and remind is to think about links not as isolated concepts. No – think about it as linking many nodes together that form a link graph, similar to the human brain with connected neurons.
Two ideas worth thinking of:
1. A link graph should represent the corpus of a subject.
NerdWallet is a personal finance site that produced high-quality content. They’re an emblematic example of a brand that covers the entire corpus of the subject. If there’s a personal finance topic, they’ll write about it.
In the same vein, we expect their articles to be linked to by a large percentage of the most authoritative websites on the internet whenever those websites are also discussing a topic that NerdWallet has written on. Competitors won’t link, but quality sites that cite their sources will.
When Google decides whether to rank NerdWallet or LendingTree for a certain result, they should evaluate the corpus of the content and the link graph to determine ranking. They should look at the link graph as a whole rather than individual silver bullet links.
2. The reduced link graph can be useful for reducing spam and too much noise.
Imagine two web pages talking about a subject. One of them has links from the New York Times, 3 other newspapers, and 10 blog posts. The other has 50 links from random local small businesses & 100 spammy websites. Google decides to use a reduced link graph based on certain authority metrics. The small business and spam links are thrown out.
Explained well by Roger Montti:
These are what I call Links for Inclusion and Validating Links. I call them links for inclusion because in my opinion, based on my Penguin algorithm research, Google is creating at least two link graphs. One that excludes pages that are spammy and one that includes non-spammy pages.
This creates a situation where you must have links from the right sites in order to be considered for ranking. This is something called the Reduced Link Graph. It’s called the reduced link graph because it’s not a complete map of every page on the Internet. The map is excluding sites that are bad actors. The map of the Internet is reduced to sites that matter.
Do this at scale – will ranking be hurt at a global level with the reduced link graph in use? I think there’s a cascading effect, where if there aren’t strong enough signals, Google expands the link graph.
This is all conjecture.
We don’t know for sure the power of some parts of the link graph over others, we’re constantly poking and trying to get to the truth.