This article is more of a question than an answer. But I think it is an interesting question.
As you doubtless know, when trying to figure out the value of your website, Google puts a great deal of emphasis on the number and quality of links that point to your pages from other websites.
While the Googlebot does a good job of discerning the content of your website, it cannot make a determination on the quality of what you publish. Google depends very much on inbound links to give it some insight into the quality of your website.
For instance, if Starbucks were to place a link to my CoffeeDetective.com website on its homepage, Google would consider this a massive thumbs-up and would likely increase its internal page rank of my site accordingly.
The model Google uses is pretty much the same as how peer-reviews are used to judge the quality of academic papers. Reviews from respected sources are a good indicator that the paper is of a high quality.
The fact that this model is drawn from the offline world is significant.
In the world of Web 1.0 it made perfect sense to judge the quality of a website by the quality of its inbound links.
But does it make as much sense in the world of Web 2.0, when more and more people are linking to each other, without any expectation of being linked to by large, highly regarded websites?
To illustrate the point:
Traditionally, if my website about a new productivity application I have developed is linked to by Mashable.com, my site will probably get an influx of direct traffic and a thumbs-up from Google.
But what is that scenario were a little different? What if that application were aimed at freelancer writers in particular, and word spread through small blogs, Twitter lists and Facebook groups? What if I reached my target audience without ever becoming a blip on the radar of Mashable or any other “authoritative” website?
If I am wrong, I hope someone will let me know. But if my software site were wildly popular and of tremendous quality, but never came to the attention of big, relevant websites, and therefore never attracted links from them, would Google ever figure out how good and valuable my website is?
To put it another way, if my site is linked to by thousands of individuals through social media, but never from Mashable or CNET, will I ever rank high up in the search results?
I am asking this question because we are moving very quickly from a one-to-many environment online – CNET as one authoritative source for many readers – to a many-to-many environment – with millions of individuals linking to and learning from each other.
The “authority” of each individual is negligible, but their combined influence can be significant.
Can Google determine quality by looking at landscape which comprises these many-to-many links?
You might argue that any really good website will eventually come to the attention of authoritative sites, which will then link to it.
Maybe, but that still seems to be a model based on Web 1.0. “Let’s cross our fingers and hope a big website notices us.”
As we move forward, I think it is important that websites and blogs are able to acquire authority and page rank based on a high volume of links from a large cloud of small, individual sources.