Quality is king in search engine rankings. Of course spam sites using the latest techniques make their way up top—but their rankings are temporary, fleeting, and quickly forgotten. Quality sites are the only ones that maintain consistent top rankings.
This wasn’t always true. A few years ago it was very easy to rank highly for competitive search terms. I know that for a fact because I was able to build my company from scratch just by using thin affiliate sites. Everybody knows that there has been a drastic change. At least on Google, it is increasingly difficult for sites without real content to rank highly, no matter how many links back they get.
Why is this?
Search engines use “quality signals” to rank websites automatically. Traditionally, those signals were things you might expect: the presence of the searched words in the body of the web page, and on the links coming from other pages. There is increasing evidence that Google is looking at other quality signals to deduce the relevance of websites.
Here are some of the so-called quality signals Google might be using to provide better results:
-
Voluntary Quality Raters. It is well known that Google pays more than 10,000 people to do searches and rate the results they see. I can safely assume that these raters do not visit every single result to see if it is a quality one. Having great titles and meta descriptions is a good way to appeal to those raters. For example: If I am a Quality Rater and I see a listing that says: “Advanced search engine marketing tips to succeed online” and another that says “Search engine optimization, SEO, PPC, SEM, Search engine positioning…” which one do you think I'd rate as spam?
-
Involuntary Quality Raters. You are a Quality Rater, too—and you didn’t even know it! Each time you do a search, see the results and retype the search again, Google’s software takes notice. Even when you do a search, click on some of the results, shake your head and keep looking, Google is listening. Your behavior signals to Google that the results it gave you might not be very relevant after all.
(As an aside: EGOL at Seomoz pointed out an interesting idea: Theoretically Google could use the personalized information they gather from us (Google Reader, Toolbar, Web history, etc.) to tell what we like—but also what we know about. For example, if my profile tells them that I read a lot about science, they might trust my clicks when I do searches for 'genetic research'. If my passion was celebrity gossip, maybe less so!)
-
Visitors actions. Are visitors doing searches and staying on your site, or are they hitting the back button and re-formulating the search? You want to provide sticky content and most important of all, you want your content to match the visitor’s query. You can provide the best page in the world about baseball caps, for example, but if the user was searching for baseball scores, that’s a strikeout.
-
Social bookmarking/rating. Unique and useful content is what drives visitors to bookmark and thumb-up your content. Content with many of these is content that search engines can trust as quality content.
-
Analytics (Visit length/Bounce rates, Return visits). This might come as a surprise to the overly naïve, but Google is giving Google Analytics away for free for a reason. My opinion is that analytics provides quality information no other method can provide them (except maybe the Google toolbar). Sites with high bounce rates, short visit lengths and few return visits are big signals that scream: SPAM. Many webmasters don’t install analytics for this very reason. They are afraid. My opinion, though, is that Google uses the information in aggregate form, not for specific sites.
-
Feed subscribers/readers. One of the best measures of the success of a blog is the number of RSS subscribers. A lot of subscribers means the blog is of high quality. No wonder Google recently bought Feedburner and set it free. The data is worth a lot more than subscriptions. I am experimenting with Feedburner’s FeedFlares and I think I have a nice idea that I plan to share later to help you increase your RSS subscribers. Stay tuned.
What does it all mean?
I’m trying to get across here that chasing search engine algorithms to discover how they rank websites is a losing proposition. It is practically impossible to keep track of so many variables. Many of them are out of SEO control anyway.
The good news is that it is increasingly easier for sites that are good for users to rank high. As I mentioned before, you make better use of your time by working on building useful and original content for your blog. With good content and visitors visiting, recommending, bookmarking and subscribing to your feed, links and high rankings will follow. Quality has always been king, and it looks like it’s going to stay that way.
Hafiz Rahman
July 12, 2007 at 11:07 pm
Thank you very much for this post! Just by seeing the (temporary) results, it's indeed very tempting to try using the "latest techniques" you mentioned above. Writing quality content does take patience and perserverance. And I agree that it will be a better investment in the long run. Now the only piece of the puzzle is how to keep ourselves from getting discouraged when the temporary results aren't as good as hoped. Any ideas?
Hamlet Batista
July 13, 2007 at 8:28 am
Hafiz, Thanks for your comment. If you are tempted to take the 'dark path', I guess you will have to work on many sites, trying different techniques. Hopefully some of them will work. The pros create custom tools for this purpose. That is how they don't get discouraged.
Jez
July 13, 2007 at 4:18 am
As more metrics are introduced, and more information is harvested directly from users do you see the influence of back-links being superseded?
Hamlet Batista
July 14, 2007 at 6:17 am
<blockquote> do you see the influence of back-links being superseded? </blockquote> Jez, They don't need to replace them. Links are simply another quality signal. All quality signals can be faked. The more signals they can get the better. I think they will be giving more weight to the ones that are more difficult to reproduce by spammers.
Matt Ellsworth
July 13, 2007 at 7:14 am
Great thoughts on this subject. I totally forgot about the paid content readers - I never realized that there were 10,000 of them. I hear what you are saying about analytics and why some are worried about it. I have no idea what they do with all the data but they sure are collecting lots of it.
Hamlet Batista
July 13, 2007 at 8:56 am
Matt, Thanks for your visit. I am sure they collect a lot of data from us. I think the intelligence data is helping them in many ways to improve both their organic and paid results.
Mutiny Design
July 13, 2007 at 3:02 pm
Do you know of the content reader just look at on page content? Do they look at your HTML or you incoming links? I am not sure how successfuly Google would be able to use data collected from analytics. Although, I have thought they would be stupid not to consider using it. If bounce rates became a big factor in ranking, I could easliy create a script to connecto to proxy servers to go to my site and browse around them. Or you could make a script to go to your competitors sites and instantlly go elsewhere. Vist length is another ambiguous one. I could have a site, which has the purpose of getting people to phone me as quickly as possible and the visit length would be short, but the user has got what they wanted. Does that mean my site is low quality? Social bookmarking/rating - Another good idea, but how difficult is it for me to set up a few hundered digg accounts and create a script to go an digg whatever article i send to the script? Other than the human editors and AI, I can't see Google coming up with anything concrete. Everything they have ever done has been spammed.
Hamlet Batista
July 14, 2007 at 6:30 am
<blockquote>Do you know of the content reader just look at on page content? Do they look at your HTML or you incoming links?</blockquote> I assume you are talking about Google Reader. I think it is safe to assume the Google FeedFetcher can easily extract the links from your posts. <blockquote>Other than the human editors and AI, I can’t see Google coming up with anything concrete. Everything they have ever done has been spammed.</blockquote> I agree with you. SPAM will always be there. SPAM is easy to identify for humans but not for machines. Automation gives them scalability. It would be impossible to create the massive index manually, but that comes at the cost of letting spammers through. As they employ more humans and other quality signals, they will be able to further reduce SPAM.
Dink
July 15, 2007 at 3:17 pm
<blockquote>I agree with you. SPAM will always be there. SPAM is easy to identify for humans but not for machines.</blockquote> As a spammer, I can tell you that not all spam is as easily identified as you may think. The WWW is truly world wide. A large (and growing) number of webmasters are not native English speakers. The great content they create often doesn't read the same as that written by those whose native tongue is English. I do agree that spam will always be there. As long as an important portion of our content discovery is tied to a search engine, we'll be there to game it. BTW...nice blog.
Hamlet Batista
July 16, 2007 at 12:20 pm
Dink, Thanks for your compliments and comment. I agree that there is some really clever spam that is hard to detect, but remember that you only need motivated competitors to report you. I know that for a fact.
AussieWebmaster
July 13, 2007 at 7:22 am
The Google Army... sad thing is these guys have much more impact that the dmoz editors ever did. And I turned down doing this a few years ago.... thought my spare time was worth more than the $10-20 an hour they were paying!!!
Hamlet Batista
July 13, 2007 at 8:59 am
Frank, Thanks for your comment. Most people in the third world don't make $10-$20, and it is definitely a lot for a college student. No wonder they have so many quality raters.;-)
Paul Montwill
July 15, 2007 at 11:11 pm
Hamlet, thanks for mentioning FeedFlare. I installed FeedBurner on my website and I can see the number of subscribers growing. I would like to utilise this so I will investigate FeedFlare. Interaction is the king! And one more thing, the more I read about Google, the more I can't imaging how complex this whole search engines are and the amount of data and factors they calculate is just unbelievable!! SERP = visitors x Google applications used x millions of factors most SEO are aware of x millions of other factors we have no idea about
Geoff
July 16, 2007 at 3:06 am
There have been lots of articles on why content is king. This one, however, is a great look at some elements that get overlooked: the human elements. Nice, fresh take on a subject I thought was exhausted long ago.
Hamlet Batista
July 16, 2007 at 12:21 pm
Geoff, Thanks for your comment. I am glad that enjoyed the post.
Paul Montwill
July 16, 2007 at 11:26 pm
The human element is even getting bigger with the growing number of Google Apps and groups of friends on diggs etc. helping to promote their projects (digg has ban many users for that).
Rex
May 11, 2008 at 6:42 pm
1. I did not know about "voluntary quality raters". 2. Analytics: I saw my homepage jump to first on Google for one of my most important keywords shortly after installing the Analytics script on all of my site pages. My site has been in existence less than two years and beat out one with over ten thousand inbound links that has been around for ten years. Perhaps this is anecdotal evidence for your assertion that Analytics data is factored in SERP rankings. Once again, I'm learning something on your site that I have not seen or heard anywhere else. Thanks!
Tina
May 26, 2008 at 9:00 am
.....And if you are still looking for the quality things. Why the duplicate content writers are still on top...? I agree, the content is a king for the better serps, But google is genous big time...Round way..The more you work the higher...For both parties perhaps? :)
Curt
August 11, 2008 at 9:59 am
Great post. I was just discussing this with another blogger who spends a lot of time with SEO strategies. I agree with you, SEO strategies are no longer worth the effort. You are better off spending time writting better articles, adding features or marketing your site.