Optimizing for highly competitive keywords requires a completely different strategy than optimizing for non-competitive ones. First, let’s clarify a few points. When I talk about long tail or fat head keywords, I am talking in relation to the search demand for those particular keywords. I am not talking about the offer (the number of sites competing for those keywords). Although demand and competition are generally in direct proportion, there are cases where this is not the case, such as unexploited niches.
In this post, let’s just explore the simple case where you are targeting non-competitive keywords. They have decent demand but not a lot of competition. You may be asking yourself how you can tell the competitive and non-competitive keywords apart.
My strategy is to perform the same search several times, but using advanced modifiers to look for the keyword in the title, in the URL, in the text and in the incoming links. If there are a large number of sites for the incoming links search when compared to the other searches, it’s a good indicator of heavy competition. Why? Because as I mentioned before, this tells me search engines are relying more on the off-page metrics to rank the pages than the on-page ones.
An alternative approach is to check the number of back links of the top search results. I don't use that because not all links to a site are counted for each specific ranking. For example, a site like Amazon has millions of back links and the fact that some pages rank for particular product searches doesn't mean all their links contributed to that ranking. It is also important to note that, as I explained before, not all top-ten results will remain there. It is very important to identify the web authorities when doing your competitive research.
So now, you have used my technique and you have some really good keywords that don't have a lot of competition. The goal is to use on-page optimization techniques. First we study how our chosen competitor (be sure it’s a web authority) incorporates the keywords into its content. Then we proceed to incorporate the keywords into our content so that the metrics are the same or similar.
Relevance Metrics
Relevance metrics are the quality signals or key differentiators search engines use to organize web pages. We need to look for density, prominence, weight, proximity and rarity at a minimum. Let me explain these metrics in more detail:
-
Density is how many times a keyword appears in the content compared to the rest of the keywords.
-
Prominence is how high in the page the keywords appear compared to other keywords.
-
Weight is the relative size of the font and capitalization used for the keywords.
-
Proximity is only relevant for phrases of two words or more and is about how closely the words are placed together.
-
Rarity, also known as term weight in Information Retrieval science, is a measure of how much the keyword appears in the document relative to all other documents.
Most SEOs like to use fixed numbers for these metrics, but I prefer to use the same metrics as the competing web authority I am trying to match or beat. The idea is that search engines should see the same or a similar relevance profile when they evaluate your page for those keywords as when they evaluate your chosen competitor.
This work is definitely easier to accomplish with a dedicated tool like the keyword density analyzer. The SEO suite I plan to release next month in public beta also incorporates such functionality.
Looking ahead
How profitable your SEO efforts are going to be is dependant primarily upon how much demand your keywords have and your ability to meet that demand better than your competitors. On the other hand, how likely you are to get the necessary rankings in the first place is going to depend largely on how much competition you face on the SERPs and developing your winning strategy.
For competitive keyword phrases we need a more involved strategy that I going to explain in a follow up post…
Jez
August 16, 2007 at 1:00 am
Some very good pointers there, thanks!
Kent Schnepp
August 16, 2007 at 10:02 am
Hamlet - Excellent points. What impact do you think prominence and weight have on relevancy scores? Is it significant?
Hamlet Batista
August 16, 2007 at 11:13 am
Kent - Thanks for your comment. I think is important to consider them as they are referenced in the the research papers. On the other hand, it is hard to tell which metrics have greater impact as they can tweak their formulas all the time. They probably use different weights for different searches. The best approach is to measure the web authorities in for your target keywords. Compare their scores for the metrics you are interested in measuring and make your own assumptions.
Ivan
August 18, 2007 at 4:31 pm
Hamlet, Your approach is really interesting, searching for authorities with the targeted keywords and comparing their site's metrics with your site's ones. But that would be a kinda difficult task when it comes to compare metrics for a lot of sites (authorities). I guess that having a software to do this for you would be nice. Also, would be a plus if that software can show the data (metrics) in a useful and easy way for the user.
Hamlet Batista
August 22, 2007 at 11:13 am
Ivan - Thanks for your comment. Stop giving clues about what my SEO software does ;-)
Mutiny Design
August 19, 2007 at 1:44 pm
Some interesting tactics here Hamlet. Its given me an idea for a new SEO tool. Also, looking forward to your suite. One note I have to add is to look for emerging industries, to give you a few examples 'Eastern Europe Investment Properties', 'iPhone' etc. As these are new indsutries, the competition will be far easier to beat than long established industries, such as 'cheap airlines'. Whereas most people are looking to jump on existing Web 2.0 bandwaggons and become the kings of blogs, social bookmarking etc. I am looking to find untouched areas and trying to figure out what Web 3.0 might comprise of. Currently I am working on a serivce which has been in demand for a couple of years, but no one has realised the potential....
Hamlet Batista
August 22, 2007 at 11:16 am
Mutiny - Excellent contribution! The main advantage of targeting untapped markets is that you get to be the first. <blockquote> Currently I am working on a serivce which has been in demand for a couple of years, but no one has realised the potential….</blockquote> I have to say that is very clever and that is how you become successful.
Mutiny Design
August 27, 2007 at 11:04 am
I have actually evolved this theory since reading your blog :) A good comparison for this would be blogging. If you were to start up a brand new blog to try and get recognition in your field about web standards and CSS, you would probably have a tough time getting readers, advertisers etc because the market is saturated. However, if you were to start up a new blog about technologies which may be part of Web 3.0 or could be considered Web 2.5 - such as Appollo, XSLT, XPath,ECMA 4 or XForms - you would probably not get a very big following to start with. However, your reader would be much more interested in your blog because you would be one of few or even the only source of quality onformation on that topic. As the technologies become more prevelent, more people will find you and so long is you have a quality blog you should be recognised as an authority.
Hamlet Batista
August 28, 2007 at 2:13 pm
mutiny - that is why I decided to focus on advanced sem, instead of just another SEO blog. ;-)