
Friday, January 19, 2007
Google Webmaster Central To Solve Canonical Issues
Site Navigation and Search Engine Optimization
No Graphical Menus, Please!
Graphical menus may look good, but they prevent you from exploiting one of the most important factors - anchor text. Search engines cannot recognize text on images. However, they give a lot of weight to anchor text (the text of your links). Use simple text links for navigation.
Put your keywords in the anchor text of all your navigational links.
Let's say that you optimize your main page for the phrase "keyword1 keyword2". You certainly have no benefit of links such as "products", "information", "articles", "about" etc. Use longer navigational links with the anchor text "keyword1 keyword2 products", "keyword1 keyword2 articles" etc. etc. Put a maximal number of instances of your key phrases in your navigational links. Anchor hits help both the page the link it is placed on, and the page it points to. To accommodate the longer links, you may need to redesign your whole site. Do it. It's worth it.
Use good and descriptive anchor title tags
Your navigational links should look like this: keyword rich anchor text. The anchor title tags are given a tiny amount of weight, but it is better to have it, than not.
Divert PageRank to your most important pages
To do it, follow these two rules:
1. Link to your most important pages from every page of your site using appropriate anchor text
2. PageRank is divided among all links on a page. Remove all unimportant links from every page and you will concentrate your PageRank power into your important pages. For example, you may not need a link to your Contacts page from every other page. For every page, think about which unimportant links you can remove without making your site strange and unusable, and do remove them.
Quick Search Engine Optimization Check List
1. Capitalize
Google keeps capitalization information and capitalized keywords are given slightly more weight. Go over all page elements and capitalize the keywords, wherever applicable. Capitalize your titles, alt image tags, anchor title tags, anchor texts etc.
2. Write your Alt Image Tags
Don't leave an image without an alt image tag. Use your keywords in the alt image tags. Capitalize. Check your site all over for images without alt tags.
3. Write your Anchor Title Tags
All your links should use anchor title tags with keyword rich descriptions. Links should be written as: keyword rich anchor text
4. Use simple Text Navigation
Search engines don't recognize text on images. Use keyword rich text links for navigation.
5. Rewrite your Anchor Texts?
On page links with your keywords increase the relevancy of the page the links are placed on. If you have a page about colon cancer, use links such as "Colon Cancer Info", "About Colon Cancer", "Colon Cancer Articles".
6. Bold your Keywords
Wherever applicable, bold your keywords. Bolded on page text carries more weight. Note: bolded links don't help.
7. Rework your Title?
If you page targets a short keyword phrase, use two instances of your keywords in your page title. That will help a lot.
8. Use more Keyword Phrases
Think about including more keyword phrases throughout your page. Use direct phrases, because they carry a lot more weight. Example: use "Search Engine Optimization" instead of "Search Engine …some words.. Optimization".
Quick Search Engine Optimization Check List
The major problem that crawlers face is the growth of the web. Should a search engine crawl new pages or refresh old ones? There are too many pages to crawl and search engines must choose wisely. It is important to crawl documents that change frequently and documents that are of high quality as often as possible.
Crawling Priority
Search engines assign a crawling priority to every page. Crawling priority is a number that denotes the importance of a page in relation to crawling. Pages with a higher crawling priority number will be crawled before pages with a smaller priority number.
Main Factors that determine Google's Crawling Priority
PageRank - pages with a higher PageRank have a higher crawling priority
Number of slashes ('/') in the URLs - pages with fewer slashes in their URLs have a higher crawling priority because they tend to change more often. In other implementations, Google uses the number of slashes ('/') in the links that point to a page. Getting a link from a page with a lot of slashes in its URL results in a smaller crawling priority number.
New Sites and Crawling
It is really frustrating to release a new site, and discover that in the following 3 months Google has crawled just 5% of its pages.
In order for a new page A to get crawled:
1. Google must crawl/index a page B that contains a link to page A
2. Google will discover a new page A sooner if page B itself has a high crawling priority (if you get a link from a low PageRank page, Google might crawl it 6 weeks later to find about the existence of your site)
The worst situation happens when you have a new site with a lot of pages that are more than 2 clicks away from the home page. These new pages might get crawled months later because the pages that link to the 3rd ++ level pages are also new (they have to be found and crawled first, and at the same time have a low crawling priority).
Tips to get a new site crawled faster
1. Start working on getting incoming links. That is the fastest way to get your home page crawled. Obtaining many incoming links will get the PageRank of your home page up, which will propagate to the second, third etc. level pages and bring them up in the crawling queue.
2. Use an internal linking structure that minimizes the path (number of links it takes to get) from your home page to the majority of your pages
3. Point 2. can be augmented by the usage of site maps. Site maps can link to fourth, fifth etc. level pages.
4. Avoid having pages with too many slashes in the URLs such as mydomain.com/articles/section/1/article/2/article_title/
5. On sites with a tree-linking structure such as directories, you can rotate the third-level categories displayed on your home page. Usually on such sites, the home page lists the main categories and under each main category there are links to some of its subcategories. Rotate the subcategory links.
How does Google Rank Pages?
Google has dramatically changed the way they rank pages in the last few years. What worked once upon a time does not seem to work now. What worked before? PageRank and keyword rich incoming links. Webmasters just swapped keyword-rich links, bought high PageRank keyword-rich links etc. Optimized sites could dominate the results, but Google wants genuine sites that have natural incoming links (votes).
In my opinion, Google has introduced new ranking scores that operate mostly on whole sites (domains) and lowered the weight of some of the older page-based scores. Some sites get high rankings for almost anything (amazon). I have noticed that the rankings to all my pages within a site jump/fall uniformly. Basically these new scores tell Google how reputable/trusted/quality is a whole site and these factors influence the rankings of all pages of a site.
When your site (not page) gets a higher overall domain score, you get more traffic/rankings to all your pages and vice versa. These new domain based scores are connected with the infamous SandBox effect. To me, a sandboxed site is a site with a low general domain score.
How did Google change the weight of the older factors (PageRank, anchor text)?
1. Google de-emphasized the weight of anchor text keywords. Why? Because that makes it harder to rank internal pages. Webmasters tend to link more to sites than to internal pages. Giving too much weight on the anchor text of links pushes up the rankings of home pages. Webmasters countered this over-emphasis of anchor text with two strategies:
textual keyword-rich internal navigationbuying / swapping keyword-rich links that pointed to internal pages
In Google's eyes, when a site has thousands of natural incoming links to the home page and internal pages, this site must be of very high-quality and even the internal pages with zero incoming links deserve to rank high. That can be achieved by lowering the weight of anchor text and introducing domain based quality scores. I also think that Google spreads the anchor text value of links to all related pages on a site. What does it mean? If you have 3 pages about widgets, an external link to one of the widget pages with anchor text "widgets" may also help the other two widget pages.
2. Google in my opinion has changed the way they calculate PageRank. I believe the toolbar PageRank does not reflect the real way they calculate PageRank at all.
Consider this fact: when you get a high PageRank with site-wide incoming links, your rankings don't get the same boost, as when you get the very same high PageRank with links from a lot of unique domains. The toolbar PageRank is useless.
Link popularity is still important, but we don't know what modified version of calculating PageRank Google uses.
What are the major new domain based factors that Google introduced?
Google seems to be going in the direction of their "Information Retrieval Based On Historical Data" and follow-up patent(s).
Domain Age
That has become too obvious. Newer sites are less trusted than older sites. This is a factor you cannot control.
Freshness/Staleness
Google factors in how fresh or stale your site is.
Your site is considered fresh when:You have recently updated it (changed content or added new pages). The site acquired new incoming links recently. Most of the sites that link to the site are fresh (recently updated, got linked to).
Your site becomes stale (outdated) when:You have not updated it recently. You have no new incoming links. The sites that link to you have not been updated and linked to.
Content Updates/Changes
Have you updated your site recently? If all your competitors constantly update their sites but you don't, your rankings may go down.
User Behavior
Your pages get ranked high for some keywords. Do the searchers actually click on your pages? If they don't it may get your rankings lowered. When searchers frequently click on your ranked pages that is a good thing in the eyes of Google.
Query Based (In my opinion, Content is King factors)
Every time one of your pages gets ranked high (top 30) for some keywords that tells Google that you have good content. Before you get ranked high for competitive keywords, you need to get ranked high for non-competitive ones. When you have a lot of unique content, there is no way your pages will not get ranked high for at least some non-competitive keywords. The more content you have the more times you get pages ranked high for non-competitive keywords, which helps in the future rankings of more competitive keyphrases.
Google may influence the rankings of certain queries, by looking into the rankings of their related keyphrases. When you rank high for some keyphrases this helps ranking high for related queries. Content is King. Don't over-optimize but use a variety of related words. That helps better in the long-term.
User Maintained Data, Traffic etc.
Do your visitors bookmark your site? Do they stay long at your site? Do they come back?
Focus on the visitors and Google will find it out and boost your rankings.
Linkage of Independent Peers
How many unique domains link to your site? The more usually the better unless the links grow too fast.
Anti-Spam Factors
Google tries to detect when you do aggressive link building. Again, don't overdo it. Focus on content. Over-optimizing content also does not help too much because it lowers your chances to rank well for related/synonymous keyphrases.
In a nutshell, how do I get top Google rankings?
1. Add new content frequently (at least once a week).
2. Write long in-depth content instead of short pages. Longer pages will always outperform shorter ones.
3. Don't over-optimize. Think about making your navigation readable, your text readable and include more keywords only if you think it is appropriate to your users. Write naturally and include related terms, synonyms etc.
4. Don't be over-aggressive with link building.
5. Link out to other good sites/pages in your articles.
6. Cross-reference your content by putting links within your content to other pages on your site.
7. Use long descriptive anchor text. Keyword density in content/links is a myth.
8. Publish unique content. Forget about duplicate content. It does not work on Google.
9. Have some patience. Let your site age and don't stop working on any factor (content, links).
10. It is better to lay off link building than adding fresh unique high-quality content.
11. Content is King.
When you do all of these above, you will get a very high domain based score and you will easily outrank the over-optimized competition.
How to get more clicks/traffic out of your top search engine rankings?
Naturally, it stands to reason that changes in organic search engine results' descriptions can lead to different CTRs. When search engines list results, they show keyword rich text snippets taken from the page text, meta description tags or DMOZ/Yahoo directory listings. These keyword rich snippets are shown together with the titles of your pages.
Search for your keywords on Google and notice what text descriptions does Google show for your top ranked pages. Think about it - can you improve the link description so that more surfers click on your link? Most often, you can.
There are two issues here - which description will lead to the highest CTR and how to make Google show exactly this optimal text snippet. Let's tackle these two problems one by one.
How to write a link description that improves the CTR?
First rule here is that the description must contain your keywords. This is important because surfers like to see the keywords in the description and if the description lacks the keywords, Google may not show it at all (remember, Google chooses to show link descriptions that have the keywords it them).
The link description will have more clicks when it offers benefits and when there is a call to action (example: download something free). The best way to test it is to run an AdWords campaign. I use text that has proven to offer the highest CTR on my AdWords campaigns.
You can also use common sense. Write down a couple of text descriptions and try them for a period of time (either on AdWords or on your pages). I prefer AdWords because I can test different texts simultaneously.
How to make Google show your preferred link description?
Write the keyword rich link description in your meta description tag. Most often when Google finds the keywords in the meta description tag, it shows the text in the tag as a link description. That works on the premise that the meta description tag will provide a better human edited description than text snippets taken out from the page text (often text snippets lead to low CTR descriptions).
Your meta description tag must use a short well written keyword rich description that targets the most important/competitive/highest conversion keywords.
If the page targets many keywords then you will use the meta description tag for the most important keywords and edit the page text for your less important keywords until it produces a better description. In this case, you will need to edit the text around the targeted keywords.
Testing the idea
On one of my top ranked pages, I plugged a meta description from my AdWords campaign and I immediately saw better CTR (19% higher than before). On another page I got a 50% boost because the link description shown previously by Google was screaming "buy this stuff" and now the description looks like an interesting article title (actually it is a sales letter disguised as an article). Surely numbers will vary from page to page and from one link description to another.
All in all I think it is in many cases easier to increase your organic listings' CTR than to move up in the rankings for competitive keywords.
One last tip for your page titles - use a title of the type "list of your keywords - yoursite.com", not "yoursite.com - list of your keywords". The first title type has a higher CTR.
Articles From seoguide.org
How to get more clicks/traffic out of your top search engine rankings?
Naturally, it stands to reason that changes in organic search engine results' descriptions can lead to different CTRs. When search engines list results, they show keyword rich text snippets taken from the page text, meta description tags or DMOZ/Yahoo directory listings. These keyword rich snippets are shown together with the titles of your pages.
Search for your keywords on Google and notice what text descriptions does Google show for your top ranked pages. Think about it - can you improve the link description so that more surfers click on your link? Most often, you can.
There are two issues here - which description will lead to the highest CTR and how to make Google show exactly this optimal text snippet. Let's tackle these two problems one by one.
How to write a link description that improves the CTR?
First rule here is that the description must contain your keywords. This is important because surfers like to see the keywords in the description and if the description lacks the keywords, Google may not show it at all (remember, Google chooses to show link descriptions that have the keywords it them).
The link description will have more clicks when it offers benefits and when there is a call to action (example: download something free). The best way to test it is to run an AdWords campaign. I use text that has proven to offer the highest CTR on my AdWords campaigns.
You can also use common sense. Write down a couple of text descriptions and try them for a period of time (either on AdWords or on your pages). I prefer AdWords because I can test different texts simultaneously.
How to make Google show your preferred link description?
Write the keyword rich link description in your meta description tag. Most often when Google finds the keywords in the meta description tag, it shows the text in the tag as a link description. That works on the premise that the meta description tag will provide a better human edited description than text snippets taken out from the page text (often text snippets lead to low CTR descriptions).
Your meta description tag must use a short well written keyword rich description that targets the most important/competitive/highest conversion keywords.
If the page targets many keywords then you will use the meta description tag for the most important keywords and edit the page text for your less important keywords until it produces a better description. In this case, you will need to edit the text around the targeted keywords.
Testing the idea
On one of my top ranked pages, I plugged a meta description from my AdWords campaign and I immediately saw better CTR (19% higher than before). On another page I got a 50% boost because the link description shown previously by Google was screaming "buy this stuff" and now the description looks like an interesting article title (actually it is a sales letter disguised as an article). Surely numbers will vary from page to page and from one link description to another.
All in all I think it is in many cases easier to increase your organic listings' CTR than to move up in the rankings for competitive keywords.
One last tip for your page titles - use a title of the type "list of your keywords - yoursite.com", not "yoursite.com - list of your keywords". The first title type has a higher CTR.
Thursday, January 18, 2007
Web Directories and Specialized Search Engines
What are Google Alternatives
The first alternative to Google is obvious – optimize for the other major search engines, if you have not done it already. Yahoo! and MSN (to a lesser degree) can bring you enough visitors, though sometimes it is virtually impossible to optimize for the three of them at the same time because of the differences in their algorithms. You could also optimize your site for (or at least submit to) some of the other search engines (Lycos, Excite, Netscape, etc.) but having in mind that they altogether hardly have over 3-5% of the Web search traffic, do not expect much.
Another alternative is to submit to search directories (also known as Web directories) and specialized search engines. Search directories might sound so pre-Google but submitting to the right directories might prove better than optimizing for MSN, for example. Specialized search engines and portals have the advantage that the audience they attract consists of people who are interested in a particular topic and if this is your topic, you can get to your target audience directly. It is true that specialized search engines will not bring you as many visitors, as if you were topping Google but the quality of these visitors is extremely high.
Naming all Google alternatives would be a long list and it is outside the scope of this article but just to be a little more precise about what alternatives exist, we cannot skip SEO instruments like posting to blogs and forums or paid advertisements.
Web Directories
What is a Web Directory?
Web directories (or as they are better known – search directories) existed before the search engines, especially Google, became popular. As the name implies, web directories are directories where different resources are gathered. Similarly to desktop directories, where you gather files in a directory based on some criterion, Web directories are just enormous collections of links to sites, arranged in different categories. The sites in a Web directory are listed in some order (most often alphabetic but it is not necessarily so) and users browse through them.
Although many Web directories offer a search functionality of some kind (otherwise it will be impossible to browse thousands of pages for let's say Computers), search directories are fundamentally different from search engines in the two ways – most directories are edited by humans and URLs are not gathered automatically by spiders but submitted by site owners. The main advantage of Web directories is that no matter how clever spiders become, when there is a human to view and check the pages, there is a lesser chance that pages will be classified in the wrong categories. The disadvantages of the first difference are that the lists in web directories are sometimes outdated, if no human was available to do the editing and checking for some time (but this is not that bad because search engines also deliver pages that do not exist anymore) and that sometimes you might have to wait half an year before being included in a search directory.
The second difference – no spiders – means that you must go and submit your URL to the search directory, rather than sit and wait for the spider to come to your site. Fortunately, this is done only once for each directory, so it is not that bad.
Once you are included in a particular directory, in most cases you can stay there as long as you wish to and wait for people (and search engines) to find you. The fact that a link to your site appears in a respectable Web directory is good because first, it is a backlink and second, you increase your visibility for spiders, which in turn raises your chance to be indexed by them.
Examples of Web Directories
There are hundreds and thousands of search directories but undoubtedly the most popular one is DMOZ. It is a general purpose search directory and it accepts links to all kinds of sites. Other popular general-purpose search directories are Google Directory and Yahoo! Directory. The Best of the Web is one of the oldest Web directories and it still keeps to high standards in selecting sites.
Besides general-purpose Web directories, there are incredibly many topical ones. For instance, the The Environment Directory lists links to environmental sites only, while The Radio Directory lists thousands of radio stations worldwide, arranged by country, format, etc. There are also many local and national Web directories, which accept links to sites about a particular region or country only and which can be great if your site is targeted at local and national audience only. You see, it is not possible to mention even the topics of specialized search directories only because the list will get incredibly long. Using Google and specialized search resources like The Search Engines Directory, you can find on your own many directories that are related to your area of interest.
Specialized Search Engines
What is a Specialized Search Engine?
Specialized search engines are one more tool to include in your SEO arsenal. Unlike general-purpose search engines, specialized search engines index pages for particular topics only and very often there are many pages that cannot be found in general-purpose search engines but only in specialized ones. Some of the specialized search engines are huge sites that actually host the resources they link to, or used to be search directories but have evolved to include links not only to sites that were submitted to them. There are many specialized search engines for every imaginable topic and it is always wise to be aware of the specialized search engines for your niche. The examples in the next section are by no means a full list of specialized search engines but are aimed to give you the idea of what is available. If you search harder on the Web, you will find many more resources.
Examples of Specialized Search Engines
Probably specialized search engines are not that numeric as Web directories but still certainly there is no shortage of them either, especially if one counts password-protected sites with database accessible only from within the site as a specialized search engine. As with Web directories, if there were a list of specialized search engines it would be really, really long (and constantly changing), so instead, here are some links to lists of search engines: Pandia Powersearch, Webquest, Virtual Search Engines, the already mentioned The Search Engines Directory, etc. What is common for these lists is that they offer a selection of specialized search engines, arranged by topic, so it is a good starting point for the hunt of specialized search engines
Importance of Sitemaps
Sitemaps are not a novelty. They have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view, you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.
One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.
Why Use a Sitemap
Using sitemaps has many benefits, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index your changed pages but certainly the changes will be indexed faster, compared to when you don't have a sitemap.
Also, when you have a sitemap and submit it to the search engines, you rely less on external links that will bring search engines to your site. Sitemaps can even help with messy internal links - for instance if you by accident have broken internal links or orphaned pages that cannot be reached in other way (though there is no doubt that it is much better to fix your errors than rely on a sitemap).
If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be vital to your success. Although you can still go without a sitemap, it is likely that soon sitemaps will become the standard way of submitting a site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.
Sitemaps also help in classifying your site content, though search engines are by no means obliged to classify a page as belonging to a particular category or as matching a particular keyword only because you have told them so.
Having in mind that the sitemap programs of major search engines (and especially Google) are still in beta, using a sitemap might not generate huge advantages right away but as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed fast via sitemaps.
Generating and Submitting the Sitemap
The steps you need to perform in order to have a sitemap for your site are simple. First, you need to generate it, then you upload it to your site, and finally you notify Google about it.
Depending on your technical skills, there are two ways to generate a sitemap - to download and install a sitemap generator or to use an online sitemap generation tool. The first is more difficult but you have more control over the output. You can download the Google sitemap generator from here. After you download the package, follow the installation and configuration instructions in it. This generator is a Python script, so your Web server must have Python 2.2 or later installed, in order to run it.
The second way to generate a sitemap is easier. There are many free online tools that can do the job for you. For instance, have a look at this collection of Third-party Sitemap tools. Although Google says explicitly that it has neither tested, nor verified them, this list will be useful because it includes links to online generators, downloadable sitemap generators, sitemap plugins for popular content-management systems, etc., so you will be able to find exactly what you need.
After you have created the sitemap, you need to upload it to your site (if it is not already there) and notify Google about its existence. Notifying Google includes adding the site to your Google Sitemaps account, so if you do not have an account with Google, it is high time to open one. Another detail that is useful to know in advance is that in order to add the sitemap to your account, you need to verify that you are the legitimate owner of the site.
Currently Yahoo! and MSN do not support sitemaps, or at least not in the XML format, used by Google. Yahoo! allows webmasters to submit “a text file with a list of URLs” (which can actually be a stripped-down version of a site map), while MSN does not offer even that but there are rumors that it is indexing sitemaps when they are available onsite. Most likely this situation will change in the near future and both Yahoo! and MSN will catch with Google because user-submitted site maps are just a too powerful SEO tool and cannot be ignored.
How to Build Backlinks
Getting Backlinks the Natural Way
The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.
Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of building quality backlinks are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.
The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.
You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.
Getting Listed in Directories
If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.
Forums and Article Directories
Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.
While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.
Content Exchange and Affiliate Programs
Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.
Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?
News Announcements and Press Releases
Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.
Backlink Building Practices to Avoid
One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.
Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.
Optimizing Flash Sites
If there is a really hot potato that divides SEO experts and Web designers, this is Flash. Undoubtedly a great technology to include sounds and picture on a Web site, Flash movies are a real nightmare for SEO experts. The reason is pretty prosaic – search engines cannot index (or at least not easily) the contents inside a Flash file and unless you feed them with the text inside a Flash movie, you can simply count this text lost for boosting your rankings. Of course, there are workarounds but until search engines start indexing Flash movies as if they were plain text, these workarounds are just a clumsy way to optimize Flash sites, although certainly they are better than nothing.
Why Search Engines Dislike Flash Sites?
Search engines dislike Flash Web sites not because of their artistic qualities (or the lack of these) but because Flash movies are too complex for a spider to understand. Spiders cannot index a Flash movie directly, as they do with a plain page of text. Spiders index filenames (and you can find tons of these on the Web), but not the contents inside.
Flash movies come in a proprietary binary format (.swf) and spiders cannot read the insides of a Flash file, at least not without assistance. And even with assistance, do not count that spiders will crawl and index all your Flash content. And this is true for all search engines. There might be differences in how search engines weigh page relevancy but in their approach to Flash, at least for the time beings, search engines are really united – they hate it but they index portions of it.
What (Not) to Use Flash For?
Despite the fact that Flash movies are not spider favorites, there are cases when a Flash movie is worth the SEO efforts. But as a general rule, keep Flash movies at a minimum. In this case less is definitely better and search engines are not the only reason. First, Flash movies, especially banners and other kinds of advertisement, distract users and they generally tend to skip them. Second, Flash movies are fat. They consume a lot of bandwidth, and although dialup days are over for the majority of users, a 1 Mbit connection or better is still not the standard one.
Basically, designers should keep to the statement that Flash is good for enhancing a story, but not for telling it – i.e. you have some text with the main points of the story (and the keywords that you optimize for) and then you have the Flash movie to add further detail or just a visual representation of the story. In that connection, the greatest SEO sin is to have the whole site made in Flash! This is is simply unforgivable and do not even dream of high rankings!
Another “No” is to use Flash for navigation. This applies not only to the starting page, where once it was fashionable to splash a gorgeous Flash movie but external links as well. Although it is a more common mistake to use images and/or javascript for navigation, Flash banners and movies must not be used to lead users from one page to another. Text links are the only SEO approved way to build site navigation.
Workarounds for Optimizing Flash Sites
Although a workaround is not a solution, Flash sites still can be optimized. There are several approaches to this:
* Input metadata
This is a very important approach, although it is often underestimated and misunderstood. Although metadata is not as important to search engines as it used to be, Flash development tools allow easily to add metadata to your movies, so there is no excuse to leave the metadata fields empty.
* Provide alternative pages
For a good site it is a must to provide html only pages that do not force the user to watch the Flash movie. Preparing these pages requires more work but the reward is worth because not only users, but search engines as well will see the html only pages.
* Flash Search Engine SDK
This is the life-belt. The most advanced tool to extract text from a Flash movie. One of the handiest applications in the Flash Search Engine SDK is the tool named swf2html. As it name implies, this tool extracts text and links from a Macromedia Flash file and writes the output unto a standard HTML document, thus saving you the tedious job to do it manually. However, you still need to have a look at the extracted contents and correct it, if necessary. For example, the order in which the text and links is arranged might need a little restructuring in order to put the keyword-rich content in the title and headings or in the beginning of the page. Also, you need to check if there is no duplicate content among the extracted sentences and paragraphs. The font color of the extracted text is also another issue. If the font color of the extracted text is the same as the background color, you will run into hidden text territory.
* SE-Flash.com
Here is a tool that visually shows what from your Flash files is visible to search engines and what is not. This tool is very useful, even if you already have the Flash Search Engine SDK installed because it provides one more check of the accuracy of the extracted text. Besides, it is not certain that Google and the other search engines use Flash Search Engine SDK to get contents from a Flash file, so this tool might give completely different results from those that the SDK will produce.
These approaches are just some of the most important examples of how to optimize Flash sites. There are many other approaches as well. However, not all of them are brilliant and clear, or they can be classified on the boundary of ethical SEO – e.g. creating invisible layers of text that is delivered to spiders instead the Flash movie itself. Although this technique is not wrong – i.e. there is no duplicate or fake content, it is very similar to cloaking and doorway pages and it is better to avoid it.
What is Robots.txt
It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.
One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.
What Is Robots.txt?
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.
The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.
Structure of a Robots.txt File
The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:
User-agent:
Disallow:
“User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:
# All user agents are disallowed to see the /temp directory.
User-agent: *
Disallow: /temp/
The Traps of a Robots.txt File
When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.
The more serious problem is with logical errors. For instance:
User-agent: *
Disallow: /temp/
User-agent: Googlebot
Disallow: /images/
Disallow: /temp/
Disallow: /cgi-bin/
The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.
Tools to Generate and Validate a Robots.txt File
Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:
User agent: *
Disallow: /temp/
this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.
In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.
Promoting Your Site to Increase Traffic
1. Submitting Your Site to Search Directories, forums and special sites
After you have finished optimizing your new site, time comes to submit it to search engines. Generally, with search engines you don't have to do anything special in order to get your site included in their indices – they will come and find you. Well, it cannot be said exactly when they will visit your site for the first time and at what intervals they will visit it later but there is hardly anything that you can to do invite them. Sure, you can go to their Submit a Site pages in submit the URL of your new site but by doing this do not expect that they will hop to you right away. What is more, even if you submit your URL, most search engines reserve the right to judge whether to crawl your site or not. Anyway, here are the URLs for submitting pages in the three major search engines: Google, MSN, and Yahoo.
In addition to search engines, you may also want to have your site included in search directories as well. Although search directories also list sites that are relevant to a given topic, they are different from search engines in several aspects. First, search directories are usually maintained by humans and the sites in them are reviewed for relevancy after they have been submitted. Second, search directories do not use crawlers to get URLs, so you need to go to them and submit your site but once you do this, you can stay there forever and no more efforts on your side are necessary. Some of the most popular search directories are DMOZ and Yahoo! (the directory, not the search engine itself) and here are the URLs of their submissions pages: DMOZ and Yahoo!.
Sometimes posting a link to your site in the right forums or special sites can do miracles in terms of traffic. You need to find the forums and sites that are leaders in the fields of interest to you but generally even a simple search in Google or the other major search engines will retrieve their names. For instance, if you are a hardware freak, type “hardware forums” in the search box and in a second you will have a list of sites that are favorites to other hardware freaks. Then you need to check the sites one by one because some of them might not allow posting links to commercial sites. Posting into forums is more time-consuming than submitting to search engines but it could also be pretty rewarding.
2. Specialized Search Engines
Google, Yahoo!, and MSN are not the only search engines on Earth, nor even the only general-purpose ones. There are many other general-purpose and specialized search engines and some of them can be really helpful for reaching your target audience. You just can't imagine for how many niches specialized search engines exist – from law, to radiostations, to educational one! Some of them are actually huge sites that gather Webwide resources on a particular topic but almost all of them have sections for submitting links to external sites of interest. So, after you find the specialized search engines in your niche, go to their site and submit your URL – this could prove more trafficworthy than striving to get to the top of Google.
3. Paid Ads and Submissions
We have already mentioned some other alternatives to search engines – forums, specialized sites and search engines, search directories – but if you need to make sure that your site will be noticed, you can always resort to paid ads and submissions. Yes, paid listings are a fast and guaranteed way to appear in search results and most of the major search engines accept payment to put your URL in the Paid Links section for keywords of interest to you but you also must have in mind that users generally do not trust paid links as much as they do with the normal ones – in a sense it looks like you are bribing the search engine to place you where you can't get on your own, so think twice about the pros and cons of paying to get listed.
Static Versus Dynamic URLs
There are a couple of good reasons why static URLs score better than dynamic URLs. First, dynamic URLs are not always there – i.e. the page is generated on request after the user performs some kind of action (fills a form and submits it or performs a search using the site's search engine). In a sense, such pages are nonexistent for search engines, because they index the Web by crawling it, not by filling in forms.
Second, even if a dynamic page has already been generated by a previous user request and is stored on the server, search engines might just skip it if it has too many question marks and other special symbols in it. Once upon a time search engines did not index dynamic pages at all, while today they do index them but generally slower than they index static pages.
The idea is not to revert to static HTML only. Database-driven sites are great but it will be much better if you serve your pages to the search engines and users in a format they can easily handle. One of the solutions of the dynamic URLs problem is called URL rewriting. There are special tools (different for different platforms and servers) that rewrite URLs in a friendlier format, so they appear in the browser like normal HTML pages.
Static Versus Dynamic URLs
There are a couple of good reasons why static URLs score better than dynamic URLs. First, dynamic URLs are not always there – i.e. the page is generated on request after the user performs some kind of action (fills a form and submits it or performs a search using the site's search engine). In a sense, such pages are nonexistent for search engines, because they index the Web by crawling it, not by filling in forms.
Second, even if a dynamic page has already been generated by a previous user request and is stored on the server, search engines might just skip it if it has too many question marks and other special symbols in it. Once upon a time search engines did not index dynamic pages at all, while today they do index them but generally slower than they index static pages.
The idea is not to revert to static HTML only. Database-driven sites are great but it will be much better if you serve your pages to the search engines and users in a format they can easily handle. One of the solutions of the dynamic URLs problem is called URL rewriting. There are special tools (different for different platforms and servers) that rewrite URLs in a friendlier format, so they appear in the browser like normal HTML pages.
Visual Extras and SEO
1. Images
Images are an essential part of any Web page and from a designer point of view they are not an extra but a most mandatory item for every site. However, here designers and search engines are on two poles because for search engines every piece of information that is buried in an image is lost. When working with designers, sometimes it takes a while to explain to them that having textual links (with proper anchor text) instead of shining images is not a whim and that clear text navigation is really mandatory. Yes, it can be hard to find the right balance between artistic performance and SEO-friendliness but since even the finest site is lost in cyberspace if it cannot be found by search engines, a compromise to its visual appearance cannot be avoided.
With all that said, the idea is not to skip images at all. Sure, nowadays this is impossible because the result would be a most ugly site. Rather the idea is that images should be used for illustration and decoration, not for navigation or even worse – for displaying text (in a fancy font, for example). And the most important – in the

2. Animation and Movies
The situation with animation and movies is similar to that with images – they are valuable from a designer's point of view but are not loved by search engines. For instance, it is still pretty common to have an impressive Flash introduction on the home page. You just cannot imagine what a disadvantage with search engines this is – it is a number one rankings killer! And it gets even worse, if you use Flash to tell a story that can be written in plain text, hence crawled and indexed by search engines. One workaround is to provide search engines with a HTML version of the Flash movie but in this case make sure that you have excluded the original Flash movie from indexing (this is done in the robots.txt file but the explanation of this file is not a beginners topic and that is why it is excluded from this tutorial), otherwise you can be penalized for duplicate content.
There are rumors that Google is building a new search technology that will allow to search inside animation and movies and that the .swf format will contain new metadata that can be used by search engines, but until then, you'd better either refrain from using (too much) Flash, or at least provide a textual description of the movie (you can use an
3. Frames
It is a good news that frames are slowly but surely disappearing from the Web. 5 or 10 years ago they were an absolute hit with designers but never with search engines. Search engines have difficulties indexing framed pages because the URL of the page is the same, no matter which of the separate frames is open. For search engines this was a shock because actually there were 3 or 4 pages and only one URL, while for search engines 1 URL is 1 page. Of course, search engines can follow the links to the pages in the frameset and index them but this is a hurdle for them.
If you still insist on using frames, make sure that you provide a meaningful description of the site in the
Example:
This site is best viewed in a browser that supports frames.
Welcome to our site for prospective dog adopters! Adopting a homeless dog is a most noble deed that will help save the life of the poor creature.
4. JavaScript
This is another hot potato. It is known by everybody that pure HTML is powerless to make complex sites with a lot of functionality (anyway, HTML was not intended to be a programming languages for building Web applications, so nobody expects that you can use HTML to handle writing to a database or even for storing session information) as required by today's Web users and that is why other programming languages (like JavaScript, or PHP) come to enhance HTML. For now search engines just ignore JavaScript they encounter on a page. As a result of this, first if you have links that are inside the JavaScript code, chances are that they will not be spidered. Second, if JavaScript is in the HTML file itself (rather than in an external .js file that is invoked when necessary) this clutters the html file itself and spiders might just skip it and move to the next site. Just for your information, there is a
Content Is King
For company sites that are not focused on writing but on manufacturing constantly adding text can be a problem because generally company sites are not reading rooms or online magazines that update their content daily, weekly or monthly but even for company sites there are reasonable solutions. No matter what your business is, one is for sure – it is always relevant to include a news section on your site – it can be company news or RSS feeds but this will keep the ball rolling.
1. Topical Themes or How to Frequently Add Content to Your Site
If you are doing the SEO for an online magazine, you can consider yourself lucky – fresh content is coming all the time and you just need to occasionally arrange a heading or two or a couple of paragraphs to make the site SEO-friendly. But even if you are doing a SEO for an ordinary company site, it is not all that bad - there are ways to constantly get fresh content that fits into the topic of the site.
One of the intricacies of optimizing a company site is that it has to be serious. Also, if your content smells like advertising and has no practical value for your visitors, this content is not that valuable. For instance, if you are a trade company, you can have promotional texts about your products. But have in mind that these texts must be informational, not just sales hype. And if you have a lot of products to sell, or frequently get new products, or make periodical promotions of particular products and product groups – you can post all this to your site and you will have fresh, topical content.
Also, depending on what your business is about, you can include different kinds of self-updating information like lists of hot new products, featured products, discounted items, even online calculators or order trackers. Unlike promotional pages, this might neither bring you many new visitors, nor improve your ratings but is more than nothing.
One more potential traffic trigger for company sites are news sections. Here you can include news about past and coming events, post reports about various activities, announce new undertakings, etc. Some companies even go further – their CEO keeps a blog, where he or she writes in a more informal style about what is going in the company, in the industry as a whole, or in the world in general. These blogs do attract readers, especially if the information is true, rather than the official story.
An alternative way to get fresh free content are RSS feeds. RSS feeds are gaining more and more popularity and with a little bit of searching, you can get free syndicated content for almost any topic you can think of.
2. Bold and Italic Text
When you have lots of text, the next question is how to make the important items stand out from the crowd – for both humans and search engines. While search engines (and their spiders – the programs that crawl the Web and index pages) cannot read text the way humans do, they do have ways of getting the meaning of a piece of text. Headings are one possibility, bold and italic are another way to emphasize a word or a couple of words that are important. Search engines read the and text and get the idea that what is in bold and/or italic is more important than the rest of the text. But do not use bold and italic too much – this will spoil the effect, rather than make the whole page a search engine favorite.
3. Duplicate Content
When you get new content, there is one important issue – is this content original? Because if it is not, i.e. it is stolen from another site, this will get you into trouble. But even if it is not illegal, i.e. you obtained it for free from an article feed, have in mind that you might not be only one on the Web, who has this particular stuff. If you have the rights to do it, you can change the text a little, so it is not an exact copy of another page and cannot be labeled “duplicate content” by search engines. If you don't manage to escape the duplicate content filter that search engines have imposed recently in their attempts to filter stolen, scrapped, or simply copied contents, your pages could be removed from search results!
Duplicate content became an issue when tricky webmasters started making multiple copies of the same page (under a different name) in order to fool search engines that they have more content than they actually do. As a result of this malpractice, search engines responded with a duplicate content filter that removes suspicious pages. Unfortunately, this filter sometimes removes quite legitimate pages, like product descriptions given from a manufacturer to all its resellers, which must be kept exactly the same.
You see, duplicate content can be a serious problem. But it is not an obstacle that cannot be overcome. First, you need to periodically check the Web for pages that are similar to yours. You can use http://copyscape.com. If you identify pages that are similar to yours (and it is not you who have illegitimately copied them), you could notify the webmaster of the respective site(s) to remove them. Also, you could change a little the text on your site, hoping that this way you will avoid the duplicate content penalty. Even with product descriptions, you can add commentary or opinion on the same page and this could be a way out.
Try the Similar Page Checker to check the similarity between two URLs.
Metatags
The meta Description tag is are one more way for you to write a description of your site, thus pointing search engines to what themes and topics your Web site is relevant to. It does not hurt to include at least a brief description, so don't skip it. For instance, for the dog adoption site, the meta Description tag could be something like this:
A potential use of the meta Keywords tags is to include a list of keywords that you think are relevant to your pages. The major search engines will not take this into account but still it is a chance for you to emphasize your target keywords. You may consider including alternative spellings (or even common misspellings of your keywords) in the meta Keywords tag. For instance, if I were to write the meta keywords tag for the dog adoption site, I would do it like that: . It is a small boost to search engine top ranking but why miss the chance?
The meta Robots tag deserves more attention. In this tag you specify the pages that you do NOT want crawled and indexed. It happens that on your site you have contents that you need to keep there but you don't want it indexed. Listing this pages in the meta Robots tag is one way to exclude them (the other way is by using a robots.txt file and generally this is the better way to do it) from being indexed.
Links – Another Important SEO Item
Probably the word that associates best with Web is “links”. That is what hypertext is all about – you link to pages you like and get linked by pages that like your site. Actually, the Web is woven out of interconnected pages and spiders follow the links, when indexing the Web. If not many sites link to you, then it might take ages for search engines to find your site and even if they find you, it is unlikely that you will have high rankings because the quality and quantity of links is part of the algorithms of search engines for calculating relevancy.
2. Inbound and Outbound Links
Put in layman's terms, there are two types of links that are important for SEO – inbound and outbound links. Outbound links are links that start from your site and lead to another one, while inbound links, or backlinks, come from an external site to yours, e.g. if a.com links to mydomain.com, the link from a.com is an inbound link for mydomain.com.
Backlinks are very important because they are supposed to be a measure of the popularity of your site among the Web audience. It is necessary to say that not all backlinks are equal. There are good and bad backlinks. Good backlinks are from reputable places - preferably from sites with a similar theme. These links do boost search engine ranking. Bad backlinks come from suspicious places – like link farms – and are something to be avoided. Well, if you are backlinked without your knowledge and consent, maybe you should drop the Webmaster a line, asking him or her to remove the backlink.
If you are not heavily backlinked, don't worry - buying links is an established practice and if you are serious about getting to the top, you may need to consider it. But before doing this, you should consider some free alternatives. For instance, some of the good places where you can get quality backlinks are Web directories like http://dmoz.org or http://dir.yahoo.com.
First, look for suitable sites to backlink to you using the Backlinks Builder below. After you identify potential backlinks, it's time to contact the Web master of the site and to start negotiating terms. Sometimes you can agree to a barter deal – i.e. a link exchange – they will put on their site N links to your site and you will put on your site N links to their site - but have in mind that this is a bad, risky deal and you should always try to avoid it.
Internal links (i.e. links from one page to another page on the same site) are also important but not as much as backlinks. In this connection it is necessary to say, that using images for links might be prettier but it is a SEO killer. Instead of having buttons for links, use simple text links. Since search engines spider the text on a page, they can't see all the designer miracles, like gradient buttons or flash animations, so when possible, either avoid using them, or provide a meaningful textual description in the
3. Anchor text
Anchor text is the most important item in a backlink. While it does matter where a link comes from (i.e. a reputable place or a link farm), what matters more is the actual text the link starts from. Put simply, anchor text is the word(s) that you click on to open the hyperlink – e.g. if we have the best search engine, than “the best search engine” is the anchor text for the hyperlink to google.com. You see that you might have a backlink from a valuable site but if the anchor text is something like “an example of a complete failure”, you will hardly be happy with it.
When you check your backlinks, always check what their anchor text is and if there is a keyword in it. It is a great SEO boost to have a lot of backlinks from quality sites and the anchor text to include our keywords. Check the anchor text of inbound backlinks is with the Backlink Anchor Text Analyzer tool below. Besides the anchor text itself, the text around it is also important.
4. Link Practices That Are To Be Avoided
Similar to keyword stuffing, purchasing links in bulk is a practice to be avoided. It gets suspicious if you bartered 1000 links with another site in a day or two. What is more, search engines keep track of link farms (sites that sell links in bulk) and since bought links are a way to manipulate search results, this practice gets punished by search engines. So avoid dealing with link farms because it can cause more harm than do good. Also, outbound links from your site to known Web spammers or “bad guys” are also to be avoided.
As mentioned, link exchange is not a clean deal. Even if it boosts your ranking, it can have many other negative aspects in the long run. First, you do not know if the other party will keep their promise – i.e. they might remove some of the links to you. Second, they might change the context the link appears into. Third, it is really suspicious if you seem to be “married” to another site and 50% or more of your inbound and outbound links are from/to this direction.
When links are concerned, one aspect to have in mind is the ratio between inbound and outbound links. Generally speaking, if your outbound links are ten times your inbound links, this is bad but it also varies on a case by case basis. If you have a site that links to news sources or has RSS feeds, then having many outbound links is the inevitable price of fresh content.
Keywords – the Most Important Item in SEO
1. Choosing the Right Keywords to Optimize For
It seems that the time when you could easily top the results for a one-word search string is centuries ago. Now, when the Web is so densely populated with sites, it is next to impossible to achieve constant top ratings for a one-word search string. Achieving constant top ratings for two-word or three-word search strings is a more realistic goal. If you examine closely the dynamics of search results for popular one-word keywords, you might notice that it is so easy one week to be in the first ten results and the next one– to have fallen out of the first 30 results because the competition for popular one-word keywords is so fierce and other sites have replaced you.
Of course, you can include one-word strings in your keywords list but if they are not backed up by more expressions, do not dream of high ratings. For instance, if you have a site about dogs, “dog” is a mandatory keyword but if you do not optimize for more words, like “dog owners”, “dog breeds”, “dog food”, or even “canine”, success is unlikely, especially for such a popular keyword. The examples given here are by no means the ultimate truth about how to optimize a dog site but they are good enough to show that you need to think broad when choosing the keywords.
Generally, when you start optimization, the first thing you need to consider is the keywords that describe the content of your site best and that are most likely to be used by users to find you. Ideally, you know your users well and can guess correctly what search strings they are likely to use to search for you. One issue to consider is synonyms. Very often users will use a different word for the same thing. For instance, in the example with the dog site, “canine” is a synonym and it is for sure that there will be users who will use it, so it does not hurt to include it now and then on your pages. But do not rush to optimize for every synonym you can think of – search engines themselves have algorithms that include synonyms in the keyword match, especially for languages like English.
Instead, think of more keywords that are likely to be used to describe your site. Thinking thematically is especially good because search engines tend to rate a page higher if it belongs to a site the theme of which fits into the keyword string. In this aspect it is important that your site is concentrated around a particular theme – i.e. dogs. It might be difficult to think of all the relevant keywords on your own but that is why tools are for. For instance, the Website Keyword Suggestions Tool below can help you to see how search engines determine the theme of your web site and what keywords fit into this theme. You can also try Google's Keyword Tool to get more suggestions about which keywords are hot and which are not.
When choosing the keywords to optimize for, you need to consider not only their relevancy to your site and the expected monthly number of searches for these particular keywords. Very often narrow searches are more valuable because the users that come to your site are those that are really interested in your product. If we go on with the dog example, you might discover that the “adopt a dog” keyphrase brings you more visitors because you have a special section on your site where you give advice on what to look for when adopting a dog. This page is not of interest of current dog owners but to potential dog owners only, who might be not so many in number but are your target audience and the overall effect of attracting this niche can be better than attracting everybody who is interested in dogs in general. So, when you look at the numbers of search hits per month, consider the unique hits that fit into the theme of your site.
2. Keyword Density
After you have chosen the keywords that describe your site and are supposedly of interest to your users, the next step is to make your site keyword-rich and to have good keyword density for your target keywords. Keyword density is a common measure of how relevant a page is. Generally, the idea is that the higher the keyword density, the more relevant to the search string a page is. The recommended density is 3-7% for the major 2 or 3 keywords and 1-2% for minor keywords. Try the Keyword Density Checker below to determine the keyword density of your website.
Although there are no strict rules, try optimizing for a reasonable number of keywords – 5 or 10 is OK. If you attempt to optimize for a list of 300, you will soon see that it is just not possible to have a good keyword density for more than a few keywords, without making the text sound artificial and stuffed with keywords. And what is worse, there are severe penalties (including ban from the search engine) for keyword stuffing because this is considered an unethical practice that tries to manipulate search results.
3. Keywords in Special Places
Keywords are very important not only as quantity but as quality as well – i.e. if you have more keywords in the page title, the headings, the first paragraphs – this counts more that if you have many keywords at the bottom of the page. The reason is that the URL (and especially the domain name), file names and directory names, the page title, the headings for the separate sections are more important than ordinary text on the page and therefore, all equal, if you have the same keyword density as your competitors but you have keywords in the URL, this will boost your ranking incredibly, especially with Yahoo!.
a. Keywords in URLs and File Names
The domain name and the whole URL of a site tell a lot about it. The presumption is that if your site is about dogs, you will have “dog”, “dogs”, or “puppy” as part of your domain name. For instance, if your site is mainly about adopting dogs, it is much better to name your dog site “dog-adopt.net” than “animal-care.org”, for example, because in the first case you have two major keywords in the URL, while in the second one you have no more than one potential minor keyword.
When hunting for keyword rich domain names, don't get greedy. While from a SEO point of view it is better to have 5 keywords in the URL, just imagine how long and difficult to memorize the URL will be. So you need to strike a balance between the keywords in the URL and site usability, which says that more than 3 words in the URL is a way too much. Probably you will not be able to come on your own with tons of good suggestions. Additionally, even if you manage to think of a couple of good domain names, they might be already taken. In such cases tools like the Tool below can come very handy.
File names and directory names are also important. Often search engines will give preference to pages that have a keyword in the file name. For instance http://mydomain.com/dog-adopt.html is not as good as http://dog-adopt.net/dog-adopt.html but is certainly better than http://mydomain.com/animal-care.html. The advantage of keywords in file names over keywords in URLs is that they are easier to change, if you decide to move to another niche, for example.
b. Keywords in Page Titles
The page title is another special place because the contents of the
Unlike URLs, with page titles you can get wordy. If we go on with the dog example, the
c. Keywords in Headings
Normally headings separate paragraphs into related subtopics and from a literary point of view, it may be pointless to have a heading after every other paragraph but from SEO point of view it is extremely good to have as many headings on a page as possible, especially if they have the keywords in them.
There are no technical length limits for the contents of the
, , , ... tags but common sense says that too long headings are bad for page readability. So, like with URLs, you need to be wise with the length of headings. Another issue you need to consider is how the heading will be displayed. If it is Heading 1 (), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.
, ... tags but common sense says that too long headings are bad for page readability. So, like with URLs, you need to be wise with the length of headings. Another issue you need to consider is how the heading will be displayed. If it is Heading 1 (), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.
), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.
Introduction – What Is SEO
One of the basic truths in SEO is that even if you do all the things that are necessary to do, this does not automatically guarantee you top ratings but if you neglect basic rules, this certainly will not go unnoticed. Also, if you set realistic goals – i.e to get into the top 30 results in Google for a particular keyword, rather than be the number one for 10 keywords in 5 search engines, you will feel happier and more satisfied with your results.
Although SEO helps to increase the traffic to one's site, SEO is not advertising. Of course, you can be included in paid search results for given keywords but basically the idea behind the SEO techniques is to get top placement because your site is relevant to a particular search term, not because you pay.
SEO can be a 30-minute job or a permanent activity. Sometimes it is enough to do some generic SEO in order to get high in search engines – for instance, if you are a leader for rare keywords, then you do not have a lot to do in order to get decent placement. But in most cases, if you really want to be at the top, you need to pay special attention to SEO and devote significant amounts of time and effort to it. Even if you plan to do some basic SEO, it is essential that you understand how search engines work and which items are most important in SEO.
1. How Search Engines Work
The first basic truth you need to learn about SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.
First, search engines crawl the Web to see what is there. This task is performed by e piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified. Sometimes crawlers will not visit your site for a month or two, so during this time your SEO efforts will not be rewarded. But there is nothing you can do about it, so just keep quiet.
What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.
After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.
When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one pages (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index to the search string.
There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or metatags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, MSN, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.
The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.
2. Differences Between the Major Search Engines
Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. For different search engines different factors are important. There were times, when SEO experts joked that the algorithms of Yahoo! are intentionally made just the opposite of those of Google. While this might have a grain of truth, it is a matter a fact that the major search engines like different stuff and if you plan to conquer more than one of them, you need to optimize carefully.
There are many examples of the differences between search engines. For instance, for Yahoo! and MSN, on-page keyword factors are of primary importance, while for Google links are very, very important. Also, for Google sites are like wine – the older, the better, while Yahoo! generally has no expressed preference towards sites and domains with tradition (i.e. older ones). Thus you might need more time till your site gets mature to be admitted to the top in Google, than in Yahoo!.