Search engines: Keeping them all happy

You have probably already heard that the ubiquitous industry leading search engine Google has some new competitors on the rise. Former powerhouse search engine Yahoo, following its absorption of Inktomi is poised to make a move.



MSN Search, backed by the powerful engineering and marketing ability of the Microsoft corporation, has designs on an enlarged market share as well.



Smaller search engines, including Teoma, are looking for their spot at the search engine table as well.



When you design and launch a new website, aiming for strong result placements on Google remains very important. You must never forget the rising players in the search engine game, however.



They have some similarities, and some differences, in how to achieve high rankings in their results. You need to know what makes them happy too.



As part of their desire to win the screens and minds of internet searchers, the various search engines offer onsite interfaces to attract your queries. By making their technology available to webmasters, the various search engines appear to be attempting to gain some grassroots search loyalty.



Site design and site promotion must take all of the search engines and their whims into account to gain high rankings.



All search engines use complex mathematical formulas, known as algorithms, to calculate the position of web pages in their search results. The basic functions of the search engines are the same. They send out computerized robots, known as spiders, to crawl the internet searching for web pages for their indexes.



Upon receiving a search request, the search engines examine all of the pages within their respective indexes, and return what their algorithm computes to be the most relevant results.

The various algorithms are unique to each search engine. In a broad sense, they use similar indicators of web page importance.



All search engines currently consider incoming links, on page content, and off page factors, in their calculations. Because of those similarities, the designer of a new website has some guidelines as to what is required to achieve high rankings.



There is some speculation that the new MSN Search will use on page word consideration in place of link and off page factors. The thought is that MSN will use word context as the primary search technique.



The word context system places emphasis on strong copywriting skills. How the words are used, in the on page copy, would determine the relevance of a web page. That sounds like a real advantage for blogs as a result.



Increasing the relevance of contextual keywords in a search algorithm would not only work in MSN Search, but would also benefit a site in the other major search engines. The addition of well written content, to any website, will help its placements in the search results.



By providing a solid reason for better written content, the addition of context based search could enhance the results in all of the search engines, regardless of algorithm. Content is a major part of all search engine results. In fact, that is what search engines attempt to provide.



A decline in such search engine terms of service violations, such as hidden text and doorway pages might be a positive result. Since those spam techniques can not provide any context relation to the on page content, they would be less of a threat to honest webmasters.



Such a search method would lessen the importance of keywords and incoming links. It is highly unlikely, however, that their value would be removed entirely. At the present time, that is only a possibility, as MSN continues to use similar calculations to the other search engines.



Regardless of search engine, there are many similarities in their requirements that must be considered, when you design your website. These are concepts of search engine optimization that are common to all search engine algorithms, to a greater or lesser degree. By utilizing them in your site design from the beginning, your website can get a head start in the search engine placements race.



Make certain that your newly designed site includes a good site map. Fortunately, they are provided in most blogging tools as part of the system.



At the same time, a site map gives you double service, as it helps the various search engine spiders to crawl and index all of your web pages. Most blogging platforms provide archive links and links to previous posts. Many blogging systems provide categories as well.



Be sure that the links to the various pages include the keywords for that specific page. Try to avoid general and vague terms like "home", "about us", etc. Use keywords that highlight the subject matter of the page. It not only helps the search engine spiders, but assists your visitors as well.



Place a unique title on each post on your blog. Use the main keywords for that page within the title tags. All of the search engine spiders can use that information, to help rank your page higher, in the search engine results pages (SERPs).



You get the added benefit of having more pages indexed by the search engines, meaning more entry points to your site. It is also generally thought that larger sites with more pages do better in the SERPs than smaller sites.



Place your keywords higher on the page when you create your content. Some search engines apparently prefer the keywords to appear sooner, emphasising their importance. Make certain that the keywords appear, throughout the written copy, to give all of the search engine spiders something tasty to nibble.



Since there is some difference in the spiders’ crawling preferences, you cover all the bases by sprinkling your keywords high, middle, and late in your copy. On your page, be careful about what appears at the top of your page.



Be careful of keyword density. While Yahoo and MSN Search possibly like more heavy keyword use, Google apparently has upper limits on their number. Be careful to use your keywords in the most appropriate locations, where they naturally flow in your copy, and you should experience no problems. Solid placement of keywords is a good idea. Overuse could be considered spamming; and that's a no-no.



While the addition of meta tags to your site may not carry any noticeable weight in the Google algorithm, such is not the case for the other search engines. There is some evidence that Yahoo and MSN Search consider meta tags in their search calculations. The use of meta tags was certainly a strong consideration in the Inktomi algorithm.



When you create a list of meta tags, be sure to only use the keywords that actually appear on that page. Do not use any keywords that are not included in that web page’s content. Over use of keywords, especially words that are unrelated to the on page content, could be viewed as spam by the search engines. Keep your meta tags precise and targeted for that page alone.



When designing your website, it’s important to keep the differences and similarities between the various search engines in mind.



While there are some differences in the algorithms utilized by the search engines, there are many features that are common to all of them.





Archives