May 24, 2011

SEO Search Engine Friendly Web Directories List

A Web Directory or link directory is a directory on the World Wide Web. It specializes in linking to other web sites and categorizing those links

Sr. No.

Web Directories URL

IP Address

1 Add URL The Directory

50.6.179.155
2 Submit To Directory

76.163.25.106
3 Submit To Dir

76.163.25.108
4 Add URL To Dir

76.163.25.128
5 Add Website To Dir

76.163.25.129
6 Submit Site To Directory

76.163.25.130
7 List Your Site To Dir

76.163.25.131
8 List Site To Dir

76.163.25.136
9 Add Site To Dir

76.163.25.137
10 Submit Site To Dir

96.0.255.52
11 Submit To Dir Now96.0.255.53

12
Join Directory 98.130.211.58

13
Best Dir Listing 98.130.211.59

14
Add Link Z Directory 98.130.211.60

15
Submit URL To Dir 98.130.211.83

16
URL The Directory 98.130.211.84
17 Dandy Web Directory

98.130.211.85
18 Zoot Web Directory

98.130.211.86
19 Add URL The Dir

98.130.211.231
20 Submit URL The Dir

98.130.222.35
21 Submit URL Web Dir

98.130.222.36
22 Add URL Web Dir

98.130.222.72
23 Submit URL Web Directory

98.130.222.115
24 Submit URL Planet

98.130.222.116
25 The Submit URL Planet

98.130.222.136
26 The Add URL Planet

98.130.222.137
27 Add URL Planet

98.130.222.138
28 Add URL On Web

98.130.222.148
29 Submit URL On Web

98.130.222.149
30 Add Submit URL Web Dir

98.130.222.150
31 Add Submit URL Web Directory

98.130.222.152
32 Zingy Web Directory

98.130.222.153
33 The Top Web Directory

76.163.25.61
34 Tizzy Web Directory

76.163.25.122
35 Add Link Web Dir

76.163.25.177
36 Submit Link Web Dir

76.163.25.178
37 The Add Link Web Dir

76.163.25.240
38 The Submit Link Web Dir

76.163.25.241
39 Top Submit Link Dir

98.130.222.158
40 Top Add Link Dir

96.0.254.104
41 Best Add Link Dir

96.0.254.37
42 Best Submit Link Dir

96.0.254.43
43 Add URL Orb

96.0.254.99
44 Submit URL Orb

96.0.254.57
45 Submit URL Web Orb

96.0.254.45
46 Add URL Web Orb

96.0.254.53
47 The Add URL Orb

96.0.254.85
48 The Submit URL Orb

96.0.254.58
49 Add Link Orb

96.0.254.76
50 Submit Link Orb

96.0.254.52
51 Add Submit URL Orb 96.0.254.98


May 4, 2011

Free High PR Article Directory List

Article directories are websites where users submit unique articles to be categorized and included to a specific niche. Well-written content articles released for free distribution have the potential of increasing the authoring business' credibility within its market as well as attracting new clients.
  1. a1-articledirectory.com
  2. articlebliss.com
  3. articlekarma.com
  4. goodinfohome.com
  5. lifeweightloss.com
  6. articledirectorylive.com
  7. articlenexus.com
  8. articlesinsight.com
  9. e-zinearticles.info
  10. ezine-submission.com
  11. freearticlez.com
  12. article-voip.com
  13. articlemarketing.org
  14. articles.incentivesearch.com
  15. articlewealth.com
  16. bestpublicspeakingarticles.com
  17. clicknews.biz
  18. fivestararticles.com
  19. newagelivingarticles.com
  20. wefindyouarticles.com
  21. 321articles.com
  22. articles2go.co.uk
  23. articleshmarticle.com
  24. articlesmart.org
  25. didarticles.com
  26. electrictext.com
  27. healthandwealth4you.com
  28. theoriginalarticle.com
  29. article.directory4u.org
  30. usrealestatesite.com
  31. britishrealestateinfo.com
  32. article-smart.com
  33. articlearty.com
  34. my-resource.com
  35. business.freearticledirectories.com
  36. spotyourarticle.com
  37. articledirectorycentral.com
  38. articlesidea.com
  39. articleuser.com
  40. bizarticlesonly.com
  41. medicalsupportforum.com
  42. gurumarketingarticles.com
  43. newarticleseek.com
  44. articledashboard.net
  45. articlevines.com
  46. freeseowebmastertools.com
  47. articleoncall.com
  48. sales-articles.com
  49. kruppenterprises.com
  50. ezineproarticles.com
  51. free-find-articles.com
  52. goarticles.com
  53. articledashboard.com
  54. articlesbase.com
  55. amazines.com
  56. articlesnatch.com
  57. articlerich.com
  58. earticlesonline.com
  59. sooperarticles.com
  60. addarticles.org
  61. widbox.com
  62. articlealley.com
  63. bdv-articles.co.uk
  64. thearticlesbase.com
  65. basearticles.com
  66. articlecorp.com
  67. articlemonkeys.com
  68. articlesservices.com
  69. informationhut.com
  70. articlecoop.com
  71. topmalldirectories.org
  72. articleforsubmission.com
  73. 1articlesdirectory.com
  74. akerpub.com
  75. articlecamp.com
  76. articledirectory.com
  77. articleonlinedirectory.com
  78. articleintelligence.com
  79. articlepantry.com
  80. articlesurge.com
  81. findezinearticles.com/directory
  82. articleshouse.info
  83. articles-unlimited.com
  84. articlesolve.com
  85. articles.directorygold.com
  86. ebusiness-articles.com
  87. ezeen.net
  88. articlesgrowth.info
  89. realarticle.com
  90. freebie-articles.com
  91. fyifiles.com
  92. kallblad.com
  93. muvee.co.in
  94. intelligentseoarticles.com
  95. article4wealth.com
  96. articlesthewebdirectory.com
  97. articlesthedirectory.com
  98. article.casinowinz.com
  99. article.mytoureguide.com
  100. article.shopvianet.com

April 28, 2011

Optimize your page content


There are countless tips for optimizing your page’s content so that it will be more relevant to given search. Each engine ranks pages differently so most tips are not universal. However, there so one tips that overrides them all:  

Create pages that emulate the statists of pages that already rank at or near the top of the search results. These statistics include:

Frequency of the keyword on the page – this dose not mean more keywords are better. Instead, emulate, the same number of keyword of top – ranking pages as closely as possible. Be careful not be base your entire strategy on the statistics of a single top – ranking page. The content of a top ranking page could have easily changed since it was last indexed. Therefore , every page ranking in the Top 10 may not always represent exactly what the engine is currently looking for today . Using averages is one way to combat this problem. 

Total words on the page – mimic the approximate number of words of a top –ranking page on your own page.

Weight of the keywords on the page (i.e frequency divided by the total words ) – too high a weight is just as bad as too low a weight .

Area of location of the keywords on the page (i.e title, heading, etc.)- A keywords is given more relevance by an engine when the keyword appears in the engine’s preferred areas. 

Prominence - generally, the closer to the closer to the front of the area you can place the keyword, the better.

Proximity – the closer that the words of a phrase appear together, the better.

Off-pages criteria (i.e link popularity, click through popularity, etc,)- 

Even when you‘ve done everything else right, don’t forget the off- page factors! 

Bonus tip 1: In general, you should try including your keyword or phrasing in the Title tag, Heading tag, the link text, and to lesser extent, your Meta keyword and description tags.  There are other areas in which you may want to include the keyword, depending on the engine. For example, Google is known to give a ranking boost to keywords that are in bold or large print. 

Bonus tip 2: Naming your page after your keyword and/or obtaining a domain name with your keyword in it will often boost your rankings. 

Bonus tip 3: If you run a regional business where most of your business is local, it’s critical that you include your full company address on every page of your site. Otherwise, people could search for “Ford dealer in Chicago”and your site would not appear if your company address was buried only on your contact page. Also, take advantage of proximity by putting the word “Chicago” as close to the phrase “Ford dealer” as possible. Lastly make sure the address is in text from since search engines can’t read your address out of a graphical logo on your page.

Bonus tip 4: Don’t spam the engines. Every engine has their pet peeves, so make sure you know what they are and avoid them. Unfortunately, generalized tips will only take you so far. That’s why we developed the page Critic feature of web Position Gold and have continued to fine –tune it over the years. We also update the advice every month to keep pace with the changes at each engine. The critic gives specific advice for the keyword, the web page, and the engine that you select so you’re not overwhelmed with advice that doesn’t apply to your find yourself wasting hours trying to count and locate keywords on your page and your competitor pages.

April 26, 2011

Proper submission to the major Directories is critical


One of the most significant changes in search engine marketing in recent years has been the rise in the popularity of human –reviewed directories and catalogs like Look smart, yahoo, and open directory. Some search engines prominently display directories listing for many popular searches.  MSN is a prime example. 

Do a search on MSN, and you’ll generally find the first page of result s dominated by Looksmart directory listings. Some of the other major engine also listing directory results prominently or at least emphasize them in various ways. You can recognize directory listings since they are often called “Web site results” rather than “Web Page Results.”

Once you submit to a directory, it‘s difficult to go back and correct mistakes later. Some of them, are it’s of utmost importance to get right the first time. There are many strategies for achieving great visibility with the directories. Some of them involve keyword placement and some involve human psychology.Tiips for other directories are found in the directory submission position Gold’s submitter. Read all the information you can about submitting to each directory before you submit .Even if you’ve poorly, you should find some strategies to help you reverse the damage.

April 25, 2011

Target the right keywords

For those of you who are new to search engine marketing, a keyword is simply a word or phrase that people would search on to find your web site. You might think that choosing the right keywords to target would be a no brainier. However, you’d be surprised at how many people jump in, optimize their web site, and achieve top rankings. They then cry out in dismay when their hit registers no more visitors than it did before they want to all the work to tune up their site.

What happened? They failed to choose keyword that people were actually looking for. Therefore, I recommend you:
 
  • Brainstorm a list of keywords and phrases that apply to your web site’s products and services. Try to place yourself in the shoes of the web searcher. Avoid generalities like “small business”. Yes, you may sell a product for small business, but who is going to search for “small business” when they are looking for a new windows accounting program?
     
  • Take advantage of excellent services like word tracker  to tell  you which keyword are popular but not so competitive as to make a top ranking next to impossible. There’ a fine line between targeting keyword that are too general or competitive versus keyword  phrases that are so specific that few people ever think to search for them .word tracker handles both with ease. In addition, word tracker will do much of the brainstorming for you by taking a couple of keyword and producing a broad list of related words and phrases from which to choose.         
Again. Please don’t make the mistake of picking wrong keyword. Nothing is more disappointing that taking the time to achieve top ranking and then seeing   no increase in traffic form all your efforts.   Also, don’t pick keywords that are too popular or broad like” games “or “Entertainment.”  You’ll not only get visitors that are likely to buy product, but you amount of work needed to again that ranking will not be worth the trouble.  You’ll then join the ranks of misinformed critics screaming” search engine” optimization doesn’t work –don’t waste your time “search engine optimization works and works well, if you take time to do it right.

April 23, 2011

Google LocalRank

On February 25, 2003, the Google Company patented a new algorithm for ranking pages called LocalRank. It is based on the idea that pages should be ranked not by their global link citations, but by how they are cited among pages that deal with topics related to the particular query. The LocalRank algorithm is not used in practice (at least, not in the form it is described in the patent). However, the patent contains several interesting innovations we think any SEO specialist should know about. Nearly all search engines already take into account the topics to which referring pages are devoted. It seems that rather different algorithms are used for the LocalRank algorithm and studying the patent will allow us to learn general ideas about how it may be implemented.

While reading this section, please bear in mind that it contains theoretical information rather than practical guidelines.

The following three items comprise the main idea of the LocalRank algorithm:

1. An algorithm is used to select a certain number of documents relevant to the search query (let it be N). These documents are initially sorted by some criteria (this may be PageRank, relevance or a group of other criteria). Let us call the numeric value of this criterion OldScore.

2. Each of the N N selected pages goes through a new ranking procedure and it gets a new rank. Let us call it LocalScore.

3. The OldScore and LocalScore values for each page are multiplied, to yield a new value – NewScore. The pages are finally ranked based on NewScore.

The key procedure in this algorithm is the new ranking procedure, which gives each page a new LocalScore rank. Let us examine this new procedure in more detail:

0. An initial ranking algorithm is used to select N pages relevant to the search query. Each of the N pages is allocated an OldScore value by this algorithm. The new ranking algorithm only needs to work on these N selected pages. .

1. While calculating LocalScore for each page, the system selects those pages from N that have inbound links to this page. Let this number be M. At the same time, any other pages from the same host (as determined by IP address) and pages that are mirrors of the given page will be excluded from M.

2. The set M is divided into subsets Li. These subsets contain pages grouped according to the following criteria:
  
Belonging to one (or similar) hosts. Thus, pages whose first three octets in their IP addresses are the same will get into one group. This means that pages whose IP addresses belong to the range xxx.xxx.xxx.0 to xxx.xxx.xxx.255 will be considered as belonging to one group.

1.  Pages that have the same or similar content (mirrors)

2.  Pages on the same site (domain).

3. Each page in each Li subset has rank OldScore. One page with the largest OldScore rank is taken from each subset, the rest of pages are excluded from the analysis. Thus, we get some subset of pages K referring to this page.

4. Pages in the subset K are sorted by the OldScore parameter, then only the first k pages (k is some predefined number) are left in the subset K. The rest of the pages are excluded from the analysis.

5. LocalScore is calculated in this step. The OldScore parameters are combined together for the rest of k pages. This can be shown with the help of the following formula:

   Here m is some predefined parameter that may vary from one to three. Unfortunately, the patent for the algorithm in question does not describe this parameter in detail.

After LocalScore is calculated for each page from the set N, NewScore values are calculated and pages are re-sorted according to the new criteria. The following formula is used to calculate NewScore:

NewScore(i)= (a+LocalScore(i)/MaxLS)*(b+OldScore(i)/MaxOS)

i is the page for which the new rank is calculated.

a and b – are numeric constants (there is no more detailed information in the patent about these parameters).

MaxLS – is the maximum LocalScore among those calculated.

MaxOS – is the maximum value among OldScore values.

Now let us put the math aside and explain these steps in plain words.

In step 0) pages relevant to the query are selected. Algorithms that do not take into account the link text are used for this. For example, relevance and overall link popularity are used. We now have a set of OldScore values. OldScore is the rating of each page based on relevance, overall link popularity and other factors.

In step 1) pages with inbound links to the page of interest are selected from the group obtained in step 0). The group is whittled down by removing mirror and other sites in steps 2), 3) and 4) so that we are left with a set of genuinely unique sites that all share a common theme with the page that is under analysis. By analyzing inbound links from pages in this group (ignoring all other pages on the Internet), we get the local (thematic) link popularity.
 
LocalScore values are then calculated in step 5). LocalScore is the rating of a page among the set of pages that are related by topic. Finally, pages are rated and ranked using a combination of LocalScore and OldScore.

April 20, 2011

Creating Correct Content


The content of a site plays an important role in site promotion for many reasons. We will describe some of them in this section. We will also give you some advice on how to populate your site with good content.

Content uniqueness. Search engines value new information that has not been published before. That is why you should compose own site text and not plagiarize excessively. A site based on materials taken from other sites is much less likely to get to the top in search engines. As a rule, original source material is always higher in search results.

While creating a site, remember that it is primarily created for human visitors, not search engines. Getting visitors to visit your site is only the first step and it is the easiest one. The truly difficult task is to make them stay on the site and convert them into purchasers. You can only do this by using good content that is interesting to real people.

Try to update information on the site and add new pages on a regular basis. Search engines value sites that are constantly developing. Also, the more useful text your site contains, the more visitors it attracts. Write articles on the topic of your site, publish visitors' opinions, create a forum for discussing your project. A forum is only useful if the number of visitors is sufficient for it to be active. Interesting and attractive content guarantees that the site will attract interested visitors.

A site created for people rather than search engines has a better chance of getting into important directories such as DMOZ and others.

An interesting site on a particular topic has much better chances to get links, comments, reviews, etc. from other sites on this topic. Such reviews can give you a good flow of visitors while inbound links from such resources will be highly valued by search engines.

As final tip…there is an old German proverb: "A shoemaker sticks to his last" which means, "Do what you can do best.” If you can write breathtaking and creative textual prose for your website then that is great. However, most of us have no special talent for writing attractive text and we should rely on professionals such as journalists and technical writers. Of course, this is an extra expense, but it is justified in the long term.

April 19, 2011

Link Farms

One unsuccessful method of increasing link popularity is by becoming part of a “Link Farm.” Link Farms are networks of sites that have all agreed to link to one another. Each of the sites in the link farm has a page containing links to all other sites that are part of the farm. One of the problems with this strategy is that the content of the sites that are linked together does not necessarily share any common theme. Therefore, even though your site may have a seemingly high link popularity score, most of the links carry very little weight since they lack relevancy to your site. Some link farms have tried to improve on this by categorizing their members and only establishing links between members that have related content.

While this is an improvement, it still does not address the core issue – a link farm is an artificial method of improving link popularity, and most search engine have mechanisms in place to either ignore or actually penalize link farm participants.

A couple of years ago, I attended a search engine strategies conference and someone in the audience asked about link farms. AltaVista’s chief scientist was sitting on the panel of experts and he laughed when he heard the question. He then proceeded to identify several well-known link farms by name and stated that AltaVista did not consider links from sites in those networks to have any validity whatsoever. If a link farm is promoting itself on the web chances are that search engines are already aware of it and will ignore any links form sites that are part of that network.

April 8, 2011

Link partners

Another factor that some search engines consider part of the link popularity equation is link relevancy. When an engine finds a link from one site to another site, it compares the content of both the sites to determine if the sites are thematically similar. A link from a site that contains similar content to your site is assigned a greater weight than a link from an unrelated site.

Let’s look at silver platter foods again. Silver platter is a web site consisting of a number of pages offering gourmet foods and goods. A search engine will look at its web page and see that they are well optimized and targeted. But how does the search engine know silver platters foods is reputable com- pany worthily of high rankings? The completion for gourmet foods companies on the internet is intense. How can search engine determine which sites should be assigned the highest rankings for searches relating to gourmet foods?

It makes sense for search engine to look beyond the optimization of silver platters pages and consider the quantity and quality of incoming links to the silver platters site. If mike Merlot perfected a web page pertaining to pinot noir wines sense and also had links to this page from various well-known vineyards in Napa valley and Italy and form large gourmet web sites such as gourmet.com, that page would probably score highly for a search on pinot noir. Search engine would see that other related sites with high link popularity considered the silver platter site to be worth linking to. In a sense, having a large, well known site in your industry linking to your site can be considered akin to an endorsement of your site from a reputable source.

The best way to increase your site’s link popularity is to find web site within your industry and persuade them to become one of your link partners, meaning that they will link to your site in exchange for your site linking back to theirs. Link partners do not have to be limited to commercial sites. In fact, some of the best link partners are educational or information sites. As long as the site contains content that is related in some way to your site’s content, it can be a worthwhile link partner.

March 17, 2011

Benefits of Social Bookmarking


Social Bookmarking

Increased Traffic - As more and more people read your bookmarked blog post/ web article and vote for it, the more popular it becomes & the traffic to your site increases. 

Increased Visibility & Branding - Submitting to Social Bookmarking Sites puts your blog/website out there in front of millions of people, increasing your visibility ten notches and getting you more branding.

Get more Links - Each of these Popular Social Bookmarking Sites see tremendous volumes of traffic on a daily basis, if your blog/website has good content on it, chances are that several of these users will link to your content, thus getting you relevant links which will ultimately lead to an increase in your Rankings in the Search Engines.

March 5, 2011

Why Inbound Links to sites are taken into account

As you can see from the previous section, many factors influencing the ranking process are under the control of webmasters. If these were the only factors then it would be impossible for search engines to distinguish between a genuine high-quality document and a page created specifically to achieve high search ranking but containing no useful information. For this reason, an analysis of inbound links to the page being evaluated is one of the key factors in page ranking. This is the only factor that is not controlled by the site owner.

It makes sense to assume that interesting sites will have more inbound links. This is because owners of other sites on the Internet will tend to have published links to a site if they think it is a worthwhile resource. The search engine will use this inbound link criterion in its evaluation of document significance.

Therefore, two main factors influence how pages are stored by the search engine and sorted for display in search results:

- Relevance, as described in the previous section on internal ranking factors.

- Number and quality of inbound links, also known as link citation, link popularity or citation index. This will be described in the next section.

March 3, 2011

Build a Link Wheel to Get More Traffic

How To Create A Link Wheel
One great way to get a ton of traffic to your websites is to build a link wheel. These link wheels build backlinks to your website's main page, which give your website a higher search engine ranking.

Instructions

1
Write a Squidoo lens about your website topic. Then put a link in your Squidoo lens back to your website. You can open an account at Squidoo by clicking on the link in the resources section of this article.

2
Write an article about your topic and submit it to any of the online article directories. You can use Go Articles or Ezinearticles or any of the other many article directories online. Include a link back to your Squidoo lens in your article or resource box.

3
Create a Hubpage about your topic and link it back to the article that you wrote in step 2. You can open a Hubpages account by clicking on the link in the resources section.

4
Create a Google Knol about your topic and include a link back to your Hubpage.

5
Make a one page Blogger blog and link it back to your Google Knol. You can actually use the same Blogger blog to place links for many different link wheels.

6
At this point you can either choose to leave the wheel as it is, and leave it "open" or you can close the wheel by linking your website back to your Blogger blog. Either way, you will see your website rise in the search engines over the next week or two.

February 15, 2011

Hidden Text, a Deceptive SEO Method


The last two issues are not really mistakes but deliberate attempts to deceive search engines using illicit SEO methods. Hidden text (when the text color coincides with the background color, for example) allows site owners to cram a page with their desired keywords without affecting page logic or visual layout. Such text is invisible to human visitors but will be seen by search robots. The use of such deceptive optimization methods may result in banning of the site. It could be excluded from the index (database) of the search engine. 

One-pixel links, SEO deception
This is another deceptive SEO technique. Search engines consider the use of tiny, almost invisible, graphic image links just one pixel wide and high as an attempt at deception, which may lead to a site ban.

February 14, 2011

One Page SEO – One Keyword Phrase


For maximum SEO try to optimize each page for its own keyword phrase. Sometimes you can choose two or three related phrases, but you should certainly not try to optimize a page for 5-10 phrases at once. Such phrases would probably produce no effect on page rank. 

SEO and the Main page
Optimize the main page of your site (domain name, index.html) for word combinations that are most important. This page is most likely to get to the top of search engine lists. My seo observations suggest that the main page may account for up to 30-40% percent of the total search traffic for some sites 

Common SEO mistakes

Graphic header
Very often sites are designed with a graphic header. Often, we see an image of the company logo occupying the full-page width. Do not do it! The upper part of a page is a very valuable place where you should insert your most important keywords for best seo. In case of a graphic image, that prime position is wasted since search engines can not make use of images. Sometimes you may come across completely absurd situations: the header contains text information, but to make its appearance more attractive, it is created in the form of an image. The text in it cannot be indexed by search engines and so it will not contribute toward the page rank. If you must present a logo, the best way is to use a hybrid approach – place the graphic logo at the top of each page and size it so that it does not occupy its entire width. Use a text header to make up the rest of the width. 

Graphic navigation menu
The situation is similar to the previous one – internal links on your site should contain keywords, which will give an additional advantage in seo ranking. If your navigation menu consists of graphic elements to make it more attractive, search engines will not be able to index the text of its links. If it is not possible to avoid using a graphic menu, at least remember to specify correct ALT attributes for all images. 

Script navigation
Sometimes scripts are used for site navigation. As an seo worker, you should understand that search engines cannot read or execute scripts. Thus, a link specified with the help of a script will not be available to the search engine, the search robot will not follow it and so parts of your site will not be indexed. If you use site navigation scripts then you must provide regular HTML duplicates to make them visible to everyone – your human visitors and the search robots. 

Session Identifier
Some sites use session identifiers. This means that each visitor gets a unique parameter (&session_id=) when he or she arrives at the site. This ID is added to the address of each page visited on the site. Session IDs help site owners to collect useful statistics, including information about visitors' behavior. However, from the point of view of a search robot, a page with a new address is a brand new page. This means that, each time the search robot comes to such a site, it will get a new session identifier and will consider the pages as new ones whenever it visits them. 

Search engines do have algorithms for consolidating mirrors and pages with the same content. Sites with session IDs should, therefore, be recognized and indexed correctly. However, it is difficult to index such sites and sometimes they may be indexed incorrectly, which has an adverse effect on seo page ranking. If you are interested in seo for your site, I recommend that you avoid session identifiers if possible.

Redirects
Redirects make site analysis more difficult for search robots, with resulting adverse effects on seo. Do not use redirects unless there is a clear reason for doing so.

February 10, 2011

Ranking Factors

Description Meta tag
This is used to specify page descriptions. It does not influence the SEO ranking process but it is very important. A lot of search engines (including the largest one – Google) display information from this tag in their search results if this tag is present on a page and if its content matches the content of the page and the search query. 

Experience has shown that a high position in search results does not always guarantee large numbers of visitors. For example, if your competitors' search result description is more attractive than the one for your site then search engine users may choose their resource instead of yours. That is why it is important that your Description Meta tag text be brief, but informative and attractive. It must also contain keywords appropriate to the page. 

Keywords Meta tag
This Meta tag was initially used to specify keywords for pages but it is hardly ever used by search engines now. It is often ignored in seo projects. However, it would be advisable to specify this tag just in case there is a revival in its use. The following rule must be observed for this tag: only keywords actually used in the page text must be added to it. 

Site structure

Number of pages
The general SEO rule is: the more, the better. Increasing the number of pages on your website increases the visibility of the site to search engines. Also, if new information is being constantly added to the site, search engines consider this as development and expansion of the site. This may give additional advantages in ranking. You should periodically publish more information on your site – news, press releases, articles, useful tips, etc. 

Navigation menu
As a rule, any site has a navigation menu. Use keywords in menu links, it will give additional seo significance to the pages to which the links refer. 

Keywords in page names
Some SEO experts consider that using keywords in the name of a HTML page file may have a positive effect on its search result position. 

Avoid subdirectories
If there are not too many pages on your site (up to a couple of dozen), it is best to place them all in the root directory of your site. Search engines consider such pages to be more important than ones in subdirectories.

February 9, 2011

Keyword Density and Search Engine Optimization

«TITLE» tag
This is one of the most important tags for search engines. Make use of this fact in your seo work. Keywords must be used in the TITLE tag. The link to your site that is normally displayed in search results will contain text derived from the TITLE tag. It functions as a sort of virtual business card for your pages. Often, the TITLE tag text is the first information about your website that the user sees. This is why it should not only contain keywords, but also be informative and attractive. You want the searcher to be tempted to click on your listed link and navigate to your website. As a rule, 50-80 characters from the TITLE tag are displayed in search results and so you should limit the size of the title to this length.

Keywords in links
A simple SEO rule – use keywords in the text of page links that refer to other pages on your site and to any external Internet resources. Keywords in such links can slightly enhance page rank.

«ALT» attributes in images
Any page image has a special optional attribute known as "alternative text.” It is specified using the HTML «ALT» tag. This text will be displayed if the browser fails to download the image or if the browser image display is disabled. Search engines save the value of image ALT attributes when they parse (index) pages, but do not use it to rank search results.

Currently, the Google search engine takes into account text in the ALT attributes of those images that are links to other pages. The ALT attributes of other images are ignored. There is no information regarding other search engines, but we can assume that the situation is similar. We consider that keywords can and should be used in ALT attributes, but this practice is not vital for seo purposes.

February 4, 2011

Keyword Density and SEO

Keyword page density is a measure of the relative frequency of the word in the text expressed as a percentage. For example, if a specific word is used 5 times on a page containing 100 words, the keyword density is 5%. If the density of a keyword is too low, the search engine will not pay much attention to it. If the density is too high, the search engine may activate its spam filter. If this happens, the page will be penalized and its position in search listings will be deliberately lowered.

The optimum value for keyword density is 5-7%. In the case of keyword phrases, you should calculate the total density of each of the individual keywords comprising the phrases to make sure it is within the specified limits. In practice, a keyword density of more than 7-8% does not seem to have any negative seo consequences. However, it is not necessary and can reduce the legibility of the content from a user’s viewpoint. 

Location of keywords on a page
A very short rule for SEO experts – the closer a keyword or keyword phrase is to the beginning of a document, the more significant it becomes for the search engine. 

Text format and SEO
Search engines pay special attention to page text that is highlighted or given special formatting. We recommend:

- use keywords in headings. Headings are text highlighted with the «H» HTML tags. The «h1» and «h2» tags are most effective. Currently, the use of CSS allows you to redefine the appearance of text highlighted with these tags. This means that «H» tags are used less than nowadays, but are still very important in seo work.;

- Highlight keywords with bold fonts. Do not highlight the entire text! Just highlight each keyword two or three times on the page. Use the «strong» tag for highlighting instead of the more traditional «B» bold tag.

February 3, 2011

Internal ranking factors

Several factors influence the position of a site in the search results. They can be divided into external and internal ranking factors. Internal ranking factors are those that are controlled by seo aware website owners (text, layout, etc.) and will be described next. 

Web page layout factors relevant to seo

Amount of text on a page

A page consisting of just a few sentences is less likely to get to the top of a search engine list. Search engines favor sites that have a high information content. Generally, you should try to increase the text content of your site in the interest of seo. The optimum page size is 500-3000 words (or 2000 to 20,000 characters). 

Search engine visibility is increased as the amount of page text increases due to the increased likelihood of occasional and accidental search queries causing it to be listed. This factor sometimes results in a large number of visitors.

Number of keywords on a page

Keywords must be used at least three to four times in the page text. The upper limit depends on the overall page size – the larger the page, the more keyword repetitions can be made. Keyword phrases (word combinations consisting of several keywords) are worth a separate mention. The best seo results are observed when a keyword phrase is used several times in the text with all keywords in the phrase arranged in exactly the same order. In addition, all of the words from the phrase should be used separately several times in the remaining text. There should also be some difference (dispersion) in the number of entries for each of these repeated words. 

Let us take an example. Suppose we optimize a page for the phrase "seo software” (one of our seo keywords for this site) It would be good to use the phrase “seo software” in the text 10 times, the word “seo” 7 times elsewhere in the text and the word “software” 5 times. The numbers here are for illustration only, but they show the general seo idea quite well.

February 2, 2011

Common search engine principles

To understand seo you need to be aware of the architecture of search engines. They all contain the following main components:

Spider - a browser-like program that downloads web pages.

Crawler – a program that automatically follows all of the links on each web page.

Indexer - a program that analyzes web pages downloaded by the spider and the crawler. 

Database– storage for downloaded and processed pages.

Results engine – extracts search results from the database. 

Web server – a server that is responsible for interaction between the user and other search engine components. 

Specific implementations of search mechanisms may differ. For example, the Spider+Crawler+Indexer component group might be implemented as a single program that downloads web pages, analyzes them and then uses their links to find new resources. However, the components listed are inherent to all search engines and the seo principles are the same. 

Spider. This program downloads web pages just like a web browser. The difference is that a browser displays the information presented on each page (text, graphics, etc.) while a spider does not have any visual components and works directly with the underlying HTML code of the page. You may already know that there is an option in standard web browsers to view source HTML code. 

Crawler. This program finds all links on each page. Its task is to determine where the spider should go either by evaluating the links or according to a predefined list of addresses. The crawler follows these links and tries to find documents not already known to the search engine. 

Indexer. This component parses each page and analyzes the various elements, such as text, headers, structural or stylistic features, special HTML tags, etc.

Database. This is the storage area for the data that the search engine downloads and analyzes. Sometimes it is called the index of the search engine.

Results Engine. The results engine ranks pages. It determines which pages best match a user's query and in what order the pages should be listed. This is done according to the ranking algorithms of the search engine. It follows that page rank is a valuable and interesting property and any seo specialist is most interested in it when trying to improve his site search results. In this article, we will discuss the seo factors that influence page rank in some detail. 

Web server. The search engine web server usually contains a HTML page with an input field where the user can specify the search query he or she is interested in. The web server is also responsible for displaying search results to the user in the form of an HTML page. 

History of Search Engines

In the early days of Internet development, its users were a privileged minority and the amount of available information was relatively small. Access was mainly restricted to employees of various universities and laboratories who used it to access scientific information. In those days, the problem of finding information on the Internet was not nearly as critical as it is now.

Site directories were one of the first methods used to facilitate access to information resources on the network. Links to these resources were grouped by topic. Yahoo was the first project of this kind opened in April 1994. As the number of sites in the Yahoo directory inexorably increased, the developers of Yahoo made the directory searchable. Of course, it was not a search engine in its true form because searching was limited to those resources who’s listings were put into the directory. It did not actively seek out resources and the concept of seo was yet to arrive.

Such link directories have been used extensively in the past, but nowadays they have lost much of their popularity. The reason is simple – even modern directories with lots of resources only provide information on a tiny fraction of the Internet. For example, the largest directory on the network is currently DMOZ (or Open Directory Project). It contains information on about five million resources. Compare this with the Google search engine database containing more than eight billion documents.

The WebCrawler project started in 1994 and was the first full-featured search engine. The Lycos and AltaVista search engines appeared in 1995 and for many years Alta Vista was the major player in this field.

In 1997 Sergey Brin and Larry Page created Google as a research project at Stanford University. Google is now the most popular search engine in the world.

Currently, there are three leading international search engines – Google, Yahoo and MSN Search. They each have their own databases and search algorithms. Many other search engines use results originating from these three major search engines and the same seo expertise can be applied to all of them. For example, the AOL search engine (search.aol.com) uses the Google database while AltaVista, Lycos and AllTheWeb all use the Yahoo database.

January 29, 2011

Introduction to SEO

This document is intended for webmasters and site owners who want to investigate the issues of SEO ( Sarch Engine Optimization ) and promotion of their resources. It is mainly aimed at beginners, although I hope that experienced webmasters will also find something new and interesting here. There are many articles on seo on the Internet and this text is an attempt to gather some of this information into a single consistent document.

   Information presented in this text can be divided into several parts:

   - Clear-cut seo recommendations, practical guidelines.
   - Theoretical information that we think any seo specialist should know.
   - Seo tips, observations, recommendations from experience, other seo sources, etc.

January 19, 2011

Doorway page

Doorway pages are web pages that are created for Spamdexing, this is, for spamming the index of a search engine by inserting results for particular phrases with the purpose of sending visitors to a different page. They are also known as bridge pages, portal pages, jump pages, gateway pages, entry pages and by other names. Doorway pages that redirect visitors without their knowledge use some form of cloaking.

January 17, 2011

What Is The Google Dance?

Whenever we are at trade shows, run seminars, or speak at symposiums we get asked the question "what is the Google dance?" We've heard a few different things referred to as "the Google Dance", but only one is really correct. It's the period when Google is rebuilding its rankings, and results fluctuate widely for a 3 to 5 day period.

How Often Does The Google Dance Happen?
The name "Google Dance" was in the past used to describe the period that a major index update of the Google search engine are being implemented. These major Google index update occured on average every 36 days or 10 times per year. It was easiest be identified by significant changes in search results, and by an updating of Google's cache of all indexed pages. These changes would be evident from one minute to the next. But the update did not proceed as a switch from one index to another like the flip of a switch. In fact, it took several days to finish the complete update of the index.

Because Google, like every other search engine, depends on their customers knowing that they deliver authoritative reliable results 24 hours of the day, seven days a week, updates pose a serious issue. They can not shut down for maintenance and they cannot afford to go offline for even one minute. Hence, we had the Dance. Every search engine goes through it, some more or less often than Google. However, it is only because of Google's reach that we pay attention to its rebuild more than that of any other engine.

Since August 2003, the famous / infamous Google Dance is no more. Or rather it has become less dramatic. Google now performs updates every week, with most movement occurring on Mondays. These ongoing updates feature mostly minor algorithm and index updates.

So, during any month there will be minor changes in rankings. This is because Google's bot or spider is always running and finding new material. It also happens because the bot may have detected that a website no longer exists, and needs to be deleted from the index. During the Dance, the Googlebot will revisit every website, figure out how many sites link to it, and how many it links out to, and how valuable these links are.

Because Google is constantly crawling and updated selected pages, their search results will vary slightly over the course of the month. However, it is only during the Google Dance that these results can swing wildly. You also need to consider that Google has multiple data centers, sharing more than 10,000 servers. Somehow, the updates to the index that occur during the month, and outside of the Google Dance have to get transferred throughout. It's a constant process for Google, and every other search engine. These ongoing, incremental updates only affect parts of the index at any one time.

Checking The Google Dance

Until January 2004, Google had 12 main www servers online, which were as follows:

    * www-ex.google.com - (where you get when you type www.google.com)
    * www-sj.google.com - (which can also be accessed at www2.google.com)
    * www-va.google.com - (which can also be accessed at www3.google.com)
    * www-dc.google.com
    * www-ab.google.com
    * www-in.google.com
    * www-zu.google.com
    * www-cw.google.com
    * www-fi.google.com - found in May 2003.
    * www-gv.google.com - found in August 2003.
    * www-gv2.google.com - found in September 2003.
    * www-kr.google.com - found in October 2003.

At some point in January, these servers stopped accepting connections, and the only servers easy to connect to were:

    * www.google.com
    * www2.google.com
    * www3.google.com

As well as the numeric address databases - which people keep discovering, and kindly help us keep abreast of.

    * 216.239.37.99
    * 216.239.39.99
    * 216.239.41.99
    * 216.239.51.99
    * 216.239.37.104
    * 216.239.41.104
    * 216.239.37.147
    * 64.233.161.98
    * 64.233.161.99
    * 64.233.161.104
    * 64.233.161.105

At any time during an index update you can check the Google servers, and they will display sometime wildly differing results, thus they are said to be "dancing", and hence the name "Google Dance".

In the past, the easiest way to check if the Google Dance was happening was to go to www.google.com, and do a search. Look at the blue bar at the top of the page. It would have the words "Results 1 - 10 of about 626,000. Search took 0.48 seconds" Then check the same search on www2.google.com, and www3.google.com. If you were seeing a different number of total pages for the same search, then the Google Dance was on. You could also check all the variations above. www2 is really www-sj, and www3 is www-va. We have found that all the others need their full www-extension.google.com in the url if you want to test them properly. Once the numbers, and the order of results on all 10 www's are the same, you knew the dance is over.

Importance Of The Google Dance

For most people, this event in and of itself was not important. However for anyone in the search engine optimization industry it was a period of note. Pages got temporarily dropped. Sometimes it lasted a day. People panicked. Then they are re-added, and they are better placed than before, and things calmed down. It's interesting to see how overpoweringly important this one engine is. For more information on about this search engine, or any other, please read our sections on The Search Engines.

About Google Instant

Google Instant is a new search enhancement that shows results as you type. We are pushing the limits of our technology and infrastructure to help you get better search results, faster. Our key technical insight was that people type slowly, but read quickly, typically taking 300 milliseconds between keystrokes, but only 30 milliseconds (a tenth of the time!) to glance at another part of the page. This means that you can scan a results page while you type.

The most obvious change is that you get to the right content much faster than before because you don’t have to finish typing your full search term, or even press “search.” Another shift is that seeing results as you type helps you formulate a better search term by providing instant feedback. You can now adapt your search on the fly until the results match exactly what you want. In time, we may wonder how search ever worked in any other way.

Benefits
Faster Searches: By predicting your search and showing results before you finish typing, Google Instant can save 2-5 seconds per search.

Smarter Predictions: Even when you don’t know exactly what you’re looking for, predictions help guide your search. The top prediction is shown in grey text directly in the search box, so you can stop typing as soon as you see what you need.

Instant Results: Start typing and results appear right before your eyes. Until now, you had to type a full search term, hit return, and hope for the right results. Now results appear instantly as you type, helping you see where you’re headed, every step of the way.