April 28, 2011

Optimize your page content


There are countless tips for optimizing your page’s content so that it will be more relevant to given search. Each engine ranks pages differently so most tips are not universal. However, there so one tips that overrides them all:  

Create pages that emulate the statists of pages that already rank at or near the top of the search results. These statistics include:

Frequency of the keyword on the page – this dose not mean more keywords are better. Instead, emulate, the same number of keyword of top – ranking pages as closely as possible. Be careful not be base your entire strategy on the statistics of a single top – ranking page. The content of a top ranking page could have easily changed since it was last indexed. Therefore , every page ranking in the Top 10 may not always represent exactly what the engine is currently looking for today . Using averages is one way to combat this problem. 

Total words on the page – mimic the approximate number of words of a top –ranking page on your own page.

Weight of the keywords on the page (i.e frequency divided by the total words ) – too high a weight is just as bad as too low a weight .

Area of location of the keywords on the page (i.e title, heading, etc.)- A keywords is given more relevance by an engine when the keyword appears in the engine’s preferred areas. 

Prominence - generally, the closer to the closer to the front of the area you can place the keyword, the better.

Proximity – the closer that the words of a phrase appear together, the better.

Off-pages criteria (i.e link popularity, click through popularity, etc,)- 

Even when you‘ve done everything else right, don’t forget the off- page factors! 

Bonus tip 1: In general, you should try including your keyword or phrasing in the Title tag, Heading tag, the link text, and to lesser extent, your Meta keyword and description tags.  There are other areas in which you may want to include the keyword, depending on the engine. For example, Google is known to give a ranking boost to keywords that are in bold or large print. 

Bonus tip 2: Naming your page after your keyword and/or obtaining a domain name with your keyword in it will often boost your rankings. 

Bonus tip 3: If you run a regional business where most of your business is local, it’s critical that you include your full company address on every page of your site. Otherwise, people could search for “Ford dealer in Chicago”and your site would not appear if your company address was buried only on your contact page. Also, take advantage of proximity by putting the word “Chicago” as close to the phrase “Ford dealer” as possible. Lastly make sure the address is in text from since search engines can’t read your address out of a graphical logo on your page.

Bonus tip 4: Don’t spam the engines. Every engine has their pet peeves, so make sure you know what they are and avoid them. Unfortunately, generalized tips will only take you so far. That’s why we developed the page Critic feature of web Position Gold and have continued to fine –tune it over the years. We also update the advice every month to keep pace with the changes at each engine. The critic gives specific advice for the keyword, the web page, and the engine that you select so you’re not overwhelmed with advice that doesn’t apply to your find yourself wasting hours trying to count and locate keywords on your page and your competitor pages.

April 26, 2011

Proper submission to the major Directories is critical


One of the most significant changes in search engine marketing in recent years has been the rise in the popularity of human –reviewed directories and catalogs like Look smart, yahoo, and open directory. Some search engines prominently display directories listing for many popular searches.  MSN is a prime example. 

Do a search on MSN, and you’ll generally find the first page of result s dominated by Looksmart directory listings. Some of the other major engine also listing directory results prominently or at least emphasize them in various ways. You can recognize directory listings since they are often called “Web site results” rather than “Web Page Results.”

Once you submit to a directory, it‘s difficult to go back and correct mistakes later. Some of them, are it’s of utmost importance to get right the first time. There are many strategies for achieving great visibility with the directories. Some of them involve keyword placement and some involve human psychology.Tiips for other directories are found in the directory submission position Gold’s submitter. Read all the information you can about submitting to each directory before you submit .Even if you’ve poorly, you should find some strategies to help you reverse the damage.

April 25, 2011

Target the right keywords

For those of you who are new to search engine marketing, a keyword is simply a word or phrase that people would search on to find your web site. You might think that choosing the right keywords to target would be a no brainier. However, you’d be surprised at how many people jump in, optimize their web site, and achieve top rankings. They then cry out in dismay when their hit registers no more visitors than it did before they want to all the work to tune up their site.

What happened? They failed to choose keyword that people were actually looking for. Therefore, I recommend you:
 
  • Brainstorm a list of keywords and phrases that apply to your web site’s products and services. Try to place yourself in the shoes of the web searcher. Avoid generalities like “small business”. Yes, you may sell a product for small business, but who is going to search for “small business” when they are looking for a new windows accounting program?
     
  • Take advantage of excellent services like word tracker  to tell  you which keyword are popular but not so competitive as to make a top ranking next to impossible. There’ a fine line between targeting keyword that are too general or competitive versus keyword  phrases that are so specific that few people ever think to search for them .word tracker handles both with ease. In addition, word tracker will do much of the brainstorming for you by taking a couple of keyword and producing a broad list of related words and phrases from which to choose.         
Again. Please don’t make the mistake of picking wrong keyword. Nothing is more disappointing that taking the time to achieve top ranking and then seeing   no increase in traffic form all your efforts.   Also, don’t pick keywords that are too popular or broad like” games “or “Entertainment.”  You’ll not only get visitors that are likely to buy product, but you amount of work needed to again that ranking will not be worth the trouble.  You’ll then join the ranks of misinformed critics screaming” search engine” optimization doesn’t work –don’t waste your time “search engine optimization works and works well, if you take time to do it right.

April 23, 2011

Google LocalRank

On February 25, 2003, the Google Company patented a new algorithm for ranking pages called LocalRank. It is based on the idea that pages should be ranked not by their global link citations, but by how they are cited among pages that deal with topics related to the particular query. The LocalRank algorithm is not used in practice (at least, not in the form it is described in the patent). However, the patent contains several interesting innovations we think any SEO specialist should know about. Nearly all search engines already take into account the topics to which referring pages are devoted. It seems that rather different algorithms are used for the LocalRank algorithm and studying the patent will allow us to learn general ideas about how it may be implemented.

While reading this section, please bear in mind that it contains theoretical information rather than practical guidelines.

The following three items comprise the main idea of the LocalRank algorithm:

1. An algorithm is used to select a certain number of documents relevant to the search query (let it be N). These documents are initially sorted by some criteria (this may be PageRank, relevance or a group of other criteria). Let us call the numeric value of this criterion OldScore.

2. Each of the N N selected pages goes through a new ranking procedure and it gets a new rank. Let us call it LocalScore.

3. The OldScore and LocalScore values for each page are multiplied, to yield a new value – NewScore. The pages are finally ranked based on NewScore.

The key procedure in this algorithm is the new ranking procedure, which gives each page a new LocalScore rank. Let us examine this new procedure in more detail:

0. An initial ranking algorithm is used to select N pages relevant to the search query. Each of the N pages is allocated an OldScore value by this algorithm. The new ranking algorithm only needs to work on these N selected pages. .

1. While calculating LocalScore for each page, the system selects those pages from N that have inbound links to this page. Let this number be M. At the same time, any other pages from the same host (as determined by IP address) and pages that are mirrors of the given page will be excluded from M.

2. The set M is divided into subsets Li. These subsets contain pages grouped according to the following criteria:
  
Belonging to one (or similar) hosts. Thus, pages whose first three octets in their IP addresses are the same will get into one group. This means that pages whose IP addresses belong to the range xxx.xxx.xxx.0 to xxx.xxx.xxx.255 will be considered as belonging to one group.

1.  Pages that have the same or similar content (mirrors)

2.  Pages on the same site (domain).

3. Each page in each Li subset has rank OldScore. One page with the largest OldScore rank is taken from each subset, the rest of pages are excluded from the analysis. Thus, we get some subset of pages K referring to this page.

4. Pages in the subset K are sorted by the OldScore parameter, then only the first k pages (k is some predefined number) are left in the subset K. The rest of the pages are excluded from the analysis.

5. LocalScore is calculated in this step. The OldScore parameters are combined together for the rest of k pages. This can be shown with the help of the following formula:

   Here m is some predefined parameter that may vary from one to three. Unfortunately, the patent for the algorithm in question does not describe this parameter in detail.

After LocalScore is calculated for each page from the set N, NewScore values are calculated and pages are re-sorted according to the new criteria. The following formula is used to calculate NewScore:

NewScore(i)= (a+LocalScore(i)/MaxLS)*(b+OldScore(i)/MaxOS)

i is the page for which the new rank is calculated.

a and b – are numeric constants (there is no more detailed information in the patent about these parameters).

MaxLS – is the maximum LocalScore among those calculated.

MaxOS – is the maximum value among OldScore values.

Now let us put the math aside and explain these steps in plain words.

In step 0) pages relevant to the query are selected. Algorithms that do not take into account the link text are used for this. For example, relevance and overall link popularity are used. We now have a set of OldScore values. OldScore is the rating of each page based on relevance, overall link popularity and other factors.

In step 1) pages with inbound links to the page of interest are selected from the group obtained in step 0). The group is whittled down by removing mirror and other sites in steps 2), 3) and 4) so that we are left with a set of genuinely unique sites that all share a common theme with the page that is under analysis. By analyzing inbound links from pages in this group (ignoring all other pages on the Internet), we get the local (thematic) link popularity.
 
LocalScore values are then calculated in step 5). LocalScore is the rating of a page among the set of pages that are related by topic. Finally, pages are rated and ranked using a combination of LocalScore and OldScore.

April 20, 2011

Creating Correct Content


The content of a site plays an important role in site promotion for many reasons. We will describe some of them in this section. We will also give you some advice on how to populate your site with good content.

Content uniqueness. Search engines value new information that has not been published before. That is why you should compose own site text and not plagiarize excessively. A site based on materials taken from other sites is much less likely to get to the top in search engines. As a rule, original source material is always higher in search results.

While creating a site, remember that it is primarily created for human visitors, not search engines. Getting visitors to visit your site is only the first step and it is the easiest one. The truly difficult task is to make them stay on the site and convert them into purchasers. You can only do this by using good content that is interesting to real people.

Try to update information on the site and add new pages on a regular basis. Search engines value sites that are constantly developing. Also, the more useful text your site contains, the more visitors it attracts. Write articles on the topic of your site, publish visitors' opinions, create a forum for discussing your project. A forum is only useful if the number of visitors is sufficient for it to be active. Interesting and attractive content guarantees that the site will attract interested visitors.

A site created for people rather than search engines has a better chance of getting into important directories such as DMOZ and others.

An interesting site on a particular topic has much better chances to get links, comments, reviews, etc. from other sites on this topic. Such reviews can give you a good flow of visitors while inbound links from such resources will be highly valued by search engines.

As final tip…there is an old German proverb: "A shoemaker sticks to his last" which means, "Do what you can do best.” If you can write breathtaking and creative textual prose for your website then that is great. However, most of us have no special talent for writing attractive text and we should rely on professionals such as journalists and technical writers. Of course, this is an extra expense, but it is justified in the long term.

April 19, 2011

Link Farms

One unsuccessful method of increasing link popularity is by becoming part of a “Link Farm.” Link Farms are networks of sites that have all agreed to link to one another. Each of the sites in the link farm has a page containing links to all other sites that are part of the farm. One of the problems with this strategy is that the content of the sites that are linked together does not necessarily share any common theme. Therefore, even though your site may have a seemingly high link popularity score, most of the links carry very little weight since they lack relevancy to your site. Some link farms have tried to improve on this by categorizing their members and only establishing links between members that have related content.

While this is an improvement, it still does not address the core issue – a link farm is an artificial method of improving link popularity, and most search engine have mechanisms in place to either ignore or actually penalize link farm participants.

A couple of years ago, I attended a search engine strategies conference and someone in the audience asked about link farms. AltaVista’s chief scientist was sitting on the panel of experts and he laughed when he heard the question. He then proceeded to identify several well-known link farms by name and stated that AltaVista did not consider links from sites in those networks to have any validity whatsoever. If a link farm is promoting itself on the web chances are that search engines are already aware of it and will ignore any links form sites that are part of that network.

April 8, 2011

Link partners

Another factor that some search engines consider part of the link popularity equation is link relevancy. When an engine finds a link from one site to another site, it compares the content of both the sites to determine if the sites are thematically similar. A link from a site that contains similar content to your site is assigned a greater weight than a link from an unrelated site.

Let’s look at silver platter foods again. Silver platter is a web site consisting of a number of pages offering gourmet foods and goods. A search engine will look at its web page and see that they are well optimized and targeted. But how does the search engine know silver platters foods is reputable com- pany worthily of high rankings? The completion for gourmet foods companies on the internet is intense. How can search engine determine which sites should be assigned the highest rankings for searches relating to gourmet foods?

It makes sense for search engine to look beyond the optimization of silver platters pages and consider the quantity and quality of incoming links to the silver platters site. If mike Merlot perfected a web page pertaining to pinot noir wines sense and also had links to this page from various well-known vineyards in Napa valley and Italy and form large gourmet web sites such as gourmet.com, that page would probably score highly for a search on pinot noir. Search engine would see that other related sites with high link popularity considered the silver platter site to be worth linking to. In a sense, having a large, well known site in your industry linking to your site can be considered akin to an endorsement of your site from a reputable source.

The best way to increase your site’s link popularity is to find web site within your industry and persuade them to become one of your link partners, meaning that they will link to your site in exchange for your site linking back to theirs. Link partners do not have to be limited to commercial sites. In fact, some of the best link partners are educational or information sites. As long as the site contains content that is related in some way to your site’s content, it can be a worthwhile link partner.