Tuesday, September 25, 2007

Lsi - What And Why

What is LSI?

LSI, or Latent Semantic Indexing, is one of the latest weapons that search engines use to rate websites in order to provide searchers with the best possible search experience. LSI algorithms evaluate the overall theme of a website, placing emphasis on quality and freshness of content. A great deal of importance is given to the way a website is constructed and internally linked. Preference is granted to those sites which have relevant content pages that are cross-linked, but general cross-linking between unrelated pages can cause ranking positions to suffer.

In an effort to weed out erroneous search results, new LSI algorithms evaluate websites much like a human would. Are the internally linked pages related to each other in regards to subject matter, or are there pages about car insurance cross-linked to pages about fishing equipment? Irrelevant cross-linking is frowned upon and will hurt the ranking of pages.

Incoming links are also evaluated on a theme basis. If incoming links from related websites link to you, you'll score points. Unrelated incoming links can detract from your overall ranking score.

In many cases, LSI algorithms use a form of artificial intelligence to gauge the quality of webpage content. If your content is nonsensical or appears to be machine generated, your ranking position will suffer.

Why LSI?

In days gone by, a webmaster could attain high search engine rankings by stuffing a webpage with loads of keywords. You'd find keywords in the title, headings, image alt tags and generously sprinkled throughout the page content. Put in enough keywords and you'd convince the search engines to place you high in the search results pages.

Of course, the fast-buck artists took immediate advantage and easily achieved top rankings. Garbage pages, stuffed with keywords littered the best positions in the search engines.

The search engines fought back and started placing more importance on incoming links than just counting keyword density. The hucksters responded with link farms - sites that are setup only for the purpose of artificially increasing the quantity of incoming links. The search engines countered by looking at the relevancy of incoming links and sites that used link farms saw their websites with coveted top positions start to plummet into oblivion.

A few enterprising individuals actually formed a syndicate of sorts that allowed member sites of similar content to cross-link, again falsely inflating the importance of many websites. A great many people pay nearly $10,000 a year to participate in such programs, even though those practices may soon become ineffective.

Why do search engines care about users?

At first thought, many people wonder why search engines even care about users. After all, the search engines are free to use, so they don't make money when you search for free mp3 downloads or you look for that latest news story about the stock market, so why should they be concerned if visitors enjoy their search experience? Well, search engines do make money from searchers, indirectly. Take Google for example, with their AdWords advertisements. Those are the small ads that come up on the right hand side of the search results pages.

If you happen to click on one of those ads, the advertiser pays a certain amount of money to Google, so Google makes money for each and every click. When you consider the huge amount of traffic Google receives on a daily basis and the large number of clicks on those ads, it comes up to a tidy sum. Of course, if the search results don't help you to find what you're looking for, you'll find a better search engine and Google's income will drop.

Google absolutely does not want that to happen, so they're constantly trying to improve their search results to keep you coming back and clicking those revenue generating ads. They want to make sure that when you search for a certain topic, the search results are genuinely related to what you're looking for.

Change with the times or die

As search engines refine their indexing algorithms, webmasters find it increasingly difficult to achieve and maintain top search rankings. Keyword stuffing, cloaking and link farms no longer work. Computer generated content, quickly and easily producing hundreds or thousands of 'optimized' pages will no longer fool the search engines. Any underhanded method developed to 'game' the search engines will not work for long, if at all.

Actually, I think the LSI algorithms are of great benefit to webmasters. Just build your sites to confirm and you can achieve the top rankings you want without having to upgrade to the latest 'trick' that works this week but stops working next week.

Search engines are big business and if you don't play by their rules, you'll find yourself sitting on the sidelines, so you had better pay close attention to LSI. If you want high rankings, give the search engines exactly what they're looking for, then concentrate on running your business instead of looking for the next blackhat method of getting those fleeting, artificially high ranking positions.
Article Directory: http://www.articledashboard.com
By: CarsonDanfield

Carson Danfield is a successful 'Under the Radar' internet entrepreneur and can show you the 'Right Way' to do things.
Want to implement LSI for your website without the steep learning curve? Want LSI that's push-button easy? Get it now at Traffic-Trix.com/SILO3

Travel Blog

0 Comments:

Post a Comment

<< Home