Now in doing such thing there quite a few things you should consider. In the book "Winning the Search Engine Wars" by SearchEngineNews.com, they list 6 things to consider in how to build a search engine friendly website, they are:
* Use of valid HTML and CSS codes * Avoiding as much as possible the use of Flash, Javascript, and frames * URLs should be properly used * Well and organized website and directory structure * Keep your pages as limited as possible * Proper use of what is called the "robots.txt" file
HTML and CSS codes
Web pages should at the very least meet the requirements of or should be in accordance with W3C guidelines. That would mean that you should close all tags which need to be closed, as well as avoiding the use of source code in which will confuse the search engine spider (yes, avoid the use of old source codes), so that your web page can be spidered and cached properly by the search engines. In order to achieve this, the code must meet the minimum requirements of functionality and successful display.
Flash, Javascript, and frames
Now we all know that the use of these things will be an eye-candy, or will be very pleasing to look at, thus making your site very attractive. In the other side too much use of these things will slow the computer, it will just take too long to download your website, and, as most people tend to make most of everything especially time. They just tend to cancel the page and go back to surf another site. So to build a search engine friendly website, one must avoid the use of Flash, Javascript and frames, or if it is really needed, to limit the use of it.
URL
Another important thing to consider in building a search engine friendly website are the URLs. Search engine spiders find and process webpages by following links from one site to another, so consider building in which they can be easily followed. Broken links as others think will degrade your rank in search results.
Accessibility
Consider also the use of site maps in which to organize your websites. If a page is too far deep which means the viewer has to click a few pages to open it, the search engine will have a harder time of indexing it. Thus a well and organized web structure will ensure that the search engines will process all pages in your site.
File Size
File size also plays a major role in building a search engine friendly website. File size with more than 100k (or 150k, depending on the search engine) will not be properly cached, so that would mean you may not appear on those search results. So the best way to overcome this is to limit the file size of your website to 100k.
Robot.txt
The robot.txt file is another important thing to consider as this file tells the search engine whether or not to index a page from your website. Search engine spiders process things in a limited capacity. Robot.txt helps these spiders to direct their searches for the things that you really wish for the viewer to see on your pages.
Info by : Leia Mahalo
No comments:
Post a Comment