File and Directory Structure,

Tuesday, September 2, 2008

Directory structure

Most search engines don't recognize anything beyond two directory levels. They will index 40 to 50 files in those directories and do it alphabetically. It is crucial for you to place your most important pages at the first or second directory level, breaking it up into 50 files per directory. Be sure to name your files and directories with your keywords. Don't underscore to separate keywords. Instead, use hyphens. Don't stuff too many keywords in your file or directory names. Make them keyword rich but not too long. Name image files after keywords, which is particularly important now that many search engines have image searches. Name your PDF files after your keywords as well.

Entry pages

Pages that bring you traffic are entry pages, and each should be optimized and submitted to directories and search engines. Make the pages stand-alone, like your home page. When a visitor lands on one of your entry pages, the visitor needs to know where they are, who your organization is, and what the page is about. Include full navigation on all entry pages and make it obvious what the page and site is about. Don't assume visitors will find the index page first.

Robots.txt file

Search engine robots will check a special plain text file in the root of each server called robots.txt before indexing a site. Robots.txt implements the Robots Exclusion Protocol, which allows the website administrator to define what parts of the site are off-limits to specific robot user agent names. Web administrators can disallow access to the Common Gateway Interface (CGI), private and temporary directories, for example, because they do not want pages in those areas indexed . Learn more about search engine indexing and robots.txt files.


posted by Alenjoe @ 2:36 PM permanent link   |

Post a Comment

|


0 Comments:

<< Home