File and Directory Structure
2. Entry pages. Pages that bring you traffic are entry pages, and each should be optimized and submitted to directories and search engines. Make the pages stand-alone, like your home page. When a visitor lands on one of your entry pages, the visitor needs to know where they are, who your organization is, and what the page is about. Include full navigation on all entry pages and make it obvious what the page and site is about. Don’t assume visitors will find the index page first.
3. Robots.txt file. Search engine robots will check a special plain text file in the root of each server called robots.txt before indexing a site. Robots.txt implements the Robots Exclusion Protocol, which allows the website administrator to define what parts of the site are off-limits to specific robot user agent names. Web administrators can disallow access to the Common Gateway Interface (CGI), private and temporary directories, for example, because they do not want pages in those areas indexed . Learn more about search engine indexing and robots.txt files.
No comments yet.




