Creating robots.txt Files
A robots.txt file is a file that you create in your root website directory. It informs search engine automated crawlers which pages they should get to be included in their search results database. It also specifies what the automated crawlers should NOT include in thier results.
An overview of the robots.txt file and how to create and use one can be found on Google by following this link.
Keep in mind two things when generating a robots.txt file:
1. The contents should be thought of as a general guideline on what the automated crawlers search and what they don't. The major search engines (Google, Yahoo, Ask, Bing, etc) will most likely stay away from places you do not allow, but there are poorly written crawlers or intentional wrong doers that will use what you don't want indexed.
2. Because of 1 above, DO NOT add files and directories that nobody has business being in to the disallow areas of a robots.txt. If hackers or others want to find your admin directory to hack into your site, by specifically stating that the admin area is disallowed will point them to it like a beacon! It is best to leave the directory out all together.
Powered by WHMCompleteSolution