A robots.txt file is used to tell search robots which pages on your website you want indexed and which pages to disregard when crawling the website. Thus pages with sensitive content can be excluded from search results at the behest of the client.
When implementing an SEO campaign for a website it is important to review and revise the robot.txt file once the site has gone live and the SEO campaign has been in action for a while. A robot.txt file is always located in the main directory of a website and when a search engine robots comes across the website, they look to this file for instructions. New instructions will make them aware of the fact that the website content has been updated and that there are new files to be indexed.
Secondly the robot.txt file may be excluding content that needs to be indexed, and a revision of this file can correct the problem. A review will also highlight any other potential problems that can arise as a result of the robot.txt file.
The chief aim of a robot.txt file review is to ensure that nothing on the main website is being excluded for any reason other than the fact that the client asked for it.
Keyword Assessment | Meta Data | Webmaster Tools | Google Analytics | Monthly Reporting | Robots.txt | .htaccess | Favicon | Site Map | Search Engines | Google Maps | Forms and E-mails | Footer Link | Google Search Engine | Picture Tagging | Directory Page | Social Media | Heading Tags | Indexed Pages | Robots | Sitemap | HTML Errors | Search Engine Errors | Broken Links | Pages and Content | Search Engines | Picture Tagging | Flash and iFrames | Hierarchy | Forms and Emails | Add Media | Static URLs | Descriptive URLs | Duplicate Content | Plagiarism Check | Positive Content | Negative Content | Character Coding | Downloadable Files | Optimize Graphics | Optimize Navigation | Incoming Links | Outgoing Links | Page Content | Meta Data | Rich Content | Reduce Bounce Rate | Social Media | Analyze Competitors | Website Security | 3rd Party Tracking