What is a robots.txt file? These files are used to tell search robots which pages on your website you want indexed, as well as the pages to disregarded when crawling the website. Thus, pages with sensitive content are excluded from search results at the behest of the client.
When implementing an SEO campaign for a website, you must review and revise the robot.txt file once the site has gone live and the SEO campaign has been in action for a while. A robot.txt file is always located in the main directory of a website and when a search engine robot comes across the website, they look to this file for instructions. New instructions will make them aware of the fact that the website content has been updated and that there are new files to be indexed.
The robot.txt file may be excluding content that needs to be indexed, and a revision of this file can correct the problem. A review will also highlight any other potential problems that can arise because of the robot.txt file.
The main aim of a robot.txt file review is to ensure that nothing on the main website is being excluded for any reason other than the fact that the client asked for it.
Keyword Assessment | Meta Data | Webmaster Tools | Google Analytics | Monthly Reporting | Robots.txt | .htaccess | Favicon | Site Map | Search Engines | Google Maps | Forms and E-mails | Footer Link | Google Search Engine | Picture Tagging | Directory Page | Social Media | Heading Tags | Indexed Pages | Robots | Sitemap | HTML Errors | Search Engine Errors | Broken Links | Pages and Content | Search Engines | Picture Tagging | Flash and iFrames | Hierarchy | Forms and Emails | Add Media | Static URLs | Descriptive URLs | Duplicate Content | Plagiarism Check | Positive Content | Negative Content | Character Coding | Downloadable Files | Optimize Graphics | Optimize Navigation | Incoming Links | Outgoing Links | Page Content | Meta Data | Rich Content | Reduce Bounce Rate | Social Media | Analyze Competitors | Website Security | 3rd Party Tracking