Is the Robots.txt file necessary for SEO?

Is the Robots.txt file necessary for SEO?

There are many a blog post out there detailing the advantages and features of robots.txt files, but does your site even need one? And does it have any impact on search engine optimization strategy?

Robots.txt is used to block search engine spiders from crawling individual pages or directories. On most sites, private information is already blocked because it is delivered using SSL, so there is no point to having a robots.txt file.

But Steve, you say, Google recommends using a robots.txt file to “tell the crawlers which directories can or cannot be crawled.” Well sure, but if you have no problem with Google-bot crawling everything, there is no need for the robots.txt file, right? Why tell it to do what it is already going to try to do? — Crawl everything!

There are only two cases in which I recommend using a robots.txt file. The first case is (and I think the Google Webmaster guidelines says it best) “to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.” The other situation where there is the need for a robots.txt file is that if you cannot verify your sites XML sitemap for some reason (that should be fixed), you can use robots.txt to specify its location on your server.

0/5 (0 Reviews)

Ready to Collaborate? Contact Us!

Recent Works SEO Case Study
BTO Sports Case Study Case Study Case Study
Tacori Social Engagement
Oxi Fresh SEO Case Study SEO Case Study
TIBCO case study
Teleflora Web Development


Get the Latest SEO News & Updates