Google Search Console Announces Updated robots.txt Testing Tool

Google Search Console Annoucnes Update to robots.txt Testing Tool

Big news just announced on the Google Webmaster Central Blog: Google has updated their robots.txt tool. They are now allowing websites the opportunity to test their robots.txt files after making changes, and before pushing it live.

You know the robots. As a refresher: the robots.txt file essentially tells the search engines which pages on a site to crawl and which pages not to crawl. “Why not crawl all my pages, though?” Well, there are certain instances and URLs associated with nearly every website out there, where there is no point, nor any good reason for a search engine to crawl them. This can be login screen URLs to access your backend, comment sections of a website, and some other pages that have no business being crawled or indexed.

That’s pretty much the brass tacks of what a .robots.txt file does for a website. However, it can be a tricky process if you’re not well-versed in how to go about creating this file. In Google’s attempt to make every website the “best” it possibly can be, they have announced a brand new tool to help you out! According to Google, “[With this new tool, you] can test new URLs to see whether they’re disallowed for crawling. To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too; you’ll need to upload the new version of the file to your server afterward to make the changes take effect.”

Google Robots Image For more information regarding this new tool and how it can be of benefit to you and your site, check out the Webmaster Central blog post.   Connect with SEO Inc on Social!

Ready to Collaborate? Contact Us!

Categories.

NEWSLETTER

Please enable JavaScript in your browser to complete this form.