Home Bing Bing’s new robots.txt tester can help SEOs identify crawling issues

Bing’s new robots.txt tester can help SEOs identify crawling issues


Bing has added a robots.txt tester to its Webmaster Tools, the company announced Friday. The new feature allows SEOs to analyze their robots.txt files and highlights issues that may hinder Bing from optimal crawling.

The robots.txt tester and editor within Bing Webmaster Tools.

How it works. SEOs can use this tool to test and validate their robots.txt file, or to check whether a URL is blocked, which statement is blocking it and for which user agent.

Changes can also be made to robots.txt files using the editor. The test functionality can check the submitted URL against the content of the editor, allowing SEOs and site owners to check the URL for errors on the spot.

The edited robots.txt file can be downloaded to be updated offline and, if changes to it have been made from elsewhere, the fetch option can be used to retrieve the latest version of the file.

The tester operates as Bingbot and AdIdxbot (the crawler used by Bing Ads) would and there’s an option to toggle between the two. The tool also enables SEOs to submit a request to let Bing know that your robots.txt file has been updated.

Why we care. Following the required formats and syntax related to robots.txt can be complex, which may lead to errors that result in suboptimal crawling. This tool can help highlight crawling issues for SEOs and webmasters, enabling them to troubleshoot their robots.txt files more easily.

About The Author

George Nguyen is an editor for Search Engine Land, covering organic search, podcasting and e-commerce. His background is in journalism and content marketing. Prior to entering the industry, he worked as a radio personality, writer, podcast host and public school teacher.

Previous articleA Corona Xmas: Why physical stores will power online shopping this holiday season
Next articleConfused by Facebook’s Limited Data Use for CCPA? You’re not alone