New Robots.txt Tester from Google

Google has just launched a new robots.txt testing tool for Google Webmaster Tools. The updated version highlights errors that cause Google to fail to crawl pages on your website, and will let you edit your robots.txt file and test your site to see whether URLs are being blocked. It also lets you view older versions of your files.

If you find that Google is not crawling a specific page or a part of your site, you can use the robots.txt tester, which can be found under the Crawl section of Google Webmaster Tools, to test whether there is an issue with the robots.txt file. In many cases, it is an erroneous directive in that file which prevents websites from being properly indexed.

Easy to Understand

One of the biggest improvements in the new tool is that it highlights the specific directive that Google followed to decide not to index a file. This can be invaluable if there are multiple conflicting directives. You can make changes to the file and test them on the fly, and then once you have found the settings that work well you can upload the new version of the file to your web server and watch the changes in effect.

You will also be able to review past versions of the file, and look through logs to see if there are any problems with crawling your website. There are many things that might stop Google from being able to properly crawl your website – for example, if there is a 500 server error when attempting to access the Robots.txt file, then the search engine will usually pause its crawl process until that error is resolved.

Every Webmaster Should Check for Crawl Errors It is a good idea for webmasters to check the status of their robots.txt file in Webmaster Tools, even if they think there are no problems. In many cases, errors can be very subtle. For example, your CSS or JS files may be blocked, but the content files may load properly. This can cause your page to render poorly for Googlebot, which is not going to prevent you from being indexed, but could have an adverse impact on your rankings. It takes just a few minutes to check for errors, and it could reveal some easy-to-fix issues that will greatly benefit your position in the SERPs.

 

Next article

Bing Falling Out of Favour in the USA Google has just launched a new robots.txt testing tool for Google Webmaster Tools. The updated version highlights errors that cause Google to fail to crawl pages[...] Read article
Find Out How We Can Help Your Business

We specialise in implementing bespoke online marketing campaigns and building stunning responsive websites. Fill in the details below to find out how we can help your online business become a success.