понедельник, 21 июля 2014 г.

Hello, new robots.txt Tester Update



The robots.txt tester in Google Webmaster tool has got on the road an update to help detecting errors, which appear when Google tries to crawl your site, allows you edit your file, make sure that URLs aren’t blocked, and make it available for you to review older versions of the file.

In case Google cannot crawl a page of a site, the robots.txt tester, situated under the Crawl section, now allows you to test if there is anything in your file that blocks Google.

To get you through complicated directives, the robots.txt emphasizes that directive, which leds to the final judgment. It allows you to change the file as well as test them immediately.

You just have to upload the new file version of to your server and make the changes work. For instance, when Googlebot sees a 500 error for the robots.txt file, it usually stops further crawling of the website.
John Mueller, Webmaster Trends Analyst, noticed that even if you believe robots.txt works wonderfully, it would be better to make sure that there are no any warnings and errors.

He also mentioned that, as a rule, many of these problems remain unnoticed. Therefore, you have to double-check if you are blocking any JS or CSS files from crawling. 


Комментариев нет:

Отправить комментарий