Google webmaster tools

Crawl Errors Update from Google Webmaster Tools

One of the most important things to do for webmasters is to register and verify their site on Google webmaster tools.

The Google webmaster tools provides a variety of information like the site configuration details – sitemaps, site links, crawler access etc, details about your links, keywords, search queries in the section your site on the web. The Diagnostics section provides vital information like HTML suggestions, Malware, Crawl Errors, Crawl Stats. The Labs section shares interesting details like site performance and instant reviews.

Checking Google webmaster tools at least once every day would be in the schedule of majority of the webmasters and one vital section that one would check are the crawl errors and now in Google webmaster central blog one can see a posting update about crawl errors.

The posting starts stating that enhancements have made with respect to crawl errors and now crawl errors are further divided into 2 sections

    Site errors
    URL errors

Site Errors

To make errors more user friendly these categories have been made. The errors like DNS resolution failures, connectivity issues, problems fetching robots.txt files, before these errors were reported as URL errors henceforth they will be reported in the section Site Errors as they are not URL related errors. If the frequency of these errors is high then alerts will be sent to you. If your site is devoid of these errors as in most sites, you will just see friendly check marks across these issues indicating your website has no such errors.

The URL Errors

If Google webmaster indicates URL errors It means ‘it was able to resolve your DNS, connect to your server, fetch and read your robots.txt file, and then request this URL, but something went wrong after that’. Separate categories will be displayed for Google news or mobile (CHTML/XHTML).

Errors Presentation

There were 100,000 errors of each type shown but there was no way to interpret which errors had more priorities. Keeping this in mind now, Webmaster tools will present the 1000 most important errors for each category. Once the webmaster has fixed them he can view details about them.

Sites having more than 1000 errors will be able to see the total errors for each category and for those who need further details, Google is considering adding an API to download all errors.

One more important update is that the Robots.txt blocked pages errors have been removed as Google webmaster tools find the pages blocked via Robots.txt to be intentionally blocked by the webmaster. Soon these errors will be updated under crawler access in site configuration.

User Friendly Error Details

Clicking on the error URL will open an additional pane with useful information like last tried crawling information, when problem first occurred and an error briefing.From the details pane you can check the URL error on clicking it. There are also options provided to mark the error as fixed, see other pages that link to this URL and more such useful details. There is an option of having Googlebot fetch the URL to double check your error whether it’s fixed.

The priority list would be such errors “fixing broken links on your own site, fixing bugs in your server software, updating your Sitemaps to prune dead URLs, or adding a 301 redirect to get users to the “real” page.” The priority would be decided based on a number of factors whether the link is added in the sitemap and the no of internal links the URL has and various other factors. If you are a user with full permission you can mark the error as fixed and it would be removed from your top-errors list unless Googlebot encounters the same errors again.

These changes hopefully will help webmasters a lot in getting their site errors corrected soon.

Tags: , , ,

Request a Free SEO Quote