Reading time 2 min

Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on performance and availability, which Google uses when calculating page rank. While static errors like broken links can be identified easily, intermittent crawl errors can be much more difficult to address.

The small sample size of crawls on your site and their sporadic nature can make it impossible to understand the root cause of the errors that are found with Webmaster Tools alone. Also, the lack of a time stamp to go along with the crawl errors, you cannot correlate the errors with other environment changes such as traffic levels or infrastructure changes.

Zeroing in on these failures requires a more granular approach. With a basic Response Check watching the problematic URL, you will learn how a page performs over the course of time as well as what happens to its availability during specific events like code pushes or times of peak volumes.

As an example, if we look at the crawl errors below, we notice that there are a handful of 502 errors picked up on the 13th. When we checked the links associated with these errors, there were back online and loading properly.

To look further into the cause of these issues, we pulled up the uptime reports for the site in question and noticed spotty availability between 4:30pm and 6:30pm:

This particular site is comprised of multiple web servers sitting behind a load balancer. We when we look at the web servers individually, we find that one of them is experiencing 100% downtime during that same timeframe.

We now know that these errors was caused by an outage on a specific web server, and because we have the time frame, we can isolate the cause to a configuration change on the application server. We can see that the site started having issues shortly after the change and returned to normal when the change was rolled back.

By keeping a pulse on your site with a 24×7 monitoring system like Rigor, you will be able to find and fix intermittent errors found in Webmaster Tools. Get started with a trial of our own.

Suggested Blog Posts

The Perception Gap for Poor Web Performance

E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online.  However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...

Read More

Using Browser Trends to Maintain a Better Site

Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...

Read More

Optimization Options not Always Optimal

Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...

Read More

Top 10 Twitter Accounts in Web Performance

At Rigor, we leverage the open source and Web Performance community to build performance and availability tools, which were previously only available to large enterprises, to everyday organizations. I often find myself discussing the Web Performance...

Read More