Reading time 2 min

Every day Rigor robots make 1.5mm visits to over 350,000 web pages and load 23mm assets in browsers from around the world. Because of the large amount of data that we sift through each day, we are able to detect many outages that occur on the internet in real time. Yesterday we saw a sizable outage across many of our customer and research accounts.

When we dug down into specific runs we kept seeing the same issue pop up. A piece of javascript on each page was unable to connect back to its server and blocked other content from loading.  Below is a waterfall chart of one of our customers who experienced latency due to the beacon. (Not familiar with waterfall charts? Check out “How to Read a Waterfall Chart”).

waterfall picture rum latency

Websites such as TOMS, Citrix, AirBnB, and hundreds more experienced multi-second delays to their page loads due to issues loading a javascript beacon (also known as a tag). Ironically this beacon is used to track the end-users’ web performance experience.

After some digging we quickly discovered that the culprit was New Relic’s browser collection beacon. This piece of javascript is how New Relic creates their Real User Monitoring (RUM) data. Our suspicion was confirmed shortly thereafter via New Relic’s public status page as seen in the picture below:

New Relic Browser Collection

In all, the New Relic outage lasted nearly two full hours and caused significant UX problems for end users. This got me thinking about the two blogs I wrote in December describing the benefits and shortcomings of RUM and how it complements synthetic monitoring.

One of the key differentiators that I failed to mention in these blogs between RUM and Synthetics is the external nature of Synthetics. Despite all of the benefits of RUM, loading any external content on your site has risks. Unlike RUM, Synthetic monitoring is completely external and does not require any client-side installation or the insertion of a web tag. As the New Relic outage so clearly illustrated, RUM at its worst can be the source of serious performance problems and at its best is still yet another asset that needs to be loaded by your users at a time when web pages are already slowing and growing at an alarming rate.

Suggested Blog Posts

The Perception Gap for Poor Web Performance

E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online.  However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...

Read More

Using Browser Trends to Maintain a Better Site

Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...

Read More

Finding Causes of Intermittent Errors from Google Webmaster Tools

Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...

Read More

Optimization Options not Always Optimal

Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...

Read More