Recently, Rigor conducted a test on a comScore top 10 sports media website to understand some of the causes of downtime and latency for a high trafficked publisher.The goal was to see whether the main sources of latency were internal resources or 3rd party resources and to better identify the role of ads and ad servers on page performance. What we found was pretty surprising.
Here are some high level performance stats that we observed over the 2 week period:
- Average Page Size: 3.29 MB
- Average Response Time: 11 seconds
- Average Number of Page Requests: 326.3
What we found was that over 2 weeks, publisher downtime attributable to 1st party content was less than .03%. Contrarily, 3rd party resources accounted for 1.33% of downtime. Dropping the total uptime of the website from 99.97% to 98.68%.
While images accounted for nearly 40% of the total page weight (1.21MB of 3.31MB total) and roughly 40% of the requests (133 out of 326.3), they only accounted for 1s of load time/latency or about 10% of the total.
Ad servers accounted for only 14% of the total page weight (.46MB out of 3.31MB total), but were responsible for roughly a 3rd of the total requests (105.6 out of 326.3). As a result, they were responsible for an average of 5 seconds of latency; over 40% of the total page load! To make matters worse, of the 50 failed runs that occurred over the sample period, 48 of the failures were attributable to requests made by ads.
Other Third Parties
All other third parties accounted for roughly 1 second of page latency while making roughly 10% of all page requests (33.5 out of 326.3) and 12% of total page weight (.42MB out of 3.31MB total).
We configured 4 checks which tested a the websites desktop homepage and gathered data over a two-week period. The first test was our baseline test that loaded typical webpage a user would see. The second test loaded the webpage, but we prevented the browser from requesting images on the page. The third test loaded the page, but this time we prevented the browser from requesting any ads on the page. Our final test loaded only 1st party resources and prevented the browser from loading any 3rd party resources. Each of these checks leveraged a Firefox browser every 5 minutes from 11 different locations (both domestic and international) in a round-robin format.
- Homepage Baseline – 11s response time, 98.64% uptime (50 errors)
Once we collected the data, we viewed the performance history charts for each of these tests and noticed disparity in performance and uptime between checks that contained ads and those that did not. We then drilled into failed runs on the baseline chart and looked at the waterfall chart of HAR data to ensure that ad domains were causing a significant amount of the performance hangups.
In this instance ads were the main source of latency and downtime for this website, but we know the publisher in question cannot just remove their sole source of revenue from the site. However, there are multiple ad servers and ad exchanges out there. By testing your websites for performance and quantifying the impact of third parties you can keep your ad vendors and technology partners from getting complacent. The second you spot performance degradation make your vendor aware. If a 3rd party is consistently degrading performance, you should start shopping around for a new vendor. Only then will we as an industry start to see performance gains in the ad world. For publishers and ad servers looking to optimize first party content here is a list of best practices.
Publishers shouldn’t be the only ones advocating for performance gains. More than anyone else, ad buyers should be promoting performance when determining how their ads will be delivered to their audience. A recent study found that, “If a page takes 8+ seconds to load, visitors will spend only 1% of their time on page looking at ads.” We are all stakeholders in the goal to make a faster internet. Let’s make it happen.
E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online. However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...Read More
Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...Read More
Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...Read More
Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...Read More