Reading time 4 min

At Rigor, we spend our days enabling organizations of all kinds to improve the performance and reliability of their websites and web applications.

When we first started out, we focused primarily on eCommerce organizations. This made sense as the ROI model was easy to understand and pitch:

  • Every 1/10 of a second shaved off their load time correlated to an increase in sales by 1%.
  • A 1 second delay causes a 7% loss in conversions, 11% decline in page views and 16 % decrease in customer satisfaction
  • The average online shopper expects your page to load in 3 seconds or less. After 3 seconds 57% of visitors will abandon the page
  • 20% of users abandon their cart because of a long loading time. About 67% of online shoppers quote slow loading times as the top reason they’d abandon a purchase.

And the list goes on and on.

Publishers and Advertisers

As we grew, we soon found success talking with a new type of user: publishers. For those unfamiliar with publishers and the “Online Media” industry, these organizations publish content online to generate advertising dollars based on impressions and ad click-throughs. This includes traditional news sites like CNN and The Weather Channel, recipes sites like Allrecipes, and now even cable networks like Viacom. These companies constantly measure the engagement of users by leveraging a variety of metrics:

  • Unique Visitors – number of unique visitors that visit the site in a given timeframe (ideally this wouldn’t count the same user more than once)
  • Page Views – average number of pages viewed by a user in a given session
  • Social Media Shares – number of times their content is shared on social media networks such as Twitter or Linkedin
  • Ad Clickthroughs – number of users that click the ads on a page
  • Time on Site – average length of time a user is engaged on the website

For many of the above KPIs, poor performance can be very impactful.

For example, studies have shown that a website that loads in more than 5 seconds has a bounce rate 2x higher than a website that loads in 1 second. Bounce rate is the rate that users will abandon a website after visiting only 1 page. So if a slow load time can double a website’s bounce rate, then poor performance can be tied to decreases in page views, time on site, and ad clickthroughs.

If we choose to ignore the impact of performance on engagement metrics and simply look at the relationship between performance and revenue generated, we find that the overall performance of a publisher’s website has a direct impact on revenue. In fact, in 2009, Bing found that a 2 second latency reduced advertising revenue per user by 4.3%.

That was nearly 6 years ago, when desktop and mobile advertising was a fraction of what it is today. The below image illustrates the dramatic shift in web advertising over the past 2 decades:

digital-media-evolutionAs online advertising has continued to grow, so has the size of websites that display them. On March 1st, 2015 the average size of the top 1000 visited websites exceeded 2MB!

march 1st bytes per pageThe increasing weight and bloat of web pages has created a serious roadblock for publishers. As page views and clickthroughs reduced ad rates, advertisers have started to hone in on time on site believing it to be a better indication of ad viewability. Why is this important? Because 54% of digital ads aren’t viewable!

Some of these “non-viewable” ads are located at the bottom of a page “below the fold.” Their viewability is less troubling because their placements are cheap and advertisers know up front they will be less viewable. However, this does not constitute 46% of all digital ads on the internet. So how did we get to the 54% number?

The Internet Advertising Bureau (IAB) defines a viewable ad as “having a minimum of 50 percent of the pixels in view for a minimum of 1 second.”

What was found was that ads often either never load in, load too late into a user’s session to score an impression, or they do not appear at all due to poor page design (think desktop page unoptimized for mobile). Regardless, advertisers have had enough of receiving 50% of the ads that they pay for.

viewability-of-display-ads-2014-by-augustine-fou-2-638

In 2015, agencies and the brands they represent are making a stand in regards to poor ad viewability. Many agencies are stating that they will only pay for the percentage of viewable ads or are negotiating additional advertising to make up for deficiencies.

So what are publishers doing to combat this growing issue? Making performance a top priority. Publisher sites tend to be incredibly heavy on the front end and require more assets (sometimes up to 400 for a single page!) to load in for the page to render. Such bloat leaves a lot of room for analysis and improvement.

Everyday we help organizations like the Weather Channel, Vice Media, Condé Nast, and The Daily Beast track KPI’s such as:

  • Page Size
  • Load time
  • Asset Count
  • 3rd party domain performance
  • Responsive design performance across devices
  • Geographic availability

These organizations know that as viewability measurement solutions across devices expand, real-time bidding platforms will optimize towards viewable impressions. In other words, these solutions will automatically start spending more on highly viewable ad inventory.

Consequently, advertising dollars will flow towards publishers that improve and maximize viewability. The publishers we work with have already started the process of optimizing their pages and inventory so they can readily monetize the effect of good web design on ad viewability.

Serving content to the user faster isn’t just a necessity for retailers. Optimizing web performance makes sense for every revenue model. Just ask our customers.

 

 

 

 

 

Suggested Blog Posts

The Perception Gap for Poor Web Performance

E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online.  However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...

Read More

Using Browser Trends to Maintain a Better Site

Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...

Read More

Finding Causes of Intermittent Errors from Google Webmaster Tools

Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...

Read More

Optimization Options not Always Optimal

Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...

Read More