Above All Else, Ad Blocking is a User Experience Problem
We’re slowly learning the consequences of the ad blocking going mainstream. Strategies by publishers and responses from readers have been as inventive as you’d expect.
Various payment strategies for single articles have spun up.
Ad-lite is now a thing; intended to reward people who agree to turn off their ad blockers.
Web sites have started blocking the adblockers.
Adblockers, in turn, have agreed to play nice with some publishers by telling them that “well-behaved” ads (those that don’t autoplay sound or video) will be allowed through.
Unfortunately, as a result of this struggle, some adblockers have started to block parts of sites that people actually want to see.
It’s dizzying, and it seems to ignore the root of the issue: ad blocking is a user experience problem.
Why Are People Blocking Ads Anyway?
All the time it takes to get that stuff in front of the viewer can be just as annoying as the old “you’ve won our millionth visitor prize” pop up.
So people start blocking.
And it works.
Web sites can load up to 30 seconds faster with ads blocked. And here’s where all the other problems begin. An adblocker works by aborting requests to third party domains. If you’re on website.com and it starts loading content from ads.com the adblocker aborts the request and you get a nice white iframe on the side of your article. By the same token, if website.com starts loading cdn.com, it’s possible your adblocker can stop that too. If that’s where your images are coming from, your users might not get to see them all. In the end, they get a faster website, but they don’t get the overall experience that you have built for them.
Here’s Why Speed Matters
Adobe and PageFair have found that 67% of people using adblockers would be willing to view still image and text ads if privacy and performance concerns were addressed by publishers. But how do we get there?
One of the first lessons I learned from the internet was one of image resolution. When trying to make my own Red Hot Chili Peppers posters, I realized that 72 dpi pictures on forums weren’t going to cut it. These images had been optimized for the web, not for a teenager’s wall.
Most if not all images can be served on the web at 70% quality with no appreciable difference to the viewer.
Images are low-hanging fruit that can make a big impact. And there are plenty more to look at. It’s important to make sure HTML, JSS, and CSS are properly formatted. For larger publishers who have already invested in a CDN to get resources closer to the end user, it’s important to continually ensure the CDN is properly configured. Many CDN and CMS platforms offer services that attack the image issue I mention above. With Dynamic Image Resizing, you set and forget it until it’s not working. If I had a dollar for every time a client assured me that their CDN/CMS was serving the most optimized version of their images… Human error, bugs, they happen. Continuous monitoring is the only way to assure that your CDN is behaving the way you expect it to. WebPagetest is a good tool to evaluate the page weight of your site. There are some tools available that allow you to automatically run your page through WebPagetest every time you deploy. From there you can take the info about how images are being served back to your design team and ask them to use Photoshop to get them back within your best practices.
Or, you can automate the whole process. Rigor’s engineering team has open sourced the code we use to optimize our own CD process. Using the code outlined here you can automatically check your site for resources that fall outside of your best practices and get the optimized versions without ever having to go outside the dev team.
Adblockers are a problem that isn’t going anywhere anytime soon. Most attempts to stop people from using adblockers thus far have been worse than the negative experience of slow loading ads. Publishers should take steps to lower the overall page weight and load time by watching key performance metrics, especially image size. Tools such as WebPagetest and Rigor can help teams of all sizes monitor continuously for performance defects that could bloat the page without detection.
Adobe’s research referenced throughout [PDF]
E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online. However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...Read More
Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...Read More
Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...Read More
Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...Read More