The CPDoS attack that was discovered in October 2019 and the corresponding attention paid to it within the technology sphere brought to light an issue many of us overlook: the boundaries of our web applications have expanded, and that has increased the number of systems and services that must be monitored and secured.
Let’s dive into the specifics of the CPDoS attack, explore how it is just a transient example of a more fundamental issue, and discuss how you can protect yourself from future issues.
One site. Lots of locations.
We often talk about websites or applications as existing singularly, but this is not the case at all. Modern sites or apps don’t exist in one single, specific place; they are fragmented across many virtual and physical systems, in different locations, in a variety of ways:
- Applications can be multitiered, with different systems living in different (physical or virtual) servers or entirely different data centers.
- Applications can be serverless, with entirely ephemeral infrastructure that changes each time.
- Applications can require third-party content, which is loaded from different systems. In the case of programmatic ads, the systems that are used can change on each and every page view.
- Applications often leverage CDNs to have global reach. This means parts of your site are copied and cached on dozens or hundreds of different points-of-presence around the globe.
Out of these four, CDNs create the largest amount of fragmentation. This is by design: CDNs are only able to work because they keep copies of your site around the world.
Regardless of the cause, this fragmentation creates a big challenge to understanding the state of your application. Because your app doesn’t exist in a single place, how your visitors are accessing your app decides what pieces of your application and infrastructure are used.
For example, you could load your website from California and find that everything is working. However, that doesn’t mean your site is working. It only means that all the systems and infrastructure that were used to deliver your site in California were working.
Your site may not be working when accessed from Europe, or inside Mainland China, because there could be problems with those fragments of your site from those locations.
The CPDoS attack provides an example of how only some of those fragments can fail.
What Is a CPDoS Attack?
As Akamai so succinctly put it, a CPDoS attack works “by using a malicious header in the HTTP request to cause an error message to be cached, rather than the intended content. The error page is subsequently served to users in place of the legitimate content.”
In short, the CPDoS attack could be used to purposely corrupt some of the copies of your site that are cached in the CDN’s points-of-presence. Then, if someone tried to access your site during the attack, they could be served a corrupted copy of your site, depending on their geographic location.
Do You Need to Worry About the CPDoS Attack Now?
No. Shortly after the CPDoS method of attack was discovered, all major CDNs had addressed the issue.
Does that mean you can breathe a sigh of relief, and is this attack the only way that a cache can be corrupted? Unfortunately, no.
The Call Is Coming from Inside the House!
Coordinated attacks aren’t the only way that a cache can be corrupted. Issues can also be introduced internally, with no malicious intent, that can cause errors within a CDN. These errors can, as with a CPDoS attack, cause certain users to have a poor experience that leads to higher-than-normal bounce rates.
For example, Rigor recently worked with a large, well-known retailer that has both a strong brick-and-mortar presence and an international ecommerce business. Their dev team pushed out a configuration of the site that worked fine on its U.S. CDN. Since they were monitoring their online store from inside the U.S., it appeared that the update was error-free.
However, the configuration was, in fact, returning errors – but only for locations outside the U.S. Because they were limited in their monitoring, they had no way of being alerted to the errors or of understanding their impact.
By implementing Rigor, this retailer’s team began global performance monitoring – reviewing their CDNs around the world. With expanded, robust performance monitoring in place, they were alerted about the errors and were able to address and remediate them.
Performance Monitoring: Think Globally, Not Locally
Your site or application does not exist in only one location, it exists in many locations. And if you used a CDN, copies of it exist in hundreds of locations.
If you need to be alerted to problems with your site – whether by malicious poisoning attack like CPDoS or by accidental errors in your code or configurations – you must take a global view of testing. This approach requires a robust monitoring and optimization platform that can give you insight into your site’s performance from locations across the globe.
Rigor can test the performance and experience of your sites and APIs across the world – both from fixed locations and private locations, depending on your needs. You can then be alerted when your site is unavailable or returning errors (like 400s or 500s) or using more advanced criteria like performance budgets or performance timings failing to meet thresholds.
Remember, if you are only testing your site locally – or even from just one or two additional locations – then you could potentially miss a cache issue, whether from the next attack on a vulnerability or from an error in your code.
Take advantage of Rigor’s ability to monitor your site globally and to alert you about the site defects that can impact your performance and user experience. Reach out today for more information and a free trial.
E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online. However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...Read More
Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...Read More
Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...Read More
Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...Read More