Last week I wrote a blog that defined and analyzed differences between the two predominant end-user monitoring techniques leveraged in the market today: Real User Monitoring (RUM) and Synthetic Monitoring.
I ended that post with a list of benefits provided by synthetic monitoring tools. This week I want to detail some of the shortcomings of using a “synthetic-only” approach and discuss how leveraging synthetic monitoring coupled with RUM is the most effective and holistic monitoring practice.
Last week, I detailed many of the benefits offered by synthetic monitoring tools. When it comes to alerting on performance problems or major service disruptions, testing pre-production environments, or baselining performance, synthetic monitoring is unrivaled. However, synthetic is not without its deficiencies. As its name would suggest, synthetic monitoring does not measure the experience of actual users. Instead, it creates synthetic traffic from data centers owned by the vendor and/or customer specifically for the purpose of measuring performance.
Consequently, visibility into performance is limited to the number of “synthetic tests” built and managed by the user. For example, if CNN creates a synthetic test for the homepage of their site they will have visibility into the performance of that specific page, but will be blind to performance problems elsewhere.
As the above example suggests, the primary problem with synthetic monitoring is scaling the scope of your application monitoring. For synthetic monitoring to be valuable you must understand and monitor all of your business critical web pages, services, and transactions. When using a synthetic-only approach, failure to create tests for any of your high trafficked or mission critical websites and services leaves you in the dark to performance issues present in other areas.
RUM is valuable because, unlike Synthetic monitoring, it captures the performance of actual users of your website or web application regardless of their devices, browsers, or geography. In this sense, it is great for the business’s understanding of performance.
Because there is no need to pre-define your important use cases (a la synthetic), RUM is great for generating reports and analyzing trends. As users goes through the application, all of the performance timings are captured, so no matter what pages they see, performance data will be available. This is particularly important for large sites or complex apps, where the functionality or content is constantly changing.
By leveraging RUM, a business can better understand its users and identify the areas on its site that require the most attention. Moreover, RUM can help to understand the geographic or channel distribution trends of your users. Knowing your user trends helps you better define your business plan and, from a monitoring perspective, allows you to identify key areas to target for optimization and performance improvements.
Summary of RUM Benefits
- Understand HOW your application is being used
- Understand the real geographic distribution of your users and the impact of that distribution on the end user experience
- Understand network or channel distribution and flow of your users
- Ensure full visibility of application usage and performance
Synthetic and RUM: Better Together
If your organization has access to synthetic tools, RUM allows for you to create synthetic tests that are more representative of your end user’s experience. A best practice would be to use RUM to identify target areas for optimization and then create synthetic tests to monitor these pages from relevant geographic areas and channels moving forward. This will allow you to isolate and diagnose the root cause of latency or intermittent performance problems that are impossible to identify with RUM data. A good illustration of this process can be found on this blog published by the performance team at Gilt Groupe.
In short, leveraging RUM with synthetic monitoring allows for a more cost-effective, accurate, and comprehensive monitoring experience.
E-commerce revenue continues to grow,as consumers turn away from shopping in brick-and-mortar stores to shopping online. However, many businesses are not prepared for this growth because they do not fully understand the market and how to invest in...Read More
Because of the multifarious nature of web clients today, it’s important to consider the usage statistics when designing, implementing, and managing your site. However, misconceptions often arise when determining what browsers to design...Read More
Google Webmaster Tools is a web service that allows webmasters to view the status of their sites as seen by Google and the Googlebot crawlers. In addition to indexing your site structure and content, the Googlebot crawlers also record data on perform...Read More
Web designers and developers are always looking for ways to speed up their page load times. Yahoo has an excellent article written on the best practices for speeding up your page. How do you know if implementing one of their suggested practices will...Read More