Each month Rigor sponsors the Atlanta Web Performance Meetup at our home in the Atlanta Tech Village. This event is a chance for attendees to network with other website developers, designers, and system administrators who all want to improve the user experience, lower abandonment rates, and raise conversation rates by making their websites fast.
Various web performance meetups have been founded across the country, with the most notable being the NY Web Performance Meetup and the San Francisco and Silicon Valley Web Performance Meetup.
Our August meetup speaker was Kurk Johnson, Senior Performance Engineer at InComm in Atlanta. Kurk’s talk focused on various considerations needed in order to select the best performance testing tools and products for your needs. Selecting the correct performance test product is not always an easy task for developers, and often times, when companies have an urgent performance need on their site, they aren’t sure which direction to go.
Kurk Johnson, Sr Performance Engineer at InComm, presents to the Atlanta Web Performance Meetup
In his presentation, Kurk outlined steps teams can take to get started in this process. Here are my three distinct takeaways from Kurk’s presentation:
1. Use a tool that can test performance in pre-production
Why is so much emphasis put on performance and tools that are used to test it? Because organizations can suffer serious consequences as a result of their site malfunctioning or loading slowly (just ask GQ).
The right performance product should have the ability to test performance in pre-production, and if it doesn’t, start looking for a new tool! Make sure to look for certain key features that will ensure your success, such as real-time analytics, a good “kill switch”, etc.
Introducing performance defects or bugs into production can have many negative repercussions, including a loss of revenue, loss of customers, underuse of expensive company systems, a backlog customer orders, and negative publicity that could impact your brand. Testing in pre-production helps minimize these risks.
By catching performance issues in pre-production, teams are able to triage and resolve performance issues before they have a widespread impact on their users. Check out our blog outlining tips for integrating performance testing into pre-production.
2. Know when you are in need of a new tool
How should you know when your team needs to start evaluating new tools? According to Kurk, there are several different scenarios that should cause you to reevaluate your current testing process.
For instance, if your company has gone through internal changes resulting in the creation of a new performance team, you may want to consider evaluating new products. Kurk also considers poor support from your existing vendor or a lack of a robust user community as reasons to evaluate new performance products. Ultimately, if the product or tool you are using does not provide the support and training your team needs, your investment in performance is all but useless.
More obvious reasons for evaluating new performance management products include value, cumbersome licensing, or your contract is expiring.
Also, importantly, keep in mind that a change may need to be made if your existing product is inadequate. For instance, is your current solution scalable? Is it missing key features? Does it integrate poorly with other products, such as your build or deploy system? Depending on your answer to these questions, it may be time to see what else the market has to offer.
3. Performance isn’t a finishable feature!
The job of the performance engineer does not stop once an automated performance testing tool is in place. Your team should use the outputs from these products to analyze data and provide recommendations to correct problem areas to the development team.
Performance teams should have regular sessions to discuss scripting, share tips, increase team knowledge, and talk about lessons learned.
Not to mention, one singular automated performance test tool can’t offer all the features you need. There will always be dependencies on other tools to fill the gaps. According to Kurk, APM and database tools are commonly used to fill gaps left open by performance tools
Want to learn more about the Atlanta Web Performance Meetup? View upcoming and past event topics on our Meetup page.