Locust Performance Testing with AI and Observability with Lars Holmberg
Failed to add items
Sorry, we are unable to add the item because your shopping cart is already at capacity.
Add to basket failed.
Please try again later
Add to Wish List failed.
Please try again later
Remove from Wish List failed.
Please try again later
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
By:
About this listen
Performance testing often fails for one simple reason: teams can't see where the slowdown actually happens.
In this episode, we explore Locust load testing and why Python-based performance testing is becoming the go-to choice for modern DevOps, QA, and SRE teams. You'll learn how Locust enables highly realistic user behavior, massive concurrency, and distributed load testing — without the overhead of traditional enterprise tools.
We also dive into:
Why Python works so well for AI-assisted load testing
- How Locust fits naturally into CI/CD and GitHub Actions
- The real difference between load testing vs performance testing
- How observability and end-to-end tracing eliminate guesswork
- Common performance testing mistakes even experienced teams make
Whether you're a software tester, automation engineer, or QA leader looking to shift-left performance testing, this conversation will help you design smarter tests and catch scalability issues before your users do.
No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.