Hi Daan, Yes - the plan is to run these API performance benchmarks regularly for each release, using the previous release as the baseline, specifically to detect regressions early and track trends over time. We’ve also discussed integrating these results into the QA dashboard so they’re visible and comparable per release; this aligns well with what Bobby was exploring during the Milan hackathon. Thanks for the suggestion - it’s very much in line with where we want to take this. Best regards, Rosi
________________________________ From: Daan Hoogland <[email protected]> Sent: 12 December 2025 11:35 To: [email protected] <[email protected]> Cc: users <[email protected]> Subject: Re: CloudStack API Performance Testing Results - 4.20.2 and 4.22.0 Releases nice , thanks for sharing Rosi. Will you (as in do you plan to) share these data on a regular, per release basis? I heard Bobby say he is working on a dashboard on the hackathon in Milan. Is it an idea to add this? (probably already planned so forgive me if I ask ignorant questions) On Fri, Dec 12, 2025 at 8:55 AM Rositsa Kyuchukova < [email protected]> wrote: > Hi All, > > I'm sharing our latest API performance benchmarking results from > ShapeBlue's performance tests. We conduct these tests to ensure CloudStack > upgrades maintain or improve API response times at enterprise scale. > As CloudStack deployments grow, API performance becomes critical for > management operations and user experience. We systematically benchmark each > release to: > > - Detect performance regressions > - Verify database query optimization at scale > - Provide the community with real-world performance data > > Our testing focuses on the most common list operations that query large > datasets. > Test Environment > Infrastructure: > > - Test Controller VM: Oracle Linux 8 server (running the > apache/cloudstack-csbench > <https://github.com/apache/cloudstack-csbench/tree/main> tool) > - Management Server Under Test: CloudStack instance deployed as VM > with dedicated vCenter resource pool: > - 6 vCPUs (6.0 GHz) > - 32 GB RAM > - 20 GB disk storage > - OS: Oracle Linux 8 > - Hypervisor: VMware vSphere > - Database: MySQL > - KVM Host: Runs CloudStack system infrastructure (1 SSVM + 1 > Console Proxy) > - Storage: NFS (2 pools) > > Database Scale (Mock Objects): > > - 1 Zone, 1,000 Clusters > - 2,453 Mock Routing Hosts (database entries simulating large > deployment) > - 2,370 VMs, 4,740 Volumes, 2,370 Networks > > This configuration tests database query performance under enterprise-scale > loads without requiring thousands of physical resources. > Testing Methodology > We developed an automation script that orchestrates the complete > performance testing workflow. The script handles benchmark execution, > result validation, CSV archival, and automated diff report generation. > Workflow: > > 1. Restore to baseline snapshot with mock data populated in database > 2. Run baseline benchmark: > - Measures API response times against current CloudStack version > (e.g., 4.21.0.0) > - Runs 100 iterations per API (~40 minutes; configurable) > - Validates results and archives CSV reports > 3. Upgrade CloudStack to target version (e.g., release candidate > 4.22.0.0) > 4. Run comparison benchmark: > - Performs identical API performance tests against upgraded version > - Compares response times against baseline > 5. Automated diff report generated showing performance deltas > (absolute time + percentage changes) > > APIs Benchmarked: > > - listAccounts, listDomains, listHosts, listNetworks, > listVirtualMachines, listVolumes > > Performance Threshold: API degradation >20% requires investigation > Test Results > Release 4.20.1 → 4.20.2 (Maintenance Release) > API > Base (s) > Patch (s) > Diff > Change > listAccounts > 6.0 > 5.99 > -0.01 > -0.17% > listDomains > 0.77 > 0.76 > -0.01 > -1.3% > listHosts > 1.44 > 1.49 > +0.05 > +3.47% > listNetworks > 7.68 > 7.7 > +0.02 > +0.26% > listVirtualMachines > 3.11 > 3.2 > +0.09 > +2.89% > listVolumes > 1.04 > 1.04 > 0.0 > 0.0% > Observations: > > - Minor improvements in listAccounts and listDomains > - Slight regression in listVirtualMachines (+0.09s, +2.89%) which > falls within typical benchmark variance and represents no meaningful > performance impact > - All changes well below 5% threshold > > > Conclusion: Very stable maintenance release > > Release 4.21.0.0 → 4.22.0.0 (LTS Release) > API > Base (s) > Patch (s) > Diff > Change > listAccounts > 7.2 > 7.2 > 0.0 > 0.0% > listDomains > 1.62 > 1.68 > +0.06 > +3.7% > listHosts > 1.32 > 1.32 > 0.0 > 0.0% > listNetworks > 6.77 > 7.0 > +0.23 > +3.4% > listVirtualMachines > 2.78 > 2.85 > +0.07 > +2.52% > listVolumes > 0.9 > 0.94 > +0.04 > +4.44% > Observations: > > - Largest change: listNetworks (+0.23s, +3.4%) > - All APIs show <5% variation > - No performance improvements, but stable behavior > > > Conclusion: Minor variations within acceptable range > *Overall Assessment* > Both releases demonstrate stable API performance at enterprise scale: > > - 4.20.1 → 4.20.2: Excellent stability with minor improvements > - 4.21.0.0 → 4.22.0.0: Acceptable variations for major LTS release > > All tested APIs remain well below our 20% investigation threshold. The > largest absolute slowdown across both releases is listNetworks (+0.23s in > 4.22.0.0), representing only 3.4% increase. > Best regards, > Rosi > Rositsa Kyuchukova > Senior QA Engineer > *s:* | * d: *+44 203 603 0540 > *e:* [email protected] | * w: *www.shapeblue.com |* t:* > @shapeblue > *a:* 3 London Bridge Street, 3rd floor, News Building, London, , United > Kingdom > ------------------------------ > > Find out more about ShapeBlue and our range of CloudStack related services: > IaaS Cloud Design & Build > <http://shapeblue.com/iaas-cloud-design-and-build/> | CloudStack > Consulting <http://shapeblue.com/cloudstack-consultancy/> | CloudStack > Software Engineering > <http://shapeblue.com/cloudstack-software-engineering/> > CloudStack Infrastructure Support > <http://shapeblue.com/cloudstack-infrastructure-support/> | CloudStack > Bootcamp Training Courses <http://shapeblue.com/cloudstack-training/> > > Shape Blue Ltd is a company incorporated in England & Wales. ShapeBlue is > a registered trademark. This email and any attachments to it may be > confidential and are intended solely for the use of the individual to whom > it is addressed. Any views or opinions expressed are solely those of the > author and do not necessarily represent those of Shape Blue Ltd or related > companies. If you are not the intended recipient of this email, you must > neither take any action based upon its contents, nor copy or show it to > anyone. Please contact the sender if you believe you have received this > email in error. > > > -- Daan
