Cockroach Labs recently released their annual cloud report identifying Google Cloud Platform as the best overall provider. The 2021 Cloud Report compares AWS, Azure, and GCP on benchmarks that reflect critical applications and workloads.
The team at Cockroach Labs assessed 54 machines and conducted almost 1000 benchmark runs to measure four main areas using different benchmark tools: CPU Performance (using CoreMark), network performance (using Netperf), storage I/O performance (using FIO) and OLTP performance (using a derivative of TPC-C).
While GCP is selected in the report as the best cloud for performance and throughput, AWS offers the best network latency but lags in storage I/O and single-core CPU performance, according to the company behind the open source CockroachDB database. GCP delivered the fastest processing rates on all four throughput benchmarks: network throughput, storage I/O read throughput, storage I/O write throughput, and maximum tpm throughput. Amazon's Graviton2 processor-based machines performed best for multi-core CPU performance. Thanks to ultra-disks, Azure came out on top for read and write IOPS.
The goals of the reports were to determine the most cost efficient cloud provider, evaluate performance tradeoffs and assess the cost/benefit of disks and CPU processors. John Kendall, product manager at Cockroach Labs, explains how their report differs:
We delivered the only cloud performance report to compare the three major cloud providers on micro and industry benchmarks that reflect critical OLTP applications and workloads.
Now in its third year, the testing methodology was previously challenged by the cloud providers for relying on default configurations, a choice that Cockroach Labs addressed in a separate article. Jadon Naas, product development lead at InMotion Hosting, is concerned that the report is limited to three providers:
Is the cloud market so consolidated and non-competitive that only GCP, AWS, and Azure warrant mention? (...) I totally get they do not have time to test every cloud under the sun, and these are the three major cloud providers. I worry, though, about this kind of reporting sending a message that other cloud providers or cloud solutions are after-thoughts or not worth testing (i.e. using).
The results are not only a comparison between the largest vendors but they also help developers who run deployments on a single cloud as they test different options for OLTP applications and workloads, including different processors, on each one of them.
By the time results were out, providers had already released new products and enhancements; for example, the gp2 volumes on AWS, used in some benchmarks, have already been superseded by the new gp3 volumes. As the methodology is available on GitHub, with all the resources, scripts, and configurations, developers can perform further benchmarks with newer or different options. According to the documentation, "it would be possible to extend this binary to run on other platforms relatively easily, but requires some work to handle cloud-specific tasks – namely, getting machine metadata".
Access to the full report is free but requires registration.