BenchmarkXPRT Blog banner

Category: web microservices

The CloudXPRT v1.2 update package is now available!

We’re happy to announce that the CloudXPRT v1.2 update package is now available! The update prevents potential installation failures on Google Cloud Platform and Microsoft Azure, and ensures that the web microservices workload works on Ubuntu 22.04. The update uses updated software components such as Kubernetes v1.23.7, Kubespray v2.18.1, and Kubernetes Metrics Server v1, and incorporates some additional minor script changes.

The CloudXPRT v1.2 web microservices workload installation package is available at the CloudXPRT.com download page and the BenchmarkXPRT GitHub repository.

Before you get started with v1.2, please note the following updated system requirements:

  • Ubuntu 20.04.2 or 22.04 for on-premises testing
  • Ubuntu 18.04, 20.04.2, or 22.04 for CSP (AWS/Azure/GCP) testing

Because CloudXPRT is designed to run on high-end servers, physical nodes or VMs under test must meet the following minimum specifications:

  • 16 logical or virtual CPUs
  • 8 GB of RAM
  • 10 GB of available disk space (50 GB for the data analytics workload)

The update package includes only the updated v1.2 test harness and the updated web microservices workload. It does not include the data analytics workload. As we stated in the blog, now that we’ve published the web microservices package, we will assess the level of interest users express about a possible refresh of the v1.1 data analytics workload. For now, the v1.1 data analytics workload will continue to be available via CloudXPRT.com for some time to serve as a reference resource for users who have worked with the package in the past.

Please let us know if you have any questions about the CloudXPRT v1.2 test package. Happy testing!

Justin

CloudXPRT status and next steps

We developed our first cloud benchmark, CloudXPRT, to measure the performance of cloud applications deployed on modern infrastructure as a service (IaaS) platforms. When we first released CloudXPRT in February of 2021, the benchmark included two test packages: a web microservices workload and a data analytics workload. Both supported on-premises and cloud service provider (CSP) testing with Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure. 

CloudXPRT is our most complex benchmark, requiring sustained compatibility between many software components across multiple independent test environments. As vendors roll out updates for some components and stop supporting others, it’s inevitable that something will break. Since CloudXPRT’s launch, we’ve become aware of installation failures while attempting to set up CloudXPRT on Ubuntu virtual machines with GCP and Microsoft Azure. Additionally, while the web microservices workload continues to run in most instances with a few configuration tweaks and workarounds, the data analytics workload fails consistently due to compatibility issues with Minio, Prometheus, and Kafka within the Kubernetes environment. 

In response, we’re working to fix problems with the web microservices workload and bring all necessary components up to date. We’re developing an updated test package that will work on Ubuntu 22.04, using Kubernetes v1.23.7 and Kubespray v2.18.1. We’re also updating Kubernetes Metrics Server from v1beta1 to v1, and will incorporate some minor script changes. Our goal is to ensure successful installation and testing with the on-premises and CSP platforms that we supported when we first launched CloudXPRT.

We are currently focusing on the web microservices workload for two reasons. First, more users have downloaded it than the data analytics workload. Second, we think we have a clear path to success. Our plan is to publish the updated web microservices test package, and see what feedback and interest we receive from users about a possible data analytics refresh. The existing data analytics workload will remain available via CloudXPRT.com for the time being to serve as a reference resource.

We apologize for the inconvenience that these issues have caused. We’ll provide more information about a release timeline and final test package details here in the blog as we get closer to publication. If you have any questions about the future of CloudXPRT, please feel free to contact us!

Justin

We welcome your CloudXPRT results!

We recently published a set of CloudXPRT Data Analytics and Web Microservices workload test results submitted by Quanta Computer, Inc. The Quanta submission is the first set of CloudXPRT results that we’ve published using the formal results submission and approval process. We’re grateful to the Quanta team for carefully following the submission guidelines, enabling us to complete the review process without a hitch.

If you are unfamiliar with the process, you can find general information about how we review submissions in a previous blog post. Detailed, step-by-step instructions are available on the results submission page. As a reminder for testers who are considering submitting results for July, the submission deadline is tomorrow, Friday July 16, and the publication date is Friday July 30. We list the submission and publication dates for the rest of 2021 below. Please note that we do not plan to review submissions in December, so if we receive results submissions after November 30, we may not publish them until the end of January 2022.

August

Submission deadline: Tuesday 8/17/21

Publication date: Tuesday 8/31/21

September

Submission deadline: Thursday 9/16/21

Publication date: Thursday 9/30/21

October

Submission deadline: Friday 10/15/21

Publication date: Friday 10/29/21

November

Submission deadline: Tuesday 11/16/21

Publication date: Tuesday 11/30/21

December

Submission deadline: N/A

Publication date: N/A

If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!

Justin

Default requirements for CloudXPRT results submissions

Over the past few weeks, we’ve received questions about whether we require specific test configuration settings for official CloudXPRT results submissions. Currently, testers have the option to edit up to 12 configuration options for the web microservices workload and three configuration options for the data analytics workload. Not all configuration options have an impact on testing and results, but a few of them can drastically affect key results metrics and how long it takes to complete a test. Because new CloudXPRT testers may not anticipate those outcomes, and so many configuration permutations are possible, we’ve come up with a set of requirements for all future results submissions to our site. Please note that testers are still free to adjust all available configuration options—and define service level agreement (SLA) settings—as they see fit for their own purposes. The requirements below apply only to results testers want to submit for publication consideration on our site, and to any resulting comparisons.


Web microservices results submission requirement

Starting with the May results submission cycle, all web microservices results submissions must have the workload.cpurequestsvalue, which lets the user designate the number of CPU cores the workload assigns to each pod, set to 4. Currently, the benchmark supports values of 1, 2, and 4, with the default value of 4. While 1 and 2 CPU cores per pod may be more appropriate for relatively low-end systems or configurations with few vCPUs, a value of 4 is appropriate for most datacenter processors, and it often enables CSP instances to operate within the benchmark’s max default 95th percentile latency SLA of 3,000 milliseconds.

In future CloudXPRT releases, we may remove the option to change the workload.cpurequests value from the config.json file and simply fix the value in the benchmark’s code to promote test predictability and reasonable comparisons. For more information about configuration options for the web microservices workload, please consult the Overview of the CloudXPRT Web Microservices Workload white paper.


Data analytics results submission requirement

Starting with the May results submission cycle, all data analytics results submissions must have the best reported performance (throughput_jobs/min) correspond to a 95th percentile SLA latency of 90 seconds or less. We have received submissions where the throughput was extremely high, but the 95th percentile SLA latency was up to 10 times the 90 seconds that we recommend in CloudXPRT documentation. High latency values may be acceptable for the unique purposes of individual testers, but they do not provide a good basis for comparison between clusters under test. For more information about configuration options with the data analytics workload, please consult the Overview of the CloudXPRT Data Analytics Workload white paper.

We will update CloudXPRT documentation to make sure that testers know to use the default configuration settings if they plan to submit results for publication. If you have any questions about CloudXPRT or the CloudXPRT results submission process, please let us know.

Justin

The CloudXPRT v1.1 general release is tomorrow!

We’re happy to announce that CloudXPRT v1.1 will move from beta to general release status tomorrow! The installation packages will be available at the CloudXPRT.com download page and the BenchmarkXPRT GitHub repository. You will find more details about the v1.1 updates in a previous blog post, but the most prominent changes are the consolidation of the five previous installation packages into two packages (one per workload) and added support for Ubuntu 20.04.2 with on-premises testing.

Before you get started with v1.1, please note the following updated system requirements:

  • Ubuntu 20.04.2 or later for on-premises testing
  • Ubuntu 18.04 and 20.04.2 or later for CSP (AWS/Azure/GCP) testing

CloudXPRT is designed to run on high-end servers. Physical nodes or VMs under test must meet the following minimum specifications:

  • 16 logical or virtual CPUs
  • 8 GB of RAM
  • 10 GB of available disk space (50 GB for the data analytics workload)

We have also made significant adjustments to the installation and test configuration instructions in the readmes for both workloads, so please revisit these documents even if you’re familiar with previous test processes.

As we noted during the beta period, we have not observed any significant differences in performance between v1.01 and v1.1, but we haven’t tested every possible test configuration across every platform. If you observe different results when testing the same configuration with v1.01 and v1.1, please send us the details so we can investigate.

If you have any questions about CloudXPRT v1.1, please let us know!

Justin

The CloudXPRT v1.1 beta is available!

Last week, we announced that a CloudXPRT v1.1 beta was on the way. We’re happy to say that the v1.1 beta is now available to the public on a dedicated CloudXPRT v1.1 beta download page. While CloudXPRT v1.01 remains the officially supported version on CloudXPRT.com and in our GitHub repository, interested testers can use the v1.1 beta version in new environments as we finalize the v1.1 build for official release. You are welcome to publish results as we do not expect results to change in the final, official release.

As we mentioned in last week’s post, the CloudXPRT v1.1 beta includes the following changes:

  • We’ve added support for Ubuntu 20.04.2 or later for on-premises testing.
  • We’ve consolidated and standardized the installation packages for both workloads. Instead of one package for the data analytics workload and four separate packages for the web microservices workload, each workload has a single installation package that supports on-premises testing and testing with all three supported CSPs.
  • We’ve incorporated Terraform to help create and configure VMs, which helps to prevent problems when testers do not allocate enough storage per VM prior to testing.
  • We’ve replaced the Calico network plugin in Kubespray with Weave, which helps to avoid some of the network issues testers have occasionally encountered in the CPS environment.

Please feel free to share the link to the beta download page. (To avoid confusion, the beta will not appear in the main CloudXPRT download table.) We can’t yet state definitively whether results from the new version will be comparable to those from v1.01. We have not observed any significant differences in performance, but we haven’t tested every possible test configuration across every platform. If you observe different results when testing the same configuration with v1.01 and v1.1 beta, please send us the details so we can investigate.

If you have any questions about CloudXPRT or the CloudXPRT v1.1 beta, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?