BenchmarkXPRT Blog banner

Category: Servers

Reports of CloudXPRT installation failures

Recently, CloudXPRT testers have reported installation failures while attempting to set up CloudXPRT on Ubuntu virtual machines with Google Cloud Platform (GCP) and Microsoft Azure. We have not yet determined whether the installation process fails consistently on these VMs or the problem occurs under only specific conditions. We believe these failures occur with only GCP and Azure, and you should still be able to successfully install and run CloudXPRT on both Amazon Web Services virtual machines and on-premises gear.

We apologize for the inconvenience that this issue causes for CloudXPRT testers and will let the community know as soon as we identify a reliable solution. If you have encountered any other issues during CloudXPRT testing, please feel free to contact us!

Justin

We welcome your CloudXPRT results!

We recently published a set of CloudXPRT Data Analytics and Web Microservices workload test results submitted by Quanta Computer, Inc. The Quanta submission is the first set of CloudXPRT results that we’ve published using the formal results submission and approval process. We’re grateful to the Quanta team for carefully following the submission guidelines, enabling us to complete the review process without a hitch.

If you are unfamiliar with the process, you can find general information about how we review submissions in a previous blog post. Detailed, step-by-step instructions are available on the results submission page. As a reminder for testers who are considering submitting results for July, the submission deadline is tomorrow, Friday July 16, and the publication date is Friday July 30. We list the submission and publication dates for the rest of 2021 below. Please note that we do not plan to review submissions in December, so if we receive results submissions after November 30, we may not publish them until the end of January 2022.

August

Submission deadline: Tuesday 8/17/21

Publication date: Tuesday 8/31/21

September

Submission deadline: Thursday 9/16/21

Publication date: Thursday 9/30/21

October

Submission deadline: Friday 10/15/21

Publication date: Friday 10/29/21

November

Submission deadline: Tuesday 11/16/21

Publication date: Tuesday 11/30/21

December

Submission deadline: N/A

Publication date: N/A

If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!

Justin

Default requirements for CloudXPRT results submissions

Over the past few weeks, we’ve received questions about whether we require specific test configuration settings for official CloudXPRT results submissions. Currently, testers have the option to edit up to 12 configuration options for the web microservices workload and three configuration options for the data analytics workload. Not all configuration options have an impact on testing and results, but a few of them can drastically affect key results metrics and how long it takes to complete a test. Because new CloudXPRT testers may not anticipate those outcomes, and so many configuration permutations are possible, we’ve come up with a set of requirements for all future results submissions to our site. Please note that testers are still free to adjust all available configuration options—and define service level agreement (SLA) settings—as they see fit for their own purposes. The requirements below apply only to results testers want to submit for publication consideration on our site, and to any resulting comparisons.


Web microservices results submission requirement

Starting with the May results submission cycle, all web microservices results submissions must have the workload.cpurequestsvalue, which lets the user designate the number of CPU cores the workload assigns to each pod, set to 4. Currently, the benchmark supports values of 1, 2, and 4, with the default value of 4. While 1 and 2 CPU cores per pod may be more appropriate for relatively low-end systems or configurations with few vCPUs, a value of 4 is appropriate for most datacenter processors, and it often enables CSP instances to operate within the benchmark’s max default 95th percentile latency SLA of 3,000 milliseconds.

In future CloudXPRT releases, we may remove the option to change the workload.cpurequests value from the config.json file and simply fix the value in the benchmark’s code to promote test predictability and reasonable comparisons. For more information about configuration options for the web microservices workload, please consult the Overview of the CloudXPRT Web Microservices Workload white paper.


Data analytics results submission requirement

Starting with the May results submission cycle, all data analytics results submissions must have the best reported performance (throughput_jobs/min) correspond to a 95th percentile SLA latency of 90 seconds or less. We have received submissions where the throughput was extremely high, but the 95th percentile SLA latency was up to 10 times the 90 seconds that we recommend in CloudXPRT documentation. High latency values may be acceptable for the unique purposes of individual testers, but they do not provide a good basis for comparison between clusters under test. For more information about configuration options with the data analytics workload, please consult the Overview of the CloudXPRT Data Analytics Workload white paper.

We will update CloudXPRT documentation to make sure that testers know to use the default configuration settings if they plan to submit results for publication. If you have any questions about CloudXPRT or the CloudXPRT results submission process, please let us know.

Justin

The Overview of the CloudXPRT Data Analytics Workload white paper is now available!

Today, we expand our portfolio of CloudXPRT resources with a paper on the benchmark’s data analytics workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper goes into much greater detail.

In addition to providing practical information about the data analytics installation package and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.

CloudXPRT is the most complex tool in the XPRT family, and the new paper is part of our effort to create more—and better—CloudXPRT documentation. We plan to publish additional CloudXPRT white papers in the coming months, with possible future topics including the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.

We hope that the Overview of the CloudXPRT Data Analytics Workload paper will serve as a go-to resource for CloudXPRT testers, and will answer any questions you have about the workload. You can find links to the paper and other resources in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, please let us know!

Justin

The CloudXPRT learning tool is now live!

We’re happy to announce that the CloudXPRT learning tool is now live! We designed the tool to serve as an information hub for common CloudXPRT topics and questions, and to help tech journalists, OEM lab engineers, and everyone who is interested in CloudXPRT find the answers they need as quickly as possible.

The tool features four primary areas of content:

  • The Q&A section provides quick answers to the questions we receive most from testers and the tech press.
  • The CloudXPRT: the basics section describes specific topics such as the benchmark’s target platforms, workloads, companion cloud software, and hardware and software requirements.
  • The Testing and results section covers the testing process, metrics, and how to publish results.
  • The cloud primer provides brief, easy-to-understand definitions of key cloud computing terms and concepts.

The first screenshot below shows the home screen. To illustrate how some of the pop-up information sections appear, the second screenshot shows part of the Key terms and concepts module in the Cloud primer section. 

We’re excited about the new CloudXPRT learning tool! If you have any questions about the tool, or suggestions for additional content to include in it, please let us know!

Justin

Check out our new CloudXPRT video!

Many businesses want to move critical applications to the cloud, but choosing the right cloud-based infrastructure as a service (IaaS) platform can be a complex and costly project. We developed CloudXPRT to help speed up and simplify the process by providing a powerful benchmarking tool that allows users to run multiple workloads on cloud platform software in on-premises and popular public cloud environments.

To help spread the word about what CloudXPRT can do and why it matters to businesses, we’ve published a new video, Choose the best IaaS configuration for your business with CloudXPRT, on YouTube and CloudXPRT.com. If you know anyone who is evaluating cloud options, or who would be interested in CloudXPRT testing or results, we encourage you to share the video with them. As always, if you have any questions about CloudXPRT, please let us know!

Justin

Video: Choose the best IaaS configuration for your business with CloudXPRT.

Check out the other XPRTs:

Forgot your password?