BenchmarkXPRT Blog banner

Category: Servers

Default requirements for CloudXPRT results submissions

Over the past few weeks, we’ve received questions about whether we require specific test configuration settings for official CloudXPRT results submissions. Currently, testers have the option to edit up to 12 configuration options for the web microservices workload and three configuration options for the data analytics workload. Not all configuration options have an impact on testing and results, but a few of them can drastically affect key results metrics and how long it takes to complete a test. Because new CloudXPRT testers may not anticipate those outcomes, and so many configuration permutations are possible, we’ve come up with a set of requirements for all future results submissions to our site. Please note that testers are still free to adjust all available configuration options—and define service level agreement (SLA) settings—as they see fit for their own purposes. The requirements below apply only to results testers want to submit for publication consideration on our site, and to any resulting comparisons.


Web microservices results submission requirement

Starting with the May results submission cycle, all web microservices results submissions must have the workload.cpurequestsvalue, which lets the user designate the number of CPU cores the workload assigns to each pod, set to 4. Currently, the benchmark supports values of 1, 2, and 4, with the default value of 4. While 1 and 2 CPU cores per pod may be more appropriate for relatively low-end systems or configurations with few vCPUs, a value of 4 is appropriate for most datacenter processors, and it often enables CSP instances to operate within the benchmark’s max default 95th percentile latency SLA of 3,000 milliseconds.

In future CloudXPRT releases, we may remove the option to change the workload.cpurequests value from the config.json file and simply fix the value in the benchmark’s code to promote test predictability and reasonable comparisons. For more information about configuration options for the web microservices workload, please consult the Overview of the CloudXPRT Web Microservices Workload white paper.


Data analytics results submission requirement

Starting with the May results submission cycle, all data analytics results submissions must have the best reported performance (throughput_jobs/min) correspond to a 95th percentile SLA latency of 90 seconds or less. We have received submissions where the throughput was extremely high, but the 95th percentile SLA latency was up to 10 times the 90 seconds that we recommend in CloudXPRT documentation. High latency values may be acceptable for the unique purposes of individual testers, but they do not provide a good basis for comparison between clusters under test. For more information about configuration options with the data analytics workload, please consult the Overview of the CloudXPRT Data Analytics Workload white paper.

We will update CloudXPRT documentation to make sure that testers know to use the default configuration settings if they plan to submit results for publication. If you have any questions about CloudXPRT or the CloudXPRT results submission process, please let us know.

Justin

The Overview of the CloudXPRT Data Analytics Workload white paper is now available!

Today, we expand our portfolio of CloudXPRT resources with a paper on the benchmark’s data analytics workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper goes into much greater detail.

In addition to providing practical information about the data analytics installation package and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.

CloudXPRT is the most complex tool in the XPRT family, and the new paper is part of our effort to create more—and better—CloudXPRT documentation. We plan to publish additional CloudXPRT white papers in the coming months, with possible future topics including the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.

We hope that the Overview of the CloudXPRT Data Analytics Workload paper will serve as a go-to resource for CloudXPRT testers, and will answer any questions you have about the workload. You can find links to the paper and other resources in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, please let us know!

Justin

The CloudXPRT learning tool is now live!

We’re happy to announce that the CloudXPRT learning tool is now live! We designed the tool to serve as an information hub for common CloudXPRT topics and questions, and to help tech journalists, OEM lab engineers, and everyone who is interested in CloudXPRT find the answers they need as quickly as possible.

The tool features four primary areas of content:

  • The Q&A section provides quick answers to the questions we receive most from testers and the tech press.
  • The CloudXPRT: the basics section describes specific topics such as the benchmark’s target platforms, workloads, companion cloud software, and hardware and software requirements.
  • The Testing and results section covers the testing process, metrics, and how to publish results.
  • The cloud primer provides brief, easy-to-understand definitions of key cloud computing terms and concepts.

The first screenshot below shows the home screen. To illustrate how some of the pop-up information sections appear, the second screenshot shows part of the Key terms and concepts module in the Cloud primer section. 

We’re excited about the new CloudXPRT learning tool! If you have any questions about the tool, or suggestions for additional content to include in it, please let us know!

Justin

Check out our new CloudXPRT video!

Many businesses want to move critical applications to the cloud, but choosing the right cloud-based infrastructure as a service (IaaS) platform can be a complex and costly project. We developed CloudXPRT to help speed up and simplify the process by providing a powerful benchmarking tool that allows users to run multiple workloads on cloud platform software in on-premises and popular public cloud environments.

To help spread the word about what CloudXPRT can do and why it matters to businesses, we’ve published a new video, Choose the best IaaS configuration for your business with CloudXPRT, on YouTube and CloudXPRT.com. If you know anyone who is evaluating cloud options, or who would be interested in CloudXPRT testing or results, we encourage you to share the video with them. As always, if you have any questions about CloudXPRT, please let us know!

Justin

Video: Choose the best IaaS configuration for your business with CloudXPRT.

A CloudXPRT build with bug fixes is on the way

We want to let CloudXPRT testers know that updated installer packages are on the way. The packages will include several fixes for bugs that we discovered in the initial CloudXPRT Preview release (build 0.95). The fixes do not affect CloudXPRT test results, but do help to facilitate installation and remove potential sources of confusion during the setup and testing process.

Along with a few text edits and other minor fixes, we made the following changes in the upcoming build:

  • We updated the data analytics setup code to prevent error messages that occurred when the benchmark treated one-node configurations as a special case.
  • We configured the data analytics workload to use a go.mod file for all the required go modules. With this change, we can explicitly state the release version of the necessary go modules, and updates to the latest go release won’t break the benchmark. This change also removes the need to include large gosrc.tar.gz files in the source code.
  • We added a cleanup utility script for the web microservices workload. If something goes wrong during configuration or a test run, testers can use this script to clean everything and start over.
  • We fixed an error that prevented the benchmark from successfully retrieving the cluster_config.json file in certain multi-node setups.
  • In the web microservices workload, we changed the output format of the request rate metric from integer to float. This change allows us to report workload data with a higher degree of precision.
  • In the web microservices workload, we added an overall summary line to results log file that reports the best throughput numbers from the test run.
  • In the web microservices code, we modified a Kubernetes option that the benchmark used to create the Cassandra schema. Prior to this change, the option generated an inconsequential but distracting error message about TTY input.

We haven’t set the release date for the updated build yet, but when we do, we’ll announce it here in the blog. If you have any questions about CloudXPRT, please let us know!

Justin

The CloudXPRT Preview is here!

The CloudXPRT Preview installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! The CloudXPRT Preview includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.

Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. The Helpful Info box on CloudXPRT.com also contains resources such as links to the CloudXPRT master readme and the CloudXPRT GitHub repository. Soon, we will add a link to the CloudXPRT Preview source code, which will be freely available for testers to download and review.

All interested parties may now publish CloudXPRT results. However, until we begin the formal results submission and review process in July, we will publish only results we produce in our own lab. We anticipate adding the first set of those within the coming week.

We’re thankful for all the input we received during the initial CloudXPRT development process, and we welcome feedback on the CloudXPRT Preview. If you have any questions about CloudXPRT, or would like to share your comments and suggestions, please let us know.

Justin

Check out the other XPRTs:

Forgot your password?