As we’ve been working
on improvements and updates for CloudXPRT, we’ve been using feedback from
community members to determine which changes will help testers most in the
short term. To make some of those changes available to the community as soon as
possible, we plan to release a beta version of CloudXPRT v1.1 in the coming
During the v1.1 beta
period, the CloudXPRT v1.01 installation packages on CloudXPRT.com and our GitHub repository will continue to include the officially supported
version of CloudXPRT. However, interested testers can experiment with the v1.1
beta version in new environments while we finalize the build for official
The CloudXPRT v1.1
beta includes the following primary changes:
- We’re adding support for Ubuntu 20.04.2 or later, the number one
request we’ve received.
- We’re consolidating and standardizing the installation packages
for both workloads. Instead of one package for the data analytics workload and
four separate packages for the web microservices workload, each workload will
have two installation packages: one for all on-premises testing and one for
testing with all three supported CSPs.
- We’re incorporating Terraform to help create and
configure VMs, which will help to prevent situations when testers do not
allocate enough storage per VM prior to testing.
- We use Kubespray to manage Kubernetes
clusters, and Kubespray uses Calico as the default network plug in. Calico has not always worked
well for CloudXPRT in the CSP environment, so we’re replacing Calico with Weave.
At the start of the
beta period, we will share a link to the v1.1 beta download page here in the
blog. You’ll be free to share this link. To avoid confusion, we will not add the
beta download to the v1.01 downloads available on CloudXPRT.com.
As the beta release
date approaches, we’ll share more details about timelines, access, and any additional
changes to the benchmark. If you have any questions about the upcoming
CloudXPRT v1.1 beta, please let us know!
Soon, we’ll be publishing
a CloudXPRT white paper that focuses on the benchmark’s data analytics
workload. We summarized the workload in the Introduction to CloudXPRT white paper, but in the same way that
the Overview of the CloudXPRT Web Microservices Workload paper did, the new
paper will discuss the workload in much greater detail.
In addition to
providing practical information about the installation package and minimum
system requirements for the data analytics workload, the paper will describe
test configuration variables, structural components, task workflows, and test
metrics. It will also include guidance on interpreting test results and submitting
them for publication.
As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family,
with no shortage of topics to explore. Possible future topics include the
impact of adjusting specific test configuration options, recommendations for
results reporting, and methods for results analysis. If there are specific
topics that you’d like us to address in future white papers, please feel free
to send us your ideas!
We hope that the
upcoming Overview of the CloudXPRT Data Analytics Workload paper
will serve as a go-to resource for CloudXPRT testers, and will answer any
questions you have about the workload. Once it goes live, we’ll provide links
in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.
If you have any questions, please let us know!
We want to let CloudXPRT testers know that we’re close to
releasing an updated version (build 1.01) with two minor bug fixes, an improved
post-test results processing script, and an adjustment to one of our test
configuration recommendations. None of these changes will affect performance or
test results, so scores from previous CloudXPRT builds will be comparable to
those from the new build.
The most significant changes in CloudXPRT build 1.01 are as
- In previous builds, some testers encountered warnings during setup to update the version of Kubernetes Operations (kops) when testing on public-cloud platforms (the CloudXPRT 1.00 recommendation is kops version 1.16.0). We are adjusing the kops installation instructions in the setup instructions for the web microservices and data analytics workloads to prevent these warnings.
- In previous builds, post-test cleanup instructions for public-cloud testing environments do not always delete all of the resources that CloudXPRT creates during setup. We are updating instructions to ensure a more thorough cleanup process. This change applies to test instructions for the web microservices and data analytics workloads.
- We are reformatting the optional results graphs the web microservices postprocess program creates to make them easier to interpret.
- In previous builds, the recommended time interval for the web-microservices workload is 120 seconds if the hpamode option is enabled and 60 seconds if it is disabled. Because we’ve found that the 60-second difference has no significant impact on test results, we are changing the recommendation to 60 seconds for both hpamode settings.
We hope these changes
will improve the CloudXPRT setup and testing experience. We haven’t set the
release date for the updated build yet, but when we do, we’ll announce it here
in the blog. If you have any questions about CloudXPRT, or would like to report
bugs or other issues, please feel free to contact us!
Soon, we’ll be expanding
our portfolio of CloudXPRT resources with a white paper that focuses on the benchmark’s
web microservices workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper will discuss the
workload in much greater detail.
In addition to providing practical information about the web microservices installation packages and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.
As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family, with no shortage of topics to explore further. We plan to publish a companion overview for the data analytics workload, and possible future topics include the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.
We hope that the
upcoming Overview of the CloudXPRT Web Microservices Workload paper will
serve as a go-to resource for CloudXPRT testers, and will answer any questions
you have about the workload. Once it goes live, we’ll provide links in the
Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.
If you have any questions,
please let us know!
This week, we made
some changes to the CloudXPRT results viewer that we think will simplify the results-browsing experience and
allow visitors to more quickly and easily find important data.
The first set of
changes involves how we present test system information in the main results
table and on the individual results details pages. We realized that there was
potential for confusion around the “CPU” and “Number of nodes” categories. We
removed those and created the following new fields: “Cluster components,”
“Nodes (work + control plane),” and
“vCPUs (work + control plane).” These new categories better describe test
configurations and clarify how many CPUs engage with the workload.
The second set of
changes involves the number of data points that we list in the table for each web
microservices test run. For example, previously, we published a unique entry
for each level of concurrency a test run records. If a run scaled to 32
concurrent instances, we presented the data for each instance in its own row. This
helped to show the performance curve during a single test as the workload
scaled up, but it made it more difficult for visitors to identify the best
throughput results from an individual run. We decided to consolidate the
results from a complete test run on a single row, highlighting only the maximum
number of successful requests (throughout). All the raw data from each run remains
available for download on the details page for each result, but visitors don’t
have to wade through all that data to find the configuration’s main “score.”
We view the development of the CloudXPRT results viewer as an ongoing process. As we add results and receive feedback from testers about the data presentation formats that work best for them, we’ll continue to add more features and tweak existing ones to make them as useful as possible. If you have any questions about CloudXPRT results or the results viewer, please let us know!
The CloudXPRT Preview installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! The CloudXPRT Preview includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.
Several different test packages are available for
download from the CloudXPRT download
page. For detailed installation instructions and
hardware and software requirements for each, click the package’s readme link. The
Helpful Info box on CloudXPRT.com also contains resources such as links to the
CloudXPRT master readme and the CloudXPRT GitHub repository. Soon, we will add
a link to the CloudXPRT Preview source code, which will be freely available for
testers to download and review.
All interested parties may now publish CloudXPRT
results. However, until we begin the formal results submission and review process in July, we will publish only results we
produce in our own lab. We anticipate adding the first set of those within the coming
We’re thankful for all the input we received during the initial CloudXPRT development process, and we welcome feedback on the CloudXPRT Preview. If you have any questions about CloudXPRT, or would like to share your comments and suggestions, please let us know.