Last week, we announced that a CloudXPRT v1.1
beta was on the way. We’re happy to say that the v1.1 beta is now available to
the public on a dedicated CloudXPRT v1.1 beta download page. While CloudXPRT v1.01
remains the officially supported version on CloudXPRT.com and in our GitHub
repository, interested testers can use the v1.1
beta version in new environments as we finalize the v1.1 build for official
release. You are welcome to publish results as we do not expect results to
change in the final, official release.
As we mentioned in
last week’s post, the CloudXPRT v1.1 beta includes the following changes:
We’ve added support for Ubuntu 20.04.2 or later for on-premises
We’ve consolidated and standardized the installation packages
for both workloads. Instead of one package for the data analytics workload and
four separate packages for the web microservices workload, each workload has a
single installation package that supports on-premises testing and testing with
all three supported CSPs.
We’ve incorporated Terraform to help create and
configure VMs, which helps to prevent problems when testers do not allocate
enough storage per VM prior to testing.
We’ve replaced the Calico network plugin in Kubespray with Weave, which helps to avoid some
of the network issues testers have occasionally encountered in the CPS
Please feel free to
share the link to the beta download page. (To avoid confusion, the beta will
not appear in the main CloudXPRT download table.) We can’t yet state
definitively whether results from the new version will be comparable to those
from v1.01. We have not observed any significant differences in performance,
but we haven’t tested every possible test configuration across every platform.
If you observe different results when testing the same configuration with v1.01
and v1.1 beta, please send us the details so we can investigate.
If you have any questions about CloudXPRT or the CloudXPRT v1.1 beta, please let us know!
CloudXPRT is undoubtedly
the most complex tool in the XPRT family of benchmarks. To run the cloud-native
benchmark’s multiple workloads across different hardware and software platforms,
testers need two things: (1) at least a passing familiarity with a wide range
of cloud-related toolkits, and (2) an understanding that changing even one test
configuration variable can affect test results. While the complexity of CloudXPRT
makes it a powerful and flexible tool for measuring application performance on
real-world IaaS stacks, it also creates a steep learning curve for new users.
Benchmark setup and
configuration can involve a number of complex steps, and the corresponding
instructions should be thorough, unambiguous, and intuitive to follow. For all
of the XPRT tools, we strive to publish documentation that provides quick,
easy-to-find answers to the questions users might have. Community members have asked
us to improve the clarity and readability of the CloudXPRT setup,
configuration, and individual workload documentation. In response, we are
working to create more—and better—CloudXPRT documentation.
If you’re intimidated
by the benchmark’s complexity, helping you is one of our highest priorities. In
the coming weeks and months, we’ll be evaluating all of our CloudXPRT
documentation, particularly from the perspective of new users, and will release
more information about the new documentation as it becomes available.
We also want to remind
you of some of the existing CloudXPRT resources. We encourage everyone to check
out the Introduction to CloudXPRT and Overview of the CloudXPRT Web Microservices Workload white papers. (Note
that we’ll soon be publishing a paper on the benchmark’s data analytics
workload.) Also, a couple of weeks ago, we published the CloudXPRT learning tool, which we designed to serve as an information
hub for common CloudXPRT topics and questions, and to help tech journalists,
OEM lab engineers, and everyone who is interested in CloudXPRT find the answers
they need as quickly as possible.
Thanks to all who let us know that there was room for improvement in the CloudXPRT documentation. We rely on that kind of feedback and always welcome it. If you have any questions or suggestions regarding CloudXPRT or any of the other XPRTs, please let us know!
We’re happy to announce
that the CloudXPRT learning tool is now live! We
designed the tool to serve as an information hub for common CloudXPRT topics
and questions, and to help tech journalists, OEM lab engineers, and everyone
who is interested in CloudXPRT find the answers they need as quickly as
The tool features four
primary areas of content:
The Q&A section provides quick answers to the questions we
receive most from testers and the tech press.
The CloudXPRT: the basics section describes specific topics such
as the benchmark’s target platforms, workloads, companion cloud software, and
hardware and software requirements.
The Testing and results section covers the testing process,
metrics, and how to publish results.
The cloud primer provides brief, easy-to-understand definitions of
key cloud computing terms and concepts.
The first screenshot below shows the home screen. To illustrate how some of the pop-up information sections appear, the second screenshot shows part of the Key terms and concepts module in the Cloud primer section.
We’re excited about the new CloudXPRT learning tool! If you have any questions about the tool, or suggestions for additional content to include in it, please let us know!
In addition to
providing practical information about the installation package and minimum
system requirements for the data analytics workload, the paper will describe
test configuration variables, structural components, task workflows, and test
metrics. It will also include guidance on interpreting test results and submitting
them for publication.
As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family,
with no shortage of topics to explore. Possible future topics include the
impact of adjusting specific test configuration options, recommendations for
results reporting, and methods for results analysis. If there are specific
topics that you’d like us to address in future white papers, please feel free
to send us your ideas!
We hope that the
upcoming Overview of the CloudXPRT Data Analytics Workload paper
will serve as a go-to resource for CloudXPRT testers, and will answer any
questions you have about the workload. Once it goes live, we’ll provide links
in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.
Soon, we’ll be expanding
our portfolio of CloudXPRT resources with a white paper that focuses on the benchmark’s
web microservices workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper will discuss the
workload in much greater detail.
In addition to providing practical information about the web microservices installation packages and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.
As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family, with no shortage of topics to explore further. We plan to publish a companion overview for the data analytics workload, and possible future topics include the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.
We hope that the
upcoming Overview of the CloudXPRT Web Microservices Workload paper will
serve as a go-to resource for CloudXPRT testers, and will answer any questions
you have about the workload. Once it goes live, we’ll provide links in the
Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.
The CloudXPRT Preview period has ended, and CloudXPRT version 1.0 installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! Like the Preview build, CloudXPRT version 1.0 includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.
Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. On CloudXPRT.com, the Helpful Info box contains resources such as links to the Introduction to CloudXPRT white paper, the CloudXPRT master readme, and the CloudXPRT GitHub repository.
The GitHub repository also contains the CloudXPRT
source code. The source code is freely available for testers to download and
Performance results from this release are comparable
to performance results from the CloudXPRT Preview build. Testers who wish to
publish results on CloudXPRT.com can find more information about the results
submission and review process in the blog. We post the monthly results cycle schedule on the results
We’re thankful for all the input we received during the CloudXPRT development process and Preview period. If you have any questions about CloudXPRT, please let us know.