We want to let CloudXPRT testers know that we’re close to
releasing an updated version (build 1.01) with two minor bug fixes, an improved
post-test results processing script, and an adjustment to one of our test
configuration recommendations. None of these changes will affect performance or
test results, so scores from previous CloudXPRT builds will be comparable to
those from the new build.
The most significant changes in CloudXPRT build 1.01 are as
In previous builds, some testers encountered warnings during setup to update the version of Kubernetes Operations (kops) when testing on public-cloud platforms (the CloudXPRT 1.00 recommendation is kops version 1.16.0). We are adjusing the kops installation instructions in the setup instructions for the web microservices and data analytics workloads to prevent these warnings.
In previous builds, post-test cleanup instructions for public-cloud testing environments do not always delete all of the resources that CloudXPRT creates during setup. We are updating instructions to ensure a more thorough cleanup process. This change applies to test instructions for the web microservices and data analytics workloads.
We are reformatting the optional results graphs the web microservices postprocess program creates to make them easier to interpret.
In previous builds, the recommended time interval for the web-microservices workload is 120 seconds if the hpamode option is enabled and 60 seconds if it is disabled. Because we’ve found that the 60-second difference has no significant impact on test results, we are changing the recommendation to 60 seconds for both hpamode settings.
We hope these changes
will improve the CloudXPRT setup and testing experience. We haven’t set the
release date for the updated build yet, but when we do, we’ll announce it here
in the blog. If you have any questions about CloudXPRT, or would like to report
bugs or other issues, please feel free to contact us!
The CloudXPRT Preview period has ended, and CloudXPRT version 1.0 installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! Like the Preview build, CloudXPRT version 1.0 includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.
Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. On CloudXPRT.com, the Helpful Info box contains resources such as links to the Introduction to CloudXPRT white paper, the CloudXPRT master readme, and the CloudXPRT GitHub repository.
The GitHub repository also contains the CloudXPRT
source code. The source code is freely available for testers to download and
Performance results from this release are comparable
to performance results from the CloudXPRT Preview build. Testers who wish to
publish results on CloudXPRT.com can find more information about the results
submission and review process in the blog. We post the monthly results cycle schedule on the results
We’re thankful for all the input we received during the CloudXPRT development process and Preview period. If you have any questions about CloudXPRT, please let us know.
Many businesses want
to move critical applications to the cloud, but choosing the right cloud-based
infrastructure as a service (IaaS) platform can be a complex and costly project.
We developed CloudXPRT to help speed up and simplify the process by providing a
powerful benchmarking tool that allows users to run multiple workloads on cloud
platform software in on-premises and popular public cloud environments.
We want to let CloudXPRT testers know that updated installer packages are on the way. The packages will include several fixes for bugs that we discovered in the initial CloudXPRT Preview release (build 0.95). The fixes do not affect CloudXPRT test results, but do help to facilitate installation and remove potential sources of confusion during the setup and testing process.
Along with a few text edits
and other minor fixes, we made the following changes in the upcoming build:
updated the data analytics setup code to prevent error messages that occurred
when the benchmark treated one-node configurations as a special case.
configured the data analytics workload to use a go.mod file for all the
required go modules. With this change, we can explicitly state the release
version of the necessary go modules, and updates to the latest go release won’t
break the benchmark. This change also removes the need to include large gosrc.tar.gz
files in the source code.
added a cleanup utility script for the web microservices workload. If something
goes wrong during configuration or a test run, testers can use this script to
clean everything and start over.
fixed an error that prevented the benchmark from successfully retrieving the cluster_config.json
file in certain multi-node setups.
the web microservices workload, we changed the output format of the request
rate metric from integer to float. This change allows us to report workload
data with a higher degree of precision.
the web microservices workload, we added an overall summary line to results log
file that reports the best throughput numbers from the test run.
In the web microservices code, we
modified a Kubernetes option that the benchmark used to create the Cassandra
schema. Prior to this change, the option generated an inconsequential but
distracting error message about TTY input.
We haven’t set the release date for the updated build yet, but when we do, we’ll announce it here in the blog. If you have any questions about CloudXPRT, please let us know!
We’re happy to announce that the CloudXPRT results viewer is now live with results from the first few rounds of CloudXPRT
Preview testing we conducted in our lab. Here are some tips to help you to
navigate the viewer more efficiently:
Click the tabs at the top of the table to switch from Data analytics
workload results to Web microservices workload results.
Click the header of any column to sort the data on that
variable. Single click to sort A to Z and double-click to sort Z to A.
Click the link in the Source/details column to visit a detailed
page for that result, where you’ll find additional test configuration and
system hardware information and the option to download results files.
By default, the viewer displays eight results per page, which
you can change to 16, 48, or Show all.
The free-form search field above the table lets you filter for
variables such as cloud service or processor.
We’ll be adding more features, including expanded filtering and
sorting mechanisms, to the results viewer in the near future. We’re also
investigating ways to present multiple data points in a graph format, which
will allow visitors to examine performance behavior curves in conjunction with
factors such as concurrency and resource utilization.
We welcome your CloudXPRT results submissions! To learn about
the new submission and review process we’ll be using, take a look at last week’s blog.
If you have any questions or suggestions for ways that we can
improve the results viewer, please let us know!
The CloudXPRT Preview installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! The CloudXPRT Preview includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.
Several different test packages are available for
download from the CloudXPRT download
page. For detailed installation instructions and
hardware and software requirements for each, click the package’s readme link. The
Helpful Info box on CloudXPRT.com also contains resources such as links to the
CloudXPRT master readme and the CloudXPRT GitHub repository. Soon, we will add
a link to the CloudXPRT Preview source code, which will be freely available for
testers to download and review.
All interested parties may now publish CloudXPRT
results. However, until we begin the formal results submission and review process in July, we will publish only results we
produce in our own lab. We anticipate adding the first set of those within the coming
We’re thankful for all the input we received during the initial CloudXPRT development process, and we welcome feedback on the CloudXPRT Preview. If you have any questions about CloudXPRT, or would like to share your comments and suggestions, please let us know.