happy to announce that the CloudXPRT v1.2 update package is now available! The
update prevents potential installation failures on Google Cloud Platform and
Microsoft Azure, and ensures that the web microservices workload works on
Ubuntu 22.04. The update uses updated software components such as Kubernetes
v1.23.7, Kubespray v2.18.1, and Kubernetes Metrics Server v1, and incorporates
some additional minor script changes.
CloudXPRT v1.2 web microservices workload installation package is available at the CloudXPRT.com download
page and the BenchmarkXPRT GitHub repository.
you get started with v1.2, please note the following updated system
- Ubuntu 20.04.2 or 22.04 for on-premises testing
- Ubuntu 18.04, 20.04.2, or 22.04 for CSP (AWS/Azure/GCP) testing
CloudXPRT is designed to run on high-end servers, physical nodes or VMs under
test must meet the following minimum specifications:
- 16 logical or virtual CPUs
- 8 GB of RAM
- 10 GB of available disk space (50 GB for the data analytics workload)
update package includes only the updated v1.2 test harness and the updated web
microservices workload. It does not include the data analytics workload. As we
stated in the blog,
now that we’ve published the web microservices package, we will assess the
level of interest users express about a possible refresh of the v1.1 data
analytics workload. For now, the v1.1 data analytics workload will continue to
be available via CloudXPRT.com
for some time to serve as a reference resource for users who have worked with
the package in the past.
Please let us know if you have any questions about the CloudXPRT v1.2 test package. Happy testing!
month, we announced
that we’re working on an updated CloudXPRT web microservices test package. The purpose
of the update is to fix installation failures on Google Cloud Platform and
Microsoft Azure, and ensure that the web microservices workload works on Ubuntu
22.04, using updated software components such as Kubernetes v1.23.7, Kubespray
v2.18.1, and Kubernetes Metrics Server v1. The update also incorporates some
additional minor script changes.
are still testing the updated test package with on-premises hardware and Amazon
Web Services, Google Cloud Platform, and Microsoft Azure configurations. So
far, testing is progressing well, and we feel increasingly confident that we
will be able to release the updated test package soon. We would like to share a
more concrete release schedule, but because of the complexity of the workload
and the CSP platforms involved, we are waiting until we are certain that
everything is ready to go.
name of the updated package will be CloudXPRT v1.2, and it will include only the
updated v1.2 test harness and the updated web microservices workload. It will
not include the data analytics workload. As we stated in last month’s blog, we plan
to publish the updated web microservices package, and see what kind of interest
we receive from users about a possible refresh of the v1.1 data analytics workload.
For now, the v1.1 data analytics workload will continue to be available via CloudXPRT.com
for some time to serve as a reference resource for users that have worked with
the package in the past.
soon as possible, we’ll provide more information about the CloudXPRT v1.2 release
date here in the blog. If you have any questions about the update or CloudXPRT
in general, please feel free to contact us!
developed our first cloud benchmark, CloudXPRT,
to measure the performance of cloud applications deployed on modern infrastructure
as a service (IaaS) platforms. When we first released CloudXPRT in
February of 2021, the benchmark included two test packages: a web microservices
workload and a data analytics workload. Both supported on-premises and cloud
service provider (CSP) testing with Amazon Web Services (AWS), Google Cloud
Platform (GCP), and Microsoft Azure.
is our most complex benchmark, requiring sustained compatibility between many
software components across multiple independent test environments. As vendors
roll out updates for some components and stop supporting others, it’s
inevitable that something will break. Since CloudXPRT’s launch, we’ve become
aware of installation failures while attempting to set up CloudXPRT on Ubuntu
virtual machines with GCP and Microsoft Azure. Additionally, while the web
microservices workload continues to run in most instances with a few
configuration tweaks and workarounds, the data analytics workload fails
consistently due to compatibility issues with Minio, Prometheus, and Kafka
within the Kubernetes environment.
response, we’re working to fix problems with the web microservices workload and
bring all necessary components up to date. We’re developing an updated test
package that will work on Ubuntu 22.04, using Kubernetes v1.23.7 and Kubespray
v2.18.1. We’re also updating Kubernetes Metrics Server from v1beta1 to v1, and will
incorporate some minor script changes. Our goal is to ensure successful
installation and testing with the on-premises and CSP platforms that we
supported when we first launched CloudXPRT.
are currently focusing on the web microservices workload for two reasons.
First, more users have downloaded it than the data analytics workload. Second, we
think we have a clear path to success. Our plan is to publish the updated web
microservices test package, and see what feedback and interest we receive from
users about a possible data analytics refresh. The existing data analytics workload
will remain available via CloudXPRT.com for the time being to serve as a
apologize for the inconvenience that these issues have caused. We’ll provide
more information about a release timeline and final test package details here
in the blog as we get closer to publication. If you have any questions about
the future of CloudXPRT, please feel free to contact us!
CloudXPRT testers have reported installation failures while attempting to set
up CloudXPRT on Ubuntu virtual machines with Google Cloud Platform (GCP) and
Microsoft Azure. We have not yet determined whether the installation process
fails consistently on these VMs or the problem occurs under only specific
conditions. We believe these failures occur with only GCP and Azure, and you should
still be able to successfully install and run CloudXPRT on both Amazon Web
Services virtual machines and on-premises gear.
apologize for the inconvenience that this issue causes for CloudXPRT testers
and will let the community know as soon as we identify a reliable solution. If
you have encountered any other issues during CloudXPRT testing, please feel
free to contact us!
recently published a set of CloudXPRT Data Analytics and Web Microservices
workload test results
submitted by Quanta Computer, Inc.
The Quanta submission is the first set of CloudXPRT results that we’ve
published using the formal results submission and approval process.
We’re grateful to the Quanta team for carefully following the submission
guidelines, enabling us to complete the review process without a hitch.
If you are unfamiliar
with the process, you can find general information about how we review
submissions in a previous blog post.
Detailed, step-by-step instructions are available on the results submission page.
As a reminder for testers who are considering submitting results for July, the
submission deadline is tomorrow, Friday July 16, and the publication date is
Friday July 30. We list the submission and publication dates for the rest of
2021 below. Please note that we do not plan to review submissions in December,
so if we receive results submissions after November 30, we may not publish them
until the end of January 2022.
Submission deadline: Tuesday 8/17/21
Publication date: Tuesday 8/31/21
Submission deadline: Thursday 9/16/21
Publication date: Thursday 9/30/21
Submission deadline: Friday 10/15/21
Publication date: Friday 10/29/21
Submission deadline: Tuesday 11/16/21
Publication date: Tuesday 11/30/21
Submission deadline: N/A
Publication date: N/A
If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!
We recently received questions about whether we accept CloudXPRT
results submissions from testing on pre-production gear, and how we would handle
any differences between results from pre-production and production-level tests.
To answer first question, we are not opposed to pre-production
results submissions. We realize that vendors often want to include benchmark
results in launch-oriented marketing materials they release before their
hardware or software is publicly available. To help them do so, we’re happy to
consider pre-production submissions on a case-by-case basis. All such submissions
must follow the normal CloudXPRT results
submission process, and undergo
vetting by the CloudXPRT Results Review Group according to the standard review
and publication schedule. If we decide to publish pre-production results on our site, we
will clearly note their pre-production status.
In response to the second question, the CloudXPRT Results Review Group will handle any challenges to published results or perceived discrepancies between pre-production and production-level results on a case-by-case basis. We do not currently have a formal process for challenges; anyone who would like to initiate a challenge or express comments or concerns about a result should address the review group via email@example.com. Our primary concern is always to ensure that published results accurately reflect the performance characteristics of production-level hardware and software. If it becomes necessary to develop more policies in the future, we’ll do so, but we want to keep things as simple as possible.
If you have any questions about the CloudXPRT results submission process, please let us know!