Earlier this month, we discussed
the possibility of using a periodic results submission process for CloudXPRT
instead of the traditional rolling publication process that we’ve used for the
other XPRTs. We’ve received some positive responses to the idea, and while
we’re still working out some details, we’re ready to share the general
framework of the process we’re planning to use.
- We will establish
a results review group, which only official BenchmarkXPRT Development Community
members can join.
- We will update
the CloudXPRT database with new results once a month, on a pre-published
- Two weeks before each
publication date, we will stop accepting submissions for consideration for
that review cycle.
- One week before each
publication date, we will send an email to the results review group that
includes the details of that month’s submissions for review.
- The results
review group will serve as a sanity check process and a forum for comments
on the month’s submissions, but we reserve the right of final approval for
- We will not
restrict publishing results outside of the monthly review cadence, but we
will not automatically add those results to the results database.
- We may add
externally published results to our database, but will do so only after
vetting, and only on the designated day each month.
Our goal is to
strike a balance between allowing the tech press, vendors, or other testers to
publish CloudXPRT results on their own schedule, and simultaneously building a curated
results database that OEMs or other parties can use to compete for the best
We’ll share more
details about the review group, submission dates, and publications dates soon.
Do you have questions or comments about the new process? Let us
know what you think!
A few months
ago, we wrote about the possibility of creating a datacenter XPRT. In the
intervening time, we’ve discussed the idea with folks both in and outside of the
XPRT Community. We’ve heard from vendors of datacenter products, hosting/cloud
providers, and IT professionals that use those products and services.
thread that emerged was the need for a cloud benchmark that can accurately
measure the performance of modern, cloud-first applications deployed on modern infrastructure
as a service (IaaS) platforms, whether those platforms are on-premises, hosted
elsewhere, or some combination of the two (hybrid clouds). Regardless of where
clouds reside, applications are increasingly using them in latency-critical,
highly available, and high-compute scenarios.
datacenter benchmarks do not give a clear indication of how applications will
perform on a given IaaS infrastructure, so the benchmark should use cloud-native
components on the actual stacks used for on-prem and public cloud management.
We are planning to call the benchmark CloudXPRT. Our goal is for CloudXPRT to address the needs described above while also including the elements that have made the other XPRTs successful. We plan for CloudXPRT to
- Be relevant to on-prem (datacenter), private, and public cloud
- Run on top of cloud platform software such as Kubernetes
- Include multiple workloads that address common scenarios like web
applications, AI, and media analytics
- Support multi-tier workloads
- Report relevant metrics including both throughput and critical
latency for responsiveness-driven applications and maximum throughput for
applications dependent on batch processing
workloads will use cloud-native components on an actual stack to provide
end-to-end performance metrics that allow users to choose the best IaaS
configuration for their business.
building and testing preliminary versions of CloudXPRT for the last few months.
Based on the progress so far, we are shooting to have a Community Preview of
CloudXPRT ready in mid- to late-March with a version for general availability ready
about two months later.
coming weeks, we’ll be working on getting out more information about CloudXPRT
and continuing to talk with interested parties about how they can help. We’d
love to hear what workflows would be of most interest to you and what you would
most like to see in a datacenter/cloud benchmark. Please feel free to contact us!