BenchmarkXPRT Blog banner

Tag Archives: white paper

Looking back on 2022 with the XPRTs

Around the beginning of each new year, we like to take the opportunity to look back and summarize the XPRT highlights from the previous year. Readers of our newsletter are familiar with the stats and updates we include each month, but for our blog readers who don’t receive the newsletter, we’ve compiled some highlights from 2022 below.

Benchmarks
In the past year, we released WebXPRT 4, and the CloudXPRT v1.2 update package.

XPRTs in the media
Journalists, advertisers, and analysts referenced the XPRTs thousands of times in 2022. It’s always rewarding to know that the XPRTs have proven to be useful and reliable assessment tools for technology publications around the world. Media sites that used the XPRTs in 2022 include AnandTech, Android Authority, Benchlife.info (China), BodNara (South Korea), ComputerBase (Germany), DISKIDEE (Belgium), eTeknix, Expert Reviews, Gadgets 360, Hardware.info (The Netherlands), Hardware Zone (Singapore), ITC.ua (Ukraine), ITmedia (Japan), Itndaily.ru (Russia), Notebookcheck, PCMag, PC-Welt (Germany), PCWorld, TechPowerUp, Tom’s Guide, TweakTown, and ZOL.com (China).

Downloads and confirmed runs
In 2022, we had more than 10,800 benchmark downloads and 183,300 confirmed runs. Users have run our most popular benchmark, WebXPRT, more than 1,135,500 times since its debut in 2013! WebXPRT continues to be a go-to, industry-standard performance benchmark for OEM labs, vendors, and leading tech press outlets around the globe.

XPRT media, tools, and publications
Part of our mission with the XPRTs is to produce tools and materials that help testers better understand the ins and outs of benchmarking in general and the XPRTs in particular. To help achieve this goal, we published the following in 2022:

We’re thankful for everyone who used the XPRTs, joined the community, and sent questions and suggestions throughout 2022. We’re excited to see what’s in store for the XPRTs in 2023!

Justin

The Exploring WebXPRT 4 white paper is now available

This week, we published the Exploring WebXPRT 4 white paper. It describes the design and structure of WebXPRT 4, including detailed information about the benchmark’s harness, HTML5 and WebAssembly (WASM) capability checks, and changes we’ve made to the structure of the performance test workloads. We explain the benchmark’s scoring methodology, how to automate tests, and how to submit results for publication. The white paper also includes information about the third-party functions and libraries that WebXPRT 4 uses during the HTML5 and WASM capability checks and performance workloads.

The Exploring WebXPRT 4 white paper promotes the high level of transparency and disclosure that is a core value of the BenchmarkXPRT Development Community. We’ve always believed that transparency builds trust, and trust is essential for a healthy benchmarking community. That’s why we involve community members in the benchmark development process and disclose how we build our benchmarks and how they work.

You can find the paper on WebXPRT.com and our XPRT white papers page. If you have any questions about WebXPRT 4, please let us know, and be sure to check out our other XPRT white papers.

Justin

Default requirements for CloudXPRT results submissions

Over the past few weeks, we’ve received questions about whether we require specific test configuration settings for official CloudXPRT results submissions. Currently, testers have the option to edit up to 12 configuration options for the web microservices workload and three configuration options for the data analytics workload. Not all configuration options have an impact on testing and results, but a few of them can drastically affect key results metrics and how long it takes to complete a test. Because new CloudXPRT testers may not anticipate those outcomes, and so many configuration permutations are possible, we’ve come up with a set of requirements for all future results submissions to our site. Please note that testers are still free to adjust all available configuration options—and define service level agreement (SLA) settings—as they see fit for their own purposes. The requirements below apply only to results testers want to submit for publication consideration on our site, and to any resulting comparisons.


Web microservices results submission requirement

Starting with the May results submission cycle, all web microservices results submissions must have the workload.cpurequestsvalue, which lets the user designate the number of CPU cores the workload assigns to each pod, set to 4. Currently, the benchmark supports values of 1, 2, and 4, with the default value of 4. While 1 and 2 CPU cores per pod may be more appropriate for relatively low-end systems or configurations with few vCPUs, a value of 4 is appropriate for most datacenter processors, and it often enables CSP instances to operate within the benchmark’s max default 95th percentile latency SLA of 3,000 milliseconds.

In future CloudXPRT releases, we may remove the option to change the workload.cpurequests value from the config.json file and simply fix the value in the benchmark’s code to promote test predictability and reasonable comparisons. For more information about configuration options for the web microservices workload, please consult the Overview of the CloudXPRT Web Microservices Workload white paper.


Data analytics results submission requirement

Starting with the May results submission cycle, all data analytics results submissions must have the best reported performance (throughput_jobs/min) correspond to a 95th percentile SLA latency of 90 seconds or less. We have received submissions where the throughput was extremely high, but the 95th percentile SLA latency was up to 10 times the 90 seconds that we recommend in CloudXPRT documentation. High latency values may be acceptable for the unique purposes of individual testers, but they do not provide a good basis for comparison between clusters under test. For more information about configuration options with the data analytics workload, please consult the Overview of the CloudXPRT Data Analytics Workload white paper.

We will update CloudXPRT documentation to make sure that testers know to use the default configuration settings if they plan to submit results for publication. If you have any questions about CloudXPRT or the CloudXPRT results submission process, please let us know.

Justin

Coming soon: a white paper about the CloudXPRT web microservices workload

Soon, we’ll be expanding our portfolio of CloudXPRT resources with a white paper that focuses on the benchmark’s web microservices workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper will discuss the workload in much greater detail.

In addition to providing practical information about the web microservices installation packages and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.

As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family, with no shortage of topics to explore further. We plan to publish a companion overview for the data analytics workload, and possible future topics include the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.

We hope that the upcoming Overview of the CloudXPRT Web Microservices Workload paper will serve as a go-to resource for CloudXPRT testers, and will answer any questions you have about the workload. Once it goes live, we’ll provide links in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, please let us know!

Justin

The Introduction to CloudXPRT white paper is now available!

Today, we published the Introduction to CloudXPRT white paper. The paper provides an overview of our latest benchmark and consolidates CloudXPRT-related information that we’ve published in the XPRT blog over the past several months. It describes the CloudXPRT workloads, choosing and downloading installation packages, submitting CloudXPRT results for publication, and possibilities for additional development in the coming months.

CloudXPRT is one of the most complex tools in the XPRT family, and there are more CloudXPRT-related topics to discuss than we could fit in this first paper. In future white papers, we will discuss in greater detail each of the benchmark workloads, the range of test configuration options, results reporting, and methods for analysis.

We hope that Introduction to CloudXPRT will provide testers who are interested in CloudXPRT with a solid foundation of understanding on which they can build. Moving forward, we will provide links to the paper in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions about CloudXPRT, please let us know!

Justin

The Introduction to AIXPRT white paper is now available!

Today, we published the Introduction to AIXPRT white paper. The paper serves as an overview of the benchmark and a consolidation of AIXPRT-related information that we’ve published in the XPRT blog over the past several months. For folks who are completely new to AIXPRT and veteran testers who need to brush up on pre-test configuration procedures, we hope this paper will be a quick, one-stop reference that helps reduce the learning curve.

The paper describes the AIXPRT toolkits and workloads, adjusting key test parameters (batch size, level of precision, number of concurrent instances, and default number of requests), using alternate test configuration files, understanding and submitting results, and accessing the source code.

We hope that Introduction to AIXPRT will prove to be a valuable resource. Moving forward, readers will be able to access the paper from the Helpful Info box on AIXPRT.com and the AIXPRT section of our XPRT white papers page. If you have any questions about AIXPRT, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?