BenchmarkXPRT Blog banner

Category: results submission

Improving WebXPRT-related tools and resources

As we move forward with the WebXPRT 4 development process, we’re also working on ways to enhance the value of WebXPRT beyond simply updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related tools and resources we offer at WebXPRT.com, starting with a new results viewer.

Currently, users can view WebXPRT results on our site two primary ways, each of which has advantages and limitations.

The first way is the WebXPRT results viewer, which includes hundreds of PT-curated performance scores from a wide range of trusted sources and devices. Users can sort entries by device type, device name, device model, overall score, date of publication, and source. The viewer also includes a free-form filter for quick, targeted searches. While the results viewer contains a wealth of information, it does not give users a way to use graphs or charts for viewing and comparing multiple results at once. Another limitation of the current results viewer is that it offers no easy way for users to access the additional data about the test device and the subtest scores that we have for many entries.

The second way to view WebXPRT results on our site is the WebXPRT Processor Comparison Chart. The chart uses horizontal bar graphs to compare test scores from the hundreds of published results in our database, grouped by processor type. Users can click the average score for a processor to view all the WebXPRT results we currently have for that processor. The visual aspect of the chart and its automated “group by processor type” feature are very useful, but it lacks the sorting and filtering capabilities of the viewer, and navigating to the details of individual tests takes multiple clicks.

In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!

Justin

We welcome your CloudXPRT results!

We recently published a set of CloudXPRT Data Analytics and Web Microservices workload test results submitted by Quanta Computer, Inc. The Quanta submission is the first set of CloudXPRT results that we’ve published using the formal results submission and approval process. We’re grateful to the Quanta team for carefully following the submission guidelines, enabling us to complete the review process without a hitch.

If you are unfamiliar with the process, you can find general information about how we review submissions in a previous blog post. Detailed, step-by-step instructions are available on the results submission page. As a reminder for testers who are considering submitting results for July, the submission deadline is tomorrow, Friday July 16, and the publication date is Friday July 30. We list the submission and publication dates for the rest of 2021 below. Please note that we do not plan to review submissions in December, so if we receive results submissions after November 30, we may not publish them until the end of January 2022.

August

Submission deadline: Tuesday 8/17/21

Publication date: Tuesday 8/31/21

September

Submission deadline: Thursday 9/16/21

Publication date: Thursday 9/30/21

October

Submission deadline: Friday 10/15/21

Publication date: Friday 10/29/21

November

Submission deadline: Tuesday 11/16/21

Publication date: Tuesday 11/30/21

December

Submission deadline: N/A

Publication date: N/A

If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!

Justin

How to submit WebXPRT results for publication

It’s been a while since we last discussed the process for submitting WebXPRT results to be considered for publication in the WebXPRT results browser and the WebXPRT Processor Comparison Chart, so we thought we’d offer a refresher.

Unlike sites that publish all results they receive, we hand-select results from internal lab testing, user submissions, and reliable tech media sources. In each case, we evaluate whether the score is consistent with general expectations. For sources outside of our lab, that evaluation includes confirming that there is enough detailed system information to help us determine whether the score makes sense. We do this for every score on the WebXPRT results page and the general XPRT results page. All WebXPRT results we publish automatically appear in the processor comparison chart as well.

Submitting your score is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 3 run on my personal system.

After you submit your score, we’ll contact you to confirm how we should display the source. You can choose one of the following:

  • Your first and last name
  • “Independent tester” (for those who wish to remain anonymous)
  • Your company’s name, provided that you have permission to submit the result in their name. To use a company name, we ask that you provide a valid company email address.


We will not publish any additional information about you or your company without your permission.

We look forward to seeing your score submissions, and if you have suggestions for the processor chart or any other aspect of the XPRTs, let us know!

Justin

Publishing CloudXPRT results from testing on pre-production gear

We recently received questions about whether we accept CloudXPRT results submissions from testing on pre-production gear, and how we would handle any differences between results from pre-production and production-level tests.  

To answer first question, we are not opposed to pre-production results submissions. We realize that vendors often want to include benchmark results in launch-oriented marketing materials they release before their hardware or software is publicly available. To help them do so, we’re happy to consider pre-production submissions on a case-by-case basis. All such submissions must follow the normal CloudXPRT results submission process, and undergo vetting by the CloudXPRT Results Review Group according to the standard review and publication schedule. If we decide to publish pre-production results on our site, we will clearly note their pre-production status.

In response to the second question, the CloudXPRT Results Review Group will handle any challenges to published results or perceived discrepancies between pre-production and production-level results on a case-by-case basis. We do not currently have a formal process for challenges; anyone who would like to initiate a challenge or express comments or concerns about a result should address the review group via benchmarkxprtsupport@principledtechnologies.com. Our primary concern is always to ensure that published results accurately reflect the performance characteristics of production-level hardware and software. If it becomes necessary to develop more policies in the future, we’ll do so, but we want to keep things as simple as possible.

If you have any questions about the CloudXPRT results submission process, please let us know!

Justin

Default requirements for CloudXPRT results submissions

Over the past few weeks, we’ve received questions about whether we require specific test configuration settings for official CloudXPRT results submissions. Currently, testers have the option to edit up to 12 configuration options for the web microservices workload and three configuration options for the data analytics workload. Not all configuration options have an impact on testing and results, but a few of them can drastically affect key results metrics and how long it takes to complete a test. Because new CloudXPRT testers may not anticipate those outcomes, and so many configuration permutations are possible, we’ve come up with a set of requirements for all future results submissions to our site. Please note that testers are still free to adjust all available configuration options—and define service level agreement (SLA) settings—as they see fit for their own purposes. The requirements below apply only to results testers want to submit for publication consideration on our site, and to any resulting comparisons.


Web microservices results submission requirement

Starting with the May results submission cycle, all web microservices results submissions must have the workload.cpurequestsvalue, which lets the user designate the number of CPU cores the workload assigns to each pod, set to 4. Currently, the benchmark supports values of 1, 2, and 4, with the default value of 4. While 1 and 2 CPU cores per pod may be more appropriate for relatively low-end systems or configurations with few vCPUs, a value of 4 is appropriate for most datacenter processors, and it often enables CSP instances to operate within the benchmark’s max default 95th percentile latency SLA of 3,000 milliseconds.

In future CloudXPRT releases, we may remove the option to change the workload.cpurequests value from the config.json file and simply fix the value in the benchmark’s code to promote test predictability and reasonable comparisons. For more information about configuration options for the web microservices workload, please consult the Overview of the CloudXPRT Web Microservices Workload white paper.


Data analytics results submission requirement

Starting with the May results submission cycle, all data analytics results submissions must have the best reported performance (throughput_jobs/min) correspond to a 95th percentile SLA latency of 90 seconds or less. We have received submissions where the throughput was extremely high, but the 95th percentile SLA latency was up to 10 times the 90 seconds that we recommend in CloudXPRT documentation. High latency values may be acceptable for the unique purposes of individual testers, but they do not provide a good basis for comparison between clusters under test. For more information about configuration options with the data analytics workload, please consult the Overview of the CloudXPRT Data Analytics Workload white paper.

We will update CloudXPRT documentation to make sure that testers know to use the default configuration settings if they plan to submit results for publication. If you have any questions about CloudXPRT or the CloudXPRT results submission process, please let us know.

Justin

The CloudXPRT results viewer is live

We’re happy to announce that the CloudXPRT results viewer is now live with results from the first few rounds of CloudXPRT Preview testing we conducted in our lab. Here are some tips to help you to navigate the viewer more efficiently:

  • Click the tabs at the top of the table to switch from Data analytics workload results to Web microservices workload results.
  • Click the header of any column to sort the data on that variable. Single click to sort A to Z and double-click to sort Z to A.
  • Click the link in the Source/details column to visit a detailed page for that result, where you’ll find additional test configuration and system hardware information and the option to download results files.
  • By default, the viewer displays eight results per page, which you can change to 16, 48, or Show all.
  • The free-form search field above the table lets you filter for variables such as cloud service or processor.

We’ll be adding more features, including expanded filtering and sorting mechanisms, to the results viewer in the near future. We’re also investigating ways to present multiple data points in a graph format, which will allow visitors to examine performance behavior curves in conjunction with factors such as concurrency and resource utilization.

We welcome your CloudXPRT results submissions! To learn about the new submission and review process we’ll be using, take a look at last week’s blog.

If you have any questions or suggestions for ways that we can improve the results viewer, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?