BenchmarkXPRT Blog banner

Category: results submission

Default requirements for CloudXPRT results submissions

Over the past few weeks, we’ve received questions about whether we require specific test configuration settings for official CloudXPRT results submissions. Currently, testers have the option to edit up to 12 configuration options for the web microservices workload and three configuration options for the data analytics workload. Not all configuration options have an impact on testing and results, but a few of them can drastically affect key results metrics and how long it takes to complete a test. Because new CloudXPRT testers may not anticipate those outcomes, and so many configuration permutations are possible, we’ve come up with a set of requirements for all future results submissions to our site. Please note that testers are still free to adjust all available configuration options—and define service level agreement (SLA) settings—as they see fit for their own purposes. The requirements below apply only to results testers want to submit for publication consideration on our site, and to any resulting comparisons.

Web microservices results submission requirement

Starting with the May results submission cycle, all web microservices results submissions must have the workload.cpurequestsvalue, which lets the user designate the number of CPU cores the workload assigns to each pod, set to 4. Currently, the benchmark supports values of 1, 2, and 4, with the default value of 4. While 1 and 2 CPU cores per pod may be more appropriate for relatively low-end systems or configurations with few vCPUs, a value of 4 is appropriate for most datacenter processors, and it often enables CSP instances to operate within the benchmark’s max default 95th percentile latency SLA of 3,000 milliseconds.

In future CloudXPRT releases, we may remove the option to change the workload.cpurequests value from the config.json file and simply fix the value in the benchmark’s code to promote test predictability and reasonable comparisons. For more information about configuration options for the web microservices workload, please consult the Overview of the CloudXPRT Web Microservices Workload white paper.

Data analytics results submission requirement

Starting with the May results submission cycle, all data analytics results submissions must have the best reported performance (throughput_jobs/min) correspond to a 95th percentile SLA latency of 90 seconds or less. We have received submissions where the throughput was extremely high, but the 95th percentile SLA latency was up to 10 times the 90 seconds that we recommend in CloudXPRT documentation. High latency values may be acceptable for the unique purposes of individual testers, but they do not provide a good basis for comparison between clusters under test. For more information about configuration options with the data analytics workload, please consult the Overview of the CloudXPRT Data Analytics Workload white paper.

We will update CloudXPRT documentation to make sure that testers know to use the default configuration settings if they plan to submit results for publication. If you have any questions about CloudXPRT or the CloudXPRT results submission process, please let us know.

Justin

The CloudXPRT results viewer is live

We’re happy to announce that the CloudXPRT results viewer is now live with results from the first few rounds of CloudXPRT Preview testing we conducted in our lab. Here are some tips to help you to navigate the viewer more efficiently:

  • Click the tabs at the top of the table to switch from Data analytics workload results to Web microservices workload results.
  • Click the header of any column to sort the data on that variable. Single click to sort A to Z and double-click to sort Z to A.
  • Click the link in the Source/details column to visit a detailed page for that result, where you’ll find additional test configuration and system hardware information and the option to download results files.
  • By default, the viewer displays eight results per page, which you can change to 16, 48, or Show all.
  • The free-form search field above the table lets you filter for variables such as cloud service or processor.

We’ll be adding more features, including expanded filtering and sorting mechanisms, to the results viewer in the near future. We’re also investigating ways to present multiple data points in a graph format, which will allow visitors to examine performance behavior curves in conjunction with factors such as concurrency and resource utilization.

We welcome your CloudXPRT results submissions! To learn about the new submission and review process we’ll be using, take a look at last week’s blog.

If you have any questions or suggestions for ways that we can improve the results viewer, please let us know!

Justin

The CloudXPRT Preview results submission schedule

A few weeks ago, we shared the general framework of the periodic results publication process we will use for CloudXPRT. Now that the CloudXPRT Preview is live, we’re ready to share more details about the results review group; the submission, review, and publication cycles; and the schedule for the first three months.

The results review group
The CloudXPRT results review group will serve as a sanity check and a forum for comments on each month’s submissions. All registered BenchmarkXPRT Development Community members who wish to participate in the review process can join the group by contacting us via email. We’ll confirm receipt of your request and add you to the review group mailing list. Any non-members who would like to join the review group can contact us and we’ll help you become community members.

The submission, review, and publication cycle
We will update the CloudXPRT results database once a month on a published schedule. While testers can submit results through the CloudXPRT results submission page at any time, two weeks prior to each publication date, we will close submissions for that review cycle. One week prior to each publication date, we will email details of that month’s submissions to the results review group, along with the deadline for sending post-publication feedback.

Schedule for the first three publication cycles
We will publish results to the database on the last business day of each month and will close the submission window at 11:59 PM on the business day that falls two weeks earlier (with occasional adjustments for holidays). The schedule will be available at least six months in advance on CloudXPRT.com.

The schedule for the first three cycles is as follows:

July
Submission deadline: Friday 7/17/20
Publication date: Friday 7/31/20
August
Submission deadline: Monday 8/17/20
Publication date: Monday 8/31/20
September
Submission deadline: Wednesday 9/16/20
Publication date: Wednesday 9/30/20

As a reminder, members of the tech press, vendors, and other testers are free to publish CloudXPRT results at any time. We may choose to add such results to our database on the monthly publication date, after first vetting them.

We look forward to reviewing the first batch of results! If you have any questions about CloudXPRT or the results submission or review process, let us know!

Justin

The CloudXPRT Preview is here!

The CloudXPRT Preview installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! The CloudXPRT Preview includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.

Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. The Helpful Info box on CloudXPRT.com also contains resources such as links to the CloudXPRT master readme and the CloudXPRT GitHub repository. Soon, we will add a link to the CloudXPRT Preview source code, which will be freely available for testers to download and review.

All interested parties may now publish CloudXPRT results. However, until we begin the formal results submission and review process in July, we will publish only results we produce in our own lab. We anticipate adding the first set of those within the coming week.

We’re thankful for all the input we received during the initial CloudXPRT development process, and we welcome feedback on the CloudXPRT Preview. If you have any questions about CloudXPRT, or would like to share your comments and suggestions, please let us know.

Justin

More information about the CloudXPRT results submission process

Earlier this month, we discussed the possibility of using a periodic results submission process for CloudXPRT instead of the traditional rolling publication process that we’ve used for the other XPRTs. We’ve received some positive responses to the idea, and while we’re still working out some details, we’re ready to share the general framework of the process we’re planning to use.

  • We will establish a results review group, which only official BenchmarkXPRT Development Community members can join.
  • We will update the CloudXPRT database with new results once a month, on a pre-published schedule.
  • Two weeks before each publication date, we will stop accepting submissions for consideration for that review cycle.
  • One week before each publication date, we will send an email to the results review group that includes the details of that month’s submissions for review.
  • The results review group will serve as a sanity check process and a forum for comments on the month’s submissions, but we reserve the right of final approval for publication.
  • We will not restrict publishing results outside of the monthly review cadence, but we will not automatically add those results to the results database.
  • We may add externally published results to our database, but will do so only after vetting, and only on the designated day each month.

Our goal is to strike a balance between allowing the tech press, vendors, or other testers to publish CloudXPRT results on their own schedule, and simultaneously building a curated results database that OEMs or other parties can use to compete for the best results.

We’ll share more details about the review group, submission dates, and publications dates soon. Do you have questions or comments about the new process? Let us know what you think!

Justin

Our results database, your resource

Testers who have started using the XPRT benchmarks recently may not know about one of the free resources we offer. The XPRT results database currently holds more than 2,400 test results from over 90 sources, including major tech review publications around the world, OEMs, and independent testers. It offers a wealth of current and historical performance data across all the XPRT benchmarks and hundreds of devices.

We update the results database several times a week, adding selected results from our own internal lab testing, end-of-test user submissions, and reliable tech media sources. (After you run one of the XPRTs, you can choose to submit the results, but they don’t automatically appear in the database.)

Before adding a result, we evaluate whether the score makes sense and is consistent with general expectations, which we can do only when we have sufficient system information details. For that reason, we encourage testers to disclose as much hardware and software information as possible when publishing or submitting a result.

We encourage visitors to our site to explore the XPRT results database. There are three primary ways to do so. The first is by visiting the main BenchmarkXPRT results browser, which displays results entries for all of the XPRT benchmarks in chronological order (see the screenshot below). Users can narrow the results by selecting a benchmark from the drop-down menu and can type values, such as vendor or the name of a tech publication, into the free-form filter field. For results we produced in our lab, clicking “PT” in the Source column takes you to a page with additional disclosure information for the test system. For sources outside our lab, clicking the source name takes you to the original article or review that contains the result.

The second way to access our published results is by visiting the results page for each individual XPRT benchmark. Go the page of the benchmark you’re interested in, and look for the blue View Results button. Clicking it takes you to a page that displays results for only that benchmark. You can use the free-form filter on the page to filter those results, and can use the Benchmarks drop-down menu to jump to the other individual XPRT results pages.

The third way to view information in our results database is with the WebXPRT Processor Comparison Chart. When we publish a new WebXPRT result, the score automatically appears in the processor comparison chart as well. For each processor, the chart shows a bar representing the average score. Mousing over the bar displays a popup indicating the number of WebXPRT results we currently have for that processor and clicking the bar lets you view the results. You can change the number of results the chart displays on each page, and use the drop-down menu to toggle back and forth between the WebXPRT 3 and WebXPRT 2015 charts.

We hope you’ll take some time to browse the information in our results database. We welcome your feedback about what you’d like to see in the future and suggestions for improvement. Our database contains the XPRT scores that we’ve gathered, but we publish them as a resource for you. Let us know what you think!

Justin

Check out the other XPRTs:

Forgot your password?