BenchmarkXPRT Blog banner

Category: BenchmarkXPRT development community

Improving the CloudXPRT results viewer

This week, we made some changes to the CloudXPRT results viewer that we think will simplify the results-browsing experience and allow visitors to more quickly and easily find important data.

The first set of changes involves how we present test system information in the main results table and on the individual results details pages. We realized that there was potential for confusion around the “CPU” and “Number of nodes” categories. We removed those and created the following new fields: “Cluster components,” “Nodes (work + control plane),”  and “vCPUs (work + control plane).” These new categories better describe test configurations and clarify how many CPUs engage with the workload.

The second set of changes involves the number of data points that we list in the table for each web microservices test run. For example, previously, we published a unique entry for each level of concurrency a test run records. If a run scaled to 32 concurrent instances, we presented the data for each instance in its own row. This helped to show the performance curve during a single test as the workload scaled up, but it made it more difficult for visitors to identify the best throughput results from an individual run. We decided to consolidate the results from a complete test run on a single row, highlighting only the maximum number of successful requests (throughout). All the raw data from each run remains available for download on the details page for each result, but visitors don’t have to wade through all that data to find the configuration’s main “score.”

We view the development of the CloudXPRT results viewer as an ongoing process. As we add results and receive feedback from testers about the data presentation formats that work best for them, we’ll continue to add more features and tweak existing ones to make them as useful as possible. If you have any questions about CloudXPRT results or the results viewer, please let us know!

Justin

We’re working on an update for the AIXPRT OpenVINO workload

Shortly after the initial AIXPRT release, we noted that each of the toolkits AIXPRT uses (Intel OpenVINO, TensorFlow, NVIDIA TensorRT, and Apache MXNet) is on its own development schedule, and new versions will sometimes appear with little warning. When this happens, we’ll have to respond by updating specific AIXPRT installation packages, giving AIXPRT testers relatively short notice.

This is one of those times! Intel recently released OpenVINO 2020.3 Long-Term Support (LTS), and we’re planning to update the AIXPRT OpenVINO packages with the LTS version. The LTS version targets environments that benefit from maximum stability, and don’t require a constant stream of new tools and feature changes. In other words, it’s well suited for a benchmark, and we think it’s a good fit for AIXPRT moving forward.

We don’t yet know what impact the new version will have on AIXPRT OpenVINO test results. A substantial part of the development process will involve testing the new packages on a variety of platforms to see how performance changes. We’ll communicate our findings here in the blog, so AIXPRT testers will know what to expect.

Thankfully, the modular nature of the AIXPRT installation packages ensures that we don’t need to revise the entire AIXPRT suite every time a toolkit update goes live. If you test with only TensorFlow, TensorRT, or MXNet, or a combination of those toolkits, this update won’t affect your testing.

We’re not ready to commit to a release date for the new build, but anticipate it will be in September.

If you have any questions about AIXPRT or OpenVINO, please let us know!

Justin

Now available: An updated CloudXPRT Preview build and source code

Today, we published an updated CloudXPRT Preview build (v0.97), along with the build’s source code. The new build fixes a few minor bugs, and makes several improvements to help facilitate installation, setup, and testing. The fixes do not affect CloudXPRT test results, so results from the new build are comparable to results from the original build (v0.95). You can find more detailed information about the changes in last week’s blog.

The CloudXPRT Preview v0.97 source code is available to the public via the CloudXPRT GitHub repository. As we’ve discussed in the past, publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing all interested parties to download and review our source code, we’re encouraging openness and honesty in the benchmarking industry and are inviting the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

While the CloudXPRT source code is available to the public, our approach to derivative works differs from some open-source models. Traditional open-source models encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

We encourage you to download and review the source and send us any feedback you have. Your questions and suggestions may influence future versions of CloudXPRT.

If you have any questions about CloudXPRT or the source code, please let us know!

Justin

The CloudXPRT Preview results submission schedule

A few weeks ago, we shared the general framework of the periodic results publication process we will use for CloudXPRT. Now that the CloudXPRT Preview is live, we’re ready to share more details about the results review group; the submission, review, and publication cycles; and the schedule for the first three months.

The results review group
The CloudXPRT results review group will serve as a sanity check and a forum for comments on each month’s submissions. All registered BenchmarkXPRT Development Community members who wish to participate in the review process can join the group by contacting us via email. We’ll confirm receipt of your request and add you to the review group mailing list. Any non-members who would like to join the review group can contact us and we’ll help you become community members.

The submission, review, and publication cycle
We will update the CloudXPRT results database once a month on a published schedule. While testers can submit results through the CloudXPRT results submission page at any time, two weeks prior to each publication date, we will close submissions for that review cycle. One week prior to each publication date, we will email details of that month’s submissions to the results review group, along with the deadline for sending post-publication feedback.

Schedule for the first three publication cycles
We will publish results to the database on the last business day of each month and will close the submission window at 11:59 PM on the business day that falls two weeks earlier (with occasional adjustments for holidays). The schedule will be available at least six months in advance on CloudXPRT.com.

The schedule for the first three cycles is as follows:

July
Submission deadline: Friday 7/17/20
Publication date: Friday 7/31/20
August
Submission deadline: Monday 8/17/20
Publication date: Monday 8/31/20
September
Submission deadline: Wednesday 9/16/20
Publication date: Wednesday 9/30/20

As a reminder, members of the tech press, vendors, and other testers are free to publish CloudXPRT results at any time. We may choose to add such results to our database on the monthly publication date, after first vetting them.

We look forward to reviewing the first batch of results! If you have any questions about CloudXPRT or the results submission or review process, let us know!

Justin

The CloudXPRT Preview is here!

The CloudXPRT Preview installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! The CloudXPRT Preview includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.

Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. The Helpful Info box on CloudXPRT.com also contains resources such as links to the CloudXPRT master readme and the CloudXPRT GitHub repository. Soon, we will add a link to the CloudXPRT Preview source code, which will be freely available for testers to download and review.

All interested parties may now publish CloudXPRT results. However, until we begin the formal results submission and review process in July, we will publish only results we produce in our own lab. We anticipate adding the first set of those within the coming week.

We’re thankful for all the input we received during the initial CloudXPRT development process, and we welcome feedback on the CloudXPRT Preview. If you have any questions about CloudXPRT, or would like to share your comments and suggestions, please let us know.

Justin

The CloudXPRT Preview is almost here

We’re happy to announce that we’re planning to release the CloudXPRT Preview next week! After we take the CloudXPRT Preview installation and source code packages live, they will be freely available to the public via CloudXPRT.com and the BenchmarkXPRT GitHub repository. All interested parties will be able to publish CloudXPRT results. However, until we begin the formal results submission and review process in July, we will publish only results we produce in our own lab. We’ll share more information about that process and the corresponding dates here in the blog in the coming weeks.

We do have one change to report regarding the CloudXPRT workloads we announced in a previous blog post. The Preview will include the web microservices and data analytics workloads (described below), but will not include the AI-themed container scaling workload. We hope to add that workload to the CloudXPRT suite in the near future, and are still conducting testing to make sure we get it right.

If you missed the earlier workload-related post, here are the details about the two workloads that will be in the preview build:

  • In the web microservices workload, a simulated user logs in to a web application that does three things: provides a selection of stock options, performs Monte-Carlo simulations with those stocks, and presents the user with options that may be of interest. The workload reports performance in transactions per second, which testers can use to directly compare IaaS stacks and to evaluate whether any given stack is capable of meeting service-level agreement (SLA) thresholds.
  • The data analytics workload calculates XGBoost model training time. XGBoost is a gradient-boosting framework  that data scientists often use for ML-based regression and classification problems. The purpose of the workload in the context of CloudXPRT is to evaluate how well an IaaS stack enables XGBoost to speed and optimize model training. The workload reports latency and throughput rates. As with the web-tier microservices workload, testers can use this workload’s metrics to compare IaaS stack performance and to evaluate whether any given stack is capable of meeting SLA thresholds.

The CloudXPRT Preview provides OEMs, the tech press, vendors, and other testers with an opportunity to work with CloudXPRT directly and shape the future of the benchmark with their feedback. We hope that testers will take this opportunity to explore the tool and send us their thoughts on its structure, workload concepts and execution, ease of use, and documentation. That feedback will help us improve the relevance and accessibility of CloudXPRT testing and results for years to come.

If you have any questions about the upcoming CloudXPRT Preview, please feel free to contact us.

Justin

Check out the other XPRTs:

Forgot your password?