BenchmarkXPRT Blog banner

Category: Datacenter

Coming soon: a white paper about the CloudXPRT web microservices workload

Soon, we’ll be expanding our portfolio of CloudXPRT resources with a white paper that focuses on the benchmark’s web microservices workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper will discuss the workload in much greater detail.

In addition to providing practical information about the web microservices installation packages and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.

As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family, with no shortage of topics to explore further. We plan to publish a companion overview for the data analytics workload, and possible future topics include the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.

We hope that the upcoming Overview of the CloudXPRT Web Microservices Workload paper will serve as a go-to resource for CloudXPRT testers, and will answer any questions you have about the workload. Once it goes live, we’ll provide links in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, please let us know!

Justin

CloudXPRT version 1.0 is here!

The CloudXPRT Preview period has ended, and CloudXPRT version 1.0 installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! Like the Preview build, CloudXPRT version 1.0 includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.

Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. On CloudXPRT.com, the Helpful Info box contains resources such as links to the Introduction to CloudXPRT white paper, the CloudXPRT master readme, and the CloudXPRT GitHub repository.

The GitHub repository also contains the CloudXPRT source code. The source code is freely available for testers to download and review.

Performance results from this release are comparable to performance results from the CloudXPRT Preview build. Testers who wish to publish results on CloudXPRT.com can find more information about the results submission and review process in the blog. We post the monthly results cycle schedule on the results submission page.

We’re thankful for all the input we received during the CloudXPRT development process and Preview period. If you have any questions about CloudXPRT, please let us know.

Justin

The ongoing evolution of the BenchmarkXPRT Development Community

This November will mark the tenth anniversary of the BenchmarkXPRT Development Community, which we originally called the HDXPRT Development Community. Since the early days of HDXPRT, our community has grown to include about 275 members from over 85 companies and organizations, and we’ve added seven benchmarks to the XPRT family. We initially mailed HDXPRT DVDs to testers interested in a new way to evaluate PC performance, and now thousands of users around the world download our benchmarks and rely on them to help measure the performance of everything from tablets to laptops to high-end datacenter hardware.

As the XPRTs continue to grow and evolve, we’ve worked to make sure that the resources that we offer—and the ways we offer them—continue to meet the needs of XPRT testers and community members. As we expand in the AI and datacenter spaces with AIXPRT and CloudXPRT, our user group is becoming larger and more diverse than ever. We have already made some changes to better serve this expanding group, and will be making additional changes over the months ahead.

The first set of changes relate to our community membership model. Originally, membership in the BenchmarkXPRT Development Community required a $20 fee and provided access to preview versions of new benchmarks, the ability to submit ideas for future benchmarks, and regular updates through our monthly newsletter and community announcements. To remove the financial obstacle to joining, we introduced a fee waiver process a few years ago.

Also, we know that some OEM employees and members of the tech press are interested in the XPRTs, but are unable to join the community for one reason or another. With these people in mind, we recently experimented with making the CloudXPRT Preview publicly available. Releasing preview builds to all who are interested makes it more likely that users will incorporate the XPRTs into their test suites, and we have decided to adopt this practice for other benchmarks going forward.

In the coming months, we’ll be updating parts of our website to increase access to XPRT content. For example, certain content such as source code for most of the XPRTs is currently available only to members. We plan to remove the login requirement for access to this material.

Please keep in mind that membership in the BenchmarkXPRT Development Community continues to offer exclusive opportunities. Members can join groups such as the CloudXPRT Results Review Group and offer direct input into the design of future benchmarks. Members also receive our monthly newsletters.

If you have any questions about the XPRTs or community membership, please feel free to ask!

Justin

Improving the CloudXPRT results viewer

This week, we made some changes to the CloudXPRT results viewer that we think will simplify the results-browsing experience and allow visitors to more quickly and easily find important data.

The first set of changes involves how we present test system information in the main results table and on the individual results details pages. We realized that there was potential for confusion around the “CPU” and “Number of nodes” categories. We removed those and created the following new fields: “Cluster components,” “Nodes (work + control plane),”  and “vCPUs (work + control plane).” These new categories better describe test configurations and clarify how many CPUs engage with the workload.

The second set of changes involves the number of data points that we list in the table for each web microservices test run. For example, previously, we published a unique entry for each level of concurrency a test run records. If a run scaled to 32 concurrent instances, we presented the data for each instance in its own row. This helped to show the performance curve during a single test as the workload scaled up, but it made it more difficult for visitors to identify the best throughput results from an individual run. We decided to consolidate the results from a complete test run on a single row, highlighting only the maximum number of successful requests (throughout). All the raw data from each run remains available for download on the details page for each result, but visitors don’t have to wade through all that data to find the configuration’s main “score.”

We view the development of the CloudXPRT results viewer as an ongoing process. As we add results and receive feedback from testers about the data presentation formats that work best for them, we’ll continue to add more features and tweak existing ones to make them as useful as possible. If you have any questions about CloudXPRT results or the results viewer, please let us know!

Justin

The CloudXPRT Preview results submission schedule

A few weeks ago, we shared the general framework of the periodic results publication process we will use for CloudXPRT. Now that the CloudXPRT Preview is live, we’re ready to share more details about the results review group; the submission, review, and publication cycles; and the schedule for the first three months.

The results review group
The CloudXPRT results review group will serve as a sanity check and a forum for comments on each month’s submissions. All registered BenchmarkXPRT Development Community members who wish to participate in the review process can join the group by contacting us via email. We’ll confirm receipt of your request and add you to the review group mailing list. Any non-members who would like to join the review group can contact us and we’ll help you become community members.

The submission, review, and publication cycle
We will update the CloudXPRT results database once a month on a published schedule. While testers can submit results through the CloudXPRT results submission page at any time, two weeks prior to each publication date, we will close submissions for that review cycle. One week prior to each publication date, we will email details of that month’s submissions to the results review group, along with the deadline for sending post-publication feedback.

Schedule for the first three publication cycles
We will publish results to the database on the last business day of each month and will close the submission window at 11:59 PM on the business day that falls two weeks earlier (with occasional adjustments for holidays). The schedule will be available at least six months in advance on CloudXPRT.com.

The schedule for the first three cycles is as follows:

July
Submission deadline: Friday 7/17/20
Publication date: Friday 7/31/20
August
Submission deadline: Monday 8/17/20
Publication date: Monday 8/31/20
September
Submission deadline: Wednesday 9/16/20
Publication date: Wednesday 9/30/20

As a reminder, members of the tech press, vendors, and other testers are free to publish CloudXPRT results at any time. We may choose to add such results to our database on the monthly publication date, after first vetting them.

We look forward to reviewing the first batch of results! If you have any questions about CloudXPRT or the results submission or review process, let us know!

Justin

The CloudXPRT Preview is here!

The CloudXPRT Preview installation packages are now available on CloudXPRT.com and the BenchmarkXPRT GitHub repository! The CloudXPRT Preview includes two workloads: web microservices and data analytics (you can find more details about the workloads here). Testers can use metrics from the workloads to compare IaaS stack (both hardware and software) performance and to evaluate whether any given stack is capable of meeting SLA thresholds. You can configure CloudXPRT to run on local datacenter, Amazon Web Services, Google Cloud Platform, or Microsoft Azure deployments.

Several different test packages are available for download from the CloudXPRT download page. For detailed installation instructions and hardware and software requirements for each, click the package’s readme link. The Helpful Info box on CloudXPRT.com also contains resources such as links to the CloudXPRT master readme and the CloudXPRT GitHub repository. Soon, we will add a link to the CloudXPRT Preview source code, which will be freely available for testers to download and review.

All interested parties may now publish CloudXPRT results. However, until we begin the formal results submission and review process in July, we will publish only results we produce in our own lab. We anticipate adding the first set of those within the coming week.

We’re thankful for all the input we received during the initial CloudXPRT development process, and we welcome feedback on the CloudXPRT Preview. If you have any questions about CloudXPRT, or would like to share your comments and suggestions, please let us know.

Justin

Check out the other XPRTs:

Forgot your password?