BenchmarkXPRT Blog banner

Category: Hosted cloud

We’ve fixed an installation bug in the CloudXPRT Data Analytics Workload package

Yesterday, we published an updated CloudXPRT Data Analytics workload package that fixes a problem during the package installation process. CloudXPRT uses the Helm utility, which serves as a package manager for the Kubernetes container orchestration system. Helm accesses files in a default repository, and the version of Helm that we originally used with CloudXPRT tries to access files that are no longer available. We fixed the problem by updating the code to use the latest version of Helm.

This update does not change how the benchmark workload runs, and has no impact on benchmark results. We apologize if this bug caused headaches for any testers during installation, and we appreciate your patience as we worked on a fix.

As a reminder for testers interested in experimenting with the CloudXPRT Data Analytics workload, the Overview of the CloudXPRT Data Analytics Workload paper is now available. You can find links to the paper and other resources in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, or have encountered any obstacles during testing, please let us know!

Justin

The Overview of the CloudXPRT Data Analytics Workload white paper is now available!

Today, we expand our portfolio of CloudXPRT resources with a paper on the benchmark’s data analytics workload. While we summarized the workload in the Introduction to CloudXPRT white paper, the new paper goes into much greater detail.

In addition to providing practical information about the data analytics installation package and minimum system requirements, the paper describes the workload’s test configuration variables, structural components, task workflows, and test metrics. It also discusses interpreting test results and the process for submitting results for publication.

CloudXPRT is the most complex tool in the XPRT family, and the new paper is part of our effort to create more—and better—CloudXPRT documentation. We plan to publish additional CloudXPRT white papers in the coming months, with possible future topics including the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for analysis.

We hope that the Overview of the CloudXPRT Data Analytics Workload paper will serve as a go-to resource for CloudXPRT testers, and will answer any questions you have about the workload. You can find links to the paper and other resources in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, please let us know!

Justin

Improved CloudXPRT documentation is coming soon

CloudXPRT is undoubtedly the most complex tool in the XPRT family of benchmarks. To run the cloud-native benchmark’s multiple workloads across different hardware and software platforms, testers need two things: (1) at least a passing familiarity with a wide range of cloud-related toolkits, and (2) an understanding that changing even one test configuration variable can affect test results. While the complexity of CloudXPRT makes it a powerful and flexible tool for measuring application performance on real-world IaaS stacks, it also creates a steep learning curve for new users.

Benchmark setup and configuration can involve a number of complex steps, and the corresponding instructions should be thorough, unambiguous, and intuitive to follow. For all of the XPRT tools, we strive to publish documentation that provides quick, easy-to-find answers to the questions users might have. Community members have asked us to improve the clarity and readability of the CloudXPRT setup, configuration, and individual workload documentation. In response, we are working to create more—and better—CloudXPRT documentation.

If you’re intimidated by the benchmark’s complexity, helping you is one of our highest priorities. In the coming weeks and months, we’ll be evaluating all of our CloudXPRT documentation, particularly from the perspective of new users, and will release more information about the new documentation as it becomes available.

We also want to remind you of some of the existing CloudXPRT resources. We encourage everyone to check out the Introduction to CloudXPRT and Overview of the CloudXPRT Web Microservices Workload white papers. (Note that we’ll soon be publishing a paper on the benchmark’s data analytics workload.) Also, a couple of weeks ago, we published the CloudXPRT learning tool, which we designed to serve as an information hub for common CloudXPRT topics and questions, and to help tech journalists, OEM lab engineers, and everyone who is interested in CloudXPRT find the answers they need as quickly as possible.

Thanks to all who let us know that there was room for improvement in the CloudXPRT documentation. We rely on that kind of feedback and always welcome it. If you have any questions or suggestions regarding CloudXPRT or any of the other XPRTs, please let us know!

Justin

The CloudXPRT learning tool is now live!

We’re happy to announce that the CloudXPRT learning tool is now live! We designed the tool to serve as an information hub for common CloudXPRT topics and questions, and to help tech journalists, OEM lab engineers, and everyone who is interested in CloudXPRT find the answers they need as quickly as possible.

The tool features four primary areas of content:

  • The Q&A section provides quick answers to the questions we receive most from testers and the tech press.
  • The CloudXPRT: the basics section describes specific topics such as the benchmark’s target platforms, workloads, companion cloud software, and hardware and software requirements.
  • The Testing and results section covers the testing process, metrics, and how to publish results.
  • The cloud primer provides brief, easy-to-understand definitions of key cloud computing terms and concepts.

The first screenshot below shows the home screen. To illustrate how some of the pop-up information sections appear, the second screenshot shows part of the Key terms and concepts module in the Cloud primer section. 

We’re excited about the new CloudXPRT learning tool! If you have any questions about the tool, or suggestions for additional content to include in it, please let us know!

Justin

Next up: a white paper about the CloudXPRT data analytics workload

Soon, we’ll be publishing a CloudXPRT white paper that focuses on the benchmark’s data analytics workload. We summarized the workload in the Introduction to CloudXPRT white paper, but in the same way that the Overview of the CloudXPRT Web Microservices Workload paper did, the new paper will discuss the workload in much greater detail.

In addition to providing practical information about the installation package and minimum system requirements for the data analytics workload, the paper will describe test configuration variables, structural components, task workflows, and test metrics. It will also include guidance on interpreting test results and submitting them for publication.

As we’ve noted, CloudXPRT is one of the more complex tools in the XPRT family, with no shortage of topics to explore. Possible future topics include the impact of adjusting specific test configuration options, recommendations for results reporting, and methods for results analysis. If there are specific topics that you’d like us to address in future white papers, please feel free to send us your ideas!

We hope that the upcoming Overview of the CloudXPRT Data Analytics Workload paper will serve as a go-to resource for CloudXPRT testers, and will answer any questions you have about the workload. Once it goes live, we’ll provide links in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions, please let us know!

Justin

Fixes for minor CloudXPRT bugs are on the way

We want to let CloudXPRT testers know that we’re close to releasing an updated version (build 1.01) with two minor bug fixes, an improved post-test results processing script, and an adjustment to one of our test configuration recommendations. None of these changes will affect performance or test results, so scores from previous CloudXPRT builds will be comparable to those from the new build.

The most significant changes in CloudXPRT build 1.01 are as follows:

  • In previous builds, some testers encountered warnings during setup to update the version of Kubernetes Operations (kops) when testing on public-cloud platforms (the CloudXPRT 1.00 recommendation is kops version 1.16.0). We are adjusing the kops installation instructions in the setup instructions for the web microservices and data analytics workloads to prevent these warnings.
  • In previous builds, post-test cleanup instructions for public-cloud testing environments do not always delete all of the resources that CloudXPRT creates during setup. We are updating instructions to ensure a more thorough cleanup process. This change applies to test instructions for the web microservices and data analytics workloads.
  • We are reformatting the optional results graphs the web microservices postprocess program creates to make them easier to interpret.
  • In previous builds, the recommended time interval for the web-microservices workload is 120 seconds if the hpamode option is enabled and 60 seconds if it is disabled. Because we’ve found that the 60-second difference has no significant impact on test results, we are changing the recommendation to 60 seconds for both hpamode settings.


We hope these changes will improve the CloudXPRT setup and testing experience. We haven’t set the release date for the updated build yet, but when we do, we’ll announce it here in the blog. If you have any questions about CloudXPRT, or would like to report bugs or other issues, please feel free to contact us!

Justin

Check out the other XPRTs:

Forgot your password?