PT-Logo
Forgot your password?
BenchmarkXPRT Blog banner

Tag Archives: benchmark

CloudXPRT is on the way

A few months ago, we wrote about the possibility of creating a datacenter XPRT. In the intervening time, we’ve discussed the idea with folks both in and outside of the XPRT Community. We’ve heard from vendors of datacenter products, hosting/cloud providers, and IT professionals that use those products and services.

The common thread that emerged was the need for a cloud benchmark that can accurately measure the performance of modern, cloud-first applications deployed on modern infrastructure as a service (IaaS) platforms, whether those platforms are on-premises, hosted elsewhere, or some combination of the two (hybrid clouds). Regardless of where clouds reside, applications are increasingly using them in latency-critical, highly available, and high-compute scenarios.

Existing datacenter benchmarks do not give a clear indication of how applications will perform on a given IaaS infrastructure, so the benchmark should use cloud-native components on the actual stacks used for on-prem and public cloud management.

We are planning to call the benchmark CloudXPRT. Our goal is for CloudXPRT to address the needs described above while also including the elements that have made the other XPRTs successful. We plan for CloudXPRT to

  • Be relevant to on-prem (datacenter), private, and public cloud deployments
  • Run on top of cloud platform software such as Kubernetes
  • Include multiple workloads that address common scenarios like web applications, AI, and media analytics
  • Support multi-tier workloads
  • Report relevant metrics including both throughput and critical latency for responsiveness-driven applications and maximum throughput for applications dependent on batch processing

CloudXPRT’s workloads will use cloud-native components on an actual stack to provide end-to-end performance metrics that allow users to choose the best IaaS configuration for their business.

We’ve been building and testing preliminary versions of CloudXPRT for the last few months. Based on the progress so far, we are shooting to have a Community Preview of CloudXPRT ready in mid- to late-March with a version for general availability ready about two months later.

Over the coming weeks, we’ll be working on getting out more information about CloudXPRT and continuing to talk with interested parties about how they can help. We’d love to hear what workflows would be of most interest to you and what you would most like to see in a datacenter/cloud benchmark. Please feel free to contact us!

Bill

The XPRTs in 2019: Looking back on an exciting and productive year

2019 is winding down, and we want to take this opportunity to review another exciting and productive year for the BenchmarkXPRT Development Community. Readers of our newsletter are familiar with the stats and updates we post in each month’s mailing, but we know that not all our blog readers receive the newsletter, so we’ve compiled the highlights below.

Trade shows
Earlier this year, Justin attended CES in Las Vegas and Mark travelled to MWC Barcelona. These shows help us keep up with the latest industry trends and gather insights that help to lay the groundwork for XPRT development in the years ahead.

Benchmarks
In the past year, we released MobileXPRT 3, HDXPRT 4, and AIXPRT, our new AI benchmark tool that helps you evaluate a system’s machine learning inference performance. There’s much more to come in 2020 with AIXPRT and several other projects, so expect more news about benchmark development early in the year.

Web mentions
In 2019 so far, journalists, advertisers, and analysts have referenced the XPRTs over 5,000 times, including mentions in more than 190 articles and 1,350 device reviews. This represents a more than 50% increase over 2018.

Downloads and confirmed runs
To date, we’ve had more than 24,800 benchmark downloads and 153,000 confirmed runs in 2019, increases of more than 8% and 10%, respectively, over 2018. Within the last month, our most popular benchmark, WebXPRT, passed the 500,000-run milestone! WebXPRT continues to be an industry-standard performance benchmark upon which OEM labs, vendors, and leading tech press outlets rely.

XPRT Tech Spotlight
We put 47 new devices in the XPRT Tech Spotlight throughout the year and published updated back-to-school, Black Friday, and holiday showcases to help buyers compare devices.

Media and interactive tools
We published a new XPRTs around the world infographic and an interactive AIXPRT installation package selector tool. We’ve received a lot of positive feedback about the tool. We encourage you to give it a try if you’re curious about AIXPRT but aren’t sure how to get started.

We’re thankful for everyone who used the XPRTs, joined the community, and sent questions and suggestions throughout 2019. This will be our last blog post for 2019, but there’s much more to come in 2020, including some exciting new developments. Stay tuned in early January for updates!

Justin

AIXPRT’s unique development path

With four separate machine learning toolkits on their own development schedules, three workloads, and a wide range of possible configurations and use cases, AIXPRT has more moving parts than any of the XPRT benchmark tools to date. Because there are so many different components, and because we want AIXPRT to provide consistently relevant evaluation data in the rapidly evolving AI and machine learning spaces, we anticipate a cadence of AIXPRT updates in the future that will be more frequent than the schedules we’ve used for other XPRTs in the past. With that expectation in mind, we want to let AIXPRT testers know that when we release an AIXPRT update, they can expect minimized disruption, consideration for their testing needs, and clear communication.

Minimized disruption

Each AIXPRT toolkit (Intel OpenVINO, TensorFlow, NVIDIA TensorRT, and Apache MXNet) is on its own development schedule, and we won’t always have a lot of advance notice when new versions are on the way. Hypothetically, a new version of OpenVINO could release one month, and a new version of TensorRT just two months later. Thankfully, the modular nature of AIXPRT’s installation packages ensures that we won’t need to revise the entire AIXPRT suite every time a toolkit update goes live. Instead, we’ll update each package individually when necessary. This means that if you only test with a single AIXPRT package, updates to the other packages won’t affect your testing. For us to maintain AIXPRT’s relevance, there’s unfortunately no way to avoid all disruption, but we’ll work to keep it to a minimum.

Consideration for testers

As we move forward, when software compatibility issues force us to update an AIXPRT package, we may discover that the update has a significant effect on results. If we find that results from the new package are no longer comparable to those from previous tests, we’ll share the differences that we’re seeing in our lab. As always, we will use documentation and versioning to make sure that testers know what to expect and  that there’s no confusion about which package to use.

Clear communication

When we update any package, we’ll make sure to communicate any updates in the new build as clearly as possible. We’ll document all changes thoroughly in the package readmes, and we’ll talk through significant updates here in the blog. We’re also available to answer questions about AIXPRT and any other XPRT-related topic, so feel free to ask!

Justin

Planning for the next CrXPRT

We’re currently planning the next version of CrXPRT, our benchmark that evaluates the performance and battery life of Chromebooks. If you’re unfamiliar with CrXPRT, you can find out more about how it works both here in the blog and at CrXPRT.com. If you’ve used CrXPRT, we’d love to hear any suggestions you may have. What do you like or dislike about CrXPRT? What features do you hope to see in a new version?

When we begin work on a new version of any benchmark, one of our first steps is to determine whether the workloads will provide value during the years ahead. As technology and user behavior evolve, we update test content to be more relevant. One example is when we replace photos with ones that use more contemporary file resolutions and sizes.

Sometimes the changing tech landscape prompts us to remove entire workloads and add new ones. The Photo Collage workload in CrXPRT uses Portable Native Client (PNaCl) technology, for which the Chrome team will soon end support. CrXPRT 2015 has a workaround for this issue, but the best course of action for the next version of CrXPRT will be to remove this workload altogether.

The battery life test will also change. Earlier this year, we started to see unusual battery life estimates and high variance when running tests at CrXPRT’s default battery life test length of 3.5 hours, so we’ve been recommending that users perform full rundowns instead. In the next CrXPRT, the battery life test will require full rundowns.

We’ll also be revamping the CrXPRT UI to improve the look of the benchmark and make it easier to use, as we’ve done with the other recent XPRT releases.

We really do want to hear your ideas, and any feedback you send has a chance to shape the future of the benchmark. Let us know what you think!

Justin

AIXPRT is here!

We’re happy to announce that AIXPRT is now available to the public! AIXPRT includes support for the Intel OpenVINO, TensorFlow, and NVIDIA TensorRT toolkits to run image-classification and object-detection workloads with the ResNet-50 and SSD-MobileNet v1networks, as well as a Wide and Deep recommender system workload with the Apache MXNet toolkit. The test reports FP32, FP16, and INT8 levels of precision.

To access AIXPRT, visit the AIXPRT download page. There, a download table displays the AIXPRT test packages. Locate the operating system and toolkit you wish to test and click the corresponding Download link. For detailed installation instructions and information on hardware and software requirements for each package, click the package’s Readme link. If you’re not sure which AIXPRT package to choose, the AIXPRT package selector tool will help to guide you through the selection process.

In addition, the Helpful Info box on AIXPRT.com contains links to a repository of AIXPRT resources, as well links to XPRT blog discussions about key AIXPRT test configuration settings such as batch size and precision.

We hope AIXPRT will prove to be a valuable tool for you, and we’re thankful for all the input we received during the preview period! If you have any questions about AIXPRT, please let us know.

Coming soon: An interactive AIXPRT selector tool

AI workloads are now relevant to all types of hardware, from servers to laptops to IOT devices, so we intentionally designed AIXPRT to support a wide range of potential hardware, toolkit, and workload configurations. This approach provides AIXPRT testers with a tool that is flexible enough to adapt to a variety of environments. The downside is that the number of options makes it fairly complicated to figure out which AIXPRT download package suits your needs.

To help testers navigate this complexity, we’ve been working on a new interactive selector tool. The tool is not yet live, but the screenshots and descriptions below provide a preview of what’s to come.

The tool will include drop-down menus for the key factors that go into determining the correct AIXPRT download package, along with a description of the options. Users can proceed in any order but will need to make a selection for each category. Since not all combinations work together, each selection the user makes will eliminate some of the options in the remaining categories.

AIXPRT user guide snip 1

After a user selects an option, a check mark appears on the category icon, and the selection for that category appears in the category box (e.g., TensorFlow in the Toolkit category). This shows users which categories they’ve completed and the selections they’ve made. After a user selects options in more than one category, a Start over button appears in the lower-left corner. Clicking this button clears all existing selections and provides users with a clean slate.

Once every category is complete, a Download button appears in the lower-right corner. When you click this, a popup appears that provides a link for the correct download package and associated readme file.

AIXPRT user guide snip 2

We hope the selector tool will help make the AIXPRT download and installation process easier for those who are unfamiliar with the benchmark. Testers who already know exactly which package they need will be able to bypass the tool and go directly to a download table.

The tool will debut with the AIXPRT 1.0 GA in the next few days, and we’ll let everyone know when that happens! If you have any questions or comments about AIXPRT, please let us know.

Justin

Check out the other XPRTs: