PT-Logo
Forgot your password?
BenchmarkXPRT Blog banner

Category: BenchmarkXPRT

Principled Technologies and the BenchmarkXPRT Development Community make the AIXPRT source code available to the public

Durham, NC, February 18 — Principled Technologies and the BenchmarkXPRT Development Community release the source code for the AIXPRT benchmark to the public. AIXPRT is a free tool that allows users to evaluate a system’s machine learning inference performance by running common image-classification, object detection, and recommender system workloads.

“Publishing the AIXPRT source code is part of our commitment to making the XPRT development process as transparent as possible,” said Bill Catchings, co-founder of Principled Technologies, which administers the BenchmarkXPRT Development Community. “By allowing all interested parties to download and review our source code, we’re taking tangible steps to improve openness in the benchmarking industry.”

To access the AIXPRT source code, visit the AIXPRT GitHub repository at https://github.com/BenchmarkXPRT/AIXPRT.

AIXPRT includes support for the Intel© OpenVINO™, TensorFlow™, and NVIDIA© TensorRT™ toolkits to run image-classification and object-detection workloads with the ResNet-50 and SSD-MobileNet v1 networks, as well as the MXNet™ toolkit with a Wide and Deep recommender system workload. The test reports FP32, FP16, and INT8 levels of precision.

To access AIXPRT, visit www.AIXPRT.com.

AIXPRT is part of the BenchmarkXPRT suite of performance evaluation tools, which includes WebXPRT, CrXPRT, MobileXPRT, TouchXPRT, and HDXPRT. The XPRTs help users get the facts before they buy, use, or evaluate tech products such as computers, tablets, and phones.

To learn more about the BenchmarkXPRT Development Community, go to www.BenchmarkXPRT.com or contact a BenchmarkXPRT Development Community representative directly by sending a message to BenchmarkXPRTsupport@PrincipledTechnologies.com.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing, as well as learning and development services. It administers the BenchmarkXPRT Development Community.

Principled Technologies, Inc. is located in Durham, North Carolina, USA. For more information, please visit www.PrincipledTechnologies.com.

Company Contact
Justin Greene
BenchmarkXPRT Development Community
Principled Technologies, Inc.
1007 Slater Road, Ste. 300
Durham, NC 27704
BenchmarkXPRTsupport@PrincipledTechnologies.com

The AIXPRT source code is now public

This week, we have good news for AIXPRT testers: the AIXPRT source code is now available to the public via GitHub. As we’ve discussed in the past, publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. With other XPRT benchmarks, we’ve only made the source code available to community members. With AIXPRT, we have released the source code more widely. By allowing all interested parties, not just community members, to download and review our source code, we’re taking tangible steps to improve openness and honesty in the benchmarking industry and we’re encouraging the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

Traditional open-source models encourage developers to change products and even take them in new and different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source code and submit potential workloads for future consideration, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

We encourage you to download and review the source and send us any feedback you may have. Your questions and suggestions may influence future versions of AIXPRT. If you have any questions about AIXPRT or accessing the source code, please feel free to ask! Please also let us know if you think we should take this approach to releasing the source code with other XPRT benchmarks.

Justin

A preview of the new CrXPRT 2 UI

As we get closer to the CrXPRT 2 Community Preview (CP), we want to provide readers with a glimpse of the new CrXPRT 2 UI. In line with the functional and aesthetic themes we used for the latest versions of WebXPRT, MobileXPRT, and HDXPRT, we’re implementing a clean, bright look with a focus on intuitive navigation. The screenshots below show how we’ve used that approach to rework the home, battery life test, performance test, and battery life test results screens. (We’re still tweaking the UI, so the screens you see in the CP may differ slightly.)

On the home screen, we kept the performance test and battery life test buttons, but made it clearer that you can choose only one. We also added a link to the user manual to the bottom ribbon for quick access.

If you choose to run a battery life test and click Next, the screen below appears. The CrXPRT 2 battery life test requires a full rundown, so you’ll need charge your device to 100 percent before you can start the test. Once you’ve done that, enter a name for the test run, unplug the system, and click Start. (Note that you no longer need to enter values for screen brightness and audio levels.)

The CrXPRT 2 performance test includes updated versions of six of the seven workloads in CrXPRT 2015. (As we discussed in a previous blog post, newer versions of Chrome can’t run the Photo Collage workload without a workaround, so we removed it from CrXPRT 2.)  To run the performance test, enter a name for the test run, customize the workloads if you wish, and click Start.

For the results screens, we wanted to highlight the most important end-of-test information while still offering clear paths for options such as getting additional details on the test, submitting results, and running the test again. Below, we show the results screen from a battery life test. Note the “Main menu” link in the upper-left corner, which we added to all screens to give users a quick way to navigate back to the home screen.

CrXPRT 2 development and testing are still underway. We don’t yet have an exact release date for the CP, but once we do, we’ll announce it here in the blog.

What do you think about the new CrXPRT 2 UI? Let us know!

Justin

The XPRT activity we have planned for first half of 2020

Today, we want to let readers know what to expect from the XPRTs over the next several months. Timelines and details can always change, but we’re confident that community members will see CloudXPRT Community Preview (CP), updated AIXPRT, and CrXPRT 2 releases during the first half of 2020.

CloudXPRT

Last week, Bill shared some details about our new datacenter-oriented benchmark, CloudXPRT. If you missed that post, we encourage you to check it out and learn more about the need for a new kind of cloud benchmark, and our plans for the benchmark’s structure and metrics. We’re already testing preliminary builds, and aim to release a CloudXPRT CP in late March, followed by a version for general availability roughly two months later.

AIXPRT

About a month ago, we explained how the number of moving parts in AIXPRT will necessitate a different development approach than we’ve used for other XPRTs. AIXPRT will require more frequent updating than our other benchmarks, and we anticipate releasing the second version of AIXPRT by mid-year. We’re still finalizing the details, but it’s likely to include the latest versions of ResNet-50 and SSD-MobileNet, selected SDK updates, ease-of-use improvements for the harness, and improved installation scripts. We’ll share more detailed information about the release timeline here in the blog as soon as possible.

CrXPRT 2

As we mentioned in December, we’re working on CrXPRT 2, the next version of our benchmark that evaluates the performance and battery life of Chromebooks. You can find out more about how CrXPRT works both here in the blog and at CrXPRT.com.

We’re currently testing an alpha version of CrXPRT 2. Testing is going well, but we’re tweaking a few items and refining the new UI. We should start testing a CP candidate in the next few weeks, and will have firmer information for community members about a CP release date very soon.

We’re excited about these new developments and the prospect of extending the XPRTs into new areas. If you have any questions about CloudXPRT, AIXPRT, or CrXPRT 2, please feel free to ask!

Justin

CloudXPRT is on the way

A few months ago, we wrote about the possibility of creating a datacenter XPRT. In the intervening time, we’ve discussed the idea with folks both in and outside of the XPRT Community. We’ve heard from vendors of datacenter products, hosting/cloud providers, and IT professionals that use those products and services.

The common thread that emerged was the need for a cloud benchmark that can accurately measure the performance of modern, cloud-first applications deployed on modern infrastructure as a service (IaaS) platforms, whether those platforms are on-premises, hosted elsewhere, or some combination of the two (hybrid clouds). Regardless of where clouds reside, applications are increasingly using them in latency-critical, highly available, and high-compute scenarios.

Existing datacenter benchmarks do not give a clear indication of how applications will perform on a given IaaS infrastructure, so the benchmark should use cloud-native components on the actual stacks used for on-prem and public cloud management.

We are planning to call the benchmark CloudXPRT. Our goal is for CloudXPRT to address the needs described above while also including the elements that have made the other XPRTs successful. We plan for CloudXPRT to

  • Be relevant to on-prem (datacenter), private, and public cloud deployments
  • Run on top of cloud platform software such as Kubernetes
  • Include multiple workloads that address common scenarios like web applications, AI, and media analytics
  • Support multi-tier workloads
  • Report relevant metrics including both throughput and critical latency for responsiveness-driven applications and maximum throughput for applications dependent on batch processing

CloudXPRT’s workloads will use cloud-native components on an actual stack to provide end-to-end performance metrics that allow users to choose the best IaaS configuration for their business.

We’ve been building and testing preliminary versions of CloudXPRT for the last few months. Based on the progress so far, we are shooting to have a Community Preview of CloudXPRT ready in mid- to late-March with a version for general availability ready about two months later.

Over the coming weeks, we’ll be working on getting out more information about CloudXPRT and continuing to talk with interested parties about how they can help. We’d love to hear what workflows would be of most interest to you and what you would most like to see in a datacenter/cloud benchmark. Please feel free to contact us!

Bill

The XPRTs in 2019: Looking back on an exciting and productive year

2019 is winding down, and we want to take this opportunity to review another exciting and productive year for the BenchmarkXPRT Development Community. Readers of our newsletter are familiar with the stats and updates we post in each month’s mailing, but we know that not all our blog readers receive the newsletter, so we’ve compiled the highlights below.

Trade shows
Earlier this year, Justin attended CES in Las Vegas and Mark travelled to MWC Barcelona. These shows help us keep up with the latest industry trends and gather insights that help to lay the groundwork for XPRT development in the years ahead.

Benchmarks
In the past year, we released MobileXPRT 3, HDXPRT 4, and AIXPRT, our new AI benchmark tool that helps you evaluate a system’s machine learning inference performance. There’s much more to come in 2020 with AIXPRT and several other projects, so expect more news about benchmark development early in the year.

Web mentions
In 2019 so far, journalists, advertisers, and analysts have referenced the XPRTs over 5,000 times, including mentions in more than 190 articles and 1,350 device reviews. This represents a more than 50% increase over 2018.

Downloads and confirmed runs
To date, we’ve had more than 24,800 benchmark downloads and 153,000 confirmed runs in 2019, increases of more than 8% and 10%, respectively, over 2018. Within the last month, our most popular benchmark, WebXPRT, passed the 500,000-run milestone! WebXPRT continues to be an industry-standard performance benchmark upon which OEM labs, vendors, and leading tech press outlets rely.

XPRT Tech Spotlight
We put 47 new devices in the XPRT Tech Spotlight throughout the year and published updated back-to-school, Black Friday, and holiday showcases to help buyers compare devices.

Media and interactive tools
We published a new XPRTs around the world infographic and an interactive AIXPRT installation package selector tool. We’ve received a lot of positive feedback about the tool. We encourage you to give it a try if you’re curious about AIXPRT but aren’t sure how to get started.

We’re thankful for everyone who used the XPRTs, joined the community, and sent questions and suggestions throughout 2019. This will be our last blog post for 2019, but there’s much more to come in 2020, including some exciting new developments. Stay tuned in early January for updates!

Justin

Check out the other XPRTs: