BenchmarkXPRT Blog banner

Category: OpenVINO

AIXPRT Community Preview 2 is almost here!

In last week’s blog, we predicted that the second AIXPRT Community Preview (CP2) would be ready for release later this month. Since then, the development process has accelerated, and we now expect to release CP2 as early as tomorrow, May 10.

Those who have access to the existing AIXPRT Community Preview GitHub repository will be able to access CP2 the same way as before. In addition to making the build available on GitHub, we’ll also post CP2 on an AIXPRT tab in the XPRT Members’ Area (login required). If you don’t have a BenchmarkXPRT Development Community membership, please contact us and we’ll help you register.

Testing with AIXPRT CP2 in Ubuntu will be the same as with the first CP, and none of the CP2 changes will affect results. In Windows, testers will be able to use OpenVINO to target a system’s CPU and GPU, and TensorFlow to target CPUs. We’re still investigating ways to support TensorFlow GPU and TensorFlow-TensorRT testing in Windows.

We’re also continuing to work on the improvements to the AIXPRT results viewer that we mentioned last week. We won’t be able to implement all of the changes by tomorrow, but rather than waiting until we’re finished, we’ll be rolling out improvements as they become ready.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. If you have any questions or comments, please let us know.

Justin

An update on AIXPRT development

It’s been almost two months since the AIXPRT Community Preview went live, and we want to provide folks with a quick update. Community Preview periods for the XPRTs generally last about a month. Because of the complexity of AIXPRT and some of the feedback we’ve received, we plan to release a second AIXPRT Community Preview (CP2) later this month.

One of the biggest additions in CP2 will be the ability to run AIXPRT on Windows. AIXPRT currently requires test systems to run Ubuntu 16.04 LTS. This is fine for testers accustomed to Linux environments, but presents obstacles for those who want to test in a traditional Windows environment. We will not be changing the tests themselves, so this update will not influence existing results from Ubuntu. We plan to make CP2 available for download from the BenchmarkXPRT website for people who don’t wish to deal with GitHub.

Also, after speaking with testers and learning more about the kinds of data points people are looking for in AIXPRT results, we’ve decided to make significant adjustments to the AIXPRT results viewer. To make it easier for visitors to find what they’re looking for, we’ll add filters for key categories such as batch size, toolkit, and latency percentile (e.g., 50th, 90th, 99th), among others. We’ll also allow users to set desired ranges for metrics such as throughput and latency.

Finally, we’re adding a demo mode that displays some images and other information on the screen while a test is running to give users a better idea what is happening. While we haven’t seen results change while running in demo mode, users should not publish demo results or use them for comparison.

We hope to release CP2 in the second half of May and a GA version in mid-June. However, this project has more uncertainties than we usually encounter with the XPRTs, so that timeline could easily change.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. As always, we appreciate your suggestions. If you have any questions or comments about AIXPRT, please let us know.

Bill

All about the AIXPRT Community Preview

Last week, Bill discussed our plans for the AIXPRT Community Preview (CP). I’m happy to report that, despite some last-minute tweaks and testing, we’re close to being on schedule. We expect to take the CP build live in the coming days, and will send a message to community members to let them know when the build is available in the AIXPRT GitHub repository.

As we mentioned last week, the AIXPRT CP build includes support for the Intel OpenVINO, TensorFlow (CPU and GPU), and TensorFlow with NVIDIA TensorRT toolkits to run image-classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32, FP16, and INT8 levels of precision. Although the minimum CPU and GPU requirements vary by toolkit, the test systems must be running Ubuntu 16.04 LTS. You’ll be able to find more detail on those requirements in the installation instructions that we’ll post on AIXPRT.com.

We’re making the AIXPRT CP available to anyone interested in participating, but you must have a GitHub account. To gain access to the CP, please contact us and let us know your GitHub username. Once we receive it, we’ll send you an invitation to join the repository as a collaborator.

We’re allowing folks to quote test results during the CP period, and we’ll publish results from our lab and other members of the community at AIXPRT.com. Because this testing involves so many complex variables, we may contact testers if we see published results that seem to be significantly different than those from comparable systems. During the CP period, On the AIXPRT results page, we’ll provide detailed instructions on how to send in your results for publication on our site. For each set of results we receive , we’ll disclose all of the detailed test, software, and hardware information that the tester provides. In doing so, our goal is to make it possible for others to reproduce the test and confirm that they get similar numbers.

If you make changes to the code during testing, we ask that you email us and describe those changes. We’ll evaluate if those changes should become part of AIXPRT. We also require that users do not publish results from modified versions of the code during the CP period.

We expect the AIXPRT CP period to last about four to six weeks, placing the public release around the end of March or beginning of April. In the meantime, we welcome your thoughts and suggestions about all aspects of the benchmark.

Please let us know if you have any questions. Stay tuned to AIXPRT.com and the blog for more developments, and we look forward to seeing your results!

JNG

An update on the AIXPRT Request for Comments preview

As we approach the end of the original feedback window for the AIXPRT Request for Comments preview build, we want to update folks on the status of the project and what to expect in the coming weeks.

First, thanks to those who’ve downloaded the AIXPRT OpenVINO package and sent in their questions and comments. We value your feedback, and it’s instrumental in making AIXPRT a better tool. We’re currently working through some issues with the TensorFlow and TensorRT packages, and hope to add support for those to the RFC preview build repository very soon.

We’re also hoping to have a full-fledged community preview (CP) ready in mid to late February. Like our other community previews, the AIXPRT CP would be solid enough to allow folks to start quoting numbers. We typically make our benchmarks available to the general public four to six weeks after the community preview period begins, so if that schedule holds, it would place the public AIXPRT release around the end of March.

In light of the schedule described above, you still have time to gain access to the AIXPRT RFC preview build and give your feedback, so let us know if you’d like to check it out. The installation and testing process can take less than an hour, but getting everything properly set up can take a few tries. We are hard at work trying to make that process more straightforward. We welcome your input on all aspects of the benchmark, including workloads, ease of use, metrics, scores, and reporting.

Thanks for your help!

Justin

The AIXPRT Request for Comments preview build

In the next few days, we’ll be publishing the first AIXPRT tool as a Request for Comments (RFC) preview build, an early version of one of the AIXPRT tools we’re developing to help evaluate machine learning performance.

We’re inviting folks to run the workload and send in their thoughts and suggestions. Only BenchmarkXPRT Development Community members have access to our RFCs and the opportunity to provide feedback. However, because we’re seeking broad input from experts in this field, we’ll gladly make anyone interested in participating a member.

This AIXPRT RFC preview build includes support for the Intel OpenVINO computer vision toolkit to run image classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32 and FP16 levels of precision. The system requirements are:

  • Operating system = Ubuntu 16.04
  • CPU = 6th to 8th generation Intel Core or Xeon processors, or Intel Pentium processors N4200/5, N3350/5, N3450/5 with Intel HD Graphics


We welcome input on all aspects of the benchmark, including scope, workloads, metrics and scores, user experience, and reporting. We will add support for TensorFlow and TensorRT to the AIXPRT RFC preview build during the preview period. We are accepting feedback through January 25th, 2019, after which we’ll collect and evaluate responses before publishing the next build. Because this is an RFC release, we ask that testers do not publish scores or use the results for comparison purposes.

We’ll send out a community announcement when the RFC preview build is officially available, and we’ll also post an announcement and RFC preview build user guide on AIXPRT.com. We’re hosting the AIXPRT RFC preview build in a dedicated GitHub repository, so please contact us at BenchmarkXPRTsupport@principledtechnologies.com to gain access.

This is just the next step for AIXPRT. With your help, we hope to add more workloads and other frameworks in the coming months. We look forward to receiving your feedback!

Bill

Check out the other XPRTs:

Forgot your password?