PT-Logo
Forgot your password?
BenchmarkXPRT Blog banner

Category: Linux

Understanding AIXPRT’s default number of requests

A few weeks ago, we discussed how AIXPRT testers can adjust the key variables of batch size, levels of precision, and number of concurrent instances by editing the JSON test configuration file in the AIXPRT/Config directory. In addition to those key variables, there is another variable in the config file called “total_requests” that has a different default setting depending on the AIXPRT test package you choose. This setting can significantly affect a test run, so it’s important for testers to know how it works.

The total_requests variable specifies how many inference requests AIXPRT will send to a network (e.g., ResNet-50) during one test iteration at a given batch size (e.g., Batch 1, 2, 4, etc.). This simulates the inference demand that the end users place on the system. Because we designed AIXPRT to run on different types of hardware, it makes sense to set the default number of requests for each test package to suit the most likely hardware environment for that package.

For example, testing with OpenVINO on Windows aligns more closely with a consumer (i.e., desktop or laptop) scenario than testing with OpenVINO on Ubuntu, which is more typical of server/datacenter testing. Desktop testers require a much lower inference demand than server testers, so the default total_requests settings for the two packages reflect that. The default for the OpenVINO/Windows package is 500, while the default for the OpenVINO/Ubuntu package is 5,000.

Also, setting the number of requests so low that a system finishes each workload in less than 1 second can produce high run-to-run variation, so our default settings represent a lower boundary that will work well for common test scenarios.

Below, we provide the current default total_requests setting for each AIXPRT test package:

  • MXNet: 1,000
  • OpenVINO Ubuntu: 5,000
  • OpenVINO Windows: 500
  • TensorFlow Ubuntu: 100
  • TensorFlow Windows: 10
  • TensorRT Ubuntu: 5,000
  • TensorRT Windows: 500


Testers can adjust these variables in the config file according to their own needs. Finding the optimal combination of machine learning variables for each scenario is often a matter of trial and error, and the default settings represent what we think is a reasonable starting point for each test package.

To adjust the total_requests setting, start by locating and opening the JSON test configuration file in the AIXPRT/Config directory. Below, we show a section of the default config file (CPU_INT8.json) for the OpenVINO-Windows test package (AIXPRT_1.0_OpenVINO_Windows.zip). For each batch size, the total_requests setting appears at the bottom of the list of configurable variables. In this case, the default setting Is 500. Change the total_requests numerical value for each batch size in the config file, save your changes, and close the file.

Total requests snip

Note that if you are running multiple concurrent instances, OpenVINO and TensorRT automatically distribute the number of requests among the instances. MXNet and TensorFlow users must manually allocate the instances in the config file. You can find an example of how to structure manual allocation here. We hope to make this process automatic for all toolkits in a future update.

We hope this information helps you understand the total_requests setting, and why the default values differ from one test package to another. If you have any questions or comments about this or other aspects of AIXPRT, please let us know.

Justin

AIXPRT is here!

We’re happy to announce that AIXPRT is now available to the public! AIXPRT includes support for the Intel OpenVINO, TensorFlow, and NVIDIA TensorRT toolkits to run image-classification and object-detection workloads with the ResNet-50 and SSD-MobileNet v1networks, as well as a Wide and Deep recommender system workload with the Apache MXNet toolkit. The test reports FP32, FP16, and INT8 levels of precision.

To access AIXPRT, visit the AIXPRT download page. There, a download table displays the AIXPRT test packages. Locate the operating system and toolkit you wish to test and click the corresponding Download link. For detailed installation instructions and information on hardware and software requirements for each package, click the package’s Readme link. If you’re not sure which AIXPRT package to choose, the AIXPRT package selector tool will help to guide you through the selection process.

In addition, the Helpful Info box on AIXPRT.com contains links to a repository of AIXPRT resources, as well links to XPRT blog discussions about key AIXPRT test configuration settings such as batch size and precision.

We hope AIXPRT will prove to be a valuable tool for you, and we’re thankful for all the input we received during the preview period! If you have any questions about AIXPRT, please let us know.

How to use alternate configuration files with AIXPRT

In last week’s AIXPRT Community Preview 3 announcement, we mentioned the new public GitHub repository that we’re using to publish AIXPRT-related information and resources. In addition to the installation readmes for each AIXPRT installation package, the repository contains a selection of alternative test config files that testers can use to quickly and easily change a test’s parameters.

As we discussed in previous blog entries about batch size, levels of precision, and number of concurrent instances, AIXPRT testers can adjust each of these key variables by editing the JSON file in the AIXPRT/Config directory. While the process is straightforward, editing each of the variables in a config file can take some time, and testers don’t always know the appropriate values for their system. To address both of these issues, we are offering a selection of alternative config files that testers can download and drop into the AIXPRT/Config directory.

In the GitHub repository, we’ve organized the available config files first by operating system (Linux_Ubuntu and Windows) and then by vendor (All, Intel, and NVIDIA). Within each section, testers will find preconfigured JSON files set up for several scenarios, such as running with multiple concurrent instances on a system’s CPU or GPU, running with FP32 precision instead of FP16, etc. The picture below shows the preconfigured files that are currently available for systems running Ubuntu on Intel hardware.

AIXPRT public repository snip 2

Because potential AIXPRT use cases cut across a wide range of hardware segments, including desktops, edge devices, and servers, not all AIXPRT workloads and configs will be applicable to each segment. As we move towards the AIXPRT GA, we’re working to find the best way to parse out these distinctions and communicate them to end users. In many cases, the ideal combination of test configuration variables remains an open question for ongoing research. However, we hope the alternative configuration files will help by giving testers a starting place.

If you experiment with an alternative test configuration file, please note that it should replace the existing default config file. If more than one config file is present, AIXPRT will run all the configurations and generate a separate result for each. More information about the config files and detailed instructions for how to handle the files are available in the EditConfig.md document in the public repository.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. If you have any questions or comments, please let us know.

Justin

An update on AIXPRT development

It’s been a while since we last discussed the AIXPRT Community Preview 3 (CP3) release schedule, so we want to let everyone know where things stand. Testing for CP3 has taken longer than we predicted, but we believe we’re nearly ready for the release.

Testers can expect three significant changes in AIXPRT CP3. First, we updated support for the Ubuntu test packages. During the initial development phase of AIXPRT, Ubuntu version 16.04 LTS (Long Term Support) was the most current LTS version, but version 18.04 is now available.

Second, we have added TensorRT test packages for Windows and Ubuntu. Previously, AIXPRT testers could test only the TensorFlow variant of TensorRT. Now, they can use TensorRT to test systems with NVIDIA GPUs.

Third, we have added the Wide and Deep recommender system workload with the MXNet toolkit. Recommender systems are AI-based information-filtering tools that learn from end user input and behavior patterns and try to present them with optimized outputs that suit their needs and preferences. If you’ve used Netflix, YouTube, or Amazon accounts, you’ve encountered recommender systems that learn from your behavior.

Currently, the recommender system workload in AIXPRT CP3 is available for Ubuntu testing, but not for Windows. Recommender system inference workloads typically run on datacenter hardware, which tends to be Linux based. If enough community members are interested in running the MXNet/Wide and Deep test package on Windows, we can investigate what that would entail. If you’d like to see that option, please let us know.

As always, if you have any questions about the AIXPRT development process, feel free to ask!

Justin

Navigating the AIXPRT Community Preview download page just got easier

AIXPRT Community Preview 2 (CP2) has been generating quite a bit of interest among the BenchmarkXPRT Development Community and members of the tech press. We’re excited that the tool has piqued curiosity and that folks are recognizing its value for technical analysis. When talking with folks about test setup and configuration, we keep hearing the same questions:

  • How do I find the exact toolkit or package that I need?
  • How do I find the instructions for a specific toolkit?
  • What test configuration variables are most important for producing consistent, relevant results?
  • How do I know which values to choose when configuring options such as iterations, concurrent instances, and batch size?


In the coming weeks, we’ll be working to provide detailed answers to questions about test configuration. In response to the confusion about finding specific packages and instructions, we’ve redesigned the CP2 download page to make it easier for you to find what you need. Below, we show a snapshot from the new CP2 download table. Instead of having to download the entire CP2 package that includes the OpenVINO, TensorFlow, and TensorRT in TensorFlow test packages, you can now download one package at a time. In the Documentation column, we’ve posted package-specific instructions, so you won’t have to wade through the entire installation guide to find the instructions you need.

AIXPRT Community Preview download table

We hope these changes make it easier for people to experiment with AIXPRT. As always, please feel free to contact us with any questions or comments you may have.

Justin

Making AIXPRT easier to use

We’re glad to see so much interest in the AIXPRT CP2 build. Over the past few days, we’ve received two questions about the setup process: 1) where to find instructions for setting up AIXPRT on Windows, and 2) whether we could make it easier to install Intel OpenVINO on test systems.

In response to the first question, testers can find the relevant instructions for each framework in the readme files included in the AIXPRT install package. Instructions for Windows installation are in section 3 of the OpenVINO and TensorFlow readmes. Please note that whether you’re running AIXPRT on Ubuntu or Windows, be sure to read the “Known Issues” section in the readme, as there may be issues relevant to your specific configuration.

The readme files for each respective framework in the CP2 package are located here:

  • AIXPRT_0.5_CP2\AIXPRT_OpenVINO_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFLow_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFlow_TensorRT_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning


We’re also working on consolidating the instructions into a central document that will make it easier for everyone to find the instructions they need.

In response to the question about OpenVINO installation, we’re working on an AIXPRT CP2 package that includes a precompiled version of OpenVINO R5.0.1 for easy installation on Windows via a few quick commands, and a script that installs the necessary OpenVINO dependencies. We’re currently testing the build, and we’ll make it available to testers as soon as possible.

The tests themselves will not change, so the new build will not influence existing results from Ubuntu or Windows. We hope it will simply facilitate the setup and testing process for many users.

We appreciate each bit of feedback that we receive, so if you have any suggestions for AIXPRT, please let us know!

Justin

Check out the other XPRTs: