If you’re not sure which AIXPRT package to choose, this tool will guide you through the process. The boxes below present drop-down menus for the key factors that go into determining the correct AIXPRT package. Click the drop-down arrow to see a description of the available options in each category, and make a selection. You may proceed in any order, but not all combinations work together, so each selection will eliminate some of the options in the remaining categories.
After you select an option, a check mark will appear on the category icon, and the selection for that category will appear in the category box. After you select options in more than one category, a Start over button will appear in the lower-left corner. Clicking this button clears all existing selections and resets the tool. Once every category is complete, a Download button appears in the lower-right corner. When you click this, a popup appears that provides a link for the correct download package and associated readme file.
Select your test hardware’s operating system below. Systems must be running Ubuntu 18.04 LTS or Windows 10.
Because AI use cases cut across many hardware segments, we built the benchmark to be very flexible. Our goal is to eventually serve as many of these segments as we can, but not all AIXPRT workloads and configs will be applicable to all segments. To proceed, select the type of hardware that you intend to test. Depending on your selection, you may need to tailor AIXPRT’s test configuration files to better match your system.
AIXPRT includes support for the Intel OpenVINO, TensorFlow, and NVIDIA TensorRT toolkits to run image-classification and object-detection workloads, as well as a Wide and Deep recommender system workload with the Apache MXNet toolkit. Select the toolkit that corresponds to the workload you wish to run.
Depending on the operating system, host hardware, and selected toolkit, AIXPRT’s workloads can target CPUs, Intel processor graphics, NVIDIA GPUs, other discrete GPUs, or Intel Visual Processing Units (VPUs). If you cannot select your desired target hardware below, you will need to change your selection for one or more of the other categories.
To run image-classification and object-detection workloads using Intel OpenVINO, TensorFlow, or NVIDIA TensorRT, select ResNet-50 and SSD-MobileNet v1. To run a recommender system workload, select Wide and Deep.
The XPRT benchmarks are constantly evolving. Get the latest announcements about the XPRT family right here.
Principled Technologies brings its technical prowess to the XPRT family with these informative, fact-based white papers. Read them here.
We periodically hold Webinars to inform members about what's going on in the BenchmarkXPRT community. Visit our Webinars archive.