BenchmarkXPRT Blog banner

Category: Windows

HDXPRT: See how your Windows PC handles real-world media tasks

Many of our blog readers first encountered the XPRTs when reading about a specific benchmark, such as WebXPRT, in a device review. Because these folks might be unfamiliar with our other benchmarks, we like to occasionally “reintroduce” individual XPRTs. This week, we invite you to get to know HDXPRT.

HDXPRT, which stands for High-Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT 4, the latest version, evaluates the performance of Windows 10 and Windows 11 devices while handling real-world media tasks such as photo editing, video conversion, and music editing. HDXPRT uses real commercial applications, such Photoshop and MediaEspresso, to complete its workloads. The benchmark then produces easy-to-understand results that are relevant to buyers shopping for new Windows systems.

The HDXPRT 4 setup process takes about 30 minutes on most systems. The length of the test can vary significantly depending on the speed of the system, but for most PCs that are less than a few years old, a full three-iteration test cycle takes under two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test scores from a variety of Windows devices, go to HDXPRT.com and click View Results.

Want to run HDXPRT?

Download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for configuring your system and kicking off a test.

Want to dig into the details?

The HDXPRT source code is available upon request. If you’d like to access the source code, please send your request to benchmarkxprtsupport@principledtechnologies.com. Build instructions are also available.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

Justin

Adobe PSE 2020 and HDXPRT 4

HDXPRT 4, our benchmark for assessing Windows performance on real-world media tasks, runs tests that use real commercial applications such as Adobe Photoshop Elements (PSE) 2020. Last fall, we informed HDXPRT testers that Adobe had started requiring a user ID to download the free Adobe Photoshop Elements 2020 trial package. Previously, testers could download the trial without setting up an account.

Recently, Adobe made additional changes to the access path for the PSE 2020 installation package. The package is no longer available on the PSE downloads page, but users who previously purchased their copy or registered it with Adobe can access the package on another page. However, this approach does not work for users who want to temporarily use the trial version for HDXPRT 4 testing.

We have found a third-party location, ProDesignTools, that currently offers a free, straightforward PSE 2020 installation package download with no requirements for registration or transmission of personal information. In our testing so far, the installation package (PhotoshopElements_2020_LS30_win64_ESD.zip) has been functioning as expected, and HDXPRT 4 is running the PSE-based workloads without any issues.

Unfortunately, we cannot guarantee that ProDesignTools will continue to offer a free PSE 2020 installation package download, and we’re not aware of an alternative Adobe download path at this time. We apologize for the inconvenience!

Justin

Using WebXPRT 3 to compare the performance of popular browsers in Windows 10 and Windows 11

People choose a default web browser based on several factors. Speed is sometimes the deciding factor, but privacy settings, memory load, ecosystem integration, and web app capabilities can also come into play. Regardless of the motivations behind a person’s go-to browser choice, the dominance of software-as-a-service (SaaS) computing means that new updates are always right around the corner. In previous blog posts, we’ve talked about how browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Microsoft Edge on Windows and Google Chrome on Chrome OS.

Windows 11 began rolling out earlier this month, and tech press outlets such as AnandTech and PCWorld have used WebXPRT 3 to evaluate the impact of the new OS—or specific settings in the OS—on browser performance. Our own in-house tests, which we discuss below, show a negligible impact on browser performance when updating our test system from Windows 10 to Windows 11. It’s important to note that depending on a system’s hardware setup, the impact might be more significant in certain scenarios. For more information about such scenarios, we encourage you to read the PCWorld article discussing the impact of the Windows 11 default virtualization-based security (VBS) settings on browser performance in some instances.

In our comparison tests, we used a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM. For the Windows 10 tests, we used a clean Windows 10 Home image updated to version 20H2 (19042.1165). For the Windows 11 tests, we updated the system to Windows 11 Home version 21H2 (22000.282). On each OS version, we ran WebXPRT 3 three times on the latest versions of five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. For each browser, the score we post below is the median of the three test runs.

In our last round of tests on Windows 10, Firefox was the clear winner. Three of the Chromium-based browsers (Chrome, Edge, and Opera) produced very close scores, and the performance of Brave lagged by about 7 percent. In this round of Windows 10 testing, performance on every browser improved slightly, with Google Chrome taking a slight lead over Firefox.

In our Windows 11 testing, we were interested to find that without exception, browser scores were slightly lower than in Windows 10 testing. However, none of the decreases were statistically significant. Most users performing daily tasks are unlikely to notice that degree of difference.

Have you observed any significant differences in WebXPRT 3 scores after upgrading to Windows 11? If so, let us know!

Justin

Testing XPRT compatibility with Windows 11

Last week, Microsoft announced that the Windows 11 GA build will officially launch Tuesday October 5, earlier than the initial late 2021 estimate. The update will start rolling out with select new laptops and existing Windows 10 PCs that satisfy specific system requirements, and only some Windows 10 PCs will be eligible for the update right away. Through a phased Windows Update process, additional Windows 10 PCs will be able to access the update throughout the first half of 2022.

Between the phased Windows 11 rollout and the pledge Microsoft has made to continue Windows 10 support through October 2025, it will likely be a while before the majority of Windows users transition to the new version. We hope the transition period will go smoothly for the XPRTs. However, because we designed three of our benchmarks to run on Windows 10 (HDXPRT 4, TouchXPRT 2016, and AIXPRT), we might encounter compatibility issues with Windows 11.

Over the coming weeks, we’ll be testing HDXPRT 4, TouchXPRT 2016, and AIXPRT on beta versions of Windows 11, and we’ll test again after the GA launch. In addition to obvious compatibility issues and test failures, we’ll note any changes we need to make to our documentation to account for differences in the Windows 11 installation or test processes.

We hope that testers will be able to successfully use all three benchmarks on both OS versions throughout the transition process. If problems arise, we will keep our blog readers informed while exploring solutions. As always, we’re also open to feedback from the community, so if you are participating in the Windows Insider Program and have encountered Windows 11 beta compatibility issues with any of the Windows-focused XPRTs, please let us know!

Justin

HDXPRT 4 v1.2 and the HDXPRT 4 source code package are available

This week, we have good news for HDXPRT 4 testers. A few weeks ago, we discussed the fact that Adobe removed the trial version of Adobe Photoshop Elements (PSE) 2018 from the PSE download page. HDXPRT 4 used PSE 2018 for the Edit Photos scenario, so this change meant that new HDXPRT testers would not be able to successfully install and run the benchmark.

Fortunately, we were able to adapt the Edit Photos scripts to use the new trial version of PSE 2020, and have incorporated those changes in an updated HDXPRT 4 build (v1.2). It’s available for download on HDXPRT.com, along with an updated user manual. Apart from slightly different instructions for installing the trial version of PSE 2020, all aspects of the installation and test process remain the same. We tested the new build and found that individual workload and overall scores did not vary significantly, so scores from the new build will be comparable to existing HDXPRT 4 scores.

We also posted the HDXPRT 4 source code and build instructions on the HDXPRT tab in the Members’ Area (login required). If you’d like to review XPRT source code, but haven’t yet joined the community, we encourage you to join! Registration is quick and easy, and if you work for a company or organization with an interest in benchmarking, you can join for free. Simply fill out the form with your company e-mail address and select the option to be considered for a free membership. We’ll contact you to verify the address and then activate your membership.

We apologize to HDXPRT testers for the inconvenience over the last several weeks, and we thank you for your patience while we worked on a solution. If you have any questions about HDXPRT or the community, please feel free to ask!

Justin

Understanding AIXPRT’s default number of requests

A few weeks ago, we discussed how AIXPRT testers can adjust the key variables of batch size, levels of precision, and number of concurrent instances by editing the JSON test configuration file in the AIXPRT/Config directory. In addition to those key variables, there is another variable in the config file called “total_requests” that has a different default setting depending on the AIXPRT test package you choose. This setting can significantly affect a test run, so it’s important for testers to know how it works.

The total_requests variable specifies how many inference requests AIXPRT will send to a network (e.g., ResNet-50) during one test iteration at a given batch size (e.g., Batch 1, 2, 4, etc.). This simulates the inference demand that the end users place on the system. Because we designed AIXPRT to run on different types of hardware, it makes sense to set the default number of requests for each test package to suit the most likely hardware environment for that package.

For example, testing with OpenVINO on Windows aligns more closely with a consumer (i.e., desktop or laptop) scenario than testing with OpenVINO on Ubuntu, which is more typical of server/datacenter testing. Desktop testers require a much lower inference demand than server testers, so the default total_requests settings for the two packages reflect that. The default for the OpenVINO/Windows package is 500, while the default for the OpenVINO/Ubuntu package is 5,000.

Also, setting the number of requests so low that a system finishes each workload in less than 1 second can produce high run-to-run variation, so our default settings represent a lower boundary that will work well for common test scenarios.

Below, we provide the current default total_requests setting for each AIXPRT test package:

  • MXNet: 1,000
  • OpenVINO Ubuntu: 5,000
  • OpenVINO Windows: 500
  • TensorFlow Ubuntu: 100
  • TensorFlow Windows: 10
  • TensorRT Ubuntu: 5,000
  • TensorRT Windows: 500


Testers can adjust these variables in the config file according to their own needs. Finding the optimal combination of machine learning variables for each scenario is often a matter of trial and error, and the default settings represent what we think is a reasonable starting point for each test package.

To adjust the total_requests setting, start by locating and opening the JSON test configuration file in the AIXPRT/Config directory. Below, we show a section of the default config file (CPU_INT8.json) for the OpenVINO-Windows test package (AIXPRT_1.0_OpenVINO_Windows.zip). For each batch size, the total_requests setting appears at the bottom of the list of configurable variables. In this case, the default setting Is 500. Change the total_requests numerical value for each batch size in the config file, save your changes, and close the file.

Total requests snip

Note that if you are running multiple concurrent instances, OpenVINO and TensorRT automatically distribute the number of requests among the instances. MXNet and TensorFlow users must manually allocate the instances in the config file. You can find an example of how to structure manual allocation here. We hope to make this process automatic for all toolkits in a future update.

We hope this information helps you understand the total_requests setting, and why the default values differ from one test package to another. If you have any questions or comments about this or other aspects of AIXPRT, please let us know.

Justin

Check out the other XPRTs:

Forgot your password?