BenchmarkXPRT Blog banner

Tag Archives: Windows

Best practices for WebXPRT testing

One of the strengths of WebXPRT is that it’s a remarkably easy benchmark to run. Its upfront simplicity attracts users with a wide range of technical skills—everyone from engineers in cutting-edge OEM labs to veteran tech journalists to everyday folks who simply want to test their gear’s browser performance. With so many different kinds of people running the test each day, it’s certain that at least some of them use very different approaches to testing. In today’s blog, we’re going to share some of the key benchmarking practices we follow in the XPRT lab—and encourage you to consider—in order to produce the most consistent and reliable WebXPRT scores.

We offer these best practices as tips you might find useful in your testing. Each step relates to evaluating browser performance with WebXPRT, but several of these practices will apply to other benchmarks as well.

  • Test with clean images: In the XPRT lab, we typically use an out-of-box (OOB) method for testing new devices. OOB testing means that other than running the initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device and before they install additional software. This approach is the best way to provide an accurate assessment of the performance retail buyers will experience from their new devices. That said, the OOB method is not appropriate for certain types of testing, such as when you want to compare largely identical systems or when you want to remove as much pre-loaded software as possible. The OOB method is less relevant to users who want to see how their device performs as it is.
  • Browser updates can have a significant impact: Most people know that different browsers often produce different performance scores on the same system. They may not know that there can be shifts in performance between different versions of the same browser. While most browser updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always important to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Turn off automatic updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always double-check update settings before testing. On Windows systems, the same considerations apply to turning off User Account Control notifications.
  • Let the system settle: Depending on the system and the OS, a significant amount of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle time) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variance in the first run before the scores start to tighten up.
  • Run the test more than once: Because of natural variance, our standard practice in the XPRT lab is to publish a score that represents the median of three to five runs, if not more. If you run a benchmark only once and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions or over the course of multiple runs.
  • Clear the cache: Browser caching can improve web page performance, including the loading of the types of JavaScript and HTML5 assets that WebXPRT uses in its workloads. Depending on the platform under test, browser caching may or may not significantly change WebXPRT scores, but clearing the cache before testing and between each run can help improve the accuracy and consistency of scores.

We hope these tips will serve as a good baseline methodology for your WebXPRT testing. If you have any questions about WebXPRT, the other XPRTs, or benchmarking in general, please let us know!

Justin

XPRT possibilities with ChromeOS Flex

Recently, Tom’s Guide published an interesting article about how they used ChromeOS Flex to turn a ten-year-old Apple MacBook Pro into a functioning Chromebook by replacing the laptop’s macOS operating system with ChromeOS. ChromeOS Flex is a free Google tool that allows users to create a bootable USB drive that they can then use to install ChromeOS on a wide variety of hardware platforms that traditionally run other operating systems such as macOS or Windows. Because ChromeOS is a cloud-first, relatively low-overhead operating system, the ChromeOS Flex option could breathe new life into an old laptop that you have lying around.

Never having encountered a MacBook Pro with ChromeOS, we were interested to learn about Tom’s experience running XPRT benchmarks in this new environment. WebXPRT 4, WebXPRT 3, and the CrXPRT 2 performance test apparently ran without any issues, but we have not yet seen a CrXPRT 2 battery life result from a ChromeOS Flex environment. We plan to experiment with this soon.

We were happy to publish the results on our site, and will consider any ChromeOS Flex results we receive for publication. If you submit results from ChromeOS Flex testing, we ask that you use the “Additional information” field in the results submission form to clarify that you ran the tests in a ChromeOS Flex environment. This will prevent any possible confusion when we see a submission that lists a traditional macOS or Windows hardware platform along with a ChromeOS version number.

Do you have experience running CrXPRT or WebXPRT with ChromeOS Flex? We’d love to hear about it!

Justin

HDXPRT: See how your Windows PC handles real-world media tasks

Many of our blog readers first encountered the XPRTs when reading about a specific benchmark, such as WebXPRT, in a device review. Because these folks might be unfamiliar with our other benchmarks, we like to occasionally “reintroduce” individual XPRTs. This week, we invite you to get to know HDXPRT.

HDXPRT, which stands for High-Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT 4, the latest version, evaluates the performance of Windows 10 and Windows 11 devices while handling real-world media tasks such as photo editing, video conversion, and music editing. HDXPRT uses real commercial applications, such Photoshop and MediaEspresso, to complete its workloads. The benchmark then produces easy-to-understand results that are relevant to buyers shopping for new Windows systems.

The HDXPRT 4 setup process takes about 30 minutes on most systems. The length of the test can vary significantly depending on the speed of the system, but for most PCs that are less than a few years old, a full three-iteration test cycle takes under two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test scores from a variety of Windows devices, go to HDXPRT.com and click View Results.

Want to run HDXPRT?

Download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for configuring your system and kicking off a test.

Want to dig into the details?

The HDXPRT source code is available upon request. If you’d like to access the source code, please send your request to benchmarkxprtsupport@principledtechnologies.com. Build instructions are also available.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

Justin

Testing XPRT compatibility with Windows 11

Last week, Microsoft announced that the Windows 11 GA build will officially launch Tuesday October 5, earlier than the initial late 2021 estimate. The update will start rolling out with select new laptops and existing Windows 10 PCs that satisfy specific system requirements, and only some Windows 10 PCs will be eligible for the update right away. Through a phased Windows Update process, additional Windows 10 PCs will be able to access the update throughout the first half of 2022.

Between the phased Windows 11 rollout and the pledge Microsoft has made to continue Windows 10 support through October 2025, it will likely be a while before the majority of Windows users transition to the new version. We hope the transition period will go smoothly for the XPRTs. However, because we designed three of our benchmarks to run on Windows 10 (HDXPRT 4, TouchXPRT 2016, and AIXPRT), we might encounter compatibility issues with Windows 11.

Over the coming weeks, we’ll be testing HDXPRT 4, TouchXPRT 2016, and AIXPRT on beta versions of Windows 11, and we’ll test again after the GA launch. In addition to obvious compatibility issues and test failures, we’ll note any changes we need to make to our documentation to account for differences in the Windows 11 installation or test processes.

We hope that testers will be able to successfully use all three benchmarks on both OS versions throughout the transition process. If problems arise, we will keep our blog readers informed while exploring solutions. As always, we’re also open to feedback from the community, so if you are participating in the Windows Insider Program and have encountered Windows 11 beta compatibility issues with any of the Windows-focused XPRTs, please let us know!

Justin

Understanding AIXPRT’s default number of requests

A few weeks ago, we discussed how AIXPRT testers can adjust the key variables of batch size, levels of precision, and number of concurrent instances by editing the JSON test configuration file in the AIXPRT/Config directory. In addition to those key variables, there is another variable in the config file called “total_requests” that has a different default setting depending on the AIXPRT test package you choose. This setting can significantly affect a test run, so it’s important for testers to know how it works.

The total_requests variable specifies how many inference requests AIXPRT will send to a network (e.g., ResNet-50) during one test iteration at a given batch size (e.g., Batch 1, 2, 4, etc.). This simulates the inference demand that the end users place on the system. Because we designed AIXPRT to run on different types of hardware, it makes sense to set the default number of requests for each test package to suit the most likely hardware environment for that package.

For example, testing with OpenVINO on Windows aligns more closely with a consumer (i.e., desktop or laptop) scenario than testing with OpenVINO on Ubuntu, which is more typical of server/datacenter testing. Desktop testers require a much lower inference demand than server testers, so the default total_requests settings for the two packages reflect that. The default for the OpenVINO/Windows package is 500, while the default for the OpenVINO/Ubuntu package is 5,000.

Also, setting the number of requests so low that a system finishes each workload in less than 1 second can produce high run-to-run variation, so our default settings represent a lower boundary that will work well for common test scenarios.

Below, we provide the current default total_requests setting for each AIXPRT test package:

  • MXNet: 1,000
  • OpenVINO Ubuntu: 5,000
  • OpenVINO Windows: 500
  • TensorFlow Ubuntu: 100
  • TensorFlow Windows: 10
  • TensorRT Ubuntu: 5,000
  • TensorRT Windows: 500


Testers can adjust these variables in the config file according to their own needs. Finding the optimal combination of machine learning variables for each scenario is often a matter of trial and error, and the default settings represent what we think is a reasonable starting point for each test package.

To adjust the total_requests setting, start by locating and opening the JSON test configuration file in the AIXPRT/Config directory. Below, we show a section of the default config file (CPU_INT8.json) for the OpenVINO-Windows test package (AIXPRT_1.0_OpenVINO_Windows.zip). For each batch size, the total_requests setting appears at the bottom of the list of configurable variables. In this case, the default setting Is 500. Change the total_requests numerical value for each batch size in the config file, save your changes, and close the file.

Total requests snip

Note that if you are running multiple concurrent instances, OpenVINO and TensorRT automatically distribute the number of requests among the instances. MXNet and TensorFlow users must manually allocate the instances in the config file. You can find an example of how to structure manual allocation here. We hope to make this process automatic for all toolkits in a future update.

We hope this information helps you understand the total_requests setting, and why the default values differ from one test package to another. If you have any questions or comments about this or other aspects of AIXPRT, please let us know.

Justin

A necessary update for HDXPRT 4

If you tried to install HDXPRT 4 over the past few days, you likely noticed that Adobe Photoshop Elements 2018, the version the Edit Photos scenario uses, is no longer available on the Adobe Photoshop Elements download page. In the past, Adobe has provided access to multiple older versions of their software for some time after a new release, but they appear to be moving away from that practice. We have not yet found an alternative way for users to download PSE 2018 on a trial basis. Unfortunately, this means testers will be temporarily unable to successfully complete the HDXPRT 4 installation process.

We’re adapting the scripts in the HDXPRT 4 Edit Photos scenario to use PSE 2020. As soon as we finish, we’ll start testing, with a focus on determining whether the change significantly affects the individual workload or overall scores.

We apologize for the inconvenience that this issue causes for HDXPRT testers. We’ll continue to update the community here in the blog about our progress with the new build. If you have any questions or comments, please let us know.

Justin

Check out the other XPRTs:

Forgot your password?