BenchmarkXPRT Blog banner

Category: Windows

Best practices for WebXPRT testing

One of the strengths of WebXPRT is that it’s a remarkably easy benchmark to run. Its upfront simplicity attracts users with a wide range of technical skills—everyone from engineers in cutting-edge OEM labs to veteran tech journalists to everyday folks who simply want to test their gear’s browser performance. With so many different kinds of people running the test each day, it’s certain that at least some of them use very different approaches to testing. In today’s blog, we’re going to share some of the key benchmarking practices we follow in the XPRT lab—and encourage you to consider—to produce the most consistent and reliable WebXPRT scores.

We offer these best practices as tips you might find useful in your testing. Each step relates to evaluating browser performance with WebXPRT, but several of these practices will apply to other benchmarks as well.

  • Test with clean images: In the XPRT lab, we typically use an out-of-box (OOB) method for testing new devices. OOB testing means that other than running the initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device and before they install additional software. This approach is the best way to provide an accurate assessment of the performance retail buyers will experience from their new devices. That said, the OOB method is not appropriate for certain types of testing, such as when you want to compare largely identical systems or when you want to remove as much pre-loaded software as possible. The OOB method is less relevant to users who want to see how their device performs as it is.
  • Browser updates can have a significant impact: Most people know that different browsers often produce different performance scores on the same system. They may not know that there can be shifts in performance between different versions of the same browser. While most browser updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always important to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Turn off automatic updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always double-check update settings before testing. On Windows systems, the same considerations apply to turning off User Account Control notifications.
  • Let the system settle: Depending on the system and the OS, a significant amount of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle time) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variance in the first run before the scores start to tighten up.
  • Run the test more than once: Because of natural variance, our standard practice in the XPRT lab is to publish a score that represents the median of three to five runs, if not more. If you run a benchmark only once and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions or over the course of multiple runs.
  • Clear the cache: Browser caching can improve web page performance, including the loading of the types of JavaScript and HTML5 assets that WebXPRT uses in its workloads. Depending on the platform under test, browser caching may or may not significantly change WebXPRT scores, but clearing the cache before testing and between each run can help improve the accuracy and consistency of scores.

We hope these tips will serve as a good baseline methodology for your WebXPRT testing. If you have any questions about WebXPRT, the other XPRTs, or benchmarking in general, please let us know!

Justin

XPRT possibilities with ChromeOS Flex

Recently, Tom’s Guide published an interesting article about how they used ChromeOS Flex to turn a ten-year-old Apple MacBook Pro into a functioning Chromebook by replacing the laptop’s macOS operating system with ChromeOS. ChromeOS Flex is a free Google tool that allows users to create a bootable USB drive that they can then use to install ChromeOS on a wide variety of hardware platforms that traditionally run other operating systems such as macOS or Windows. Because ChromeOS is a cloud-first, relatively low-overhead operating system, the ChromeOS Flex option could breathe new life into an old laptop that you have lying around.

Never having encountered a MacBook Pro with ChromeOS, we were interested to learn about Tom’s experience running XPRT benchmarks in this new environment. WebXPRT 4, WebXPRT 3, and the CrXPRT 2 performance test apparently ran without any issues, but we have not yet seen a CrXPRT 2 battery life result from a ChromeOS Flex environment. We plan to experiment with this soon.

We were happy to publish the results on our site, and will consider any ChromeOS Flex results we receive for publication. If you submit results from ChromeOS Flex testing, we ask that you use the “Additional information” field in the results submission form to clarify that you ran the tests in a ChromeOS Flex environment. This will prevent any possible confusion when we see a submission that lists a traditional macOS or Windows hardware platform along with a ChromeOS version number.

Do you have experience running CrXPRT or WebXPRT with ChromeOS Flex? We’d love to hear about it!

Justin

HDXPRT: See how your Windows PC handles real-world media tasks

Many of our blog readers first encountered the XPRTs when reading about a specific benchmark, such as WebXPRT, in a device review. Because these folks might be unfamiliar with our other benchmarks, we like to occasionally “reintroduce” individual XPRTs. This week, we invite you to get to know HDXPRT.

HDXPRT, which stands for High-Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT 4, the latest version, evaluates the performance of Windows 10 and Windows 11 devices while handling real-world media tasks such as photo editing, video conversion, and music editing. HDXPRT uses real commercial applications, such Photoshop and MediaEspresso, to complete its workloads. The benchmark then produces easy-to-understand results that are relevant to buyers shopping for new Windows systems.

The HDXPRT 4 setup process takes about 30 minutes on most systems. The length of the test can vary significantly depending on the speed of the system, but for most PCs that are less than a few years old, a full three-iteration test cycle takes under two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test scores from a variety of Windows devices, go to HDXPRT.com and click View Results.

Want to run HDXPRT?

Download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for configuring your system and kicking off a test.

Want to dig into the details?

The HDXPRT source code is available upon request. If you’d like to access the source code, please send your request to benchmarkxprtsupport@principledtechnologies.com. Build instructions are also available.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

Justin

Adobe PSE 2020 and HDXPRT 4

HDXPRT 4, our benchmark for assessing Windows performance on real-world media tasks, runs tests that use real commercial applications such as Adobe Photoshop Elements (PSE) 2020. Last fall, we informed HDXPRT testers that Adobe had started requiring a user ID to download the free Adobe Photoshop Elements 2020 trial package. Previously, testers could download the trial without setting up an account.

Recently, Adobe made additional changes to the access path for the PSE 2020 installation package. The package is no longer available on the PSE downloads page, but users who previously purchased their copy or registered it with Adobe can access the package on another page. However, this approach does not work for users who want to temporarily use the trial version for HDXPRT 4 testing.

We have found a third-party location, ProDesignTools, that currently offers a free, straightforward PSE 2020 installation package download with no requirements for registration or transmission of personal information. In our testing so far, the installation package (PhotoshopElements_2020_LS30_win64_ESD.zip) has been functioning as expected, and HDXPRT 4 is running the PSE-based workloads without any issues.

Unfortunately, we cannot guarantee that ProDesignTools will continue to offer a free PSE 2020 installation package download, and we’re not aware of an alternative Adobe download path at this time. We apologize for the inconvenience!

Justin

Using WebXPRT 3 to compare the performance of popular browsers in Windows 10 and Windows 11

People choose a default web browser based on several factors. Speed is sometimes the deciding factor, but privacy settings, memory load, ecosystem integration, and web app capabilities can also come into play. Regardless of the motivations behind a person’s go-to browser choice, the dominance of software-as-a-service (SaaS) computing means that new updates are always right around the corner. In previous blog posts, we’ve talked about how browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Microsoft Edge on Windows and Google Chrome on Chrome OS.

Windows 11 began rolling out earlier this month, and tech press outlets such as AnandTech and PCWorld have used WebXPRT 3 to evaluate the impact of the new OS—or specific settings in the OS—on browser performance. Our own in-house tests, which we discuss below, show a negligible impact on browser performance when updating our test system from Windows 10 to Windows 11. It’s important to note that depending on a system’s hardware setup, the impact might be more significant in certain scenarios. For more information about such scenarios, we encourage you to read the PCWorld article discussing the impact of the Windows 11 default virtualization-based security (VBS) settings on browser performance in some instances.

In our comparison tests, we used a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM. For the Windows 10 tests, we used a clean Windows 10 Home image updated to version 20H2 (19042.1165). For the Windows 11 tests, we updated the system to Windows 11 Home version 21H2 (22000.282). On each OS version, we ran WebXPRT 3 three times on the latest versions of five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. For each browser, the score we post below is the median of the three test runs.

In our last round of tests on Windows 10, Firefox was the clear winner. Three of the Chromium-based browsers (Chrome, Edge, and Opera) produced very close scores, and the performance of Brave lagged by about 7 percent. In this round of Windows 10 testing, performance on every browser improved slightly, with Google Chrome taking a slight lead over Firefox.

In our Windows 11 testing, we were interested to find that without exception, browser scores were slightly lower than in Windows 10 testing. However, none of the decreases were statistically significant. Most users performing daily tasks are unlikely to notice that degree of difference.

Have you observed any significant differences in WebXPRT 3 scores after upgrading to Windows 11? If so, let us know!

Justin

Testing XPRT compatibility with Windows 11

Last week, Microsoft announced that the Windows 11 GA build will officially launch Tuesday October 5, earlier than the initial late 2021 estimate. The update will start rolling out with select new laptops and existing Windows 10 PCs that satisfy specific system requirements, and only some Windows 10 PCs will be eligible for the update right away. Through a phased Windows Update process, additional Windows 10 PCs will be able to access the update throughout the first half of 2022.

Between the phased Windows 11 rollout and the pledge Microsoft has made to continue Windows 10 support through October 2025, it will likely be a while before the majority of Windows users transition to the new version. We hope the transition period will go smoothly for the XPRTs. However, because we designed three of our benchmarks to run on Windows 10 (HDXPRT 4, TouchXPRT 2016, and AIXPRT), we might encounter compatibility issues with Windows 11.

Over the coming weeks, we’ll be testing HDXPRT 4, TouchXPRT 2016, and AIXPRT on beta versions of Windows 11, and we’ll test again after the GA launch. In addition to obvious compatibility issues and test failures, we’ll note any changes we need to make to our documentation to account for differences in the Windows 11 installation or test processes.

We hope that testers will be able to successfully use all three benchmarks on both OS versions throughout the transition process. If problems arise, we will keep our blog readers informed while exploring solutions. As always, we’re also open to feedback from the community, so if you are participating in the Windows Insider Program and have encountered Windows 11 beta compatibility issues with any of the Windows-focused XPRTs, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?