BenchmarkXPRT Blog banner

Tag Archives: Performance

Best practices for WebXPRT testing

One of the strengths of WebXPRT is that it’s a remarkably easy benchmark to run. Its upfront simplicity attracts users with a wide range of technical skills—everyone from engineers in cutting-edge OEM labs to veteran tech journalists to everyday folks who simply want to test their gear’s browser performance. With so many different kinds of people running the test each day, it’s certain that at least some of them use very different approaches to testing. In today’s blog, we’re going to share some of the key benchmarking practices we follow in the XPRT lab—and encourage you to consider—in order to produce the most consistent and reliable WebXPRT scores.

We offer these best practices as tips you might find useful in your testing. Each step relates to evaluating browser performance with WebXPRT, but several of these practices will apply to other benchmarks as well.

  • Test with clean images: In the XPRT lab, we typically use an out-of-box (OOB) method for testing new devices. OOB testing means that other than running the initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device and before they install additional software. This approach is the best way to provide an accurate assessment of the performance retail buyers will experience from their new devices. That said, the OOB method is not appropriate for certain types of testing, such as when you want to compare largely identical systems or when you want to remove as much pre-loaded software as possible. The OOB method is less relevant to users who want to see how their device performs as it is.
  • Browser updates can have a significant impact: Most people know that different browsers often produce different performance scores on the same system. They may not know that there can be shifts in performance between different versions of the same browser. While most browser updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always important to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Turn off automatic updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always double-check update settings before testing. On Windows systems, the same considerations apply to turning off User Account Control notifications.
  • Let the system settle: Depending on the system and the OS, a significant amount of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle time) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variance in the first run before the scores start to tighten up.
  • Run the test more than once: Because of natural variance, our standard practice in the XPRT lab is to publish a score that represents the median of three to five runs, if not more. If you run a benchmark only once and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions or over the course of multiple runs.
  • Clear the cache: Browser caching can improve web page performance, including the loading of the types of JavaScript and HTML5 assets that WebXPRT uses in its workloads. Depending on the platform under test, browser caching may or may not significantly change WebXPRT scores, but clearing the cache before testing and between each run can help improve the accuracy and consistency of scores.

We hope these tips will serve as a good baseline methodology for your WebXPRT testing. If you have any questions about WebXPRT, the other XPRTs, or benchmarking in general, please let us know!

Justin

February 2025 WebXPRT 4 browser performance comparisons

Once or twice per year, we refresh our ongoing series of WebXPRT comparison tests to see if software version updates have reordered the performance rankings of popular web browsers. We published our most recent comparison last June, when we used WebXPRT 4 to compare the performance of five browsers—Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera—on a Lenovo ThinkPad T14s Gen 3. When assessing performance differences, it’s worth noting that all the browsers—except for Firefox—are built on a Chromium foundation. In the last round of tests, the scores were very tight, with a difference of only four percent between the last-place browser (Brave) and the winner (Chrome). Firefox’s score landed squarely in the middle of the pack.

Recently, we conducted a new set of tests to see how performance scores may have changed. To maintain continuity with our last comparison, we stuck with the same ThinkPad T14s as our reference system. That laptop is still in line with current mid-range laptops, so our comparison scores are likely to fall within the range of scores we would see from a typical user today. The ThinkPad is equipped with an Intel Core i7-1270P processor and 16 GB of RAM, and it’s running Windows 11 Pro, version 23H2 (22631.4890).

Before testing, we installed all current Windows updates, and we updated each of the browsers to the latest available stable version. After the update process was complete, we turned off updates to prevent any interference with test runs. We ran WebXPRT 4 five times on each of the five browsers. In Figure 1 below, each browser’s score is the median of the five test runs.

In this round of tests, the gap widened a bit between first and last place scores, with a difference of just over six percent between the lowest median score of 303 (Brave) and the highest median score of 322 (Firefox).

Figure 1: The median scores from running WebXPRT 4 five times with each browser on the Lenovo ThinkPad T14s Gen 3.

In this round of tests, the distribution of scores indicates that most users would not see a significant performance difference if they switched between the latest versions of these browsers. The one exception may be a change from the latest version of Brave to the latest version of Firefox. Even then, the quality of your browsing experience will often depend on other factors. The types of things you do on the web (e.g., gaming, media consumption, or multi-tab browsing), the type and number of extensions you’ve installed, and how frequently the browsers issue updates and integrate new technologies—among other things—can all affect browser performance over time. It’s important to keep such variables in mind when thinking about how browser performance comparison results may translate to your everyday web experience.

Have you tried using WebXPRT 4 in your own browser performance comparison? If so, we’d love to hear about it! Also, please let us know if there are other types of WebXPRT comparisons you’d like to see!

Justin

Shop confidently this holiday season with the XPRTs!

The holiday shopping season is upon us, and trying to find the right tech gift for your friends or loved ones (or yourself!) can be a daunting task. If you’re considering new phones, tablets, Chromebooks, laptops, or desktops as gifts this year—and are unsure where to get reliable device information—the XPRTs can help!

The XPRTs provide industry-trusted and time-tested measures of a device’s performance that can help you cut through the fog of competing marketing claims. For example, instead of guessing whether the performance of a new gaming laptop justifies its price, you can use its WebXPRT performance score to see how it stacks up against both older models and competitors while tackling everyday tasks.

A great place to start looking for device scores is our XPRT results browser, which lets you access our database of more than 3,700 test results—across all the XPRT benchmarks and hundreds of devices—from over 155 sources, including major tech review publications around the world, OEMs, our own Principled Technologies (PT) testing, and independent submissions. For tips on how to use the XPRT results browser, check out this blog post.

Another way to view information in our results database is by using the WebXPRT 4 results viewer. The viewer provides an information-packed, interactive tool that we created to help people explore data from the set of almost 800 WebXPRT 4 results we’ve curated and published to date on our site. You’ll find detailed instructions in this blog post for how to use the WebXPRT 4 results viewer tool.

If you’re considering a popular device, it’s likely that a recent tech press review includes an XPRT score for it. To find those scores, go to your favorite tech review site and search for “XPRT,” or enter the name of the device and the appropriate XPRT (e.g., “iPhone” and “WebXPRT”) in a search engine. Here are a few recent tech reviews that used the XPRTs to evaluate popular devices:

In addition to XPRT-related resources in the tech press, here at PT we frequently publish reports that evaluate the performance of hot new consumer devices, and many of those reports include WebXPRT scores. For example, check out the results from our extensive testing of a Dell Latitude 7450 AI PC or our in-depth evaluation of three new Lenovo ThinkPad and ThinkBook laptops.

The XPRTs can help you make better-informed and more confident tech purchases this holiday season. We hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database, please feel free to ask!

Justin

Shopping for back-to-school tech? The XPRTs can help!

For many students, the first day of school is just around the corner, and it’s now time to shop for new tech devices that can help set them up for success in the coming year. The tech marketplace can be confusing, however, with so many brands, options, and competing claims to sort through.

Fortunately, the XPRTs are here to help!

Whether you’re shopping for a new phone, tablet, Chromebook, laptop, or desktop, the XPRTs can provide industry-trusted performance scores that can give you confidence that you’re making a smart purchasing decision.

The WebXPRT 4 results viewer is a good place to start looking for device scores. The viewer displays WebXPRT 4 scores from over 700 devices—including many of the latest releases—and we’re adding new scores all the time. To learn more about the viewer’s capabilities and how you can use it to compare devices, check out this blog post.

Another resource we offer is the XPRT results browser. The browser is the most efficient way to access the XPRT results database, which currently holds more than 3,700 test results from over 150 sources, including major tech review publications around the world, manufacturers, and independent testers. It offers a wealth of current and historical performance data across all the XPRT benchmarks and hundreds of devices. You can read more about how to use the results browser here.

Also, if you’re considering a popular device, there’s a good chance that a recent tech review includes an XPRT score for that device. There are two quick ways to find these reviews: You can either (1) search for “XPRT” on your preferred tech review site or (2) use a search engine and input the device name and XPRT name, such as “Dell XPS” and “WebXPRT.”

Here are a few recent tech reviews that use one of the XPRTs to evaluate a popular device:

Lastly, here at Principled Technologies, we frequently publish reports that evaluate the performance of hot new consumer devices, and many of those reports include WebXPRT scores. For example, check out our extensive testing of HP ZBook G10 mobile workstations or our detailed comparison of Lenovo ThinkPad, ThinkBook, and ThinkCentre devices to their Apple Mac counterparts.

The XPRTs can help anyone stuck in the back-to-school shopping blues make better-informed and more confident tech purchases. As this new school year begins, we hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database please feel free to ask!

Justin

Best practices in benchmarking

From time to time, a tester writes to ask for help determining why they see different WebXPRT scores on two systems that have the same hardware configuration. The scores sometimes differ by a significant percentage. This can happen for many reasons, including different software stacks, but score variability can also result from different testing behavior and environments. While a small amount of variability is normal, these types of questions provide an opportunity to talk about the basic benchmarking practices we follow in the XPRT lab to produce the most consistent and reliable scores.

Below, we list a few basic best practices you might find useful in your testing. Most of them relate to evaluating browser performance with WebXPRT, but several of these practices apply to other benchmarks as well.

  • Test with clean images: We typically use an out-of-box (OOB) method for testing new devices in the XPRT lab. OOB testing means that other than running the initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device, before installing additional apps and utilities. This is the best way to provide an accurate assessment of the performance retail buyers will experience. While OOB is not appropriate for certain types of testing, the key is to not test a device that’s bogged down with programs that will influence results.
  • Turn off automatic updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always double-check update settings before testing.
  • Get a baseline for system processes: Depending on the system and the OS, a significant amount of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable (idle) baseline of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variance in the first run before the scores start to tighten up.
  • Hardware is not the only important factor: Most people know that different browsers produce different performance scores on the same system. However, testers aren’t always aware of shifts in performance between different versions of the same browser. While most updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always worthwhile to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Use more than one data point: Because of natural variance, our standard practice in the XPRT lab is to publish a score that represents the median from three to five runs, if not more. If you run a benchmark only once, and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions.

We hope these tips will help make your testing more accurate. If you have any questions about the XPRTs, or about benchmarking in general, feel free to ask!

Justin

WebXPRT passes the million-run milestone!

We’re excited to see that users have successfully completed over 1,000,000 WebXPRT runs! If you’ve run WebXPRT in any of the 924 cities and 81 countries from which we’ve received complete test data—including newcomers Bahrain, Bangladesh, Mauritius, The Philippines, and South Korea —we’re grateful for your help. We could not have reached this milestone without you!

As the chart below illustrates, WebXPRT use has grown steadily since the debut of WebXPRT 2013. On average, we now record more WebXPRT runs in one month than we recorded in the entirety of our first year. With over 104,000 runs so far in 2022, that growth is continuing.


For us, this moment represents more than a numerical milestone. Developing and maintaining a benchmark is never easy, and a cross-platform benchmark that will run on a wide variety of devices poses an additional set of challenges. For such a benchmark to succeed, developers need not only technical competency, but the trust and support of the benchmarking community. WebXPRT is now in its ninth year, and its consistent year-over-year growth tells us that the benchmark continues to hold value for manufacturers, OEM labs, the tech press, and end users like you. We see it as a sign of trust that folks repeatedly return to the benchmark for reliable performance metrics. We’re grateful for that trust, and for everyone that’s contributed to the WebXPRT development process throughout the years.

We’ll have more to share related to this exciting milestone in the weeks to come, so stay tuned to the blog. If you have any questions or comments about WebXPRT, we’d love to hear from you!

Justin

Check out the other XPRTs:

Forgot your password?