BenchmarkXPRT Blog banner

Category: Benchmark metrics

XPRTs in the press

Each month, we send a newsletter to members of the BenchmarkXPRT Development Community. In the newsletter, we recap the latest updates from the XPRT world and provide a summary of the previous month’s XPRT-related activity, including uses or mentions of the XPRTs in the tech press. More people read the weekly XPRT blog than receive the monthly newsletter, so we realized that some blog readers may be unaware of the wide variety of tech outlets that regularly use or mention the XPRTs.

So for today’s blog, we want to give readers a sampling of the XPRT press usage we see on a weekly basis. Recent mentions include:

  • Tom’s Guide used HDXPRT 4 to compare the performance of the Geekom Mini IT8 and Dell OptiPlex 7090 Ultra small-form-factor PCs.
  • Intel used WebXPRT 4 test data in promotional material for their line of 12th Gen) Intel Core processors(Alder Lake). Hundreds of press outlets then republished the presentation.
  • AnandTech used WebXPRT 4 to evaluate the Cincoze DS-1300 Industrial PC.
  • ZDNet used CrXPRT 2 in a review titled The best Chromebooks for students: Student-proof laptops.
  • PCWorld used CrXPRT 2 to provide data for an article listing their top Chromebook recommendations.
  • TechPowerUp used WebXPRT 3 to compare the browser performance of Intel Core i9-12900KS processor-based systems and other Intel- and AMD processor-based systems.
  • Other outlets that have published articles, ads, or reviews mentioning the experts in the last few months include: Android Authority, ASUS, BenchLife, Gadgets 360, Good Gear Guide, Hardware.info, Hot Hardware, ITHardware (Poland), ITMedia (Japan), Itndaily (Russia), Mobile01.com (China), Notebookcheck, PCMag, ProClockers, Sohu.com (China), Tom’s Hardware, and Tweakers.

If you don’t currently receive the monthly BenchmarkXPRT newsletter, but would like to join the mailing list, please let us know! We will not publish or sell any of the contact information you provide, and will only send the monthly newsletter and occasional benchmark-related announcements such as patch notifications or new benchmark releases.

Justin

A great start for WebXPRT 4!

WebXPRT 4 has been available to testers since the end of December, and we’re excited to see that the benchmark is already gaining significant traction in the tech press and testing communities. Several tech publications have already published reviews that feature WebXPRT results, and the number of WebXPRT 4 runs is growing by about fifty percent each month, more than twice the rate of growth for WebXPRT 3 after launch.

As WebXPRT 4 use continues to grow, and more tech publications and OEM labs add WebXPRT 4 to their benchmark suites, we encourage you to keep an eye on the WebXPRT 4 results viewer. The viewer currently has about 120 test results, and we’ll continue to populate the viewer with the latest PT-curated WebXPRT 4 results each week.

You don’t have to be a tech journalist to publish a WebXPRT 4 result, however. We publish any results—including individual user submissions—that meet our evaluation criteria. To submit a result for publication consideration, simply follow the straightforward submission instructions after the test completes. Scores must be consistent with general expectations and must include enough detailed system information that we can determine whether the score makes sense. If you’ve tested with WebXPRT 4 on a new device, or any device or device configuration that’s not already present in the results viewer, we encourage you to send in the result. We want to hear from you!

Justin

Using WebXPRT 4 to compare the performance of popular browsers

From time to time, we like to run a series of in-house WebXPRT comparison tests to see if recent updates have changed the performance rankings of popular web browsers. We published our most recent comparison last October, when we used WebXPRT 3 to compare Windows 10 and Windows 11 browser performance on the same system. Now that WebXPRT 4 is live, it’s time to update our comparison series with the newest member of the XPRT family.

For this round of tests, we used a Dell XPS 13 7930, which features an Intel Core i3-10110U processor and 4 GB of RAM, running Windows 11 Home updated to version 21H2 (22000.593). We installed all current Windows updates and tested on a clean system image. After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 4 three times each across five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. The posted score for each browser is the median of the three test runs.

In our previous round of tests with WebXPRT 3, Google Chrome narrowly beat out Firefox in Windows 10 and Windows 11 testing, but the scores among three of the Chromium-based browsers (Chrome, Edge, and Opera) were close enough that most users performing common daily tasks would be unlikely to notice a difference. Brave performance lagged by about 7 percent, a difference that may be noticeable to most users. This time, when testing updated versions of the browsers with WebXPRT 4 on Windows 11, the rankings changed. Edge was the clear winner, with a 2.2 percent performance advantage over Chrome. Firefox came in last, about 3 percent slower than Opera, which was in the middle of the pack. Performance from Brave improved to the point that it was no longer lagging the other Chromium-based browsers.

Do these results mean that Microsoft Edge will always provide you with a speedier web experience? A device with a higher WebXPRT score will probably feel faster during daily use than one with a lower score. For comparisons on the same system, however, the answer depends in part on the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately each browser’s default installation settings reflect how you would set up that browser for your daily workflow.

In addition, browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Edge on Windows 11 and Chrome on Chrome OS. All these variables are important to keep in mind when considering how WebXPRT results translate to your everyday experience.

Do you have insights you’d like to share from using WebXPRT to compare browser performance? Let us know!

Justin

Exploring the WebXPRT 4 results viewer

Now that WebXPRT 4 is live, we want to remind readers about the features of the WebXPRT 4 results viewer. We’re excited about this new tool, which we view as an ongoing project that we will expand and improve over time. The viewer currently has over 100 test results, and we’re just getting started. We’ll continue to actively populate the viewer with the latest PT-curated WebXPRT 4 results for the foreseeable future.

The screenshot below shows the tool’s default display. Each vertical bar in the graph represents the overall score of a single test result, with bars arranged from lowest to highest. To view a single result in detail, the user hovers over a bar until it turns white and a small popup window displays the basic details of the result. Once the user clicks to select the highlighted bar, the bar turns dark blue, and the dark blue banner at the bottom of the viewer displays additional details about that result.

In the example above, the banner shows the overall score (227), the score’s percentile rank (98th) among the scores in the current display, the name of the test device, and basic hardware disclosure information. Users can click the Run info button to see the run’s individual workload scores.

The viewer includes a drop-down menu to quickly filter results by major device type categories, and a tab that allows users to apply additional filtering options, such as browser type, processor vendor, and result source. The screenshot below shows the viewer after I used the device type drop-down filter to select only laptops.

The screenshot below shows the viewer as I use the filter tab to explore additional filter options, such browser type.

The viewer also lets users pin multiple specific runs, which is helpful for making side-by-side comparisons. The screenshot below shows the viewer after I pinned four runs and viewed them on the Pinned runs screen.

The screenshot below shows the viewer after I clicked the Compare runs button: the overall and individual workload scores of the pinned runs appear as a table.

We’re excited about the WebXPRT 4 results viewer, and we want to hear your feedback. Are there features you’d really like to see, or ways we can improve the viewer? Please let us know, and send us your latest test results!

Justin

Updated system configuration recommendations for CrXPRT 2 battery life tests

Recently, we heard from a BenchmarkXPRT Development Community member who was testing Chromebooks in their lab. On a few of the Chromebooks, they saw sporadic CrXPRT 2 battery life test failures where CrXPRT 2 would successfully complete a battery life test and produce a result for the initial run, but then fail at the end of later runs.

After a considerable amount of troubleshooting, they determined that the issue seemed to be related to the way some systems automatically shut down before the battery is completely exhausted, and the way some systems will automatically boot up once the tester plugs in the power adapter for charging. This member found that when they added a few system configuration steps before battery life tests and made slight changes to their post-test routine, the systems that had previously experienced consistent failures would successfully complete battery life tests and produce results.

The added steps are quick and straightforward, and we decided to add them to the Configuring the test device and Running the tests sections of the CrXPRT 2 user manual. We hope this updated guidance will help to prevent future frustration for CrXPRT 2 testers.

If you have any questions or comments about the CrXPRT 2 battery life test, please feel free to contact us!

Justin

Why we don’t control screen brightness during CrXPRT 2 battery life tests

Recently, we had a discussion with a community member about why we no longer recommend specific screen brightness settings during CrXPRT 2 battery life tests. In the CrXPRT 2015 user manual, we recommended setting the test system’s screen brightness to 200 nits. Because the amount of power that a system directs to screen brightness can have a significant impact on battery life, we believed that pegging screen brightness to a common standard for all test systems would yield apple-to-apples comparisons.

After extensive experience with CrXPRT 2015 testing, we decided to not recommend a standard screen brightness with CrXPRT 2, for the following reasons:

  • A significant number of Chromebooks cannot produce a screen brightness of 200 nits. A few higher-end models can do so, but they are not representative of most Chromebooks. Some Chromebooks, especially those that many school districts and corporations purchase in bulk, cannot produce a brightness of even 100 nits.
  • Because of the point above, adjusting screen brightness would not represent real-life conditions for most Chromebooks, and the battery life results could mislead consumers who want to know the battery life they can expect with default out-of-box settings.
  • Most testers, and even some labs, do not have light meters, and the simple brightness percentages that the operating system reports produce different degrees of brightness on different systems. For testers without light meters, a standardized screen brightness recommendation could discourage them from running the test.
  • The brightness controls for some low-end Chromebooks lack the fine-tuning capability that is necessary to standardize brightness between systems. In those cases, an increase or decrease of one notch can swing brightness by 20 to 30 nits in either direction. This could also discourage testing by leading people to believe that they lack the capability to correctly run the test.

In situations where testers want to compare battery life using standardized screen brightness, we recommend using light meters to set the brightness levels as closely as possible. If the brightness levels between systems vary by more than few nits, and if the levels vary significantly from out-of-box settings, the publication of any resulting battery life results should include a full disclosure and explanation of test conditions.

For the majority of testers without light meters, running the CrXPRT 2 battery life test with default screen brightness settings on each system provides a reliable and accurate estimate of the type of real-world, out-of-box battery life consumers can expect.

If you have any questions or comments about the CrXPRT 2 battery life test, please feel free to contact us!

Justin

Check out the other XPRTs:

Forgot your password?