Recently, a tester contacted us with details from a CrXPRT 2 performance test run that they’d successfully completed on… an Apple MacBook Pro! Because CrXPRT 2 is a Chrome Web App that we designed for Chrome OS, it was quite a surprise to hear that it is now possible to run CrXPRT 2 on non-Chrome OS platforms by using FydeOS.
FydeOS is an operating system based on a fork of the Chromium OS project. Developers originally intended FydeOS to be a Google-independent, Chrome-like alternative for the Chinese educational market, but FydeOS is now available to the English-speaking consumer and enterprise markets as well. FydeOS users can run a Chrome-like OS on something other than a Chromebook or a Chromebox, such as a PC, Mac, virtual machine, or even a Raspberry Pi device. Additionally, FydeOS supports Android, Chrome OS, and Linux apps, and users can run those apps at the same time on the same screen.
We have not yet conducted any testing with FydeOS in our lab, but we wanted to pass along this information to any readers who may be interested. If the OS operates as described, it may provide a way for us to experiment with using CrXPRT 2 in some interesting cross-platform tests.
WebXPRT 4 has been available to testers since
the end of December, and we’re excited to see that the benchmark is already
gaining significant traction in the tech press and testing communities. Several
tech publications have already published reviews that feature WebXPRT results,
and the number of WebXPRT 4 runs is growing by about fifty percent each month, more
than twice the rate of growth for WebXPRT 3 after launch.
As WebXPRT 4 use continues to grow,
and more tech publications and OEM labs add WebXPRT 4 to their benchmark
suites, we encourage you to keep an eye on the WebXPRT 4 results viewer.
The viewer currently has about 120 test results, and we’ll continue to populate
the viewer with the latest PT-curated WebXPRT 4
results each week.
You don’t have to be a tech
journalist to publish a WebXPRT 4 result, however. We publish any results—including
individual user submissions—that meet our evaluation criteria. To submit a result
for publication consideration, simply follow the straightforward submission instructions
after the test completes. Scores must be consistent with general expectations and
must include enough detailed system information that we can determine whether
the score makes sense. If you’ve tested with WebXPRT 4 on a new device, or any
device or device configuration that’s not already present in the results
viewer, we encourage you to send in the result. We want to hear from you!
From time to time, we like to run a series of in-house WebXPRT comparison tests to see if recent updates have changed the performance rankings of popular web browsers. We published our most recent comparison last October, when we used WebXPRT 3 to compare Windows 10 and Windows 11 browser performance on the same system. Now that WebXPRT 4 is live, it’s time to update our comparison series with the newest member of the XPRT family.
For this round of tests, we used a Dell
XPS 13 7930, which features an Intel Core i3-10110U processor and 4 GB of RAM, running
Windows 11 Home updated to version 21H2 (22000.593). We installed all current
Windows updates and tested on a clean system image. After the update process
completed, we turned off updates to prevent them from interfering with test
runs. We ran WebXPRT 4 three times each across five browsers: Brave, Google
Chrome, Microsoft Edge, Mozilla Firefox, and Opera. The posted score for each
browser is the median of the three test runs.
In our previous round of tests with WebXPRT 3, Google Chrome narrowly beat out Firefox in Windows 10 and Windows 11 testing, but the scores among three of the Chromium-based browsers (Chrome, Edge, and Opera) were close enough that most users performing common daily tasks would be unlikely to notice a difference. Brave performance lagged by about 7 percent, a difference that may be noticeable to most users. This time, when testing updated versions of the browsers with WebXPRT 4 on Windows 11, the rankings changed. Edge was the clear winner, with a 2.2 percent performance advantage over Chrome. Firefox came in last, about 3 percent slower than Opera, which was in the middle of the pack. Performance from Brave improved to the point that it was no longer lagging the other Chromium-based browsers.
Do these results mean that Microsoft
Edge will always provide you with a speedier web experience? A device with a
higher WebXPRT score will probably feel faster during daily use than one with a
lower score. For comparisons on the same system, however, the answer depends in
part on the types of things you do on the web, how the extensions you’ve
installed affect performance, how frequently the browsers issue updates and
incorporate new web technologies, and how accurately each browser’s default
installation settings reflect how you would set up that browser for your daily
In addition, browser speed can
increase or decrease significantly after an update, only to swing back in the
other direction shortly thereafter. OS-specific optimizations can also affect
performance, such as with Edge on Windows 11 and Chrome on Chrome OS. All these
variables are important to keep in mind when considering how WebXPRT results
translate to your everyday experience.
Do you have insights you’d like to share from using WebXPRT to compare browser performance? Let us know!
We recently received a
question from member of the tech press about whether we would be willing to
supply them with the WebXPRT 4 source code, along with instructions for how to
set up a local instance of the benchmark for their internal testbed. We were
happy to help, and they are now able to automate WebXPRT 4 runs within their
own isolated network.
If you’re a new XPRT
tester, you may not be aware that we provide free access to the source code for
each of the XPRT benchmarks. Publishing XPRT source code is part of our
commitment to making the XPRT development process as transparent as possible.
By allowing all interested parties to access and review our source code, we’re encouraging
openness and honesty in the benchmarking industry and are inviting the kind of
constructive feedback that helps to ensure that the XPRTs continue to
contribute to a level playing field.
While XPRT source code
is available to the public, our approach to derivative works differs from some
open-source models. Traditional open-source models encourage developers to
change products and even take them in different directions. Because benchmarking
requires a product that remains static to enable valid comparisons over time,
we allow people to download the source, but we reserve the right to control
derivative works. This discourages a situation where someone publishes an
unauthorized version of the benchmark and calls it an “XPRT.”
Accessing XPRT source code is a straightforward process. The source code for CloudXPRT is freely available in our CloudXPRT GitHub repository. If you’d like to download and review the source code for WebXPRT 4 or any of the other XPRTs, or get instructions for how to build one of the benchmarks, all you need to do is contact us at firstname.lastname@example.org. Your feedback is valuable!
Last March, we discussed the Chrome OS team’s original announcement that they would be phasing out support for Chrome Apps altogether in June 2021, and would shift their focus to Chrome extensions and Progressive Web Apps. The Chrome OS team eventually extended support for existing Chrome Apps through June 2022, but as of this week, we see no indication that they will further extend support for Chrome Apps published with general developer accounts. If the end-of-life schedule for Chrome Apps does not change in the next few months, both CrXPRT 2 and CrXPRT 2015 will stop working on new versions of Chrome OS at some point in June.
To maintain CrXPRT
functionality past June, we would need to rebuild the app completely—either as
a Progressive Web App or in some other form. For this reason, we want to
reassess our approach to Chrome OS testing, and investigate which features and
technologies to include in a new Chrome OS benchmark. Our current goal is to
gather feedback and conduct exploratory research over the next few months, and begin
developing an all-new Chrome OS benchmark for publication by the end of the
While we will discuss ideas for this new Chrome OS benchmark in future blog posts, we welcome ideas from CrXPRT users now. What features or workloads would you like the new benchmark to retain? Would you like us to remove any components from the existing benchmark? Does the battery life test in its current form suit your needs? If you have any thoughts about these questions or any other aspects of Chrome OS benchmarking, please let us know!
that WebXPRT 4 is live, we want to remind readers about the features of the WebXPRT 4 results viewer.
We’re excited about this new tool, which we view as an ongoing project that we
will expand and improve over time. The viewer currently has over 100 test
results, and we’re just getting started. We’ll continue to actively populate
the viewer with the latest PT-curated WebXPRT 4
results for the foreseeable future.
The screenshot below shows the tool’s default display. Each vertical bar in the graph represents the overall score of a single test result, with bars arranged from lowest to highest. To view a single result in detail, the user hovers over a bar until it turns white and a small popup window displays the basic details of the result. Once the user clicks to select the highlighted bar, the bar turns dark blue, and the dark blue banner at the bottom of the viewer displays additional details about that result.
the example above, the banner shows the overall score (227), the score’s
percentile rank (98th) among the scores in the current display, the name
of the test device, and basic hardware disclosure information. Users can click the
Run info button to see the run’s individual workload scores.
The viewer includes a drop-down menu to quickly filter results by major device type categories, and a tab that allows users to apply additional filtering options, such as browser type, processor vendor, and result source. The screenshot below shows the viewer after I used the device type drop-down filter to select only laptops.
The screenshot below shows the viewer as I use the filter tab to explore additional filter options, such browser type.
The viewer also lets users pin multiple specific runs, which is helpful for making side-by-side comparisons. The screenshot below shows the viewer after I pinned four runs and viewed them on the Pinned runs screen.
The screenshot below shows the viewer after I clicked the Compare runs button: the overall and individual workload scores of the pinned runs appear as a table.
We’re excited about the WebXPRT 4 results viewer, and we want to hear your feedback. Are there features you’d really like to see, or ways we can improve the viewer? Please let us know, and send us your latest test results!