BenchmarkXPRT Blog banner

Tag Archives: Mozilla

Comparing the performance of popular browsers with WebXPRT 4

If you’ve been reading the XPRT blog for a while, you know that we occasionally like to revisit a series of in-house WebXPRT comparison tests to see if recent updates have changed the performance rankings of popular web browsers. We published our most recent comparison last April, when we used WebXPRT 4 to compare the performance of five browsers on the same system.

For this round of tests, we used a Dell XPS 13 7930, which features an Intel Core i3-10110U processor and 4 GB of RAM, running Windows 11 Home updated to version 22H2 (22621.1105). We installed all current Windows updates, and updated each of the browsers under test: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera.

After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 4 three times on each of the five browsers. The score we post for each browser is the median of the three test runs.

In our last round of tests, Edge was the clear winner, with a 2.2 percent performance advantage over Chrome. Firefox came in last, about 3 percent slower than Opera, which was in the middle of the pack. With updated versions of the browsers, the only change in rank order was that Brave moved into a tie with Opera.

While the rank order from this round of tests was very similar to the previous round, we did observe two clear performance trends: (1) the range between high and low scores was tighter, dropping from a difference of 7.8 percent to 4.3 percent, and (2) every browser demonstrated improved performance. The chart below illustrates both trends. Firefox showed the single largest score improvement at 7.8 percent, but the performance jump for each browser was considerable.

Do these results mean that Microsoft Edge will always provide a speedier web experience, or Firefox will always be slower than the others? Not necessarily. It’s true that a device with a higher WebXPRT score will probably feel faster during daily web activities than one with a much lower score, but your experience depends in part on the types of things you do on the web, along with your system’s privacy settings, memory load, ecosystem integration, extension activity, and web app capabilities.

In addition, browser speed can noticeably increase or decrease after an update, and OS-specific optimizations can affect performance, such as with Edge on Windows 11 and Chrome on Chrome OS. All these variables are important to keep in mind when considering how WebXPRT results translate to your everyday experience.

Have you used WebXPRT to compare browser performance on the same system? Let us know how it turned out!

Justin

Considering a battery life test for WebXPRT 4

A few weeks ago, we discussed the beginnings of a WebXPRT 4 development plan, and asked for reader feedback about potential workload changes. So far, the two most common feedback topics have been the possible addition of a WebAssembly workload, and the feasibility of including a browser-based battery life test. Today, we discuss what a WebXPRT 4 battery life test would look like, and some of the challenges we’d have to overcome to make it a reality.

Battery life tests fall into two primary categories: simple rundown tests and performance-weighted tests. Simple rundown tests measure battery life during extreme idle periods and loops of movie playbacks, etc., but do not reflect the wide-ranging mix of activities that characterize a typical day for most users. While they can be useful for performing very specific apple-to-apples comparisons, these tests have limited value when it comes to giving consumers a realistic estimation of the battery life they would experience during everyday use.

In contrast, performance-weighted battery life tests, such as the one in CrXPRT 2, attempt to reflect real-world usage. The CrXPRT battery life test simulates common daily usage patterns for Chromebooks by including all the productivity workloads from the performance test, plus video playback, audio playback, and gaming scenarios. It also includes periods of wait/idle time. We believe this mixture of diverse activity and idle time better represents typical real-life behavior patterns. This makes the resulting estimated battery life much more helpful for consumers who are trying to match a device’s capabilities with their real-world needs.

From a technical standpoint, WebXPRT’s cross-platform nature presents us with several challenges that we did not face while developing the CrXPRT battery life test for Chrome OS. While the WebXPRT performance tests run in almost any browser, cross-browser differences in battery life reporting may restrict the battery life test to a single browser. For instance, Mozilla has deprecated the battery status API for Firefox, and we’re not yet sure if there’s another approach that would work. If a WebXPRT 4 battery life test supported only a single browser, such as Chrome or Safari, would you still be interested in using it? Please let us know.

A browser-based battery life workflow also presents other challenges that we do not face in native client applications such as CrXPRT:

  • A browser-based battery life test would require the user to check the starting and ending battery capacities, with no way for the app to independently verify data accuracy.
  • The battery life test could require more babysitting in the event of network issues. We can catch network failures and try to handle them by reporting periods of network disconnection, but those interruptions could influence the battery life duration.
  • The factors above could make it difficult to achieve repeatability. One way to address that problem would be to run the test in a standardized lab environment with a steady internet connection, but a long list of standardized environmental requirements would make the battery life test less attractive and less accessible to many testers.

Our intention with today’s blog is not to make a WebXPRT 4 battery life test seem like an impossibility. Rather, we want to share our perspective on what the test might look like, and describe some of the challenges and considerations in play. If you have thoughts about battery life testing, or experience with battery life APIs in one or more of the major browsers, we’d love to hear from you!

Justin

Using WebXPRT 3 to compare the performance of popular browsers (Round 2)

It’s been nine months since we’ve published a WebXPRT 3 browser performance comparison, so we decided to put the newest versions of popular browsers through the paces to see if the performance rankings have changed since our last round of tests.

We used the same laptop as last time: a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM running Windows 10 Home, updated to version 1909 (18363.1139). We installed all current Windows updates and tested on a clean system image. After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 3 three times on five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. The posted score for each browser is the median of the three test runs.

In our last round of tests, the four Chromium-based browsers (Brave, Chrome, Edge, and Opera) produced scores that were nearly identical. Only Mozilla Firefox produced a significantly different (and better) score. The parity of the Chromium-based browsers was not surprising, considering they have the same underlying foundation.

In this round of testing, the Chromium-based browsers again produced very close scores, although Brave’s performance lagged by about 4 percent. Firefox again separated itself from the pack with a higher score. With the exception of Chrome, which produced an identical score as last time, every browser’s score was slightly slower than before. There are many possible reasons for this, including increased overhead in the browsers or changes in Windows, and the respective slowdowns for each browser will probably be unnoticeable to most users during everyday tasks.

Do these results mean that Mozilla Firefox will provide you with a speedier web experience? As we noted in the last comparison, a device with a higher WebXPRT score will probably feel faster during daily use than one with a lower score. For comparisons on the same system, however, the answer depends in part on the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately each browsers’ default installation settings reflect how you would set up that browser for your daily workflow.

In addition, browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Edge on Windows 10 and Chrome on Chrome OS. All of these variables are important to keep in mind when considering how browser performance comparison results translate to your everyday experience.

What are your thoughts on browser performance? Let us know!

Justin

Check out the other XPRTs:

Forgot your password?