BenchmarkXPRT Blog banner

Category: WebXPRT

How we evaluate new WebXPRT workload proposals

A key value of the BenchmarkXPRT Development Community is our openness to user feedback. Whether it’s positive feedback about our benchmarks, constructive criticism, ideas for completely new benchmarks, or proposed workload scenarios for existing benchmarks, we appreciate your input and give it serious consideration.

We’re currently accepting ideas and suggestions for ways we can improve WebXPRT 4. We are open to adding both non-workload features and new auxiliary tests, which can be experimental or targeted workloads that run separately from the main test and produce their own scores. You can read more about experimental WebXPRT 4 workloads here. However, a recent user question about possible WebGPU workloads has prompted us to explain the types of parameters that we consider when we evaluate a new WebXPRT workload proposal.

Community interest and real-life relevance

The first two parameters we use when evaluating a WebXPRT workload proposal are straightforward: are people interested in the workload and is it relevant to real life? We originally developed WebXPRT to evaluate device performance using the types of web-based tasks that people are likely to encounter daily, and real-life relevancy continues to be an important criterion for us during development. There are many technologies, functions, and use cases that we could test in a web environment, but only some of them are both relevant to common applications or usage patterns and likely to be interesting to lab testers and tech reviewers.

Maximum cross-platform support

Currently, WebXPRT runs in almost any web browser, on almost any device that has a web browser, and we would ideally maintain that broad level of cross-platform support when introducing new workloads. However, technical differences in the ways that different browsers execute tasks mean that some types of scenarios would be impossible to include without breaking our cross-platform commitment.

One reason that we’re considering auxiliary workloads with WebXPRT, e.g., a battery life rundown, is that those workloads would allow WebXPRT to offer additional value to users while maintaining the cross-platform nature of the main test. Even if a battery life test ran on only one major browser, it could still be very useful to many people.

Performance differentiation

Computer benchmarks such as the XPRTs exist to provide users with reliable metrics that they can use to gauge how well target platforms or technologies perform certain tasks. With a broadly targeted benchmark such as WebXPRT, if the workloads are so heavy that most devices can’t handle them, or so light that most devices complete them without being taxed, the results will have little to no use for OEM labs, the tech press, or independent users when evaluating devices or making purchasing decisions.

Consequently, with any new WebXPRT workload, we try to find a sweet spot in terms of how demanding it is. We want it to run on a wide range of devices—from low-end devices that are several years old to brand-new high-end devices and everything in between. We also want users to see a wide range of workload scores and resulting overall scores, so they can easily grasp the different performance capabilities of the devices under test.

Consistency and replicability

Finally, workloads should produce scores that consistently fall within an acceptable margin of error, and are easily to replicate with additional testing or comparable gear. Some web technologies are very sensitive to uncontrollable or unpredictable variables, such as internet speed. A workload that measures one of those technologies would be unlikely to produce results that are consistent and easily replicated.

We hope this post will be useful for folks who are contemplating potential new WebXPRT workloads. If you have any general thoughts about browser performance testing, or specific workload ideas that you’d like us to consider, please let us know.

Justin

WebXPRT’s mirror host site in Singapore

If you’ve ever spent time exploring WebXPRT.com, you may have noticed a line that says, “If you are in East Asia, you can run WebXPRT from our Singapore host,” followed by a hyperlink with Simplified Chinese characters. We realize that some people may not know why we have a WebXPRT mirror host site in Singapore—or how to use it—so today’s post will cover the basics.

When we first released WebXPRT 2013, some users in mainland China reported slow download times when running the benchmark. These slowdowns affected initial page and workload content load times, but not workload execution, which happens locally. As a result, subtest and overall scores were still consistent with expectations for the devices under test, but it took longer than normal for test runs to complete. In response, we set up a mirror host site in Singapore to facilitate WebXPRT testing in China and other East Asian countries. We continued this practice with subsequent WebXPRT versions, and currently offer Singapore-based instances of WebXPRT 4WebXPRT 3, and WebXPRT 2015.

The link to WebXPRT 4 Singapore on WebXPRT.com

The default UI language on the Singapore site is Simplified Chinese, but users can opt to change the language to English or German. Apart from a different default language, the WebXPRT mirror instances hosted in Singapore are identical to the instances on the main WebXPRT site. If you test a device on WebXPRT Singapore and WebXPRT.com, you should see similar performance scores from both sites.

The start page for WebXPRT 4 Singapore, with the default Simplified Chinese UI

We hope that the WebXPRT mirror host site in Singapore will make it easier for people in East Asia to use the benchmark. Do you find the site useful? If so, we’d love to hear from you! Also, if you encounter any unexpected issues or interruptions while testing, please let us know!

Justin

WebXPRT runs: A decade of growth

In our last blog post, we celebrated the 10-year anniversary of the WebXPRT launch by looking back on the WebXPRT team’s accomplishments over the last decade. The incremental steps and milestone improvements we discussed all contributed to carving out a lasting place for WebXPRT in the benchmarking world and helped to grow its reputation for being a reliable, effective, and easy to use measurement tool.

WebXPRT’s growth is most evident when we look at the rising number of completed test runs over the last 10 years. Since the first WebXPRT launch in 2013, we’ve seen a steady increase in the number of tests people are running. To put the increase in perspective, we had more runs last month alone (17,300) than we recorded in the first 10 months that WebXPRT was available (11,984).

That growth has helped us to reach and surpass the million-run mark, but the most exciting aspect of seeing a consistent increase in WebXPRT testing is the knowledge that the benchmark is proving to be useful to more people in more places around the world. In our next blog post, we’ll discuss WebXPRT’s truly global reach and some of the surprising cities and countries where people have been using it to test their gear.

We’re grateful for all the testers that have helped WebXPRT grow during the last decade. If you have any questions or comments about using WebXPRT, let us know!

Justin

Celebrating 10 years of WebXPRT!

We’re excited to announce that it’s been 10 years since the initial launch of WebXPRT! In early 2013, we introduced WebXPRT as a unique browser performance benchmark in a market space that was already crowded with a variety of specialized measurement tools. Our goal was to offer a benchmark that could compare the performance of almost any web-enabled device, using scenarios created to mirror real-world tasks. We wanted it to be a free, easily accessible, easy-to-run, useful, and appealing testing option for OEM labs, vendors, and the tech press.

When we look back on the last 10 years of WebXPRT, we can’t help but conclude that our efforts have been successful. Since those early days, the WebXPRT market presence has grown from humble beginnings into a worldwide industry standard. Hundreds of tech press publications have used WebXPRT in thousands of articles and reviews, and testers have now run the benchmark well over 1.1 million times.

Below, I’ve listed some of the WebXPRT team’s accomplishments over the last decade. If you’ve been following WebXPRT from the beginning, this may all be familiar, but if you’re new to the  community, it may be interesting to see some of the steps that contributed to making WebXPRT what it is today.

In future blog posts, we’ll look at how the number of WebXPRT runs has grown over time, and how WebXPRT use has grown among OEMs, vendors, and the tech press worldwide. Do you have any thoughts that you’d like to share from your WebXPRT testing experience? If so, let us know!

Justin

Comparing the performance of popular browsers with WebXPRT 4

If you’ve been reading the XPRT blog for a while, you know that we occasionally like to revisit a series of in-house WebXPRT comparison tests to see if recent updates have changed the performance rankings of popular web browsers. We published our most recent comparison last April, when we used WebXPRT 4 to compare the performance of five browsers on the same system.

For this round of tests, we used a Dell XPS 13 7930, which features an Intel Core i3-10110U processor and 4 GB of RAM, running Windows 11 Home updated to version 22H2 (22621.1105). We installed all current Windows updates, and updated each of the browsers under test: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera.

After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 4 three times on each of the five browsers. The score we post for each browser is the median of the three test runs.

In our last round of tests, Edge was the clear winner, with a 2.2 percent performance advantage over Chrome. Firefox came in last, about 3 percent slower than Opera, which was in the middle of the pack. With updated versions of the browsers, the only change in rank order was that Brave moved into a tie with Opera.

While the rank order from this round of tests was very similar to the previous round, we did observe two clear performance trends: (1) the range between high and low scores was tighter, dropping from a difference of 7.8 percent to 4.3 percent, and (2) every browser demonstrated improved performance. The chart below illustrates both trends. Firefox showed the single largest score improvement at 7.8 percent, but the performance jump for each browser was considerable.

Do these results mean that Microsoft Edge will always provide a speedier web experience, or Firefox will always be slower than the others? Not necessarily. It’s true that a device with a higher WebXPRT score will probably feel faster during daily web activities than one with a much lower score, but your experience depends in part on the types of things you do on the web, along with your system’s privacy settings, memory load, ecosystem integration, extension activity, and web app capabilities.

In addition, browser speed can noticeably increase or decrease after an update, and OS-specific optimizations can affect performance, such as with Edge on Windows 11 and Chrome on Chrome OS. All these variables are important to keep in mind when considering how WebXPRT results translate to your everyday experience.

Have you used WebXPRT to compare browser performance on the same system? Let us know how it turned out!

Justin

Looking back on 2022 with the XPRTs

Around the beginning of each new year, we like to take the opportunity to look back and summarize the XPRT highlights from the previous year. Readers of our newsletter are familiar with the stats and updates we include each month, but for our blog readers who don’t receive the newsletter, we’ve compiled some highlights from 2022 below.

Benchmarks
In the past year, we released WebXPRT 4, and the CloudXPRT v1.2 update package.

XPRTs in the media
Journalists, advertisers, and analysts referenced the XPRTs thousands of times in 2022. It’s always rewarding to know that the XPRTs have proven to be useful and reliable assessment tools for technology publications around the world. Media sites that used the XPRTs in 2022 include AnandTech, Android Authority, Benchlife.info (China), BodNara (South Korea), ComputerBase (Germany), DISKIDEE (Belgium), eTeknix, Expert Reviews, Gadgets 360, Hardware.info (The Netherlands), Hardware Zone (Singapore), ITC.ua (Ukraine), ITmedia (Japan), Itndaily.ru (Russia), Notebookcheck, PCMag, PC-Welt (Germany), PCWorld, TechPowerUp, Tom’s Guide, TweakTown, and ZOL.com (China).

Downloads and confirmed runs
In 2022, we had more than 10,800 benchmark downloads and 183,300 confirmed runs. Users have run our most popular benchmark, WebXPRT, more than 1,135,500 times since its debut in 2013! WebXPRT continues to be a go-to, industry-standard performance benchmark for OEM labs, vendors, and leading tech press outlets around the globe.

XPRT media, tools, and publications
Part of our mission with the XPRTs is to produce tools and materials that help testers better understand the ins and outs of benchmarking in general and the XPRTs in particular. To help achieve this goal, we published the following in 2022:

We’re thankful for everyone who used the XPRTs, joined the community, and sent questions and suggestions throughout 2022. We’re excited to see what’s in store for the XPRTs in 2023!

Justin

Check out the other XPRTs:

Forgot your password?