One
of our goals during the ongoing WebXPRT 4 development process is to be as
responsive as possible to user feedback, and we want to emphasize that it’s not
too late to send us your ideas. Until we finalize the details for each workload
and complete the code work for the preview build, we still have quite a bit of
flexibility around adding new features.
Just
this week, a community member raised the possibility of a WebXPRT 4 feature that
would enable user-specific test ID numbers or accounts. One possible implementation
of the idea would allow a user to sign up for a WebXPRT test account as an
individual or on behalf of their organization. The test accounts would be both
free and optional; you could continue to run the benchmark without an account,
but running it with an account would let you save and view your test history. Another
implementation option we are considering would let users generate a permanent
user ID number for themselves or their organization. They could then use that
number to tag and search for their automated test runs in our database, without
having to log into an account.
Our biggest question at the moment is whether our user base would be interested in WebXPRT user accounts or test IDs. If this concept piques your interest, or you have suggestions for implementation, please let us know!
In November, we published our WebXPRT 3 browser performance comparison,
so we decided it was time to see if the performance rankings of popular
browsers have changed in the last nine months.
For this round of tests, we used the same laptop as last time:
a Dell
XPS 13 7930 with an Intel
Core i3-10110U processor and 4 GB of RAM running Windows 10 Home, updated to
version 1909 (18363.1556). We installed all current Windows updates and tested
on a clean system image. After the update process completed, we turned off
updates to prevent them from interfering with test runs. We ran WebXPRT 3 three
times each on five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox,
and Opera. For each browser, the score we post below is the median of the three
test runs.
In our last
round of tests, the four
Chromium-based browsers (Brave, Chrome, Edge, and Opera) produced very close scores,
though we saw about a four percent lower score from Brave. In this round
of testing, performance improved for all four of the Chromium-based browsers.
Chrome, Edge, and Opera still produced very close scores, but Brave’s
performance still lagged, this time by about seven percent.
Firefox separated itself from the pack with a much higher score
and has been the clear winner in all three rounds of testing. During our second
round of testing in November, every browser except for Chrome saw slightly slower
performance than the first
round. In these latest tests, all the
Chromium-based browsers produced significantly higher scores than the second
round. When discussing browser performance, it’s important to remember that there
are many possible reasons for these performance changes—including changes in browser
overhead or changes in Windows—and most users may not notice the changes during
everyday tasks.
Do these results mean that
Mozilla Firefox will always provide you with a speedier web experience? As we
noted in previous comparisons, a device with a higher WebXPRT score will
probably feel faster during daily use than one with a lower score. For comparisons
on the same system, however, the answer depends on several factors, such as the
types of things you do on the web, how the extensions you’ve installed affect
performance, how frequently the browsers issue updates and incorporate new web
technologies, and how accurately each browser’s default installation settings
reflect how you would set up that browser for your daily workflow.
In addition, browser speed can
increase or decrease significantly after an update, only to swing back in the
other direction shortly thereafter. OS-specific optimizations can also affect
performance, such as with Edge on Windows 10 or Chrome on Chrome OS. All these
variables are important to keep in mind when considering how browser
performance comparison results translate to your everyday experience.
Do you have insights
you’d like to share from using WebXPRT to compare browser performance? Let us know!
As we move forward with the WebXPRT 4 development
process, we’re also working on ways to enhance the value of WebXPRT beyond simply
updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related
tools and resources we offer at WebXPRT.com, starting with a new results
viewer.
Currently, users can view
WebXPRT results on our site two primary ways, each of which has advantages and
limitations.
The first way is the WebXPRT results viewer, which includes hundreds of
PT-curated performance scores from a wide range of trusted sources and devices.
Users can sort entries by device type, device name, device model, overall
score, date of publication, and source. The viewer also includes a free-form
filter for quick, targeted searches. While the results viewer contains a wealth
of information, it does not give users a way to use graphs or charts for
viewing and comparing multiple results at once. Another limitation of the
current results viewer is that it offers no easy way for users to access the
additional data about the test device and the subtest scores that we have for
many entries.
The second way to view WebXPRT
results on our site is the WebXPRT Processor Comparison
Chart. The
chart uses horizontal bar graphs to compare test scores from the hundreds of
published results in our database, grouped by processor type. Users can click
the average score for a processor to view all the WebXPRT results we currently
have for that processor. The visual aspect of the chart and its automated
“group by processor type” feature are very useful, but it lacks the sorting and
filtering capabilities of the viewer, and navigating to the details of
individual tests takes multiple clicks.
In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!
We
recently published a set of CloudXPRT Data Analytics and Web Microservices
workload test results
submitted by Quanta Computer, Inc.
The Quanta submission is the first set of CloudXPRT results that we’ve
published using the formal results submission and approval process.
We’re grateful to the Quanta team for carefully following the submission
guidelines, enabling us to complete the review process without a hitch.
If you are unfamiliar
with the process, you can find general information about how we review
submissions in a previous blog post.
Detailed, step-by-step instructions are available on the results submission page.
As a reminder for testers who are considering submitting results for July, the
submission deadline is tomorrow, Friday July 16, and the publication date is
Friday July 30. We list the submission and publication dates for the rest of
2021 below. Please note that we do not plan to review submissions in December,
so if we receive results submissions after November 30, we may not publish them
until the end of January 2022.
August
Submission deadline: Tuesday 8/17/21
Publication date: Tuesday 8/31/21
September
Submission deadline: Thursday 9/16/21
Publication date: Thursday 9/30/21
October
Submission deadline: Friday 10/15/21
Publication date: Friday 10/29/21
November
Submission deadline: Tuesday 11/16/21
Publication date: Tuesday 11/30/21
December
Submission deadline: N/A
Publication date: N/A
If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!
In early May, we sent
a survey to members of the tech press who regularly use WebXPRT in articles and
reviews. We asked for their thoughts on several aspects of WebXPRT, as well as what
they’d like to see in the upcoming fourth version of the benchmark. We also
published the survey questions here in the blog, and invited
experienced WebXPRT testers to send their feedback as well. We received some
good responses to the survey, and for the benefit of our readers, we’ve
summarized some of the key comments and suggestions below.
One respondent stated that WebXPRT is demanding enough to test
performance, but if we want to simulate modern web usage, we should find the
most up-to-date studies on common browser tasks and web technologies. This
suggestion lines up with our intention to study the feasibility of adding a WebAssembly workload.
One respondent liked that fact that unlike many other browser
benchmarks, WebXPRT tests more than just JavaScript calculation speed.
One respondent suggested that we include a link to a WebXPRT
white paper within the UI, or at least a guide describing what happens during
each workload.
One respondent stated that they would like for WebXPRT to
automatically produce a good result file on the local test system.
One respondent said that WebXPRT has a relatively long runtime
for a browser benchmark, and they would prefer that the runtime not increase in
WebXPRT 4.
We had no direct calls for a battery life test, because many
testers already have scripts and/or methodologies in place for battery testing,
but one tester suggested adding the ability to loop the test so users can measure
performance over varying lengths of time.
There were no requests to bring back any aspects of WebXPRT 2015
that we removed in WebXPRT 3.
There were no reports of significant connection issues when
testing with WebXPRT.
We greatly appreciate the members of the tech press that responded to the survey. We’re still in the planning stages of WebXPRT 4, so there’s still time for anyone to send comments or ideas to benchmarkxprtsupport@principledtechnologies.com. We look forward to hearing from you!
Device reviews in publications
such as AnandTech, Notebookcheck, and PCMag, among many others, often feature
WebXPRT test results, and we appreciate the many members of the tech press that
use WebXPRT. As we move forward with the WebXPRT 4 development process, we’re especially
interested in learning what longtime users would like to see in a new version
of the benchmark.
In previous posts,
we’ve asked people to weigh in on the potential addition of a WebAssembly workload or a battery life test. We’d also like to ask experienced testers some other
test-related questions. To that end, this week we’ll be sending a WebXPRT 4
survey directly to members of the tech press who frequently publish WebXPRT
test results.
Regardless of whether you are a member of the tech press, we invite you to participate by sending your answers to any or all the questions below to benchmarkxprtsupport@principledtechnologies.com. We ask you to do so by the end of May.
Do you think WebXPRT 3’s selection of workload scenarios is representative of modern web tasks?
How do you think WebXPRT compares to other common browser-based benchmarks, such as JetStream, Speedometer, and Octane?
Are there web technologies that you’d like us to include in additional workloads?
Are you happy with the WebXPRT 3 user interface? If not, what UI changes would you like to see?
Are there any aspects of WebXPRT 2015 that we changed in WebXPRT 3 that you’d like to see us change back?
Have you ever experienced significant connection issues when testing with WebXPRT?
Given the array of workloads, do you think the WebXPRT runtime is reasonable? Would you mind if the average runtime were a bit longer?
Are there any other aspects of WebXPRT 3 that you’d like to see us change?
If you’d like to discuss any topics
that we did not cover in the questions above, please feel free to include additional
comments in your response. We look forward to hearing your thoughts!
Cookie Notice: Our website uses cookies to deliver a smooth experience by storing logins and saving user information. By continuing to use our site, you agree with our usage of cookies as our privacy policy outlines.