BenchmarkXPRT Blog banner

Category: Performance benchmarking

The XPRTs in 2020: a year to remember

As 2020 comes to a close, we want to take this opportunity to review another productive year for the XPRTs. Readers of our newsletter are familiar with the stats and updates we include each month, but for our blog readers who don’t receive the newsletter, we’ve compiled some highlights below.

Benchmarks
In the past year, we released CrXPRT 2 and updated MobileXPRT 3 for testing on Android 11 phones. The biggest XPRT benchmark news was the release of CloudXPRT v1.0 and v1.01. CloudXPRT, our newest  benchmark, can accurately measure the performance of cloud applications deployed on modern infrastructure-as-a-service (IaaS) platforms, whether those platforms are paired with on-premises, private cloud, or public cloud deployments. 

XPRTs in the media
Journalists, advertisers, and analysts referenced the XPRTs thousands of times in 2020, and it’s always rewarding to know that the XPRTs have proven to be useful and reliable assessment tools for technology publications such as AnandTech, ArsTechnica, Computer Base, Gizmodo, HardwareZone, Laptop Mag, Legit Reviews, Notebookcheck, PCMag, PCWorld, Popular Science, TechPowerUp, Tom’s Hardware, VentureBeat, and ZDNet.

Downloads and confirmed runs
So far in 2020, we’ve had more than 24,200 benchmark downloads and 164,600 confirmed runs. Our most popular benchmark, WebXPRT, just passed 675,000 runs since its debut in 2013! WebXPRT continues to be a go-to, industry-standard performance benchmark for OEM labs, vendors, and leading tech press outlets around the globe.

Media, publications, and interactive tools
Part of our mission with the XPRTs is to produce materials that help testers better understand the ins and outs of benchmarking in general and the XPRTs in particular. To help achieve this goal, we’ve published the following in 2020:

We’re thankful for everyone who has used the XPRTs, joined the community, and sent questions and suggestions throughout 2020. This will be our last blog post of the year, but there’s much more to come in 2021. Stay tuned in early January for updates!

Justin

The AIXPRT learning tool is now live (and a CloudXPRT version is on the way)!

We’re happy to announce that the AIXPRT learning tool is now live! We designed the tool to serve as an information hub for common AIXPRT topics and questions, and to help tech journalists, OEM lab engineers, and everyone who is interested in AIXPRT find the answers they need in as little time as possible.

The tool features four primary areas of content:

  • The Q&A section provides quick answers to the questions we receive most from testers and the tech press.
  • The AIXPRT: the basics section describes specific topics such as the benchmark’s toolkits, networks, workloads, and hardware and software requirements.
  • The testing and results section covers the testing process, metrics, and how to publish results.
  • The AI/ML primer provides brief, easy-to-understand definitions of key AI and ML terms and concepts for those who want to learn more about the subject.

The first screenshot below shows the home screen. To show how some of the popup information sections appear, the second screenshot shows the Inference tasks (workloads) entry in the AI/ML Primer section. 

We’re excited about the new AIXPRT learning tool, and we’re also happy to report that we’re working on a version of the tool for CloudXPRT. We hope to make the CloudXPRT tool available early next year, and we’ll post more information in the blog as we get closer to taking it live.

If you have any questions about the tool, please let us know!

Justin

We’ve updated MobileXPRT 3 to address issues with Android 11

This week, we published an updated MobileXPRT 3 build, version 3.116.0.4, on MobileXPRT.com and in the Google Play Store. The new build addresses an issue we recently discovered, where MobileXPRT was crashing after installation on some Android 11 phones. Permissions requirements and a new storage strategy called scoped storage were causing the problem. By default, scoped storage restricts an app’s storage access to app-specific directories and media, and prohibits general access to external or public directories. It also prevents third-party apps such as email clients or file managers from accessing MobileXPRT 3 results files. This default setting requires an opt-in permissions prompt that MobileXPRT 3 did not have prior to this week’s release.

MobileXPRT 3.116.0.4 points all of the benchmark’s file references to its private directory and allows users to zip results files and attach them to results submission emails. Neither change affects the testing process or test scores. If you have any questions or comments about the new MobileXPRT 3 build, please let us know!

Justin

The XPRTs can help with your holiday shopping

The biggest shopping days of the year are fast approaching, and if you’re researching phones, tablets, Chromebooks, or laptops in preparation for Black Friday and Cyber Monday sales, the XPRTs can help! One of the core functions of the XPRTs is to help cut through all the marketing noise by providing objective, reliable measures of a device’s performance. For example, instead of trying to guess whether a new Chromebook is fast enough to handle the demands of remote learning, you can use its CrXPRT and WebXPRT performance scores to see how it stacks up against the competition when handling everyday tasks.

A good place to start your search for scores is our XPRT results browser. The browser is the most efficient way to access the XPRT results database, which currently holds more than 2,600 test results from over 100 sources, including major tech review publications around the world, OEMs, and independent testers. It offers a wealth of current and historical performance data across all the XPRT benchmarks and hundreds of devices. You can read more about how to use the results browser here.

Also, if you’re considering a popular device, chances are good that someone has already published an XPRT score for that device in a recent tech review. The quickest way to find these reviews is by searching for “XPRT” within your favorite tech review site, or by entering the device name and XPRT name (e.g. “Apple iPad” and “WebXPRT”) in a search engine. Here are a few recent tech reviews that use one or more of the XPRTs to evaluate a popular device:


The XPRTs can help consumers make better-informed and more confident tech purchases this holiday season, and we hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database please feel free to ask!

Justin

Thinking ahead to the next HDXPRT

We’re currently formulating our 2021 development roadmap for the XPRTs. In addition to planning CloudXPRT and WebXPRT updates, we’re discussing the possibility of releasing HDXPRT 5 in 2021. It’s hard for me to believe, but it’s been about two and a half years since we started work on HDXPRT 4, and February 2021 will mark two years since the first HDXPRT 4 release. Windows PCs are more powerful than ever, so it’s a good time to talk about how we can enhance the benchmark’s ability to measure how well the latest systems handle real-world media technologies and applications.

When we plan a new version of an XPRT benchmark, one of our first steps is updating the benchmark’s workloads so that they will remain relevant in years to come. We almost always update application content, such as photos and videos, to contemporary file resolutions and sizes. For example, we added both higher-resolution photos and a 4K video conversion task in HDXPRT 4. Are there specific types of media files that you think would be especially relevant to high-performance media tasks over the next few years?

Next, we will assess the suitability of the real-world trial applications that the editing photos, editing music, and converting videos test scenarios use. Currently, these are Adobe Photoshop Elements, Audacity, CyberLink MediaEspresso, and HandBrake. Can you think of other applications that belong in a high-performance media processing benchmark?

In HDXPRT 4, we gave testers the option to target a system’s discrete graphics card during the video conversion workload. Has this proven useful in your testing? Do you have suggestions for new graphics-oriented workloads?

We’ll also strive to make the UI more intuitive, to simplify installation, and to reduce the size of the installation package. What elements of the current UI do you find especially useful or think we could improve? 

We welcome your answers to these questions and any additional suggestions or comments on HDXPRT 5. Send them our way!

Justin

Using WebXPRT 3 to compare the performance of popular browsers (Round 2)

It’s been nine months since we’ve published a WebXPRT 3 browser performance comparison, so we decided to put the newest versions of popular browsers through the paces to see if the performance rankings have changed since our last round of tests.

We used the same laptop as last time: a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM running Windows 10 Home, updated to version 1909 (18363.1139). We installed all current Windows updates and tested on a clean system image. After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 3 three times on five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. The posted score for each browser is the median of the three test runs.

In our last round of tests, the four Chromium-based browsers (Brave, Chrome, Edge, and Opera) produced scores that were nearly identical. Only Mozilla Firefox produced a significantly different (and better) score. The parity of the Chromium-based browsers was not surprising, considering they have the same underlying foundation.

In this round of testing, the Chromium-based browsers again produced very close scores, although Brave’s performance lagged by about 4 percent. Firefox again separated itself from the pack with a higher score. With the exception of Chrome, which produced an identical score as last time, every browser’s score was slightly slower than before. There are many possible reasons for this, including increased overhead in the browsers or changes in Windows, and the respective slowdowns for each browser will probably be unnoticeable to most users during everyday tasks.

Do these results mean that Mozilla Firefox will provide you with a speedier web experience? As we noted in the last comparison, a device with a higher WebXPRT score will probably feel faster during daily use than one with a lower score. For comparisons on the same system, however, the answer depends in part on the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately each browsers’ default installation settings reflect how you would set up that browser for your daily workflow.

In addition, browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Edge on Windows 10 and Chrome on Chrome OS. All of these variables are important to keep in mind when considering how browser performance comparison results translate to your everyday experience.

What are your thoughts on browser performance? Let us know!

Justin

Check out the other XPRTs:

Forgot your password?