BenchmarkXPRT Blog banner

Category: AnandTech

The XPRTs in 2020: a year to remember

As 2020 comes to a close, we want to take this opportunity to review another productive year for the XPRTs. Readers of our newsletter are familiar with the stats and updates we include each month, but for our blog readers who don’t receive the newsletter, we’ve compiled some highlights below.

Benchmarks
In the past year, we released CrXPRT 2 and updated MobileXPRT 3 for testing on Android 11 phones. The biggest XPRT benchmark news was the release of CloudXPRT v1.0 and v1.01. CloudXPRT, our newest  benchmark, can accurately measure the performance of cloud applications deployed on modern infrastructure-as-a-service (IaaS) platforms, whether those platforms are paired with on-premises, private cloud, or public cloud deployments. 

XPRTs in the media
Journalists, advertisers, and analysts referenced the XPRTs thousands of times in 2020, and it’s always rewarding to know that the XPRTs have proven to be useful and reliable assessment tools for technology publications such as AnandTech, ArsTechnica, Computer Base, Gizmodo, HardwareZone, Laptop Mag, Legit Reviews, Notebookcheck, PCMag, PCWorld, Popular Science, TechPowerUp, Tom’s Hardware, VentureBeat, and ZDNet.

Downloads and confirmed runs
So far in 2020, we’ve had more than 24,200 benchmark downloads and 164,600 confirmed runs. Our most popular benchmark, WebXPRT, just passed 675,000 runs since its debut in 2013! WebXPRT continues to be a go-to, industry-standard performance benchmark for OEM labs, vendors, and leading tech press outlets around the globe.

Media, publications, and interactive tools
Part of our mission with the XPRTs is to produce materials that help testers better understand the ins and outs of benchmarking in general and the XPRTs in particular. To help achieve this goal, we’ve published the following in 2020:

We’re thankful for everyone who has used the XPRTs, joined the community, and sent questions and suggestions throughout 2020. This will be our last blog post of the year, but there’s much more to come in 2021. Stay tuned in early January for updates!

Justin

WebXPRT 3: relevant, reliable, and easy to use

WebXPRT continues to be the most widely-used XPRT benchmark, with just over 625,000 runs to date. From the first WebXPRT release in 2013, WebXPRT has been popular with device manufacturers, developers, tech journalists, and consumers because it’s easy to run, it runs on almost anything with a web browser, and its workloads reflect the types of web-based tasks that people are likely to encounter on a daily basis.

We realize that many folks who follow the XPRTs may be unaware of the wide variety of WebXPRT uses that we frequently read about in the tech press. Today, we thought it would be interesting to bring the numbers to life. In addition to dozens of device reviews, here’s a sample of WebXPRT 3 mentions over the past few weeks.

As we plan for the next version of WebXPRT, we want to be sure we build a benchmark that continues WebXPRT’s legacy of relevant workloads, ease-of-use, and broad compatibility. We know what works well in our lab, but to build a benchmark that meets the needs of a diverse group of users all around the world, it’s important that we hear from all types of testers. We recently discussed some of the new technologies that we’re considering for WebXPRT 4, so please don’t hesitate to let us know what you think about those proposals, or send any additional ideas you may have!

Justin

The XPRTs in action

In the near future, we’ll update our “XPRTs around the world” infographic, which provides a snapshot of how people are using the XPRTs worldwide. Among other stats, we include the number of XPRT web mentions, articles, and reviews that have appeared during a given period. Recently, we learned how one of those statistics—a single web site mention of WebXPRT—found its way to consumers in more places than we would have imagined.

Late last month, AnandTech published a performance comparison by Andrei Frumusanu examining the Samsung Galaxy S9’s Snapdragon 845 and Exynos 9810 variants and a number of other high-end phones. WebXPRT was one of the benchmarking tools used. The article stated that both versions of the brand-new S9 were slower than the iPhone X and, in some tests, were slower than even the iPhone 7.

A CNET video discussed the article and the role of WebXPRT in the performance comparison, and the article has been reposted to hundreds of tech media sites around the world. A quick survey shows reposts in Albania, Bulgaria, Denmark, Chile, the Czech Republic, France, Germany, Greece, Indonesia, Iran, Italy Japan, Korea, Poland, Russia, Spain, Slovakia, Turkey, and many other countries.

The popularity of the article is not surprising, for it positions the newest flagship phones from the industry’s two largest phone makers in a head-to-head comparison with a somewhat unexpected outcome. AnandTech did nothing to stir controversy or sensationalize the test results, but simply provided readers with an objective, balanced assessment of how these devices compare so that they could draw their own conclusions. The XPRTs share this approach.

We’re grateful to Andrei and others at AnandTech who’ve used the XPRTs over the years to produce content that helps consumers make informed decisions. WebXPRT is just part of AnandTech’s toolkit, but it’s one that’s accessible to anybody free of charge. With the help of BenchmarkXPRT Development Community members, we’ll continue to publish XPRT tools that help users everywhere gain valuable insight into device performance.

Justin

Quarterly review

It’s been one of our busiest quarters ever! Here’s a quick review of what’s been happening:

The XPRTs were on the road a lot!

 
While I was at CES, I was lucky enough to be able to sit down and talk on-the-record with a couple of community members:

 
Many thanks to them for being so generous with their time and their insights.

We also gave folks a lot to look at:

 
That is a great start to the year, but we’re going to top it – Next week, we’ll kick off Q2 with one of our biggest announcements ever!

Eric

Focusing the spotlight

As you may have heard, the Samsung Galaxy S7 is the XPRT Weekly Tech Spotlight this week.  As we were testing it, we noticed that our WebXPRT scores were about 8 percent lower than those reported by AnandTech.

The folks at AnandTech do a good job on their reviews, so we wanted to understand the discrepancy in scores. The S7 comes in a couple of models, so we started by verifying that our model was the same as theirs. It was.

The next step was to check their configuration against ours, and this is where we found the difference. Both phones were running the same version of Android, but the S7 AnandTech tested used Chrome 48 while the S7 we tested came preloaded with Chrome 49. In our testing, we’ve noticed that upgrading from Chrome 48 to Chrome 49 has a noticeable performance impact on certain devices. On the Samsung Galaxy S6, the scores went down about 10 percent. In all cases we’ve seen, the decrease is driven largely by the Stock Option Pricing workload.

This isn’t the first time we’ve written about browser versions affecting results. WebXPRT is a browsing benchmark, and the browser has a legitimate impact on performance. When you’re comparing results, it’s always important to look at all the factors involved.

Justin

Last week in the XPRTs

We published the XPRT Weekly Tech Spotlight on the Samsung Galaxy S7.
We added two new BatteryXPRT ’14 results.
We added one new MobileXPRT ’15 result.
We added four new WebXPRT ’15 results.

A clarification from Brett Howse

A couple of weeks ago, I described a conversation I had with Brett Howse of AnandTech. Brett was kind enough to send a clarification of some of his remarks, which he gave us permission to share with you.

“We are at a point in time where the technology that’s been called mobile since its inception is now at a point where it makes sense to compare it to the PC. However we struggle with the comparisons because the tools used to do the testing do not always perform the same workloads. This can be a major issue when a company uses a mobile workload, and a desktop workload, but then puts the resulting scores side by side, which can lead to misinformed conclusions. This is not only a CPU issue either, since on the graphics side we have OpenGL well established, along with DirectX, in the PC space, but our mobile workloads tend to rely on OpenGL ES, with less precision asked of the GPU, and GPUs designed around this. Getting two devices to run the same work is a major challenge, but one that has people asking what the results would be.”

I really appreciate Brett taking the time to respond. What are your thoughts in these issues? Please let us know!

Eric

Check out the other XPRTs:

Forgot your password?