BenchmarkXPRT Blog banner

Category: Performance of computing devices

The value of speed

I was reading an interesting article on how high-end smartphones like the iPhone X, Pixel 2 XL, and Galaxy S8 generate more money from in-game revenue than cheaper phones do.

One line stood out to me: “With smartphones becoming faster, larger and more capable of delivering an engaging gaming experience, these monetization key performance indicators (KPIs) have begun to increase significantly.”

It turns out the game companies totally agree with the rest of us that faster devices are better!

Regardless of who is seeking better performance—consumers or game companies—the obvious question is how you determine which models are fastest. Many folks rely on device vendors’ claims about how much faster the new model is. Unfortunately, the vendors’ claims don’t always specify on what they base the claims. Even when they do, it’s hard to know whether the numbers are accurate and applicable to how you use your device.

The key part of any answer is performance tools that are representative, dependable, and open.

  • Representative – Performance tools need to have realistic workloads that do things that you care about.
  • Dependable – Good performance tools run reliably and produce repeatable results, both of which require that significant work go into their development and testing.
  • Open – Performance tools that allow people to access the source code, and even contribute to it, keep things above the table and reassure you that you can rely on the results.

Our goal with the XPRTs is to provide performance tools that meet all these criteria. WebXPRT 3 and all our other XPRTs exist to help accurately reveal how devices perform. You can run them yourself or rely on the wealth of results that we and others have collected on a wide array of devices.

The best thing about good performance tools is that everyone, even vendors, can use them. I sincerely hope that you find the XPRTs helpful when you make your next technology purchase.

Bill

The XPRTs in action

In the near future, we’ll update our “XPRTs around the world” infographic, which provides a snapshot of how people are using the XPRTs worldwide. Among other stats, we include the number of XPRT web mentions, articles, and reviews that have appeared during a given period. Recently, we learned how one of those statistics—a single web site mention of WebXPRT—found its way to consumers in more places than we would have imagined.

Late last month, AnandTech published a performance comparison by Andrei Frumusanu examining the Samsung Galaxy S9’s Snapdragon 845 and Exynos 9810 variants and a number of other high-end phones. WebXPRT was one of the benchmarking tools used. The article stated that both versions of the brand-new S9 were slower than the iPhone X and, in some tests, were slower than even the iPhone 7.

A CNET video discussed the article and the role of WebXPRT in the performance comparison, and the article has been reposted to hundreds of tech media sites around the world. A quick survey shows reposts in Albania, Bulgaria, Denmark, Chile, the Czech Republic, France, Germany, Greece, Indonesia, Iran, Italy Japan, Korea, Poland, Russia, Spain, Slovakia, Turkey, and many other countries.

The popularity of the article is not surprising, for it positions the newest flagship phones from the industry’s two largest phone makers in a head-to-head comparison with a somewhat unexpected outcome. AnandTech did nothing to stir controversy or sensationalize the test results, but simply provided readers with an objective, balanced assessment of how these devices compare so that they could draw their own conclusions. The XPRTs share this approach.

We’re grateful to Andrei and others at AnandTech who’ve used the XPRTs over the years to produce content that helps consumers make informed decisions. WebXPRT is just part of AnandTech’s toolkit, but it’s one that’s accessible to anybody free of charge. With the help of BenchmarkXPRT Development Community members, we’ll continue to publish XPRT tools that help users everywhere gain valuable insight into device performance.

Justin

Here’s to 100 more!

This week’s Essential Phone entry marks the 100th device that we’ve featured in the XPRT Weekly Tech Spotlight! It’s a notable milestone for us as we work toward our goal of building a substantial library of device information that buyers can use to compare devices. In celebration, I thought it would be fun to share some Spotlight-related stats.

Our first Spotlight entry was the Google Pixel C way back on February 8, 2016, and we’ve featured a wide array of devices since then:

  • 33 phones
  • 16 laptops
  • 16 tablets
  • 16 2-in-1s
  • 6 small-form-factor PCs
  • 5 desktops
  • 5 game consoles
  • 3 all-in-ones



In addition to a wide variety of device types, we try to include a wide range of vendors. So far, we’ve featured devices from Acer, Alcatel, Alienware, Amazon, Apple, ASUS, BLU, CHUWI, Dell, Essential, Fujitsu, Google, HP, HTC, Huawei, Intel, LeEco, Lenovo, LG, Microsoft, NVIDIA, OnePlus, Razer, Samsung, Sony, Syber, Xiaomi, and ZTE. We look forward to adding many more to that list during the year ahead.

XPRT Spotlight is a great way for device vendors and manufacturers to share PT-verified specs and test results with buyers around the world. If you’re interested in sending in a device for testing, please contact XPRTSpotlight@PrincipledTechnologies.com.

There’s a lot more to come for XPRT Spotlight, and we’re constantly working on new features and improvements for the page. Are there any specific devices or features that you would like to see in the Spotlight? Let us know.

Justin

Digging deeper

From time to time, we like to revisit the fundamentals of the XPRT approach to benchmark development. Today, we’re discussing the need for testers and benchmark developers to consider the multiple factors that influence benchmark results. For every device we test, all of its hardware and software components have the potential to affect performance, and changing the configuration of those components can significantly change results.

For example, we frequently see significant performance differences between different browsers on the same system. In our recent recap of the XPRT Weekly Tech Spotlight’s first year, we highlighted an example of how testing the same device with the same benchmark can produce different results, depending on the software stack under test. In that instance, the Alienware Steam Machine entry included a WebXPRT 2015 score for each of the two browsers that consumers were likely to use. The first score (356) represented the SteamOS browser app in the SteamOS environment, and the second (441) represented the Iceweasel browser (a Firefox variant) in the Linux-based desktop environment. Including only the first score would have given readers an incomplete picture of the Steam Machine’s web-browsing capabilities, so we thought it was important to include both.

We also see performance differences between different versions of the same browser, a fact especially relevant to those who use frequently updated browsers, such as Chrome. Even benchmarks that measure the same general area of performance, for example, web browsing, are usually testing very different things.

OS updates can also have an impact on performance. Consumers might base a purchase on performance or battery life scores and end up with a device that behaves much differently when updated to a new version of Android or iOS, for example.

Other important factors in the software stack include pre-installed software, commonly referred to as bloatware, and the proliferation of apps that sap performance and battery life.

This is a much larger topic than we can cover in the blog. Let the examples we’ve mentioned remind you to think critically about, and dig deeper into, benchmark results. If we see published XPRT scores that differ significantly from our own results, our first question is always “What’s different between the two devices?” Most of the time, the answer becomes clear as we compare hardware and software from top to bottom.

Justin

IDF16 Shenzhen

I just spent the last couple of days at IDF16 Shenzhen. It was a great opportunity to talk to folks about the XPRTs, see some future technology demos, and think about the future of the XPRTs.

The technology and product demos included lots of interesting technology. I saw everything from the latest computers to games to VR to body monitoring.

IDF16-1

Of particular interest to me were the future-looking technologies beyond the usual array of notebooks, tablets, and servers. I was able to see drones that could video a person by following them, while avoiding obstacles such as trees. I saw a number of demos using the Oculus Rift. I got to see some robot demos that were impressive in their use of the fairly off-the-shelf technology driving them. I would have had myself scanned and then had a small 3D model of myself printed, but I was pressed for time and the line was too long.

I was particularly interested in a mirror that could scan a person and tell things about their health. I also found somewhat amusing a technology demo that was able to “beautify” a person in real time for use with teleconferencing such as Skype. While I might quibble about the definition of beautify, the idea of real-time video enhancement is intriguing. (Given the raw material I gave it to work with, it was no easy task to accomplish!) Maybe I won’t need to shave before my next WebEx meeting…

IDF16-2

All of these technologies give some hints as to areas the XPRTs may go in the future. While I don’t think we are quite ready for BeautificationXPRT, there may well be some workloads we should consider such as path finding, real-time video enhancement, health monitoring, virtual reality, and gaming. Please let us know your thoughts about what near-term technologies we should be considering in future workloads.

We definitely have exciting times still ahead of us in technology and the XPRTs!

Bill

A clarification from Brett Howse

A couple of weeks ago, I described a conversation I had with Brett Howse of AnandTech. Brett was kind enough to send a clarification of some of his remarks, which he gave us permission to share with you.

“We are at a point in time where the technology that’s been called mobile since its inception is now at a point where it makes sense to compare it to the PC. However we struggle with the comparisons because the tools used to do the testing do not always perform the same workloads. This can be a major issue when a company uses a mobile workload, and a desktop workload, but then puts the resulting scores side by side, which can lead to misinformed conclusions. This is not only a CPU issue either, since on the graphics side we have OpenGL well established, along with DirectX, in the PC space, but our mobile workloads tend to rely on OpenGL ES, with less precision asked of the GPU, and GPUs designed around this. Getting two devices to run the same work is a major challenge, but one that has people asking what the results would be.”

I really appreciate Brett taking the time to respond. What are your thoughts in these issues? Please let us know!

Eric

Check out the other XPRTs:

Forgot your password?