BenchmarkXPRT Blog banner

Airborne

I’m old enough that I’ve never really understood the whole selfie thing. However, it’s clearly never going away, and I’m fascinated–although a little creeped out–by the development of selfie drones. It’s amazing that we have so quickly reached the point where you can buy a drone that will literally fit in your pocket.

As an example of how sophisticated these devices can be, consider Zero Robotics Hover Camera Passport.  It’s capable of capturing 4K UHD video and 13-megapixel images, it can track faces or bodies, and it includes sensors, including sonar, to measure the distance from air to ground. All in a package that’s about the size of an old VHS tape.

A while back we talked about the new ways people are finding to use technology, and how the XPRTs need to adapt.  While I don’t think we’re going to be seeing DroneXPRT any time soon, we’ve been talking about including the technologies that make these devices possible in the XPRTs. These technologies include machine learning, computer vision, and 4K video.

What new devices fascinate you? Which technologies are going to be most useful in the near future? Let us know!

Eric

How do you say that?

I recently saw this video, and heard something that I had never imagined: “Next we tested with what I assume is pronounced web-export.” I’ve had people ask if it was an acronym, but I ‘ve never heard it pronounced “export.”

How do we pronounce XPRT? The same way we pronounce “expert.” So, it’s “Benchmark expert,” “Web expert,” “Touch expert,” and so on.  CrXPRT is pronounced “C‑R expert” and HDXPRT is pronounced “H‑D‑expert.”

When I was working in Australia, I got teased about my accent quite a bit, and my case-hardened American R was a particular target. So, when I say the letters out loud, is comes out something like “eks‑pee‑arrr‑tee,” (arrr like a pirate would say it) and “expert” is the closest match. This is true for most Americans. However, in many other accents, it’s more like “eks‑pee‑ah‑tee,” and “ex-paht” is much closer to “export.”

Yes, I think way too much about this stuff.

Eric

HDXPRT’s future

While industry pundits have written many words about the death of the PC, Windows PCs are going through a renaissance. No longer do you just choose between a desktop or a laptop in beige or black. There has been an explosion of choices.

Whether you want a super-thin notebook, a tablet, or a two-in-one device, the market has something to offer. Desktop systems can be small devices on your desk, all-in-ones with the PC built into the monitor, or old-style boxes that sit on the floor. You can go with something inexpensive that will be sufficient for many tasks or invest in a super-powerful PC capable of driving today’s latest VR devices. Or you can get a new Microsoft Surface Studio, an example of the new types of devices entering the PC scene.

The current proliferation of PC choices means that tools that help buyers understand the performance differences between systems are more important than they have been in years. Because HDXPRT is one such tool, we expect demand for it to increase.

We have many tasks ahead of us as we prepare for this increased demand. The first is to release a version of HDXPRT 2014 that doesn’t require a patch. We are working on that and should have something ready later this month.

For the other tasks, we need your input. We believe we need to update HDXPRT to reflect the world of high-definition content. It’s tempting to simply change the name to UHDXPRT, but this was our first XPRT and I’m partial to the original name. How about you?

As far as tests, what should a 2017 version of HDXPRT include? We think 4K-related workloads are a must, but aren’t sure whether 4K playback tests are the way to go. What do you think? We need to update other content, such as photo and video resolutions, and replace outdated applications with current versions. Would a VR test would be worthwhile?

Please share your thoughts with us over the coming weeks as we put together a plan for the next version of HDXPRT!

Bill

Tracking device evolution with WebXPRT ’15, part 2

Last week, we used the Apple iPhone as a test case to show how hardware advances are often reflected in benchmark scores over time. When we compared WebXPRT 2015 scores for various iPhone models, we saw a clear trend of progressively higher scores as we moved from phones with an A7 chip to phones with A8, A9, and A10 Fusion chips. Performance increases over time are not surprising, but WebXPRT ’15 scores also showed us that upgrading from an iPhone 6 to an iPhone 6s is likely to have a much greater impact on web-browsing performance than upgrading from an iPhone 6s to an iPhone 7.

This week, we’re revisiting our iPhone test case to see how software updates can boost device performance without any changes in hardware. The original WebXPRT ’15 tests for the iPhone 5s ran on iOS 8.3, and the original tests for the iPhone 6s, 6s Plus, and SE ran on variants of iOS 9. We updated each phone to iOS 10.0.2 and ran several iterations of WebXPRT ’15.

Upgrading from iOS 8.3 to iOS 10 on the iPhone 5s caused a 17% increase in web-browsing performance, as measured by WebXPRT. Upgrading from iOS 9 to iOS 10 on the iPhone 6s, 6s Plus, and SE produced web-browsing performance gains of 2.6%, 3.6%, and 3.1%, respectively.

The chart below shows the WebXPRT ’15 scores for a range of iPhones, with each iPhone’s iOS version upgrade noted in parentheses. The dark blue columns on the left represent the original scores, and the light blue columns on the right represent the upgrade scores.

Oct 27 iPhone chart

As with our hardware comparison last week, these scores are the median of a range of scores for each device in our database. These scores come both from our own testing and from device reviews from popular tech media outlets.

These results reinforce a message that we repeat often, that many factors other than hardware influence performance. Designing benchmarks that deliver relevant and reliable scores requires taking all factors into account.

What insights have you gained recently from WebXPRT ’15 testing? Let us know!

Justin

Tracking device evolution with WebXPRT ‘15

The XPRT Spotlight on the Apple iPhone 7 Plus gives us a great opportunity to look at the progression of WebXPRT 2015 scores for the iPhone line and see how hardware and software advances are often reflected in benchmark scores over time. This week, we’ll see how the evolution of Apple’s mobile CPU architecture has boosted web-browsing performance. In a future post, we’ll see the impact of iOS development.

As we’ve discussed in the past, multiple factors can influence benchmark results. While we’re currently using the iPhone as a test case, the same principles apply to all types of devices. We should also note that WebXPRT is an excellent gauge of expected web-browsing performance during real-world tasks, which is different than pure CPU performance in isolation.

When looking at WebXPRT ’15 scores in our database, we see that iPhone web-browsing performance has more than doubled in the last three years. In 2013, an iPhone 5s with an Apple A7 chip earned an overall WebXPRT ’15 score of 100. Today, a new iPhone 7 Plus with an A10 Fusion chip reports a score somewhere close to 210. The chart below shows the WebXPRT ’15 scores for a range of iPhones, with each iPhone’s CPU noted in parentheses.

Oct 20 iPhone chart

Moving forward from the A7 chip in the iPhone 5s to the A8 chip in the iPhone 6 and the A9 chip in the iPhone 6s and SE, we see consistent score increases. The biggest jump, at over 48%, appears in the transition from the A8 to the A9 chip, implying that folks upgrading from an iPhone 6 or 6 Plus to anything newer would notice a huge difference in web performance.

In general, folks upgrading from an A9-based phone (6S, 6S Plus, or SE) to an A10-based phone (7 and 7 Plus) could expect an increase in web performance of over 6.5%.

The scores we list represent the median of a range of scores for each device in our database. These scores come from our own testing, as well as from device reviews from media outlets such as AnandTech, Notebookcheck, and Tom’s Hardware. It’s worth noting that the highest A9 score in our database (AnandTech’s iPhone SE score of 205) overlaps with the lowest A10 Fusion score (Tom’s Hardware of Germany’s iPhone 7 score of 203), so while the improvement in median scores is clear, performance will vary according to individual phones and other factors.

Soon, we’ll revisit our iPhone test case to see how software updates can boost device performance without any changes in hardware. For more details on the newest iPhones, visit the Spotlight comparison page to see how iPhone 7 and 7 Plus specs and WebXPRT scores stack up.

Justin

Taking a detour

Back in April, Bill announced that we would be starting development of a cross-platform benchmark. This announcement generated a lot of interest and we got lots of good feedback and ideas.

We knew from the start that getting a cross-platform benchmark right would be hard. However, it proved to be even trickier than we thought. As I explained before, benchmarks not only have to run well, but the results must be fair to all platforms involved. Achieving both of these requirements has been a challenge.

At the same time we’ve been devoting a great deal of effort and resources to the cross-platform benchmark, some increasingly popular new use cases have been receiving less attention than they deserve. We’ve decided that the cross-platform benchmark is not the best use of the Community’s resources, and are going to put it on the shelf for a while. This will free up the resources to let us really dig into some newer technologies.

Thanks again to everyone who responded.

Eric

Check out the other XPRTs:

Forgot your password?