BenchmarkXPRT Blog banner

Category: Benchmarking computing devices

Notes from the lab

This week’s XPRT Weekly Tech Spotlight featured the Alcatel A30 Android phone. We chose the A30, an Amazon exclusive, because it’s a budget phone running Android 7.0 (Nougat) right out of the box. That may be an appealing combination for consumers, but running a newer OS on inexpensive hardware such as what’s found in the A30 can cause issues for app developers, and the XPRTs are no exception.

Spotlight fans may have noticed that we didn’t post a MobileXPRT 2015 or BatteryXPRT 2014 score for the A30. In both cases, the benchmark did not produce an overall score because of a problem that occurs during the Create Slideshow workload. The issue deals with text relocation and significant changes in the Android development environment.

As of Android 5.0, on 64-bit devices, the OS doesn’t allow native code executables to perform text relocation. Instead, it is necessary to compile the executables using position-independent code (PIC) flags. This is how we compiled the current version of MobileXPRT, and it’s why we updated BatteryXPRT earlier this year to maintain compatibility with the most recent versions of Android.

However, the same approach doesn’t work for SoCs built with older 32-bit ARMv7-A architectures, such as the A30’s Qualcomm Snapdragon 210, so testers may encounter this issue on other devices with low-end hardware.

Testers who run into this problem can still use MobileXPRT 2015 to generate individual workload scores for the Apply Photo Effects, Create Photo Collages, Encrypt Personal Content, and Detect Faces workloads. Also, BatteryXPRT will produce an estimated battery life for the device, but since it won’t produce a performance score, we ask that testers use those numbers for informational purposes and not publication.

If you have any questions or have encountered additional issues, please let us know!

Justin

Celebrating one year of the XPRT Weekly Tech Spotlight

It’s been just over a year since we launched the XPRT Weekly Tech Spotlight by featuring our first device, the Google Pixel C. Spotlight has since become one of the most popular items at BenchmarkXPRT.com, and we thought now would be a good time to recap the past year, offer more insight into the choices we make behind the scenes, and look at what’s ahead for Spotlight.

The goal of Spotlight is to provide PT-verified specs and test results that can help consumers make smart buying decisions. We try to include a wide variety of device types, vendors, software platforms, and price points in our inventory. The devices also tend to fall into one of two main groups: popular new devices generating a lot of interest and devices that have unique form factors or unusual features.

To date, we’ve featured 56 devices: 16 phones, 11 laptops, 10 two-in-ones, 9 tablets, 4 consoles, 3 all-in-ones, and 3 small-form-factor PCs. The operating systems these devices run include Android, ChromeOS, iOS, macOS, OS X, Windows, and an array of vendor-specific OS variants and skins.

As much as possible, we test using out-of-the-box (OOB) configurations. We want to present test results that reflect what everyday users will experience on day one. Depending on the vendor, the OOB approach can mean that some devices arrive bogged down with bloatware while others are relatively clean. We don’t attempt to “fix” anything in those situations; we simply test each device “as is” when it arrives.

If devices arrive with outdated OS versions (as is often the case with Chromebooks), we update to current versions before testing, because that’s the best reflection of what everyday users will experience. In the past, that approach would’ve been more complicated with Windows systems, but the Microsoft shift to “Windows as a service” ensures that most users receive significant OS updates automatically by default.

The OOB approach also means that the WebXPRT scores we publish reflect the performance of each device’s default browser, even if it’s possible to install a faster browser. Our goal isn’t to perform a browser shootout on each device, but to give an accurate snapshot of OOB performance. For instance, last week’s Alienware Steam Machine entry included two WebXPRT scores, a 356 on the SteamOS browser app and a 441 on Iceweasel 38.8.0 (a Firefox variant used in the device’s Linux-based desktop mode). That’s a significant difference, but the main question for us was which browser was more likely to be used in an OOB scenario. With the Steam Machine, the answer was truly “either one.” Many users will use the browser app in the SteamOS environment and many will take the few steps needed to access the desktop environment. In that case, even though one browser was significantly faster than the other, choosing to omit one score in favor of the other would have excluded results from an equally likely OOB environment.

We’re always looking for ways to improve Spotlight. We recently began including more photos for each device, including ones that highlight important form-factor elements and unusual features. Moving forward, we plan to expand Spotlight’s offerings to include automatic score comparisons, additional system information, and improved graphical elements. Most importantly, we’d like to hear your thoughts about Spotlight. What devices and device types would you like to see? Are there specs that would be helpful to you? What can we do to improve Spotlight? Let us know!

Justin

We haven’t mentioned this in a while

I had a conversation with a community member yesterday who wanted to know whether we would test his device with one of the XPRTs. The short answer is “Absolutely!” The somewhat longer answer follows.

If you send us a device you want us to test, we will do so, with the appropriate set of XPRTs, free of charge. You will know that an impartial, third-party lab has tested your device using the best benchmarking practices. After we share the results with you, you will have three options: (1) to keep the results private, (2) to have us make the results public immediately in the appropriate XPRT results databases, or (3) to delay releasing the results until a future date. Regardless of your choice, we will keep the device so that we can use it as part of our testbed for developing and testing future versions of the XPRTs.

When we add the results to our online databases, we will cite Principled Technologies as the source, indicating that we stand behind the results.

The free testing includes no collateral beyond publishing the results. If you would like to publicize them through a report, an infographic, or any of the other materials PT can provide, just let us know and the appropriate person will contact you to discuss the how much those services would cost.

If you’re interested in getting your device tested for free, contact us at BenchmarkXPRTSupport@principledtechnologies.com.

Eric

Question we get a lot

“How come your benchmark ranks devices differently than [insert other benchmark here]?” It’s a fair question, and the reason is that each benchmark has its own emphasis and tests different things. When you think about it, it would be unusual if all benchmarks did agree.

To illustrate the phenomenon, consider this excerpt from a recent browser shootout in VentureBeat:

 
While this looks very confusing, the simple explanation is that the different benchmarks are testing different things. To begin with, SunSpider, Octane, JetStream, PeaceKeeper, and Kraken all measure JavaScript performance. Oort Online measures WebGL performance. WebXPRT measures both JavaScript and HTML 5 performance. HTML5Test measures HTML5 compliance.

Even with benchmarks that test the same aspect of browser performance, the tests differ. Kraken and SunSpider both test the speed of JavaScript math, string, and graphics operations in isolation, but run different sets of tests to do so. PeaceKeeper profiles the JavaScript from sites such as YouTube and FaceBook.

WebXPRT, like the other XPRTs, uses scenarios that model the types of work people do with their devices.

It’s no surprise that the order changes depending on which aspect of the Web experience you emphasize, in much the same way that the most fuel-efficient cars might not be the ones with the best acceleration.

This is a bigger topic than we can deal with in a single blog post, and we’ll examine it more in the future.

Eric

What’s in a name?

A couple of weeks ago, the Notebookcheck German site published a review of the Huawei P8lite. We were pleased to see they used WebXPRT 2015, and the P8 Lite got an overall score of 47. This week, AnandTech published their review of the Huawei P8lite. In their review, the P8lite got an overall score of 59!

Those scores are very different, but it was not difficult to figure out why. The P8lite comes in two versions, depending on your market. The version Notebookcheck used is based on HiSilicon’s Kirin 620, while the version AnandTech used was Qualcomm’s Snapdragon 615 SoC. It’s also worth noting that the phone Notebookcheck tested was running Android 5.0, while the phone AnandTech tested was running Android 4.4. With different hardware and different operating systems, it’s no surprise that the results were different.

One consequence of the XPRTs being used across the world is that is that it is not uncommon to see results from devices in different markets. As we’ve said before, many things can influence benchmark results, so don’t assume that two devices with the same name are identical.

Kudos to both AnandTech and Notebookcheck for their care in presenting the system information for the devices in their reviews. The AnandTech review even included a brief description of the two models of the P8lite. This type of information is essential for helping people make informed decisions.

In other news, Windows 10 launched yesterday. We’re looking forward to seeing the TouchXPRT and WebXPRT results!

Eric

Device or computer?

As you may have noticed, I am fascinated by performance.  I’m also an avid cyclist and techno geek.  The recent start of the Tour de France has turned my thoughts to the technology of bikes and their accessories.  As with most technology, the latest models promise to be faster, lighter, and better.

One accessory of particular interest to me is the bike “computer.”  When I first started serious riding six years ago, bike computers were pretty minimal devices.  They were generally a small LCD display that connected via wires to two sensors.  One sensor counted how quickly a magnet on the pedal passed by to determine the cyclist’s cadence (pedal strokes per minute).  The other sensor counted how quickly a magnet on one of the wheels passed by. Knowing the circumference of the wheel, it calculated the cyclist’s speed and distance traveled.  Sure there had to be a processor of some sort in those bike computers, but I always refused to call it a bike computer. Speedometer/odometer seemed more accurate to me.

Now, however, I have on my bike a Garmin Edge 500 (https://buy.garmin.com/shop/shop.do?cID=160&pID=36728).  It is a small device—less than 2 inches by 3 inches—that attaches to my handle bars and determines my speed and distance via a built-in GPS. It determines altitude by detecting changes in barometric pressure and temperature by a built-in thermometer.  It communicates wirelessly with my heart rate monitor.  It can also talk wirelessly to other devices, like a cadence sensor or a power meter that measures the power applied to the pedals.  The LCD screen is customizable and allows me to display the information I most care about while riding.  The Edge 500 collects all of the data and can upload it via a computer to the Garmin Connect Web site.

By any definition of computer, the Edge 500 seems to qualify.  I still don’t call it a computer, however. Calling it a speedometer/odometer would be silly.  I tend to refer to it as my Garmin.  The line between computer and device is definitely getting blurrier.

We are all surrounded by more and more computing devices, whether they are desktops, notebooks, tablets, smart phones, or bike computers.  On some of those, performance is critical while on others, fast enough is all we care about.  On which devices do you think performance is important?  Even as we start the work on HDXPRT 2012, we are constantly examining other areas and types of devices that need benchmarks.  Let us know your thoughts!

Bill

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?