BenchmarkXPRT Blog banner

Category: Performance testing on tablets

We’ve updated MobileXPRT 3 to address issues with Android 11

This week, we published an updated MobileXPRT 3 build, version 3.116.0.4, on MobileXPRT.com and in the Google Play Store. The new build addresses an issue we recently discovered, where MobileXPRT was crashing after installation on some Android 11 phones. Permissions requirements and a new storage strategy called scoped storage were causing the problem. By default, scoped storage restricts an app’s storage access to app-specific directories and media, and prohibits general access to external or public directories. It also prevents third-party apps such as email clients or file managers from accessing MobileXPRT 3 results files. This default setting requires an opt-in permissions prompt that MobileXPRT 3 did not have prior to this week’s release.

MobileXPRT 3.116.0.4 points all of the benchmark’s file references to its private directory and allows users to zip results files and attach them to results submission emails. Neither change affects the testing process or test scores. If you have any questions or comments about the new MobileXPRT 3 build, please let us know!

Justin

The XPRT Spotlight Black Friday Showcase helps you shop with confidence

Black Friday and Cyber Monday are almost here, and you may be feeling overwhelmed by the sea of tech gifts to choose from. The XPRTs are here to help. We’ve gathered the product specs and performance facts for some of the hottest tech devices in one convenient place—the XPRT Spotlight Black Friday Showcase. The Showcase is a free shopping tool that provides side-by-side comparisons of some of the season’s most popular smartphones, laptops, Chromebooks, tablets, and PCs. It helps you make informed buying decisions so you can shop with confidence this holiday season.

Want to know how the Google Pixel 4 stacks up against the Apple iPhone 11 or Samsung Galaxy Note10 in web browsing performance or screen size? Simply select any two devices in the Showcase and click Compare. You can also search by device type if you’re interested in a specific form factor such as consoles or tablets.

The Showcase doesn’t go away after Black Friday. We’ll rename it the XPRT Holiday Showcase and continue to add devices such as the Microsoft Surface Pro X throughout the shopping season. Be sure to check back in and see how your tech gifts measure up.

If this is the first you’ve heard about the XPRT Tech Spotlight, here’s a little background. Our hands-on testing process equips consumers with accurate information about how devices function in the real world. We test devices using our industry-standard BenchmarkXPRT tools: WebXPRT, MobileXPRT, TouchXPRT, CrXPRT, BatteryXPRT, and HDXPRT. In addition to benchmark results, we include photographs, specs, and prices for all products. New devices come online weekly, and you can browse the full list of almost 200 that we’ve featured to date on the Spotlight page.

If you represent a device vendor and want us to feature your product in the XPRT Tech Spotlight, please visit the website for more details.

Justin

The MobileXPRT 3 source code is now available

We’re excited to announce that the MobileXPRT 3 source code is now available to BenchmarkXPRT Development Community members!

Download the MobileXPRT 3 source here (login required).

We’ve also posted a download link on the MobileXPRT tab in the Members’ Area, where you will find instructions for setting up and configuring a local instance of MobileXPRT 3.

As part of our community model for software development, source code for each of the XPRTs is available to anyone who joins the community. If you’d like to review XPRT source code, but haven’t yet joined the community, we encourage you to join! Registration is quick and easy, and if you work for a company or organization with an interest in benchmarking, you can join the community for free. Simply fill out the form with your company e-mail address and select the option to be considered for a free membership. We’ll contact you to verify the address and then activate your membership.

If you have any other questions about community membership or XPRT source code, feel free to contact us. We look forward to hearing from you!

Justin

MobileXPRT 3 is here!

We’re excited to announce that MobileXPRT 3 is now available to the public! MobileXPRT 3 is the latest version of our popular tool for evaluating the performance of Android devices. The BenchmarkXPRT Development Community has been using a community preview for several weeks, but now anyone can run the tool and publish their results.

Compatible with systems running Android 5.0 and above, MobileXPRT 3 includes the same performance workloads as MobileXPRT 2015 (Apply Photo Effects, Create Photo Collages, Create Slideshow, Encrypt Personal Content, and Detect Faces to Organize Photos), plus a new optical character recognition-based workload called Scan Receipts for Spreadsheet.

MobileXPRT 3 is available at MobileXPRT.com and on the Google Play Store. Alternatively, you can download the app using either of the links below:



After trying out MobileXPRT 3, please submit your scores here and send any comments to BenchmarkXPRTsupport@principledtechnologies.com. To see test results from a variety of systems, go to MobileXPRT.com and click View Results, where you’ll find scores from a variety of Android devices. We look forward to seeing your results!

Justin

Notes from the lab: choosing a calibration system for MobileXPRT 3

Last week, we shared some details about what to expect in MobileXPRT 3. This week, we want to provide some insight into one part of the MobileXPRT development process, choosing a calibration system.

First, some background. For each of the benchmarks in the XPRT family, we select a calibration system using criteria we’ll explain below. This system serves as a reference point, and we use it to calculate scores that will help users understand a single benchmark result. The calibration system for MobileXPRT 2015 is the Motorola DROID RAZR M. We structured our calculation process so that the mean performance score from repeated MobileXPRT 2015 runs on that device is 100. A device that completes the same workloads 20 percent faster than the DROID RAZR M would have a performance score of 120, and one that performs the test 20 percent more slowly would have a score of 80. (You can find a more in-depth explanation of MobileXPRT score calculations in the Exploring MobileXPRT 2015 white paper.)

When selecting a calibration device, we are looking for a relevant reference point in today’s market. The device should be neither too slow to handle modern workloads nor so fast that it outscores most devices on the market. It should represent a level of performance that is close to what the majority of consumers experience, and one that will continue to be relevant for some time. This approach helps to build context for the meaning of the benchmark’s overall score. Without that context, testers can’t tell whether a score is fast or slow just by looking at the raw number. When compared to a well-known standard such as the calibration device, however, the score has more informative value.

To determine a suitable calibration device for MobileXPRT 3, we started by researching the most popular Android phones by market share around the world. It soon became clear that in many major markets, the Samsung Galaxy S8 ranked first or second, or at least appeared in the top five. As last year’s first Samsung flagship, the S8 is no longer on the cutting edge, but it has specs that many current mid-range phones are deploying, and the hardware should remain relevant for a couple of years.

For all of these reasons, we made the Samsung Galaxy S8 the calibration device for MobileXPRT 3. The model in our lab has a Qualcomm Snapdragon 835 SoC, 4 GB of RAM, and runs Android 7.0 (Nougat). We think it has the balance we’re looking for.

If you have any questions or concerns about MobileXPRT 3, calibration devices, or score calculations, please let us know. We look forward to sharing more information about MobileXPRT 3 as we get closer to the community preview.

Justin

More on the way for the XPRT Weekly Tech Spotlight

In the coming months, we’ll continue to add more devices and helpful features to the XPRT Weekly Tech Spotlight. We’re especially interested in adding data points and visual aids that make it easier to quickly understand the context of each device’s test scores. For instance, those of us who are familiar with WebXPRT 3 scores know that an overall score of 250 is pretty high, but site visitors who are unfamiliar with WebXPRT probably won’t know how that score compares to scores for other devices.

We designed Spotlight to be a source of objective data, in contrast to sites that provide subjective ratings for devices. As we pursue our goal of helping users make sense of scores, we want to maintain this objectivity and avoid presenting information in ways that could be misleading.

Introducing comparison aids to the site is forcing us to make some tricky decisions. Because we value input from XPRT community members, we’d love to hear your thoughts on one of the questions we’re facing: How should our default view present a device’s score?

We see three options:

1) Present the device’s score in relation to the overall high and low scores for that benchmark across all devices.
2) Present the device’s score in relation to the overall high and low scores for that benchmark across the broad category of devices to which that device belongs (e.g., phones).
3) Present the device’s score in relation to the overall high and low scores for that benchmark across a narrower sub-category of devices to which that device belongs (e.g., high-end flagship phones).

To think this through, consider WebXPRT, which runs on desktops, laptops, phones, tablets, and other devices. Typically, the WebXPRT scores for phones and tablets are lower than scores for desktop and laptop systems. The first approach helps to show just how fast high-end desktops and laptops handle the WebXPRT workloads, but it could make a phone or tablet look slow, even if its score was good for its category. The second approach would prevent unfair default comparisons between different device types but would still present comparisons between devices that are not true competitors (e.g., flagship phones vs. budget phones). The third approach is the most careful, but would introduce an element of subjectivity because determining the sub-category in which a device belongs is not always clear cut.

Do you have thoughts on this subject, or recommendations for Spotlight in general? If so, Let us know.

Justin

Check out the other XPRTs:

Forgot your password?