BenchmarkXPRT Blog banner

Category: Android

Running Android-oriented XPRTs on Chrome OS

Since last summer, we’ve been following Google’s progress in bringing Android apps and the Google Play store to Chromebooks, along with their plan to gradually phase out support for Chrome apps over the next few years. Because we currently offer apps that assess battery life and performance for Android devices (BatteryXPRT and MobileXPRT) and Chromebooks (CrXPRT), the way this situation unfolds could affect the makeup of the XPRT portfolio in the years to come.

For now, we’re experimenting to see how well the Android app/Chrome OS merger is working with the devices in our lab. One test case is the Samsung Chromebook Plus, which we featured in the XPRT Weekly Tech Spotlight a few weeks ago. Normally, we would publish only CrXPRT and WebXPRT results for a Chromebook, but installing and running MobileXPRT 2015 from the Google Play store was such a smooth and error-free process that we decided to publish the first MobileXPRT score for a device running Chrome OS.

We also tried running BatteryXPRT on the Chromebook Plus, but even though the installation was quick and easy and the test kicked off without a hitch, we could not generate a valid result. Typically, the test would complete several iterations successfully, but terminate before producing a result. We’re investigating the problem, and will keep the community up to date on what we find.

In the meantime, we continue to recommend that Chromebook testers use CrXPRT for performance and battery life assessment. While we haven’t encountered any issues running MobileXPRT 2015 on Chromebooks, CrXPRT has a proven track record.

If you have any questions about running Android-oriented XPRTs on Chrome OS, or insights that you’d like to share, please let us know.

Justin

Digging deeper

From time to time, we like to revisit the fundamentals of the XPRT approach to benchmark development. Today, we’re discussing the need for testers and benchmark developers to consider the multiple factors that influence benchmark results. For every device we test, all of its hardware and software components have the potential to affect performance, and changing the configuration of those components can significantly change results.

For example, we frequently see significant performance differences between different browsers on the same system. In our recent recap of the XPRT Weekly Tech Spotlight’s first year, we highlighted an example of how testing the same device with the same benchmark can produce different results, depending on the software stack under test. In that instance, the Alienware Steam Machine entry included a WebXPRT 2015 score for each of the two browsers that consumers were likely to use. The first score (356) represented the SteamOS browser app in the SteamOS environment, and the second (441) represented the Iceweasel browser (a Firefox variant) in the Linux-based desktop environment. Including only the first score would have given readers an incomplete picture of the Steam Machine’s web-browsing capabilities, so we thought it was important to include both.

We also see performance differences between different versions of the same browser, a fact especially relevant to those who use frequently updated browsers, such as Chrome. Even benchmarks that measure the same general area of performance, for example, web browsing, are usually testing very different things.

OS updates can also have an impact on performance. Consumers might base a purchase on performance or battery life scores and end up with a device that behaves much differently when updated to a new version of Android or iOS, for example.

Other important factors in the software stack include pre-installed software, commonly referred to as bloatware, and the proliferation of apps that sap performance and battery life.

This is a much larger topic than we can cover in the blog. Let the examples we’ve mentioned remind you to think critically about, and dig deeper into, benchmark results. If we see published XPRT scores that differ significantly from our own results, our first question is always “What’s different between the two devices?” Most of the time, the answer becomes clear as we compare hardware and software from top to bottom.

Justin

Celebrating one year of the XPRT Weekly Tech Spotlight

It’s been just over a year since we launched the XPRT Weekly Tech Spotlight by featuring our first device, the Google Pixel C. Spotlight has since become one of the most popular items at BenchmarkXPRT.com, and we thought now would be a good time to recap the past year, offer more insight into the choices we make behind the scenes, and look at what’s ahead for Spotlight.

The goal of Spotlight is to provide PT-verified specs and test results that can help consumers make smart buying decisions. We try to include a wide variety of device types, vendors, software platforms, and price points in our inventory. The devices also tend to fall into one of two main groups: popular new devices generating a lot of interest and devices that have unique form factors or unusual features.

To date, we’ve featured 56 devices: 16 phones, 11 laptops, 10 two-in-ones, 9 tablets, 4 consoles, 3 all-in-ones, and 3 small-form-factor PCs. The operating systems these devices run include Android, ChromeOS, iOS, macOS, OS X, Windows, and an array of vendor-specific OS variants and skins.

As much as possible, we test using out-of-the-box (OOB) configurations. We want to present test results that reflect what everyday users will experience on day one. Depending on the vendor, the OOB approach can mean that some devices arrive bogged down with bloatware while others are relatively clean. We don’t attempt to “fix” anything in those situations; we simply test each device “as is” when it arrives.

If devices arrive with outdated OS versions (as is often the case with Chromebooks), we update to current versions before testing, because that’s the best reflection of what everyday users will experience. In the past, that approach would’ve been more complicated with Windows systems, but the Microsoft shift to “Windows as a service” ensures that most users receive significant OS updates automatically by default.

The OOB approach also means that the WebXPRT scores we publish reflect the performance of each device’s default browser, even if it’s possible to install a faster browser. Our goal isn’t to perform a browser shootout on each device, but to give an accurate snapshot of OOB performance. For instance, last week’s Alienware Steam Machine entry included two WebXPRT scores, a 356 on the SteamOS browser app and a 441 on Iceweasel 38.8.0 (a Firefox variant used in the device’s Linux-based desktop mode). That’s a significant difference, but the main question for us was which browser was more likely to be used in an OOB scenario. With the Steam Machine, the answer was truly “either one.” Many users will use the browser app in the SteamOS environment and many will take the few steps needed to access the desktop environment. In that case, even though one browser was significantly faster than the other, choosing to omit one score in favor of the other would have excluded results from an equally likely OOB environment.

We’re always looking for ways to improve Spotlight. We recently began including more photos for each device, including ones that highlight important form-factor elements and unusual features. Moving forward, we plan to expand Spotlight’s offerings to include automatic score comparisons, additional system information, and improved graphical elements. Most importantly, we’d like to hear your thoughts about Spotlight. What devices and device types would you like to see? Are there specs that would be helpful to you? What can we do to improve Spotlight? Let us know!

Justin

BatteryXPRT 2014 gets an update

After Android 7 (Nougat) was released on select devices this past fall, we discovered an issue with BatteryXPRT on devices running Android 7 and above. The battery life tests were completing accurately and reliably, but the test was not producing a performance score.

The problem was a result of significant changes in the Android development environment. Android 7 restricted the flags used for different target architectures when linking native code components, and that caused issues while executing part of the Create Slideshow workload. We resolved the issue by changing the linked flags. Also, we migrated the BatteryXPRT code from the Eclipse and Android SDK development environments to the up-to-date Android Studio environment. This allowed us to rebuild the app in a way that maintains compatibility with the most recent versions of Android.

Today, we’re releasing a new build of BatteryXPRT 2014 (v104) at BatteryXPRT.com and the Google Play store. Scores from this build are comparable with previous BatteryXPRT scores, and if you’re testing with a version of BatteryXPRT that you downloaded from the Google Play store, you should receive the new build via an app update.

Click here to download the new BatteryXPRT installer (330 MB) directly from our site.

For users who have limited bandwidth or trouble accessing the Google Play store, downloading the APK files (26.7 MB total) may make installation easier.

Download the updated BatteryXPRT APK (2.8 MB) directly from our site.

Download the updated BatteryXPRT Tests APK (23.9 MB) directly from our site.

If you have any questions about the update or any other XPRT-related topic, feel free to contact us at BenchmarkXPRTSupport@principledtechnologies.com.

Justin

CrXPRT’s future

This week, we’re continuing our review of the XPRT portfolio by discussing the future of CrXPRT. CrXPRT, designed for use with Chrome OS, is a tool for evaluating the performance and battery life of Chromebooks as they handle everyday tasks. The app’s performance test, which measures Chromebook speed, produces an overall score and individual scores for each workload. The battery life test produces an estimated battery life and a separate performance score. CrXPRT is easy to install and use, and like BatteryXPRT, it evaluates battery life in half a workday.

We developed CrXPRT in response to the growing popularity of Chromebooks, especially in the education sector. The number of OEMs manufacturing Chromebooks has grown dramatically, along with the range of Chromebook price points and form factors. That growth shows no signs of slowing down, so CrXPRT is more relevant than ever as a tool for helping consumers make informed buying decisions.

As Chromebook market share continues to grow, however, it’s clear that significant changes to the Chrome OS environment are on the way. One big change is Google’s decision to bring Android apps, and the Google Play store itself, to Chrome OS. Another change is the plan to “begin the evolution away” from the Chrome apps platform and phase out Chrome app support on other platforms within the next two years.

There are also reports of a hybrid Android-Chrome OS operating system. Codenamed “Andromeda,” it would unite the Android and Chrome OS environments in a manner similar to the way Microsoft Continuum allows Windows 10 to run on a wide variety of device types. Details on Andromeda are few and far between, but it would obviously be a game changer.

The Google Play store rollout to select Chromebooks is already well underway. As for the other changes, it remains to be seen exactly when and how they will be implemented. The Chromium team did state that all types of Chrome apps will remain supported and maintained on Chrome OS for the foreseeable future.

For us, it’s important to maintain the ability to measure both performance and battery life on Chromebooks. The current version of CrXPRT does the job well, so we don’t see a need for a new version until the situation becomes more clear. In the meantime, we’ll be keeping an eye on Chrome-related news.

As always, we’re interested in your feedback. If you have any thoughts on CrXPRT 2015 or the future of Chromebook evaluation, let us know!

Justin

Exploring virtual reality

We’ve talked a lot in recent weeks about new technologies we are evaluating for the XPRTs. You may remember that back in June, we also wrote about sponsoring a second senior project with North Carolina State University. Last week, the project ended with the traditional Posters and Pies event. The team gave a very well thought‑out presentation.

NCSU VR blog pic 1

As you can tell from the photo below, the team designed and implemented a nifty virtual reality app. It’s a room escape puzzle, and it looks great!

NCSU VR blog pic 2

The app is a playable game with the ability to record the gameplay for doing repeatable tests. It also includes a recording that allows you to test a device without playing the game. Finally, the app lets you launch directly into the prerecorded game without using a viewer, which will be handy for testing multiple devices.

The team built the app using the Google Cardboard API and the Unity game engine, which allowed them to create Android and iOS versions. We’re looking forward to seeing what that may tell us!

After Posters and Pies, the team came to PT to present their work and answer questions. We were all very impressed with their knowledge and with how well thought out the application was.

NCSU VR blog pic 3

Many thanks to team members Christian McCurdy, Gregory Manning, Grayson Jones, and Shon Ferguson (not shown).

NCSU VR blog pic 4

Thanks also to Dr. Lina Battestilli, the team’s technical advisor, and Margaret Heil, Director of the Senior Design Center.

We are currently evaluating the app, and expect to make it available to the community in early 2017!

Eric

 

Check out the other XPRTs:

Forgot your password?