BenchmarkXPRT Blog banner

Category: XPRT Weekly Tech Spotlight

Here’s to 100 more!

This week’s Essential Phone entry marks the 100th device that we’ve featured in the XPRT Weekly Tech Spotlight! It’s a notable milestone for us as we work toward our goal of building a substantial library of device information that buyers can use to compare devices. In celebration, I thought it would be fun to share some Spotlight-related stats.

Our first Spotlight entry was the Google Pixel C way back on February 8, 2016, and we’ve featured a wide array of devices since then:

  • 33 phones
  • 16 laptops
  • 16 tablets
  • 16 2-in-1s
  • 6 small-form-factor PCs
  • 5 desktops
  • 5 game consoles
  • 3 all-in-ones



In addition to a wide variety of device types, we try to include a wide range of vendors. So far, we’ve featured devices from Acer, Alcatel, Alienware, Amazon, Apple, ASUS, BLU, CHUWI, Dell, Essential, Fujitsu, Google, HP, HTC, Huawei, Intel, LeEco, Lenovo, LG, Microsoft, NVIDIA, OnePlus, Razer, Samsung, Sony, Syber, Xiaomi, and ZTE. We look forward to adding many more to that list during the year ahead.

XPRT Spotlight is a great way for device vendors and manufacturers to share PT-verified specs and test results with buyers around the world. If you’re interested in sending in a device for testing, please contact XPRTSpotlight@PrincipledTechnologies.com.

There’s a lot more to come for XPRT Spotlight, and we’re constantly working on new features and improvements for the page. Are there any specific devices or features that you would like to see in the Spotlight? Let us know.

Justin

Find the perfect tech gift with the XPRT Spotlight Black Friday Showcase

With the biggest shopping day of the year fast approaching, you might be feeling overwhelmed by the sea of tech gifts to choose from. Luckily, the XPRTs are here to help. We’ve gathered the product specs and performance facts for the hottest tech devices in one convenient place—the XPRT Spotlight Black Friday Showcase. This free shopping tool provides side-by-side comparisons of some of the season’s most coveted smartphones, laptops, Chromebooks, tablets, and PCs. Most importantly, it helps you make informed buying decisions so you can breeze through this season’s holiday shopping.

Want to know how the Google Pixel 2 compares to the Apple iPhone X or Samsung Galaxy Note 8 in web browsing performance or screen size? Simply select any two devices and click the compare button to see how they stack up against each other. You can also search by device type if you’re interested in a specific form factor such as consoles or tablets.

The Showcase doesn’t go away after Black Friday. We’ll rename it the XPRT Holiday Buying Guide and continue to add devices throughout the shopping season. So be sure to check back in and see how your tech gifts measure up.

If this is your first time reading about the XPRT Weekly Tech Spotlight, here’s a little background. Our hands-on testing process equips consumers with accurate information about how devices function in the real world. We test devices using our industry-standard BenchmarkXPRT tools: WebXPRT, MobileXPRT, TouchXPRT, CrXPRT, BatteryXPRT, and HDXPRT. In addition to benchmark results, we include photographs, specs, and prices for all products. New devices come online weekly, and you can browse the full list of almost 100 that we’ve featured to date on the Spotlight page.

If you represent a device vendor and want us to feature your product in the XPRT Weekly Tech Spotlight, please visit the website for more details.

Do you have suggestions for the Spotlight page or device recommendations? Let us know!

Justin

The XPRT Spotlight Back-to-School Roundup

Today, we’re pleased to announce our second annual XPRT Spotlight Back-to-School Roundup, a free shopping tool that provides side-by-side comparisons of this school year’s most popular Chromebooks, laptops, tablets, and convertibles. We designed the Roundup to help buyers choosing devices for education, such as college students picking out a laptop or school administrators deciding on the devices for a grade. The Roundup can help make those decisions easier by gathering the product and performance facts these buyers need in one convenient place.

We tested the Roundup devices in our lab using the XPRT suite of benchmark tools. In addition to benchmark results, we also provide photographs, device specs, and prices.

If you haven’t yet visited the XPRT Weekly Tech Spotlight page, check it out. Every week, the Spotlight highlights a new device, making it easier for consumers to shop for a new laptop, smartphone, tablet, or PC. Recent devices in the spotlight include the Samsung Chromebook Pro, Microsoft Surface Laptop, Microsoft Surface Pro, OnePlus 5, and Apple iPad Pro 10.5”.

Vendors interested in having their devices featured in the XPRT Weekly Tech Spotlight or next year’s Roundup can visit the website for more details.

We’re always working on ways to make the Spotlight an even more powerful tool for helping with buying decisions. If you have any ideas for the page or suggestions for devices you’d like to see, let us know!

Justin

Best practices

Recently, a tester wrote in and asked for help determining why they were seeing different WebXPRT scores on two tablets with the same hardware configuration. The scores differed by approximately 7.5 percent. This can happen for many reasons, including different software stacks, but score variability can also result from different testing behavior and environments. While some degree of variability is natural, the question provides us with a great opportunity to talk about the basic benchmarking practices we follow in the XPRT lab, practices that contribute to the most consistent and reliable scores.

Below, we list a few basic best practices you might find useful in your testing. While they’re largely in the context of the WebXPRT focus on evaluating browser performance, several of these practices apply to other benchmarks as well.

  • Test with clean images: We use an out-of-box (OOB) method for testing XPRT Spotlight devices. OOB testing means that other than initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device, before installing additional apps and utilities. This is the best way to provide an accurate assessment of the performance retail buyers will experience. While OOB is not appropriate for certain types of testing, the key is to not test a device that’s bogged down with programs that influence results unnecessarily.
  • Turn off updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always account for update settings.
  • Get a feel for system processes: Depending on the system and the OS, quite a lot of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variability in the first run before the scores start to tighten up.
  • Disclosure is not just about hardware: Most people know that different browsers will produce different performance scores on the same system. However, testers aren’t always aware of shifts in performance between different versions of the same browser. While most updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always worthwhile to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Use more than one data point: Because of natural variability, our standard practice in the XPRT lab is to publish a score that represents the median from at least three to five runs. If you run a benchmark only once, and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions.


We hope those tips will make testing a little easier for you. If you have any questions about the XPRTs, or about benchmarking in general, feel free to ask!

Justin

HDXPRT: see how your Windows PC handles media tasks

Over the last several weeks, we reminded readers of the capabilities and benefits of TouchXPRT, CrXPRT, and BatteryXPRT. This week, we’d like to highlight HDXPRT. HDXPRT, which stands for High Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT evaluates the performance of Windows devices while handling real-world media tasks such as photo editing, video conversion, and music editing, all while using real commercial applications, including Photoshop and iTunes. HDXPRT presents results that are relevant and easy to understand.

We originally distributed HDXPRT on installation DVDs, but HDXPRT 2014, the latest version, is available for download from HDXPRT.com. HDXPRT 2014 is for systems running Windows 8.1 and later. The benchmark takes about 10 minutes to install, and a run takes less than two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test results from a variety of systems, go to HDXPRT.com and click View Results, where you’ll find scores from many different Windows devices.

If you’d like to run HDXPRT:

Simply download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for how to configure your system and kick off a test. Testers running HDXPRT on Windows 10 Creators Update builds should consult the tech support note posted on HDXPRT.com.

If you’d like to dig into the details:

Check out the Exploring HDXPRT 2014 white paper. In it, we discuss the benchmark’s three test scenarios in detail and show how we calculate the results.

If you’d like to dig even deeper, the HDXPRT source code is available to members of the BenchmarkXPRT Development Community, so consider joining today. Membership is free for members of any company or organization with an interest in benchmarks, and there are no obligations after joining.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

On another note, Bill will be attending Mobile World Congress in Shanghai next week. Let us know if you’d like to meet up and discuss the XPRTs or how to get your device in the XPRT Spotlight.

Justin

Digging deeper

From time to time, we like to revisit the fundamentals of the XPRT approach to benchmark development. Today, we’re discussing the need for testers and benchmark developers to consider the multiple factors that influence benchmark results. For every device we test, all of its hardware and software components have the potential to affect performance, and changing the configuration of those components can significantly change results.

For example, we frequently see significant performance differences between different browsers on the same system. In our recent recap of the XPRT Weekly Tech Spotlight’s first year, we highlighted an example of how testing the same device with the same benchmark can produce different results, depending on the software stack under test. In that instance, the Alienware Steam Machine entry included a WebXPRT 2015 score for each of the two browsers that consumers were likely to use. The first score (356) represented the SteamOS browser app in the SteamOS environment, and the second (441) represented the Iceweasel browser (a Firefox variant) in the Linux-based desktop environment. Including only the first score would have given readers an incomplete picture of the Steam Machine’s web-browsing capabilities, so we thought it was important to include both.

We also see performance differences between different versions of the same browser, a fact especially relevant to those who use frequently updated browsers, such as Chrome. Even benchmarks that measure the same general area of performance, for example, web browsing, are usually testing very different things.

OS updates can also have an impact on performance. Consumers might base a purchase on performance or battery life scores and end up with a device that behaves much differently when updated to a new version of Android or iOS, for example.

Other important factors in the software stack include pre-installed software, commonly referred to as bloatware, and the proliferation of apps that sap performance and battery life.

This is a much larger topic than we can cover in the blog. Let the examples we’ve mentioned remind you to think critically about, and dig deeper into, benchmark results. If we see published XPRT scores that differ significantly from our own results, our first question is always “What’s different between the two devices?” Most of the time, the answer becomes clear as we compare hardware and software from top to bottom.

Justin

Check out the other XPRTs:

Forgot your password?