BenchmarkXPRT Blog banner

Category: Performance of computing devices

The XPRT Spotlight Black Friday Showcase helps you shop with confidence

Black Friday and Cyber Monday are almost here, and you may be feeling overwhelmed by the sea of tech gifts to choose from. The XPRTs are here to help. We’ve gathered the product specs and performance facts for some of the hottest tech devices in one convenient place—the XPRT Spotlight Black Friday Showcase. The Showcase is a free shopping tool that provides side-by-side comparisons of some of the season’s most popular smartphones, laptops, Chromebooks, tablets, and PCs. It helps you make informed buying decisions so you can shop with confidence this holiday season.

Want to know how the Google Pixel 4 stacks up against the Apple iPhone 11 or Samsung Galaxy Note10 in web browsing performance or screen size? Simply select any two devices in the Showcase and click Compare. You can also search by device type if you’re interested in a specific form factor such as consoles or tablets.

The Showcase doesn’t go away after Black Friday. We’ll rename it the XPRT Holiday Showcase and continue to add devices such as the Microsoft Surface Pro X throughout the shopping season. Be sure to check back in and see how your tech gifts measure up.

If this is the first you’ve heard about the XPRT Tech Spotlight, here’s a little background. Our hands-on testing process equips consumers with accurate information about how devices function in the real world. We test devices using our industry-standard BenchmarkXPRT tools: WebXPRT, MobileXPRT, TouchXPRT, CrXPRT, BatteryXPRT, and HDXPRT. In addition to benchmark results, we include photographs, specs, and prices for all products. New devices come online weekly, and you can browse the full list of almost 200 that we’ve featured to date on the Spotlight page.

If you represent a device vendor and want us to feature your product in the XPRT Tech Spotlight, please visit the website for more details.

Justin

Three years of the XPRT Weekly Tech Spotlight

February marked the three-year anniversary of the XPRT Weekly Tech Spotlight, and we now have over 150 devices in our Spotlight library! We started the Spotlight to provide consumers with objective information on device hardware and performance, and to provide vendors with a trusted third-party showcase for their gear. Each week, we measure and verify the Spotlight device’s specs ourselves, never relying on vendor-published data. We also test each device with every applicable XPRT benchmark, and publish the data that lets consumers know how a device measures up to its competitors.

Over the past three years, we’ve featured a wide array of devices:

  • 49 phones
  • 28 laptops
  • 26 tablets
  • 24 2-in-1 devices
  • 12 small-form-factor PCs
  • 7 desktops
  • 6 game consoles
  • 6 all-in-ones

 

In addition to a wide variety of device types, we try to include a wide range of vendors. We’ve featured devices from ACEPC, Acer, Alcatel, Alienware, Amazon, Apple, ASUS, Barnes and Noble, BlackBerry, BLU, CHUWI, Dell, Essential, Fujitsu, Fusion5, Google, Honor, HP, HTC, Huawei, Intel, LeEco, Lenovo, LG, Microsoft, MINIX, Motorola, Nokia, NVIDIA, OnePlus, Razer, Samsung, Sony, Syber, Xiaomi, and ZTE.

XPRT Spotlight is a great way for device vendors and manufacturers to share PT-verified specs and test results with buyers around the world. We test many of the devices that appear each year and will test—at no charge—any device a manufacturer or vendor sends us. If you’d like us to test your device, please contact us at XPRTSpotlight@PrincipledTechnologies.com.

There’s a lot more to come for the XPRT Spotlight, and we’re constantly working on new features and improvements for the page. Are there any specific devices or features that you would like to see in the Spotlight? Let us know.

Justin

More, faster, better: The future according to Mobile World Congress 2019

More is more data, which the trillions of devices in the coming Internet of Things will be pumping through our air into our (computing) clouds in hitherto unseen quantities.

Faster is the speed at which tomorrow’s 5G networks will carry this data—and the responses and actions from our automated assistants (and possibly overlords).

Better is the quality of the data analysis and recommendations, thanks primarily to the vast army of AI-powered analytics engines that will be poring over everything digital the planet has to say.

Swimming through this perpetual data tsunami will be we humans and our many devices, our laptops and tablets and smartphones and smart watches and, ultimately, implants. If we are to believe the promise of this year’s Mobile World Congress in Barcelona—and of course I do want to believe it, who wouldn’t?—the result of all of this will be a better world for all humanity, no person left behind. As I walked the show floor, I could not help but feel and want to embrace its optimism.

The catch, of course, is that we have a tremendous amount of work to do between where we are today and this fabulous future.

We must, for example, make sure that every computing node that will contribute to these powerful AI programs is up to the task. From the smartphone to the datacenter, AI will end up being a very distributed and very demanding workload. That’s one of the reasons we’ve been developing AIXPRT. Without tools that let us accurately compare different devices, the industry won’t be able to keep delivering the levels of performance improvements that we need to realize these dreams.

We must also think a lot about how to accurately measure all other aspects of our devices’ performance, because the demands this future will place on them are going to be significant. Fortunately, the always evolving XPRT family of tools is up to the task.

The coming 5G revolution, like all tech leaps forward before it, will not come evenly. Different 5G devices will end up behaving differently, some better and some worse. That fact, plus our constant and growing reliance on bandwidth, suggests that maybe the XPRT community should turn its attention to the task of measuring bandwidth. What do you think?

One thing is certain: we at the Benchmark XPRT Development Community have a role to play in building the tools necessary to test the tech the world will need to deliver on the promise of this exciting trade show. We look forward to that work.

More on the way for the XPRT Weekly Tech Spotlight

In the coming months, we’ll continue to add more devices and helpful features to the XPRT Weekly Tech Spotlight. We’re especially interested in adding data points and visual aids that make it easier to quickly understand the context of each device’s test scores. For instance, those of us who are familiar with WebXPRT 3 scores know that an overall score of 250 is pretty high, but site visitors who are unfamiliar with WebXPRT probably won’t know how that score compares to scores for other devices.

We designed Spotlight to be a source of objective data, in contrast to sites that provide subjective ratings for devices. As we pursue our goal of helping users make sense of scores, we want to maintain this objectivity and avoid presenting information in ways that could be misleading.

Introducing comparison aids to the site is forcing us to make some tricky decisions. Because we value input from XPRT community members, we’d love to hear your thoughts on one of the questions we’re facing: How should our default view present a device’s score?

We see three options:

1) Present the device’s score in relation to the overall high and low scores for that benchmark across all devices.
2) Present the device’s score in relation to the overall high and low scores for that benchmark across the broad category of devices to which that device belongs (e.g., phones).
3) Present the device’s score in relation to the overall high and low scores for that benchmark across a narrower sub-category of devices to which that device belongs (e.g., high-end flagship phones).

To think this through, consider WebXPRT, which runs on desktops, laptops, phones, tablets, and other devices. Typically, the WebXPRT scores for phones and tablets are lower than scores for desktop and laptop systems. The first approach helps to show just how fast high-end desktops and laptops handle the WebXPRT workloads, but it could make a phone or tablet look slow, even if its score was good for its category. The second approach would prevent unfair default comparisons between different device types but would still present comparisons between devices that are not true competitors (e.g., flagship phones vs. budget phones). The third approach is the most careful, but would introduce an element of subjectivity because determining the sub-category in which a device belongs is not always clear cut.

Do you have thoughts on this subject, or recommendations for Spotlight in general? If so, Let us know.

Justin

The value of speed

I was reading an interesting article on how high-end smartphones like the iPhone X, Pixel 2 XL, and Galaxy S8 generate more money from in-game revenue than cheaper phones do.

One line stood out to me: “With smartphones becoming faster, larger and more capable of delivering an engaging gaming experience, these monetization key performance indicators (KPIs) have begun to increase significantly.”

It turns out the game companies totally agree with the rest of us that faster devices are better!

Regardless of who is seeking better performance—consumers or game companies—the obvious question is how you determine which models are fastest. Many folks rely on device vendors’ claims about how much faster the new model is. Unfortunately, the vendors’ claims don’t always specify on what they base the claims. Even when they do, it’s hard to know whether the numbers are accurate and applicable to how you use your device.

The key part of any answer is performance tools that are representative, dependable, and open.

  • Representative – Performance tools need to have realistic workloads that do things that you care about.
  • Dependable – Good performance tools run reliably and produce repeatable results, both of which require that significant work go into their development and testing.
  • Open – Performance tools that allow people to access the source code, and even contribute to it, keep things above the table and reassure you that you can rely on the results.

Our goal with the XPRTs is to provide performance tools that meet all these criteria. WebXPRT 3 and all our other XPRTs exist to help accurately reveal how devices perform. You can run them yourself or rely on the wealth of results that we and others have collected on a wide array of devices.

The best thing about good performance tools is that everyone, even vendors, can use them. I sincerely hope that you find the XPRTs helpful when you make your next technology purchase.

Bill

The XPRTs in action

In the near future, we’ll update our “XPRTs around the world” infographic, which provides a snapshot of how people are using the XPRTs worldwide. Among other stats, we include the number of XPRT web mentions, articles, and reviews that have appeared during a given period. Recently, we learned how one of those statistics—a single web site mention of WebXPRT—found its way to consumers in more places than we would have imagined.

Late last month, AnandTech published a performance comparison by Andrei Frumusanu examining the Samsung Galaxy S9’s Snapdragon 845 and Exynos 9810 variants and a number of other high-end phones. WebXPRT was one of the benchmarking tools used. The article stated that both versions of the brand-new S9 were slower than the iPhone X and, in some tests, were slower than even the iPhone 7.

A CNET video discussed the article and the role of WebXPRT in the performance comparison, and the article has been reposted to hundreds of tech media sites around the world. A quick survey shows reposts in Albania, Bulgaria, Denmark, Chile, the Czech Republic, France, Germany, Greece, Indonesia, Iran, Italy Japan, Korea, Poland, Russia, Spain, Slovakia, Turkey, and many other countries.

The popularity of the article is not surprising, for it positions the newest flagship phones from the industry’s two largest phone makers in a head-to-head comparison with a somewhat unexpected outcome. AnandTech did nothing to stir controversy or sensationalize the test results, but simply provided readers with an objective, balanced assessment of how these devices compare so that they could draw their own conclusions. The XPRTs share this approach.

We’re grateful to Andrei and others at AnandTech who’ve used the XPRTs over the years to produce content that helps consumers make informed decisions. WebXPRT is just part of AnandTech’s toolkit, but it’s one that’s accessible to anybody free of charge. With the help of BenchmarkXPRT Development Community members, we’ll continue to publish XPRT tools that help users everywhere gain valuable insight into device performance.

Justin

Check out the other XPRTs:

Forgot your password?