BenchmarkXPRT Blog banner

Celebrating one year of the XPRT Weekly Tech Spotlight

It’s been just over a year since we launched the XPRT Weekly Tech Spotlight by featuring our first device, the Google Pixel C. Spotlight has since become one of the most popular items at BenchmarkXPRT.com, and we thought now would be a good time to recap the past year, offer more insight into the choices we make behind the scenes, and look at what’s ahead for Spotlight.

The goal of Spotlight is to provide PT-verified specs and test results that can help consumers make smart buying decisions. We try to include a wide variety of device types, vendors, software platforms, and price points in our inventory. The devices also tend to fall into one of two main groups: popular new devices generating a lot of interest and devices that have unique form factors or unusual features.

To date, we’ve featured 56 devices: 16 phones, 11 laptops, 10 two-in-ones, 9 tablets, 4 consoles, 3 all-in-ones, and 3 small-form-factor PCs. The operating systems these devices run include Android, ChromeOS, iOS, macOS, OS X, Windows, and an array of vendor-specific OS variants and skins.

As much as possible, we test using out-of-the-box (OOB) configurations. We want to present test results that reflect what everyday users will experience on day one. Depending on the vendor, the OOB approach can mean that some devices arrive bogged down with bloatware while others are relatively clean. We don’t attempt to “fix” anything in those situations; we simply test each device “as is” when it arrives.

If devices arrive with outdated OS versions (as is often the case with Chromebooks), we update to current versions before testing, because that’s the best reflection of what everyday users will experience. In the past, that approach would’ve been more complicated with Windows systems, but the Microsoft shift to “Windows as a service” ensures that most users receive significant OS updates automatically by default.

The OOB approach also means that the WebXPRT scores we publish reflect the performance of each device’s default browser, even if it’s possible to install a faster browser. Our goal isn’t to perform a browser shootout on each device, but to give an accurate snapshot of OOB performance. For instance, last week’s Alienware Steam Machine entry included two WebXPRT scores, a 356 on the SteamOS browser app and a 441 on Iceweasel 38.8.0 (a Firefox variant used in the device’s Linux-based desktop mode). That’s a significant difference, but the main question for us was which browser was more likely to be used in an OOB scenario. With the Steam Machine, the answer was truly “either one.” Many users will use the browser app in the SteamOS environment and many will take the few steps needed to access the desktop environment. In that case, even though one browser was significantly faster than the other, choosing to omit one score in favor of the other would have excluded results from an equally likely OOB environment.

We’re always looking for ways to improve Spotlight. We recently began including more photos for each device, including ones that highlight important form-factor elements and unusual features. Moving forward, we plan to expand Spotlight’s offerings to include automatic score comparisons, additional system information, and improved graphical elements. Most importantly, we’d like to hear your thoughts about Spotlight. What devices and device types would you like to see? Are there specs that would be helpful to you? What can we do to improve Spotlight? Let us know!

Justin

Mobile World Congress 2017 and the territories ahead

Walking the halls of this year’s Mobile World Congress (MWC)—and, once again, I walked by every booth in every one of them—it was clear that mobile technology is expanding faster than ever into more new tech territories than ever before.

On the device front, cameras and camera quality have become a pitched battleground, with mobile phone makers teaming with camera manufacturers to give us better and better images and video. This fight is far from over, too, because vendors are exploring many different ways to improve mobile phone camera quality. Quick charging is a hot new trend we can expect to hear more about in the days to come. Of course, apps and their performance continue to matter greatly, because if you can do it from any computer, you better be able to do at least some of it from your phone.

The Internet of Things (IoT) grabbed many headlines, with vendors still selling more dreams than reality, but some industries living this future now. The proliferation of IoT devices will result, of course, in massive increases in the amount of data flowing through the world’s networks, which in turn will require more and more computing power to analyze and use. That power will need to be everywhere, from massive datacenters to the device in your hand, because the more data you have, the more you’ll want to customize it to your particular needs.

Similarly, AI was a major theme of the show, and it’s also likely to suck up computing cycles everywhere. The vast majority of the work will, of course, end up in datacenters, but some processing is likely to be local, particularly in situations, such as real-time translation, where we can’t afford significant comm delays.

5G, the next big step in mobile data speeds, was everywhere, with most companies seeming to agree the new standard was still years away–but also excited about what will be possible. When you can stream 4K movies to your phone wirelessly while simultaneously receiving and customizing analyses of your company’s IoT network, you’re going to need a powerful, sophisticated device running equally powerful and sophisticated apps.

Everywhere I looked, the future was bright—and complicated, and likely to place increasing demands on all of our devices. We’ll need guides as we find our paths through these new territories and as we determine the right device tools for our jobs, so the need for the XPRTs will only increase. I look forward to seeing where we, the BenchmarkXPRT Development Community, take them next.

Mark

A new reality

A while back, I wrote about a VR demo built by students from North Carolina State University. We’ve been checking it out over the last couple of months and are very impressed. This workload will definitely heat up your device! While the initial results look promising, this is still an experimental workload and it’s too early to use results in formal reviews or product comparisons.

We’ve created a page that tells all about the VR demo. As an experimental workload, the demo is available only to community members. As always, members can download the source as well as the APK.

We asked the students to try to build the workload for iOS as a stretch goal. They successfully built an iOS version, but this was at the end of the semester and there was little time for testing. If you want to experiment with iOS yourself, look at the build instructions for Android and iOS that we include with the source. Note that you will need Xcode to build and deploy the demo on iOS.

After you’ve checked out the workload, let us know what you think!

Finally, we have a new video featuring the VR demo. Enjoy!

vr-demo-video

Eric

The XPRTs embrace virtual reality

Durham, NC – Virtual reality (VR) continues to open new and exciting worlds of possibility, but how do consumers know if their tech can handle its computing demands? The XPRTs have long provided the tools everyone needs to inform purchases, and now they’re turning to VR. For more, see the video at https://youtu.be/liqJyKsDp-c.

Principled Technologies (PT) and the BenchmarkXPRT Development Community, which PT administers, recently sponsored a senior project at the NC State Computer Science Department Senior Design Center. PT advised NC State team members Christian McCurdy, Gregory Manning, Grayson Jones, and Shon Ferguson in the development of a prototype virtual reality evaluation tool.

“Developing the VR benchmark with XPRT tools as guidelines taught us to test for a more real-world scenario instead of trying for numbers that consumers might not understand,” said student Grayson Jones.

Learn more about the XPRTs and try the VR demo at http://www.principledtechnologies.com/benchmarkxprt/vr-demo.

About Principled Technologies, Inc.

Principled Technologies, Inc. is the leading provider of technology marketing and learning & development services. It administers the BenchmarkXPRT Development Community.

Principled Technologies, Inc. is located in Durham, North Carolina, in NC’s Research Triangle Park Region. For more information, please visit www.principledtechnologies.com.

Company Contact

Eric Hale
Principled Technologies, Inc.
1007 Slater Road, Suite #300
Durham, NC 27703
ehale@principledtechnologies.com

About the BenchmarkXPRT Development Community

The BenchmarkXPRT Development Community is a forum where registered members can contribute to the process of creating and improving the XPRTs. For more information, please visit www.BenchmarkXPRT.com.

About the NC State University Computer Science Department Senior Design Center

The Senior Design Center brings together Computer Science seniors and sponsor companies to work on a specific project for a semester. This collaboration provides valuable hands-on experience for the students and important project results for the companies sponsoring them. For more information, please visit https://sdc.csc.ncsu.edu.

Experience is the best teacher

One of the core principles that guides the design of the XPRT tools is they should reflect the way real-world users use their devices. The XPRTs try to use applications and workloads that reflect what users do and the way that real applications function. How did we learn how important this is? The hard way—by making mistakes! Here’s one example.

In the 1990s, I was Director of Testing for the Ziff-Davis Benchmark Operation (ZDBOp). The benchmarks ZDBOp created for its technical magazines became the industry standards, because of both their quality and Ziff-Davis’ leadership in the technical trade press.

WebBench, one of the benchmarks ZDBOp developed, measured the performance of early web servers. We worked hard to create a tool that used physical clients and tested web server performance over an actual network. However, we didn’t pay enough attention to how clients actually interacted with the servers. In the first version of WebBench, the clients opened connections to the server, did a small amount of work, closed the connections, and then opened new ones.

When we met with vendors after the release of WebBench, they begged us to change the model. At that time, browsers opened relatively long-lived connections and did lots of work before closing them. Our model was almost the opposite of that. It put vendors in the position of having to choose between coding to give their users good performance and coding to get good WebBench results.

Of course, we were horrified by this, and worked hard to make the next version of the benchmark reflect more closely the way real browsers interacted with web servers. Subsequent versions of WebBench were much better received.

This is one of the roots from which the XPRT philosophy grew. We have tried to learn and grow from the mistakes we’ve made. We’d love to hear about any of your experiences with performance tools so we can all learn together.

Eric

A new HDXPRT 2014 build is available

Last fall, we identified a way to run HDXPRT 2014, originally developed for Windows 8, on Windows 10. The method involved overwriting the HDXPRT CPU-Z files with newer versions and performing a few additional pre-test configuration steps. You can read more details about those steps here.

Today, we’re releasing a new build of HDXPRT 2014 (v1.2) that eliminates the need to overwrite the CPU-Z files. The new build is available for download at HDXPRT.com. Please note that the app package is 5.08 GB, so allow time and space for the download process.

We also updated the HDXPRT 2014 User Manual to reflect changes in pre-test system configuration and to include the settings we recommend for newer builds of Windows 10.

The changes in the new build do not affect results, so v1.2 scores are comparable to v1.1 scores on the same system.

The new build ran well during testing in our labs, but issues could emerge as Microsoft releases new Windows updates. If you have any questions about HDXPRT or encounter any issues during testing, we encourage you to let us know.

We look forward to seeing your test results!

Justin

Check out the other XPRTs:

Forgot your password?