BenchmarkXPRT Blog banner

Tag Archives: laptops

The XPRTs can help with your holiday shopping

The biggest shopping days of the year are fast approaching, and if you’re researching phones, tablets, Chromebooks, or laptops in preparation for Black Friday and Cyber Monday sales, the XPRTs can help! One of the core functions of the XPRTs is to help cut through all the marketing noise by providing objective, reliable measures of a device’s performance. For example, instead of trying to guess whether a new Chromebook is fast enough to handle the demands of remote learning, you can use its CrXPRT and WebXPRT performance scores to see how it stacks up against the competition when handling everyday tasks.

A good place to start your search for scores is our XPRT results browser. The browser is the most efficient way to access the XPRT results database, which currently holds more than 2,600 test results from over 100 sources, including major tech review publications around the world, OEMs, and independent testers. It offers a wealth of current and historical performance data across all the XPRT benchmarks and hundreds of devices. You can read more about how to use the results browser here.

Also, if you’re considering a popular device, chances are good that someone has already published an XPRT score for that device in a recent tech review. The quickest way to find these reviews is by searching for “XPRT” within your favorite tech review site, or by entering the device name and XPRT name (e.g. “Apple iPad” and “WebXPRT”) in a search engine. Here are a few recent tech reviews that use one or more of the XPRTs to evaluate a popular device:


The XPRTs can help consumers make better-informed and more confident tech purchases this holiday season, and we hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database please feel free to ask!

Justin

CrXPRT is more valuable than ever

Digital Trends recently published an article discussing various rumors about the future of the Google Pixelbook line. Pixelbooks were some of the first Chromebooks with high-end hardware specs, and they were priced accordingly. Whether or not the rumors discussed in the article turn out to be true, the author points out that the Pixelbook prompted several other vendors, such as HP and Lenovo, to take a chance on high-end Chromebooks. It seems like high-end Chromebooks are here to stay, but given the unique constraints of the Chrome OS environment, buyers are often unsure if it’s worth it to shell out the extra money for a premium model.

We developed CrXPRT to help buyers answer these questions. CrXPRT is a benchmark tool that measures the battery life of your Chromebook as well as how fast it handles everyday tasks like playing video games, watching movies, editing pictures, and doing homework. The performance test gives you individual workload scores and an overall score based on the speed of the device. The battery life test produces an estimated battery life time, a separate performance score, and a frames-per-second (FPS) rate for a built-in HTML5 gaming component.

You don’t have to be a tech journalist or even a techie to use CrXPRT. To learn more, check out the links below.

Testing the performance or battery life of your Chromebook

Simply download CrXPRT from the Chrome Web Store. Installation is quick and easy, and the CrXPRT 2015 user manual provides step-by-step instructions. A typical performance test takes about 15 minutes, and a battery life test will take 3.5 hours once the system is charged and configured for testing. If you’d like to see how your score compares to other Chromebooks, visit the CrXPRT results page.

Want to know more?

Read the Exploring CrXPRT 2015 white paper, where we discuss the concepts behind CrXPRT, its development process, and the app’s structure. We also describe the component tests and explain the statistical processes used to calculate expected battery life.

BenchmarkXPRT Development Community members also have access to the CrXPRT source code, so if you’re interested, join today! There’s no obligation and membership is free for members of any company or organization with an interest in benchmarks.

Give CrXPRT a try and let us know what you think!

Justin

Three years of the XPRT Weekly Tech Spotlight

February marked the three-year anniversary of the XPRT Weekly Tech Spotlight, and we now have over 150 devices in our Spotlight library! We started the Spotlight to provide consumers with objective information on device hardware and performance, and to provide vendors with a trusted third-party showcase for their gear. Each week, we measure and verify the Spotlight device’s specs ourselves, never relying on vendor-published data. We also test each device with every applicable XPRT benchmark, and publish the data that lets consumers know how a device measures up to its competitors.

Over the past three years, we’ve featured a wide array of devices:

  • 49 phones
  • 28 laptops
  • 26 tablets
  • 24 2-in-1 devices
  • 12 small-form-factor PCs
  • 7 desktops
  • 6 game consoles
  • 6 all-in-ones

 

In addition to a wide variety of device types, we try to include a wide range of vendors. We’ve featured devices from ACEPC, Acer, Alcatel, Alienware, Amazon, Apple, ASUS, Barnes and Noble, BlackBerry, BLU, CHUWI, Dell, Essential, Fujitsu, Fusion5, Google, Honor, HP, HTC, Huawei, Intel, LeEco, Lenovo, LG, Microsoft, MINIX, Motorola, Nokia, NVIDIA, OnePlus, Razer, Samsung, Sony, Syber, Xiaomi, and ZTE.

XPRT Spotlight is a great way for device vendors and manufacturers to share PT-verified specs and test results with buyers around the world. We test many of the devices that appear each year and will test—at no charge—any device a manufacturer or vendor sends us. If you’d like us to test your device, please contact us at XPRTSpotlight@PrincipledTechnologies.com.

There’s a lot more to come for the XPRT Spotlight, and we’re constantly working on new features and improvements for the page. Are there any specific devices or features that you would like to see in the Spotlight? Let us know.

Justin

HDXPRT 4 is here!

We’re excited to announce that HDXPRT 4 is now available to the public! Just like previous versions of HDXPRT, HDXPRT 4 uses trial versions of commercial applications to complete real-world media tasks. The HDXPRT 4 installation package includes installers for some of those programs, such as Audacity and HandBrake. For other programs, such as Adobe Photoshop Elements and CyberLink Media Espresso, users will need to download the necessary installers prior to testing by using the links and instructions in the HDXPRT 4 User Manual.

In addition to the editing photos, editing music, and converting videos workloads from prior versions of the benchmark, HDXPRT 4 includes two new Photoshop Elements scenarios. The first utilizes an AI tool that corrects closed eyes in photos, and the second creates a single panoramic photo from seven separate photos.

HDXPRT 4 is compatible with systems running Windows 10, and is available for download at HDXPRT.com. The installation package is about 4.8 GB, so the download may take several minutes. The setup process takes about 30 minutes on most computers, and a standard test run takes approximately an hour.

After trying out HDXPRT 4, please submit your scores here and send any comments to BenchmarkXPRTsupport@principledtechnologies.com. To see test results from a variety of systems, go to HDXPRT.com and click View Results, where you’ll find scores from a variety of devices. We look forward to seeing your results!

Updates on HDXPRT 4 and MobileXPRT 3

There’s a lot going on with the XPRTs, so we want to offer a quick update.

On the HDXPRT 4 front, we’re currently testing community preview candidate builds across a variety of laptops and desktops. Testing is going well, but as is often the case prior to a release, we’re still tweaking the code as necessary when we run into bugs. We’re excited about HDXPRT 4 and look forward to the community seeing how much faster and easier to use it is than previous versions. You can read more about what’s to come in HDXPRT 4 here.

On the MobileXPRT 3 front, proof-of-concept testing for the new and updated workloads went well, and we’re now working to implement the new UI. Below, you can see a mockup of the new MobileXPRT 3 start screen for phones. The aesthetic is completely different than MobileXPRT 2015, and is in line with the clean, bright look we used for WebXPRT 3 and HDXPRT 4. We’ve made it easy to select and deselect individual workloads by tapping the workload name (deselected workloads are grayed out), and we’ve consolidated common menu items into an Android-style taskbar at the bottom of the screen. Please note that this is an early view and some aspects of the screen will change. For instance, we’re certain that the final receipt-scanning workload won’t be called “Optical character recognition.”

We’ll share more information about HDXPRT 4 and MobileXPRT 3 in the coming weeks. If you have any questions about HDXPRT or MobileXPRT, or would like to share your ideas, please get in touch!

Justin

MobileXPRT-3-main-phone

More on the way for the XPRT Weekly Tech Spotlight

In the coming months, we’ll continue to add more devices and helpful features to the XPRT Weekly Tech Spotlight. We’re especially interested in adding data points and visual aids that make it easier to quickly understand the context of each device’s test scores. For instance, those of us who are familiar with WebXPRT 3 scores know that an overall score of 250 is pretty high, but site visitors who are unfamiliar with WebXPRT probably won’t know how that score compares to scores for other devices.

We designed Spotlight to be a source of objective data, in contrast to sites that provide subjective ratings for devices. As we pursue our goal of helping users make sense of scores, we want to maintain this objectivity and avoid presenting information in ways that could be misleading.

Introducing comparison aids to the site is forcing us to make some tricky decisions. Because we value input from XPRT community members, we’d love to hear your thoughts on one of the questions we’re facing: How should our default view present a device’s score?

We see three options:

1) Present the device’s score in relation to the overall high and low scores for that benchmark across all devices.
2) Present the device’s score in relation to the overall high and low scores for that benchmark across the broad category of devices to which that device belongs (e.g., phones).
3) Present the device’s score in relation to the overall high and low scores for that benchmark across a narrower sub-category of devices to which that device belongs (e.g., high-end flagship phones).

To think this through, consider WebXPRT, which runs on desktops, laptops, phones, tablets, and other devices. Typically, the WebXPRT scores for phones and tablets are lower than scores for desktop and laptop systems. The first approach helps to show just how fast high-end desktops and laptops handle the WebXPRT workloads, but it could make a phone or tablet look slow, even if its score was good for its category. The second approach would prevent unfair default comparisons between different device types but would still present comparisons between devices that are not true competitors (e.g., flagship phones vs. budget phones). The third approach is the most careful, but would introduce an element of subjectivity because determining the sub-category in which a device belongs is not always clear cut.

Do you have thoughts on this subject, or recommendations for Spotlight in general? If so, Let us know.

Justin

Check out the other XPRTs:

Forgot your password?