BenchmarkXPRT Blog banner

Want to know how your device performs? Explore the XPRT results database

If you only recently started using the XPRT benchmarks, you may not know about one of the free resources we offer—the XPRT results database. Our results database currently holds more than 3,650 test results from over 150 sources, including global tech press outlets, OEM labs, and independent testers. It serves as a treasure trove of current and historical performance data across all the XPRT benchmarks and hundreds of devices. You can use these results and the results of the same XPRTs on your device to get a sense of how well your device performs.

We update the results database several times a week, adding selected results from our own internal lab testing, reliable media sources, and end-of-test user submissions. (After you run one of the XPRTs, you can choose to submit the results, but don’t worry—this is opt-in. Your results do not automatically appear in the database.) Before adding a result, we also look at any available system information and evaluate whether the score makes sense and is consistent with general expectations.

There are three primary ways that you can explore the XPRT results database.

The first is by visiting the main BenchmarkXPRT results browser, which displays results entries for all of the XPRT benchmarks in chronological order (see the screenshot below). You can filter the results by selecting a benchmark from the drop-down menu. You can also type values, such as a vendor name (e.g., Dell) or the name of a tech publication (e.g., PCWorld) into the free-form filter field. For results we’ve produced in our lab, clicking “PT” in the Source column takes you to a page with additional configuration information for the test system. For sources outside our lab, clicking the source name takes you to the original article or review that contains the result.

The second way to access our published results is by visiting the results page for an individual XPRT benchmark. Start by going to the page of the benchmark that interests you (e.g., CrXPRT.com) , and looking for the blue View Results button. Clicking the button takes you to a page that displays results for only that benchmark. You can use the free-form filter on the page to filter those results, and you can use the Benchmarks drop-down menu to jump to the other individual XPRT results pages.

The third way to view our results database is with the WebXPRT 4 results viewer. The viewer provides an information-packed, interactive tool with which you can explore data from the curated set of WebXPRT 4 results we’ve published on our site. We’ll discuss the features of the WebXPRT 4 results viewer in more detail in a future post.

You can use any of these approaches to compare the results of an XPRT on your device with our many published results. We hope you’ll take some time to explore the information in our results database and that it proves to be helpful to you. If you have ideas for new features or suggestions for improvement, we’d love to hear from you!

Justin

Another milestone for WebXPRT!

Back in November, we discussed some of the trends we were seeing in the total number of completed and reported WebXPRT runs each month. The monthly run totals were increasing at a rate we hadn’t seen before. We’re happy to report that the upward trend has continued and even accelerated through the first quarter of this year! So far in 2024, we’ve averaged 43,744 WebXPRT runs per month, and our run total for the month of March alone (48,791) was more than twice the average monthly run total for 2023 (24,280).

The rapid increase in WebXPRT testing has helped us reach the milestone of 1.5 million runs much sooner than we anticipated. As the chart below shows, it took about six years for WebXPRT to log the first half-million runs and nine years to pass the million-run milestone. It’s only taken about one-and-a-half years to add another half-million.

This milestone means more to us than just reaching some large number. For a benchmark to be successful, it should ideally have widespread confidence and support from the benchmarking community, including manufacturers, OEM labs, the tech press, and other end users. When the number of yearly WebXPRT runs consistently increases, it’s a sign to us that the benchmark is serving as a valuable and trusted performance evaluation tool for more people around the world.

As always, we’re grateful for everyone who has helped us reach this milestone. If you have any questions or comments about using WebXPRT to test your gear, please let us know! And, if you have suggestions for how we can improve the benchmark, please share them. We want to keep making it better and better for you!

Justin

XPRT mentions in the tech press

One of the ways we monitor the effectiveness of the XPRT family of benchmarks is to regularly track XPRT usage and reach in the global tech press. Many tech journalists invest a lot of time and effort into producing thorough device reviews, and relevant and reliable benchmarks such as the XPRTs often serve as indispensable parts of a reviewer’s toolkit. Trust is hard-earned and easily lost in the benchmarking community, so we’re happy when our benchmarks consistently achieve “go-to” status for a growing number of tech assessment professionals around the world.

Because some of our newer readers may be unaware of the wide variety of outlets that regularly use the XPRTs, we occasionally like to share an overview of recent XPRT-related tech press activity. For today’s blog, we want to give readers a sampling of the press mentions we’ve seen over the past few months.

Recent mentions include:

Each month, we send out a BenchmarkXPRT Development Community newsletter that contains the latest updates from the XPRT world and provides a summary of the previous month’s XPRT-related activity, including new mentions of the XPRTs in the tech press. If you don’t currently receive the monthly BenchmarkXPRT newsletter but would like to join the mailing list, please let us know! There is no cost to join, and we will not publish or sell any of the contact information you provide. We will send only the monthly newsletter and occasional benchmark-related announcements, such as news about patches or new releases.

Justin

Working with the WebXPRT 4 source code

In our last blog post, we discussed the WebXPRT 4 source code and how you can contact us to request free access to the build package. In this post, we’ll address two questions that users sometimes ask about code access. The first question is, “How do I build a local instance of WebXPRT?” The second is, “What can I do with it?”

How to build a local WebXPRT 4 instance

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package, which contains all the necessary source code files and installation instructions. You will need a system to use as a server, and you will need to be familiar with Apache, PHP, and MySQL configuration to follow the build instructions. WebXPRT 4 uses a LAMP (Linux, Apache, MySQL, and PHP) setup on the “server” side, but it’s also possible to set up an instance with a WAMP or XAMPP stack.

The build instructions include a step-by-step methodology for setup. If you are familiar with LAMP stack configuration, the build and configuration process should take about two to three hours, depending on whether your LAMP-related extensions and libraries are current.

What you can do with a local WebXPRT 4 instance

We allow users to set up their own WebXPRT 4 instances for purposes of review, internal testing, or experimentation.

One use-case example is internal OEM lab testing. Some labs use WebXPRT to conduct extensive testing on preproduction hardware, and they follow stringent security guidelines to avoid the possibility of any hardware or test information leaving the lab. Even though we have our own strict policies about how we handle the little amount of data that WebXPRT gathers from tests, a local WebXPRT 4 instance provides those labs with an extra layer of security for sensitive tests.

We do ask that users publish results only from tests that they run on WebXPRT.com. As we mentioned in our most recent post, benchmarking requires a product that is consistent to enable valid comparisons over time. We allow people to download the source, but we reserve the right to control derivative works and which products can use the name “WebXPRT.” That way, when people see WebXPRT scores in tech press articles or vendor marketing materials, they can run their own tests on WebXPRT.com and be confident that they’re using the same standard for comparison.

If you have any questions about using the WebXPRT 4 source code, let us know!

Justin

Accessing the WebXPRT 4 source code

If you’re new to the XPRTs, you may not be aware that we provide free access to XPRT benchmark source code. Publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing interested parties to access and review our source code, we’re encouraging openness and honesty in the benchmarking industry. We’re also inviting constructive feedback that can help ensure that the XPRTs continue to improve and contribute to a level playing field for all the types of products they measure.

While we do offer free access to the XPRT source code, we’ve decided to offer the code upon request instead of using a permanent download link. This approach prevents bots or other malicious actors from downloading the code. It also has the benefit of allowing us to interact with users who are interested in the source code and answer any questions they may have. We’re always keen to learn more about what others are thinking about the XPRTs and the types of work they measure.

We recently received some questions about accessing the WebXPRT 4 source code, which made us realize that we needed to make a clearer way for people to ask for the code. In response, we added a “Request WebXPRT 4 source code” link to the gray Helpful Info box on WebXPRT.com (see it in the screenshot below). Clicking the link will allow you to email the BenchmarkXPRT Support team directly and request the code.

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package. For those users who wish to set up a local instance of WebXPRT 4 for their own internal testbeds, the package will contain all the necessary files and installation instructions. We allow folks to set up their own instances for purposes of review, internal testing, or experimentation, but we ask that users publish only test results from the official WebXPRT 4 site.

While we offer free access to XPRT source code, our approach to derivative works differs from some traditional open-source models that encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

If you have any questions about accessing the WebXPRT 4 source code, let us know!

Justin

WebXPRT in PT reports

We don’t just make WebXPRT—we use it, too. If you normally come straight to BenchmarkXPRT.com or WebXPRT.com, you may not even realize that Principled Technologies (PT) does a lot more than just managing and administering the BenchmarkXPRT Development Community. We’re also the tech world’s leading provider of hands-on testing and related fact-based marketing services. As part of that work, we’re frequent WebXPRT users.

We use the benchmark when we test devices such as Chromebooks, desktops, mobile workstations, and consumer laptops for our clients. (You can see a lot of that work and many of our clients on our public marketing portfolio page.) We run the benchmark for the same reasons that others do—it’s a reliable and easy-to-use tool for measuring how well devices handle web browsing and other web work.

We also sometimes use WebXPRT simply because our clients request it. They request it for the same reason the rest of us like and use it: it’s a great tool. Regardless of job titles and descriptions, most laptop and tablet users surf the web and access web-based applications every day. Because WebXPRT is a browser benchmark, higher scores on it could indicate that a device may provide a superior online experience.

Here are just a few of the recent PT reports that used WebXPRT:

  • In a project for Dell, we compared the performance of a Dell Latitude 7340 Ultralight to that of a 13-inch Apple MacBook Air (2022).
  • In this study for HP, we compared the performance of an HP ZBook Firefly G10, an HP ZBook Power G10, and an HP ZBook Fury G10.
  • Finally, in a set of comparisons for Lenovo, we evaluated the system performance and end-user experience of eight Lenovo ThinkBook, ThinkCentre, and ThinkPad systems along with their Apple counterparts.

All these projects, and many more, show how a variety of companies rely on PT—and on WebXPRT—to help buyers make informed decisions. P.S. If we publish scores from a client-commissioned study in the WebXPRT 4 results viewer, we will list the source as “PT”, because we did the testing.

By Mark L. Van Name and Justin Greene

Check out the other XPRTs:

Forgot your password?