BenchmarkXPRT Blog banner

Category: History of benchmarking

HDXPRT: See how your Windows PC handles real-world media tasks

Many of our blog readers first encountered the XPRTs when reading about a specific benchmark, such as WebXPRT, in a device review. Because these folks might be unfamiliar with our other benchmarks, we like to occasionally “reintroduce” individual XPRTs. This week, we invite you to get to know HDXPRT.

HDXPRT, which stands for High-Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT 4, the latest version, evaluates the performance of Windows 10 and Windows 11 devices while handling real-world media tasks such as photo editing, video conversion, and music editing. HDXPRT uses real commercial applications, such Photoshop and MediaEspresso, to complete its workloads. The benchmark then produces easy-to-understand results that are relevant to buyers shopping for new Windows systems.

The HDXPRT 4 setup process takes about 30 minutes on most systems. The length of the test can vary significantly depending on the speed of the system, but for most PCs that are less than a few years old, a full three-iteration test cycle takes under two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test scores from a variety of Windows devices, go to HDXPRT.com and click View Results.

Want to run HDXPRT?

Download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for configuring your system and kicking off a test.

Want to dig into the details?

The HDXPRT source code is available upon request. If you’d like to access the source code, please send your request to benchmarkxprtsupport@principledtechnologies.com. Build instructions are also available.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

Justin

A huge milestone for XPRT runs and downloads!

We’re excited to have recently passed an important milestone: one million XPRT runs and downloads! Most importantly, that huge number does not just reflect past successes. As the chart below illustrates, XPRT use has grown steadily over the years. In 2021, we record, on average, more XPRT runs and downloads in one month (23,395) than we recorded in the entire first year we started tracking these stats (17,051).

We reached one million runs and downloads in about seven and a half years. At the current rate, we’ll reach two million in roughly three and a half more years. With WebXPRT 4 on the way, there’s a good chance we can reach that mark even sooner!

As always, we’re grateful for all the testers that have helped us reach this milestone. If you have any questions or comments about using any of the XPRTs to test your gear, let us know!

Justin

Adapting to a changing tech landscape

The BenchmarkXPRT Development Community started almost 10 years ago with the development of the High Definition Experience & Performance Ratings Test, also known as HDXPRT. Back then, we distributed the benchmark to interested parties by mailing out physical DVDs. We’ve come a long way since then, as testers now freely and easily access six XPRT benchmarks from our site and major app stores.

Developers, hardware manufacturers, and tech journalists—the core group of XPRT testers—work within a constantly changing tech landscape. Because of our commitment to providing those testers with what they need, the XPRTs grew as we developed additional benchmarks to expand the reach of our tools from PCs to servers and all types of notebooks, Chromebooks, and mobile devices.

As today’s tech landscape continues to evolve at a rapid pace, our desire to play an active role in emerging markets continues to drive us to expand our testing capabilities into areas like machine learning (AIXPRT) and cloud-first applications (CloudXPRT). While these new technologies carry the potential to increase efficiency, improve quality, and boost the bottom line for companies around the world, it’s often difficult to decide where and how to invest in new hardware or services. The ever-present need for relevant and reliable data is the reason many organizations use the XPRTs to help make confident choices about their company’s future tech.

We just released a new video that helps to explain what the XPRTs provide and how they can play an important role in a company’s tech purchasing decisions. We hope you’ll check it out!

We’re excited about the continued growth of the XPRTs, and we’re eager to meet the challenges of adapting to the changing tech landscape. If you have any questions about the XPRTs or suggestions for future benchmarks, please let us know!

Justin

Experience is the best teacher

One of the core principles that guides the design of the XPRT tools is they should reflect the way real-world users use their devices. The XPRTs try to use applications and workloads that reflect what users do and the way that real applications function. How did we learn how important this is? The hard way—by making mistakes! Here’s one example.

In the 1990s, I was Director of Testing for the Ziff-Davis Benchmark Operation (ZDBOp). The benchmarks ZDBOp created for its technical magazines became the industry standards, because of both their quality and Ziff-Davis’ leadership in the technical trade press.

WebBench, one of the benchmarks ZDBOp developed, measured the performance of early web servers. We worked hard to create a tool that used physical clients and tested web server performance over an actual network. However, we didn’t pay enough attention to how clients actually interacted with the servers. In the first version of WebBench, the clients opened connections to the server, did a small amount of work, closed the connections, and then opened new ones.

When we met with vendors after the release of WebBench, they begged us to change the model. At that time, browsers opened relatively long-lived connections and did lots of work before closing them. Our model was almost the opposite of that. It put vendors in the position of having to choose between coding to give their users good performance and coding to get good WebBench results.

Of course, we were horrified by this, and worked hard to make the next version of the benchmark reflect more closely the way real browsers interacted with web servers. Subsequent versions of WebBench were much better received.

This is one of the roots from which the XPRT philosophy grew. We have tried to learn and grow from the mistakes we’ve made. We’d love to hear about any of your experiences with performance tools so we can all learn together.

Eric

Another great year

A lot of great stuff happened this year! In addition to releasing new versions of the benchmarks, videos, infographics, and white papers, we released our first-ever German UI and sponsored our first student partnership at North Carolina State University. We visited three continents to promote the XPRTs and saw XPRT results published in six of them (we’re still working on Antarctica).

Perhaps most exciting, we reached our fifth anniversary. Users have downloaded or run the XPRTs over 100,000 times.

As great as the year has been, we are sprinting into 2016. Though I can’t talk about them yet, there are some big pieces of news coming soon. Even sooner, I will be at CES next week. If you would like to talk about the XPRTs or the future of benchmarking, let me know and we’ll find a time to meet.

Whatever your holiday traditions are, I hope you are having a great holiday season. Here’s wishing you all the best in 2016!

Eric

Chaos and opportunity

With both E3 and Apple’s WWDC happening this week, there’s been a lot of news. There’s also been a lot of hyperbolic commentary. I am not about to get into the arguments about the PS4 vs. the Xbox One or iOS 7 vs. Android.

It was Tim Cook’s presentation at WWDC that really got my attention. It’s unusual in an executive presentation to focus so much attention on a particular competitor, but Android was clearly on his mind. At one point, he focused harsh attention on fragmentation in the Android market, calling it “terrible” for developers. You can see the video here, at about 74 minutes.

As we saw in the 90s, chaos can breed innovation. At that time, the paradigm was that Macs always worked, but if you wanted the most advanced hardware, you should get a PC. I remember the editors at MacWorld, who deeply, truly loved the Mac, lusting over the (by the standards of the time) small, light, cheap notebooks PC users could get.

That being said, we understand the challenges of developing in the Android market. As I said in It’s finally here!, the Android ecosystem is sufficiently diverse that we know the benchmark will encounter configurations we’ve not seen before. If you have any problems with the MobileXPRT CP, please let us know at benchmarkxprtsupport@principledtechnologies.com. We want the benchmark to be the best it can be.

Eric

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?