Today we’re releasing the HDXPRT 4 Community Preview (CP). Just like previous versions of HDXPRT, HDXPRT 4 uses trial versions of commercial applications to complete workload tasks. For some of those programs, such as Audacity and HandBrake, HDXPRT 4 includes installers in the HDXPRT installation package. For other programs, such as Adobe Photoshop Elements 2018 and CyberLink Media Espresso 7.5, users need to download the necessary installers prior to testing by using the links and instructions in the HDXPRT 4 User Manual.
In addition to the editing photos, editing music, and converting videos workloads from prior versions of the benchmark, HDXPRT 4 includes two new Photoshop Elements scenarios. The first utilizes an AI tool that corrects closed eyes in photos, and the second creates a single panoramic photo from seven separate photos.
HDXPRT 4 is compatible with systems running Windows 10, and the installation package is slightly smaller than previous versions at just over 4.7 GB.
Because this is a community preview, it is available only to community members, who may download the preview from the HDXPRT tab in the Members’ Area. Because we expect results from CP testing to be comparable to results from the general release, members may publish their CP test results.
After you try the CP, please send us your comments. If you send information that’s relevant to the entire community, we may post an anonymous version of your comments to the forum. Thanks for your participation!
For BenchmarkXPRT Development Community members anticipating the HDXPRT 4 Community Preview (CP), we want to thank you for your patience and explain where we are in the release process.
This past month has brought a flurry of activity in the Windows 10 development world. We’ve been testing HDXPRT 4 extensively on each of the new prerelease builds available through the Windows Insider Program. While testing on a recent Windows 10 Redstone 5 preview build, we began to see inconsistent HDXPRT 4 workload scores on some systems. The difference between those workload scores and scores on the same systems with previous Windows 10 builds was significant enough for us to decide that the best course of action is to hold off on the CP until we understand the issue. We don’t want to release a CP only to run into serious problems with an imminent Windows release. We want to take the time to figure out what’s going on and get it right.
We hope to resolve these issues and publish the HDXPRT 4 CP as soon as possible. Thanks again for your patience. We’ll update the community soon with more information on the anticipated release schedule.
This week, we’re sharing a little more about the upcoming HDXPRT 4 Community Preview. Just like previous versions of HDXPRT, HDXPRT 4 will use trial versions of commercial applications to complete workload tasks. We will include installers for some of those programs, such as Audacity and HandBrake, in the HDXPRT installation package. For other programs, such as Adobe Photoshop Elements 2018 and CyberLink Media Espresso 7.5, users will need to download the necessary installers prior to testing using links and instructions that we will provide. The HDXPRT 4 installation package is just over 4.7 GB, slightly smaller than previous versions.
I can also report that the new version requires fewer pre-test configuration steps and a full test run takes much less time than before. Some systems that took over an hour to complete an HDXPRT 2014 run are completing HDXPRT 4 runs in about 25 minutes.
We’ll continue to provide more information as we get closer to releasing the community preview. If you’re interested in testing with HDXPRT 4 before the general release but have not yet joined the community, we invite you to join now. If you have any questions or comments about HDXPRT or the community, please contact us.
A few months ago, we discussed some initial ideas for the next version of HDXPRT, including updating the benchmark’s workloads and real-world trial applications and improving the look and feel of the UI. This week, we’d like to share more about the status of the HDXPRT development process.
We’re planning to keep HDXPRT’s three test categories: editing photos, editing music, and converting videos. We’re also planning to use the latest trial versions of the same five applications included in HDXPRT 2014: Adobe Photoshop Elements, Apple iTunes, Audacity, CyberLink MediaEspresso, and HandBrake. The new versions of each of these programs include features and capabilities that may enhance the HDXPRT workloads. For example, Adobe Photoshop Elements 2018 includes interesting new AI tools such as “Open Closed Eyes,” which purports to fix photos ruined by subjects who blinked at the wrong time. We’re evaluating whether any of the new technologies on offer will be a good fit for HDXPRT.
We’re also evaluating how the new Windows 10 SDK and Fall Creators Update will affect HDXPRT. It’s too early to discuss potential changes in any detail, but we know we’ll need to adapt to new development tools, and it’s possible that the Fluent Design System will affect the HDXPRT UI beyond the improvements we already had in mind.
As HDXPRT development progresses, we’ll continue to keep the community up to date. If you have suggestions or insights into the new Fall Creators Update or any of HDXPRT’s real-world applications, we’d love to hear from you! If you’re just reading out about HDXPRT for the first time, you can find out more about the purpose, structure, and capabilities of the test here.
A while back, I wrote about a VR demo built by students from North Carolina State University. We’ve been checking it out over the last couple of months and are very impressed. This workload will definitely heat up your device! While the initial results look promising, this is still an experimental workload and it’s too early to use results in formal reviews or product comparisons.
We’ve created a page that tells all about the VR demo. As an experimental workload, the demo is available only to community members. As always, members can download the source as well as the APK.
We asked the students to try to build the workload for iOS as a stretch goal. They successfully built an iOS version, but this was at the end of the semester and there was little time for testing. If you want to experiment with iOS yourself, look at the build instructions for Android and iOS that we include with the source. Note that you will need Xcode to build and deploy the demo on iOS.
After you’ve checked out the workload, let us know what you think!
Finally, we have a new video featuring the VR demo. Enjoy!
One of the core principles that guides the design of the XPRT tools is they should reflect the way real-world users use their devices. The XPRTs try to use applications and workloads that reflect what users do and the way that real applications function. How did we learn how important this is? The hard way—by making mistakes! Here’s one example.
In the 1990s, I was Director of Testing for the Ziff-Davis Benchmark Operation (ZDBOp). The benchmarks ZDBOp created for its technical magazines became the industry standards, because of both their quality and Ziff-Davis’ leadership in the technical trade press.
WebBench, one of the benchmarks ZDBOp developed, measured the performance of early web servers. We worked hard to create a tool that used physical clients and tested web server performance over an actual network. However, we didn’t pay enough attention to how clients actually interacted with the servers. In the first version of WebBench, the clients opened connections to the server, did a small amount of work, closed the connections, and then opened new ones.
When we met with vendors after the release of WebBench, they begged us to change the model. At that time, browsers opened relatively long-lived connections and did lots of work before closing them. Our model was almost the opposite of that. It put vendors in the position of having to choose between coding to give their users good performance and coding to get good WebBench results.
Of course, we were horrified by this, and worked hard to make the next version of the benchmark reflect more closely the way real browsers interacted with web servers. Subsequent versions of WebBench were much better received.
This is one of the roots from which the XPRT philosophy grew. We have tried to learn and grow from the mistakes we’ve made. We’d love to hear about any of your experiences with performance tools so we can all learn together.