BenchmarkXPRT Blog banner

Month: October 2012

TouchXPRT in the fast lane

I titled last week’s blog “Putting the TouchXPRT pedal to the metal.” The metaphor still applies. On Monday, we released TouchXPRT 2013 Community Preview 1 (CP1).  Members can download it here.

CP1 contains five scenarios based on our research and community feedback. The scenarios are Beautify Photo Album, Prepare Photos for Sharing, Convert Videos for Sharing, Export Podcast to MP3, and Create Slideshow from Photos.

Each scenario gives two types of results. There’s a rate, which allows for simple “bigger is better” comparisons. CP1 also gives the elapsed time for each scenario, which is easier to grasp intuitively. Each approach has its advantages. We’d like to get your feedback on whether you’d like us to pick one of those metrics for the final version of TouchXPRT 2013 or whether it makes more sense to include both. You’ll find a fuller description of the scenarios and the results in the TouchXPRT 2013 Community Preview 1 Design overview.

While you’re looking at CP1, we’re getting the source ready to release.  To check out the source, you’ll need a system running Windows 8, with Visual Studio 2012 installed. We hope to release it on Friday. Keep your eye the TouchXPRT forums for more details.

Post your feedback to the TouchXPRT forum, or e-mail it to TouchXPRTSupport@principledtechnologies.com.  Do you want more scenarios? Different metrics? A new UI feature? Let us know! Make TouchXPRT the benchmark you want it to be.

As I explained last week, we released CP1 without any restrictions on publishing results. It seems that AnandTech was the first to take advantage of that. Read AnandTech’s Microsoft Surface Review to see TouchXPRT in action.

We are hoping that other folks take advantage of CP1’s capability to act as a cross-platform benchmark on the new class of Windows 8 devices. Come join us in the fast lane!

Bill

Comment on this post in the forums

Putting the TouchXPRT pedal to the metal

Since we announced TouchXPRT early this year, we’ve been following a typical benchmark development path. We started with the most important question—“What are people likely to do with a touch-based Windows 8 device?”—and built from there. We looked at what people are doing now with iOS- and Android-based devices. We worked with early Windows 8 units. We studied app stores. We spoke with members of the development community. And so on. When we were done studying, we moved to coding.

We’re making great progress, but something has been nagging at us: When Windows 8 tablets and other devices ship next week, there just won’t be much in the way of tools for measuring their performance when running Windows 8 apps. Sure, you may be able use standard benchmarks to assess the performance of typical desktop applications, but that won’t tell you how the devices will perform with tablet apps.

So, we’ve decided to put the pedal to the metal and provide everyone in our development community with a special treat. Sometime next week, before Windows 8 ships, we plan to release a sneak preview of TouchXPRT, the TouchXPRT 2013 Community Preview 1 (CP1).

CP1, as its name makes clear, is not the final TouchXPRT release. It is, though, a useful tool for beginning to measure Windows 8 device performance. It is also a great way for everyone in the community to see the current state of our thinking and to provide us feedback—rather than read a design spec, you can actually run this version of the tool and see what you think! (If you would like to read the informal design spec, check out http://www.hdxprt.com/forum/touchxprt2013cp1.php .)

To make the tool easier to evaluate and more useful to all of us, we’re also taking two more unusual steps:

1.            We’re not putting any publication restrictions on this preview release. Test at will, and publish your findings.

2.            We’re releasing the source code to all community members. If you’re curious about not just what we’re doing but how we’re doing it, you can find out.

We hope these steps will speed acceptance of TouchXPRT 2013 and foster more and faster feedback. Releasing a preview version is more work, because we have to do much of the work of a software release and on less-than-final code, but we believe the value to our community justifies the effort.

Next week, when we release CP1, I’ll go over more details, the known limitations, and how you can get us your feedback—feedback we very much want.

Between now and then, we’ll be readying CP1 for your use.

Bill

Comment on this post in the forums

Looking for a winner

This week, PT published its first two public reports using HDXPRT 2012: Performance comparison: Dell Latitude E5430 vs. HP ProBook 4440s and Performance comparison: Dell Latitude E5430 vs. Lenovo ThinkPad L430. You should check them out.

Of course, you can find the HDXPRT results from these reports in the HDXPRT 2012 results database along with results from the characterization study we did last month. The results database is a repository of HDXPRT results you can use to compare system performance. The database includes full disclosure information and lets you sort by a number of criteria, including any HDXPRT score, the processor, amount of RAM, graphic card, and so on.

Looking at the results in the database got me wondering who has the mightiest machine out there. The current winner is a custom-built system with an Intel Core i7 3770 and 8 GB of RAM. It has an HDXPRT 2012 Create HD score of 248.

Records are meant to be broken, and I know someone out there can grind that score to dust.  So, we’re going to have a contest. The first person to submit a set of HDXPRT results with a score above 248 will win at least bragging rights and maybe a prize if we can find something suitable around our offices.

You’ll find instructions for submitting results at Submit your HDXPRT 2012 results.

I can’t wait to see your results!

Eric

Comment on this post in the forums

Make sure your voice is heard

One thing about the community model we use for developing HDXPRT is that is depends on the community. Your input is essential to making the benchmark the best it can be. As the community grows, we’re learning more about your priorities.

During the development of HDXPRT 2012, we made the decision to remove the playback tests from the benchmark. While the design document called for the playback test to include 4K H.264, Windows Media Player does not play that format by default. Because less demanding codecs were not differentiating systems, and because the stars used to report the results confused some people, it seemed like a reasonable decision. Bill announced the decision in a blog post, More HDXPRT 2012 changes.

Fast forward to September 18, when Bill hosted the HDXPRT 2012 Webinar. During the Q&A session, a new member of the community said that the playback tests from HDXPRT 2011 were what got him interested in the benchmark. For now, he has to use HDXPRT 2011 for those tests, although, as per Bill’s original blog post, we may release a more demanding playback test as a standalone inspection test later this year.

The suggestion period for HDXPRT 2013 started on October 1. Now is the time to let us know what tests are the most useful to you. If there are tests you’d like us to add, tests you’d like us to change, applications you’d like us to consider, we need to know that too. You can post your suggestions to the forum in the HDXPRT 2013 Suggestions section or mail them to hdxprtsupport@hdxprt.com.

In November, we’ll develop an RFC for HDXPRT 2013 and send it to the community for review.

While the suggestions we receive early have the best chance of being implemented, comments we receive after the formal suggestion period still get our attention. We’re always listening. Contact us anytime and make sure that HDXPRT 2013 includes the things that are important to you.

Eric

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?