BenchmarkXPRT Blog banner

Top 5 reasons for meeting us at Computex in Taipei

As I’ve mentioned before, Bill Catchings from PT will be at the upcoming Computex show in Taipei to debut HDXPRT 2011. At the same time, back home in North Carolina we’ll be mailing copies of the benchmark DVDs to all the members of the HDXPRT Development Community.

If you’re one of the lucky folks who gets to attend Computex, we’d love it if you would come by Bill’s room in the Hyatt (we’ll publicize the room number as soon as we know it), see the benchmark in action, and give us your thoughts about it. I know the show is huge and full of attractions, so I thought I’d give you the top five reasons you ought to make room in your schedule to visit with us.

5. Free snacks! We don’t know what they are yet, or even how we’ll persuade the hotel to let us have them, but we’re committed to providing something to quench your thirst and something to quell your hunger.

4. A break from the crowds. Not only do you get to sit, drink, eat, and see a great new benchmark, you get to do so in the quiet and luxury of a Taipei hotel suite. No more bumping shoulders with fellow show attendees or fighting to get to a place quiet enough that you can talk; in that room, you can relax.

3. You can affect the industry! The support for HDXPRT is growing. More and more organizations are using it. We don’t just want to show it to you; we want you to tell us what you think about it. Your opinions count, and they could help drive the design of the next version of the benchmark, HDXPRT 2012. Yeah, that’s right: the one in development isn’t out, and I’m already talking about the next one. Sue me: I like to live on the edge.

2. You don’t want to make Bill cry. Imagine him, sitting alone in the room, laptop humming, ready to demonstrate this cool new testing tool, and no one to keep him company. His sadness would be so unbearable that I can’t bear to think of what he might do. You can’t let that happen.

1. It’s way cooler to get your HDXPRT DVDs in person! That’s right: Bill’s not just going to show you the benchmark, he’s going to give you your very own copy! He’ll probably shake your hand, too, and thank you for coming. Admit it: that’s cooler than getting it in the mail (which is also pretty darn good—and which will happen to you if you join the HDXPRT Development Community).

Mark Van Name

Comment on this post in the forums

What to do, what to do

When you set out to build an application-based benchmark like HDXPRT, you face many choices, but two are particularly important:  what applications do you run, and what functions do you perform in each application?

With HDXPRT the answers were straightforward, as they should be.

The applications we chose reflected a blend of market leaders, those providing emerging but important features, and the input from our community members.

The functions we perform in each application are ones that are representative of common uses of those programs—and that reflect the input of the community.

What’s so important here is the last clause of each of those paragraphs:  your input defines this benchmark.

As we finish off HDXPRT 2011 and then move to the 2012 version, we’ll begin the development cycle anew. When we do, if you want to make sure we choose the applications and functions that matter most to you, then participate, tell us what you want, let us hear your voice.  We will respond to all input, so though we can’t guarantee to accept all direction—after all, goals and desires sometimes conflict—we can guarantee that you will hear back from us and that we will explain the rationale for our decisions.

Mark Van Name

Comment on this post in the forums

Putting HDXPRT in some benchmark context

Benchmarks come in many shapes and sizes.  Some are extremely small, simple, and focused, while others are large, complex, and cover many aspects of a system.  To help position HDXPRT in the world of benchmarks, let me share with you a little taxonomy that Bill and I have long used.  No taxonomy is perfect, of course, but we’ve found this one to be very helpful as a general categorization tool.

From the perspective of how benchmarks measure performance, you can divide most of them into three groups.

Inspection tools use highly specialized tests to target very particular parts of a system. Back in the day, lo these many decades ago—okay, it was only two decades, but in dog years two tech decades is like five generations—some groups used a simple no-op loop to measure processor performance. I know, it sounds dumb today, but for a short time many felt it was a legitimate measure of processor clock speed, which is one aspect of performance. Similarly, if you want to know how fast a graphics subsystem could draw a particular kind of line, you could write code to draw lines of that type over and over.

These tools have very limited utility, because they don’t do what real users do, but for people working close to hardware, they can be useful.

Moving closer to the real world, synthetic benchmarks are specially written programs that simulate the kinds of work their developers believe real users are doing. So, if you think your target users are spending all day in email, you could write your own mini email client and time functions in it.  These tools definitely move closer to real user work than inspection tools, but they still have the drawback of not actually running the programs real people are using.

Application-based benchmarks take that last step by using real applications, the same programs that users employ in the real world. These benchmarks cause those applications to perform the kinds of actions that real users take, and they time those actions.  You can always argue about how representative they are—more on that in a future blog entry, assuming I don’t forget to write it—but they are definitely closer to the real world because they’re using real applications.

With all of that background, HDXPRT becomes easy to classify:  it’s an application-based benchmark.

Mark Van Name

Comment on this post in the forums

An example of the community in action

Last week, I hosted a Webinar on HDXPRT. We’ll make a recording of it available on the site fairly soon. Multiple members attended. As I was going through the slides and discussing various aspects of the benchmark, a member asked about installing the benchmark from a USB key or a server. My response was the simple truth: we hadn’t considered that approach. As I then elaborated, we clearly should have thought about it, because those capabilities would be useful in just about every production lab out there, including ours here at PT. I concluded by saying that we’d look into it.

I’m not naming the member simply because with big companies I’m never sure if doing that will be good or will cause someone trouble, and I don’t want to cause hassle for anyone. He should, though, feel free to step forward and claim the well-deserved credit for the suggestion.

Less than a week after the Webinar, I’m happy to be able to report that the team has done more than look into these capabilities; it’s implemented them! So, the next Beta release, Beta 2, which we’ll be releasing any time now (maybe even before we post this blog entry), lets you install the benchmark from a network share or a USB key.

I know this is a relatively small thing, but I think it bears reporting because it is exactly the way the community should work. A member brought the benefits of his experience to bear in a great bit of feedback, and now the benchmark is better for it—and so are all of us who use it.

Keep the good ideas coming!

Mark Van Name

Comment on this post in the forums

Our community’s goal

Computer system performance evaluation has a long and complex history. Many of the earliest tests were simple, short code snippets, such as Whetstone, that did little more than give an indication of how fast a particular computer subsystem was able to operate. Unfortunately, such simple benchmarks quickly lost their value, in part because they were very crude measures, and in part because software tools on the things they were measuring could easily optimize for them. In some cases, a compiler could even recognize a test and “optimize” the code by simply producing the final result!

Over time, though, benchmarks have become more complex and more relevant. Whole organizations exist and have existed to build benchmarks. Notable ones include the Ziff-Davis Benchmark Operation (ZDBOp), which the Ziff-Davis computer magazines funded in the 1990s and which Mark and I ran; the Standard Performance Evaluation Corporation (SPEC), which its member companies fund and of which PT is a member; and the Business Applications Performance Corporation (BAPCo), which its member companies fund. Each of these organizations has developed widely used products, such as Winstone (ZDBOp), SPEC CPU (SPEC), and SYSmark (BAPCo). Each organization has also always faced challenges. In the case of ZDBOp, for example, Ziff Davis could no longer support the costs of developing its benchmarks, so they discontinued the group. SPEC continues to develop good benchmarks, but its process can sometimes yield years between versions.

The goal with HDXPRT and the HDXPRT Development Community (HDC) is to explore a new way to develop benchmarks. By utilizing the expertise and experience of a community of interested people, we hope to be able develop benchmarks in an open and collaborative environment while keeping them timely.

HDXPRT 2011 is the first test of this approach. We believe that it and subsequent versions of it, as well as other benchmarks, will give the industry a new model for creating world-class performance measurement tools.

If you’re not a member of the HDC, please consider joining us and helping define the future of performance evaluation.

Bill

Comment on this post in the forums

HDXPRT 2011 Beta Released for Testing

The HDXPRT Development Community, created by Principled Technologies (PT), is pleased to announce the distribution of the HDXPRT 2011 Beta to its registered members.

 
HDXPRT 2011 is a benchmark for evaluating the capabilities of PCs using real-world media scenarios and common consumer media applications.

PT has invited Community members to test the HDXPRT 2011 Beta and provide assistance in evaluating the benchmark. Their feedback will help resolve any remaining issues before PT releases the benchmark to the public. The deadline for Beta testing feedback is April 29, 2011.

Participation in the HDXPRT 2011 Beta program is available only to registered members of the HDXPRT Development Community. Membership is open to anyone willing to pay the nominal annual membership fee. Community members have the opportunity to help shape future versions of the benchmark. To register for the HDXPRT 2011 Beta program and receive access to members-only content, go tohttp://www.hdxprt.com/forum/register.php and complete the registration process.

To see the latest information on the benchmark’s development, visit the official HDXPRT Development Community Web site, http://www.hdxprt.com. The Community also has a presence on Facebook and Twitter.

About HDXPRT
HDXPRT, the High Definition eXperience & Performance Ratings Test, is a software tool for assessing the capabilities of PCs at handling real-world media scenarios and common consumer applications. HDXPRT 2011 is currently planned for release in the second quarter of 2011. It includes tests for popular consumer usage models such as high-definition video transcoding, High Dynamic Range (HDR) photo manipulation, Windows 7 Drag & Drop transcoding for portable media players, and HD Flash video playback.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing and assessment services. The founders, Mark Van Name and Bill Catchings, have worked together in technology assessment for over 25 years. As journalists, they published over a thousand articles on a wide array of technology subjects. They created and led the Ziff-Davis Benchmark Operation, which developed such industry-standard benchmarks as Ziff Davis Media’s Winstone and WebBench. They have also co-founded or led several other technology testing firms including ZD Labs, eTesting Labs, and VeriTest.

Principled Technologies, Inc. is located in Durham, North Carolina, USA. For more information, please visithttp://www.principledtechnologies.com.

Company Contact
Eric Hale
Principled Technologies, Inc.
1007 Slater Road
Suite 300
Durham, NC 27703
ehale@principledtechnologies.com
www.principledtechnologies.com

Check out the other XPRTs:

Forgot your password?