BenchmarkXPRT Blog banner

Knowing when to wait

Mark mentioned in his blog entry a few weeks ago that waiting sucks.  I think we can all agree with that sentiment.  However, an experience I had while in Taipei for Computex made me reevaluate that thinking a bit.  

I went jogging one morning in a park near my hotel.  It was a relatively small park, just a quarter mile around the pond that took up most of the park.  I was one of only a couple people jogging, but the park was full of people.  Some were walking around the pond.  There also were groups of people doing some form of Tai Chi in various clearings around the pond.  The path I was on was narrow.  At times, there was no way of getting around the people walking without running into the ones doing Tai Chi.  That in turn meant running in place at times.  Or, put another way, waiting.  

Everyone was polite at the encounters, but the contrast between me jogging and the folks doing Tai Chi was stark.  I wanted to run my miles as quickly as possible.  Those doing Tai Chi were decidedly not in a rush.  They were doing their exercises together with others.  The goal was to do them at the proper pace in the proper way.  

That got me to thinking about waiting on my computer.  (Hey, time to think is one of the main reasons I exercise!)  There are times when waiting for a computer infuriates me.  Other times, however, the computer is fast enough.  Or even too fast, like when I’m trying to scroll down to the right cell in Excel and it jumps down to a whole screen full of empty cells.  This phenomenon, of course, relates to benchmarks.  Benchmarks should measure those operations that are slow enough to hurt productivity or are downright annoying.  There is less value in measuring operations that users don’t have to wait on. 

Have you had any thoughts about what makes a good benchmark?  Even if you weren’t exercising when you had the thought, please share it with the community. 

Bill

Comment on this post in the forums

Home sweet home

After a long set of flights back from Computex in Taipei, I’m finally home in North Carolina. Unfortunately, I’m still not quite sure what time zone I’m in!

While awake in the middle of the night, I’ve been thinking about some of the things I saw at Computex.  While I was there, it seemed like a jumble of notebooks, power supplies, gaming rigs, motherboards, cases, Hello Kitty accessories, and some things that I still don’t quite know what they were.   Many of the things I saw were not brand new, but it was my first chance to see them up close.  Some of them were of technologies still on the horizon like Intel’s Ultrabook concept and Microsoft’s Windows 8.  I also saw all sorts of combinations of phones, 4G, and other devices.

One thing that stood out to me were the number and variety of tablets.  They were in a variety of sizes (and screen resolutions).  There were quite a few vendors and some were ones I would not have suspected but was pleasantly surprised to encounter, like Viewsonic and Shuttle.  The OS choices included Android, WebOS, and MeeGo.   ASUS had a couple of interesting hybrid approaches such as the Eee Pad Transformer and the Padfone.  The former is a 10.1-inch tablet that plugs into a keyboard.  The Padfone is a smartphone that can plug into the back of a larger (10.1-inch) touch screen to act as a tablet.

All of these tablet choices, as well as the iPad that they all must be compete against, left me wondering how to choose between them.  Some part of the choice comes down to the size and features.  As always, however, performance plays a key role.  My tolerance for waiting on a tablet device is even lower than it is for waiting on my PC.  The problem is how to make valid comparisons across such a wide range of platforms.  I’d love to hear from you what you think about performance testing on tablets.  Is it useful?  What are the best ways to accomplish it?

Finally, thanks to all the folks who came by and visited our suite at Computex.  I enjoyed getting the chance to meet some of the members of the HDXPRT Development Community.  And, hopefully, I convinced more folks to join.

Bill

Comment on this post in the forums

Computex – Taipei

It’s hot and muggy here in Taipei. Just like home in North Carolina!

Weather aside, Taipei is definitely not Raleigh. Taipei is a big city with tall buildings. Right next to the hotel is the Taipei 101 which was the world’s tallest building for a few years. The streets are full of cars and motor scooters. People here walk quickly and purposefully. All of Computex seems to be filled with similar purpose and drive. It reminds me a quite bit of COMDEX in Vegas in its prime. Technology has taken over a city only too glad to embrace that technology. In next week’s blog, I’ll let you know about some of the cool things showing here.

I’ve had some interesting HDXPRT meetings so far. One of them helped me to remember some of the non-technical challenges of a successful benchmark. We’ve mentioned benchmark challenges like reliability (it needs to run when you need it to run) and repeatability (it needs to give similar results—within a few percent—each time you run it). I discussed with folks from one PC performance Web site the importance of a benchmark having some permanence. If the benchmark changes too frequently, you can’t compare the current product with the one you reviewed a couple months ago. With HDXPRT, our goal is an annual cycle. That should allow for comparing to older results while still keeping the benchmark current.

Any folks who may be here in Taipei for Computex, please come on by the Hyatt. We can talk about HDXPRT, benchmarks in general, or what you would most like to see in the future of performance evaluation. If nothing else, come by and escape the humidity! Drop us an email at hdxprt_computex@principledtechnologies.com and set up a time to come on over.

Bill

Comment on this post in the forums

HDXPRT 2011 Media Benchmark Now Available

The HDXPRT Development Community, which Principled Technologies (PT) administers, is pleased to announce the release of the HDXPRT 2011 benchmark.

 
HDXPRT 2011 is a benchmark for evaluating the capabilities of PCs in consumer digital media usages.

PT will ship the release discs to all registered HDXPRT Development Community members. The public can access the benchmark via direct download from the official Community Web site, http://www.hdxprt.com.

The official global HDXPRT 2011 launch will take place in early June during Computex Taipei. Bill Catchings, CTO of PT, will lead informational seminars about the benchmark and its potential to shape the way we measure the performance of PCs manipulating consumer digital media.

Members of the HDXPRT Development Community significantly influenced the development of the benchmark by providing feedback on the initial design specifications and participating in Beta testing.

Visit the official HDXPRT Development Community Web site, http://www.hdxprt.com, to learn how you can participate in the development of future versions of HDXPRT and stay up to date on the latest information regarding the benchmark. The Community also has a presence on Facebook and Twitter.

About HDXPRT
HDXPRT, the High Definition eXperience & Performance Ratings Test, is a software tool for evaluating the capabilities of PCs at handling real-world digital media scenarios and common consumer applications. It includes tests for popular consumer usage models such as high-definition video transcoding, high dynamic range (HDR) photo manipulation, Windows 7 Drag and Drop transcoding for portable media players, and HD Flash video playback.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing and assessment services. The founders, Mark Van Name and Bill Catchings, have worked together in technology assessment for over 25 years. As journalists, they published over a thousand articles on a wide array of technology subjects. They created and led the Ziff-Davis Benchmark Operation, which developed such industry-standard benchmarks as Ziff Davis Media’s Winstone and WebBench. They have also co-founded or led several other technology testing firms including ZD labs, eTesting Labs, and VeriTest.

Principled Technologies, Inc. is located in Durham, North Carolina, USA. For more information, please visit http://www.principledtechnologies.com.

Company Contact
Eric Hale
Principled Technologies, Inc.
1007 Slater Road
Suite 300
Durham, NC 27703
ehale@principledtechnologies.com
www.principledtechnologies.com

Waiting sucks

You know it does.  Time is the most precious commodity, the one thing you can never get back.  So when someone or something makes you wait, it sucks.

It particularly sucks when you have to wait on your PC.  It’s your computer, after all, and it should do the work and be quick about it.  For many tasks, it is quick, almost instantaneous.  Some, though, require so much work that the computer can spend a lot of time doing them, leaving you waiting. Tasks that involve working with different types of media often fall into that category.

Which is exactly why we have HDXPRT.

It gives you a way to compare how long different PCs require to perform some common media-manipulation tasks.  Because those times can be significant—sometimes many seconds, but also sometimes many minutes—HDXPRT can give you valuable information that you can factor into your PC buying plans.

After all, the faster a PC is at this sort of work, the less time you’ll spend waiting on it—and that’s a good thing.

Mark Van Name

Comment on this post in the forums

Top 5 reasons for meeting us at Computex in Taipei

As I’ve mentioned before, Bill Catchings from PT will be at the upcoming Computex show in Taipei to debut HDXPRT 2011. At the same time, back home in North Carolina we’ll be mailing copies of the benchmark DVDs to all the members of the HDXPRT Development Community.

If you’re one of the lucky folks who gets to attend Computex, we’d love it if you would come by Bill’s room in the Hyatt (we’ll publicize the room number as soon as we know it), see the benchmark in action, and give us your thoughts about it. I know the show is huge and full of attractions, so I thought I’d give you the top five reasons you ought to make room in your schedule to visit with us.

5. Free snacks! We don’t know what they are yet, or even how we’ll persuade the hotel to let us have them, but we’re committed to providing something to quench your thirst and something to quell your hunger.

4. A break from the crowds. Not only do you get to sit, drink, eat, and see a great new benchmark, you get to do so in the quiet and luxury of a Taipei hotel suite. No more bumping shoulders with fellow show attendees or fighting to get to a place quiet enough that you can talk; in that room, you can relax.

3. You can affect the industry! The support for HDXPRT is growing. More and more organizations are using it. We don’t just want to show it to you; we want you to tell us what you think about it. Your opinions count, and they could help drive the design of the next version of the benchmark, HDXPRT 2012. Yeah, that’s right: the one in development isn’t out, and I’m already talking about the next one. Sue me: I like to live on the edge.

2. You don’t want to make Bill cry. Imagine him, sitting alone in the room, laptop humming, ready to demonstrate this cool new testing tool, and no one to keep him company. His sadness would be so unbearable that I can’t bear to think of what he might do. You can’t let that happen.

1. It’s way cooler to get your HDXPRT DVDs in person! That’s right: Bill’s not just going to show you the benchmark, he’s going to give you your very own copy! He’ll probably shake your hand, too, and thank you for coming. Admit it: that’s cooler than getting it in the mail (which is also pretty darn good—and which will happen to you if you join the HDXPRT Development Community).

Mark Van Name

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?