BenchmarkXPRT Blog banner

Search Results for: results viewer

How to submit WebXPRT 4 results for publication

Each new XPRT benchmark release attracts new visitors to our site. Those who haven’t yet run any of our benchmarks may not know how everything works. For those folks, as well as longtime testers who may not be aware of everything the XPRTs have to offer, we like to occasionally revisit the basics here in the blog. Today, we cover the simple process of submitting WebXPRT 4 test results for publication in the WebXPRT 4 results viewer.

Unlike sites that publish all results that users submit, we publish only results—from internal lab testing, user submissions, and reliable tech media sources—that meet our evaluation criteria. Scores must be consistent with general expectations and, for sources outside of our lab, must include enough detailed system information that we can determine whether the score makes sense. Every score in the WebXPRT results viewer and on the general XPRT results page meets these criteria.

Everyone who runs a WebXPRT 4 test is welcome to submit scores for us to consider for publication. The process is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. Please be as specific as possible when filling in the system information fields. Detailed device information helps us assess whether individual scores represent valid test runs. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 4 run on my personal system.

After you submit your score, we’ll contact you to confirm how we should display the source. You can choose one of the following:

  • Your first and last name
  • “Independent tester” (for those who wish to remain anonymous)
  • Your company’s name, provided that you have permission to submit the result in their name. If you want to use a company name, please provide a valid company email address.


We will not publish any additional information about you or your company without your permission.

We look forward to seeing your score submissions. If you have suggestions for the WebXPRT 4 results viewer or any other aspect of the XPRTs, let us know!

Justin

We fixed two bugs affecting the WebXPRT 4 Preview results-page

We launched a preview of the WebXPRT 4 results viewer just before the new year, and have published over 75 results from a wide range of devices. We appreciate the results submissions we’ve received from independent testers so far, and will continue to populate the viewer with WebXPRT 4 Preview results from both our own testing and PT-curated external submissions.

If you’ve run the test and have tried to submit results, you may have encountered one or both of the following bugs, depending on the device type you’re testing:

  • You filled out the results submission form, but the Submit button didn’t seem to do anything.
  • The test automatically downloaded the results csv file multiple times.

We’ve identified the causes of the two bugs, and have instituted fixes. The bug fixes do not affect the benchmark’s workloads or scores. If you tested the WebXPRT 4 Preview and were frustrated by the results submission bugs, we apologize for the inconvenience, and invite you to retry submitting your results.

If you have any questions or comments about the WebXPRT 4 Preview or the results viewer, please feel free to contact us!

Justin

Improvements to the AIXPRT results table

Over the last few weeks, we’ve gotten great feedback about the kinds of data points people are looking for in AIXPRT results, as well as suggestions for how to improve the AIXPRT results viewer. To make it easier for visitors to find what they’re looking for, we’ve made a number of changes:

  • You can now filter results in categories such as framework, target hardware, batch size, and precision, and can designate minimum throughput and maximum latency scores. When you select a value from a drop-down menu or enter text, the results change immediately to reflect the filter.
  • You can search for variables such as processor vendor or processor speed.
  • The viewer displays eight results per page by default and lets you change this to 16, 48, or Show all.

 

The following features of the viewer, which have been present previously, can help you to navigate more efficiently:

  • Click the tabs at the top of the table to switch from ResNet-50 network results to SSD-MobileNet network results.
  • Click the header of any column to sort the data on that variable. One click sorts A-Z and two clicks sort Z-A.
  • Click the link in the Source column to visit a detailed page on that result. The page contains additional test configuration and system hardware information and lets you download results files.

 

We hope these changes will improve the utility of the results table. We’ll continue to add features to improve the experience. If you have any suggestions, please let us know!

Justin

Sharing results

A few weeks back, I wrote about different types of results from benchmarks. HDXPRT 2011’s primary metric is an overall score. One of the challenges of a score, unlike a metric such as minutes of battery life, is that it is hard to interpret without context. Is 157 a good score? The use of a calibration, or base, system helps a bit, because if that system has a score of 100, then a 157 is definitely better. Still, two scores do not give you a lot of context.

To help make comparisons easier, we are releasing a set of results from our testing at http://hdxprt.com/hdxprt2011results. With the results viewer we’ve provided, you can sort the results on a variety of fields and filter them for matching text. We’ve include results from our beta testing and our results white papers.

We’ll continue to add results, but we want to invite members of the HDXPRT Development Community to do the same. We would especially like to get any results you have published on your Web sites. Please submit your results using this link: http://www.hdxprt.com/forum/2011resultsubmit. We’ll give them a sanity check and then include them in the results viewer. Thanks!

Bill

Comment on this post in the forums

WebXPRT can help you choose the right back-to-school tech

For many students, the excitement and anticipation of a new school year is right round the corner! In addition to being an opportunity to dive into new subjects, meet new people, and make progress toward learning goals, the back-to-school season often provides students and teachers with a chance to shop for new technology to meet their needs in the coming year. The tech marketplace can be confusing, however, with a slew of brands, options, and claims competing for back-to-school dollars.

Never fear: WebXPRT can help!

Whether you’re shopping for a new phone, tablet, Chromebook, laptop, or desktop, WebXPRT can provide industry-trusted performance scores that can help give you confidence that you’re making a smart purchasing decision.

And in this age of AI, WebXPRT performance scores do account for specific AI tasks. The benchmark includes timed AI tasks in two workloads, which reflect the types of light browser-side inference tasks that are now quite common in consumer-oriented web applications and extensions. You can read more about that in previous blog entries on the “Organize Album using AI” and “Encrypt Notes and OCR Scan” workloads.

To see how devices stack up, the WebXPRT 4 results viewer is a good place to start. The viewer displays the WebXPRT 4 scores of over 975 devices—including many of the hottest new releases—and we’re adding more scores all the time. To learn more about the viewer’s capabilities and how you can use it to compare devices, check out this blog post.

Another way to find WebXPRT scores is to go directly to the tech press. If you’re considering a popular device, there’s a good chance that a recent tech review includes a WebXPRT score for that device. There are two quick ways to find these reviews: You can either (1) search for “WebXPRT” on a tech review site or (2) use a search engine and enter the device name and WebXPRT as search terms, such as “Lenovo ThinkPad X1 Carbon” and “WebXPRT.”

Here are a few recent articles and tech reviews that used WebXPRT:


If you’re excited about the opportunity to buy new tech for school, WebXPRT can provide you with the information you need to make more confident tech purchases. As this new school year begins, we hope you’ll find the data you need on our site or in an XPRT-related tech review. Feel free to contact us if you have any questions about WebXPRT or WebXPRT scores!

Justin

The XPRTs: What would you like to see in 2025?

If you’re a new follower of the XPRT family of benchmarks, you may not be aware of one of the characteristics of the XPRTs that sets them apart from many benchmarking efforts—our openness and commitment to valuing the feedback of tech journalists, lab engineers, and anyone else that uses the XPRTs on a regular basis. That feedback helps us to ensure that as the XPRTs grow and evolve, the resources we offer will continue to meet the needs of those that use them.

In the past, user feedback has influenced specific aspects of our benchmarks, such as the length of test runs, UI features, results presentation, and the addition or subtraction of specific workloads. More broadly, we have also received suggestions for entirely new XPRTs and ways we might target emerging technologies or industry use cases.

As we look forward to what’s in store for the XPRTs in 2025, we’d love to hear your ideas about new XPRTs—or new features for existing XPRTs. Are you aware of hardware form factors, software platforms, new technologies, or prominent applications that are difficult or impossible to evaluate using existing performance benchmarks? Should we incorporate additional or different technologies into existing XPRTs through new workloads? Do you have suggestions for ways to improve any of the XPRTs or XPRT-related tools, such as results viewers?

We’re especially interested in your thoughts about the next steps for WebXPRT. If our recent blog posts about the potential addition of an AI-focused auxiliary workload, what a WebXPRT battery life test would entail, or possible WebAssembly-based test scenarios have piqued your interest, we’d love to hear your thoughts!

We’re genuinely interested in your answers to these questions and any other ideas you have, so please feel free to contact us. We look forward to hearing your thoughts and working together to figure out how they could help shape the XPRTs in 2025!

Justin

Check out the other XPRTs:

Forgot your password?