BenchmarkXPRT Blog banner

Tag Archives: BenchmarkXPRT Development Community

More information about the CloudXPRT results submission process

Earlier this month, we discussed the possibility of using a periodic results submission process for CloudXPRT instead of the traditional rolling publication process that we’ve used for the other XPRTs. We’ve received some positive responses to the idea, and while we’re still working out some details, we’re ready to share the general framework of the process we’re planning to use.

  • We will establish a results review group, which only official BenchmarkXPRT Development Community members can join.
  • We will update the CloudXPRT database with new results once a month, on a pre-published schedule.
  • Two weeks before each publication date, we will stop accepting submissions for consideration for that review cycle.
  • One week before each publication date, we will send an email to the results review group that includes the details of that month’s submissions for review.
  • The results review group will serve as a sanity check process and a forum for comments on the month’s submissions, but we reserve the right of final approval for publication.
  • We will not restrict publishing results outside of the monthly review cadence, but we will not automatically add those results to the results database.
  • We may add externally published results to our database, but will do so only after vetting, and only on the designated day each month.

Our goal is to strike a balance between allowing the tech press, vendors, or other testers to publish CloudXPRT results on their own schedule, and simultaneously building a curated results database that OEMs or other parties can use to compete for the best results.

We’ll share more details about the review group, submission dates, and publications dates soon. Do you have questions or comments about the new process? Let us know what you think!

Justin

Our results database, your resource

Testers who have started using the XPRT benchmarks recently may not know about one of the free resources we offer. The XPRT results database currently holds more than 2,400 test results from over 90 sources, including major tech review publications around the world, OEMs, and independent testers. It offers a wealth of current and historical performance data across all the XPRT benchmarks and hundreds of devices.

We update the results database several times a week, adding selected results from our own internal lab testing, end-of-test user submissions, and reliable tech media sources. (After you run one of the XPRTs, you can choose to submit the results, but they don’t automatically appear in the database.)

Before adding a result, we evaluate whether the score makes sense and is consistent with general expectations, which we can do only when we have sufficient system information details. For that reason, we encourage testers to disclose as much hardware and software information as possible when publishing or submitting a result.

We encourage visitors to our site to explore the XPRT results database. There are three primary ways to do so. The first is by visiting the main BenchmarkXPRT results browser, which displays results entries for all of the XPRT benchmarks in chronological order (see the screenshot below). Users can narrow the results by selecting a benchmark from the drop-down menu and can type values, such as vendor or the name of a tech publication, into the free-form filter field. For results we produced in our lab, clicking “PT” in the Source column takes you to a page with additional disclosure information for the test system. For sources outside our lab, clicking the source name takes you to the original article or review that contains the result.

The second way to access our published results is by visiting the results page for each individual XPRT benchmark. Go the page of the benchmark you’re interested in, and look for the blue View Results button. Clicking it takes you to a page that displays results for only that benchmark. You can use the free-form filter on the page to filter those results, and can use the Benchmarks drop-down menu to jump to the other individual XPRT results pages.

The third way to view information in our results database is with the WebXPRT Processor Comparison Chart. When we publish a new WebXPRT result, the score automatically appears in the processor comparison chart as well. For each processor, the chart shows a bar representing the average score. Mousing over the bar displays a popup indicating the number of WebXPRT results we currently have for that processor and clicking the bar lets you view the results. You can change the number of results the chart displays on each page, and use the drop-down menu to toggle back and forth between the WebXPRT 3 and WebXPRT 2015 charts.

We hope you’ll take some time to browse the information in our results database. We welcome your feedback about what you’d like to see in the future and suggestions for improvement. Our database contains the XPRT scores that we’ve gathered, but we publish them as a resource for you. Let us know what you think!

Justin

The CrXPRT 2 Community Preview is available!

We’re excited to announce that the CrXPRT 2 Community Preview (CP) is now available! BenchmarkXPRT Development Community members can access the preview using a direct link posted on the CrXPRT tab in the XPRT Members’ Area (login required), where they will also find the CrXPRT 2 CP user manual.

You can find more information about the key differences between CrXPRT 2015 and CrXPRT 2 in last week’s blog entry. During the preview period, we allow testers to publish CP test scores, but CrXPRT 2 overall performance test scores and battery life measurements are not comparable to CrXPRT 2015 scores.

If you have any questions about CrXPRT 2 or joining the community, please let us know!

Justin

The XPRTs in 2019: Looking back on an exciting and productive year

2019 is winding down, and we want to take this opportunity to review another exciting and productive year for the BenchmarkXPRT Development Community. Readers of our newsletter are familiar with the stats and updates we post in each month’s mailing, but we know that not all our blog readers receive the newsletter, so we’ve compiled the highlights below.

Trade shows
Earlier this year, Justin attended CES in Las Vegas and Mark travelled to MWC Barcelona. These shows help us keep up with the latest industry trends and gather insights that help to lay the groundwork for XPRT development in the years ahead.

Benchmarks
In the past year, we released MobileXPRT 3, HDXPRT 4, and AIXPRT, our new AI benchmark tool that helps you evaluate a system’s machine learning inference performance. There’s much more to come in 2020 with AIXPRT and several other projects, so expect more news about benchmark development early in the year.

Web mentions
In 2019 so far, journalists, advertisers, and analysts have referenced the XPRTs over 5,000 times, including mentions in more than 190 articles and 1,350 device reviews. This represents a more than 50% increase over 2018.

Downloads and confirmed runs
To date, we’ve had more than 24,800 benchmark downloads and 153,000 confirmed runs in 2019, increases of more than 8% and 10%, respectively, over 2018. Within the last month, our most popular benchmark, WebXPRT, passed the 500,000-run milestone! WebXPRT continues to be an industry-standard performance benchmark upon which OEM labs, vendors, and leading tech press outlets rely.

XPRT Tech Spotlight
We put 47 new devices in the XPRT Tech Spotlight throughout the year and published updated back-to-school, Black Friday, and holiday showcases to help buyers compare devices.

Media and interactive tools
We published a new XPRTs around the world infographic and an interactive AIXPRT installation package selector tool. We’ve received a lot of positive feedback about the tool. We encourage you to give it a try if you’re curious about AIXPRT but aren’t sure how to get started.

We’re thankful for everyone who used the XPRTs, joined the community, and sent questions and suggestions throughout 2019. This will be our last blog post for 2019, but there’s much more to come in 2020, including some exciting new developments. Stay tuned in early January for updates!

Justin

The MobileXPRT 3 source code is now available

We’re excited to announce that the MobileXPRT 3 source code is now available to BenchmarkXPRT Development Community members!

Download the MobileXPRT 3 source here (login required).

We’ve also posted a download link on the MobileXPRT tab in the Members’ Area, where you will find instructions for setting up and configuring a local instance of MobileXPRT 3.

As part of our community model for software development, source code for each of the XPRTs is available to anyone who joins the community. If you’d like to review XPRT source code, but haven’t yet joined the community, we encourage you to join! Registration is quick and easy, and if you work for a company or organization with an interest in benchmarking, you can join the community for free. Simply fill out the form with your company e-mail address and select the option to be considered for a free membership. We’ll contact you to verify the address and then activate your membership.

If you have any other questions about community membership or XPRT source code, feel free to contact us. We look forward to hearing from you!

Justin

Principled Technologies and the BenchmarkXPRT Development Community release a preview of AIXPRT, a tool designed to help testers evaluate machine learning performance

Durham, NC, March 4 — Principled Technologies and the BenchmarkXPRT Development Community release the AIXPRT Community Preview. AIXPRT is a free tool that makes it easier to evaluate a system’s machine learning inference performance by running several common image-classification workloads.

The AIXPRT Community Preview build includes support for the Intel© OpenVINO™, TensorFlow™, and TensorFlow with NVIDIA© TensorRT™ toolkits to run image-classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32, FP16, and INT8 levels of precision.

“This AIXPRT preview build is the next step towards our goal of making it easier for folks to evaluate how well systems handle machine learning tasks,” said Bill Catchings, co-founder of Principled Technologies, which administers the BenchmarkXPRT Development Community. “We invite all industry experts and interested parties to try out the AIXPRT Community Preview and send us their feedback.”

The AIXPRT Community Preview is available to anyone with a GitHub© account who is interested in participating. To request access, please contact the BenchmarkXPRT Development Community by sending a message to BenchmarkXPRTsupport@PrincipledTechnologies.com.

AIXPRT is part of the BenchmarkXPRT suite of performance evaluation tools, which includes WebXPRT, MobileXPRT, TouchXPRT, CrXPRT, HDXPRT, and BatteryXPRT. The XPRTs help users get the facts before they buy, use, or evaluate tech products such as computers, tablets, and phones.

To learn more about the AIXPRT, go to www.AIXPRT.com. To learn more about the BenchmarkXPRT Development Community, go to www.BenchmarkXPRT.com.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing, as well as learning and development services. It administers the BenchmarkXPRT Development Community.

Principled Technologies, Inc. is located in Durham, North Carolina, USA. For more information, please visit www.PrincipledTechnologies.com.

Company Contact
Justin Greene
BenchmarkXPRT Development Community
Principled Technologies, Inc.
1007 Slater Road, Ste. 300
Durham, NC 27704

BenchmarkXPRTsupport@PrincipledTechnologies.com

Check out the other XPRTs:

Forgot your password?