BenchmarkXPRT Blog banner

Month: March 2022

Accessing XPRT source code

We recently received a question from member of the tech press about whether we would be willing to supply them with the WebXPRT 4 source code, along with instructions for how to set up a local instance of the benchmark for their internal testbed. We were happy to help, and they are now able to automate WebXPRT 4 runs within their own isolated network.

If you’re a new XPRT tester, you may not be aware that we provide free access to the source code for each of the XPRT benchmarks. Publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing all interested parties to access and review our source code, we’re encouraging openness and honesty in the benchmarking industry and are inviting the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

While XPRT source code is available to the public, our approach to derivative works differs from some open-source models. Traditional open-source models encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

Accessing XPRT source code is a straightforward process. The source code for CloudXPRT is freely available in our CloudXPRT GitHub repository. If you’d like to download and review the source code for WebXPRT 4 or any of the other XPRTs, or get instructions for how to build one of the benchmarks, all you need to do is contact us at benchmarkxprtsupport@principledtechnologies.com. Your feedback is valuable!

Justin

Chrome OS support for CrXPRT apps ends in June 2022

Last March, we discussed the Chrome OS team’s original announcement that they would be phasing out support for Chrome Apps altogether in June 2021, and would shift their focus to Chrome extensions and Progressive Web Apps. The Chrome OS team eventually extended support for existing Chrome Apps through June 2022, but as of this week, we see no indication that they will further extend support for Chrome Apps published with general developer accounts. If the end-of-life schedule for Chrome Apps does not change in the next few months, both CrXPRT 2 and CrXPRT 2015 will stop working on new versions of Chrome OS at some point in June.

To maintain CrXPRT functionality past June, we would need to rebuild the app completely—either as a Progressive Web App or in some other form. For this reason, we want to reassess our approach to Chrome OS testing, and investigate which features and technologies to include in a new Chrome OS benchmark. Our current goal is to gather feedback and conduct exploratory research over the next few months, and begin developing an all-new Chrome OS benchmark for publication by the end of the year.

While we will discuss ideas for this new Chrome OS benchmark in future blog posts, we welcome ideas from CrXPRT users now. What features or workloads would you like the new benchmark to retain? Would you like us to remove any components from the existing benchmark? Does the battery life test in its current form suit your needs? If you have any thoughts about these questions or any other aspects of Chrome OS benchmarking, please let us know!

Justin

Exploring the WebXPRT 4 results viewer

Now that WebXPRT 4 is live, we want to remind readers about the features of the WebXPRT 4 results viewer. We’re excited about this new tool, which we view as an ongoing project that we will expand and improve over time. The viewer currently has over 100 test results, and we’re just getting started. We’ll continue to actively populate the viewer with the latest PT-curated WebXPRT 4 results for the foreseeable future.

The screenshot below shows the tool’s default display. Each vertical bar in the graph represents the overall score of a single test result, with bars arranged from lowest to highest. To view a single result in detail, the user hovers over a bar until it turns white and a small popup window displays the basic details of the result. Once the user clicks to select the highlighted bar, the bar turns dark blue, and the dark blue banner at the bottom of the viewer displays additional details about that result.

In the example above, the banner shows the overall score (227), the score’s percentile rank (98th) among the scores in the current display, the name of the test device, and basic hardware disclosure information. Users can click the Run info button to see the run’s individual workload scores.

The viewer includes a drop-down menu to quickly filter results by major device type categories, and a tab that allows users to apply additional filtering options, such as browser type, processor vendor, and result source. The screenshot below shows the viewer after I used the device type drop-down filter to select only laptops.

The screenshot below shows the viewer as I use the filter tab to explore additional filter options, such browser type.

The viewer also lets users pin multiple specific runs, which is helpful for making side-by-side comparisons. The screenshot below shows the viewer after I pinned four runs and viewed them on the Pinned runs screen.

The screenshot below shows the viewer after I clicked the Compare runs button: the overall and individual workload scores of the pinned runs appear as a table.

We’re excited about the WebXPRT 4 results viewer, and we want to hear your feedback. Are there features you’d really like to see, or ways we can improve the viewer? Please let us know, and send us your latest test results!

Justin

How to submit WebXPRT 4 results for publication

Each new XPRT benchmark release attracts new visitors to our site. Those who haven’t yet run any of our benchmarks may not know how everything works. For those folks, as well as longtime testers who may not be aware of everything the XPRTs have to offer, we like to occasionally revisit the basics here in the blog. Today, we cover the simple process of submitting WebXPRT 4 test results for publication in the WebXPRT 4 results viewer.

Unlike sites that publish all results that users submit, we publish only results—from internal lab testing, user submissions, and reliable tech media sources—that meet our evaluation criteria. Scores must be consistent with general expectations and, for sources outside of our lab, must include enough detailed system information that we can determine whether the score makes sense. Every score in the WebXPRT results viewer and on the general XPRT results page meets these criteria.

Everyone who runs a WebXPRT 4 test is welcome to submit scores for us to consider for publication. The process is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. Please be as specific as possible when filling in the system information fields. Detailed device information helps us assess whether individual scores represent valid test runs. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 4 run on my personal system.

After you submit your score, we’ll contact you to confirm how we should display the source. You can choose one of the following:

  • Your first and last name
  • “Independent tester” (for those who wish to remain anonymous)
  • Your company’s name, provided that you have permission to submit the result in their name. If you want to use a company name, please provide a valid company email address.


We will not publish any additional information about you or your company without your permission.

We look forward to seeing your score submissions. If you have suggestions for the WebXPRT 4 results viewer or any other aspect of the XPRTs, let us know!

Justin

Check out the other XPRTs:

Forgot your password?