BenchmarkXPRT Blog banner

Search Results for: webxprt

How to use the WebXPRT language options

In September, the Chinese tech review site KoolCenter published a review of the ASUS Mini PC PN51 that included a screenshot of the device’s WebXPRT 4 test result screen. The screenshot showed that the testers had enabled the WebXPRT Simplified Chinese UI. Users can choose from three language options in the WebXPRT 4 UI: Simplified Chinese, German, and English. We included Simplified Chinese and German because of the large number of test runs we see from China and Central Europe. We wanted to make testing a little easier for users who prefer those languages, and are glad to see people using the feature.

Changing languages in the UI is very straightforward. Locate the Change Language? prompt under the WebXPRT 4 logo at the top of the Start screen, and click or tap the arrow beside it. After the drop-down menu appears, select the language you want. The language of the start screen changes to the language you selected, and the in-test workload headers and the results screen also appear in your chosen language.

The screenshots below my sig show the Change Language? drop-down menu, and how the Start screen appears when you select Simplified Chinese or German. Be aware that if you have a translation extension installed in your browser, the extension may override the WebXPRT UI by reverting the language back to the default of English. You can avoid this conflict by temporarily disabling the translation extension for the duration of WebXPRT testing.

If you have any questions about WebXPRT’s language options, please let us know!

Justin

The WebXPRT 4 results calculation white paper is now available

Last week, we published the Exploring WebXPRT 4 white paper. The paper describes the design and structure of WebXPRT 4, including detailed information about the benchmark’s harness, HTML5 and WebAssembly capability checks, and the structure of the performance test workloads. This week, to help WebXPRT 4 testers understand how the benchmark calculates results, we’ve published the WebXPRT 4 results calculation and confidence interval white paper.

The white paper explains the WebXPRT 4 confidence interval and how it differs from typical benchmark variability, and the formulas the benchmark uses to calculate the individual workload scenario scores and overall score. The paper also provides an overview of the statistical techniques WebXPRT uses to translate raw timings into scores.

To supplement the white paper’s discussion of the results calculation process, we’ve also published a results calculation spreadsheet that shows the raw data from a sample test run and reproduces the calculations WebXPRT uses to produce workload scores and the overall score.

The paper is available on WebXPRT.com and on our XPRT white papers page. If you have any questions about the WebXPRT results calculation process, please let us know!

Justin

The Exploring WebXPRT 4 white paper is now available

This week, we published the Exploring WebXPRT 4 white paper. It describes the design and structure of WebXPRT 4, including detailed information about the benchmark’s harness, HTML5 and WebAssembly (WASM) capability checks, and changes we’ve made to the structure of the performance test workloads. We explain the benchmark’s scoring methodology, how to automate tests, and how to submit results for publication. The white paper also includes information about the third-party functions and libraries that WebXPRT 4 uses during the HTML5 and WASM capability checks and performance workloads.

The Exploring WebXPRT 4 white paper promotes the high level of transparency and disclosure that is a core value of the BenchmarkXPRT Development Community. We’ve always believed that transparency builds trust, and trust is essential for a healthy benchmarking community. That’s why we involve community members in the benchmark development process and disclose how we build our benchmarks and how they work.

You can find the paper on WebXPRT.com and our XPRT white papers page. If you have any questions about WebXPRT 4, please let us know, and be sure to check out our other XPRT white papers.

Justin

We’ve moved WebXPRT Singapore to a new hosting environment

When we first released WebXPRT 2013, some users in mainland China reported slow download times when running the benchmark. In response, we set up a mirror host site in Singapore to facilitate WebXPRT testing in China and other East Asian countries. We continued this practice with subsequent WebXPRT versions, and currently offer Singapore-based instances of WebXPRT 4, WebXPRT 3, and WebXPRT 2015.

Until this past month, we used an Amazon Web Services (AWS) EC2-Classic environment to host the Singapore mirror site. Because Amazon retired the EC2-Classic environment, we had to migrate each of the WebXPRT Singapore instances to a new AWS Virtual Private Cloud (VPC) environment.  

We do not expect the new environment to affect WebXPRT Singapore testing or results, and have not yet observed any significant differences in WebXPRT performance scores while testing on the new site. If you have a different experience when testing on the new site or encounter interruptions when trying to access the test, please let us know!

Justin

WebXPRT passes the million-run milestone!

We’re excited to see that users have successfully completed over 1,000,000 WebXPRT runs! If you’ve run WebXPRT in any of the 924 cities and 81 countries from which we’ve received complete test data—including newcomers Bahrain, Bangladesh, Mauritius, The Philippines, and South Korea —we’re grateful for your help. We could not have reached this milestone without you!

As the chart below illustrates, WebXPRT use has grown steadily since the debut of WebXPRT 2013. On average, we now record more WebXPRT runs in one month than we recorded in the entirety of our first year. With over 104,000 runs so far in 2022, that growth is continuing.

For us, this moment represents more than a numerical milestone. Developing and maintaining a benchmark is never easy, and a cross-platform benchmark that will run on a wide variety of devices poses an additional set of challenges. For such a benchmark to succeed, developers need not only technical competency, but the trust and support of the benchmarking community. WebXPRT is now in its ninth year, and its consistent year-over-year growth tells us that the benchmark continues to hold value for manufacturers, OEM labs, the tech press, and end users like you. We see it as a sign of trust that folks repeatedly return to the benchmark for reliable performance metrics. We’re grateful for that trust, and for everyone that’s contributed to the WebXPRT development process throughout the years.

We’ll have more to share related to this exciting milestone in the weeks to come, so stay tuned to the blog. If you have any questions or comments about WebXPRT, we’d love to hear from you!

Justin

Helpful tips for WebXPRT 4 results submission

Back in March, we discussed the WebXPRT 4 results submission process and reminded readers that everyone who runs a WebXPRT 4 test is welcome to submit scores for us to consider for publication in the WebXPRT 4 results viewer. Unlike sites that publish every result that users submit, we publish only results that meet our evaluation criteria. Among other things, scores must be consistent with general expectations and must include enough detailed system information to help us assess whether individual scores represent valid test runs. Today, we offer a couple of tips to increase the likelihood that we will publish your WebXPRT 4 test results.

Tip 1: Specify your system’s processor

While testers usually include detailed information for the device, model number, operating system, and browser version fields, we receive many submissions with little to no information about the test system’s processor.

In the picture below, you can see an example of the level of detail that we require to consider a submission. We need the full processor name, including the manufacturer and model number (e.g., Intel Core i9-9980HK, AMD Ryzen 3 1300X, or Apple M1 Max). Note that we do not require the processor speed reported by the system.

Tip 2: Include a valid email address

It is also common for submissions to not include a valid email address. While we understand the privacy concerns related to submitting a personal or corporate email address, we need a valid address that we can use as a point of contact to confirm test-related information when necessary. We don’t use those addresses for any other purposes, such as selling them, sharing them with any third parties, or adding them to a mailing list.

We hope this information explains why we might not have published your results. We look forward to receiving your future score submissions. If you have any questions about the submission process, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?