If you’ve ever spent time exploring WebXPRT.com, you may have noticed a line that says, “If you are in East Asia, you can run WebXPRT from our Singapore host,” followed by a hyperlink with Simplified Chinese characters. We realize that some people may not know why we have a WebXPRT mirror host site in Singapore—or how to use it—so today’s post will cover the basics.
When we first released WebXPRT 2013, some users in mainland China reported slow download times when running the benchmark. These slowdowns affected initial page and workload content load times, but not workload execution, which happens locally. As a result, subtest and overall scores were still consistent with expectations for the devices under test, but it took longer than normal for test runs to complete. In response, we set up a mirror host site in Singapore to facilitate WebXPRT testing in China and other East Asian countries. We continued this practice with subsequent WebXPRT versions, and currently offer Singapore-based instances of WebXPRT 4, WebXPRT 3, and WebXPRT 2015.
The default UI language on the Singapore site is Simplified Chinese, but users can opt to change the language to English or German. Apart from a different default language, the WebXPRT mirror instances hosted in Singapore are identical to the instances on the main WebXPRT site. If you test a device on WebXPRT Singapore and WebXPRT.com, you should see similar performance scores from both sites.
We hope that the WebXPRT mirror host site in Singapore will make it easier for people in East Asia to use the benchmark. Do you find the site useful? If so, we’d love to hear from you! Also, if you encounter any unexpected issues or interruptions while testing, please let us know!
In our last blog post, we celebrated the 10-year anniversary of the WebXPRT launch by looking back on the WebXPRT team’s accomplishments over the last decade. The incremental steps and milestone improvements we discussed all contributed to carving out a lasting place for WebXPRT in the benchmarking world and helped to grow its reputation for being a reliable, effective, and easy to use measurement tool.
WebXPRT’s growth is most evident when we look at the rising number of completed test runs over the last 10 years. Since the first WebXPRT launch in 2013, we’ve seen a steady increase in the number of tests people are running. To put the increase in perspective, we had more runs last month alone (17,300) than we recorded in the first 10 months that WebXPRT was available (11,984).
That growth has helped
us to reach and surpass the million-run mark, but the most exciting aspect of
seeing a consistent increase in WebXPRT testing is the knowledge that the
benchmark is proving to be useful to more people in more places around the
world. In our next blog post, we’ll discuss WebXPRT’s truly global reach and
some of the surprising cities and countries where people have been using it to
test their gear.
We’re grateful for all the testers that have helped WebXPRT grow during the last decade. If you have any questions or comments about using WebXPRT, let us know!
We’re excited to
announce that it’s been 10 years since the initial launch of WebXPRT! In early
2013, we introduced WebXPRT as a unique browser performance benchmark in a market
space that was already crowded with a variety of specialized measurement tools.
Our goal was to offer a benchmark that could compare the performance of almost
any web-enabled device, using scenarios created to mirror real-world tasks. We
wanted it to be a free, easily accessible, easy-to-run, useful, and appealing
testing option for OEM labs, vendors, and the tech press.
When we look back on
the last 10 years of WebXPRT, we can’t help but conclude that our efforts have
been successful. Since those early days, the WebXPRT market presence has grown
from humble beginnings into a worldwide industry standard. Hundreds of tech
press publications have used WebXPRT in thousands of articles and reviews, and testers
have now run the benchmark well over 1.1 million times.
Below, I’ve listed
some of the WebXPRT team’s accomplishments over the last decade. If you’ve been
following WebXPRT from the beginning, this may all be familiar, but if you’re
new to the community, it may be
interesting to see some of the steps that contributed to making WebXPRT what it
In future blog posts, we’ll look at how the number of WebXPRT runs has grown over time, and how WebXPRT use has grown among OEMs, vendors, and the tech press worldwide. Do you have any thoughts that you’d like to share from your WebXPRT testing experience? If so, let us know!
February 28, 2013 was
a momentous day for the BenchmarkXPRT Development Community. On that day, we
published a press release announcing the official launch of the first version
of the WebXPRT benchmark, WebXPRT 2013. As difficult as it is for us to
believe, the 10-year anniversary of the initial WebXPRT launch is in just a few
We introduced WebXPRT
as a truly unique browser performance benchmark in a field that was already
crowded with a variety of measurement tools. Since those early days, the WebXPRT
market presence has grown from a small foothold into a worldwide industry
standard. Over the years, hundreds of tech press publications have used WebXPRT
in thousands of articles and reviews, and the WebXPRT completed-runs counter
rolled over the 1,000,000-run mark.
New web technologies
are continually changing the way we use the web, and browser-performance
benchmarks should evaluate how well new devices handle the web of today, not
the web of several years ago. While some organizations have stopped development
for other browser performance benchmarks, we’ve had the opportunity to continue
updating and refining WebXPRT. We can look back at each of the four major
iterations of the benchmark—WebXPRT 2013, WebXPRT 2015, WebXPRT 3, and WebXPRT 4—and
see a consistent philosophy and shared technical lineage contributing to a product
that has steadily improved.
As we get closer to the 10-year anniversary of WebXPRT next year, we’ll be sharing more insights about its reach and impact on the industry, discussing possible future plans for the benchmark, and announcing some fun anniversary-related opportunities for WebXPRT users. We think 2023 will be the best year yet for WebXPRT!
We’ve designed each of the XPRT benchmarks to assess the performance of specific types of devices in scenarios that mirror the ways consumers typically use those devices. While most XPRT benchmark users are interested in producing official overall scores, some members of the tech press have been using the XPRTs in unconventional, creative ways.
One example is the use
by Tweakers, a popular tech
review site based in The Netherlands. (The site is in Dutch, so the Google
Translate extension in Chrome was helpful for me.) As Tweakers uses WebXPRT to
evaluate all kinds of consumer hardware, they also measure the sound output of each
device. Tweakers then publishes the LAeq metric for each device,
giving readers a sense of how loud a system may be, on average, while it
performs common browser tasks.
Other labs and tech
publications have also used the XPRTs in unusual ways such as automating the
benchmarks to run during screen burn-in tests or custom battery-life rundowns. If
you’ve used any of the XPRT benchmarks in creative ways, please let us know!
We are interested in learning more about your tests, and your experiences may
provide helpful information that we can share with other XPRT users.
This week, we published the Exploring WebXPRT 4 white paper. It describes the design and structure of WebXPRT 4, including detailed information about the benchmark’s harness, HTML5 and WebAssembly (WASM) capability checks, and changes we’ve made to the structure of the performance test workloads. We explain the benchmark’s scoring methodology, how to automate tests, and how to submit results for publication. The white paper also includes information about the third-party functions and libraries that WebXPRT 4 uses during the HTML5 and WASM capability checks and performance workloads.
The Exploring WebXPRT 4 white paper promotes
the high level of transparency and disclosure that is a core value of the
BenchmarkXPRT Development Community. We’ve always believed that transparency
builds trust, and trust is essential for a healthy benchmarking community.
That’s why we involve community members in the benchmark development process
and disclose how we build our benchmarks and how they work.