the past few months, we’ve been recommending that CrXPRT 2
testers not use the battery life test until we find a solution to a recurring
error on Chrome v89.x and later. The error prevents the test from completing
and producing a battery life estimate. Sometimes, the CrXPRT battery life test stops
running after only a few workload iterations, while at other times, it almost
reaches completion before producing the error.
We are cautiously optimistic that we’ve identified both the problem and a potential fix. We believe the problem stems from fluctuations in the time it takes the benchmark to communicate with Chrome to collect and store battery life information. While we haven’t identified the root cause of the fluctuations, adjusting the CrXPRT code to make it less sensitive to the fluctuations appears to be an effective fix. We have incorporated those adjustments into an updated, unpublished version of the app package, and we can now complete CrXPRT 2 battery life tests on Chrome v89.x and later with no failures.
are calling this a potential fix because we’re still testing across several
different Chromebook models to ensure consistency. In some testing, the
variance in estimated battery life results has been a little higher than we
like, so we’re taking time to determine whether that variance is present across
all systems or on only specific hardware.
We’d like to apologize once again for the inconvenience that this error is causing CrXPRT 2 testers. As soon as we better understand the viability of the current fix as a long-term update, we’ll let you know!
week, Microsoft announced
that the Windows 11 GA build will officially launch Tuesday October 5, earlier
than the initial late 2021 estimate. The update will start rolling out with
select new laptops and existing Windows 10 PCs that satisfy specific system requirements,
and only some Windows 10 PCs will be eligible for the update right away.
Through a phased Windows Update process, additional Windows 10 PCs will be able
to access the update throughout the first half of 2022.
the phased Windows 11 rollout and the pledge
Microsoft has made to continue Windows 10 support through October 2025, it will
likely be a while before the majority of Windows users transition to the new version.
We hope the transition period will go smoothly for the XPRTs. However, because we
designed three of our benchmarks to run on Windows 10 (HDXPRT 4,
we might encounter compatibility issues with Windows 11.
the coming weeks, we’ll be testing HDXPRT 4, TouchXPRT 2016, and AIXPRT on beta
versions of Windows 11, and we’ll test again after the GA launch. In addition
to obvious compatibility issues and test failures, we’ll note any changes we
need to make to our documentation to account for differences in the Windows 11
installation or test processes.
We hope that testers will be able to successfully use all three benchmarks on both OS versions throughout the transition process. If problems arise, we will keep our blog readers informed while exploring solutions. As always, we’re also open to feedback from the community, so if you are participating in the Windows Insider Program and have encountered Windows 11 beta compatibility issues with any of the Windows-focused XPRTs, please let us know!
In May, we surveyed
longtime WebXPRT users regarding the types of changes they would like to see in
a WebXPRT 4. We sent the survey to journalists at several tech press outlets,
and invited our blog readers to participate as well. We received some very helpful feedback. As we explore new possibilities for WebXPRT 4, we’ve decided to
open an updated version of the survey. We’ve adjusted the questions a bit based
on previous feedback and added some new ones, so we invite you to respond even
if you participated in the original survey.
recently published a set of CloudXPRT Data Analytics and Web Microservices
workload test results
submitted by Quanta Computer, Inc.
The Quanta submission is the first set of CloudXPRT results that we’ve
published using the formal results submission and approval process.
We’re grateful to the Quanta team for carefully following the submission
guidelines, enabling us to complete the review process without a hitch.
If you are unfamiliar
with the process, you can find general information about how we review
submissions in a previous blog post.
Detailed, step-by-step instructions are available on the results submission page.
As a reminder for testers who are considering submitting results for July, the
submission deadline is tomorrow, Friday July 16, and the publication date is
Friday July 30. We list the submission and publication dates for the rest of
2021 below. Please note that we do not plan to review submissions in December,
so if we receive results submissions after November 30, we may not publish them
until the end of January 2022.
Submission deadline: Tuesday 8/17/21
Publication date: Tuesday 8/31/21
Submission deadline: Thursday 9/16/21
Publication date: Thursday 9/30/21
Submission deadline: Friday 10/15/21
Publication date: Friday 10/29/21
Submission deadline: Tuesday 11/16/21
Publication date: Tuesday 11/30/21
Submission deadline: N/A
Publication date: N/A
If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!
We’re excited to have recently passed an important milestone: one million XPRT runs and downloads! Most importantly, that huge number does not just reflect past successes. As the chart below illustrates, XPRT use has grown steadily over the years. In 2021, we record, on average, more XPRT runs and downloads in one month (23,395) than we recorded in the entire first year we started tracking these stats (17,051).
We reached one million
runs and downloads in about seven and a half years. At the current rate, we’ll
reach two million in roughly three and a half more years. With WebXPRT 4 on the way, there’s a good chance we can reach that mark even sooner!
As always, we’re grateful for all the testers that have helped us reach this milestone. If you have any questions or comments about using any of the XPRTs to test your gear, let us know!
We recently received questions about whether we accept CloudXPRT
results submissions from testing on pre-production gear, and how we would handle
any differences between results from pre-production and production-level tests.
To answer first question, we are not opposed to pre-production
results submissions. We realize that vendors often want to include benchmark
results in launch-oriented marketing materials they release before their
hardware or software is publicly available. To help them do so, we’re happy to
consider pre-production submissions on a case-by-case basis. All such submissions
must follow the normal CloudXPRT results
submission process, and undergo
vetting by the CloudXPRT Results Review Group according to the standard review
and publication schedule. If we decide to publish pre-production results on our site, we
will clearly note their pre-production status.
In response to the second question, the CloudXPRT Results Review Group will handle any challenges to published results or perceived discrepancies between pre-production and production-level results on a case-by-case basis. We do not currently have a formal process for challenges; anyone who would like to initiate a challenge or express comments or concerns about a result should address the review group via firstname.lastname@example.org. Our primary concern is always to ensure that published results accurately reflect the performance characteristics of production-level hardware and software. If it becomes necessary to develop more policies in the future, we’ll do so, but we want to keep things as simple as possible.
If you have any questions about the CloudXPRT results submission process, please let us know!