In October, we shared an early preview of the new results viewer tool that we’ve been developing in parallel with WebXPRT 4. The WebXPRT 4 Preview is now available to the public, and we’re excited to announce that the new results viewer is also live. We already have over 65 test results in the viewer, and in the weeks leading up to the WebXPRT 4 general release, we’ll be actively populating the viewer with the latest PT-curated WebXPRT 4 Preview results.
We
encourage readers to visit the blog for details
about the viewer’s features, and to take some time to explore the data.
We’re excited about this new tool, which we view as an ongoing project with
room for expansion and improvement based on user feedback.
If you have any questions or comments about the WebXPRT 4 Preview or the new results viewer, please feel free to contact us!
Last
week, we shared some new details
about the changes we’re likely to make in WebXPRT 4, and a rough target date
for publishing a preview build. This week, we’re excited to share an early
preview of the new results viewer tool that we plan to release in conjunction
with WebXPRT 4. We hope the tool will help testers and analysts access the
wealth of WebXPRT test results in our database in an efficient, productive, and
enjoyable way. We’re still ironing out many of the details, so some aspects of
what we’re showing today might change, but we’d like to give you an idea of
what to expect.
The screenshot below shows the tool’s default display. In this example, the viewer displays over 650 sample results—from a wide range of device types—that we’re currently using as placeholder data. The viewer will include several sorting and filtering options, such as device type, hardware specs such as browser type and processor vendor, the source of the result, etc.
Each
vertical bar in the graph represents the overall score of single test result,
and the graph presents the scores in order from lowest to highest. To view an
individual result in detail, the user simply hovers over and selects the bar
representing the result. The bar turns dark blue, and the dark blue banner at
the bottom of the viewer displays details about that result.
In the example above, the banner shows the overall score (250) and the score’s percentile rank (85th) among the scores in the current display. In the final version of the viewer, the banner will also display the device name of the test system, along with basic hardware disclosure information. Selecting the Run details button will let users see more about the run’s individual workload scores.
We’re
still working on a way for users to pin or save specific runs. This would let
users easily find the results that interest them, or possibly select multiple
runs for a side-by-side comparison.
We’re excited about this new tool, and we look forward to sharing more details here in the blog as we get closer to taking it live. If you have any questions or comments about the results viewer, please feel free to contact us!
As we move forward with the WebXPRT 4 development
process, we’re also working on ways to enhance the value of WebXPRT beyond simply
updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related
tools and resources we offer at WebXPRT.com, starting with a new results
viewer.
Currently, users can view
WebXPRT results on our site two primary ways, each of which has advantages and
limitations.
The first way is the WebXPRT results viewer, which includes hundreds of
PT-curated performance scores from a wide range of trusted sources and devices.
Users can sort entries by device type, device name, device model, overall
score, date of publication, and source. The viewer also includes a free-form
filter for quick, targeted searches. While the results viewer contains a wealth
of information, it does not give users a way to use graphs or charts for
viewing and comparing multiple results at once. Another limitation of the
current results viewer is that it offers no easy way for users to access the
additional data about the test device and the subtest scores that we have for
many entries.
The second way to view WebXPRT
results on our site is the WebXPRT Processor Comparison
Chart. The
chart uses horizontal bar graphs to compare test scores from the hundreds of
published results in our database, grouped by processor type. Users can click
the average score for a processor to view all the WebXPRT results we currently
have for that processor. The visual aspect of the chart and its automated
“group by processor type” feature are very useful, but it lacks the sorting and
filtering capabilities of the viewer, and navigating to the details of
individual tests takes multiple clicks.
In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!
We
recently published a set of CloudXPRT Data Analytics and Web Microservices
workload test results
submitted by Quanta Computer, Inc.
The Quanta submission is the first set of CloudXPRT results that we’ve
published using the formal results submission and approval process.
We’re grateful to the Quanta team for carefully following the submission
guidelines, enabling us to complete the review process without a hitch.
If you are unfamiliar
with the process, you can find general information about how we review
submissions in a previous blog post.
Detailed, step-by-step instructions are available on the results submission page.
As a reminder for testers who are considering submitting results for July, the
submission deadline is tomorrow, Friday July 16, and the publication date is
Friday July 30. We list the submission and publication dates for the rest of
2021 below. Please note that we do not plan to review submissions in December,
so if we receive results submissions after November 30, we may not publish them
until the end of January 2022.
August
Submission deadline: Tuesday 8/17/21
Publication date: Tuesday 8/31/21
September
Submission deadline: Thursday 9/16/21
Publication date: Thursday 9/30/21
October
Submission deadline: Friday 10/15/21
Publication date: Friday 10/29/21
November
Submission deadline: Tuesday 11/16/21
Publication date: Tuesday 11/30/21
December
Submission deadline: N/A
Publication date: N/A
If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!
It’s
been a while since we last discussed the process for submitting WebXPRT results
to be considered for publication in the WebXPRT results browser
and the WebXPRT Processor Comparison Chart, so we
thought we’d offer a refresher.
Unlike
sites that publish all results they receive, we hand-select results from
internal lab testing, user submissions, and reliable tech media sources. In
each case, we evaluate whether the score is consistent with general expectations.
For sources outside of our lab, that evaluation includes confirming that there
is enough detailed system information to help us determine whether the score
makes sense. We do this for every score on the WebXPRT results page and the
general XPRT results page.
All WebXPRT results we publish automatically appear in the processor comparison
chart as well.
Submitting your score is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 3 run on my personal system.
After you submit your score, we’ll contact you to confirm how we should display
the source. You can choose one of the following:
Your first and last name
“Independent tester” (for those
who wish to remain anonymous)
Your company’s name, provided
that you have permission to submit the result in their name. To use a
company name, we ask that you provide a valid company email address.
We will
not publish any additional information about you or your company without your
permission.
We look forward to seeing your score submissions, and if you have suggestions for the processor chart or any other aspect of the XPRTs, let us know!
Today, we published the Introduction to CloudXPRT white paper. The paper provides an overview of our latest benchmark and consolidates CloudXPRT-related information that we’ve published in the XPRT blog over the past several months. It describes the CloudXPRT workloads, choosing and downloading installation packages, submitting CloudXPRT results for publication, and possibilities for additional development in the coming months.
CloudXPRT is one of
the most complex tools in the XPRT family, and there are more CloudXPRT-related
topics to discuss than we could fit in this first paper. In future white papers,
we will discuss in greater detail each of the benchmark workloads, the range of
test configuration options, results reporting, and methods for analysis.
We hope that Introduction
to CloudXPRT will provide testers who are interested in CloudXPRT with
a solid foundation of understanding on which they can build. Moving forward, we
will provide links to the paper in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.
If you have any questions about CloudXPRT, please let us know!
Cookie Notice: Our website uses cookies to deliver a smooth experience by storing logins and saving user information. By continuing to use our site, you agree with our usage of cookies as our privacy policy outlines.