BenchmarkXPRT Blog banner

Tag Archives: Intel

Local AI and new frontiers for performance evaluation

Recently, we discussed some ways the PC market may evolve in 2024, and how new Windows on Arm PCs could present the XPRTs with many opportunities for benchmarking. In addition to a potential market shakeup from Arm-based PCs in the coming years, there’s a much broader emerging trend that could eventually revolutionize almost everything about the way we interact with our personal devices—the development of local, dedicated AI processing units for consumer-oriented tech.

AI already impacts daily life for many consumers through technologies such as such as predictive text, computer vision, adaptive workflow apps, voice recognition, smart assistants, and much more. Generative AI-based technologies are rapidly establishing a permanent, society-altering presence across a wide range of industries. Aside from some localized inference tasks that the CPU and/or GPU typically handle, the bulk of the heavy compute power that fuels those technologies has been in the cloud or in on-prem servers. Now, several major chipmakers are working to roll out their own versions of AI-optimized neural processing units (NPUs) that will enable local devices to take on a larger share of the AI load.

Examples of dedicated AI hardware in recently-released or upcoming consumer devices include Intel’s new Meteor Lake NPU, Apple’s Neural Engine for M-series SoCs, Qualcomm’s Hexagon NPU, and AMD’s XDNA 2 architecture. The potential benefits of localized, NPU-facilitated AI are straightforward. On-device AI could reduce power consumption and extend battery life by offloading those tasks from the CPUs. It could alleviate certain cloud-related privacy and security concerns. Without the delays inherent in cloud queries, localized AI could execute inference tasks that operate much closer to real time. NPU-powered devices could fine-tune applications around your habits and preferences, even while offline. You could pull and utilize relevant data from cloud-based datasets without pushing private data in return. Theoretically, your device could know a great deal about you and enhance many areas of your daily life without passing all that data to another party.

Will localized AI play out that way? Some tech companies envision a role for on-device AI that enhances the abilities of existing cloud-based subscription services without decoupling personal data. We’ll likely see a wide variety of capabilities and services on offer, with application-specific and SaaS-determined privacy options.

Regardless of the way on-device AI technology evolves in the coming years, it presents an exciting new frontier for benchmarking. All NPUs will not be created equal, and that’s something buyers will need to understand. Some vendors will optimize their hardware more for computer vision, or large language models, or AI-based graphics rendering, and so on. It won’t be enough for business and consumers to simply know that a new system has dedicated AI processing abilities. They’ll need to know if that system performs well while handling the types of AI-related tasks that they do every day.

Here at the XPRTs, we specialize in creating benchmarks that feature real-world scenarios that mirror the types of tasks that people do in their daily lives. That approach means that when people use XPRT scores to compare device performance, they’re using a metric that can help them make a buying decision that will benefit them every day. We look forward to exploring ways that we can bring XPRT benchmarking expertise to the world of on-device AI.

Do you have ideas for future localized AI workloads? Let us know!

Justin

The evolving PC market brings new opportunities for WebXPRT

Here at the XPRTs, we have to spend time examining what’s next in the tech industry, because the XPRTs have to keep up with the pace of innovation. In our recent discussions about 2024, a major recurring topic has been the potential impact of Qualcomm’s upcoming line of SOCs designed for Windows on Arm PCs.

Now, Windows on Arm PCs are certainly not new. Since Windows RT launched on the Arm-based Microsoft Surface RT in 2012, various Windows on Arm devices have come and gone, but none of them—except for some Microsoft SQ-based Surface devices—have made much of a name for themselves in the consumer market.

The reasons for these struggles are straightforward. While Arm-based PCs have the potential to offer consumers the benefits of excellent battery life and “always-on” mobile communications, the platform has historically lagged Intel- and AMD-based PCs in performance. Windows on Arm devices have also faced the challenge of a lack of large-scale buy-in from app developers. So, despite the past involvement of device makers like ASUS, HP, Lenovo, and Microsoft, the major theme of the Windows on Arm story has been one of very limited market acceptance.

Next year, though, the theme of that story may change. If it does, WebXPRT 4 is well-positioned to play an important part.

At the recent Qualcomm Technology Summit, the company unveiled the new 4nm Snapdragon X Elite SOC, which includes an all-new 12-core Oryon CPU, an integrated Adreno GPU, and an integrated Hexagon NPU (neural processing unit) designed for AI-powered applications. Company officials presented performance numbers that showed the X Elite surpassing the performance of late-gen AMD, Apple, and Intel competitor platforms, all while using less power.

Those are massive claims, and of course the proof will come—or not—only when systems are available for test. (In the past, companies have made similar claims about Windows on Arm advantages, only to see those claims evaporate by the time production devices show up on store shelves.)

Will Snapdragon X Elite systems demonstrate unprecedented performance and battery life when they hit the market? How will the performance of those devices stack up to Intel’s Meteor Lake systems and Apple’s M3 offerings? We don’t yet know how these new devices may shake up the PC market, but we do know that it looks like 2024 will present us with many golden opportunities for benchmarking. Amid all the marketing buzz, buyers everywhere will want to know about potential trade-offs between price, power, and battery life. Tech reviewers will want to dive into the details and provide useful data points, but many traditional PC benchmarks simply won’t work with Windows on ARM systems. As a go-to, cross-platform favorite of many OEMs—that runs on just about anything with a browser—WebXPRT 4 is in a perfect position to provide reviewers and consumers with relevant performance comparison data.

It’s quite possible that 2024 may be the biggest year for WebXPRT yet!

Justin

Using WebXPRT 3 to compare the performance of popular browsers

Microsoft recently released a new Chromium-based version of the Edge browser, and several tech press outlets have released reviews and results from head-to-head browser performance comparison tests. Because WebXPRT is a go-to benchmark for evaluating browser performance, PCMag, PCWorld, and VentureBeat, among others, used WebXPRT 3 scores as part of the evaluation criteria for their reviews.

We thought we would try a quick experiment of our own, so we grabbed a recent laptop from our Spotlight testbed: a Dell XPS 13 7930 running Windows 10 Home 1909 (18363.628) with an Intel Core i3-10110U processor and 4 GB of RAM. We tested on a clean system image after installing all current Windows updates, and after the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 3 three times on six browsers: a new browser called Brave, Google Chrome, the legacy version of Microsoft Edge, the new version of Microsoft Edge, Mozilla Firefox, and Opera. The posted score for each browser is the median of the three test runs.

As you can see in the chart below, five of the browsers (legacy Edge, Brave, Opera, Chrome, and new Edge) produced scores that were nearly identical. Mozilla Firefox was the only browser that produced a significantly different score. The parity among Brave, Chrome, Opera, and the new Edge is not that surprising, considering they are all Chromium-based browsers. The rank order and relative scaling of these results is similar to the results published by the tech outlets mentioned above.

Do these results mean that Mozilla Firefox will provide you with a speedier web experience? Generally, a device with a higher WebXPRT score is probably going to feel faster to you during daily use than one with a lower score. For comparisons on the same system, however, the answer depends in part on the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately the browsers’ default installation settings reflect how you would set up the same browsers for your daily workflow.

In addition, browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Edge on Windows 10 and Chrome on Chrome OS. All of these variables are important to keep in mind when considering how browser performance comparison results translate to your everyday experience. In such a competitive market, and with so many variables to consider, we’re happy that WebXPRT can help consumers by providing reliable, objective results.

What are your thoughts on today’s competitive browser market? We’d love to hear from you.

Justin

How to use alternate configuration files with AIXPRT

In last week’s AIXPRT Community Preview 3 announcement, we mentioned the new public GitHub repository that we’re using to publish AIXPRT-related information and resources. In addition to the installation readmes for each AIXPRT installation package, the repository contains a selection of alternative test config files that testers can use to quickly and easily change a test’s parameters.

As we discussed in previous blog entries about batch size, levels of precision, and number of concurrent instances, AIXPRT testers can adjust each of these key variables by editing the JSON file in the AIXPRT/Config directory. While the process is straightforward, editing each of the variables in a config file can take some time, and testers don’t always know the appropriate values for their system. To address both of these issues, we are offering a selection of alternative config files that testers can download and drop into the AIXPRT/Config directory.

In the GitHub repository, we’ve organized the available config files first by operating system (Linux_Ubuntu and Windows) and then by vendor (All, Intel, and NVIDIA). Within each section, testers will find preconfigured JSON files set up for several scenarios, such as running with multiple concurrent instances on a system’s CPU or GPU, running with FP32 precision instead of FP16, etc. The picture below shows the preconfigured files that are currently available for systems running Ubuntu on Intel hardware.

AIXPRT public repository snip 2

Because potential AIXPRT use cases cut across a wide range of hardware segments, including desktops, edge devices, and servers, not all AIXPRT workloads and configs will be applicable to each segment. As we move towards the AIXPRT GA, we’re working to find the best way to parse out these distinctions and communicate them to end users. In many cases, the ideal combination of test configuration variables remains an open question for ongoing research. However, we hope the alternative configuration files will help by giving testers a starting place.

If you experiment with an alternative test configuration file, please note that it should replace the existing default config file. If more than one config file is present, AIXPRT will run all the configurations and generate a separate result for each. More information about the config files and detailed instructions for how to handle the files are available in the EditConfig.md document in the public repository.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. If you have any questions or comments, please let us know.

Justin

A new HDXPRT 4 build is available!

A few weeks ago, we announced that a new HDXPRT 4 build, v1.1, was on the way. This past Monday, we published the build on HDXPRT.com.

The new build includes an updated version of HandBrake, the commercial application that HDXPRT uses for certain video conversion tasks. HandBrake 1.2.2 supports hardware acceleration with AMD Video Coding Engine (VCE), Intel Quick Sync, and the NVIDIA video encoder (NVENC). By default, HDXPRT4 v1.1 uses the encoder available through a system’s integrated graphics, but testers can target discrete graphics by changing a configuration file flag before running the benchmark. HDXPRT will then use the encoder provided by the discrete graphics hardware. This configuration setting takes effect only when more than one of the supported encoders (VCE, QSV, or NVENC) is present on the system.

As we mentioned before, in all other respects, the benchmark has not changed. That means that, apart from a scenario where a tester changes the targeted graphics hardware, scores from previous HDXPRT 4 builds will be comparable to those from the new build.

The updated HDXPRT 4 User Manual contains additional information and instructions for changing the configuration file flag. Please contact us if you have any questions about the new build. Happy testing!

Justin

Answering questions about the AIXPRT Community Preview

Over the last two weeks, we’ve received a few questions about the AIXPRT Community Preview. Specifically, community members have asked about the project’s focus, possible future steps, and the results table. We decided to answer each of these here in the blog, since others are likely to have the same questions. We encourage folks to submit any new questions they may have.

PT previously stated that AIXPRT would be focused on edge devices. The current published results are from desktops and laptops. Is the focus of AIXPRT changing?

In the past, we did say that the focus of AIXPRT would be edge inference devices. After much feedback, we’ve come to understand that focus is probably too restrictive. PCs and laptops are using inference machine learning, and a decent amount of inference is taking place on servers in the cloud until phones are capable enough to handle the workloads. We now see all of these devices as potential targets for AIXPRT.

How did you choose the current results in your database?

We ran the AIXPRT CP on some of the systems we used during development and testing. We will continue to publish additional results as we test available systems in our lab. We’d love to get results from the community that cover a wider base of devices.

Will you be publishing results from servers?

We welcome server results submissions from the community, and will review them for publication on our site.

Will AIXPRT ever be available for Windows systems?

This is a possibility we’re actively exploring, and we hope to be able to share more about it soon.

What’s the best way to navigate the results table?

AIXPRT can run three toolkits, utilize two networks, and target CPU or GPU hardware. Together, these configuration options produce a lot of data points. To make it easier to handle all these variables, we’re working to improve the navigation, sorting, and filtering capabilities of the results table. In the meantime, a few tips:

  • There are two tabs at the top of the table, one for the ResNet-50 network and one for the SSD-MobileNet network. You can click the tabs to move between results for these networks.
  • Clicking any of the column headers will sort the data in that column A-Z (with the first click) or Z-A (with a second click).
  • To see if an individual test targeted a system’s CPU or GPU, read the description in the Summary column, e.g. Intel Core i7-7600U GPU / OpenVINO.
  • Clicking the entry in the Source column will take you to a more detailed page listing additional test configuration and system hardware information.

 

We’ll continue to share more information about AIXPRT in the coming weeks. Do you have additional questions or comments about AIXPRT? Let us know.

Justin

Check out the other XPRTs:

Forgot your password?