BenchmarkXPRT Blog banner

Category: Virtual reality

Check out our CES 2023 recap video!

We have a very talented studio team here at Principled Technologies, and this week, the team worked with the XPRTs to put together a CES 2023 recap video. In it, I discuss why we traveled to CES, my overall impressions from the show, and how the ideas and technologies presented at the show may influence the development of future XPRT benchmarks. If you missed last week’s post about my initial thoughts on the advances in AR and VR technology at the show, or you didn’t get a chance to see some of our photos from the show on social media, this week’s video provides a good summary of our CES-related activity.

To view the video, you can follow this link or click the screenshot below. If you followed CES this year and have any thoughts about how the XPRTs can help to evaluate emerging technologies, we’d love to hear from you!

Justin

CES 2023: Adapting to changing realities

The last time the XPRTs attended the Consumer Electronics Show in Las Vegas was in January 2020, shortly before shutdowns due to the global pandemic began. More than 171,000 people attended that year’s show, the 2021 show was totally virtual, and CES shortened the 2022 show after many exhibitors and media pulled out during the Omicron surge. While some aspects of the event are returning to normal this year, about one-third of the typically jam-packed Las Vegas Convention Center space is empty, and only about 100,000 people are likely to attend. Nevertheless, the show is still enormous and full of fascinating new technology.

Just one day into the show, I’ve already noticed some interesting changes in the virtual reality (VR) and augmented reality (AR) areas since I last attended in 2020. One change is a significant expansion in the sensory capabilities of VR equipment. For a long time, VR technologies have focused almost solely on visual and audio input technology and the graphics-rendering capabilities necessary for lag-free, immersive experiences. In 2020, I saw companies working on various types of haptic feedback gear, including full-body suits, that pushed the boundaries of VR beyond sight and sound. Now, several companies are demonstrating significant progress in “real-feel touch” technologies for VR. One such company is HaptX, which is developing a set of gloves (see the picture below) that pump air through “microfluidic actuators” so that users can feel the size and shape of virtual objects they interact with in a VR environment. While we often think of VR being used for gaming and entertainment, advances in realistic, multi-sensory capabilities can lead to VR becoming a valuable tool for all kinds of industrial and professional training applications.

A show attendee tries out HaptX gear.

Another change I’ve noticed is how AR seems poised to move from demos to everyday life by means of integration with all types of smartphone apps. I enjoyed speaking with a representative from a Korean AR company called Arbeon. Arbeon is developing an app that will allow users to point their phone’s camera at an object (a wine bottle in the picture below), and see an array of customizable, interactive AR animations surrounding the object. You’ll be able to find product info, see and leave feedback similar to “likes” and reviews, attach emojis, tag friends, and even purchase the product, all from your phone’s AR-enhanced camera and screen. It’s an interesting concept with limitless applications. While VR is here to stay and getting better all the time, I personally think that AR will become much more integrated into everyday life in the coming years. I also think AR apps for phones will allow the technology to take off more quickly in the near term than clunkier options like AR eyeglasses.

The large screen displays how Arbeon’s AR phone app interacts with objects like a wine bottle.

Of course, thinking about AR has led me to wonder if we’ll be able to incorporate AR-related workloads into future XPRTs. As new technologies place new and unprecedented levels of processing demand on our computing hardware, the need for objective performance evaluation will continue. Providing reliable, objective performance data is why the XPRTs exist, and planning for the future of the XPRTs is why we’re at CES 2023. If you have any thoughts about how the XPRTs can help to evaluate new technologies, we’d love to hear from you!

Justin

Considering WebAssembly for WebXPRT 4

Earlier this month, we discussed a few of our ideas for possible changes in WebXPRT 4, including new web technologies that may work well in a browser benchmark. Today, we’re going to focus on one of those technologies, WebAssembly, in more detail.

WebAssembly (WASM) is a binary instruction format that works across all modern browsers. WASM provides a sandboxed environment that operates at native speeds and takes advantage of common hardware specs across platforms. WASM’s capabilities offer web developers a great deal of flexibility for running complex client applications in the browser. That level of flexibility may enable workload scenario options for WebXPRT 4 such as gaming, video editing, VR, virtual machines, and image recognition. We’re excited about those possibilities, but it remains to be seen which WASM use cases will meet the criteria we look for when considering new WebXPRT workloads, such as relevancy to real life, consistency and replicability, and the broadest-possible level of cross-browser support.

One WASM workload that we’re investigating is a web-based machine learning workload with TensorFlow for JavaScript (TensorFlow.js). TensorFlow.js offers pre-trained models for a wide variety of tasks, including image classification, object detection, sentence encoding, and natural language processing. TensorFlow.js originally used WebGL technology on the back end, but now it’s possible to run the workload using WASM. We could also use this technology to enhance one of WebXPRT’s existing AI-themed workloads, such as Organize Album using AI or Encrypt Notes and OCR Scan.

We’re can’t yet say that a WASM workload will definitely appear in WebXPRT 4, but the technology is promising. Do you have any experience with WASM, or ideas for WASM workloads? There’s still time for you to help shape the future of WebXPRT 4, so let us know what you think!

Justin

Potential web technology additions for WebXPRT 4

A few months ago, we invited readers to send in their thoughts and ideas about web technologies and workload scenarios that may be a good fit for the next WebXPRT. We’d like to share a few of those ideas today, and we invite you to continue to send your feedback. We’re approaching the time when we need to begin firming up plans for a WebXPRT 4 development cycle in 2021, but there’s still plenty of time for you to help shape the future of the benchmark.

One of the most promising ideas for WebXPRT 4 is the potential addition of one or more WebAssembly (WASM) workloads. WASM is a low-level, binary instruction format that works across all modern browsers. It offers web developers a great deal of flexibility and provides the speed and efficiency necessary for running complex client applications in the browser. WASM enables a variety of workload scenario options, including gaming, video editing, VR, virtual machines, image recognition, and interactive educational content.

In addition, the Chrome team is dropping Portable Native Client (PNaCL) support in favor of WASM, which is why we had to remove a PNaCL workload when updating CrXPRT 2015 to CrXPRT 2. We generally model CrXPRT workloads on existing WebXPRT workloads, so familiarizing ourselves with WASM could ultimately benefit more than one XPRT benchmark.

We are also considering adding a web-based machine learning workload with TensorFlow for JavaScript (TensorFlow.js). TensorFlow.js offers pre-trained models for a wide variety of tasks including image classification, object detection, sentence encoding, natural language processing, and more. We could also use this technology to enhance one of WebXPRT’s existing AI-themed workloads, such as Organize Album using AI or Encrypt Notes and OCR Scan.

Other ideas include using a WebGL-based workload to target GPUs and investigating ways to incorporate a battery life test. What do you think? Let us know!

Justin

AI is the heartbeat of CES 2019

This year’s CES features a familiar cast of characters: gigantic, super-thin 8K screens; plenty of signage promising the arrival of 5G; robots of all shapes, sizes, and levels of competency; and acres of personal grooming products that you can pair with your phone. In all seriousness, however, one main question keeps coming to mind as I walk the floor: Are we approaching the tipping point where AI truly starts to affect most people in meaningful ways on a daily basis? I think we’re still a couple of years away from ubiquitous AI, but it’s the heartbeat of this year’s show, and it’s going play a part in almost everything we do in the very near future. AI applications at this year’s show include manufacturing, transportation, energy, medicine, education, photography, communications, farming, grocery shopping, fitness, sports, defense, and entertainment, just to name a few. The AI revolution is just starting, but once it gets going, AI will continually reshape society for decades to come. This year’s show reinforces our decision to explore the roles that the XPRTs, beginning with AIXPRT, can play in the AI revolution.

Now for the fun stuff. Here’s a peek at a couple of my favorite displays so far. As is often the case, the most awe-inducing displays at CES are those that overwhelm attendees with light and sound. LG’s enormous curved OLED wall, dubbed the Massive Curve of Nature, was truly something to behold.

IMG_0268 - Copy

Another big draw has been Bell’s Nexus prototype, a hybrid-electric VTOL (vertical takeoff and landing) air taxi. Some journalists can’t resist calling it a flying car, but I refuse to do so, because it has nothing in common with cars apart from the fact that people sit in it and use it to travel from place to place. As Elon Musk once said of an earlier, but similar, concept, “it’s just a helicopter in helicopter’s clothing.” Semantics aside, it’s intriguing to imagine urban environments full of nimble aircraft that are quieter, easier to fly, and more energy efficient than traditional helicopters, especially if they’re paired with autonomous driving technologies.

Version 2

Finally, quite a few companies are displaying props that put some of the “reality” back into “virtual reality.” Driving and flight simulators with full range of motion that are small enough to fit in someone’s basement or game room, full-body VR suits that control your temperature and deliver electrical stimulus based on game play (yikes!), and portable roller-coaster-like VR rides were just a few of the attractions.

IMG_0203 - Copy

It’s been a fascinating show so far!

Justin

XPRT collaborations: North Carolina State University

For those of us who work on the BenchmarkXPRT tools, a core goal is involving new contributors and interested parties in the benchmark development process. Adding voices to the discussion fosters the collaboration and innovation that lead to powerful benchmark tools with lasting relevance.

One vehicle for outreach that we especially enjoy is sponsoring a student project through North Carolina State University. Each semester, the Senior Design Center in the university’s Department of Computer Science partners with external companies and organizations to provide student teams with an opportunity to work on real-world programming projects. If you’ve followed the XPRTs for a while, you may remember previous student projects such as Nebula Wolf, a mini-game that shows how well different devices handle games, and VR Demo, a virtual reality prototype workload based on a room escape scenario.

This fall, a team of NC State students is developing a software console for automating machine learning tests. Ideally, the tool will let future testers specify custom workload combinations, compute a performance metric, and upload results to our database. The project will also assess the impact of the framework on performance scores. In fact, the console will perform many of the same functions we plan to implement with AIXPRT.

The students have worked very hard on the project, and have learned quite a bit about benchmarking practices and several new software tools. The project will wrap up in the next couple of weeks, and we’ll share additional details as soon as possible. Early next year, we’ll publish a video about the experience.

If you’d like to join the NC State students and hundreds of other XPRT community members in the future of benchmark development, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?