CES 2024: All The Things I Put On My Face
CES 2024 was supposed to be all about AI. To a large extent, it was – there were hundreds (literally) of new PCs launched with Intel’s first processor to include an NPU. Rabbit launched an adorable AI companion device, WEHEAD provides a terrifying disjointed face for ChatGPT, and I even found an “AI Backpack” in Eureka Park! However, while every keynote name-checked AI, I spent more time with computers on my face at CES 2024 than using LLMs, and much of it was being powered by Qualcomm, not Intel. In fact, by the end of the week in Las Vegas I had put nearly a dozen smart glasses and spatial computing rigs on my face and got a chance to assess the state of smart glasses and spatial computing just ahead of Apple Vision Pro availability.
Apple vs Qualcomm?
Apple kicked off the spatial computing party by announcing pricing and availability details for the Apple Vision Pro. I tried an Apple Vision Pro back at WWDC, not at CES, but it makes this list anyway because Apple’s entry into spatial computing has catalyzed the market, giving the competition a target to hit – or avoid. No one has announced a directly comparable product coming to market this year, but some are clearly in development. Other companies are deliberately choosing different development paths or market niches rather than taking on Apple directly.
Qualcomm is happy to enable all of it: lightweight AR glasses with the Snapdragon AR1 Gen 1, price-sensitive XR with the Snapdragon XR2 Gen 2, and high-end resolution and pass-through with the Snapdragon XR2+ Gen 2, which was announced just ahead of CES (press release here). The creation of the XR2+ appears to have been a direct response to the Apple Vision Pro; like Apple’s headset, this chipset will support 4.3K resolution per eye, does not tether to a PC, and supports twelve simultaneous cameras for low latency passthrough. Qualcomm also increased clock speeds on the CPU and GPU. It’s a testament to the power of the XR2 platform that Qualcomm was able to spin out a “plus” version in a matter of months rather than require an entirely new generation.
Consumer Things I Put On My Face
The most surprising AR demo was from XREAL, where the Air 2 Ultra is a phone-tethered attempt at full AR glasses capabilities. I spoke with XREAL co-founder Peng Jin, and the ambition here is significant: XREAL wants to build the type of AR glasses you see in science fiction and Marvel movies. These are AR glasses, so all the limitations of passthrough aren’t relevant – you can see the real world just fine. Instead, you have to deal with field of view and quality of the microdisplays. XREAL still has room to grow the field of view (a somewhat modest 52 degrees) but the image is crisp, opaque (at least in the relatively dim booth environment), and colorful. The hand tracking performed reasonably well, you can interact with digital objects mapped to the real world, and all of the demos worked. XREAL now needs to build out an entire software ecosystem, which is likely the bigger challenge than the hardware. However, XREAL has sold 350,000 smart glasses so far, and the atmosphere at the booth was electric. In just the few minutes while I waited to get into the demo booth, a half dozen attendees asked how they could get demos of the XREAL Air 2 Ultra (it was appointment only), though PR was able to squeeze in a well-known software developer. The XREAL Air 2 Ultra go on sale in March for $699.
The XREAL Air 2 Ultra must be tethered to specific phone models. However, XREAL has announced a partnership with Qualcomm for future product development, so I expect future XREAL glasses may be able to move the silicon to a separate device with processing, connectivity, and a dedicated battery. That removes weight and heat from your face, which should allow for much higher processing capabilities.
I have been trying on TCL’s RayNeo glasses since they first introduced as hand-wired prototypes at CES 2022 (yes, that CES). That prototype turned into the China-only RayNeo Air last year that sold out a run of 50,000 units within minutes of going on sale. The next generation RayNeo glasses move from Qualcomm’s Snapdragon XR2 platform to the AR-specific Snapdragon AR1. The RayNeo X2 Lite (the name may change) is planned as a commercial product for this fall. Unlike its predecessor, it will be coming to global markets but no pricing was available.
The RayNeo X2 Lite includes cameras, speakers, microphones, and extremely bright full-color microdisplays, but it does not attempt to map digital objects onto things in the room. Demos included turn-by-turn directions, live translation, and talking to an AI avatar and getting it to dance awkwardly for me. You control the glasses with voice, gestures along the temple of the glasses, and a touchscreen ring that reminded me of North’s joystick ring controller. I found the RayNeo X2 Lite light and comfortable – a key design goal for Qualcomm’s Snapdragon AR1 platform.
ASUS showed off the AirVision M1 glasses in its invitation-only booth alongside a room full of PCs and mobile devices to pair them with. These are fairly straightforward wearable displays, similar to existing products from Lenovo and XREAL. The AirVision M1 have 1100 nit brightness and a 57 degree FOV; you can launch multiple virtual displays with Windows. Pricing and availability were not announced.
Meta did not have a booth on the show floor, but I was able to pick up a review unit pair of Ray-Ban Meta Smart Glasses from Essilor-Luxottica, Ray-Ban’s parent company. I first got a demo of these at Qualcomm’s Snapdragon Summit and they have been widely covered in the press, but I am eager to spend some real time with them, especially as Meta rolls out new AI capabilities. They certainly look great.
Amazon’s booth at CES was mostly filled with Echo speakers and Alexa-compatible devices from third parties, but the section getting the biggest crowd was the latest Echo Frames. I got a pair of the Carrera-designed version just ahead of the show so I didn’t need to push through the wall of people for a demo. Amazon omits cameras and displays; its smart glasses are essentially a pair of Alexa-enabled Echo Buds embedded into a glasses frame. The improvements this time around are design – the Carrera sunglass version looks like regular sunglasses, not a tech product, improved sound quality, and battery life. Echo Frames are basically perfect for listening to podcasts while walking a dog around a suburban neighborhood, but fall short on utility otherwise. Earbuds are better for commuting, and there aren’t enough computing or imaging use cases to supplement your phone.
Enterprise Things I Put On My Face
Samsung and Google were listed as the launch customers for Qualcomm’s Snapdragon XR2+ Gen 2, but neither party has released any details whatsoever. At CES, Sony announced that it will be using the Snapdragon XR2+ Gen 2 to power an unnamed Android-powered VR headset supposedly launching later this year aimed at corporate design professionals. It publicly showed off a render of a flip-up headset alongside a ring and handheld controller duo used for creating 3D models and collaborating in VR for design reviews.
Sony is targeting a subset of Apple’s use cases, completely ignoring the consumer market (in an alternate universe, Sony would be launching the VRman and we’d be watching to see how Apple and Nokia would respond). Sony is not planning to sell the headset on its own, but as part of solutions offered by its partners. The sole software partner that Sony listed at CES is Siemens, who will include the headset as part of a package with Siemens NX Immersive Designer. This use case and go to market strategy suggests Sony will be making these in relatively small volumes and puts Sony in competition with Varjo and Campfire as much as Apple. Campfire makes a headset-and/or-software system specifically for design consultations. It’s not a direct comparison because Siemens is tackling more of the process, but for design review sessions Campfire’s software is refined, easy to use, and works on a range of devices, not just Campfire’s own headset. Sony was showing off an actual prototype of its headset at behind closed doors at CES, but I did not get a demo / I did not put it on my face.
Varjo already claims that 25% of the Fortune 500 are customers, and it sells extremely high resolution VR headsets for simulation and enterprise design and engineering. I covered Varjo’s XR-4 launch late last year, but was not able to get heads-on until CES. I was impressed with the sheer graphics capabilities you get with this system, which was tethered to a Windows PC with an Intel i7 processor and NVIDIA 4000 series GPU. You can interact in real-time with 5K-per-eye fully 3D environments rendered on the fly complete with ray tracing. That’s something that you can’t do on an Apple Silicon M2. While future versions of the Apple Vision Pro will undoubtedly get more powerful chips, Varjo and NVIDIA are not standing still either.
AR passthrough was not as impressive as I had hoped, with a fair amount of distortion around the edges, and with significant latency in my demo. This was disappointing. Meta’s much less expensive Quest 3 has passthrough that isn’t super high resolution and doesn’t like it when you move your hands near the headset, but provides AR passthrough video clear and stable enough to play games without motion sickness. Apple Vision Pro is on another level; it can fool you into thinking you are looking at the real world; despite its resolution and horsepower the Varjo XR-4 cannot, at least with the current software build.
However, the XR-4 Focal Edition’s variable focus capability was mind-blowing. I wear progressive lenses because I am old, but in VR you typically just correct for distance; with passthrough you also typically have a single focal length, so looking at tiny text in the real world – say, on an instrument panel – is impossible. With the XR-4 Focal Edition and distance corrective lens inserts, I was able to look at things at any distance, and the system would compensate so that everything was in focus, even though my eyes on their own can’t see anything clear at that distance. Amazing.
The most unusual productivity-oriented wearable at CES had to be the Sightful Spacetop exhibited at Showstoppers. The system tethers glasses-with-a-microdisplay-screen to a laptop base with no screen. The use case mirrors a key Apple Vision Pro demo – creating multiple large, repositionable virtual displays for your computer. In Apple’s case, that’s a MacBook Pro. In Sightful’s case, that’s a 2 lb wedge-shaped 5G device that runs Android and Sightful software. The idea is that you get flexibility, complete privacy, and constant connectivity. The contraption is powered by Qualcomm Snapdragon XR2 and I’m due to get in a review unit.
Accessibility Things I Put On My Face
Accessibility is the best use of technology, and some of my favorite demos at the show were aimed at solving problems with your eyes and ears.
Ocutrx Technologies developed a prototype OcuLenz AR headset to help people with macular degeneration literally read around their blind spots. This is another Qualcomm Snapdragon XR2 product. After calibrating the unit, whatever you look at gets split up and reassembled around the area where you have trouble seeing. This is obviously an extremely specific use case, and reading in a bit of a distorted bubble may take some getting used to, but, still, this was extremely impressive.
Showstoppers had a bounty of spatial computing this year. In addition to Sightful’s Spacetop and Ocutrx, Japan’s Vixion was showing off the ViXion01 multi-focal visor. This thin wearable definitely gives off Star Trek Geordi La Forge vibes, and pulls off the magic trick of correcting vision and focus at any distance. The (approximately $700) ViXion01 uses a time-of-flight sensor and then adjusts tiny lenses accordingly. Unfortunately, those lenses are really tiny. While I could see using these as an electronic jeweler’s loop, regular prescription glasses and using your brain and eyes to focus at different distances is much more practical.
The most mass-market accessibility product I demo’d at CES was the upcoming Nuance Audio glasses with built-in, completely invisible hearing aids. Nuance Audio is a new brand from Essilor Luxottica after acquiring Israeli startup Nuance Hearing. The glasses are designed to solve mild to moderate hearing loss by using head-tracking and beam-forming microphones to amplify just the voice or sound of whatever you are looking at and then aiming that sound at your ears from the temple of the glasses. Nobody needs to know that you are using them, and they bypass the social signaling problem of earbuds (people assume that you can’t or don’t want to hear them when you wear earbuds). My demo in the Essilor Luxottica booth in the noisy LVCC North Hall was effective.
There are lots of new OTC earbuds and hearing aids being developed, but they often fail getting to market. Essilor Luxottica has the cheat code for this type of product: the company owns nearly every major eyeglass brand and many major eyeglass retailers. They know what type of glasses people want because they already make the glasses that people buy. The Nuance Audio glasses will undercut hearing aid pricing but will still cost well over $1,000 (possibly closer to $2,000). However, Essilor Luxottical also has relationships with eye doctors, national health systems, and insurance companies, which should bring the end user cost down. These should be available for sale in late 2024.
To discuss the implications of this report on your business, product, or investment strategies, contact Techsponential at avi@techsponential.com.