Apple presents more pieces of its future AR headset puzzle

This story is part WWDC 2022CNET’s full coverage from and about Apple’s annual developer conference.

What is happening

Apple didn’t announce a virtual or augmented reality headset at its WWDC developer conference, but new software tools have been announced to push AR and social connection further.

why is it important

Apple’s planned headset will likely work with Macs, iPads and iPhones, building on ideas already developed on the company’s software. Meanwhile, many other companies, including Google and Qualcomm, are advancing their AR ambitions.

And after

A VR headset might not arrive until 2023, although it’s possible it’ll be announced sooner.

Anyone who follows Apple WWDC Developer Conference last week for news on his long-awaited AR/VR Headset must have been disappointed. The event had news on mac, iPad, iPhone, apple watchthe smart home and even carsbut AR was barely mentioned.

After reports of delays, Apple may not end up releasing this headset until 2023. It could mean another year of waiting…or it could mean an event later this year previewing what’s to come. Right now we have a lot of unknowns left, but that doesn’t mean there weren’t clues and new software that looks to be very useful for an upcoming headset.

Apple already has a well-developed set of AR tools for iPhone and iPad, and depth sensor lidar scanners which can do the kind of real-world “mesh” that VR and AR headsets need to layer virtual objects convincingly. There is a set of tools that recognize text and objects from camera feeds, much like Google Lens. The Apple Watch even has accessibility-focused features gesture recognition.

What WWDC 2022 showed was a handful of improvements that, more I think about it, seem to be aimed at laying down some additional groundwork before a headset arrives.


SharePlay arriving in Messages makes it a connected social framework that could be useful in a VR headset.


FaceTime and Messages: Tools for Apple’s Metaverse?

As businesses have turned to metaverse messaging over the past year, it’s typically code to massively redesign cross-platform social interactions. “Social” is a strange thing for Apple, which is not a social media-focused company like Meta, Snap, or Niantic.

Apple has FaceTime and Messages, which form a proprietary network on Apple devices that could be the glue for connecting with people on headsets. Apples SharePlay framework, which was introduced in iOS 15 of 2021, tries to make collaboration and watching or reading content with others more instantaneous. SharePlay’s reach has expanded in iOS 16. Much of it looks like the kind of adhesive that Apple’s metaverse ambitions need.

Apple already has Memoji avatars, but Apple is also adding more and more sharing tools that link apps and collaborative content through Messages and FaceTime. These features added in iOS 16, iPadOS 16, and MacOS Ventura might make it easier to share things, but on a headset, they might be essential shortcuts for quickly connecting with others. Meta’s VR headsets rely on friends and party-based connections via Messenger for social media; Apple could take the same route. SharePlay is coming to Game Center, Apple’s neglected social gaming center, seems like an equally useful tool for future cross-play experiences.

A room is scanned over the phone, with furniture and walls outlined by light lines.

RoomPlan scans a large room, along with its furniture, creating a 3D scan.


Could RoomPlan be a stepping stone to mixed reality in the bedroom?

Apple announced a new stealth tool in its next ARKit 6: a room scanning technology called RoomPlan. At first glance, the feature looks like Apple’s own version of room scanning technology using lidar, similar to what other companies have developed. The tool recognizes furniture, walls, windows, and other elements in the room, and also quickly creates a 3D scan of a space for things like home improvement or construction.

Or, perhaps, it could enable new forms of mixed reality. “Remember to use people’s environments as the canvas for your experiments,” explains developer WWDC’s video detailing RoomPlan, adding that you can “even embed people’s spaces into the game you’re building.” While a depth map with lidar could already overlay AR objects, what this new technology could do is bring a piece in virtual reality and overlay virtual things in this layout. I saw this idea a while ago in VR headsets that used cameras to scan environments and brought them to VRcreating a sort of mixed-reality feel, and it just might be the kind of mixed-reality that Apple’s long-awaited camera-based VR headset is starting to enable.

A virtual pirate ship sits on a real pond, seen on a phone screen.

Background video quality for AR will improve in ARKit 6.


4K video for AR sounds as a path to helmets

Another new feature in ARKit 6 seems notable: AR effects will now work with 4K video input. It’s an odd feature at first glance for phones, which have screens that are too small to appreciate 4K AR. To capture video with overlaid AR effects, this might be useful. But increasing the video quality for AR would be extremely useful in VR headsets that use onboard cameras to combine VR with a video feed from the outside world, a technique called mixed reality passthrough.

VR headsets that use passthrough mixed reality, like the hardware Apple is expected to eventually release, rely on a high-quality video stream to make superimposed virtual objects look realistic. On the Oculus Quest 2, this video stream is grainy and black and white. On the professional-grade Varjo XR-3, where I had a chance to try a more advanced passthrough mixed reality, it’s in color and much higher resolution.

The new ARKit features also appear to be faster at recognizing room details for quick AR overlays and motion capture tracking. Both would also be useful and necessary in a headset with AR functionality.

A 3D map view of an airport.

Apple is adding more cities to its set of enhanced, location-based augmented reality-ready destinations in Maps.


Apple’s growing cities where location-specific AR could work

A number of companies have recently expanded their mapping initiatives to work with AR so that future glasses can recognize “permanent” AR objects in real-world locations. Google is expanding AR to work on many of its Maps compatible with Street Viewand Niantic is building a participatory map of playable areas for AR games. Snap has been scanner of cities with lidar. Apple has cities scanned with lidar too, but only a handful. More are being added this year, but that means there are only certain places where location-specific AR will work reliably with Apple’s AR toolkits. Apple shouldn’t have everyday wearable AR glasses for long, and that makes sense: aside from concerns about battery life, cost, or safety, AR glasses will need a global mapping framework. which is only half built at the moment.

Still a lot of things we don’t know

Despite numerous reports of a headset imminent (or arriving in 2023), we still don’t know anything definitive about what Apple has planned. These little speculations about new software tools from Apple are hardly proof of anything… but they do show that many of the necessary pieces are coming together in plain sight.

#Apple #presents #pieces #future #headset #puzzle

Leave a Comment

Your email address will not be published. Required fields are marked *