XR Developer News - June 2025

This might just be the longest edition of XR Developer News yet, with Apple's WWDC and Augmented World Expo happening almost simultaneously. Lots of interesting news, so I hope this roundup is useful!
Apple
WWDC is always fun. Throughout the year not that many things are announced on the software and developer side by Apple, but once per year the floodgates open to spectacular effect. This year was no different.
While Vision Pro is still the only XR hardware Apple offers and there have been no real signs of any new iterations of Vision hardware, it's obvious from looking at WWDC that Apple is still full steam ahead on visionOS. They're clearly in it for the long run, building a solid operating system and developer platform.
Amongst the many small tweaks in visionOS 26, there were a surprising number of pretty major features and improvements, making for quite interesting announcements. Especially the areas where Apple is ahead of Meta in shipping actual software, like Personas (Codec Avatars at Meta) and Widgets (the ill-fated Augments at Meta), are indications that Apple is serious about competing in the space.
WWDC of course always kicks off with the keynote. The visionOS section is around 8 minutes starting at 1:06:20, and worth watching as a whole.
There is however a wealth of deep dive sessions, which have also been published.

While I won't go into full detail on each visionOS session here, I've tried to put together a bit of a viewing guide, so you can more easily find the sessions that are interesting to you.
- Mandatory viewing is the overview session What's new in visionOS 26, which expands the 8 minute keynote section into a more in depth 40 minute overview. It goes into many different topics, such as on-device LLM support for things like speech-to-text, faster 90 Hz hand tracking, co-located mixed reality, improvements to building on the web and support for a range of new 3d-native video formats.

- Then there are a few fundamentals sessions for anyone building native visionOS software in Xcode & Swift instead of Unity or on the web. What’s new in RealityKit and Set the scene with SwiftUI in visionOS are great if you're active in that area, but a bit less if you're on other development stacks. Still interesting to see where Apple is taking the native approach though.
In general it's always fascinating to see Apple completely ignore Unity, Unreal, etc at WWDC, as if they don't exist and native is the only way to go. It just always feels a bit tone deaf. From Unity I've not seen any news on visionOS 26 yet, but undoubtedly that will follow, as it keeps actively working on visionOS support.
- From there it goes into more specialised topics. A good example is Explore spatial accessory input on visionOS on third party controllers like PS VR2 Sense Controllers (remember when Apple was dismissing those as clumsy controllers?) and the new Logitech Muse stylus.
- The same goes for Share visionOS experiences with nearby people, which shows how powerful co-located and remote multi-user experiences are starting to become. It's a typical example of something which is not that useful right now, with the very small number of Vision Pros in the wild, but will be great in a few years.
- What’s new in Metal rendering for immersive apps goes pretty hardcore into technical details, but also covers one of the major new features: rendering on a Mac and then streaming to a Vision Pro, so you can make use of the full graphical power of a Mac. On Quest with Windows this has been a thing for a long time, so it's nice to see this come to Mac as well.
- If you are into web development for visionOS, you have to check out What's new for the spatial web and read the relevant blog post and deep dive post.
- I've always been a bit conflicted on the web on visionOS, because Apple doesn't support true mixed reality video passthrough WebXR type experiences, limiting what's allowed to 2D windows in space, with a few 3d model pop-outs, or full virtual reality immersive scenes without passthrough. And exactly that, true mixed reality, is the core use case of visionOS, so it's strange to see this not being allowed. By handicapping web support on visionOS this way (which Meta and Google don't do on Horizon OS and Android XR), it's just not really interesting to go that direction on the device for now.
- For anything to do with enterprise apps, check out Explore enhancements to your spatial business app.
- There was a surprising number of talks on video support, including the starter session Explore video experiences for visionOS, followed by deeper dives into the coding aspects in Support immersive video playback in visionOS apps, the video files in Learn about the Apple Projected Media Profile and the true high end of video options in Learn about Apple Immersive Video technologies. It's less my cup of tea, but should be fascinating if it's your focus.
Finally in design I'd recommend two sessions:
- Design widgets for visionOS goes into great detail on the new widget system. Somehow Apple's design chops are always impressive to see. It's a typical Apple system, in that it's not very flexible with a lot of guardrails, but it just looks so damn nice.
- My main doubt with widgets is that they really shine as a passive thing, which is just there when you're using a Vision Pro for another task. And honestly, I think with the limited comfort of the current Vision Pro, it's mostly a device you put on for targeted use cases and take off after, so there isn't that much passive time with the headset on in which those widgets are truly useful. But that still leaves it as a really nice foundation for when comfort improves and wear times increase in future hardware iterations.

- The other designs session worth watching is Design hover interactions for visionOS, which also shows the great care that goes into little details in visionOS. It's just a joy to watch the level of polish of these things.

- Finally, if you're into it, the specialised topic of custom environments, check out Optimize your custom environments for visionOS, but for most people it won't be one to watch.
It's worth spending a bit more time on the new version of Personas. In what I can only describe as typical Meta-style, on the same hardware, purely through software improvements, Apple has managed to push out a major and very impressive improvement to Personas. It has been fun to see the response.
For more solid roundups and analysis, check out the ones by Justin Ryan and David Heaney.
The beta of visionOS 26 is available now, and it will launch in the usual September/October timeframe.
Snap
The other big event this month was the mother of all XR conferences: Augmented World Expo (AWE). There was a stream of news from the conference and Snap took a very prominent spot within that.
Both founders had a keynote slot, each with interesting news. Evan Spiegel started off and dropped the news that in 2026 Snap will launch Specs, the next iteration of Spectacles, and that (in contrast to 2024's Spectacles) it will be sold to the general public.

Another important announcement was a partnership with Niantic, bringing Niantic's mapping technology to Lens Studio, Snap OS and Spectacles. This is pretty major news and a very sensible partnership, as both parties are very complementary to each other. I'm really looking forward to seeing this roll out.
And the announcements kept on dropping: Snap had been hinting for a while that this was coming, and indeed Spiegel shared that WebXR support is coming to Snap Spectacles 'later this fall', as shared in CTO Bobby Murphy's keynote. More pretty major news and a really good step. Spectacles and Snap OS are really starting to shape up well.
Moving from the future closer to the new, there was also a bunch of practical news from Snap this month:
- Lens Studio 5.10 (announcement), 5.10.1 (focused on Spectacles) and 5.11 were released. Especially the 5.10.x releases were very heavy on AI and machine learning features.

- Snap OS on Spectacles received the June update (v5.62) in parallel to Lens Studio 5.10. In addition to AI features, the addition of Bluetooth Low Energy support is an interesting one.
- Snap released a mobile app and web tool to generate Lenses, aimed at a broader audience than developers.
- Alessio Grancini from the Snap team has been pumping out interesting videos on Snap Spectacles development like there's no tomorrow, for instance on specialist topics like SnapML, but also general things like a Spectacles development overview and an intro for Unity developers. Check out his channel for more interesting videos.
Niantic
Let's jump to Snap's new partner Niantic. In the midst of its transition into Niantic Spatial, CTO Brian McClendon gave a keynote at AWE which unfortunately hasn't been published yet as far as I can tell. Niantic did publish a blog post though with the main announcements.
To dive into some specifics this month:
- v3.14 of the Niantic Lightship Unity SDK was released with some small updates.
- It seems one of the casualties of the transition to Niantic Spatial is the Niantic Maps SDK. Niantic sent out an email recently that it will be sunset in October, with a deprecation roadmap rolling out throughout the coming months.
- Another change seems to be a rebranding of Niantic Studio to 8th Wall Studio. This seems to have happened pretty silently over the last two months or so. It makes complete sense though, as Studio is very much an 8th Wall product, and the new name actually fits better than the previous one.
- Meanwhile 8th Wall Studio received a bunch of updates which added support for Prefabs and a range of other things (video).

Meta
While Apple and Snap were present in a big way, Meta was relatively quiet this month, at least when it came to big splashes. Of course the new Oakley Meta glasses were announced, but I won't cover those here as they don't have an XR component. There was plenty of smaller news though:
- Meta published a blog post with a quarterly developer recap. Interesting to read, because it contains something for almost anyone, whether you're building in Unity, Unreal or on the web. Understandably, that blog post glossed over the latest drama on developer platform quality.
- The documentation for the Mixed Reality Utility Kit has been overhauled. While you're at it, check out this new video by Roberto Coviello on colocated experiences in MRUK.
- Meta's Spatial SDK, which allows native Android developers to build for Horizon OS, has received a number of major upgrades since its initial release. Dilmer Valecillos covered the improvements in a new video as well.
- I'm still mightily annoyed that Meta and Google are developing competing Android frameworks for this use case, thereby putting Android developers in the really strange position of having to build XR support into Android apps twice in two different manners if they want to support both Horizon OS and Android XR.
- If you're developing for Quest on web, make sure you check out the new possibilities around colocation and payments.
Google, XREAL & Lynx
To get back to AWE, Google had a presence there as well in several forms. It held a keynote itself, which was mostly a repeat of earlier announcements, except for the mention that it is planning to release an SDK for the prototype glasses it has been demo'ing (around minute 11 in the video) and that Qualcomm's Snapdragon Spaces is basically being folded into Android XR (minute 16).
It's still going to be a while until Android XR really kicks off, with the latest rumours putting Samsung's headset launching in the September/October timeframe, but Google is definitely pushing forward. And so is Godot.

XREAL was also present at AWE, but its Project Aura Android XR device is not coming until 2026 and wasn't available for testing at AWE. So far it seems like it's just renders. In a similar vein, Lynx is also targeting 2026 for its Android XR device.
Other XR hardware, software and news
- Unreal Fest was held this month, and although Epic doesn't focus on XR very much, and it doesn't really feature in the keynote, it's worth watching that anyway to keep up to date on what's happening on Epic's side of the fence, for instance with Unreal Engine 5.6.
- Pico released v3.2 of its Unity and Unreal SDKs.
- Open Standards body Khronos Group keeps on building out the very useful OpenXR standard, increasingly with augment/mixed reality features. This makes every developer's life easier, so it's great to see this.

- To close off, some sad news first, with the announcement by MIXED that it is shutting down. I was very sad to hear the news, because it was a very thorough and high quality publication in the space. All the best to the editors!
- Then the happy news: I've joined DNA.inc as Head of XR and will be working on building out the XR team there. Really looking forward to it!
Upcoming XR events
- August 24-26, 2025 - AWE Asia in Singapore.
- September 17-18, 2025 - Meta Connect.
- December 8-10, 2025 - United XR Europe in Brussels, Belgium. A new event resulting from a merger between AWE EU and Stereopsia. Call for Speakers closes July 15.
A bit about this newsletter
Each month I try to round up all the interesting developments in the XR developer landscape. New hardware and software releases, events, interesting tooling, etc. Feel free to reach out to me on LinkedIn, for instance if I missed anything which definitely should be in this monthly round up next time.
Want to know even more about XR Developer News, now you can!