XR + AI: BATTLE OF THE GIANTS
I just came back from Meta Connect 2025, and I find it the perfect timing to reflect once again on the state of XR and the global strategy here at Tab To Tap.
Fair warning: This post is going to be mostly my predictions and analysis of the current XR landscape. I might end up being totally wrong, also feel free to tag me on social media and debate this!
Native Development is the way to go
I created Tab To Tap in January, and my goal has always been to be the go-to guy for native development on XR platforms. That decision came from years of doing Unity3D development and being more than unhappy with the tooling available when wanting to create apps needing a good and beautiful user interface. The maturity of the tooling coming from mobile development was a godsend when visionOS was released, and I was more than happy to jump on it and say no to the new pricing of Unity, especially with PolySpatial, their solution dedicated to targeting visionOS, being only available on the Pro Plan.
Since June 2023, my investment in native development for XR devices has been mainly focused on the visionOS platform and the Apple Vision Pro, which I view as the best platform for mixed reality development or spatial computing, as Apple calls it. For a few months in 2025, I also tried to also target companies interested in AndroidXR on the Google side and Meta Spatial SDK for Meta’s side. But as I quickly found out, there was a clear lack of interest, and it also didn’t help to be targeting so many platforms at once. That lack of interest, in my opinion, comes mainly because there aren’t any AndroidXR devices publicly available, as we’re still waiting for the launch of Project Moohan. And Meta Spatial SDK is still pretty new, and many developers on Meta’s side are still Unity developers that will not easily go towards this SDK, so Meta will instead have to turn existing Android mobile developers into native XR developers. I think they have a great team working on this SDK, and I really wish Meta pushed this more during Connect 2025.
Is Apple behind?
I often find myself in a debate regarding the fact that Apple seems so behind in terms of AI or that they completely flopped with their Vision Pro. It’s obviously not reassuring to see so many leaders at Apple leaving the company or their top engineers leaving to join Meta or OpenAI. And it can be also confusing to see the lack of enthusiasm from the general public for the Vision Pro. Both are valid concerns but let’s discuss a bit about what Apple is clearly doing great on both fronts.
There are a lot of things to say about their AI, they’re aware of the shortcomings and I do think they can turn things around as I said in a previous post. But to sum it up, I think a lot of people are really underestimating their investment in running AI locally. They’re doing two things very well to achieve that:
First with the Foundation Models framework (that allows on-device access to Apple’s large language models) being a great example of things to come on the software side. A lot of companies will see their business completely destroyed when there is no point in paying a subscription to get access to a “general” AI SaaS service and Apple is also greatly positioned with their deep commitment to privacy with Private Cloud Compute (PCC) to target users that have a deep distrust of Meta regarding their past with data privacy especially on the B2B side where the Vision Pro is actually getting some traction.
And secondly, let’s not forget that all of that is also possible because of their investment in developing their own processors (A- and M-series processors), cellular models (C1 and C1X), and now the Apple N1 chip, which supports Wi-Fi 7, Bluetooth 6, and Thread for all the latest iPhones. They are building the best hardware to run AI in your pocket.
And outside your pocket, if on top of that we look at their work with the iPhone Air, they have every ingredient to make a great Vision Air but also some other XR products that will be a much better fit for being mass market: glasses.
Because let’s be honest, headsets will stay a niche (and a very promising niche, especially in entertainment, productivity, and for enterprises), but glasses are what the masses really want and will be able to wear proudly around their peers outside and inside.
Glasses are here, is Meta the new Apple?
With Meta announcing their Meta Ray-Ban Display, I saw a lot of comments saying that Apple is once again doomed. The “move fast and break things” approach by Meta does seem very much alive as we saw it again with their multiple failed demos of the device during Connect 2025. It’s the nightmare of every entrepreneur and developer to see that happening. So obviously I feel bad for everyone who worked hard on this but it’s also symptomatic of what’s missing at Meta so far: good and stable software. Something that Apple has been much better at providing so far.
But to their defense, we can applaud the fact that they’re taking risks, even though it looked like a very messy Connect in my opinion. Some people do enjoy this more human aspect compared to the highly polished Apple Keynotes. So I guess it’s once again two ways to do things that will appeal to different people. Meta quoting Steve Jobs during the keynote also sent a strong signal that they’re aiming to beat Apple on this new terrain. But I think they still have a long way to go before that ever happening, and I doubt they’ll ever be seen as being the new Apple. They can and should be their own thing.
On the other side, it does not mean that Apple shouldn’t take more risks and do things differently. I do think that if they keep doing things the way they do today, it might make them look like they’re only caring about elderly people and focused on health. Which are subjects that are less than sexy for new generations but at the same time it’s a very sound strategy given our aging population worldwide. One critical aspect that we should be reminded of is that wearing glasses can often be seen as a fashion statement, something that Meta truly understood by making sure to partner with EssilorLuxottica to have the Ray-Ban and Oakley brands.
But that partnership does not prevent Meta from having a big weakness: they don’t have their own ecosystem and are currently relying on Apple and Google. And given the current price point of the Display glasses ($800), I bet you that most of the people owning them will have iPhones. And if you carefully looked around at Meta Connect, most people were iPhone users too. So in the end, Apple will still be part of this new AI glasses chapter even if they’re not releasing hardware at the moment. And to the defense of Meta, they sold millions of units of Meta Ray-Ban so far, so they’re on a very good path. While they may not own an ecosystem, they own the social media platforms where photos and videos taken with these devices will be shared.
But even if Apple does release glasses, can their brand still win over well-established brands such as Ray-Ban? Because while in WWDC, you could also see a lot of people wearing Meta Ray-Bans. So even if they don’t want to, both Apple and Meta will be relying on each other to push the glasses’ future for a while, at least as long as Apple does not release glasses of their own, and then we’ll see more clearly what path consumers will take.
Developer ecosystem is key
Having good hardware will not be enough. If you want to have a thriving ecosystem, you need developers onboard. We’re still a long way from having an AI generating apps on the fly for you!(This take might not age well)
Both Apple and Meta will need to make sure they have developers paving the way for them. Meta will launch their Meta Wearables Device Access Toolkit soon, but it will be very limited for the time being and will basically need an iOS/Android app requesting access to the camera feed and microphone. We will not be able to build apps on the glasses directly. This move is not very surprising because I don’t think the current glasses are powerful enough to run actual apps. So basically, apps on glasses will be an extension of your existing apps, meaning they will heavily rely on iOS & Android and your phones. The good news though is that all Meta Ray-Ban glasses are compatible, so you can already target millions of devices.
It’s my opinion that Apple has been preparing us for that with App Intents and features like Live Activities. The first glasses they’ll release might have pretty limited capabilities similar to what we’re seeing with Meta Ray-Ban. As developers, we will first have to take advantage of the glasses to improve the user experience of apps relying heavily on camera streams and microphones, so it makes sense that live streaming and sports are the first use cases to really shine with these glasses, especially since the first version of the SDK does not allow the use of the display at all. And Apple, for example, might release glasses without a display first too, according to the rumors floating around.
Preparing for the future
So with everything said, what to do now? Here are my key takeaways and my strategy for the foreseeable future:
I will continue to focus on native development. It’s even clearer for me that Unity as the preferred tool for XR development is dying as every platform is pushing its own, and it becomes less and less interesting to do cross-platform development for a solo developer/small company. And for cross-platform needs, I see a switch over to Godot in the next few years instead if it’s really needed and the engine is mature enough.
Apple will still be my platform of choice for development for all the reasons I gave in this article. Meta Ray-Ban Display will be a nice dev kit to prototype for the future of glasses and I’ll play around with their iOS SDK. I’m also getting the Halo from Brilliant Labs this year!
I’m doing most of my business with the Vision Pro currently, and what will be key to consolidate that business is the reception of the rumored refreshed Vision Pro that hopefully will come in the next two months. A few months ago, I joined a collective of companies working solely on immersive experience for the Vision Pro called STUDIO•84 and we have much to show you in the future.