What Wasn’t Announced

And then there was the stuff Meta didn’t talk about.

Content

If you’ve been following the conversation around Connect this year, a major theme is that folks were let down by a lack of major game updates and announcements. Neither of the two biggest games on the horizon, Assassin’s Creed Nexus VR and Asgard’s Wrath 2, will be ready in time for the launch of Quest 3. Players will need to wait until November 16th, and December 15th, respectively, to get their hands on those two anchor titles.

Meta showed off Assassin’s Creed during Connect, but impressions I’ve seen have been so-so up to this point and the hype seems rather muted for such a big franchise. I don’t think anyone can really tell at this point if the game will be a hit, a flop, or ‘just ok’.

And then there’s Grand Theft Auto: San Andreas VR, which Meta didn’t even mention at the conference. Considering the game was announced two years ago now—and it isn’t looking ready for the 2023 holiday season—people are starting to get worried that it might never arrive.

The biggest game news at Connect was the announcement of LEGO Bricktales, which is certainly a good IP to add to Meta’s store. But the game was revealed awkwardly on stage as part of a sizzle reel without much fanfare. As a result, I’ve seen minimal discussion about it surrounding the event. It’s also not launching until December.

All of that leaves Quest 3 without any immediate killer launch content. There will be plenty of fun games to explore for those who are brand new to VR with Quest 3, but for the most part players will be diving into content that was made with the three year old Quest 2 limitations in mind. Those that have already taken the plunge will be waiting a month or two for new games and graphical enhancements for their shiny new Quest 3.

Looming Debt

Photo by Road to VR

The overall UX/UI situation has been one of the worst parts of the Quest headsets for years now. The company keeps putting out industry-leading hardware, but has seemingly little vision for how to make the software experience seamless. From the friends and family around me who bought their own Quest 2 over the years, I’ve seen the struggle that a poor UX can bring. Basic things like watching your own video content, finding and connecting with friends, or getting quickly into apps, are far harder to do then they could be. The interface is surprisingly buggy, stuttery, internally inconsistent, and regularly changes.

In short, the Quest UX/UI needs a radical overhaul, but Meta continues forward with meandering changes that don’t address core issues. Overall its announcements (or lack thereof) imply the company is unaware of the usability issues of their headset that could be fixed in software.

And now, before the company has really even settled on a clear interface paradigm, new distractions are sure to complicate things.

Meta has been slowly introducing ‘direct touch’ into the Quest interface, allowing people to touch the interface panels like a touchscreen rather than use laser pointers at a distance. On Quest 3 it seems this might be enabled by default.

And while I’m the first one to hate on laser pointers in VR, the company has seemingly lost the institutional knowledge of Oculus Dash—otherwise Meta would recall that it already made the mistake of building an interface that didn’t know if it was supposed to be made for fingers or laser pointers, and thus ended up working poorly for both. I’m betting we’ll see similar issues arise from the current Quest interface.

SEE ALSO
Android XR Creates a Major Dilemma That Will Make or Break Meta's XR Ambitions

Meanwhile, the company wants to introduce ‘Augments’, little apps that live in the room around you, which will likely act as a spatial interface that’s very different than the current set of flat panels that make up the headset’s UI. Without a clear direction to guide this mix of flat and spatial design, users are only going to be more confused by an increasingly inconsistent set of interactions.

This accumulating UX/UI debt is very likely to come due when Apple rolls out Vision Pro next year. While the headset is in a different price class entirely, Apple has clearly thought about the core interface experience and social underpinnings of its headset—and given developers tools to build along a consistent set of rules. And Meta won’t be able to ignore this for long.

– – — – –

Overall, it’s exciting to have a new and impressive headset hitting the market. It’s good for everyone that Meta has had the backbone to keep XR alive while people collectively figure out where it does and doesn’t fit. But this is the last time the company will get to release a product into a domain that it singularly dominates. With Vision Pro finally about to deprive Meta of the ability to set its own pace, you can expect the XR landscape to look quite different by the time Connect rolls around next year.

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Andrew Jakobs

    Again, the video for IOBT with the article is not showing what it actually does for upperbody/elbow/arm tracking like anothe video on the Meta developer youtube channel. Legs are just a part of it, but the upperbody tracking is really interesting. Tried to post the link to the youtubevideo with the original article but it was never approved.

    • Arno van Wingerde

      I wonder if a few very cheap motion sensors with a band around the leg at knees and feet would not help the “problem’ for those who need it, e.g. for a dance app.for me putting them on would be too much effort in most case, but the cost should be minimal, if you just want to track 4 points.

      • There is the Sony Mocopi which offers FBT via this approach, but it has drift problems from what I’ve heard.

        Eons ago there was this really cool project called TelariaVR. It used a strap, IMU, and pressure sensor on the bottom of the foot, which allowed for some really cool interactions. But it never went anywhere unfortunately. Here’s the proof of concept for it:

        youtu . be/hVZZ9JntP3o?si=PqXYUS_7kWu6NaQ5

      • Christian Schildwaechter

        TL;DR: Meta’s software based FBT will most likely help those using hardware based tracking more than hardware tracking will help those those using Meta’s software based FBT, but effectively almost everybody using VR will significantly benefit from FBT getting more support and out of its niche.

        Adding cheap motion trackers probably would help with apps that already support FBT, but there are very few of these to begin with. The problem with current FBT is friction and default configuration. There are actually a lot of solutions available for many problems, but whenever you increase complexity and cost, you lose a lot of users. With all hardware based full body tracking systems, you have to make some significant investment in time and money.

        Adding Vive or Tundra trackers is the closest to a plug-and-play solution, where all you have to remember is charging the trackers, but you have to use lighthouse and spend about USD 125 per tracking point. Sony’s Mocopi is quite user friendly in combination with their phone app for about USD 75/tracked point in their standard six tracker config, but as a IMU based system, you have to regularly recalibrate it. The comparable SlimeVR core set comes down to USD 32/tracked point, with not so polished software and the same IMU issues, but a lot of community support. And you can push that down to USD 15 by building them yourself and pick the cheapest components, plus a lot of work that can maybe justified as a learning experience, but not as a money saving measure when applying minimum wage.

        Only very few users interested in FBT will be either willing to effectively spend hundreds of dollars and/or varying amount of time for setup and repeated recalibration. And of those a lot will still get annoyed by having to strap several controllers to their limbs instead of jumping into VR within a few seconds. Consequently there is little initiative for developers to integrate FBT into their apps or games to appeal to these few users, and even less initiative to rely on it for game mechanics

        This causes a typical chicken and egg problem, and condemns the tech to a niche. That’s even true for optional peripherals from the original manufacturer, e.g. the Gear VR initially relied on a small trackpad on the HMD, and only later versions got an additional 3DoF controller. But since the largest part of the user base didn’t have one, most apps never supported it.

        Making FBT (approximation) part of the default configuration on Quest via software solves a lot of that. For one there is now a default option to integrate FBT into apps, so a lot of developers will start to experiment it, even if it turns out to be inferior to current solutions. Once it gets integrated into more apps, SlimeTracker etc. should see a huge boost even though there is now a free alternative, simply due to there now being more uses cases. And the hardware trackers probably still significantly improving the tracking parts where Meta has to rely on smart guessing, or provides an upgrade option for Quest 2 users stuck with only guestimated virtual legs. Integrating it in a useful way into games will take much longer, as a lot of it will work best or only on Quest 3, and Quest 2 will dominate the install base for a long time.

    • Christian Schildwaechter

      I guess you are referring to “Get Moving: the Latest from Movement SDK” youtu_be/B-pN-UzpnT4 . It’s an interesting 30min video, though clearly targeting developers and rather slow paced, and end users would probably prefer a 2min cut only showing the relevant demonstration parts.

      It is very unfortunate that ad based monetization now drives sites to make users stay as long as possible instead of referring them to other/original sources, and toss out links in comments to discourage spam. Both break one of the fundamental principles of the web, the linking between relevant information as a quick way for people to get a more detailed picture, based on their own interest and needs. Instead we now have to rely on workarounds like misspelled URL to evade spam filters, and while I usually despise someone replying with “google it”, it is at least an option, as long as the title of the relevant page or video is included as a reference.

      The term micro-transactions is now mostly associated with predatory gaming monetization, but the original concept of the WWW by Tim Berners-Lee fro 1989 included not only links, but also back-links and true micro-transactions, to allow content creators to get payed for the information they provide, instead of having to drown everything in ads. And I often wish both the back-links and monetization had been properly established. I would be very willing to pay a few cents to access information that takes me a hundred times more in time to read when just applying minimum wage, if I could get directly to the relevant information instead of wading through tons of links to other articles, unrelated videos, walls of ads and now also AI generated fluff first.

      Just being able to see which other sources have linked to an article could get rid of a lot of clickbait and spam, and make linking to external sources more attractive to reputable news sources, as users would use them as a filter to get to the good stuff. Without a standardized option for (sub-)cent transactions, we now have a web with less useful connections and a lot of extra effort for sorting through the increasingly aggressive layers added to pay for the creation of “free” information. Or monthly subscription that only make sense if you only use a few information sources, but not for accessing a large network of connected information, as the web was intended to be.

  • Ad

    Hard not to be extremely pessimistic about the future of this industry.

    • The XR industry in the sense of productivity features in businesses, education and military/aviation will be doing fine.
      It’s the game and entertainment sector that’s still wonky.

      • Lucidfeuer

        Military yes, businesses and education what have you been smoking?

    • shadow9d9

      Considering how incredible pancake lens clarity is, combined with the potential for xr, plus ringless controllers, plus A1 for wireless pcvr streaming.. VR is more exciting than ever before.

    • Arno van Wingerde

      Sure the turn over went up by a measly order of magnitude in 3 years, from 6M$/month to 60 M$/month is a clear sign the industry is dying! You are one of those half-empty types, I guess?

      • Traph

        60 million bucks a month – impressive, very nice.

        Now let’s see Reality Labs’ 2022 operating loss.

        Yes I realize this is not an apples to apples comparison, so please hold back the “um ackshually”. The larger point is that the Oculus VR market is heavily warped by Zuck Bucks and it’s astonishing to consider that another order of magnitude increase over the next three years would still put Meta ~10-15 years in the hole just from 2022 losses alone.

      • Lucidfeuer

        You clearly don’t work in this industry

  • Dragon Marble

    I don’t think Meta’s MR push is a “response” to Apple. These things take years to develop. It’s just that both companies see the same potential, and the technology is now good enough. Apple seems to be all in on MR while Meta is still testing the water.

    • Christian Schildwaechter

      I don’t think this was meant as in Meta started developing tech to react to Apple. We don’t even know what they spend the USD 10bn a year on MRL for, but it is safe to assume that they have a huge number of projects we have never heard of, only a few will ever make it to market.

      So at this point a reaction to Apple is less a new development and more picking things they already had developed internally, like the micro gestures or Augments and making them publicly known. Which is fine in principle, a smart move for PR to prevent anybody pointing out features that AVP might have and the Quest 3 might not, and also cheap, as the very likely already existed somewhere at MRL.

      The problem is the lack of consistent UI philosophy. I have no doubt that they had Augments for a long time, as those are mostly an extension of the spatial hooks already available on Quest 2 for hanging virtual pictures to your MR walls. I was still somewhat baffled by them presenting the concept, as it mostly makes sense for an HMD where passthrough is basically always on, even during the use of applications. Which is the case for the AVP, where we haven’t even really seen apps using their full “immersive mode” VR. But the Quest 3 first and foremost runs the same software as the Quest 2, and the UI is used mostly as a launcher to start apps taking over the whole view. An Augments clock or chat window or weather widget positioned in MR is of only limited use on Quest, so why is this feature getting so much exposure now, long before there is a reason for users to really stay in MR on Quest?

      Meta will probably be able to counter every AVP feature with something very similar from their own labs, but this may actually backfire. Apple is often criticized for not having certain features that e.g. Android offers, and it is regularly a deliberate design choice. They aim for a very consistent set of core features that will work flawlessly together in a way that intuitively makes sense, and things that don’t fit get cut, even if the users complain. That can be very annoying, but is the basis for their high scores in usability.

      In contrast Meta is already somewhat know for their inconsistent UI/UX, with several reasons for that mentioned in the article. If instead of first fixing the base, they now start throwing extra features like Augments for the still unproven MR on top of that, or micro-gestures without having the eye tracking based UI that Apple uses them for, they’ll just make it more confusing.

      A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away.

      Antoine De Saint-Exupéry

      • Dragon Marble

        There are at least two reasons to stay in Q3’s MR home: watching movies or playing Xbox Cloud games — especially if you want to do it together with someone sitting next to you.

        • Arno van Wingerde

          I already have a device that can do that an OLED tv that blows the Quest3 away. AFAIK it can also do Xbox cloud.

          • Dragon Marble

            Not in 3D. Also, the TV I want is too big for my room.

          • JanO

            To my understanding, games will only display on a 2D virtual screen…. Where did you get that this involved actual 3D gaming?

          • Dragon Marble

            I was talking about movies. I don’t know about games. But they should add 3D support. Otherwise, I can’t really think of a flat game I want to play.

      • One thing that people very often overlook is the „it just works“ mentality of Apple products. They don’t need to be first. They don’t need to be affordable (at least in first gen products of a new category). They just make the overall sum of its parts better than the competition. That’s where the AVP will fit right in.

      • Arno van Wingerde

        I agree with your observation that the author portraits this as a reactive to Apple’s Vision Pro. It’s not like Mark saw the announcement and quickly included those into the Quest3…
        And the Quest3 really ist the first MR device, affordable or not, after the all the Vion Pro is not available, whereas the Quest3 is… so the “affordable” part could also be a reference to the Quest Pro…

      • Lucidfeuer

        The sole fact they probably copywrote this “MR” thing as a reaction to Apple’s headset showed how beyond, disgustingly, mediocre and illegitimate Meta is. Augment’s as an obsolete marketing artifice show they’re going nowhere.

        I don’t even think Apple’s markets the Vision as “MR” and seem to actually know better than to do so.

  • That is a great summary! Thanks for putting in your thoughts and comparisons to the AVP, especially in regards to the cursed UI.
    Yes, the air is getting thin for Zuckerberg. We will see Apple, probably Valve and also Samsung come swinging at them by the end of next year.

  • Dawid

    I am concerned about the Ouest 3 optimal head strap position. I have seen that in case of many people it is pushing and folding their ears. Even Mark Zuckerberg himself has this issue. I hope it is only incorrect adjustment and not a design fail.

    • Dragon Marble

      What I found (on Quest 2) is that you just need to push the interface (the hard part of the soft strap) up a little. It doesn’t affect the comfort at the back. People seem to forget that it is a soft strap, and can be bended.

    • Octogod

      It has my favorite strap yet. It’s soft, yet easy to get a snug fit. I wouldn’t worry.

  • This is a great article, but I think it misses two things:
    1. The announcement of the Ray-Ban Meta
    2. The big attention Meta had towards AI. The Quest was a side, while main dish was Meta AI, LLAMA2, and the integration between AI and the Ray-Ban smartglasses. This shows a lot what are Meta’s priorities at the moment

    • Steve R

      Agree, the new Ray Bans with AI assistant are a big deal.

      Yes, they are (correctly) giving a lot of attention/priority to AI. But Quest was still announced first in the keynote (they could have led with AI). I think XR is still getting plenty of attention/priority.

    • Christian Schildwaechter

      I wonder how much of the focus on AI was actually for investors. Connect is an XR developer conference, so the emphasis on AI was somewhat odd. Not the use of AI, which for years has driven a lot of Meta’s XR research, like with the predictive body tracking or integrating neural chips into SoCs to power life-like mapping of scanned user faces onto generic avatars with very low power requirements

      AI/machine learning now drives a lot of tech, because it can be a lot cheaper to train a network to correctly guess a result, than running hardware actually calculating it, which will help a lot in the future with e.g. rendering high resolutions. But that happens mostly in the background, and companies like Apple almost never explicitly mention it, it is just an implementation detail of new features, like the API or language used. Whenever it is emphasized, it is usually for marketing purposes, because everybody saw the results of ChatGPT and therefore now knows that this is “the future”.

      The same is probably true for Meta, who’s massive spendings at MRL have drawn a lot of criticism. So now they attached the more trendy AI to the apparent money pit XR, to make it look more appealing to their investors. And actively avoided the term “Metaverse” due to it now being mostly associated with a sort of overhyped cloud-cuckoo-land. But I doubt that their strategy has really changed all that much. Becoming a/the dominant player on the potentially dominant medium of the future is still the target and will still take decades. AI has always been a part of the tool box and Meta has released several very powerful open source products over the last few years, including the LLMs currently en vogue, and published research papers on how to use them not just to create text or images, but whole worlds as 3D geometry and populate them.

      It’s just that the public all of a sudden got aware that more AI is coming, and Meta is riding the wave of public interest to get investors to still allow them to burn through USD 10bn a year at MRL, because now it all will use AI, and that’s all they really want/need to know. The users will only notice the effects, like their legs staying in plausible positions instead of flailing around, and can blissfully ignore whether this is due to better camera tracking, smarter IK, improved apps, machine learning, or a combination of all of them.

    • Lucidfeuer

      Nobody cares about Ray-Ban Meta, it’s not like the Spectacle made any waves, anywhere. And yet this might, as you said, have been the only interesting announcement of Connect.

  • Steve R

    In one of the sessions they announced that mixed reality passthrough will be the default mode.
    IMO this is very big for usability.

    • Octogod

      Do you know which one?

      • Steve R

        Unlocking the Magic of Mixed Reality. Time 3:30 on the Youtube version.

        • Octogod

          Thank you!

  • Arno van Wingerde

    Hm… my definition of augments: “virtual junk in the living room”.

  • Christian Schildwaechter

    It’s often sad to see how much further VR could already be, if it would have attracted a larger user base. When Valve/HTC introduced lighthouse tracking in 2016, which is based on a lot of small, rather simple sensors, they expected the cost to significantly drop thanks to mass production and economies of scale. Had this worked out, we would probably have sub-millimeter precision trackers for USD 10 or less by now, and attach them to everything from arms and legs to hats, ping pong paddles, coffee mugs and pets. And there would never have been legless avatars that finally got approximated legs for standard movements with years of delay.

    Instead the number of VR users stayed rather small, with ways too many even leaving after a short while. And we only get a rather limited selection of either cheap, effectively subsidized hardware from big companies with a lot of strings attached, or high margin small scale production products, often targeting business customers with matching prices. Thanks to open source you can now build your own simple eye tracking solution from individually sourced components for less than USD 25, which is a lot more than integrating the same features into a mass produced HMD would cost, while a commercial eye tracking module that should require even less hardware than the DIY solution comes at five times its price, if you can get it at all.

    I was so convinced in 2014-2016 that VR would take the world by storm and sell in millions, quickly driving down cost. Had this happened, we would now all use FBT in much more extensive and complex virtual worlds with much higher visual fidelity thanks to ETFR and more developer interest. I am fully aware of the many problems VR still has, and even a huge success wouldn’t have significantly sped up SoC, display or lens development due to fundamental problems. Nonetheless I’m still trying to figure out where it all went wrong, and how we ended up in a slow paced and expensive niche, despite of all the amazing possibilities the technology can provide.

    • Guest

      It went wrong from repeating history and it none of the tech giants have learned from it.

    • XRC

      Its my second go round with VR, having realized the huge potential during my interaction with Virtuality in ’91-92 whilst a student studying industrial design. We had some early industrial headsets running on our silicon graphics workstations at university.

      Since getting Vive Pre in 2016 I’ve been super impressed with the equipment’s ability to generate presence, despite current limitations, and for me the steamVR tracking is key ingredient.

      As esteemed Harvard professor Jeremy Bailenson said in a recent interview when asked about the five most important aspects for presence, “tracking, tracking, tracking, tracking, tracking”.

      Also thought it would become more popular, but here we are in ’23…

  • Octogod

    Well said.

    More dire, while the drop from $60m to near $40m was in a year, it was also with a much higher number of apps in the store and in App Lab. So the average ROI on Quest games has dropped massively.

    Meta hasn’t learned that ‘Move Fast and Break Things’ doesn’t work when that is UI/UX on your face. People don’t adapt to the new interface, they just don’t engage with it at all. It has the opposite effect of increasing engagement.

    And Connect had a handful of MR demos, but 3 of the 5 were FPS wave shooters. They were the only ones I saw people stop playing mid sessions and go “we got it”. It’s clear the tech is there, but even with a year and half of developer exploration, the experiences are not.

    I’m bullish on Quest 3. But Meta needs to focus on why people buy the headset, not on winning a war with Apple.

  • Cragheart

    I am all for stylized, not fully realistic graphics, but I think that current “Metaverse version 0.01” looks just not good enough. I am not expecting 100% realism or anything like that, but Horizon Worlds needs a significant graphical update and an update to the maximum number of people gathered in one place at the same time. It’s rather cringy and weird at the moment imo. VRChat seems to make more sense in 2023.