The Meta 2 development kit, made available for pre-order by the company today, isn’t perfect. But then again neither was the Rift DK1, and it didn’t need to be. It only needed to be good enough that just a tiny bit of imagination is all that was needed for people to extrapolate and see what VR could be.

See Also: ‘Meta 2′ AR Glasses Available to Pre-order, 1440p with 90 Degree FOV for $949

Without an inexpensive and widely available development kit like the DK1, VR would not be where it is today. There are lots of great companies and incredible products now in the VR space, but there is no argument that Oculus didn’t catalyze the re-emergence of consumer VR and that the Rift DK1 wasn’t the opening act of a play that is on its way to a grand climax.

meta 2 development kit hands on augmented reality headset AR (1)

Augmented reality, on the other hand, hasn’t yet seen a device like the Rift DK1—an imperfect, but widely available and low cost piece of hardware that makes an effortless jumping-off point for the imagination.

Meta’s first development kit, the Meta 1, was not that device. It was too rudimentary; it still asked for too much faith from the uninitiated. But Meta 2 could change all of that.

New Display Tech

Meta 2’s biggest change is a drastic difference in display technology. The result is a hugely improved field of view and resolution compared to its predecessor. The unit now sports a 2560×1440 (1280×1440 per eye) resolution and 90 degree diagonal field of view, compared to the 960×540 per eye resolution and paltry 36 degree field of view of the original.

meta 2 development kit hands on augmented reality headset AR (3)

The new display tech uses a smartphone display mounted above the plastic shield which serves as the reflective medium to bounce light from the display into your eyes. On the inside, the plastic shield is shaped concavely to aim one half of the smartphone’s display into your left eye and, and the other half into your right. These concave segments use a faintly silvered coating to reflect the light coming from the angle of the smartphone display, but still allow light to penetrate through to your eyes from the outside, effectively creating a transparent display that doesn’t darkly tint the outside world.

meta 2 development kit hands on augmented reality headset AR (6)
You can see here which sections of the plastic shield are silvered for reflectivity.

Meta has honed this display system quite well and the result is a reasonably sharp and bright AR image that appears to float out in front of you. The improvement in the field of view from Meta 1 to Meta 2 is night and day. It makes a huge difference to immersion and usability.

SEE ALSO
Hands-on: Shiftall MeganeX Superlight Packs a Wishlist of Ergonomics Into a Tiny Package

Those of you with us in the Rift DK1 days will recall that one of the device’s smart choices was to saddle up with mobile device displays, effectively piggybacking on the benefits of a massive industry which was independently driving the production of affordable, high performance, high resolution displays. Meta has also made this call and it will likely pay dividends compared to more fringe proprietary display tech like waveguides and tiny projectors that don’t get the ‘free R&D’ of healthy consumer electronics competition.

Hands-on Demo

meta 2 development kit hands on augmented reality headset AR (2)

I joined Meta at their offices in Redwood City, CA recently to see the Meta 2 development kit first hand. They took me into a fairly dim room and fitted the device onto my head. For a development kit, the fit is reasonably good; most of the weight sits atop your head compared to many VR headsets which squeeze against the back of your head and your face for grip. The plastic shield in front of your eyes doesn’t actually touch your face at all.

It should be noted that Meta 2 is a tethered device which requires the horsepower of a computer to display everything I was about to see.

View post on imgur.com

Meta walked me through several short demos showing things like a miniature Earth floating on the table in front of me with satellite orbiting it, an augmented reality TV screen, and a multi-monitor setup, all displayed right there before me projected out into my reality. When I held my hand up in front of the AR projections, I saw reasonably good occlusion—enough to make me see where things are headed—such that the objects would disappear behind my hand (as if the objects were really behind it).

In the multi-AR monitor segment of the demo I was able to reach my hand out into each monitor and make a first as a grabbing gesture, allowing me to rearrange the monitors as desired. On one of the monitors I saw an e-commerce website mocked up in a web-browser, and when I touched the picture of a shoe on the AR display, a 3D model of the shoe floated out into the space in front of me.

View post on imgur.com

With the hand-tracking built into Meta 2, I was able to reach out and use the fist-gesture to move it about, somewhat haphazardly due to poor hand-tracking, and also use two fists in unison as a sort of pinch-zoom function to make the shoe larger and smaller. The hand-tracking performance I felt in this demo left much to be desired, but the company showed me some substantial improvements to hand-tracking coming down the pipe, and said that the Meta 2 development kit would ship with those improvements built-in. We plan to go into more detail on the newer hand-tracking soon.

View post on imgur.com

One ostensibly ‘big’ moment of the demo was when they showed a “holographic call” where I saw a Meta representative projected on the table in front of me, live from a nearby room. While neat, this sort of functionality presupposes that we all have a Kinect-like device set up in our homes somewhere, allowing the other user to see us in 3D as well. In this demo example, since I had no gesture camera pointed at me, the person I was seeing had no way to see me on the other end.

SEE ALSO
New 'Skydance's Behemoth' Trailer Shows Off New Boss, Gameplay & More Ahead of December Release

An Obvious Flaw (Didn’t Stop the Rift DK1 Either)

All of the demos I saw were fine and dandy and did a good job of showing some practical future uses of augmented reality. The new Meta 2 display technology sets a good minimum bar for what feels sharp and immersive enough to be useful in the AR context, but the headtracking performance was simply not up to par with the other aspects of the system.

If you want great AR, you need great headtracking as a very minimum.

A major part of believing that something is really part of your reality is that when you look away and look back, it hasn’t moved at all. If there’s an augmented reality monitor floating above my desk, it lacks a certain amount of reality until it can appear to stay properly affixed in place.

meta 2 development kit hands on augmented reality headset AR (7)

In order to achieve that, you need to know precisely where the headset is in space and where it’s going. Meta 2 is doing on-board inside-out positional and rotational tracking with sensors on the headset. And while you couldn’t say it doesn’t work, it is far from a solved problem. If you turn your head about the scene with any reasonable speed, you’ll see the AR world become completely de-synced from the real world as the tracking latency simply fails to keep up.

View post on imgur.com

Even under demo conditions and slow head movements, notice how bouncy the AR objects are in this clip.

Projected AR objects will fly off the table until you stop turning your head, at which point they’ll slide quickly back into position. The whole thing is jarring and means the brain has little time to build the AR object into its map of the real world, breaking immersion in a big way.

SEE ALSO
'Thrill of the Fight 2' Gameplay Revealed, Coming to Quest in Early Access in November

Granted, computer vision remains one of the most significant challenges facing augmented reality in general, not just Meta.

But, returning to the Rift DK1—Oculus knew they needed lower latency and quite importantly, positional tracking. But they didn’t have a solution when they shipped the DK1, and thus the unit simply went out the door with rotational tracking only and more latency than they would have prefered. But even while lacking positional tracking altogether, the DK1 went on to play an important part in the resurgence of VR by doing enough things well enough, and being that affordable, accessible jumping off point for the imagination.

Meta 2 is in the same boat (albeit, somewhat more expensive, but still sub-$1,000). It has a number of great starting points and a few major things that still need to be solved. But with any luck, it will be the jumping-off point that the world needs to see a VR-like surge of interest and development in AR.

AR is Where VR Was in 2013

The careful reader of this article’s headline (Meta 2 Could Do for Augmented Reality What Rift DK1 Did for Virtual Reality), will spot a double entendre.

Many people right now think that the VR and AR development timelines are right on top of each other—and it’s hard to blame them because superficially the two are conceptually similar—but the reality is that AR is, at best, where VR was in 2013 (the year the DK1 launched). That is to say, it will likely be three more years until we see the ‘Rifts’ and ‘Vives’ of the AR world shipping to consumers.

meta 2 development kit hands on augmented reality headset AR (5)Without such recognition, you might look at Meta’s teaser video and call it overhyped. But with it, you can see that it might just be spot on, albeit on a delayed timescale compared to where many people think AR is today.

Like Rift DK1 in 2013, Meta 2 isn’t perfect in 2016, but it could play an equally important role in the development of consumer augmented reality.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Zach Gray

    Someone needs to hack lighthouse headset tracking onto this, obviously. Of all the ideas out there, that one seems like the one that could get the most widespread and generic use. You install 2+ in a space, and everything can use it. Would be easy to install in conference rooms, offices, classrooms.

    • Jason Hunter

      We need technology that works without external equipment.

      • ryanpamplin

        Agreed! Meta 2 will ship with excellent inside out tracking.

        • CURTROCK

          Kudos on an EXCELLENT Meta2 promo video. One of the best I’ve seen.

      • Zach Gray

        This is already tethered to a PC. That’s external equipment. The tracking doesn’t look perfect, so until it is, bootstrap it with external tracking so application and dev can happen without stuff wobbling all over the place.

    • yag

      That would be a solution, but I hope we’ll solve that on-board inside-out tracking thing as soon as possible (see VPU), we really want AR to be much mobile as possible.

  • CURTROCK

    RoadtoAR? Lol. Great report on some very promising tech. Does indeed give me déjà vu going back to the DK1 days. This is a hella cool year for “Nerd-Tech”. Loving it.

    • yag

      Well, we may talk more about AR than VR soon (maybe sooner than we think, thanks to the VR bloom).

  • mellott124

    Great article. Tracking will make or break this. Let’s hope it improves.

    It looks impressive for what you pay.

    • Jason Wilhelm

      Precise head tracking is only the first issue that needs to be solved with AR. Other issues that are less often discussed are:

      1. Filtering or darkening the visual scene so that darker objects and shadows can be composited realistically into reality. (A virtual object sitting on a real table can’t realistically be grounded on the surface with out proper shadows)

      2. Illumination mapping and reconstruction is required for virtual objects to depict realistic illumination and color from the sources and ambient light within a scene. Contributes to casting properly oriented shadows as well.

      3. Precision occlusion mapping and scene reconstruction even behind semitransparent materials.

      I think the challenges facing AR are much more difficult to solve satisfactorily than for VR, but fortunately there are many overlapping issues that can be solved at the same time.

      • realtrisk

        All good points. When I think about it, AR has a lot more work to do than VR, since VR is in control of every element you see, but AR is not, and must adapt accordingly. Makes me realize it’s probably farther away than I originally thought.

        • Jason Wilhelm

          Your right on, but I hope its good enough now to be useful even if these harder challenges lie somewhere in the future.

      • Jack H

        There are existing methods for both occlusion and light mapping.
        I’m working on my own, but well known methods included selective filtering with a transmissive LCD panel or by imaging onto a reflective LCOS panel.
        For reflection mapping, the standard procedure is to use athe least 1 HDR camera to light your SLAM 3d model and use Image Based Lighting (raytracing) as originally proposed by Paul Debevec.
        Regards,
        Jack
        Halo AR

  • Kevin White

    I kind of wish we could make the distinction between AR (Google glass, HUDs, 2D additions to the world, mainly data and overlays, “head-tracking” doesn’t apply) and MR (Hololens and Magic Leap and Meta, 3D additions to the world that understand and respect depth and surfaces, head-tracking implemented, often 3D polygon constructs, often stereoscopic 3D). We all know what VR is, but in my opinion AR puts people in mind of something much more rudimentary than what Magic Leap, Hololens, Meta, etc. have in store, so MR (Mixed Reality) would be a better moniker.

    • Jesse Kindwall

      I disagree. When I think “AR”, I think of all those things you attributed to “MR”, and always have. The way I see it, “MR” is a useless term for something we already had a name for, dreamed up by marketing execs to try and distinguish their product from others.

      • Jim Cherry

        Would calling it ar advanced or 2g ar or ar++ be better. Calling google glass and ar apps on your phone the same thing as holo lens is kind of like calling military drones and quad copter toys the same thing, so I can definitely see the need for some distinction.

      • Kevin White

        When a lot of people hear “AR” they picture Google Glass, or the things where you watch something pop up on the table while looking at your phone. That’s very, very different than Meta 2. But you want them all lumped together and just called “AR?”

        • You are confusing AR (Augmented Reality) with just an HMD (head-mounted display). This isn’t a common mistake, at least amongst the few people who even know what “AR” stands for. If I mentioned AR to any of my friends, they’d think I was talking about an Assault Rifle.

      • brandon9271

        I agree. When I hear about AR I think of Magic Leap and Hololens. Anything other device that simply displays information is just a HUD and calling it AR is wrong. Mixed Reality is a term that’s new to me. I haven’t heard it used very often and honestly think it muddies the water a bit. It doesn’t really matter though. Before long AR and VR will be interchangeable. Soon VR devices will all have front facing cameras and AR devices will have “blinders” that allow then to function as VR devices. Then the next generation of devices will be able to do both AR and VR with whichever method works the best.

    • yag

      Headtracking is the distinction between AR devices (Meta, Magic Leap…) and mounted displays like Google glass.

  • psuedonymous

    “but the reality is that AR is, at best, where VR was in 2013 (the year the DK1 launched).”

    I’d argue that it’s closer to where VR was in the 1990s with the Cyberface 3: the optics design plan was there (‘large’ dual plat-panel displays with non-rectilinear optics), but the displays and tracking were not. Likewise, the Meta 2 is a planer display with no lightfield capability – or at least no live-refocusing (go ask Steve Man the importance of having AR objects at the correct focal depth) – and as of yet NOBODY has demonstrated acceptable performance from unstructured inside-out optical tracking. Before the DK2, suficientlywell eprfoming optical tracking existed commercially (used in MOCAP), and even existed partially for the consumer (TrackIR, for all their unethical business practices, did have a product that worked).

    Meta, and Hololens, are the first glimpses of what is to come, just as VR was in the 90’s. But there’s a yawning gulf of technical advancement required before our ability starts approaching our expectations.

    • Kutastha

      Fine. We can say that Meta and Hololens are equivalent to VR in the 90s. On the other hand, I’ve been hearing that Magic Leap is equivalent to where VR was in 2014.

  • youngshyne

    After experiencing hololens, I don’t think they have a chance to be successful. They need to sell before both hololens and magic leap push hard on marketing

  • Ben, great and well-balanced article. Thanks for the words and perspective. This is the best article I’ve seen so far, properly placing the Meta2 at the forefront of a major industry birth.

  • brandon9271

    Imagine this device using Lighthouse tracking and Leap motion for occlusion? It would probably be bad ass! :)

  • Nifty! We haven’t seen nearly enough real-time occlusion with AR. (I have serious doubts about that Magic Leap video…)

  • kerouac99

    I think there is a lot of confusion by classifying all AR as VR with a real world background. I’ve been using AR in education for the last 6/7 years and these see through applications don’t really provide a link to the real world. Yes it tracks head movement/ gestures but where is the actual tracking of objects/ image triggers that ground the experiences to the physical space in front of the user. To me this is the unique selling point of AR and although I will be buying the Meta Dev kit, this is where a lot of these sort of applications fall down for me. Although the Epson Moverio was awful to look at, it provided a simple way of porting existing AR experiences to a wearable device, without a lot of the issues (latency etc.) describes here. I have to ask, why couldn’t the demo examples in the article be produced in VR, sell me the USP

  • eco_bach

    Ben, great balanced review. How does the head tracking of the Hololens compare with Meta2?