Total Cinema 360 Brings Disney’s The Lion King Musical to VR

2

Disney and Total Cinema 360 have released a 360-degree video documenting the opening five minutes (read: “The Circle of Life”) of The Lion King musical to VR platforms, including Littlstar’s iOS and Android apps, YouTube 360, Vrideo’s app and MilkVR for Samsung Gear VR users.

Samsung’s New Gear VR Advert Shows How to Sell VR

5

Samsung’s Gear VR launched officially in it’s first consumer ready form last week, and as the first commercial VR headset launching in decades, how do you sell a technology famously difficult to demonstrate other than in person? Samsung’s new advert has a very effective stab at this problem.

The First 6 Gear VR Apps & Games New Owners Should Try

1

Consumer VR is finally here with this week’s release of the Samsung Gear VR. If you’ve never owned a Gear VR (there’s two previous ‘innovator editions’ out since last year) but you’ve managed to get your hands on a pre-ordered headset and compatible phone, you’re ready to plug into the world of virtual reality. But what do you download (and buy) first?

Epic Games’ Ray Davis on Lessons Learned from ‘Bullet Train’

1
Bullet Train

RayDavisRay Davis is the Studio Manager of Epic Games Seattle, and he talks about working on Bullet Train, which is Epic’s latest VR tech demo that uses the Oculus Touch controllers and debuted at the Oculus Connect 2 gathering. I had a chance to catch up with Ray at the Seattle VR conference where he told me about the iterative design process behind Bullet Train, the evolution of the teleportation VR locomotion approach, and how they discovered the innovative bullet grab and throwing game mechanic.

LISTEN TO THE VOICES OF VR PODCAST

Ray Davis talks about some of the goals and motivations behind Bullet Train. Epic wanted to create an immersive VR experience that was interactive and dynamic designed for anyone to go through regardless of what their level of gaming experience might be. Lead VR engineer Nick Whiting and Creative Director Nick Donaldson collaborated on creating Bullet Train, and they wanted to explore what it means to have hand presence within a VR experience.

Ray says that there’s an art to constructing a competitive death match environment in terms of the player flows and different pickups that encourage different pathways throughout the environment. It’s not just a matter of teleporting from location to location, and Nick Donaldson took a lot of that into consideration when creating Bullet Train.

Bullet Train has definitely been the most comfortable first-person shooter experience that I’ve had in VR so far. This level of comfort is largely thanks to their teleportation mechanic in order to move between different way points that are set on a subway train and out into the station. There’s a ghosting trail that you can see after you teleport that can help you orient you to your new location. Ray says that they thought a lot about ways to design the experience so that you could have enough visual cues to maintain your orientation as you teleported between the various waypoints.

Ray says that game design process at Epic Games has always been very organic and iterative. His advice is to just make a VR experience, and then see what people want to try to do in the experience, and then implement those things if it hasn’t been implemented yet. This is how they discovered their bullet grabbing and throwing game mechanic. They noticed that people kept trying to to catch them, and so they went ahead and just added that feature. He says that their ultimate goal is to create an intuitive experience such that people forget that they’re controlling a game, and that they can get into a flow where they’re reacting with their unconscious muscle memories.

Ray says that it’s ultimately a lot of fun to develop for virtual reality when you’re the target audience, because you’re the best expert in what you find fun and engaging. Especially when they could look to their favorite Hollywood action movies, and see what they could start to recreate within their VR experience. There are a still a number of design challenges in moving something like Bullet Train from a novel tech demo into a full-fledged game, and Ray didn’t mention any specific plans for what the future of Bullet Train might be. But it wouldn’t be surprising if they were continuing to refine and develop this concept after giving more than 500 demos over the last couple of months.

There’s also a lot of these experiments in VR where these ad hoc teams at Epic are able to dogfood the Unreal Engine. And so there is a lot of feedback and improvements that are made to the engine to make it more and more well-suited to create different virtual reality experiences. Ray says that part of the culture at Epic Games is to make things, and then try to give as much away of those innovations as possible.

Finally, Ray sees that VR and AR will have a convergence and eventually replace our screen-based interfaces in monitors, laptops, tablets, and phones. He sees that VR and AR will continue to unlock a lot of actual changes with how we gather and consumer information as well as how we connect with each other.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

WakingApp Raises $4.3M to Build an Accessible VR/AR Content Creation Platform

0

WakingApp announced recently that they’ve completed a series C funding round to secure $4.3M to realise their vision of an immersive content creation platform that everyone can use.

WEARVR Weekly Top Ten VR Downloads – November #3

0
Looking for something fun to play in VR? We’ve got the top 10 downloads from the last week on the WEARVR app marketplace, a cross-platform repository of virtual reality experiences.

First Look: PresenZ Shows Navigable Pre-rendered Room-scale VR Scenes

2

PresenZ is a system developed by VFX studio Nozon which combines some of the essential benefits of pre-rendered imagery with those of real-time interactivity. For the first time the company is demonstrating a room-scale version of a Presenz-enabled scene.

Crash Course: Pre-rendered vs. Real-time CGI

Click to Expand

To explain the difference between pre-rendered and real-time CGI, let me quickly take you back to the early days of hand-drawn animation.

Many of the cherished classics from this age, like Pinocchio (1940), were created by having an artist sit down to draw a series of pictures, and then have those pictures played back quickly after one another to create the illusion of continuous motion. Because the amount of time it takes to draw each frame is much longer (let’s say 5 minutes) than the amount of time that each frame is displayed (1/30th of a second, in the case of 30 FPS playback), interactivity is impractical—you might ask for a character to make a movement in the scene that requires 1000 frames and it would take more than three days just to draw the frames for that movement. However, if the artist can drawn simply enough—like a stick figure flip-book animation—they may be able to produce your desired result in a matter of minutes.

This is very much like the difference between pre-rendered and real-time CGI. Imagine now that instead of an artist drawing each frame, a computer is doing the drawing. A very detailed frame may take the computer five minutes to draw, depending upon how powerful it is. At a drawing (or ‘rendering’) rate of one frame per five minutes, interactivity is still impractical because you may ask for a change that takes several minutes to be produced. It makes more sense to plan what you want the computer to draw ahead of time, then simply gather up all of the frames after they have been drawn and then play them back to back to create a sense of continuous motion. This is pre-rendered CGI: the frames are drawn (‘rendered’) before (‘pre’) viewing.

For this reason, pre-rendered CGI is the preferred method for creating very high fidelity imagery, like that seen from major animation studios like DreamWorks and Pixar. The graphics in films from these studios far surpasses what your computer or Xbox can render because pre-rendered films have the time to spend many seconds, minutes, or even hours on each frame, so long as you will view them all strung together at some point in the future. But that means no interactivity because the viewer is not there during the rendering process to dictate the action (and even if they were, it would take to long to see the results).

See Also: DreamWorks Reveals Glimpse of 360 Degree ‘Super Cinema’ Rendering for VR Films

But if you make the frames simple enough (like the stick figure flip book animation) such that the computer can draw many per second, you can reach a level of practical interactivity where you can ask the computer for a change and see the resulting frames nearly instantly. This is real-time CGI: the frames are rendered as fast as they are being displayed (‘real-time’). Real-time CGI opens the door to interactivity, like being able to pick up a virtual object or press a button to open a hatch; depending upon the user’s input, new frames can be drawn quickly enough to show the result of an action instantly, rather than minutes, hours, or days later.

For this reason real-time CGI is the preferred method for creating games. Interactivity is a crucial element of gaming, and thus it makes sense to bring the graphics down to a point that the computer can render frames in real-time such that the user can press a button to dictate the action in the scene.

So simply put, pre-rendered CGI excels in visuals while real-time CGI excels in interactivity. They’re not fundamentally different except for how long it takes to draw the action.

The Promise of Presenz

For the entire history of CGI, creators have had to choose between the high fidelity visuals of pre-rendered CGI or the interactivity of real-time CGI. Then along comes Presenz, promising to mash together these two formerly incompatible benefits into a single solution.

Nozon first revealed Presenz in early 2015, demonstrating positional tracking parallax—the ability to look around objects as you move your head through 3D space—in a small area around the users head in a scene with animated pre-rendered CGI visuals with complexity which would otherwise bring a real-time game engine to a halt.

I’ve seen it for myself and it’s everything they say it is: the visuals of pre-rendered CGI with the positional tracking parallax that’s normally only possible with a scene rendered in real-time. The area in which you can move your head about the scene is only about a meter square. If you hit the edge of that area the scene will fade out as the view of the scene in the area beyond that space has not been pre-rendered. This is of course a limitation if users want to be able to crouch, jump, or walk around a larger scene.

For the first time the company is now showing a room-scale Presenz-enabled scene which is navigable with the HTC Vive & Lighthouse tracking system. In the video above, we see a pre-rendered scene which can be navigated from one end to another seamlessly just like a real-time experience. Nozon calls the scene’s viewable area the ‘zone of view’ rather than a singular ‘point of view’ (as you would be stuck with using a traditional pre-rendering approach.

So what’s the downside to this seemingly magic solution to the pros and cons balance of pre-rendered vs. real-time CGI? Well for one, interactivity is currently limited. You may be able to navigate through the scene, but interaction in the traditional real-time sense is not possible as the scene is still pre-rendered. Nozon says that they’re developing the ability to add real-time interactive elements into their Presenz scenes, but so far they’ve only demonstrated support for pre-rendered animations.

Another downside to Presenz is file size. Relatively simple scenes can climb to the gigabyte count quickly (likely scaling with the size of the zone of view), though Nozon says they are working on compression schemes which “make it possible to reduce [the file size of] some scenes by a factor 10.”

Framerate is also another downside.  A Presenz scene can only currently be animated up to 25 FPS (though the headset still views the scene at its own native refresh rate). It isn’t clear yet if this is a technical limitation or a means to keep the file size down.

PresenZ vs. Lightfields

Those of you following along carefully will probably notice some commonalities between the Presenz solution and lightfields. I certainly did, and so I queried Nozon about the differences between the two. The company insists that, despite the similarities, Presenz is a patented solution which differs from lightfields. My efforts to understand the precise differences didn’t get very far as the company is understandably careful not to dig into the specifics of their technology.

See Also: OTOY Shows Us Live-captured Light Fields in the HTC Vive

However, Nozon’s Matthieu Labeau provided me with a broad comparison between the two solutions (the skeptical reader will understand that this list is likely to lean in Nozon’s favor):

PresenZ

Benefits

  • File sizes manageable by today’s computers: about 15-20 Mbytes per frame without temporal compression. We expect to reach 20-30 Mbytes/second or better with implementation of temporal compression.
  • Can be plugged into any high-end renderer.
  • Production companies can keep their pipelines as is, and all their previous 3D Assets.
  • Capable of animated content

Lightfield

Benefits

  • Specular-lightning, reflections, and transparency are not baked in, so they will react realistically to positional tracking.

Current Limitations

  • Specular lighting is ‘baked-in’ (this can be solved in the future), but we don’t believe it’s an immersion breaker in the meant-time.
  • Large file sizes making slow downloads for the moment.

Current Limitations

  • Still imagery only
  • Comprises made for manageable data size: capture of a small volume, that is scaled up when viewed. That changes the scale of the scene and limits immersion (everything feels big and far away).

Computer Minimum Specs

Oculus recommended spec + Raid0 SSD

Computer Minimum Specs

  • Unknown but believed to require high-end GPUs. To our knowledge no standard computer can handle animation in this format.

Both technologies are still in development so the list here is likely to be in flux for a while to come. Either way, Presenz seems to be making good headway in combining the benefits of pre-rendered and real-time CGI, though there are still a number of limitations that will need sorting before broad application of the technology is possible.

Samsung Launches Gear VR Headset in the US, Coming ‘Soon’ to Europe

6

Samsung today became the first company to market with a consumer oriented virtual reality headset in decades as its Samsung Galaxy mobile phone powered Gear VR launched in the US today.

Analysis of Gear VR’s Place at the Inception of the Consumer Virtual Reality Ecosystem

2

ben-langBen Lang jokes that we’ve been at ‘Year Zero’ of VR for three years now, and the official release of the Gear VR today marks the first official launch of a consumer-ready virtual ready head-mounted display. Ben joins me on the podcast to talk about some of the technical details that allow the Gear VR to drive such a compelling virtual reality experience, as well as some analysis of the larger virtual reality market. There are a lot of high expectations that virtual reality will be able to grow and evolve into the ultimate potential that we all hope it can be, and so we take a look at how the smartphone market evolved over time and what we can expect to see over the next year.

Oculus Lists Gear VR ‘Sold Out’ at Best Buy, Amazon & Samsung Still Stocked

2

samsung-gear-vr-amazon-prime

Pre-orders for the Gear VR, the mobile VR headset made in collaboration between Samsung and Oculus, opened up on November 10th. With the headset due to be released on the 20th, it looks like Best Buy has sold out of its initial stock, but other stores remain stocked.

Madrona Venture Group on Why Seattle is a Hotbed for VR and AR

0
A satisfied mature business man wears a virtual reality headset, controlling the experience with hand gestures, while talking on a mobile/cell phone.

MattMcIlwainMatt McIlwain is a managing director at the Madrona Venture Group, which recently announced their first investment in the virtual reality space with a $4 million Series A round of funding for Envelop VR. Matt talks about why Seattle is one of the top hotbeds for augmented and virtual reality because there’s a wide variety of hardware, software, gaming, and cloud computing companies including HTC, Valve, Oculus, Facebook, Amazon AWS & Twitch, Microsoft Xbox & Hololens, and Nintendo of America. Matt talks about Madrona’s investment in Envelop VR as well as their strategy finding companies building horizontal software for AR and VR as well as other vertical commercial opportunities of new models of distribution that AR and VR enables.

Review: ‘EVE: Gunjack’ Sets the Bar for Gear VR Turret Shooters

7

EVE: Gunjack, The mobile counterpart of the hugely anticipated VR multiplayer space shooter EVE:Valkyrie, is released tomorrow on Samsung Gear VR. We go hands-on with the consumer release and find out that CCP Games have succeeded in setting a new benchmark for visual fidelity in mobile virtual reality.

Nokia Ozo Price, Specs, and Release Date to be Revealed at November 30th Event

6

Nokia has sent out invites for a forthcoming event in Los Angeles on November 30th asking guests to “join us for the exclusive unveiling” of Ozo, the company’s professional VR camera.

Virtuix Omni with HTC Vive & Lighthouse Enables Fully Decoupled Locomotion

14

As developers continue to experiment with a range of VR navigation techniques, Virtuix’s Omni treadmill gives gamers a way to physically walk and run around virtual worlds in an otherwise limited space. The company’s latest developments bring compatibility with the HTC Vive and its Lighthouse tracking tech, enabling ‘decoupled’ manipulation of walking, looking, and aiming.

Crytek’s Beautiful ‘Back to Dinosaur Island’ VR Tech Demo Now Available on Oculus Rift

22

Crytek has today released the Back to Dinosaur Island tech demo for download. The five minute demo showing a perturbed T-Rex is compatible with Oculus Rift DK2 headset and built on CryEngine.

39,929FansLike
13,574FollowersFollow
66,541FollowersFollow
27,800SubscribersSubscribe

Latest Headlines

Features & Reviews