ARKTIKA.1, an Oculus exclusive due to launch later this year, is shaping up to be one of VR's best looking games to date. You'll take it as no surprise that the title is being developed by 4A Games, the developer behind the Metro series (and its stunning next installment, Metro Exodus). One important part of making a game look great is skilled use of effects—dynamic elements like particles, smoke, muzzle flashes, explosions, and lighting. But the methods for making great looking effects for traditional games take on new challenges when it comes to VR, especially when teetering on the edge of visual fidelity and the high performance required for smooth VR rendering. In this guest article, 4A Games explores their approach to making effects in Arktika.1. Guest Article by Nikita Shilkin Nikita Shilkin is a Senior VFX Artist at 4A Games. Before that, he worked on films and ads as a Generalist Artist, and then as a VFX/Onset Supervisor on sci-fi and other types of films. Update (2/22/18): Following the launch of Arktika.1, Shilkin has published a new video further detailing the effects he created for the game. https://vimeo.com/253734727 Since this article was initially published, we've also published our Arktika.1 Review and a behind-the-scenes article exploring the artwork and insights behind the game's development. Original Article (8/13/17): To get an idea of my prior work, here's some of the scenes I've worked on: https://vimeo.com/224552800 https://vimeo.com/224553306 At the moment, I am working on effects for the ARKTIKA.1 project. This is a sci-fi VR shooter with a—traditional for the company—focus on immersing audience through story and high-quality visuals that make it possible to talk about it as an AAA product. To begin with, I would like to note that making effects for VR is essentially no different from producing them for ordinary games, with the exception of few nuances that I have noticed during the production. The first and the most important one – player’s freedom and as a consequence, the unpredictability of almost all his actions. Focus on performance. The requirement of constant 90 frames damages your technical and creative freedom, forcing you to constantly balance on the verge of game quality and player comfort. The final checkpoint is a headset. Due to the difference in resolution, gamma and the features of the virtual reality, what looked wonderful and beautiful in the editor might not look so good with a headset. Based on these three rules, we can start analyzing the production. So, let’s begin with some core things. Weapons Since we are talking about VR, we don’t have fixed camera, animations, timings or other constant values, which means we can never know how the player will shoot and from which side he sees the weapon. And the only way out is to make the effect work beautifully from all sides. And the first standard mistake is trying to make one mind-blowing sequence, which unfortunately will work only with a classic fixed camera, becoming ridiculous when turning the weapon. The solution is quite simple – no matter how complex the effect is, break it into simple fixed parts using all three directions. So you get not only volume, but also a visual randomness that will make a shot unique. https://gfycat.com/gifs/detail/AcceptableAdolescentAntarcticfurseal https://gfycat.com/gifs/detail/LimpLargeChihuahua Above: (left) a muzzle flash made with volume in all directions, (right) a typical 'first person' muzzle flash looks great from a static camera angle behind the weapon, but breaks down if seen from other directions. https://gfycat.com/gifs/detail/DeterminedFixedArchaeocete Since the VR does not feature a classic gun sight, nor the center of the screen, and aiming with a foresight or a scope is not a common thing, the projectiles of the weapon should be clearly visible. Most of the players will rely on this factor, making corrections for the bullets and their impacts. In this regard, there are several tips: The muzzle flash must not block the sight of the bullet. The bullet should be clearly visible (size, brightness, length). The lower the rate of fire, the better the bullets are seen with the trails behind them. The faster, the higher the brightness is. Don’t be lazy, create different bullets with variable impacts for all weapons, as this will also help the player to understand shooting direction. https://gfycat.com/gifs/detail/DrearyGorgeousCapeghostfrog And finally, a little piece of advice, if you have any firearms (or any other weapons with smoke particles), put them into a separate system, away from the flame and set free in the world, that looks interesting. https://gfycat.com/gifs/detail/OldWeepyAoudad Continued on Page 2: Distortion » Distortion I love using distort in different situations as with the right approach it is possible to achieve the effect of additional volume due to the refraction of other particles and the liquid effect, which helped me when I was working on plasma and the related effects. You should know that you don’t need to add distort to the muzzle flash as it will make your players feel dizzy. https://gfycat.com/gifs/detail/CheapSecondaryIraniangroundjay Even if it seems to look safe, the headset might give a different feeling. And here are some examples of non-aggressive distortion, comfortable in VR. https://gfycat.com/gifs/detail/BelatedUnrulyIrishsetter https://gfycat.com/gifs/detail/CookedAggravatingGangesdolphin https://gfycat.com/gifs/detail/CandidHideousHochstettersfrog VR Effects in Practice Now that we’ve sorted out some weapon stuff, I’d like to share my experience of using effects in different situations with a little explanation for each. It’s not a good idea to take control of player’s camera in VR, since this always leads to unavoidable motion sickness and separation from the world inside the headset. Neither is the camera shake, which might seem like a good idea for a robot landing. But it turns out that using two vertical cross movements (for example, the stones falling from top to bottom and a cloud of dust from the floor to the ceiling) while the player in the center, is a nice way to fake this effect, without shaking the player’s camera. You just use the world around the player, without using static floor and ceiling. https://gfycat.com/gifs/detail/FabulousAnimatedAmethystsunbird Let’s talk about performance. The ability to reuse the content is a necessity, since any extra texture will damage the frame rate. All the effects of steam in this room (below) used the same static texture, but with different post-processing. The most important thing is to understand the nature of the behavior of the effect, speed, inertia and then with conventional tools like rotation, motion, alpha channel and color curve, you can get anything, both beautiful and cheap for the engine. https://gfycat.com/gifs/detail/SaneFixedKouprey I want to show another example, where there is a completely static room, but due to quite simple effects, again, and correct light, you can get a sense of dynamics, which in general is cheaper than animating individual objects. It is clear that the animation would be much better, but it would take the lion’s share of the available resources. https://gfycat.com/gifs/detail/DependableEnergeticHapuka Talking about the effects, it’s hard not to discuss the software part. Since I’ve worked on some movie projects, I love Houdini, Maya and even 3ds Max. Unfortunately we do not use Houdini. Just look at the 16th version and you’ll understand how everything can be done now with ease when it comes to game development and I’m very happy about how the tool evolves, becoming absolutely universal. I use it because of an old habit (I just love the nodes), the usual free version, making some references. For example, I had to show the animator what kind of animation I would like to get from a robot charging through a wall and instead of using a thousand words it was easier to use a flipbook from Houdini. https://gfycat.com/gifs/detail/WavyUltimateKusimanse The VR makes it extremely interesting to study small details, play with physics and much much more, so do not forget about this opportunity when creating your own worlds! https://gfycat.com/gifs/detail/RemoteOldfashionedFlickertailsquirrel Continued on Page 3: The Art of Faking and Optimizing » The Art of Faking and Optimizing The most important thing to remember about creating effects for VR is the art of faking and optimizing, because on the one hand they should be voluminous, and on the other hand, any real simulation will damage performance and the balance is important if you really want to get some quality. https://gfycat.com/gifs/detail/ScentedGlumAfricangoldencat There are several tips for optimizing a game: Use RGB channels wisely. Often for the same smoke you only need one of the channels, while the remaining ones you can use, for example, for a map of brightness or transparency. Try not to use lit particles too often, just for noticeable cases. Remember about overdraw. Try to use quality, not quantity of each particle, minimizing the overall counter. Do not make highly recognizable unique sprites, it is better to get a picture inside the editor than initially rendering a beautiful sequence. This way you can repeatedly use one texture, without making it deliberately noticeable. If there is an opportunity to do something procedural with a shader, do it. Less used textures, less problems with streaming. If you have a collision of particles with static, try to minimize it by all means, or, by having a constant distance, break the effect into two different ones—drop and rebound—this way you reduce the calculation of collisions. Do not use distortion with shots. The game is still in production, so I was able to show only a small part of the content and if the topic seems interesting to you, I can share more in the future. - - — - - Arktika.1 is due to launch for the Oculus Rift in Q3 2017. This article was originally shared on 80 Level, and is published here with permission from 4A Games.