Using EmberGen in CG and Motion Design
#How to

Using EmberGen in CG and Motion Design

How to create volumetric fire and smoke, fast and cheap

Every CGI production studio aiming to deliver cinematics such as video game trailers and intros, faces multiple challenges like finding the right balance between cost and quality to meet the clients’ demands and create captivating, VFX-laden experiences that make the product stand out. At PLAYSENSE, we manage to cut a lot of corners by adopting cutting-edge software as it appears on the market. Igor Sandimirov, one of our leading Motion Designers with 15+ years of experience, recounts how his department was able to use the EmberGen suite to promptly deliver complex volumetric fire and smoke effects for a story-driven in-game event trailer.

Motion Design tech stack at PLAYSENSE

To deliver around 99% of CGI content, our department uses Adobe After Effects, Cinema 4D, X-Particles, and Redshift. Sometimes we employ Unreal, Blender, Octane, and Eve. EmberGen was able to become part of our tech stack because of its great performance and usability.

What’s the main problem with fire and smoke effects?

Pyro effects such as fire and smoke are physics simulation which is in general very demanding —regarding performance, time elapsed, and software cost. To create pyro effects, we either use real footage or additional specialized software. The latter is expensive, it requires powerful workstations, and you need time to master it. All that can impact the cost and the speed of projects adversely.

How can EmberGen help with creating pyro effects?

Here comes EmberGen: still a beta but a quite workable one. Everything is done using your system’s GPU—including the physics-related calculations and the rendering. Most of other software suites relegate physics to the CPU, slowing down the process.

EmberGen features a convenient and demonstrative node structure. Launch it, and the fire will start burning straight away in real time. If you change the structure, adding new nodes or modifying some parameters, everything will be reflected in the viewport.

Create a box, mark it as a collider, and the interactions start in that very moment. Then you can add one more emitter, change its shape, adjust the amount of fire and smoke, or change the mode of burning so that colored smoke is emitted.

With the beta, you get over 100 presets for almost any imaginable case. This played the decisive role for us, and we started using EmberGen in an actual project that was in the works. We would find a preset, understand how it works, and tweak it according to our needs. The speed at which EmberGen worked was the killer feature, allowing us to quickly implement scenes that would’ve taken a week to render otherwise.

How are volumetric fire and smoke implemented?

In case you’re not that into volumetrics, here’s the general setup of the process of creating fire and smoke.


The ‘brain’ of it is the Simulation node: a solver calculating the physics of the processes. To do that, it needs data input. The latter is provided by:

  • Emitter, something that releases the fuel that is burning. In case with a burning match, its head is the emitter.
  • Collider, an object that serves as an obstacle for smoke and fire. For example, a pot hanging over a fire must have flames licking it but never passing through it.
  • Forces influence the simulation. The wind can blow the flames aside and alter their shape, and there are more effects like turbulence.

All of this is ‘fed’ to the solver, where we set the container size (in voxels) in which the simulation is calculated. The larger the container is, the more time the calculations will take.

You can render the simulation you’ve set up, straight away after tuning the lighting and other settings for the scene, in the Volume node (there’s no such option in EmberGen yet for now as rendering and mesh shading aren’t supported). It also can be exported as a VDB format sequence to be processed in Maya, Houdini, Cinema 4D, Blender, etc.

EmberGen specifics

In case of EmberGen, we first set the fuel quantity (Fuel Rate) in the Emitter node together with the temperature and the pressure, and define whether the smoke starts to appear immediately or in the process of burning. We can also bring some Force into the Emitter node to give the flames a certain shape. Alongside with the Emitter nodes, we can add Collider nodes (for the flames to flow around them) and Forces affecting the entire simulation. Right here we can also set the container size and the simulation type (Burning, Colored Smoke, etc.). After this we can render the project into VDB format for use in other graphics suites, or into the so-called Flipbook to work on the project in game engines like Unreal and Unity. To do either, we select Render Passes in Capture node. EmberGen has all the render passes needed for quality composing.

Here’s how some render passes look.


Doing fire and smoke for a World of Tanks video

Here’s a video we made for World of Tanks, “Battle Pass Season V. Operation Phantom: Part 2”. You should start with Part One though as everyone on the project was keen on bringing it as close to movie quality as possible, and they did a great job.

Creating thick exhaust plumes

Our first task related to the video above was creating rocket launch effects using gameplay footage (after the 2:00 mark in the video above). The long shot: a rocket is slowly blasting off far away from the camera, with a long exhaust plume. The middle shot: a rocket lifts off from behind a crag on the location. And the close shot: a rocket launch right next to our character’s tank.

We rendered the plume and the smoke part by part for the rocket that was leaving the silo. This approach sped up the work as the separate projects were simple and we simply composed them in Adobe After Effects.

Billows of smoke are always a sight to behold. And if we change some parameters (like turbulence seed), we get new smoke silhouettes for a negligible cost. We used such footage, already rendered, for the project. We just modified the cloudiness of the smoke, its speed, density, dispersion, and other ‘stats’—essentially in real time.

Doing it all real quick had a nice side effect: due to the extra time, the client was able to comment on every little detail, and we addressed it all within a couple of iterations.

Making billows of volumetric smoke

The second task was to make billows of smoke burst out of the tunnel mouth at the start of the airstrip (you can see it after the 10:00 mark). A series of explosions happens in the underground complex and then all that mish-mash gets outside to blanket the part of the desert where our heroes’ tanks are standing.

We made a mesh of the tunnel entrance, stuffed a smoke emitter inside, and added some pressure. (During post-processing, it was all sped up x2). We also tuned motion blur and smoke density (to impress the client and the viewers).

The client took a look and decided that the scene would benefit from extended amount of fire. No problem: we just set a bit of fuel into the emitter and cranked up the temperature. Then we increased the amount of smoke, made it thicker and billowing even more when bursting out of the tunnel. (Also, here we sped everything up x3.)

It was quite hard to set the same camera angle as in the gameplay footage we used, as camera import wasn’t supported. Also, it wasn’t possible to bring the picture into the background. So we simply set the right angle manually.

How to cobble up a mushroom cloud

The third task was creating a huge mushroom cloud with a volume of dust slowly settling down. We wouldn’t have delivered that without EmberGen. We probably could manage to find suitable
footage but massive problems with lighting would’ve emerged then. So we found a preset we needed, and as the client requested lots of fire, we added an extra emitter at the base of the mushroom cloud, which was more burning than smoking. For composing we used beauty pass and (separately) emission to light up fire. Though all the necessary passes for a serious composition are present in EmberGen. (More on that below.)

There was a complication still related to the dust in the air around the mushroom cloud. The shot isn’t static as the camera is moving and you can observe the parallax effect. As our version of EmberGen didn’t support animated camera import, the dust clouds engulfing the desert around the fiery mushroom had to be rendered in Cinema 4D. As the gameplay footage wasn’t static, we exported the dust into VBD, tracked the camera and rendered that in Redshift, and added the layer with all that dust.

And here’s the closing shot: the view from the pilot’s cabin.

Screenshot 2021-10-11 at 21.18.28.png

EmberGen: the prospective

EmberGen has all chances of becoming one of the most sought-after software suites. The developers are constantly improving it, and have already released a version with animated mesh support. Import and export of the camera are either already present or about to be implemented. The UI has been improved, procedural animation has been added, and the quality of shading and visualization has been brought up. The full-scale release should happen next year.

EmberGen is a leap into the future, into real-time rendering. Game engines are catching up with the offline ones at their rendering capabilities. The offline ones are starting to use RTX for acceleration. And complex effects are becoming accessible and easy to implement.

#CGI #playsense #motiondesign #vfx #howtovfx #embergen

CGI Digest September from Playsense
15.10.2021 #CGI Digest
CGI Digest September from Playsense
CGI Digest August from Playsense
15.09.2021 #CGI Digest
CGI Digest August from Playsense