What you’ll learn:
VFX stage of the 3D animation pipeline
When watching a 3D animated film or sequence, you might think everything you see is created in 3D animation software. Most of it is, of course, but there are some elements which are better created in visual effects rather than 3D animation, very complex elements like hair or dust or water. For realism, these elements must react and behave according to a physics-based environment, and that kind of complexity is difficult or impossible to animate manually or through keyframes. That’s when VFX comes to the rescue!
There’s a lot to cover, so let’s get started. Read on to find out more!
For projects as complex as 3D animation, which can involve dozens to hundreds of people in teams all working toward the end goal, you need a system in place to keep things organized, running smoothly, and ensuring that everyone knows not only their own jobs, but where they fit in the overall process and who to hand off their work to for the next step. In the industry, this organizational system is known as a “pipeline.”
The 3D animation pipeline generally consists of three parts: pre-production, production, and post-production. The project is planned out, visualized, and designed in pre-production. All the elements needed for the final images are created in production.
Then, in post-production, the team takes those elements and makes the final product.
Without the pipeline in place, it would be very difficult to keep the project on task, on time, and at the quality demanded.
Here, we’re going to discuss a specific and somewhat misunderstood stage of the 3D animation pipeline – the VFX stage.
In the 3D animation pipeline, the VFX stage takes on a slightly different meaning than what you’re probably thinking. In traditional film, the VFX stage generally comes in post-production, after the initial edit, but in the 3D animation pipeline, it happens during production. While a 3D animator creates almost everything in 3D software, certain other elements are better handled as VFX, such as fur, hair, water, explosions, lightning, fire, cloth, dust, and other complex textures. These complex motions or elements are generally too resource-intensive, too difficult, or just plain impossible to do in 3D software.
Artists usually turn to physics-based systems to simulate the kinds of motions and textures involved. They set the parameters, and the system does all the calculations needed for the simulation, instead of the 3D artist trying to figure it out or keyframe it. The system takes into account environmental factors like gravity or air movement.
There are several different types of effects which come into play here.
This is pretty much just what it sounds like; a simulation system creates a simulation of hair or fur, reacting not only to environmental factors like wind or rain, but also to the movements of the head or body of the character it’s attached to. It’s an incredibly complex simulation which can require resources which may be out of the reach of any but high-budget projects.
Not just “bodies” in the sense of human or animal bodies, but any physical objects. Rigid bodies simulate inflexible objects like wood, stone, glass, or others. Rigid body simulation is used to simulate impacts, collisions, shattered glass, and various other interactions between objects. Soft body simulation includes simulating the interactions of less-rigid materials like cloth, skin, soil, bodily tissues, and anything else which may bend, contort, or ripple instead of break or shatter like hard bodies would.
Fluids don’t include only water or liquids, but also gases or gels or anything else which takes on fluid movement, including thick fog, gelatin, oatmeal, or any of many other similar substances. Fluid simulators generally base calculations on the equations involved in real-world fluid physics.
Particle simulators create points in space which take on the characteristics of scattered phenomena like rain, smoke, dust, flocks and swarms, crowds, advancing armies, flying sparks, and much more. Artists assign behavioral and visual properties to the particles so they look and move like anything the artist wants. Like in other types of simulations, artists define physics-based parameters like gravity or wind and the particles react accordingly, creating amazingly realistic cloud-like effects like snowstorms, migrating bees, attacking birds, or whatever else is needed for the sequence.
Professional effects software like VEGAS Effects often include particle generators suited to exactly this work.
The particle generators in VEGAS Effects put incredible power at your fingertips. With particle generation, you can conjure swarms of insects, engulf towns in massive dust storms, create fantastic portals to other worlds, fill your entire environment with smoke or fog, or anything your imagination dreams up.
Import 3D models and wrap your particle clouds around them in 3D space or wrap around 2D objects as well. Either way, the particle generators let you create astounding, cohesive worlds.
The particles react to real-world physics that you define, including gravity and wind effects. Bounce sparks off the ground. Blow dust clouds into powerful tornadoes. Let your imagination be your only limit!
Make VEGAS Effects part of your 3D animation pipeline!
- SOUND FORGE Pro
- ACID Pro
- Mocha VEGAS
- Boris FX Primatte Studio
- VEGAS Effects & Image
- 100GB cloud storage
- Free upgrades
- Unlimited 4K stock footage
- Text-2-Speech, Speech-2-Text, Quick Upload, File Drop
What's the difference between VEGAS Pro Edit, VEGAS Pro Suite and VEGAS Pro Post?