Capacity: The Journey from Pre-Rendered to Real-Time | SIGGRAPH 2019 | Unreal Engine

Capacity: The Journey from Pre-Rendered to Real-Time | SIGGRAPH 2019 | Unreal Engine


>>Just to introduce myself,
I am Benji Thiem, I am the Co-Creative
Director at Capacity. We are a design
and animation studio that has been around
for over 14 years. Originally, we got our start
in the broadcast space. And one of the things
I want to talk about is sort of a lot
of the changes the industry has been
going through lately, and how we have
gradually gotten more and more into real-time work. But there are some
very pivotal projects that brought us to this
point that we are today. Originally, back in the day, I remember when we first
started using Cinema 4D and they had first introduced
global illumination, right, and it took forever.
But everybody was doing it, because even though
it took forever, the quality difference
was enough to justify it. And now, jumping ahead, we sort of built
a team full of artists that were generalists,
all Cinema 4D artists. And we were one of the first
to adopt Octane as a toolset for rendering. Now, that was a crucial
point in our history, because GPU rendering
really revolutionized what we were doing
in a creative field, because GPU rendering had
that immediate feedback of, you can see, as you are
working on something, what the final result
is going to be, right? And that really primed us for what the future was
going to hold as well. So I put together
a few snippets here. This is all not Unreal work, it is all stuff that we have
done over the years for clients, and it is all rendered
on GPU in Octane. And one of the things
I want you to think about is, these are some of the most
production-heavy moments from a lot of the pieces
that we have done. And the emphasis here is, we are not always striving
for 100 percent realism, we are striving to
fulfill a creative goal. And the most important
thing there is that we have the flexibility
to follow our vision, and make something
happen out of it. So that is kind of where
our heads were at, when we were thinking, okay,
how do we get more flexibility? How do we get more
creative freedom? How does a small team of
artists take their abilities and apply it to
a big production, without getting
completely snowed under by render times
and technical issues, and all of these things that
we are used to dealing with? So I wanted to do a little
case study of a project that we did that the whole team
decided to jump into Unreal. And it was really
a leap of faith for us. In 2017, the president
of Promax Games, Steve Kazanjian, approached us
to be the design partner on the Promax Games
2018 Summit. So this was a conference
to celebrate gaming, and the people
that work in gaming. And since as a company
a lot of our clients are Blizzard Activision,
Nintendo, Psyonix, we have been doing a lot of
work in this space already. So we are super excited
about the opportunity to do a piece where we really
showcase our love for gaming, and sort of a progression where you could see different
types of games taking place. So one of our goals
for this project was to take an
entire team of people who are not used to doing work in real-time,
everybody dives into Unreal, and we do this whole
project in Unreal. At the time, we had one
artist, Alex Kintner, who was actually
our concept artist, but was pretty
comfortable in Unreal. And he was instructing us
on some things, but we did a lot
of research ourselves. And we decided to dive in
and just see what we could make. So the concept was to basically
recreate San Francisco in a stylized fashion,
and create this vivid world that was a celebration of all
kinds of different game genres. So at this point, our artists, we are very
comfortable right now, because we are working all
in Cinema, we are concepting, we are modeling things out,
we are rigging — these are all things that everybody’s
familiar with, right? So at this point, we were
just having a blast, and kind of thinking
of all the crazy things that we could add
into this piece. But it was a little bit unclear,
like, what are the hurdles that we are going to run
into when we do this? So with the help of Alex,
we jumped into Unreal. The first big part of it was, what animation systems
can we bring into Unreal that we have created
in Cinema 4D, and which ones do we have
to rebuild in Unreal? So we did a lot of exploration. We ended up doing some of
the particles in Cascade, which worked out really well. We built procedural
animated textures, using the node-based systems
in Unreal. And we basically created
as much as we could, including building
all the environments. We built them in Unreal as well. So then came a challenge
for our animators and traditional Cinema
4D artists, is how do we take
all this animation that we have been doing
in Cinema for our animatics, and translate those
over to Unreal? At the time, there was not
really clear documentation about how to do this, so there was a lot of
experimentation involved; like what do you have to do to
bake out a Character animation and bring it into Unreal? What do you do if it
is a dynamic Object, and you want to bake that out
and bring it into Unreal? So we did a lot
of experimentation. And there were certain
things at the time that did not work as well,
but I am happy to say that now that process has become
completely seamless. We have it down where even
without baking Characters, we can bring them
from Cinema to Unreal. There is a new Datasmith Plugin
which just got announced, which allows you to open
an entire scene in Unreal, so you can actually save
out your Cinema 4D file using the Save
for Melange feature, which is now being rebranded
the Save for Cineware feature. And you can save out
your entire Cinema scene, open it in Unreal,
the hierarchies come through, and everything.
And that is just — the process is becoming
so seamless now that there is — like for us,
we had built this whole thing using a more difficult process.
And now it is so much easier. So I am going to loop the
final piece that we made, while I talk about
some of the things that we got out of this process
of working with Unreal. First of all,
what really blew our minds was that all the things that
we conceived in the creative brainstorming, we were
actually able to include. Because once we had a base,
we sort of built a map out, we had the Characters. Once we had that base,
iteration, changing things, adding things, was
so quick and so easy, because everything is
happening in real-time, right? So this is actually
a pretty long piece. But we were not faced
with weeks of rendering. We were not faced with all those
usual problems you run into. So that was a huge thing. So leading right off of that is the concept of a
production timeline. When you are working in a traditional
pre-rendered pipeline, you have to schedule 25 percent
to 30 percent of your project just for rendering, compositing,
fine-tuning, all that stuff. And while it is
still good practice to have a locked in date
where you are locking picture, it is important to note,
with a project like this, where you have everything
existing in real-time, there is no need to
stop changing things at a certain point. You can, the day
before a project is due, you can change
the Character animation, bring it back
through the process, re-render it in a
matter of minutes. So it completely changes
the way you can view production and it puts a lot of control
in the creative’s hands, which is what
is important to us. And that also leads to
the idea of compositing. Compositing is
a really interesting thing, because even back when we
were doing GPU rendering, we started to realize
that a lot of the reason our composites were so huge was, we were trying to avoid having
to go back and render stuff. You are trying to
build in flexibility by doing all these passes, so that you can change
things in the future, right? Well, when you have
an Unreal workflow, there is no reason to not
just go back into the project, make the visual tweaks
that you want there, and then re-render it,
because the render time is almost non-existent, right?
So the interesting thing, too, is that there is
a lot of built-in tools for color correction,
lens effects, things like that, that you would normally
do in post, post-post, as we like to call it. And now you can just
handle them in Engine. So my recommendation is that if you are going to
jump into this process, take it as far as you
absolutely can within Unreal, and then from there,
you might have to only do a couple of tweaks
in compositing. And then the other factor
is the Sequencer. A lot of people
do not really understand the power of the Sequencer. But basically, you can do
entire edits in Unreal, so you do not have to
render out shot by shot, bring it into Premiere, or bring it into After Effects
and put it all together. You could technically
have your whole Sequence just living
with the audio in Unreal, and be able to preview it there.
So these are all things that are really helping
our production along the way. One thing, a happy surprise, we recently did a
two-minute trailer that was a 4K trailer for
a gaming company, right? And usually, delivery process
is interesting, right, you are delivering
this huge render, and then sometimes the client
needs some kind of flexibility, or they might need to change
things in the future, right? Well, normally,
we are faced with, okay, which files do I collect out?
Do I make them an AE collect that has certain
components of this? Or do I have to also
save out the 3D files that went into the AE collect, so that they have the
flexibility they need? Well, with this trailer
that we recently did, we delivered the entire trailer,
which was rendered at 4K, and alongside of that,
we delivered one Unreal project that was eight gigs that
could render every single shot that had created that trailer. And the client could go in and
make any last-minute changes they needed for legal reasons, or whatever,
and it was not a hassle on us. They did not have to
charge us overages — it all made total sense. So the delivery
process is changing, the way we see
compositing is changing. So many things are
being affected by this. Another thing I wanted
to mention, too, that is a huge
workflow thing is, as traditional artists, we were not really used
to the idea of version control. So normally,
the way we would do things is, we would have our 3D files, we would have an
artist work on them, and then we would hand
the files back and forth. And they would have
to kind of see, oh, okay, this is how they set it up, and this is where
I go from there. With Unreal, it is all set up
in a way that you can apply something like Perforce
version control, and then all of a sudden,
you can split out your scene. So you have one shot.
You can say, okay, one artist is now working
on the environment, one artist is now working on
the lighting, one artist is now working on Cinematic
Cameras and animations. And they are all
working simultaneously. And with one sync of a button, those all combine
into a master file that can be rendered out.
So that is another thing. When you are up
against a deadline, and you do not know how you are
going to pass all your files back and forth in time
to get a render up and get it out, it is so helpful
to be able to have that simultaneous
workflow that syncs, and then being able to
render in real-time, you can just get shots out
the door faster than ever. So these are all things that
we have sort of learned over the process
of working with Unreal. And this was over a year
ago that we did this. Since then, we have done
a virtual set for NFL RedZone. We have done a lot of smaller
projects with Ross Video. And recently, we did an
entire trailer with Psyonix for the new Radical
Summers event that was all
rendered in Unreal, and we even included a lot
of ray tracing features as well. And so this type of work
is just taking off for us. It is an amazing process.
We are super into it. We have a couple of big projects
launching right now. And by the way,
if you want to check out the Radical Summers trailer — I could not show it here
because there are so many IPs involved,
but it is a really cool piece, and it is on YouTube, if you just look up Radical
Summers Rocket League. And you can kind of see where
we are taking this process. So we are in the midst
of exploring how far we can push the production,
how much realism we can add in, what can we do with
Character animation. And we are super excited
of being able to work with the new tools that MAXON and Unreal have been
working with, together. So yeah,
it is a bright future for us, and we just wanted to be here
to share what we are doing, and where we see
the industry going. Thank you! [APPLAUSE]

About the author

Comments

  1. But if we need more deformers than skinning and blend shapes, Alembic isn't there yet, right? On unreal my tests a few weeks ago imported corrupted (less than 10% integrity), while Unity imported the same files correctly but with the known issue that it doesnt read materials.

    Should I look into this datasmith mentioned or it will still only works limited to skinning&morph?

  2. Very nice
    "We delivery project to client who can change it at the last minute and not charge us" – that is interesting concept 😀

  3. Just a question….
    Could i use this datasmith to import an entire animated scene from cinema directly?
    I mean, considering some meshes would be considered as "skeletal meshes" and ohters wouldnt have a skeleton :/

Leave a Reply

Your email address will not be published. Required fields are marked *