Search form

‘Super Giant Robot Brothers’: A Milestone in Virtual Animation Production

Director Mark Andrews and producers Paul Fleschner and Adam Maier chronicle their trailblazing journey to create - entirely in Unreal Engine - the 3DCG animated comedy series premiering today on Netflix.

They are super. They are very, very big. They are mechanical. They are related. And rivalrous! OK, let’s be honest, these character attributes alone should be enough to propel an animated series into the must-see category. But what makes Super Giant Robot Brothers even more remarkable is that it was created entirely inside Unreal Engine – the first series to be so produced, and a potentially prophetic milestone in the still-evolving history of twenty-first-century animation.

Directed by storied Oscar-winning animation honcho Mark Andrews (Brave), SGRB is the story of two – you guessed it – giant robots, who discover their blood relation while battling intergalactic evil. Originally developed by Victor Maldonado and Alfredo Torres under the auspices of Reel FX Animation Studios, the series grew into a two-minute proof-of-concept test produced by Paul Fleschner (The Origins of Wit and Humor) and video game animation honcho Adam Maier. Then there was one of those unforeseeable, serendipitous concatenation of circumstances in which these three visionaries came together and, with the unstinting support of Reel FX and Netflix, endured a trial by fire wherein they attempted  to create an actual series in a new way. The rest, as they say, is the still-evolving history of what’s the newest cutting-edge animation.

We corralled Andrews, Fleschner, and Maier, and encouraged them to relate the story of the first series created wholly in engine. Graciously, they obliged.

Check out the exciting trailer, then enjoy the interview:

Dan Sarto: What was the genesis of the show and how did you all get involved?

Mark Andrews: Reel FX had this project that was developed by Victor Maldonado and Alfredo Torres, and they showed me this two-minute test that they did in Unreal Engine, and I just got it. It had everything that I love personally. It's sci-fi. It's got giant robots fighting kaiju. And the robots happen to be brothers, who have a sibling rivalry. They were just so goofy, and they had this kind of Spock-McCoy dynamic – logic vs. goofball emotion – and it just clicked. So, a few months later, Victor and Al went off to do a feature, and Reel FX asked me if I would like to jump in and direct the show.

Paul Fleschner: Adam and I met Mark prior to the Disney-Fox merger, when we were at Fox VFX Lab, which was this incipient Unreal Engine virtual production initiative. We stayed in touch with Mark because he really appreciated the idea that you can use this real-time retargeting and virtual camera to inject live-action sensibilities and workflows into animation. We produced the two-minute test that Mark was talking about as a proof of concept, and then this perfect thing came together with Mark coming on the project. He was the filmmaker who could be the evangelist to all of the animators, to anyone who might be skeptical about this process. So it's just been an absolute blast working with Mark and Reel FX, and building this pipeline for their work.

Adam Maier: We came into Reel FX to talk to them about this new workflow, this new pipeline, and this new way of working with creatives. But, for me personally, the story of Super Giant Robot Brothers was perfect. I have a twin brother, and I found a lot of Shiny in myself and a lot of Thunder in my brother, so I could identify with the characters right away. I actually have an older sister, as well, who's had to deal with that nonsense of competing twin brothers. So it really drew me into the story. And then, of course, I jumped at the opportunity to work with Mark again, who’s a phenomenal storyteller and someone who really believes in this way of using technology to help in the storytelling process.

DS: I just wrote the preface to The Animation Field Guide for Unreal and for the Epic Group. And one of my comments was, "What does it all mean?" There's certainly more and more attention paid to the real-time rendering engine as an iterative, creative tool – some is obvious and some not as obvious to people who don't use the technology themselves. So, can you explain exactly how you’ve used Unreal Engine, why this is so unique at this time, and where it’s leading in terms of a real, robust set of production tools?

MA: I think there's two main things. Speaking as a visual storyteller, it allows me to work in a context that I can't work in in animation. Animation is a process that pulls everything apart and puts it in buckets, and once you’ve filled one bucket, then you're allowed to go the next bucket, and then the next bucket. So you're building one step at a time, and everything is out of context. Character designs are out of context because they're not created inside the environment. The environments are built out of context. Everything is out of context, because we don't have the final until too late in the game.

What being in real-time does is it brings everything to one place. My assets are there. My backgrounds are there. My characters are there. The lighting is there. The visual effects are there. So as a creator, I'm making my decisions and I'm reacting in context. If I see that something's wrong, or if we see that something doesn't fit, or if we see that proportions are off on a character, we can change them in real-time. So I get to be further along in the process a lot sooner. And, from a creative storytelling point of view, that’s huge. It's what live-action has had almost the entire time and animation has been struggling to get to.

I have to give a lot of props to Reel FX as one of the first companies to go all in and create a production animation pipeline that utilizes Unreal Engine. We weren't doing a test. We were on the clock. And Adam and other people who were versed in programming and Unreal really had to crack it open and figure out what was missing. And the 90% of stuff that Unreal does is amazing, but there was still a little 10% that it didn't do well, because that's the difference between what you need to make a video game and what you need to produce a film. There were huge growing pains as we were making this thing, but we got over these barriers because we had to. We had a deadline. That was the great push on the technology.

PF: I came in at the very end, on editorial. For me, and I think for Adam, an emergent phenomenon – in addition to the spontaneity of live-action and virtual camera – was discovering that, wow, we have coverage. We have wides, mediums, and closeups, and we're giving editors options. And this is something that's been a part of live-action for a century, but seeing animation editors come alive and have such zest working with Mark was awesome.

MA: I love this stuff – I love every shot, I love every damn take – so the only way for me to actually make something that's good is to hand it off to somebody else and step back. And then I come back and look at it objectively. I can go, "Oh, that's really cool. There's something here that's funny, let's milk this. Let’s cut this or let's put this in." So it was really a fabulous back and forth. It always is in editorial, but here we had so much to work with and play with. And if there was something that we didn't have, I would just call up my DP and Adam, who were on the stage, and I'd say, "I need this shot." I might draw a little sketch. "Give me versions of that." And they go off and film coverage of that idea. A couple days later, it ends up in the editorial video.

PF: Adam literally helped design our VCam interface. We worked with Technoprops at ILM and with another engineering firm and Reel FX to put together something that would really give Mark what he wanted on set. But we discovered early on how generative VCam could be – that you could shoot a couple hundred shots a day and literally have to start doing selects to feed editorial.

AM: For me, there are two really important parts of the process that the technology helps with. The first is context, as Mark mentioned, and the other is immediacy. Those two things put the creative power back in the hands of the creatives as much as possible. The director, the cinematographer, the editor, are all able to work together to create the best story possible in a really collaborative environment, because they're there together and what they're seeing and what they're able to change is immediate.

On the production side, it does very similar things. Immediacy in lighting allows the lighters to see the results of what happens when they change a light instantly. We’re basically removing the majority of the compositing pipeline, because everything that was done in compositing can be done directly in the scene with the lights, with color correction, without needing to go to an external application. And because renders are so fast, artists can iterate more quickly, more efficiently, on their shots, so they got more stabs at creating something great, which we think helped bring up the quality of the show over the course of the production. At the end of the day, the way we built the technology is meant to get it out of the way and just support the creative process as much as possible.

DS: Everyone that I talk to in animation and visual effects always says, "The technology is there to serve the creative hand." It's like, how often does that really happen?

MA: You're absolutely right. It's finally here. You have to be making something large-scale that has a deadline to make sure the technology can deliver what it's promised to deliver. And that's what we have had in this process, in building this pipeline. We actually put it through its paces and it dazzled us.

DS: So tell me briefly, in terms of the final pixel, how the episodes rolled out from this pipeline.

PF: We worked directly with the post house and just exported EXRs from the engine. We can get into the nitty gritty of how we ensured that we had the level of quality that we needed, but, yes, this was a final pixel delivery to the post house.

DS: That's sounds pretty groundbreaking. Did you know in advance that you would be able to output this way?

PF: Our DP Enrico Targetti has a live-action background, but he’s also an Unreal artist, and he’s worked as a colorist in the past. So he's very fluent in that workflow. We connected him with Epic's color scientists over a year and a half ago, getting tests going, making sure that we were going to get what he needed. And he actually suggested things to them that made their way into the engine. It's one example of how collaboration has been a part of changing the engine to facilitate this.

AM: The two-minute test we did was actually to show this out was possible, though we knew we had a long way to go. But with Epic’s support and the team at Reel FX, we were able to build this, thought there were challenges along the way.

DS: How much of what you came up with is proprietary to Reel FX? And how challenging will it be for others to do what you have just done?

PF: Without David Ross, Legal Counsel for Reel FX, on the phone, I don't know that we can answer that question [all laugh]. But I can say that the collaboration with Unreal Engine has always been in the spirit of it being for many users. There are bespoke tools that we need for our pipeline in our show, and there are general innovations that Epic is working into the engine. I think that it's good for everyone.

MA: I think we can talk about Ozone and Rich Hurrey’s team at Kitestring. Ozone was our secret weapon in this whole production pipeline, because Rich and his team developed a rigging tool in Unreal. So you don't have to take the characters out and rig them someplace else and then use plugins to make it work in Unreal. It already works in Unreal and it works in real-time. This was a huge, huge boon for us in production. We couldn't necessarily have done this show at the level and with the quality we wanted without Rich and his team.

AM: Game companies have been working on graphics for years and years and years, and that's why we're where we are today, where we can have ray tracing in real-time, those kinds of things. But something that's been missing in the game space for a long time has been the ability to deform the characters, which really allows animators to hit the quality bar that they need to. And that's what this technology allows us to do – get much more powerful deformations, so that the characters can look and move so much better than they could before.

MA: And that's where most of the skepticism comes from. When you say “game engine,” everybody thinks it's going to look like a video game. And what they’re talking about is that the animation, the facial animation specifically, is so poor in games. They haven't worked on that enough. Rich and his team solved that, so we used it. And you'd never know that this was done in engine.

DS: What were the biggest challenges that you faced on this production? Were they primarily technical, or was it as much about overcoming skepticism?

MA: From a directorial and story perspective, what I wanted to do, what I wanted the show to be, I was able to say the proof is in the pudding. So if there was skepticism or pushback, I didn’t have to get into debates. I’d say, "Give me five minutes on stage. I will shoot the scene. You will see it at the end of the day." So you get that real-time feedback, which takes care of those worries about whether it's going to work or not. I remember, when we first rolled off the stage with our first episode, how ecstatic everybody was. Any worries got erased right then and there, and this was still rough footage. But the energy, the camera dynamics, the editing dynamics, all that stuff was there. If my execs want to change anything, even that’s easy in this process, because I already have all the assets done.

AM: At the beginning of this, there were a ton of challenges. We were building a new pipeline and way of working. We were building a whole new division. This is the first series that Reel FX had fully developed and produced, and we were doing this all in the midst of COVID. So there were a ton of things to worry about. I will say that, on the creative side of things, there were no issue because we had Mark Andrews. He's not only one of the best directors out there, but also one of the best at working with teams. He knows all the artists by names. He knows the story backwards and forwards. And the producers did a fantastic job of putting a good team together that believed in what we were doing and believed in the process. So the real challenge was building this pipeline from the ground up, but it was a challenge that we were really excited to take on. It felt like a startup in many ways.

DS: Last question. I don't know whether you've given this much thought, but how much time do you think you saved? How many fewer people did you need? Can you quantify it at all?

MA: That's a great question. I think it's tricky, because you could have done this in a traditional way in the same amount of time that we did it. You could've done it in two years piece of cake, but I wouldn't have been able to direct every episode. You would need at least four or five directors. You would need a much larger animation crew – probably double or triple the size to get where we got. And just to add another qualifier to this, in traditional television animation, you're given about six to eight weeks to storyboard one episode.

We had an episode filmed every two weeks. That's in the can. So I'm six weeks ahead of everybody else doing it this way. And those time savings obviously equate to money. The much smaller crew saves money. So we do it in the same amount of time, but we get a much better quality. There's much more authorship from the crew members themselves. It isn't prescriptive, as animation usually is – I want this shot for this and that's it. There's no play and exploration.

So we buy all that time back where I could improv on stage. I could go in with an idea and find something completely new. In the traditional ways, you really can't. Your hands are a little bit more tied. Don't get me wrong, we have great animated shows that have been working this way. But if you give the creative and the crew much more freedom, where the tools can move as fast as these people can think, they do much better work at the end of the day. It's transformative.

PF: You get to story faster, and you get to 3D story faster. It's really a shift in mindset – a hybrid between live-action and animation. It's something different, and you have a different feel to the show. I also don't want to undervalue the gratitude that we feel for our partnership with Reel FX and Netflix. They let us do this whole series this way and they supported it.

MA: Yeah, Reel FX and Netflix jumping into this and being supportive of this brand-new mystery process – we walked up to them and said, "We can turn lead into gold," and they said, "Okay."

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.