Search form

James Knight Talks the Exploding World of Digital Filmmaking Technology

The veteran visual effects and performance-capture expert, and current M&E director at AMD, talks about exciting, seismic changes taking place in visual effects and virtual production.

As the Global Director of Media and Entertainment at AMD, James Knight leads a team that supervises alliances with video streaming, VFX, virtual production, and post-production computing application vendors. Knight and his colleagues also work directly with the world's most prominent media brands, studios, and streaming platforms, ultimately supporting a wide range of entertainment content creators - and software vendors like Adobe and Autodesk - by making sure their applications “run screamingly well” on AMD’s CPUs, such as the Ryzen™ Threadripper™ PRO. Which they do… on projects too numerous to mention, though we will mention current box office champ Avatar: The Way of Water.

Over a two-decade career in production, post-production, visual effects, and virtual production, Knight has worked on numerous films, including serving as performance capture project manager on James Cameron’s seminal, Oscar-winning 2009 epic, Avatar. Knight is a member of the Visual Effects Society, the British Academy of Film and Television Arts (BAFTA), and the Academy of Motion Picture Arts and Sciences.

In an interview conducted at the 2022 VIEW Conference in Turin, where he spoke on several panels, and a recent follow-up call, we spoke with Knight about how AMD is helping to bring the latest digital technology to the M&E space, how previsualization is helping executives make more informed production decisions, and what the future looks like for virtual production in the visual effects industry.

Dan Sarto: We’re witnessing this incredible explosion of digital tool-driven content creation that has allowed a kind of storytelling that wasn’t possible just a few years ago. How do you and AMD view this new creative landscape, and how does it impact your own planning regarding product development and marketing?

James Knight: For one thing, we view it as an awful lot of fun. I think people tend to lose sight of that. The other way we view it is as a collaborative enterprise. Of course, we’re fully cognizant that it’s a business, but working with filmmakers isn’t about saying, "So, how many of these do you want?" and then saying, “See you later.” Oftentimes we're working directly on projects, or previously generated assets from older projects, to optimize future generations of our technology.

One of the things that my team did a couple of years ago is take a whirlwind tour of the studios and engineers in L.A. to find out what they liked and what they wanted. We take it quite seriously. And the influence that filmmaking and content creation has had on the development of our CPUs is definitely noticeable. Media and entertainment, content creation, drives innovation in other verticals. It really does.

DS: Similarly, I would imagine that bringing some of the engineers and executives in to meet the folks that are using the products in Media and Entertainment gives AMD a much more nuanced and deeper perspective on your business in this space.

JK: It does do that. I bring them out to Los Angeles so they can see our technology being used in the field – at the studios, at the visual effects houses, sometimes on set. Without that, it would be like trying to explain how to drive a car without actually driving one. You've read the manual, but you've never actually done it. How are you supposed to market a product, how are you supposed to iterate a product, if you don't actually go and see how it's being used by the people that are buying it? And that's a large element in how we lead AMD's initiative into media and entertainment content creation.

DS: It's very important, because CG technology is so ubiquitous in content creation. Unfortunately, there's still a huge number of people involved in decision-making for this content that really have very little concept of how computers fit into the picture. They don't understand how much the artist's hand still goes into content creation. They really think, "Well, it's done with computers." And that can have a stifling impact on what gets made, how it gets made, and how it gets financed.

JK: I have a somewhat cynical view on that, which is that executives sometimes want to be ignorant about the process so that they can push harder without being concerned about the actual details. But, as far as changing people’s perceptions, I think what happened with the original Avatar is instructive. I don't think we realized it at the time, but it was quite a pioneering film. Some would argue that virtual production really was born with that.

One of the brilliant minds on that film was Glenn Derry, who was the virtual production supervisor. His company got bought by 20th Century Fox and was turned into Fox Visual Effects Labs. What Glenn was doing was virtual rapid prototyping. His company was using Unreal Engine and virtually rapid prototyping films, essentially making the film before it was greenlit – visualizing it using game engines and rendering it at as high a quality as possible. And the idea was that you'd show it to film execs and their families. It's one thing to read it and imagine it, but another to actually see it.  So, I think we played a big role in that, and NVIDIA's GPUs played a big role in that as well.

I think that executives and producers are starting to understand technology's role in getting projects greenlit, and in being able to iterate, spitball in a fashion, where they can actually see something, they can visualize it. I think technology is making it easier to get things into production.

DS: I know that, for years, folks had ways to previs things faster and to make them look better. And they would still be greeted by, "Wait, is that how it's going to look when it’s finished?" The executives had no idea how to look at it. It caused more problems with some executives than it did for them just to see it on the page.

JK: Yeah, I've seen how it causes issues with production companies and studios. It didn't make sense to them. But when you explain it in detail and in economic terms, they understand. Say you want to shoot two days on a busy street corner, which is going to be expensive with an expensive crew. So, you tell them that you’re going to scan that corner in, or create it in Maya, and then you're going to sit on a mocap stage with a virtual camera for a couple of days, and figure out where you want the camera to be pointing at what time during the day, so you’re not discovering that on set with 150 people [sitting around] on payroll. They'll go, "Oh, that's a good reason to do previs." All of a sudden, they see the benefits in that aspect of virtual production.

DS: I know this is a really broad question, but how do you see technology from your company, and the companies your products are integrated with, impacting storytelling in the next handful of years?

JK: Technology affects the story by making more time for the art. Deadlines don't move and there's a lot of money involved, and a lot of scheduling involved, so you only have a certain amount of time to create your art. So, one way in which technology's going to continue to help that is in the ever-evolving struggle to give artists more time with their art. Each year, or thereabouts, we're coming out with a beefier CPU and, for a couple of months, we give artists more time with their art. But human beings, being human, will then go, "Oh, you have a 64-core CPU. This is amazing. Now I know what I'll do with 128 cores or even more." So, it's that delicate balance.

Another way in which technology impacts storytelling is when you have a higher capacity CPU or a piece of technology, you can plug other technologies into that, in addition to those NVIDIA GPUs or the network interface card. So, all of a sudden, you can stuff more into a computer, other bits of technology that make other things possible. It could be anything from a microphone that detects you're going to start rolling so it lowers the fan speed, or something where every other frame on an LED wall is green.

There's some technology at AMD now, our FPGAs, that are in all the RED cameras and all the ARRI Alexa cameras. So, there's a lot we are doing there to streamline the story recording process so they can get to post-production much quicker. Taking footage directly from the camera, whether it's just live-action or if it's simul-cam – if it's comped at the time of capture, we're doing some streamlining there. So, I think making the storytelling process easier, making the tools much easier to use, just equals better storytelling. We are just the tools – we want to get out of the way and make storytelling a lot easier.

DS: What are the technology concerns of people who are investing in their own IP, creating content that they're hoping to sell and get distributed? As you're trying to reach these people in various ways, what is it that resonates with them and gets them to devote their limited time and resources to pay attention?

JK: I think they think about security. When you have an idea, you don't want someone to rip it off or do their own version of it. Other than that, there’s no simple answer. If they're a filmmaker or content creator, they’re going to phone their friends, they're going to go to a Visual Effects Society event, or an Academy function or a BAFTA function – they're going to check in with their community. That’s why, as a technology provider, you've got to get stuck in with the community. You can't just drop-ship a few CPUs or the latest computer and say, "See you, guys"; you've got to get stuck in and develop relationships and make sure they know that this stuff works. That's how we are working hard to become part of the fabric of visual effects.

DS: In your travels through the M&E space, what are some of the things you're seeing being done that excite you the most?

JK: I love the fact that the studios are really starting to take virtual production seriously, and they're looking to understand what those two words together mean now – the use of real-time visual effects. I really think it's the largest area of innovation in visual effects. I love that game development has had an effect on feature film production. It seems like they benefit from each other and they're all kind of watching. You'll now see filmmakers at GDC, the Game Developers Conference. 20 years ago, when you heard “games,” you’d think of teenage boys playing Atari. And that's not the case. What I love is the cross-pollination of two otherwise separate industries coming together.

DS: I think what needs to be emphasized, and what your experience in this industry clearly demonstrates, is that this isn’t some kind of intellectual exercise. People are delivering on projects, and you and AMD are right there alongside them helping them get it done.

JK: it wasn't until I moved over to the technology side of visual effects that I realized how incredibly tethered art and technology are. Technology's always been involved in art. The pen, the pencil are forms of technology. Even words on a page is a kind of technology. Art and technology have always driven each other. If you can visualize something with art, then it can also influence technology. They both inspire each other.

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.