CinemaTech
[ Digital cinema, democratization, and other trends remaking the movies ]

AD: Fans, Friends & Followers

Tuesday, August 30, 2005

Videogame / Movie Convergence


VFX World has a really swell Q&A with Cliff Plumer, chief technology officer at Industrial Light & Magic, posted just yesterday.

Most interesting to me was the discussion of ILM "borrowing" some of the software that LucasArts (a sister company) originally developed for video games. The video game artists use it to animate characters and create scenes; the movie artists are starting to use the same stuff for creating "previsualizations" of movie scenes. Previsualizations can help reduce the costs of special effects-laden sequences by getting the director and the post-production crew singing from the same hymnal.

From Barbara Robertson's interview:

    CP: Now that we’re all under one roof [at the Letterman Digital Arts Center in San Francisco], ILM can take advantage of their game engine, and LucasArts artists have access to things we take for granted. Look at something like crowd simulations. They have been big in effects for the last few years, but have been used in games for a long time. We can take advantage of their AI engines, their game engines and integrate them into the visual pipeline.

    The big win for ILM, though, is in previsualization. A visual effects supervisor can sit with the director and have a synthetic scene move around in realtime. The director can block in a scene and do a camera move with a virtual camera. It feeds the whole post process.

    BR: But that isn’t new, is it?

    CP: It hasn’t been intuitive. Previs in the past has been a scaled down post- production operation. The director comes in, we make a change and show it to him. What we’re saying is, “Let’s make this like photography; do it in realtime.” This is something we’ve been developing in conjunction with LucasArts — to hand the previs to the director. It’s almost like a game.

    BR: Who has used it?

    CP: It hasn’t been used on a film that’s been released, but it’s in use. It’s also still under development. Ask me again in a couple months.

    BR: So does this mean vfx artists will no longer do previs?

    CP: The game engine part is designed to work in realtime. The director can plan how to shoot a live action or block a CG scene. Contained in the application are libraries of lenses and so forth. But, we can also record the camera moves, create basic animations and block in camera angles. And instead of handing rendered animatics to the CG pipeline, we have actual files — camera files, scene layout files, actual assets that can feed into the pipeline. It gives the crew input into what the director is thinking.

    BR: Was this something George Lucas used for Star Wars?

    CP: No. It was driven by Star Wars. It was something George has felt strongly about. But, the tools weren’t ready for him.

    BR: Can you see anyone other than directors and vfx supervisors using it?

    CP: DPs might use it as well to place lights and see how a scene could be lit or shot. It isn’t solely for directors.

    BR: Do you expect the game engine to be used on post-production?

    CP: No. Only during early stages in previs. At this stage, the game engine can’t hold as complex a scene as we need for film effects. To get the complexity we need, we’d be compromising the realtime performance.

Later, they talk about "sharing assets" among films and videogames - reusing digital characters, props, or scenery.

It's not hard to imagine a time where the same team of people work together to create a movie and a game. They're essentially creating a digital environment and cast of characters, and perhaps doing some live action photography with actors to complement that. One "product" is a videogame where the player (or players, in the case of a "massively multiplayer game") can move through that environment at will, racking up points or killing people or solving mysteries; another "product" is a movie -- a linear narrative with a beginning, middle, and end, featuring the same characters and environments.

What gets really interesting is when the director starts producing multiple versions of a movie, perhaps based on feedback from her audience... or when the audience itself starts using the environments and characters crafted by the creative team to tell their own stories.

How far off is this convergence of movies and games? Some would say it's already happening; I think it arrives in a really significant way within the next four years.

1 Comments:

  • Scott - again a really nice article.

    The first usage of game engines for previs that I'd heard of was when they used a hacked and modded Unreal Tournament engine for Spielberg to make that future Vegas looking place (the pleasure island thing) in AI. They had a greenscreen shoot, and the camera's orientation and moves were fed to an attached computer so a realtime composite of the live actors over the proxy 3D scene could be rendered...in real time.

    On set previsualization as opposed to preproduction previsualization, but cool all the same. I think the details were discussed in the CineFex of the time.

    -mike

    By Blogger Mike Curtis, at 4:37 PM  

Post a Comment

<< Home