CinemaTech
[ Digital cinema, democratization, and other trends remaking the movies ]

AD: Fans, Friends & Followers

Monday, January 07, 2008

Cinital's new take on green screen tech

In Sunday's Boston Globe, I wrote about a small company called Cinital that's trying to bring down the price of high-quality, real-time green screen compositing. What's novel about their approach is that the camera can move anywhere it wants -- or change focus -- and the background responds appropriately.

From the story:

    Ordinarily, it's hard to tell what live actors will look like once these digital backgrounds are laid in; that work, called "compositing," is usually done afterward by visual effects specialists. But the concept behind Mack's company is to mix the actors and the backgrounds in real time, so the director can see what the final shot will look like by glancing at a high-definition monitor - and reduce or eliminate the costs of all that laborious, after-the-fact compositing.

    Mack's Cinital system could be used on as many as 20 TV productions and a handful of feature films this year, says Sam Nicholson, chief executive of Stargate Digital, a South Pasadena, Calif., visual-effects firm that bought the first system. One of the first projects to which Cinital contributed is NBC's new made-for-TV movie "Knight Rider," which airs next month.

    "This year is going to be a watershed year for us and this technology," says Nicholson, whose company has contributed visual effects to movies like "Charlie's Angels 2" and TV shows like "Heroes."


So far, the prototype system has been used for pre-viz on the forthcoming "Knight Rider" TV movie and an episode of "Saving Grace."

I also shot some video of the system, and Cinital founder Eliot Mack.

Labels: , , , , ,

13 Comments:

  • wow - from someone who works in vfx that looks very impressive. going to be interesting when this hits digital rebel affordability - which probably won't be long.

    By Blogger deepstructure, at 1:29 PM  

  • I'm curious what the reality is here. Are they just applying a blur to the background because that will fall appart fast. Do they have some type of multilayered focus image for the background? If you're shooting in 4k for your foreground actor - how many k would the background have to be to do that pan move in it... 12k?

    Either this technology is a big yawner or it's unbeleivably innovative.... it would all be in the details which weren't given. I've seen some ways of shooting spaces with multilayers that you could later change the lighting - that was impressive - yet - also somewhat impractical for the shooting. This reminds me of that technology

    By Blogger The Unknown Filmmaker, at 6:41 AM  

  • Very interesting. Not just live chroma-key but real-time compositing.

    QUESTIONS
    1. What kind of lens and tripod sends all this data to the computer? We're not talking prosumer cameras, right?

    2. How far can you pan before hitting the edge of the background shot?

    3. Can this be synchronized with real-time virtual sets? That is, a 3-D CGI set being paned, zoomed and focused. (This would solve #2 above.)

    4. What about dolly and crane moves?

    Rob:-]

    By Blogger Rob:-], at 3:17 PM  

  • All my questions were answered on their web site.

    What I didn't ask was, how much?

    By Blogger Rob:-], at 3:42 PM  

  • They're talking about $85K, which doesn't include an HD camera. They were at NAB last year, and I suspect they'll be back again this year...

    By Blogger Scott Kirsner, at 3:46 PM  

  • I went to the Cinital site and downloaded their example quicktime. It is a shot of two people in a car driving down a motorway.

    Not to knock the tech involved - I think that it's a great concept, but I was very disapointed with the results - It SO looked like a process shot :(

    If this is the case playing off my laptop then what about on a TV or even worse, on a cinema screen?

    I think that it's a great idea and I truly hope that they can find sufficient development dollars to take it further. I would love to be able to use this kit, especially on a feature film that I am planning to shoot on my RED One cameras.

    By Blogger randomfactor, at 10:34 AM  

  • I agree with you, randomfactor, about the look. I think the point is that it's like a pre-vis. That is, the cast and crew can get a pretty good idea of what the final shot is going to look like and make adjustments to improve it on the next take.

    The postproduction process would then take more time to get a perfect key, hand rotoscoping if required, to get the perfect finished frames.

    It's a production tool, not a post production tool. Somebody correct me if I'm wrong here.

    Peace,

    Rob:-]

    By Blogger Rob:-], at 12:52 PM  

  • "But the concept behind Mack's company is to mix the actors and the backgrounds in real time, so the director can see what the final shot will look like by glancing at a high-definition monitor - and reduce or eliminate the costs of all that laborious, after-the-fact compositing."

    i'd say at the minimum they're trying to be a production tool and what they're really trying to do is put post-production compositing out of work. :)

    By Blogger deepstructure, at 1:39 PM  

  • I looked at the site and my question about blur was answered - yes, it's flat plane blur which is non-cinematic.

    Honestly - I think this will end up being more a tool/toy for industrial and television usages and not for cinema. I feel like it is solving something which isn't a major problem. If you look at their sample keys - they're not very good. That's a problem. Yes, they track - but not with any parallax in the backframe, yes the focus changes, but on a flatplane. These are the same problems post has.

    In cinema - anything that can be pushed to post will be - and everything this device does can be done in post. I just don't feel like the cinema industry is hungry for this. News and industrial might eat it up like hotcakes, I don't know.

    By Blogger The Unknown Filmmaker, at 4:24 PM  

  • Actually, the background defocus is calculated on a per-pixel basis, depending on the 3D background object's distance from the lens and the lens parameters. The video backgrounds shown, however, are playing on a planar 3D element in the 3D scene, and thus defocus like a 2-D element.

    What I need to do is show an example of walking the focus range backwards and forwards through a 3D scene, so that the photographically accurate depth of field calculations are more obvious. I'll add that to our next site update so that it's more clear what we are doing.

    By Blogger Unknown, at 6:20 PM  

  • The camera does indeed track in 3-D, generating true parallax. The example shots don't have dolly moves because I didn't have a nice 3D background scene that would show off the camera motion. We certainly should include that in our site, however, to make it more clear.

    We handle the live production/post production question by providing support for both workflows. The HDSDI output can be recorded and used as a final or as a proxy, depending on the goals of the production, and is timecode-matched to the camera motion data files that are recorded. These can be brought into all of the major 2D and 3D VFX systems to enable integration into a traditional workflow.

    By Blogger Unknown, at 6:44 PM  

  • hey eliot, thanks for the further clarification. you'd definitely do well to showcase those elements on your site. very impressive tech. best of luck with it (though you may put me out of business!). :)

    By Blogger deepstructure, at 6:49 PM  

  • No problem. I wouldn't worry much about being replaced; what we're aiming at is automating the parts of VFX production that aren't much fun to do anyway.

    Trying to post-track a bunch of markers that are out of focus, or rotoscoping hair when the actor walks in front of a marker, is not usually very enjoyable. Most artists would prefer to focus on making great shots, rather than struggling after the fact with compromises made on set.

    It's simply nicer to begin a composite with pristine tracking data and a clean greenscreen plate where potential lighting problems were already caught and fixed on set.

    By Blogger Unknown, at 7:10 PM  

Post a Comment

<< Home