Hugo

The newest stereo 3D film sensation promises to be Martin Scorsese’s Hugo, just in time for the holidays. The film is the director’s first 3D venture and is based on The Invention of Hugo Cabret, a children’s graphic novel written and illustrated by Brian Selznick. It’s the story of twelve-year-old Hugo, an orphan who lives in the walls of a busy Paris train station. Hugo gets wrapped up in the mystery involving his father and a strange mechanical man.

Scorsese – who’s as much a film buff as an award-winning director – has a deep appreciation for the art form of past 3D films, like Dial M for Murder. In adapting this fantastical story, Scorsese and his Oscar-winning team have an ideal vehicle to show what stereo 3D can do in the right hands and when approached with care. Unlike the groundbreaking Avatar, which relied heavily on motion capture and synthetic environments, Hugo is a more cinematic production with real sets, actors and is based on the traditional language of filmmaking.

Hugo started production in 2010 using then-prototype ARRI ALEXA cameras, which were configured into special 3D camera rigs by Vince Pace. The ALEXA was the choice of cinematographer Bob Richardson for its filmic qualities. Camera signals were captured as 1920 x 1080 video with the Log-C color profile to portable HDCAM-SR recorders. Hugo will be the first 3D release produced with this particular equipment complement. With post for Hugo in its final stages, I had a chance to speak with two of Scorsese’s key collaborators, Rob Legato (visual effects supervisor and second unit director of photography) and Thelma Schoonmaker (film editor).

Developing the pipeline

Rob Legato has been the key to visual effects in many of Scorsese’s films, including The Aviator. For Hugo, Legato handled effects, second unit cinematography and, in fact, developed the entire start-to-finish stereo 3D post pipeline. Legato started our conversation with the back story, “I had done a small film with the ARRI D-21 and Bob [Richardson] loved the look of the camera. He liked the fact that it was produced by a traditional film camera manufacturer, so when the ALEXA came out, he was very interested in shooting Hugo with it. In order to make sure that the best possible image quality was maintained, I developed a DI workflow based on maintaining all the intermediate steps up to the end in log space. All effects work stayed in log and dailies color correction was done in log, so that no looks were baked in until the final DI stage. We used LUTs [color look-up tables] loaded into [Blackmagic Design] HDLink boxes for monitoring on-set and downstream of any the visual effects.”

“The original dailies were color corrected for editorial on a Baselight unit and that information was saved as metadata. We had both an Avid Media Composer and a Baselight system set up at my home facility, The Basement. Thelma cuts on Lightworks, but by mirroring her edits on Media Composer, I had the information in a form ready to disperse to the visual effects designers. I could load the color grades developed by Marty, Bob and the colorist for each scene into my Baselight, so that when I turned over finished VFX shots to Thelma, they would have the same look applied as those shots had from the dailies. That way a VFX shot wouldn’t be jarring when Thelma cut it back into the sequence, because it would match the same grade.”

Working in the language of stereo 3D

The key to the look of Hugo is the care put into the stereo 3D images. In fact, it’s very much a hand crafted film. Legato continued, “All the 3D imagery was done in-camera. You could never accomplish this type of look and emotional feel with post production rotoscoping techniques used to turn 2D films into 3D. Stereo was designed into the film from the very beginning. Not 3D gags, but rather a complete immersive style to the sets, lighting, camera moves and so on. Marty and Bob would watch the shots on set in 3D wearing their glasses. Performances, lighting, stereography and the position of items in the set were all tweaked to get the best results in 3D. The sets were designed for real depth, including elements like steam and particles in the air. You feel what it’s like to be in that space – emotionally. In the end, the story and the look are both a real love affair with motion pictures.”

One of the common complaints stereo 3D critics offer is that cinematographers cannot use shallow depth-of-field for storytelling. Legato responded, “Marty and Bob’s approach was to create those depth cues through lighting. We erred on the side of more focus, not less – more in the style of Gregg Toland’s work on Citizen Kane. Monitoring in stereo encouraged certain adjustments, like lighting little parts of the set in the background to gain a better sense of depth and control where the audience should focus its attention. That’s why stereoscopic post on 2D films doesn’t work. You cannot put off any part of the art form until later. You lose the taste of the artists. You lose the emotional advantage and the subtlety, because the process hasn’t been vetted by decisions made on the set through staging.”

A tailored approach

At the time of this interview, the film was in the final stages of stereo adjustments and color grading. Legato explained, “Unlike a 2D film, the finishing stage includes a final pass to tweak the 3D alignment. That is being handled by Vince Pace’s folks with Marty and Bob supervising. When they are done, that information will go to the colorist to be integrated into the grade. Greg Fisher has been our colorist throughout the film. Often you don’t have the same colorist for dailies as for the DI, but this is a color workflow that works best for Bob. By establishing a look during dailies and then carrying that data to the end with the same colorist – plus using Baselight at both ends – you get great continuity to the look. We tailored the most comfortable style of working for us, including building small 3D DI theaters in England and New York, so they could be available to Marty where he worked. That part was very important in order to have proper projection at the right brightness levels to check our work. Since the basic look has already been establish for the dailies, now Greg can concentrate on the aesthetics of refining the look during the DI.”

Cutting in 3D

Thelma Schoonmaker has been a close collaborator with Martin Scorsese as the editor for most of his films. She’s won Best Editing Oscars for The Departed, The Aviator and Raging Bull. Some editors feel that the way you have to cut for a stereo 3D release cramps their style, but not so with Schoonmaker. She explained, “I don’t think my style of cutting Hugo in 3D was any different than for my other films. The story really drives the pace and this is driven by the narrative and the acting, so a frenetic cutting style isn’t really called for. I didn’t have to make editorial adjustments based on 3D issues, because those decisions had already been made on set. In fact, the stereo qualities had really been designed from take to take, so the edited film had a very smooth, integrated look and feel.”

Often film editors do all their cutting in 2D and then switch to 3D for screenings. In fact, Avatar was edited on an older Avid Media Composer Adrenaline system without any built-in stereo 3D capabilities. Those features were added in later versions. Hugo didn’t follow that model. Schoonmaker continued, “I cut this film in 3D, complete with the glasses. For some basic assemblies and roughing out scenes, I’d sometimes switch the Lightworks system into the 2D mode, but when it came time to fine-cut a scene with Marty, we would both have our glasses on during the session and work in 3D. These were flip-up 3D glasses, so that when we turned to talk to each other, the lenses could be flipped up so we weren’t looking at each other through the darker shades of the polarized glass.”

Thelma Schoonmaker has been a loyal Lightworks edit system user. The company is now owned by EditShare, who was eager to modify the Lightworks NLE for stereo 3D capabilities. Schoonmaker explained, “The Lightworks team was very interested in designing a 3D workflow for us that could quickly switch between 2D and 3D. So, we were cutting in 3D from the start. They were very cooperative and came to watch how we worked in order to upgrade the software accordingly. For me, working in 3D was a very smooth process, although there were more things my two assistants had to deal with, since ingest and conforming is a lot more involved.”

Prior to working on Hugo, the seasoned film editor had no particular opinion about stereo 3D films. Schoonmaker elaborated, “Marty had a very clear concept for this film from the beginning and he’s a real lover of the old 3D films. As a film collector, he has his own personal copies of Dial M for Murder and House of Wax, which he screened for Rob [Legato], Bob [Richardson] and me with synced stereo film projection. Seeing such pristine prints, we could appreciate the beauty of these films.”

Editing challenges

The film was shot in 140 production days (as well as 60 second-unit days) and Thelma Schoonmaker was cutting in parallel to the production schedule. Principal photography wrapped in January of this year, with subsequent editing, effects, mix and finishing continuing into November. Schoonmaker shared some final thoughts, “I’m really eager to see the film in its final form like everyone else. Naturally I’ve been screening the cuts, but the mix, final stereo adjustments and color grading are just now happening, so I’m anxious to see it all come together. These finishing touches will really enhance the emotion of this film.”

Hugo is a fairy tale. It is narrative-driven versus being based on characters or environments. That’s unlike some of Scorsese’s other films, like Raging Bull, Goodfellas or The Departed, where there is a lot of improvisation. Marty injected some interesting characters into the story, like Sacha Baron Cohen as the station inspector. These are more fleshed out than in the book and it was one of our challenges to weave them into the story. There are some great performances by Asa Butterfield, who plays Hugo, and Ben Kingsley. In fact, the boy is truly one of a great new breed of current child actors. The first part of the film is practically like a silent movie, because he’s in hiding, yet he’s able to convey so much with just facial emotions. As an editor, there was a challenge with the dogs. It took a lot of footage to get it right [laughs]. Hugo ends as a story that’s really about a deep love of film and that section largely stayed intact through the editing of the film. Most of the changes happened in the middle and a bit in the front of the film.”

From the imagery of the trailers, it’s clear that Hugo has received a masterful touch. If, like me, you’ve made an effort to skip the 3D versions of most of the recent popular releases, then Hugo may be just the film to change that policy! As Rob Legato pointed out, “Hugo is a very immersive story. It’s the opposite of a cutty film and is really meant to be savored. You’ll probably have to see it more than once to take in all the detail. Everyone who has seen it in screenings so far finds it to be quite magical”.

Some addition stories featured in the Editors Guild Magazine, Post magazineanother from Post and from FXGuide.

And even more from Rob Legato and Thelma Schoonmaker.

Written for DV magazine (NewBay Media, LLC)

© 2011 Oliver Peters

Photo phun

I’m strictly an amateur when it comes to photography, though I still like to take my share of snapshots. Sometimes I’m lucky. As a holiday break I decided to play around with a hodge-podge of images – some from holiday times or winter locations and others not.

These were processed through Lightroom and Photoshop as well as the photo plug-in versions of Tiffen Dfx and Magic Bullet Looks. On some of these I was going for rich images, some for effects and others a pseudo painterly look. Although these were all still photos, the same looks and processes are applicable to video color grading and stylizing effects.

Click on any image to see an enlarged view and to scroll through a filmstrip view of all. After the New Year I’ll be back with more standard film and video fare.

Merry Christmas and Happy Holidays!

A season for giving

The holiday times mean many things to different people. For some, it’s a religious festival. For others, a time of reflection and to concentrate more on family. As many have said, it’s a time of giving. Along those lines, a solution for many that is more in keeping with the sentiments of the season, is to give to a favored charity rather than putting more items under the tree.

A few years back I edited a documentary entitled, “Blindsided” - about a teenager (Jared) who had gone blind from a little-known disease – Leber’s Hereditary Optic Neuropathy (LHON). The documentary has aired a number of times in the subsequent years on HBO and HBO Family, but for me it struck a special chord. As an editor, colorist and writer, eyesight is especially valuable and it’s hard to imagine living without it.

The research into the disease is moving into an important phase and is in need of additional funding. If this post equally strikes a chord in your heart and you are able to make a contribution, then I’m sure they would appreciate your support.

LHON Research Project c/o Development Office
Bascom Palmer Eye Institute
P.O. Box 016880
Miami, FL 33101-6880

Click this link to learn more about LHON.

Thank you and Merry Christmas!

Oliver Peters

Rethinking NLE design

The launch of Apple Final Cut Pro X stood the editing industry on its collective ear. It also sparked some very interesting – and occasionally heated – discussions about what the design of a modern, nonlinear editing application really should be. NLEs that we all use today are a collection of terminology, tools, designs and workflows borrowed from different technologies.

Film editing was the first, so-called “nonlinear” editing method for picture and sound, because new clips could be spliced into the assembled sequence of clips. Those clips that followed would simply move down in order with a corresponding change in overall duration. You could build a movie, reel or scene in any order, since these were all independent pieces that you were free to re-arrange.

Videotape editing was “linear”, because edited sequences were locked to their recorded location on a master videotape reel. Ironically, the earliest tape editing involved physically splicing pieces of magnetic tape and was by definition also nonlinear – even if that wasn’t as practical as it was in film. Once electronic editing came about, clips were locked against time as a function of their recorded position on the tape, so making changes that altered clip order and/or program duration required the re-recording of all the footage that followed the point of the change. You tended to build a program in a linear fashion, from start to finish.

Although electronic, random-access, nonlinear editing had a two-decades-long evolution, most younger editors recognize the original EMC2 and Avid Media Composer of the late 80s as the starting point for today’s software-based NLEs. These two pioneering systems mashed up functions from a variety of disciplines and have become the basis for current NLE design. Insert or splice-in edits and roller-style trimming came from film, as did bins and clips. Overwrite edits, a two-up (source-record) window display and digital video effects came from linear video editing. Copy-and-paste methods came from general computing and word processing. We all understood this paradigm … until June 2011.

Enter Final Cut Pro X. Along with some exciting new features, Apple decided to rethink what an NLE should be, amidst screams that they had thrown out 100 years of film editing grammar. For evidence, the critics point to the unified viewer, the magnetic timeline and the trackless timeline. As Apple is wont to do, the ProApps team designed the NLE that they thought was best for the future and not one built on evolutionary upgrades to the previous version’s feature set.

If you break the concept down, they have in many ways gone back to the core of editing as practiced by a film editor. That is, giving you tools built on editing story-centric timelines. Unlike previous NLE sequences, which place clips along a timeline based on a location in time, the FCP X timeline represent a series of connected clips organized along storylines. Clips have no specific relationship to their order in time, but rather have a parent-child relationship to each other. The single, unified viewer and the behavior of the magnetic timeline are actually quite similar to editing film on a Moviola or flatbed. Having combined A/V clips with locked, sync audio is not unlike the relationship of sprocketed film sound, where the physical sprockets prevent sync errors.

So far for me, the metaphors hold up, but the trackless timeline feels odd to me (at least for now) – especially for audio. The concept of tracks precedes the origins of editing. Think of a musical score used by a conductor. Multiple parallel staffs define the music played by each instrument – violins on one, flutes on another and so on. This is no different than tracks in NLE software and it’s a concept that we’ve understood for hundreds of years. The same principles apply to film sound mixing, multi-channel audio production, live concert venues through a mixing board, as well as Apple Logic and GarageBand.

In an effort to simplify the navigation of NLE track-patching and management of target selection that tracks invariably require, Apple has chosen to make a more fluid timeline. Audio and video clips are not locked into any rigid vertical order. In fact, audio and video clips can live in completely opposite places from where a track-based system would organize them.

This has simplified the number of keystrokes you need to edit a clip to the timeline, but it has made it much harder to engage in any type of track-based audio mixing. It also makes it all but impossible to use the timeline as a sort of visual scratch pad in the way many film editors use it. To mitigate the impact, Apple has added Roles, with an eye towards managing some of this via metadata. We’ll see if the industry warms to that. I think it will take a few more updates to fully flesh out this feature.

Most of the concepts employed in FCP X are hardly new to NLEs. Many have been used in various other products at one time or another. Non-standard editing interfaces have generally not withstood the test of time. Witness Montage, EdiFlex, the original Adobe Premiere, ImMix, EditDroid and Quantel Harry. All of these used a layout that was vastly different than we are used to now. Even Lightworks, which started out with a much different layout, eventually came to one based on the source-record-timeline model.

Apple is the only company that could do this and get away with it – and quite possibility have it pay off in the end. Avid would never do it. They’ve had Avid DS for years. It’s a great tool, but enough different from Media Composer that the veteran Avid base of Media Composer, Symphony and NewsCutter editors simply rejected early efforts to make it the flagship editing model. Apple on the other hand can afford to temporarily alienate their users without much affect on the bottom line. As such, they have time to hone the model, stomp out the bugs and gradually wait it out while a loyal core of new (and old) users grows to like the design of the product.

For me, FCP X (at version 10.0.2) is still a love-hate relationship. The redesign aids some parts of my workflow and hinders others. For example, on my own system, I work with two displays and no external broadcast monitor. The unified viewer gives me a larger video window within the UI. Running the Events browser on the left in the list view lets me quickly and visually identify footage as I arrow down through the list. The clip I’m on appears in the filmstrip at the top. On other NLEs this normally requires double-clicking each clip to load it into the source window and scrub through it to see what it is. Or setting the bin to thumbnail view, which I find useless most of the time.

FCP X isn’t the best tool for all projects, yet. It may never work for feature film editors the way “classic” FCP did. Right or wrong, Apple certainly isn’t waiting for what a small and notoriously change-resistant niche feels about FCP X. After all, “video literacy” will likely drive tremendous sales for the product. Whatever the future holds, it sure has us thinking about what we need and don’t need in editing software. We’re not just comparing features, but really thinking about which NLE actually lets us cut better and have more fun while we are doing it.

Click here for Part II.

©2011 Oliver Peters

Goodies for the holidays

Time for catch up on the little odds and ends that will make your editing workday more pleasant. If you’re the buyer, it’s hard to ignore low cost values, much less powerful free products. A number of influences over the past year have made this a great time if you are ready to spruce up your system.

Starting out with free, look no further than DaVinci Resolve Lite. This is an industrial-strength color grading tool that’s available as a free download. Blackmagic Design recently removed the limitation on the number of nodes you could use with Lite, so if you don’t need to render bigger than HD sizes and you’re not working on stereo 3D projects, then Resolve Lite does everything you need. This includes XML roundtrips from FCP 7 and FCP X, plus the ability to read AAF files and MXF media from Avid systems.

Automatic Duck has been a valuable conduit for getting from point A to point B, but now the keepers of the Duck are under Adobe’s wing. Thanks to this deal, all of Automatic Duck’s plug-in products are available as free downloads from their website.

Apple Final Cut Pro X uses effects that are all based on Motion 5 presets. The beauty of the system is that enterprising users can create new filters simply by modifying effects in Motion and publishing the presets to FCP X. There several editors around the globe who have produced a wealth of such presets as free downloads. Check here for some resources.

Speaking of FCP X, a number of traditional plug-in vendors have started to shore up the ecosystem with existing products that can install in a number of hosts, including FCP 7, FCP X, Motion 4, Motion 5 and After Effects. These include Noise Industries (and its partners), CHV, CoreMelt, CrumblePop, Motion|VFX and others. One handy set of filters and transitions comes from Digital Heaven. Although the DH products aren’t necessarily the flashiest, they are probably some that you’ll use frequently, because they are designed to fill the most common needs of daily editing.

The new cameras like ARRI ALEXA and the RED One and RED EPIC have introduced editors to the need to understand and work with LUTs. Pomfort has been developing products for the ALEXA and one of these includes a “look” filter that works with FCP 7 and FCP X to adjust the Log-C curve to Rec 709 video. In addition, it can also read and apply custom looks designed in the Pomfort Silverstack software.

Nick Shaw is a post consultant who developed one of the earliest Log-C-to-709 plug-ins for FCP 7. He hasn’t forgotten that this “legacy” NLE will see plenty of life in the next year or more. Shaw has developed another Antler Post FCP 7 plug-in to take RED footage that’s been rendered with the RedLogFilm gamma curve and correct it to 709 (plus burn-ins). As with his ALEXA filter, this effect is intended for use when offline editing with dailies in FCP 7.

Final Cut Pro X threw everyone for a loop, because there was no way to go between it and other applications. An update in September added FCP XML, which was promptly utilized by Assisted Editing to develop Project X27. FCP XML and existing XML (FCP 7 and Premiere Pro) aren’t the same, so this application handles the translation from an FCP X timeline (“project”) to an FCP 7 timeline (“sequence”) by interpreting one XML into the other. It only goes from X to 7, which some might find odd, since so many have asked for translation in the other direction. In reality, this is incredibility useful, because now you can bring your cuts into FCP 7 and from there to Color, Soundtrack Pro or Pro Tools. You can also export an XML which can be opened in Premiere Pro. Not to mentioned that since Automatic Duck’s plug-ins are free, you also have an avenue to go from FCP X to Avid systems or After Effects as needed. Assisted Editing also produces Event Manager, another FCP X productivity tool to handle access to FCP X Events and Projects while using FCP X.

This is just the start as the NLE market is very fluid and will continue to be so well into 2012. I expect to see at least two major plug-in packages to come out for FCP X: GenArts Sapphire Edge and Magic Bullet Looks. Both are preset-based and depend on a custom browser. Right now there’s a hitch that won’t allow FCP X to send frames to an external application, so Magic Bullet Mojo, which is uses slider controls, will work in FCP X, but Looks won’t. Both companies are in testing and my guess is that will be resolved with the FCP X update due in early 2012.

©2011 Oliver Peters