Inside Llewyn Davis

df_ild_01

Fans of Joel and Ethan Coen’s eclectic brand of filmmaking should be thrilled with their latest effort, Inside Llewyn Davis. The story follows Llewyn Davis as a struggling folk singer in the Greenwich Village folk scene at about 1960 – just before Bob Dylan’s early career there. Davis is played by Oscar Isaac, who most recently appeared in The Bourne Legacy. The story was inspired by the life of musician Dave Van Ronk, as chronicled in the book “The Mayor of MacDougal Street”. Although this is the Coen bothers’ most recent release, the film was actually produced in 2012 in true indie filmmaking fashion – without any firm commitment for distribution. It was picked up by CBS Films earlier this year.

df_ild_02The Coen brothers tackle post with a workflow that is specific to them. I had a chance to dig into that world with Katie McQuerrey, who is credited as an additional editor on Inside Llewyn Davis. McQuerrey started with the Coen brothers as they transitioned into digital post, helping to adapt their editorial style to Apple Final Cut Pro. For many of their films, she’s worn a number of hats – helping to coordinate the assistant editors, acting as a conduit to other departments and, in general, serving as another set of eyes, ears and brain while Ethan and Joel are cutting their films.

df_ild_07McQuerrey explained, “Ethan and Joel adapted their approach from how they used to cut on film. Ethan would pull selects from film workprint on a Moviola and then Joel would assemble scenes from these selects using a KEM. With Final Cut Pro, they each have a workstation and these are networked together. No fancy SAN management. Just Apple file sharing and a Promise storage array for media. Ethan will go through a project, review all the takes, make marks, add markers or written notes and pass it over to Joel. Ethan doesn’t actually assemble anything to a timeline. He’s only working within the bins of the broader project. All of the timeline editing of these scenes is then done by Joel.” (Although there’s been press about the Coen brothers planning to use Adobe Premiere Pro in the future, this film was still edited using Apple Final Cut Pro 7.)

df_ild_06Inside Llewyn Davis was filmed on 35mm over the course of a 45-day production in 2012. It wrapped on April 4th and was followed by a 20 to 24-week post schedule, ending in a final mix by the end of September. Technicolor in New York provided lab and transfer services for the production. They scanned in all of the raw 35mm negative one time to DPX files with a 2K resolution and performed a “best light” color correction pass of the DPX files for dailies. In addition, Technicolor also synced the sound from the mono mix of production mixer Peter Kurland’s location recordings. These were delivered to the editorial team as synced ProRes files.

df_ild_05McQuerrey said, “Ethan and Joel don’t cut during the shooting. That doesn’t start until the production wraps. Inside Llewyn Davis has a look for many of the scenes reminiscent of the era. [Director of photography] Bruno Delbonnel worked closely with [colorist] Peter Doyle to establish a suggested look during the dailies. These would be reviewed on location in a production trailer equipped with a 50” Panasonic plasma that Technicolor had calibrated. Once the film was locked, then Technicolor conformed the DPX files and Bruno, Ethan and Joel supervised the DI mastering of the film. Peter graded both the dailies and the final version using a [Filmlight] Baselight system. Naturally, the suggested look was honed and perfected in the final DI.”

df_ild_04Inside Llewyn Davis is about a musician and music is a major component of the film. The intent was to be as authentic as possible. There was no lip-syncing to the playback of a recorded music track. Peter [Kurland] recorded all of these live on set and that’s what ended up in the final mix. For editing, if we ever needed to separate tracks, then we’d go back to Peter’s broadcast wave file multi-track recordings, bring those into Final Cut and create ‘merged clips’ that were synced. Since Ethan and Joel’s offices are in a small building, the assistants had a separate cutting room at Post Factory in New York. We mirrored the media at both locations and I handled the communication between the two offices. Often this was done using Mac screen sharing between the computers.”

df_ild_03The Coen brothers approach their films in a very methodical fashion, so editing doesn’t present the kinds of challenges that might be the case with other directors. McQuerrey explained, “Ethan and Joel have a very good sense of script time to film time. They also understand how the script will translate on screen. They’ll storyboard the entire film, so there’s no improvisation for the editor to deal with. Most scenes are filmed with a traditional, single-camera set-up. This film was within minutes of the right length at the first assembly, so most of the editorial changes were minor trims and honing the cut. No significant scene lifts were made. Joel’s process is usually to do a rough cut and then a first cut. Skip Lievsay, our supervising sound editor, will do a temp mix in [Avid] Pro Tools. This cut with the temp mix will be internally screened for ‘friends and family’, plus the sound team and visual effects department. We then go back through the film top to bottom, creating a second cut with another temp mix.”

“At this stage, some of the visual effects shots have been completed and dropped into the cut. Then there’s more honing, more effects in place and finally another temp mix in 5.1 surround. This will be output to D5 for more formal screenings. Skip builds temp mixes that get pretty involved, so each time we send OMF files and change lists. Sound effects and ADR are addressed at each temp mix. The final mix was done in five days at Sony in Los Angeles with Skip and Greg Orloff working as the re-recording mixers.”

df_ild_08Even the most organized production includes some elements that are tough to cut. For Inside Llewyn Davis, this was the cross-country driving sequence that covers about one-and-a-half reels of the film. It includes another Coen favorite, John Goodman. McQuerrey described, “The driving scenes were all shot as green-screen composites. There are constantly three actors in the car, plus a cat. It’s always a challenge to cut this type of scene, because you are dealing with the continuity from take to take of all three actors in a confined space. The cat, of course, is less under anyone’s control. We ‘cheated’ that a bit using seamless split-screens to composite the shots in a way that the cat was in the right place. All of the windows had to be composited with the appropriate background scenery.”

“The most interesting part of the cut was how the first and last scenes were built. The beginning of the movie and the ending are the same event, but the audience may not realize at first that they are back at the beginning of the story. This was filmed only one time, but each scene was edited in a slightly different way, so initially you aren’t quite sure if you’ve seen this before or not. Actions in the first scene are abbreviated, but are then resolved with more exposition at the end.”

Originally written for Digital Video magazine

©2013 Oliver Peters

The NLE that wouldn’t die

It’s been 18 months since Apple launched Final Cut Pro X and the debate over it continues to rage without let-up. Apple likely has good sales numbers to deem it a success, but if you look around the professional world, with a few exceptions, there has been little or no adoption. Yes, some editors are dabbling with it to see where Apple is headed with it – and yes, some independent editors are using it for demanding projects, including commercials, corporate videos and TV shows. By comparison, though, look at what facilities and broadcasters are using – or what skills are required for job openings – and you’ll see a general scarceness of FCP X.

Let’s compare this to the launch of the original Final Cut Pro (or “legacy”) over 12 years ago. In a similar fashion, FCP was the stealth tool that attracted individual users. The obvious benefit was price. At that time a fully decked out Avid Media Composer was a turnkey system costing over $100K. FCP was available as software for only $999. Of course, what gets lost in that measure, is the Avid price included computer, monitors, wiring, broadcast i/o hardware and storage. All of this would have to be added to the FCP side and in some cases, wasn’t even possible with FCP. In the beginning it was limited to DV and FireWire only. But there were some key advantages it introduced at the start, over Avid systems. These included blend modes, easy in-timeline editing, After Effects-style effects and a media architecture built upon the open, extensible and ubiquitous QuickTime foundation. Over the years, a lot was added to make FCP a powerful system, but at its core, all the building blocks were in place from the beginning.

When uncompressed SD and next HD became the must-have items, Avid was slow to respond. Apple’s partners were able to take advantage of the hardware abstraction layer to add codecs and drivers, which expanded FCP’s capabilities. Vendors like Digital Voodoo, Aurora Video Systems and Pinnacle made it possible to edit something other than DV. Users have them to thank – more so than Apple – for growing FCP into a professional tool. When FCP 5 and 6 rolled around, the Final Cut world was pretty set, with major markets set to shift to FCP as the dominant NLE. HD, color correction and XML interchange had all been added and the package was expanded with an ecosystem of surrounding applications. By the time of the launch of the last Final Cut Studio (FCP 7) in 2009, Apple’s NLE seemed unstoppable. Unfortunately FCP 7 wasn’t as feature-packed as many had expected. Along with reticence to chuck recently purchased PowerMac G5 computers, a number of owners simply stayed with FCP 5 and/or FCP 6.

When Apple discusses the number of licensees, you have to parse how they define the actual purchases. While there are undoubtedly plenty of FCP X owners, the interpretation of sales is that more seats of FCP X have been sold than of FCP 7. Unfortunately it’s hard to know what that really means. Since it’s a comparison to FCP 7 – and not every FCP 1-6 owner upgraded to 7 – it could very well be that the X number isn’t all that large. Even though Apple EOL’ed (end of life) Final Cut Studio with the launch of FCP X, it continued to sell new seats of the software through its direct sales and reseller channels. In fact, Apple seems to still have it available if you call the correct 800 line. When Apple says it has sold more of X than of 7, is it counting the total sales (including those made after the launch) or only before? An interesting statistic would be the number of seats of Final Cut Studio (FCP 7) sold since the launch of FCP X as compared to before. We’ll never know, but it might actually be a larger number. All I know is that the system integrators I personally know, who have a long history of selling and servicing FCP-based editing suites, continue to install NEW FCP 7 rooms!

Like most drastic product changes, once you get over the shock of the new version, you quickly realize that your old version didn’t instantly stop working the day the new version launched. In the case of FCP 7, it continues to be a workhorse, albeit the 32-bit architecture is pretty creaky. Toss a lot of ProRes 4444 at it and you are in for a painful experience. There has been a lot of dissatisfaction with FCP X among facility owners, because it simply changes much of the existing workflows. There are additional apps and utilities to fill the gap, but many of these constitute workarounds compared to what could be done inside FCP 7.

Many owners have looked at alternatives. These include Adobe Premiere Pro, Avid Media Composer/Symphony, Media 100 and Autodesk Smoke 2013. If they are so irritated at Apple as to move over to Windows hardware, then the possibilities expand to include Avid DS, Grass Valley Edius and Sony Vegas. Several of these manufacturers have introduced cross-grade promotional deals to entice FCP “legacy” owners to make the switch. Avid and Adobe have benefited the most in this transition. Editors who were happy with Avid in the past – or work in a market where Avid dominates – have migrated back to Media Composer. Editors who were hoping for the hypothetical FCP 8 are often making Adobe Premiere (and the Production Premium bundle) their next NLE of choice. But ironically, many owners and users are simply doing nothing and continuing with FCP 7 or even upgrading from FCP 6 to FCP 7.

Why is it that FCP 7 isn’t already long gone or on the way out by now? Obviously the fact that change comes slowly is one answer, but I believe it’s more than that. When FCP 1.0 came on the scene, its interface and operational methodology fit into the existing NLE designs. It was like a “baby Avid” with parts of Media 100 and After Effects dropped in. If you cut on a Media Composer, the transition to FCP was pretty simple. Working with QuickTime made it easy to run on most personal machines without extra hardware.  Because of its relatively open nature and reliance in industry-standard interchange formats (many of which were added over time), FCP could easily swap data with other applications using EDLs, OMFs, text-based log files and XML. Facilities built workflows around these capabilities.

FCP X, on the other hand, introduced a completely new editing paradigm that not only changed how you work, but even the accepted nomenclature of editing. Furthermore, the UI design even did things like reverse the behavior of some keystrokes from how similar functions had been triggered in FCP 7. In short, forget everything you know about editing or using other editing software if you want to become proficient with FCP X. That’s a viable concept for students who may be the professional editors of the future. Or, for non-fulltime editors who occasionally have to edit and finish professional-level productions as one small part of their job. Unfortunately, it’s not a good approach if you want to make FCP X the ubiquitous NLE in established professional video environments, like post houses, broadcasters and large enterprise users.

After all, if I’m a facility manager and you can’t show me a compelling reason why this is better and why it won’t require a complete internal upheaval, then why should I change? In most shops, overall workflow is far more important than the specific features of any individual application. Gone are the differences in cost, so it’s difficult to make a compelling argument based on ROI. You can no longer make the (false) argument of 1999 that FCP will only cost you 1% of the cost of an Avid. Or use the bogus $50K edit suite ad that followed a few years later.

Which brings us to the present. I started on Avid systems as the first NLE where I was in the driver’s seat. I’ve literally cut on dozens of edit systems, but for me, Final Cut Pro “legacy” fit my style and preferences best. I would have loved a 64-bit version with a cleaned-up user interface, but that’s not what FCP X delivers. It’s also not exactly where Premiere Pro CS6 is today. I deal with projects from the outside – either sent to me or at shops where I freelance. Apple FCP 7 and Avid Media Composer continue to be what I run into and what is requested.

Over the past few months I’ve done quite a few complex jobs on FCP X, when I’ve had the ability to control the decision. Yet, I cannot get through any complex workflow without touching parts of Final Cut Studio (“legacy”) to get the job done. FCP X seems to excel at small projects where speed trumps precision and interoperability. It’s also great for individual owner-operators who intend to do everything inside FCP X. But for complex projects with integrated workflows, FCP 7 is still decidedly better.

As was the case with early FCP, where most of the editing design was there at the start, I now feel that with the FCP X 10.0.6 update, most of its editing design is also in place. It may never become the tool that marches on to dominate the market. FCP “legacy” had that chance and Apple walked away from it. It’s dubious that lightning will strike twice, but 18 months is simply too short of a timeframe in which to say anything that definitive. All I know is that for now, FCP 7 continues as the preferred NLE for many, with Media Composer a close second. Most editors, like old dogs, aren’t too eager to learn new tricks. At least that’s what I conclude, based on my own ear-to-the-ground analysis. Check back this time next year to see if that’s still the case. For now, I see the industry continuing to live in a very fractured, multi-NLE environment.

©2012 Oliver Peters

Hemingway & Gellhorn

Director Philip Kaufman has a talent for telling a good story against the backdrop of history. The Right Stuff (covering the start of the United States’ race into space) and The Unbearable Lightness of Being (the 1968 Soviet invasion of Prague) made their marks, but now the latest, Hemingway & Gellhorn continues that streak.

Originally intended as a theatrical film, but ultimately completed as a made-for-HBO feature, Hemingway & Gellhorn chronicles the short and tempestuous relationship between Ernest Hemingway (Clive Owen) and his third wife, Martha Gellhorn (Nicole Kidman). The two met in 1936 in Key West, traveled to Spain to cover the Spanish Civil War and were married in 1940. They lived in Havana and after four years of a difficult relationship were divorced in 1945. During her 60-year career as a journalist, Gellhorn was recognized as being one of the best war correspondents of the last century. She covered nearly every conflict up until and including the U. S. invasion of Panama in 1989.

The film also paired another team – that of Kaufman and film editor Walter Murch – both of whom had last teamed up for The Unbearable Lightness of Being. I recently spoke with Walter Murch upon his return from the screening of Hemingway & Gellhorn at the Cannes Film Festival. Murch commented on the similarities of these projects, “I’ve always been attracted to the intersection of history and drama. I hadn’t worked with Phil since the 1980s, so I enjoyed tackling another film together, but I was also really interested in the subject matter. When we started, I really didn’t know that much about Martha Gellhorn. I had heard the name, but that was about it. Like most folks, I knew the legend and myth of Hemingway, but not really many of the details of him as a person.”

This has been Murch’s first project destined for TV, rather than theaters. He continued, “Although it’s an HBO film, we never treated it as anything other than a feature film, except that our total schedule, including shooting, was about six months long, instead of ten or more months. In fact, seeing the film in Cannes with an audience of 2,500 was very rewarding. It was the first time we had actually screened in front of a theatrical audience that large. During post, we had a few ‘friends and family’ screenings, but never anything with a formal preview audience. That’s, of course, standard procedure with the film studios. I’m not sure what HBO’s plans are for Hemingway & Gellhorn beyond the HBO channels. Often some of their films make it into theatrical distribution in countries where HBO doesn’t have a cable TV presence.”

Hemingway & Gellhorn was produced entirely in the San Francisco Bay area, even though it was a period film and none of the story takes place there. All visual effects were done by Tippett Studio, supervised by Christopher Morley, which included placing the actors into scenes using real archival footage. Murch explained, “We had done something similar in The Unbearable Lightness of Being. The technology has greatly improved since then, and we were able to do things that would have been impossible in 1986. The archival film footage quality was vastly different from the ARRI ALEXA footage used for principal photography. The screenplay was conceived as alternating between grainless color and grainy monochrome scenes to juxtapose the intimate events in the lives of Hemingway and Gellhorn with their presence on the world stage at historical events. So it was always intended for effect, rather than trying to convince the audience that there was a completely continuous reality. As we got into editing, Phil started to play with color, using different tinting for the various locations. One place might be more yellow and another cool or green and so on. We were trying to be true to the reality of these people, but the film also has to be dramatic. Plus, Phil likes to have fun with the characters. There must be balance, so you have to find the right proportion for these elements.”

The task of finding the archival footage fell to Rob Bonz, who started a year before shooting. Murch explained, “An advantage you have today that we didn’t have in the ‘80s is YouTube. A lot of these clips exist on-line, so it’s easier to research what options you might have. Of course, then you have to find the highest quality version of what you’ve seen on-line. In the case of the events in Hemingway & Gellhorn, these took place all over the world, so Rob and his researchers were calling all kinds of sources, including film labs in Cuba, Spain and Russia that might still have some of these original nitrate materials.”

This was Walter Murch’s first experience working on a film recorded using an ARRI ALEXA. The production recorded 3K ARRIRAW files using the Codex recorder and then it was the editorial team’s responsibility to convert these files for various destinations, including ProResLT (1280 x 720) for the edit, H.264 for HBO review and DPX sequences for DI. Murch was quite happy with the ALEXA’s image. He said, “Since these were 3K frames we were able to really take advantage of the size for repositioning. I got so used to doing that with digital images, starting with Youth Without Youth, that it’s now just second nature. The ALEXA has great dynamic range and the image held up well to subtle zooms and frame divisions. Most repositionings and enlargements were on the order of 125% to 145%, but there’s one blow-up at 350% of normal.”

In addition to Bonz, the editorial team included Murch’s son Walter (first assistant editor) and David Cerf (apprentice). Walter Murch is a big proponent of using FileMaker Pro for his film editor’s code book and explained some of the changes on this film. “Dave really handled most of the FileMaker jiu-jitsu. It works well with XML, so we were able go back and forth between FileMaker Pro and Final Cut Pro 7 using XML. This time our script supervisor, Virginia McCarthy, was using ScriptE, which also does a handshake with FileMaker, so that her notes could be instantly integrated into our database. Then we could use this information to drive an action in Final Cut Pro – for instance, the assembly of dailies reels. FileMaker would organize the information about yesterday’s shooting, and then an XML out of that data would trigger an assembly in Final Cut, inserting graphics and text as needed in between shots. In the other direction, we would create visibility-disabled slugs on a dedicated video track, tagged with scene information about the clips in the video tracks below. Outputting XML from Final Cut would create an instantaneous continuity list with time markers in FileMaker.”

The way Walter Murch organizes his work is a good fit for Final Cut Pro 7, which he used on Hemingway & Gellhorn and continues to use on a current documentary project. In fact, at a Boston FCP user gathering, Murch showed one of the most elaborate screen grabs of an FCP timeline that you can imagine. He takes full advantage of the track structure to incorporate temporary sound effects and music cues, as well as updated final music and effects.

Another trick he mentioned to me was something he referred to as a QuickTime skin. Murch continued, “I edit with the complete movie on the timeline, not in reels, so I always have the full cut in front of me. I started using this simple QuickTime skin technique with Tetro. First, I export the timeline as a self-contained QuickTime file and then re-import the visual. This is placed on the upper-most video track, effectively hiding everything below. As such, it’s like a ‘skin’ that wraps the clips below it, so the computer doesn’t ‘see’ them when you scroll back and forth. The visual information is now all at one location on a hard drive, so the system isn’t bogged down with unrendered files and other clutter. When you make changes, then you ‘razor-blade’ through the QuickTime and pull back the skin, revealing the ‘internal organs’ (the clips that you want to revise) below – thus making the changes like a surgeon. Working this way also gives a quick visual overview of where you’ve made changes. You can instantly see where the skin has been ‘broken’ and how extensive the changes were. It’s the visual equivalent of a change list. After a couple of weeks of cutting, on average, I make a new QuickTime and start the process over.”

Walter Murch is currently working on a feature documentary about the Large Hadron Collider. Murch, in his many presentations and discussions on editing, considers the art part plumbing (knowing the workflow), part performance (instinctively feeling the rhythm and knowing, in a musical sense, when to cut) and part writing (building and then modifying the story through different combinations of picture and sound). Editing a documentary is certainly a great example of the editor as writer. His starting point is 300 hours of material following three theorists and three experimentalists over a four-year period, including the catastrophic failure of the accelerator nine days after it was turned on for the first time. Murch, who has always held a love and fascination for the sciences, is once again at that intersection of history and drama.

Click here to watch the trailer.

(And here’s a nice additional article from the New York Times.)

Originally written for Digital Video magazine (NewBay Media, LLC).

©2012 Oliver Peters

Avid Media Composer Tips for the FCP Switcher

The new (or renewed) interest in Avid Media Composer was fueled by the launch of Apple’s Final Cut Pro X and aided by Avid’s cross-grade promotional price. This move has many experienced FCP editors investigating the nuances of what seems like a completely foreign interface to some. Tutorial DVDs, like Steve Hullfish’s Avid Media Composer for Final Cut Pro Users from Class On Demand, are a great start. With experience, editors start to see more similarities than differences. Here are a few pointers of mine to help you find your comfort zone.

Multiple sequences  – FCP editors like working with multiple, tabbed sequences in the timeline window and find this lacking in Media Composer. In fact, it’s there, just not in the same way. Under the pulldown menu in the upper right corner of the record window (Canvas in FCP parlance), Media Composer editors have quick access to any previously opened sequence. Although only one is displayed in the timeline at any given time, you can quickly switch to another sequence by selecting it in this menu.

Displaying source waveforms and sequences clips – In FCP 7 or earlier, having tabbed sequences makes it easy to use a section of one sequence as a source to paste into another sequence. In addition, when reviewing audio-only clips, the waveform is displayed in the Viewer to easily mark in and out points based on the waveform. A similar feature exists in Media Composer. At the lower left corner of the timeline window is a toggle icon, which switches the window display between the source (Viewer) and record (Canvas) timelines. To use a sequence as an edit source, simply drag it from the bin into the source window. Then toggle the source/record icon to display the source’s timeline. The playhead bar or current timeline indicator will be bright green whenever you are working in the source timeline. If you have an audio-only clip, doing the same and enabling waveforms will display the source clip track with its corresponding waveform pattern.

Multiple projects – Another FCP favorite is the ability to have more than one project open at once. The Media Composer equivalent to this is Open Bin. This enables the editor to access any bin from any other project. Opening bins from other projects gives you full access to the master clips and associated metadata. In turn, that information is tracked as part of your current project going forward. Media Composer’s Find command is also active with these other bins.

Oversized images – Media Composer has traditionally been locked into standard broadcast NTSC, PAL and HD frame sizes. When it comes to dealing with higher-resolution images, Media Composer offers two solutions: the built-in Avid Pan & Zoom plug-in and Avid FX. Both of these function nicely for doing camera-style moves on bigger-than-TV-raster images. If you want to preserve the resolution of a 3500 x 3500 pixel still photo, while zooming into it, Media Composer is perfectly capable of accomplishing the task. Edit a placeholder clip to the timeline, apply the Pan & Zoom filter to it. From the effects editor, access the high-res file, which will replace the placeholder content. Finally, program your keyframes for the move.

AvidFX – One of the “stealth” features of Media Composer is that it comes with Avid FX, an OEM version of Boris RED designed to integrate into Avid host systems. Apply an AvidFX filter to a clip and launch the separate interface to open a complete effects, titling and compositing environment. Not only can you apply effects to your timeline clips or import and manipulate high-res stills, but you can also import other QuickTime movies from your drives that aren’t part of the Avid project. AvidFX also includes its own set of Boris Continuum Complete and Final Effects Complete filters, even if the BCC or FEC products haven’t been separately installed into Media Composer.

Open timeline/frame rate mix and match – FCP is known for dealing with a wide range of formats, but in fact, Media Composer is superb at mixing frame rates and sizes in real-time. For example, freely mix SD and HD clips on an HD timeline and Media Composer will automatically scale up the SD clips. However, you can quickly change the format of the project to SD and Media Composer takes care of applying the opposite scale parameters, so that SD clips appear normal and HD clips are scaled down to fit into the SD raster. In addition, codecs and frame rates can also be mixed within the same sequence. Put 23.976fps clips into a 29.97fps timeline or vice versa and Avid’s FluidMotion technology takes care of cadence and frame rate adjustments for seamless playback and blending in real-time.

Smart Tool – Easy editing within the timeline by using the mouse or keystrokes has long been a hallmark of FCP. Avid’s product designers sought to counter this (and add their own twists) with the introduction of Smart Tool. When enabled, it offers contextual timeline editing. Hovering the curser over the top or bottom of a clip  – or close to the top, bottom, right, center or left side of a cut – will change the function you can perform when clicking and dragging with the mouse. It takes a while to get comfortable with Smart Tool and many veteran Avid editors aren’t big fans. Nevertheless, editors who’ve adapted to it can quickly re-arrange clips and trim edit points in much the same way as FCP editors use the Selection and Rolling Edit keys.

Perforation-slipping – Edit accuracy on either NLE is limited to a one-frame interval, but a little known feature is the ability to trim the sync of audio to less than one video frame. Few editors know how to do that in either application. While FCP allows sample-based adjustments, Media Composer does this with a technique borrowed from its film editing heritage. Start by creating your project as a film project (even for non-film media), which enables the Slip Perf Left/Right commands. Standard 35mm film uses four sprocket holes (perforations) per frame, which equates to quarter-frame accuracy. Using the Slip Perf commands is a great way to adjust the accuracy of sound sync on subclips in double-system recordings, such as when an HDSLR camera and separate audio recorder were used.

AMA – The first instinct of many young editors is to start a project by dragging media from anywhere on the hard drives into the project and start editing. FCP facilitated this style of working, whereas Media Composer traditionally was structured with a rigid routine for importing and capturing media. Avid Media Access (AMA) is a solution to bring more of that “drag-and-drop” world into Media Composer. You may either Link to AMA File(s) or Link to AMA Volume(s), depending on whether you wish to mount entire drives or partitions, such as with P2 cards, or simply want to import individual files. Conceived as a process similar to FCP’s Log and Transfer, AMA also lets you edit directly from native files. The recommended workflow is a hybrid approach: load your clips via AMA; cull down selects; and then transcode just those clips into optimized Avid media. In either case, you have immediate access to native media, as long as there’s an AMA camera plug-in for it. This includes RED, Canon XF, P2, XDCAM and most QuickTime-based camera file formats, such as ARRI ALEXA and Canon HDSLRs.

ProRes – Ever since Apple introduced the ProRes codec family, it has been gaining share as an acquisition, edit and delivery format – thanks to the ubiquitous nature of QuickTime. It has become ingrained into many FCP editing workflows, so media compatibility is important in considering a switch. Media Composer version 6 now supports native ProRes media. Apple ProRes QuickTime files can be imported on both Mac and PC versions, but in addition, Mac Media Composers can render and export natively to ProRes, as well. When the files are ingested using the standard (non-AMA) method, the ProRes files are “fast imported”. This means the media is copied without conversion and the container is rewrapped from a .mov to .mxf format. Avid takes care of maintaining proper levels without the dreaded QuickTime gamma shift. For FCP editors moving to Media Composer, this means easy access to existing media files from past projects using one of two import methods – native ProRes import or AMA linking.

Resolve roundtrip – Both NLEs have good, built-in color correction tools, but Final Cut users who needed more grading horsepower turned to Apple Color. With the demise of Color, DaVinci Resolve has taken up the mantle as the preferred desktop grading tool. It becomes the perfect outboard grading companion when used in conjunction with Media Composer. Even the free Resolve Lite version can read and write MXF media. Start by exporting an AAF file for your edited Media Composer sequence. Make sure the Resolve Media Pool includes the location of your Avid media files. Then import the AAF into Resolve, which in turn links to the MXF media. Grade the clips as usual, render and export using the Avid roundtrip preset along with the corresponding AAF file. The folder of rendered Resolve media (in the MXF format) should be renamed with a number and moved into the Avid MediaFiles/MXF subfolder. When the corresponding AAF file is imported into your Media Composer project, the new sequence will be automatically relinked to these graded MXF files.

Open I/O – FCP users have enjoyed a wide range of capture and output hardware options and that same flexibility came to Media Composer with version 6. Both Media Composer and Symphony run as software-only applications, as well as with capture devices for any of five third-party companies: AJA, Blackmagic Design, BlueFish444, Matrox and MOTU. Plus Avid’s own gear, of course. Set up your Media Composer project as you normally would and the card or external device just simply works. No need to directly alter any card settings. Since the existing hardware from most existing FCP installations can still be used, Media Composer becomes another available application for the arsenal. This has made switching a “no-brainer” for many FCP editors.

Two links for the best 200 Media Composer tutorials for beginners on the web (Mac readers should view these in FireFox, not Safari or Chrome):

Douglas Bruce’s First 100 Basic MC Tutorials

Douglas Bruce’s Next 100 Basic MC Tutorials

Originally written for DV magazine/Creative Planet/NewBay Media, LLC

©2012 Oliver Peters

Final Cut Pro X roundtrips

XML (eXtensible Markup Language) has become a common method of data interchange between post production applications. Standard XML variations are like Romance languages – one version is as different from another, as German is from French; thus, translation software is required. Apple’s Final Cut Pro X was updated to include XML interchange, but this new version of XML (labeled FCP XML) is completely different from the XML format used in FCP 7. Stretching the language analogy, FCP 7’s XML is as different from FCP X’s XML as English is from Russian.

The underlying editing structure of Final Cut Pro 7 is based on the relationship of clips against time and tracks. FCP X links one object to another in a trackless parent-child connection, so there is no easy and direct translation of complex projects between the two versions. Some interchange between Final Cut Pro X and 7 has been achieved by CatDV, DaVinci Resolve and Assisted Editing’s Xto7 for Final Cut Pro and 7toX for Final Cut Pro . These offer migration of edited sequences when you stay within the parameters that FCP XML currently exposes to developers. I’ll concentrate on Resolve, Xto7 and 7toX – as these have the most direct application for editors.

Blackmagic Design DaVinci Resolve

DaVinci Resolve offers an exchange in both directions between Resolve and Final Cut Pro 7 or X. (It also allows Avid roundtrips using AAF and MXF media.) This is intended as a color-correction roundtrip, so you can go from FCP 7 or FCP X to Resolve and back; but, you can also go from X to Resolve to 7 and the other way around. (Note: With the FCP X 10.0.3 update, you will also need to update your version of Resolve, as the XML format was also enhanced with this release.)  For this article, let’s stick with Resolve’s position as a professional grading tool that can augment FCP X.

1. Start by cutting your project in FCP X. Avoid compound clips and speed ramps and remember that effects are not passed through FCP XML at this time. Highlight the project in the project browser and export an FCP XML file.

2. Launch DaVinci Resolve and make sure Media Storage includes the location of your source media files. Import the FCP XML file, which will link to these clips. Check your configuration settings to make sure the frame rate matches. I have noticed that 23.98 sequences are often identified as 24fps. Reset these to 23.98. Proceed to color grade the timeline.

3. Open the Render module and select FCP XML roundtrip from the Easy Set-up pulldown menu and assign the handle length. Individual new clips with modified file names will be rendered to an assigned folder, using Resolve’s source-mode rendering. These correspond to the timeline.

4. From the Conform tab, export an FCP X XML file.

5. Return to Final Cut Pro X and import the FCP XML file from Resolve. The graded clips will automatically be imported into a new Event and this will complete the roundtrip. The new, imported project will be video-only. As a safe step, I recommend that you copy-and-paste all of the clips from this project (the “from Resolve” timeline) into a new, fresh project.

6. Take the audio mix from the original (before Resolve) project – using either a mixdown or a compound clip – and edit it as a connected clip to the new timeline containing the graded clips. Lastly, re-apply any effects, such as transforms, crops, filters, speed ramps or stabilization.


Assisted Editing Xto7 for Final Cut  Pro / 7toX for Final Cut Pro

When Final Cut Pro X was launched, the biggest shock was the fact that you couldn’t migrate sequences from previous versions into the new application. Intelligent Assistance / Assisted Editing developed two translation apps as conduits between the two formats of XML. Xto7 for Final Cut Pro translates sequences (Projects) from FCP X to FCP 7, whereas 7toX for Final Cut Pro translates complete projects, bins and/or sequences from FCP 7 to FCP X. Both are available on the Mac App Store, but check the info on the Intelligent Assistance website for limitations and restrictions in what comes across in these translations.

First, let’s look at Xto7. At first blush, one might ask, “What good is going from FCP X to FCP 7?”  In reality, it’s a very useful tool, because it empowers FCP X users with a whole range of post production solutions. FCP X is a closed application that as yet offers none of the versatility of Final Cut Studio (FCP 7) or Adobe Creative Suite. With Xto7, an editor can perform the creative cut in FCP X and then use Color, Soundtrack Pro, After Effects, Premiere Pro, Audition, ProTools, Smoke and other applications for finishing. In fact, since Automatic Duck has made its plug-ins available for free, this path also enables an editor to move from FCP X to Avid Media Composer by way of FCP 7 and Automatic Duck Pro Export FCP.

1. Start in FCP X. Cut your project, but avoid a few known issues, like speed ramps and compound clips. (Check with Assisted Editing for more specifics.) Also, don’t apply effects, as they won’t translate. Highlight the project in the project browser and export an FCP XML file.

2. Launch Xto7 and navigate to the FCP XML file.

3. You have two choices: Send to Final Cut Pro 7 or Save Sequence XML. The first option opens the timeline as a new FCP 7 project. The second saves an XML file that can later be imported into FCP 7, but also Adobe Premiere Pro or Autodesk Smoke.

4. Once inside FCP 7, you have access to all the usual effect filters and roundtrip tools. This includes creating an EDL for grading or an OMF file for a Pro Tools mixer. Or sending to Color for a grading roundtrip or to Soundtrack Pro for a mix. Likewise, if you opened the XML into Premiere Pro, you could send the audio to Audition for a mix or to After Effects for effects, grading and compositing using Dynamic Link.

If you want to got in the other direction, from legacy Final Cut projects or sequences to Final Cut Pro X, then 7toX for Final Cut Pro is the tool to use. Again, check the website for translation limitations.

1. Open your project in FCP 7 and make sure your media all properly connects.

2. Highlight the project, bin or sequence you’d like to export. Then export an XML file.

3. Launch 7toX and select the exported XML file to open. Then choose the option to “open in FCP X”.

FCP X will launch, import the items into a new Event and relink to the media. Edited FCP 7 sequences will show up in the Event as a Compound clip and will be located in a Keyword Collection labeled FCP 7 Sequences.

None of these processes is perfect yet, but these are just some examples of how a new ecosystem is growing up around Apple Final Cut Pro X. This controversial editing tool may not be right for everyone, but solutions like DaVinci Resolve and Xto7 / 7toX for Final Cut Pro mean you aren’t stranded on an island.

Written for DV magazine (NewBay Media LLC)

©2012 Oliver Peters

RED post for My Fair Lidy

I’ve work on various RED projects, but a recent interesting example is My Fair Lidy, an independent film produced through the Valencia College Film Production Technology program. This was a full-blown feature shot entirely with RED One cameras. In this program, professional filmmakers with real projects in hand partner with a class of eager students seeking to learn the craft of film production. I’ve edited two of these films produced through the program and assisted in various aspects of post on many others. My Fair Lidy – a quirky comedy directed by program director Ralph Clemente – was shot in 17 days this summer at various central Florida locations. Two RED Ones were used – one handled by director of photography Ricardo Galé and the second by student cinematographers. My Fair Lidy was produced by SandWoman Films and stars Christopher Backus and Leigh Shannon.

There are many ways to handle the post production of native RED media and I’ve covered a number of them in these earlier posts. There is no single “best way” to handle these files, because each production is often best-served by a custom solution. Originally, I felt the way to tackle the dailies was to convert the .r3d camera files into ProRes 4444 files using the RedLogFilm profile. This gives you a very flat look, and a starting point very similar to ARRI ALEXA files shot with the Log-C profile. My intension would have been to finish and grade straight from the QuickTimes and never return to the .r3d files, unless I needed to fix some problems. Neutral images with a RedLogFilm gamma setting are very easy to grade and they let the colorist swing the image for different looks with ease. However, after my initial discussions with Ricardo, it was decided to do the final grade from the native camera raw files, so that we had the most control over the image, plus the ability to zoom in and reframe using the native 4K files as a source.

The dailies and editorial flow

My Fair Lidy was lensed with a 16 x 9 aspect ratio, with the REDs set to record 4096 x 2304 (at 23.98fps). In addition to a RED One and a healthy complement of grip, lighting and electrical gear, Valencia College owns several Final Cut Pro post systems and a Red Rocket accelerator card. With two REDs rolling most of the time, the latter was a godsend on this production.  We had two workstations set up – one as the editor’s station with a large Maxx Digital storage array and the other as the assistant’s station. That system housed the Red Rocket card. My two assistants (Kyle Prince and Frank Gould) handled all data back-up and conversion of 4K RED files to 1920 x 1080 ProResHQ for editorial media. Using ProResHQ was probably overkill for cutting the film (any of the lower ProRes codecs would have been fine for editorial decisions) but this gave us the best possible image for an potential screenings, trailers, etc.

Redcine-X was our tool for .r3d media organization and conversion. All in-camera settings were left alone, except the gamma adjustment. The Red Rocket card handles the full-resolution debayering of the raw files, so conversion time is close to real time. The two stations were networked via AFP (Apple’s file-sharing protocol), which permitted the assistant to handle his tasks without slowing down the editor. In addition, the assistant would sync and merge audio from the double-system sound, multi-track audio recordings and enter basic scene/take descriptions. Each shoot day had its own FCP project, so when done, project files and media (.r3d, ProRes and audio) were copied over to the editor’s Maxx array. Master clips from these daily FCP projects were then copied-and-pasted (and media relinked) into a single “master edit” FCP project.

For reasons of schedule and availability, I split the editing responsibilities with a second film editor, Patrick Tyler. My initial role was to bring the film to its first cut and then Patrick handled revisions with the producer and director. Once the picture was locked, I rejoined the project to cover final finishing and color grading. My Fair Lidy was on a very accelerated schedule, with sound design and music scoring running on a parallel track. In total, post took about 15 weeks from start to finish.

Finishing and grading

Since we didn’t use FCP’s Log and Transfer function nor the in-camera QuickTime reference files as edit proxies, there was no easy way to get Apple Color to automatically relink clips to the original .r3d files. You can manually redirect Color to link to RED files, but this must be done one shot at a time – not exactly desirable for the 1300 or so shots in the film.

The recommended workflow is to export an XML from FCP 7, which is then opened in Redcine-X. It will correctly reconnect to the .r3d files in place of the QuickTime movies. From there you export a new XML, which can be imported into Color. Voila! A Color timeline that matches the edit using the native camera files. Unfortunately for us, this is where reality came crashing in – literally. No matter what we did, using both  XMLs and EDLs, everything that we attempted to import into Color crashed the application. We also tried ClipFinder, another free application designed for RED media. It didn’t crash Color, but a significant number of shots were incorrectly linked. I suspect some internal confusion because of the A and B camera situation.

On to Plan B. Since Redcine-X correctly links to the media and includes not only controls for the raw settings, but also a healthy toolset for primary color correction, then why not use it for part of the grading process? Follow that up with a pass through Color to establish the stylistic “look”. This ended up working extremely well for us. Here are the basic steps I followed.

Step 1. We broke the film into ten reels and exported an XML file for each reel from FCP 7.

Step 2. Each reel’s XML was imported into Redcine-X as a timeline. I changed all the camera color metadata for each shot to create a neutral look and to match shots to each other. I used RedColor (slightly more saturated than RedColor2) and RedGamma2 (not quite as flat as RedLogFilm), plus adjusted the color temp, tint and ISO values to get a neutral white balance and match the A and B camera angles. The intent was to bring the image “within the goalposts” of the histogram. Occasionally I would make minor exposure and contrast adjustments, but for the most part, I didn’t touch any of the other color controls.

My objective was to end up with a timeline that looked consistent but preserved dynamic range. Essentially that’s the same thing I would do as the first step using the primary tab within Color. The nice part about this is that once I matched the settings of the shots, the A and B cameras looked very consistent.

Step 3. Each timeline was exported from Redcine-X as a single ProResHQ file with these new settings baked in. We had moved the Red Rocket card into the primary workstation, so these 1920 x 1080 clips were rendered with full resolution debayering. As with the dailies, rendering time was largely real-time or somewhat slower. In this case, approximately 10-20 minutes per reel.

Step 4. I imported each rendered clip back into FCP and placed it onto video track two over the corresponding clips for that reel to check the conforming accuracy and sync. Using the “next edit” keystroke, I quickly stepped through the timeline and “razored” each edit point on the clip from Redcine-X. This may sound cumbersome, but only took a couple of minutes for each reel. Now I had an FCP sequence from a single media clip, but with each cut split as an edit point. Doing this creates “notches” that are used by the color correction software for cuts between corrections. That’s been the basis for all “tape-to-tape” color correction since DaVinci started doing it and the new Resolve software still includes a similar automatic scene detection function today.

Step 5. I sent my newly “notched” timeline to Color and graded as I normally would. By using the Redcine-X step as a “pre-grade”, I had done the same thing to the image as I would have done using the RED tab within Color, thus keeping with the plan to grade from the native camera raw files. I do believe the approach I took was faster and better than trying to do it all inside Color, because of the inefficiency of bouncing in and out of the RED tab in Color for each clip. Not to mention that Color really bogs down when working with 4K files, even with a Red Rocket card in place.

Step 6. The exception to this process was any shot that required a blow-up or repositioning. For these, I sent the ProRes file from dailies in place of the rendered shot from Redcine-X. In Color, I would then manually reconnect to the .r3d file and resize the shot in Color’s geometry room, thus using the file’s full 4K size to preserve resolution at 1080 for the blow-up.

Step 7. The last step was to render in Color and then “Send to FCP” to complete the roundtrip. In FCP, the reel were assembled for the full movie and then married to the mixed soundtrack for a finished film.

© 2011 Oliver Peters

Improving FCP X

A short while ago I started a thread at Creative COW entitled, “What would it take?” My premise is that Final Cut Pro X has enough tantalizing advantages that many “pro users” (whatever that means) would adopt it, if only it had a few extra features. I’m not talking about turning it into FCP 8. I think that’s pretty unrealistic and I believe Apple is going in a different direction. The point is that there are a number of elements that could be added and stay within the FCP X paradigm, which would quell some of the complaints. The thread sparked some interesting suggestions, but here are a few of mine in no particular order of priority.

1. Make audio trimming and transitions as easy as and comparable to video trimming. Currently audio seems to take a back seat to video editing when it comes to trims and transitions.

2. Add “open in Motion” or “send to Motion” functions for clips. Motion 5 is quite powerful and it fills in many gaps that exist in FCP X. For example, drawing mattes. A “send to” roundtrip function would help.

3. Either add track-based mixing or add a “send to Logic” function. I feel audio without tracks is a pretty tough way to mix. Assuming the next version of Logic isn’t as drastic of a change as FCP 7 to FCP X, then it would be nice to offer the option of sending your FCP X project audio to Logic for mixing.

4. Add modifiers to give you some user-defined control over the magnetic timeline. More than just the position tool. Time to tame the magnetic timeline.

5. Add user-defined controls for more track-like behavior. Such as expanded use/behavior of additional storylines. I’m not sure what form this would take, but the desire is to get the best of both worlds.

6. Add a “save as” function.

7. Add event/project management to open/hide projects and media. This exists in Assisted Editing’s Event Manager X, but it should be a direct function within FCP X.

8. Add a choice to not see the event thumbnail/filmstrip when you click on it. Even in list view, when you click on an event clip it is refreshed in the single visible filmstrip at the top. This slows down the response of the system. I’d like to see a true list-only view for faster response when I’m entering data.

9. Remember clip in/out points.

10. Add some user control over window layouts. FCP 7’s workspace customization was great and it’s a shame we lost it.

11. Add some way to see a second window as a source/record (2-up) view.

12. Bring back copy/paste/remove attributes.

13. Bring back the equivalent to the Track Tool.

14. Import legacy FCP sequences. I realize some third-party developer will likely create an XML to FCP XML translator, but it sure would make sense if Apple solved this issue. Even if it means only a simple sequence without effects, speed ramps or audio levels.

©2011 Oliver Peters