NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters

Final Cut “Studio 2014”

df_fcpstudio_main

A few years ago I wrote some posts about Final Cut Pro as a platform and designing an FCP-centric facility. Those options have largely been replaced by an Adobe approach built around Creative Cloud. Not everyone has warmed up to Creative Cloud. Either they don’t like the software or they dislike the software rental model or they just don’t need much of the power offered by the various Adobe applications.

If you are looking for alternatives to a Creative Cloud-based production toolkit, then it’s easy to build your own combination with some very inexpensive solutions. Most of these are either Apple software or others that are sold through the Mac App Store. As with all App Store purchases, you buy the product once and get updates for free, so long as the product is still sold as the same. Individual users may install the apps onto as many Mac computers as they personally own and control, all for the one purchase price. With this in mind, it’s very easy for most editors to create a powerful bundle that’s equal to or better than the old Final Cut Studio bundle – at less than its full retail price back in the day.

The one caveat to all of this is how entrenched you may or may not be with Adobe products. If you need to open and alter complex Illustrator, Photoshop, After Effects or Premiere Pro project files, then you will absolutely need Adobe software to do it. In that case, maybe you can get by with an old version (CS6 or earlier) or maybe trial software will work. Lastly you could outsource to a colleague with Adobe software or simply pick up a Creative Cloud subscription on a month-by-month rental. On the other hand, if you don’t absolutely need to interact with Adobe project files, then these solutions may be all you need. I’m not trying to advocate for one over the other, but rather to add some ideas to think about.

Final Cut Pro X / Motion / Compressor

df_fcpstudio_fcpx_smThe last Final Cut Studio bundle included FCP 7, Motion, Compressor, Cinema Tools, DVD Studio Pro, Soundtrack Pro and Color. The current Apple video tools of Final Cut Pro X, Motion and Compressor cover all of the video bases, including editing, compositing, encoding, transcoding and disc burning. The latter may be a bone of contention for many – since Apple has largely walked away from the optical disc world. Nevertheless, simple one-off DVDs and Blu-ray discs can still be created straight from FCP X or Compressor. Of course, FCP X has been a mixed bag for editors, with many evangelists and haters on all sides. If you square off Premiere Pro against Final Cut Pro X, then it really boils down to tracks versus trackless. Both tools get the job done. Which one do you prefer?

df_fcpstudio_motion_smMotion versus After Effects is a tougher call. If you are a power user of After Effects, then Motion may seem foreign and hard to use. If the focus is primarily on motion graphics, then you can certainly get the results you want in either. There is no direct “send to” from FCP X to Motion, but on the plus side, you can create effects and graphics templates using Motion that will appear and function within FCP X. Just like with After Effects, you can also buy stock Motion templates for graphics, show opens and other types of design themes and animations.

Logic Pro X

df_fcpstudio_lpx_smLogic Pro X is the DAW in our package. It becomes the replacement for Soundtrack Pro and the alternative to Adobe Audition or Avid Pro Tools. It’s a powerful music creation tool, but more importantly for editors, it’s a strong single file and multitrack audio production and post production application. You can get FCP X files to it via FCPXML or AAF (converted using X2Pro). There are a ton of plug-ins and mixing features that make Logic a solid DAW. I won’t dive deeply into this, but suffice it to say, that if your main interest in using Logic is to produce a better mix, then you can learn the essentials quickly and get up and running in short order.

DaVinci Resolve

df_fcpstudio_resolve_smEvery decent studio bundle needs a powerful color correction tool. Apple Color is gone, but Blackmagic Design’s DaVinci Resolve is a best-of-breed replacement. You can get the free Resolve Lite version through the App Store, as well as Blackmagic’s website. It does most of everything you need, so there’s little reason to buy the paid version for most editors who do some color correction.

Resolve 11 (due out soon) adds improved editing. There is a solid synergy with FCP X, making it not only a good companion color corrector, but also a finishing editorial tool. OFX plug-ins are supported, which adds a choice of industry standard creative effects if you need more than FCP X or Motion offer.

Pixelmator / Aperture

df_fcpstudio_pixelmator_smThis one’s tough. Of all the Adobe applications, Photoshop and Illustrator are hardest to replace. There are no perfect alternatives. On the other hand, most editors don’t need all that power. If direct feature compatibility isn’t a need, then you’ve got some choices. One of these is Pixelmator, a very lightweight image manipulation tool. It’s a little like Photoshop in the version 4-7 stages, with a mix of Illustrator tossed in. There are vector drawing and design tools and it’s optimized for core image, complete with a nice set of image filters. However, it does not include some of Photoshop CC’s power user features, like smart objects, smart filters, 3D, layer groups and video manipulation. But, if you just need to doctor some images, extract or modify logos or translate various image formats, Pixelmator might be the perfect fit. For more sophistication, another choice (not in the App Store) is Corel’s Painter, as well as Adobe Photoshop Elements (also available at the App Store).

df_fcpstudio_aperture_smAlthough Final Cut Studio never included a photo application, the Creative Cloud does include Lightroom. Since the beginning, Apple’s Aperture and Adobe’s Lightroom have been leapfrogging each other with features. Aperture hasn’t changed much in a few years and is likely the next pro app to get the “X” treatment from Apple’s engineers. Photographers have the same type of “Chevy vs. Ford” arguments about Aperture and Lightroom as editors do about NLEs. Nevertheless, editors deal a lot with supplied images and Aperture is a great tool to use for organization, clean up and image manipulation.

Other

The list I’ve outlined creates a nice set of tools, but if you need to interchange with other pros using a variety of different software, then you’ll need to invest in some “glue”. There are a number of utilities designed to go to and from FCP X. Many are available through the App Store. Examples include Xto7, 7toX, EDL-X, X2Pro, Shot Notes X, Lumberjack and many others.

For a freewheeling discussion about this topic and other matters, check out my conversation with Chris Fenwick at FCPX Grille.

©2014 Oliver Peters

Inside Llewyn Davis

df_ild_01

Fans of Joel and Ethan Coen’s eclectic brand of filmmaking should be thrilled with their latest effort, Inside Llewyn Davis. The story follows Llewyn Davis as a struggling folk singer in the Greenwich Village folk scene at about 1960 – just before Bob Dylan’s early career there. Davis is played by Oscar Isaac, who most recently appeared in The Bourne Legacy. The story was inspired by the life of musician Dave Van Ronk, as chronicled in the book “The Mayor of MacDougal Street”. Although this is the Coen bothers’ most recent release, the film was actually produced in 2012 in true indie filmmaking fashion – without any firm commitment for distribution. It was picked up by CBS Films earlier this year.

df_ild_02The Coen brothers tackle post with a workflow that is specific to them. I had a chance to dig into that world with Katie McQuerrey, who is credited as an additional editor on Inside Llewyn Davis. McQuerrey started with the Coen brothers as they transitioned into digital post, helping to adapt their editorial style to Apple Final Cut Pro. For many of their films, she’s worn a number of hats – helping to coordinate the assistant editors, acting as a conduit to other departments and, in general, serving as another set of eyes, ears and brain while Ethan and Joel are cutting their films.

df_ild_07McQuerrey explained, “Ethan and Joel adapted their approach from how they used to cut on film. Ethan would pull selects from film workprint on a Moviola and then Joel would assemble scenes from these selects using a KEM. With Final Cut Pro, they each have a workstation and these are networked together. No fancy SAN management. Just Apple file sharing and a Promise storage array for media. Ethan will go through a project, review all the takes, make marks, add markers or written notes and pass it over to Joel. Ethan doesn’t actually assemble anything to a timeline. He’s only working within the bins of the broader project. All of the timeline editing of these scenes is then done by Joel.” (Although there’s been press about the Coen brothers planning to use Adobe Premiere Pro in the future, this film was still edited using Apple Final Cut Pro 7.)

df_ild_06Inside Llewyn Davis was filmed on 35mm over the course of a 45-day production in 2012. It wrapped on April 4th and was followed by a 20 to 24-week post schedule, ending in a final mix by the end of September. Technicolor in New York provided lab and transfer services for the production. They scanned in all of the raw 35mm negative one time to DPX files with a 2K resolution and performed a “best light” color correction pass of the DPX files for dailies. In addition, Technicolor also synced the sound from the mono mix of production mixer Peter Kurland’s location recordings. These were delivered to the editorial team as synced ProRes files.

df_ild_05McQuerrey said, “Ethan and Joel don’t cut during the shooting. That doesn’t start until the production wraps. Inside Llewyn Davis has a look for many of the scenes reminiscent of the era. [Director of photography] Bruno Delbonnel worked closely with [colorist] Peter Doyle to establish a suggested look during the dailies. These would be reviewed on location in a production trailer equipped with a 50” Panasonic plasma that Technicolor had calibrated. Once the film was locked, then Technicolor conformed the DPX files and Bruno, Ethan and Joel supervised the DI mastering of the film. Peter graded both the dailies and the final version using a [Filmlight] Baselight system. Naturally, the suggested look was honed and perfected in the final DI.”

df_ild_04Inside Llewyn Davis is about a musician and music is a major component of the film. The intent was to be as authentic as possible. There was no lip-syncing to the playback of a recorded music track. Peter [Kurland] recorded all of these live on set and that’s what ended up in the final mix. For editing, if we ever needed to separate tracks, then we’d go back to Peter’s broadcast wave file multi-track recordings, bring those into Final Cut and create ‘merged clips’ that were synced. Since Ethan and Joel’s offices are in a small building, the assistants had a separate cutting room at Post Factory in New York. We mirrored the media at both locations and I handled the communication between the two offices. Often this was done using Mac screen sharing between the computers.”

df_ild_03The Coen brothers approach their films in a very methodical fashion, so editing doesn’t present the kinds of challenges that might be the case with other directors. McQuerrey explained, “Ethan and Joel have a very good sense of script time to film time. They also understand how the script will translate on screen. They’ll storyboard the entire film, so there’s no improvisation for the editor to deal with. Most scenes are filmed with a traditional, single-camera set-up. This film was within minutes of the right length at the first assembly, so most of the editorial changes were minor trims and honing the cut. No significant scene lifts were made. Joel’s process is usually to do a rough cut and then a first cut. Skip Lievsay, our supervising sound editor, will do a temp mix in [Avid] Pro Tools. This cut with the temp mix will be internally screened for ‘friends and family’, plus the sound team and visual effects department. We then go back through the film top to bottom, creating a second cut with another temp mix.”

“At this stage, some of the visual effects shots have been completed and dropped into the cut. Then there’s more honing, more effects in place and finally another temp mix in 5.1 surround. This will be output to D5 for more formal screenings. Skip builds temp mixes that get pretty involved, so each time we send OMF files and change lists. Sound effects and ADR are addressed at each temp mix. The final mix was done in five days at Sony in Los Angeles with Skip and Greg Orloff working as the re-recording mixers.”

df_ild_08Even the most organized production includes some elements that are tough to cut. For Inside Llewyn Davis, this was the cross-country driving sequence that covers about one-and-a-half reels of the film. It includes another Coen favorite, John Goodman. McQuerrey described, “The driving scenes were all shot as green-screen composites. There are constantly three actors in the car, plus a cat. It’s always a challenge to cut this type of scene, because you are dealing with the continuity from take to take of all three actors in a confined space. The cat, of course, is less under anyone’s control. We ‘cheated’ that a bit using seamless split-screens to composite the shots in a way that the cat was in the right place. All of the windows had to be composited with the appropriate background scenery.”

“The most interesting part of the cut was how the first and last scenes were built. The beginning of the movie and the ending are the same event, but the audience may not realize at first that they are back at the beginning of the story. This was filmed only one time, but each scene was edited in a slightly different way, so initially you aren’t quite sure if you’ve seen this before or not. Actions in the first scene are abbreviated, but are then resolved with more exposition at the end.”

Originally written for Digital Video magazine

©2013 Oliver Peters

The NLE that wouldn’t die II

df_nledie2_sm

With echoes of Monty Python in the background, two years on, Final Cut Pro 7 and Final Cut Studio are still widely in use. As I noted in my post from last November, I still see facilities with firmly entrenched and mature FCP “legacy” workflows that haven’t moved to another NLE yet. Some were ready to move to Adobe until they learned subscription was the only choice going forward. Others maintain a fanboy’s faith in Apple that the next version will somehow fix all the things they dislike about Final Cut Pro X. Others simply haven’t found the alternative solutions compelling enough to shift.

I’ve been cutting all manner of projects in FCP X since the beginning and am currently using it on a feature film. I augment it in lots of ways with plug-ins and utilities, so I’m about as deep into FCP X workflows as anyone out there. Yet, there are very few projects in which I don’t touch some aspect of Final Cut Studio to help get the job done. Some fueled by need, some by personal preference. Here are some ways that Studio can still work for you as a suite of applications to fill in the gaps.

DVD creation

There are no more version updates to Apple’s (or Adobe’s) DVD creation tools. FCP X and Compressor can author simple “one-off” discs using their export/share/batch functions. However, if you need a more advanced, authored DVD with branched menus and assets, DVD Studio Pro (as well is Adobe Encore CS6) is still a very viable tool, assuming you already own Final Cut Studio. For me, the need to do this has been reduced, but not completely gone.

Batch export

Final Cut Pro X has no batch export function for source clips. This is something I find immensely helpful. For example, many editorial houses specify that their production company client supply edit-friendly “dailies” – especially when final color correction and finishing will be done by another facility or artist/editor/colorist. This is a throwback to film workflows and is most often the case with RED and ALEXA productions. Certainly a lot of the same processes can be done with DaVinci Resolve, but it’s simply faster and easier with FCP 7.

In the case of ALEXA, a lot of editors prefer to do their offline edit with LUT-corrected, Rec 709 images, instead of the flat, Log-C ProRes 4444 files that come straight from the camera. With FCP 7, simply import the camera files, add a LUT filter like the one from Nick Shaw (Antler Post), enable TC burn-in if you like and run a batch export in the codec of your choice. When I do this, I usually end up with a set of Rec 709 color, ProResLT files with burn-in that I can use to edit with. Since the file name, reel ID and timecode are identical to the camera masters, I can easily edit with the “dailies” and then relink to the camera masters for color correction and finishing. This works well in Adobe Premiere Pro CC, Apple FCP 7 and even FCP X.

Timecode and reel IDs

When I work with files from the various HDSLRs, I prefer to convert them to ProRes (or DNxHD), add timecode and reel ID info. In my eyes, this makes the file professional video media that’s much more easily dealt with throughout the rest of the post pipeline. I have a specific routine for doing this, but when some of these steps fail, due to some file error, I find that FCP 7 is a good back-up utility. From inside FCP 7, you can easily add reel IDs and also modify or add timecode. This metadata is embedded into the actual media file and readable by other applications.

Log and Transfer

Yes, I know that you can import and optimize (transcode) camera files in FCP X. I just don’t like the way it does it. The FCP 7 Log and Transfer module allows the editor to set several naming preferences upon ingest. This includes custom names and reel IDs. That metadata is then embedded directly into the QuickTime movie created by the Log and Transfer module. FCP X doesn’t embed name and ID changes into the media file, but rather into its own database. Subsequently this information is not transportable by simply reading the media file within another application. As a result, when I work with media from a C300, for example, my first step is still Log and Transfer in FCP 7, before I start editing in FCP X.

Conform and reverse telecine

A lot of cameras offer the ability to shoot at higher frame rates with the intent of playing this at a slower frame rate for a slow motion effect – “overcranking” in film terms. Advanced cameras like the ALEXA, RED One, EPIC and Canon C300 write a timebase reference into the file that tells the NLE that a file recorded at 60fps is to be played at 23.98fps. This is not true of HDSLRs, like a Canon 5D, 7D or a GoPro. You have to tell the NLE what to do. FCP X only does this though its Retime effect, which means you are telling the file to be played as slomo, thus requiring a render.

I prefer to use Cinema Tools to “conform” the file. This alters the file header information of the QuickTime file, so that any application will play it at the conformed, rather than recorded frame rate. The process is nearly instant and when imported into FCP X, the application simply plays it at the slower speed – no rendering required. Just like with an ALEXA or RED.

Another function of Cinema Tools is reverse telecine. If a camera file was recorded with built-in “pulldown” – sometimes called 24-over-60 – additional redundant video fields are added to the file. You want to remove these if you are editing in a native 24p project. Cinema Tools will let you do this and in the process render a new, 24p-native file.

Color correction

I really like the built-in and third-party color correction tools for Final Cut Pro X. I also like Blackmagic Design’s DaVinci Resolve, but there are times when Apple Color is still the best tool for the job. I prefer its user interface to Resolve, especially when working with dual displays and if you use an AJA capture/monitoring product, Resolve is a non-starter. For me, Color is the best choice when I get a color correction project from outside where the editor used FCP 7 to cut. I’ve also done some jobs in X and then gone to Color via Xto7 and then FCP 7. It may sound a little convoluted, but is pretty painless and the results speak for themselves.

Audio mixing

I do minimal mixing in X. It’s fine for simple mixes, but for me, a track-based application is the only way to go. I do have X2Pro Audio Convert, but many of the out-of-house ProTools mixers I work with prefer to receive OMFs rather than AAFs. This means going to FCP 7 first and then generating an OMF from within FCP 7. This has the added advantage that I can proof the timeline for errors first. That’s something you can’t do if you are generating an AAF without any way to open and inspect it. FCP X has a tendency to include many clips that are muted and usually out of your way inside X. By going to FCP 7 first, you have a chance to clean up the timeline before the mixer gets it.

Any complex projects that I mix myself are done in Adobe Audition or Soundtrack Pro. I can get to Audition via the XML route – or I can go to Soundtrack Pro through XML and FCP 7 with its “send to” function. Either application works for me and most of my third-party plug-ins show up in each. Plus they both have a healthy set of their own built-in filters. When I’m done, simply export the mix (and/or stems) and import the track back into FCP X to marry it to the picture.

Project trimming

Final Cut Pro X has no media management function.  You can copy/move/aggregate all of the media from a single Project (timeline) into a new Event, but these files are the source clips at full length. There is no ability to create a new project with trimmed or consolidated media. That’s when source files from a timeline are shortened to only include the portion that was cut into the sequence, plus user-defined “handles” (an extra few frames or seconds at the beginning and end of the clip). Trimmed, media-managed projects are often required when sending your edited sequence to an outside color correction facility. It’s also a great way to archive the “unflattened” final sequence of your production, while still leaving some wiggle room for future trimming adjustments. The sequence is editable and you still have the ability to slip, slide or change cuts by a few frames.

I ran into this problem the other day, where I needed to take a production home for further work. It was a series of commercials cut in FCP X, from which I had recut four spots as director’s cuts. The edit was locked, but I wanted to finish the mix and grade at home. No problem, I thought. Simply duplicate the project with “used media”, create the new Event and “organize” (copies media into the new Event folder). I could live with the fact that the media was full length, but there was one rub. Since I had originally edited the series of commercials using Compound Clips for selected takes, the duping process brought over all of these Compounds – even though none was actually used in the edit of the four director’s cuts. This would have resulted in copying nearly two-thirds of the total source media. I could not remove the Compounds from the copied Event, without also removing them from the original, which I didn’t want to do.

The solution was to send the sequence of four spots to FCP 7 and then media manage that timeline into a trimmed project. The difference was 12GB of trimmed source clips instead of HUNDREDS of GB. At home, I then sent the audio to Soundtrack Pro for a mix and the picture back to FCP X for color correction. Connect the mix back to the primary storyline in FCP X and call it done!

I realize that some of this may sound a bit complex to some readers, but professional workflows are all about having a good toolkit and knowing how to use it. FCP X is a great tool for productions that can work within its walls, but if you still own Final Cut Studio, there are a lot more options at your disposal. Why not continue to use them?

©2013 Oliver Peters

Particle Fever

df_pf_1Filmmaking isn’t rocket science, but sometimes they are kissing cousins. Such is the case of the documentary Particle Fever, where the credentials of both producer David Kaplan and director Mark Levinson include a Doctorate in particle physics. Levinson has been involved in filmmaking for 28 years, starting after his graduation from Berkeley, when he found the job prospects for physics in a slump. Instead he turned to his second passion – films. Levinson worked as an ADR specialist on such films as The English Patient, The Talented Mr. Ripley, Cold Mountain, and The Rainmaker. While working on those films, he built up a friendship with noted film editor Walter Murch (The Conversation, Julia, Apocalypse Now, K-19: The Widowmaker). In addition, Levinson was writing screenplays and directing some of his own independent films (Prisoner of Time). This ultimately led him to combine his two interests and pursue Particle Fever, a documentary about the research, construction and goals of building the Large Hadron Collider.

When it came time to put the polish on his documentary, Mark Levinson tapped Walter Murch as the editor. Murch explained, “I was originally only going to be on the film for three months, because I was scheduled to work on another production after that. I started in March 2012, but the story kept changing with each breaking news item from the collider. And my other project went away, so in the end, I worked on the film for 15 months and just finished the mix a few weeks ago [June 2013].” At the start of the documentary project, the outcome of the research from the Large Hadron Collider was unknown. In fact, it wasn’t until later during the edit, that the scientists achieved a major success with the confirmation of the discovery of the Higgs boson as an elementary particle in July 2012. This impacted science, but also the documentary in a major way.

Finding the story arc

df_pf_6Particle Fever is the first feature-length documentary that Walter Murch has edited, although archival and documentary footage has been part of a number of his films. He’d cut some films for the USIA early in his career and has advised and mixed a number of documentaries, including Crumb, about the controversial cartoonist Robert Crumb. Murch is fond of discussing the role of the editor as a participatory writer of the film in how he crafts the story through pictures and sound. Nowhere is this more true than in documentaries. According to Murch, “Particle Fever had a natural story arc by the nature of the events themselves. The machine [the Large Hadron Collider] provided the spine. It was turned on in 2008 and nine days later partly exploded, because a helium relief valve wasn’t strong enough. It was shut down for a year of repairs. When it was turned on again, it was only at half power and many of the scientists feared this was inadequate for any major discoveries. Nevertheless, even at half power, the precision was good enough to see the evidence that they needed. The film covers this journey from hope to disaster to recovery and triumph.”

Due to the cost of constructing large particle accelerators, a project like the Large Hadron Collider is a once-in-a-generation event. It is a seminal moment in science akin to the Manhattan Project or the moon launch. In this case, 10,000 scientists from 100 countries were involved in the goal of recreating the conditions just after the Big Bang and finding the Higgs boson, often nicknamed “the God particle”. Murch explained the production process, “Mark and David picked a number of scientists to follow and we told the story through their eyes without a narrator. They were equipped with small consumer cameras to self-record intermittent video blogs, which augmented the formal interviews. Initially Mark was following about a dozen scientists, but this was eventually narrowed down to the six that are featured in the film. The central creative challenge was to balance the events while getting to know the people and their roles. We also had to present enough science to understand what is at stake without overwhelming the audience. These six turned out to be the best at that and could convey their passion in a very charismatic and understandable way with a minimum of jargon.”

Murch continued, “Our initial cut was two-and-a-half hours, which was ultimately reduced to 99 minutes. We got there by cutting some people, but also some of the ‘side shoots’ or alternate research options that were explored. For example, there was a flurry of excitement related to what was thought to be discoveries of particles of ‘dark matter’ at a Minnesota facility. This covered about 20 minutes of the film, but in the final version there’s only a small trace of that material.”

Sifting to find the nuggets

df_pf_2As in most documentaries, the post team faced a multitude of formats and a wealth of material, including standard definition video recorded in 2007, the HDV files from the scientists’ “webcams” and Panasonic HD media from the interviews. In addition, there was a lot of PAL footage from the media libraries at CERN, the European particle accelerator. During the production, news coverage focused on the theoretical, though statistically unlikely, possibility that the Large Hadron Collider might have been capable of producing a black hole. This yielded even more source material to sift through. In total, the production team generated 300 hours of content and an additional 700 hours were available from CERN and the various news pieces produced about the collider.

Murch is known for his detailed editor’s codebook for scenes and dailies that he maintains for every film in a Filemaker Pro database. Particle Fever required a more streamlined approach. Murch came in at what initially appeared to be the end of the process after Mona Davis (Fresh, Advise & Consent) had worked on the film. Murch said, “I started the process later into the production, so I didn’t initially use my Filemaker database. Mark was both the director and my assistant editor, so for the first few months I was guided by his knowledge of the material. We maintained two mirrored workstations with Final Cut Pro 7 and Mark would ingest any new material and add his markers for clips to investigate. When these bins were copied to my station, I could use them as a guide of where to start looking for possible material.”

Mapping the sound

df_pf_4The post team operated out of Gigantic Studios in New York, which enabled an interactive workflow between Murch and sound designer Tom Paul (on staff at Gigantic) and with composer Robert Miller. Walter Murch’s editorial style involves building up a lot of temporary sound effects and score elements during the rough cut phase and then, piece-by-piece, replacing those with finished elements as he receives them. His FCP sequence on Particle Fever had 42 audio tracks of dialogue, temp sound effects and music elements. This sort of interaction among the editor, sound designer and composer worked well with a small post team all located in New York City. By the time the cut was locked in May, Miller had delivered about an hour of original score for the film and supplied Murch with seven stereo instrumentation stems for that score to give him the most versatility in mixing.

Murch and Paul mixed the film on Gigantic’s Pro Tools ICON system. Murch offered this post trick, “When I received the final score elements from Robert, I would load them into Final Cut and then was able to copy-and-paste volume keyframes I had added to Robert’s temp music onto the final stems, ducking under dialogue or emphasizing certain dynamics of the music. This information was then automatically transferred to the Pro Tools system as part of the OMF output. Although we’d still adjust levels in the mix, embedding these volume shifts gave us a better starting point. We didn’t have to reinvent the wheel, so to speak. In the end, the final mix took four days. Long days!”

df_pf_3Gigantic Post offered the advantage of an on-site screening room, which enabled the producers to have numerous in-progress screenings for both scientific and industry professionals, as well as normal interested viewers. Murch explained, “It was important to get the science right, but also to make it understandable to the layman. I have more than a passing interest in the subject, but both Mark and David have Ph.D.s in particle physics, so if I ever had a question about something, all I had to do was turn around and ask. We held about 20 screenings over the course of a year and the scientists who attended our test screenings felt that the physics was accurate. But, what they also particularly liked was that the film really conveys the passion and experience of what it’s like to work in this field.” Final Frame Post, also in New York, handled the film’s grading and digital intermediate mastering.

Graphic enhancements

df_pf_5To help illustrate the science, the producers tapped MK12, a design and animation studio, which had worked on such films as The Kite Runner and Quantum of Solace. Some of the ways in which they expressed ideas graphically throughout the film could loosely be described as a cross between A Beautiful Mind and Carl Sagan’s PBS Cosmos series. Murch described one example, “For instance, we see Nima (one of our theorists) walking across the campus of the Institute for Advanced Study while we hear his voice-over. As he talks, formulas start to swirl all around him. Then the grass transforms into a carpet of number-particles, which then transform into an expanding universe into which Nima disappears. Eventually, this scene resolves and Nima emerges, returning on campus and walking into a building, the problematic formulas falling to the ground as he goes through the door.”

Although this was Walter Murch’s first feature documentary, his approach wasn’t fundamentally different from how he works on a dramatic film. He said, “Even on a scripted film, I try to look at the material without investing it with intention. I like to view dailies with the fresh-eyed sense of ‘Oh, where did this come from? Let’s see where this will take the story’.  That’s also from working so many years with Francis [Ford Coppola], who often shoots in a documentary style. The wedding scene in The Godfather, for instance; or the Union Square conversation in The Conversation; or any of the action scenes in Apocalypse Now all exemplify that. They are ongoing events, with their own internal momentum, which are captured by multiple cameras. I really enjoyed working on this film, because there were developments and announcements during the post which significantly affected the direction of the story and ultimately the ending. This made for a real roller coaster ride!”

Particle Fever premiered at Doc/Fest Sheffield on June 14th, and won the Audience Award (split with Act of Killing). It is currently in negotiations for distribution.

NOTE: The film will open in New York on Ma5, 2014. In October 2013Peter W. Higgs – who theorized about the boson particle named after him – was awarded the Nobel Prize in Physics, together with Francois Englert. For more on Walter Murch’s thoughts about editing, click here.

And finally, an interesting look at Murch’s involvement in the Rolex Mentor Protege program, as well as an interview from 2016.

Originally written for Digital Video magazine

©2013 Oliver Peters