Voice from the Stone

df0316_vfts_1_smAs someone who’s worked on a number of independent films, I find it exciting when an ambitious feature film project with tremendous potential comes from parts other than the mainstream Hollywood studio environment. One of these is Voice from the Stone, which features Emilia Clarke and Marton Csokas. Clarke has been a fan favorite in her roles as Daenerys Targaryen in Game of Thrones and the younger Sarah Connor in Terminator Genisys. Csokas has appeared in numerous films and TV series, including Sons of Liberty and Into the Badlands.

In Voice from the Stone, Clarke plays a nurse in 1950s Tuscany who is helping a young boy, Jakob (played by Edward Ding), recover from the death of his mother. He hasn’t spoken since the mother, a renowned pianist, died. According to Eric Howell, the film’s director, “Voice from the Stone was a script that screamed to be read under a blanket with a flashlight. It plays as a Hitchcock fairy tale set in 1950s Tuscany with mysterious characters and a ghostly antagonist.” While not a horror film or thriller, it is about the emotional relationship between Clarke and the boy, but with a supernatural level to it.

df0316_vfts_15Voice from the Stone is Howell’s feature directorial debut. He has worked on numerous films as a director, assistant director, stuntman, stunt coordinator, and in special effects. Dean Zanuck (Road to Perdition, Get Low, The Zero Theorem) produced the film through his Zanuck Independent company. From there, the production takes an interesting turn towards the American heartland, as primary post-production was handled by Splice in Minneapolis. This is a market known for its high-end commercial work, but Splice has landed a solid position as the primary online facility for numerous film and TV series, such as History Channel’s America Unearthed and ABC-TV’s In An Instant.

Tuscany, Minneapolis, and more

Clayton Condit, who co-owns and co-manages Splice with his wife Barb, edited Voice from the Stone. We chatted about how this connection came about. He says, “I had edited two short films with Eric. One of these, Anna’s Playground, made the short list for the 2011 Oscars in the short films category. Eric met with Dean about getting involved with this film and while we were waiting for the financing to be secured, we finished another short, called Strangers. Eric sent the script to Emilia and she loved it. After that everything sort of fell into place. It’s a beautiful script that, along with Eric’s style of directing, fueled amazing performances from the entire cast.”

df0316_vfts_2The actual production covered about 35 days in the Tuscany region of Italy. The exterior location was filmed at one castle, while the interiors at another. This was a two-camera shoot, using ARRI Alexas recording to ARRIRAW. Anamorphic lenses were used to record in ARRI’s 3.5K 4:3 format, but the final product is desqueezed for a 2.39:1 “scope” final 2K master. The DIT on set created editorial and viewing dailies in the ProRes LT file format, complete with synced production audio and timecode burn-in. The assistant editor back at Splice was also loading and organizing the same dailies, so that everything was available there, as well.

df0316_vfts_8Condit explains the timeline of the project, “The production was filmed on location in Italy during November and December of 2014. I was there for the first half of it, cutting on my MacBook Pro on set and in my hotel room. Once I travelled back to Minneapolis, I continued to build a first cut. The director arrived back in the states by the end of January to see early rough assemblies, but it was around mid-February when I really started working a full cut with Eric on the film. By April of 2015 we had a cut ready to present to the producers. Then it took a few more weeks working with them to refine the cut. Splice is a full service post facility, so we kicked off visual effects in May and color starting mid-June. The composer, Michael Wandmacher, created an absolutely gorgeous score that we were able to record during the first week of July at Air Studios in London. We partnered with Skywalker Sound for audio post-production and mix, which took us through the middle of August.”

As with any film, getting to the final result takes time and experimentation. He continues, “We screened for various small groups listening to feedback and debated and tweaked. The film has a lot of beautiful subtleties to it. We did not want to cheapen it with cliché tricks that would diminish the relationships between characters. It really is first a love story between a mother and her child. The director and producers and I worked very closely together taking scenes out, working pacing, putting scenes back in, and really making sure we had an effective story.”

df0316_vfts_12Splice handled visual effects ranging from sky replacements to entire green screen composited sequences. Condit explains, “Our team uses a variety of tools including Nuke, Houdini, Maya, and Cinema 4D. Since this film takes place in the 1950s, there were a lot of modern elements that needed to be removed, like TV antennas and distant power lines, for example. There’s a rock quarry scene with a pool of water. When it came time to shoot there, the water was really murky, so that had to be replaced. In addition, Splice also handled a number of straight effects shots. In a couple scenes the boy is on the edge of the roof of the castle, which was a green screen composite, of course. We also shot a day in a pool for underwater shots.”

Pioneering the cut with Final Cut Pro X

df0316_vfts_5Clayton Condit is a definite convert to Apple’s Final Cut Pro X and Voice from the Stone was no exception. Condit says, “Splice originated as an Avid-based shop and then moved over to Final Cut Pro as our market shifted. We also do a lot of online finishing, so we have to be compatible with whatever the offline editor cuts in. As FCP 7 fades away we are seeing more jobs being done in [Adobe] Premiere Pro and we also are finishing with [Blackmagic Design] DaVinci Resolve. Today we are sort of an ‘all of the above’ shop; but for my offline projects I really think FCP X is the best tool. Eric also appreciated his experience with FCP X as the technology never got in the way. As storytellers, we are creatively free to try things very quickly [with Final Cut Pro X].”

df0316_vfts_7“Of course, like every FCP X editor, I have my list of features that I’d like to see; but as a creative editorial tool, hands down it’s the real deal. I really love audio roles, for example. This made it very easy to manage my temp mixes and to hand over scenes to the composer so that he could control what audio he worked with. It also streamlined turnovers. My assistant, Cody Brown, used X2Pro Audio Convert to prepare AAFs for Skywalker. Sound work in your offline is so critical when trying to ‘sell’ your edit and to make sure a scene is really working. FCP X makes that pretty easy and fun. We have an extensive sound library here at Splice. Along with early music cues from Wandmacher, I was able to do fairly decent temp mixes in surround for early screenings inside Final Cut.”

On location, Condit kept his media on a small G-RAID Thunderbolt drive for portability; but back in Minneapolis, Splice has a 600TB Xsan shared storage system for collaboration among departments. Condit’s FCP X library and cache files were kept on small dual-SSD Thunderbolt drives for performance and with mirrored media he could easily transition between working at home or at Splice.

df0316_vfts_9Condit explains his FCP X workflow, “We broke the film into separate libraries for each of the five reels. Each scene was its own event. Shots were renamed by scene and take numbers using different keyword assignments to help sort and search. The film was shot with two cameras, which Cody grouped as multicam clips in FCP X. He used Sync-N-Link X to bring in the production sound metadata. This enabled me to easily identify channel names. I tend to edit in timelines rather than a traditional source and record approach. I start with ‘stringouts’ of all the footage by scene and will use various techniques to sort and track best takes. A couple of the items I’d love to see return to FCP X are tabs for open timelines and dupe detection.”

df0316_vfts_11Final Cut Pro X also has other features to help truly refine the edit. Condit says, “I used FCP X’s retiming function extensively for pace and emotion of shots. With the optical flow technology, it delivers great results. For example, in the opening shot you see two hands – the boy and his mother – playing piano. The on-set piano rehearsal was recorded and used for playback for all takes. Unfortunately it was half the speed of the final cue used in the film. I had to retime that performance to match the final cue, which required putting a keyframe in for every finger push. Optical flow looks so good in FCP X that many of the final online retimes were actually done in FCP X.”

df0316_vfts_6Singer Amy Lee of the band Evanescence recorded the closing title song for the film during the sound sessions at Skywalker. Condit says, “Amy completely ‘got’ the film and articulated it back in this beautiful song. She and Wandmacher collaborated to create something pretty special to close the film with. Our team is fortunate enough now to be creating a music video for the song that was shot at the same castle.”

Zanuck Independent is currently arranging a domestic distribution schedule for Voice from the Stone, so look for it in theaters later this year.

If you want more details, click here for Steve Hullfish’s excellent Art of the Cut interview with Clayton Condit.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Film Editor Techniques

df2416_filmedit_sm

Editing is a craft that each editor approaches with similarities and differences in style and technique. If you follow my editor interviews or those at Steve Hullfish’s Art of the Cut series, then you know that most of the top editors are more than willing to share how they do things. This post will go through a “baker’s dozen” set of tips and techniques that hopefully will help your next, large project go just a bit more smoothly.

Transcoding media. While editing with native media straight from the camera is all the rage in the NLE world, it’s the worst way to work on long-term projects. Camera formats vary in how files are named, what the playback load is on the computer, and so on. It’s best to create a common master format for all the media in your project. If you have really large files, like 4K camera media, you might also transcode editing proxies. Cut with these and then flip to the master quality files when it comes time to finish.

Transcode audio. In addition to working with common media formats, it’s a good practice to get all of your audio into a proper format. Most NLEs can deal with a mix of audio formats, bit depths and sample rates, but that doesn’t mean you should. It’s quite common to get VO and temp music as MP3 files with 44.1kHz sampling. Even though your NLE may work with this just fine, it can cause problems with sync and during audio post later. Before you start working with audio in your project, transcode it to .wav of .aif formats with 48kHz sampling and 16-bit or 24-bit bit-depth. Higher sampling rates and bit-depths are OK if your NLE can handle them, but they should be multiples of these values.

Break up your project files by reel. Most films are broken down into 20 minute “reels”. Typically a feature will have five or six reels that make up the entire film. This is an old-school approach that goes back to the film day, yet, it’s still a good way to work in the modern digital era. How this is done differs by NLE brand.

With Media Composer, the root data file is the bin. Therefore, each film reel would be a separate timeline, quite possibly placed into a separate bin. This facilitates collaboration among editors and assistants using different systems, but still accessing the same project file. Final Cut Pro X and Premiere Pro CC don’t work this way. You cannot share the exact same FCPX library or Premiere Pro project file between two editors at one time.

In Final Cut Pro X, the library file is the basic data file/container, so each reel would be in its own library with a separate master library that contains only the final edited sequence for each of the reels. Since FCPX editors can open multiple libraries, it’s possible to work across reels this way or to have different editors open and work on different libraries independent of each other.

With Premiere you can only have a single project file open at one time. When a film is broken into one reel per project, it becomes easy for editors and assistants to work collaboratively. Then a master project can be created to import the final version of each reel’s timeline to create the combined film timeline. Media Browser within Premiere Pro should be used to access sequences from within other project files and import them into a new project.

Show/hide, sifting and sorting. Each NLE has its own way of displaying or hiding clips and subclips. Learning how to use these controls will help you speed up the organization of the media. Final Cut Pro X has a sophisticated method of assigning “favorites” and “rejects” to clips and ranges within clips. You can also assign keywords. By selecting what to see and to hide, it’s easy to cull a mass of footage into the few, best options. Likewise with Media Composer and Premiere Pro, you can show and hide clips and also sort by custom column criteria. Media Composer includes a custom sift feature, which is a filtering solution within the bin. It is easy to sift a bin by specific data in certain columns. Doing so hides everything else and reveals only the matching set of media on a per-bin basis.

Stringouts. A stringout is a sequence of selected footage. Many editors use stringouts as the starting point and then whittle down the scene from there. For example, Kirk Baxter likes his assistants to create a stringout for a dialogue scene that is broken down by line and camera. For each line of dialogue, you would see every take and camera angle covering that line of dialogue from wide to tight. Then the next line of dialogue and so on. The result is a very long sequence for the scene, but he can quickly assess the performance and best angle for each portion of the scene. Then he goes through and picks his favorites by pushing the video clip up one track for quick identification. The assistant then cleans up the stringout by creating a second version containing only these selected clips. Now the real cutting can begin.

Julian Clarke has his assistants create a similar stringout for action scenes. All takes and angles are organized back-to-back matching the choreography of the action. So – every angle/take for each crash or blast or punch within the scene. From these he has a clear idea of coverage and how to proceed cutting the scene, which otherwise might have an overwhelming amount of footage at first glance.

I use stringouts a lot for interview-driven documentaries. One sequence per person with everything. The second and third stringouts are successive cutdowns from that initial all-inclusive stringout. At this stage I start combining portions of sequences based on topics for a second round of stringouts. These will get duplicated and then culled, trimmed and rearranged as I refine the story.

Pancakes and using sequences as sources. When you use stringouts, it’s common to have one sequence become the source for another sequence. There are ways to handle this depending on your NLE. Many will nest the source sequence as a single clip on the new timeline. I contend that nesting should be avoided. Media Composer only allows one sequence in the “record” window to be active at any one time (no tabbed timeline). However, you can also drag a sequence to the source window and its tracks and clips can be viewed by toggling the timeline display between source and record. At least this way you can mark ins and outs for sections. Both Final Cut Pro “legacy” and Premiere Pro enable several sequences to be loaded into the timeline window where they are accessible through tabs. Final Cut Pro X dropped this feature, replacing it with a timeline history button to step forward or backward through several loaded sequences. To go between these sequences in all three apps, using copy-and-paste functions are typically the best way to bring clips from one sequence into another.

One innovative approach is the so-called “pancake” timeline, popularized by editor/blogger Vashi Nedomansky. Premiere Pro permits you to stack two or more timelines into separate panels. The selected sequence becomes active in the viewer at any given time. By dragging between timeline panels, it is possible to edit from one sequence to another. This is a very quick and efficient way to edit from a longer stringout of selects to a shorting one with culled choices.

Scene wall. Walter Murch has become synonymous with the scene wall, but in fact, many editors use this technique. In a scene wall, a series of index cards for each scene is placed in story order on a wall or bulletin board. This provides a quick schematic of the story at any given time during the edit. As you remove or rearrange scenes, it’s easy to see what impact that will have. Simply move the cards first and review the wall before you ever commit to doing the actual edit. In addition, with the eliminated cards (representing scenes) moved off to the side, you never lose sight of what material has been cut out of the film. This is helpful to know, in case you want to go back and revisit those.

Skinning, i.e. self-contained files. Another technique Murch likes to use is what he calls adding a skin to the topmost track. The concept is simple. When you have a lot of mixed media and temp effects, system performance can be poor until rendered. Instead of rendering, the timeline is exported as a self-contained file. In turn, that is re-imported into the project and placed onto the topmost track, hiding everything below it. Now playback is smooth, because the system only has to play this self-contained file. It’s like a “skin” covering the “viscera” of the timeline clips below it.

As changes are made to add, remove, trim or replace shots and scenes, an edit is made in this self-contained clip and the ends are trimmed back to expose the area in which changes are being made. Only the part where “edit surgery” happens isn’t covered by the “skin”, i.e. self-contained file. Next a new export is done and the process is repeated. By seeing the several tracks where successive revisions have been made to the timeline, it’s possible to track the history of the changes that have been made to the story. Effectively this functions as a type of visual change list.

Visual organization of the bin. Most NLEs feature list and frame views of a bin’s contents. FCPX also features a filmstrip view in the event (bin), as well as a full strip for the selected clip at the top of the screen when in the list view. Unfortunately, the standard approach is for these to be arranged based on sorting criteria or computer defaults, not by manual methods. Typically the view is a tiled view for nice visual organization. But, of course, the decision-making process can be messy.

Premiere Pro at least lets you manually rearrange the order of the tiles, but none of the NLEs is as freeform as Media Composer. The bin’s frame view can be a completely messy affair, which editors use to their advantage. A common practice is to move all of the selected takes up to the top row of the bin and then have everything else pulled lower in the bin display, often with some empty space in between.

Multi-camera. It is common practice, even on smaller films, to shoot with two or more cameras for every scene. Assuming these are used for two angles of the same subject, like a tight and a wide shot on the person speaking, then it’s best to group these as multi-camera clips. This gives you the best way to pick among several options. Every NLE has good multi-camera workflow routines. However, there are times when you might not want to do that, such as in this blog post of mine.

Multi-channel source audio. Generally sound on a film shoot is recorded externally with several microphones being tracked separately. A multi-channel .wav file is recorded with eight or more tracks of materials. The location sound mixer will often mix a composite track of the microphones for reference onto channel one and/or two of the file. When bringing this into the edit, how you handle it will vary with each NLE.

Both Media Composer and Premiere Pro will enable you to merge audio and picture into synchronized clips and select which channels to include in the combined file. Since it’s cumbersome to drag along eight or more source channels for every edit in these track-based timelines, most editors will opt to only merge the clips using channel one (the mixed track) of the multi-channel .wav file. There will be times when you need to go to one of the isolated mics, in which case a match-frame will get you back to the source .wav, from which you can pull the clean channel containing the isolated microphone. If your project goes to a post-production mixer using Pro Tools, then the mixer normally imports and replaces all of the source audio with the multi-channel .wav files. This is common practice when the audio work done by the picture editor is only intended to be used as a temp mix.

With Final Cut Pro X, source clips always show up as combined a/v clips, with multi-channel audio hidden within this “container”. This is just as true with synchronized clips. To see all of the channels, expand the clip or select it and view the details in the inspector. This way the complexity doesn’t clog the timeline and you can still selectively turn on or off any given mic channel, as well as edit within each audio channel. No need to sync only one track or to match-frame back to the audio source for more involved audio clean-up.

Multi-channel mixing. Most films are completed as 5.1 surround mixes – left, center, right, left rear surround, right rear surround, and low-frequency emitter (subwoofer). Films are mixed so that the primary dialogue is mono and largely in the center channel. Music and effects are spread to the left and right channels with a little bit also in the surrounds. Only loud, low frequencies activate the subwoofer channel. Usually this means explosions or some loud music score with a lot of bottom. In order to better approximate the final mix, many editors advocate setting up their mixing rooms for 5.1 surround or at least an LCR speaker arrangement. If you’ve done that, then you need to mix the timeline accordingly. Typically this would mean mono dialogue into the center channel and effects and music to the left and right speakers. Each of these NLEs support sequence presets for 5.1, which would accommodate this edit configuration, assuming that your hardware is set up accordingly.

Audio – organizing temp sound. It’s key that you organize the sounds you use in the edit in such a way that it is logical for other editors with whom you may be collaborating. It should also make sense to the post-production mixer who might do the final mix. If you are using a track-based NLE, then structure your track organization on the timeline. For example, tracks 1-8 for dialogue, tracks 9-16 for sound effects, and tracks 17-24 for music.

If you are using Final Cut Pro X, then it’s important to spend time with the roles feature. If you correctly assign roles to all of your source audio, it doesn’t matter what your timeline looks like. Once properly assigned, the selection of roles on output – including when using X2Pro to send to Pro Tools – determines where these elements show up on an exported file or inside of a Pro Tools track sheet. The most basic roles assignment would be dialogue, effects and music. With multi-channel location recordings, you could even assign a role or subrole for each channel, mic or actor. Spending a little of this time on the front end will greatly improve efficiency at the back end.

For more ideas, click on the “tips and tricks” category or start at 12 Tips for Better Film Editing and follow the bread crumbs forward.

©2016 Oliver Peters

NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters

Adobe Premiere Pro CC Learning Tips

df2016_ppro_1

Adobe Premiere Pro CC is the heir apparent editing application for many editors. In order to make your transition easier, I’ve compiled a series of links to various official and unofficial resources, including Adobe sites, forums, YouTube videos, training resources, and various blog posts. This list is by no means all that’s out there, but it should provide a great starting point to become more comfortable with Premiere Pro CC.

Adobe and Adobe-related training resources

Adobe tutorials

Adobe Premiere Pro tips

Maxim Jago’s tips at Lynda

Maxim Jago’s tips at Peachpit

Lynda’s Adobe Premiere Pro training

Ripple Training – Premiere Pro CC 2015

Adobe-related blogs and YouTube channels

Dave Helmly’s DAV Tech Table

Jason Levine’s YouTube channel

Colin Smith’s YouTube channel

Dave Helmly’s presentation at Orlando Post Pros meeting – 2015 (webcast)

Forums

Adobe Filmmaker stories

Adobe Premiere Pro Forum

Creative COW Premiere Pro forum

Premiere-centric sites and YouTube channels

Jarle Leirpoll’s Premiere Pro blog

Retooled

Premiere Bro

Best Premiere Pro Quick Tips YouTube Channel

Premiere Pro Tips YouTube channel

Blog posts

VashiVisuals – Deadpool Premiere Pro Presets

VashiVisuals – Keyboard Layouts

VashiVisuals – Music Video Editing Tips

VashiVisuals – Pancake Timeline

Jonny Elwyn – Premiere posts

Jonny Elwyn – Premiere Pro Tools and Tutorials

Jonny Elwyn – Tips and Tricks

Jonny Elwyn – Tips for Better Editing in Premiere Pro

Jonny Elwyn – Tutorials for Better Editing in Premiere Pro

Premium Beat – Beginner’s Guide to Premiere Pro Shortcuts

Premium Beat – Match Frame and Replace Edit

Premium Beat – How to Clean up Audio in Premiere Pro

Premium Beat – Creating a Storyboard Edit in Premiere Pro

Premium Beat – AVCHD Editing Workflow

Premium Beat – How to Organize a Feature Film Edit

Premium Beat – 3 Quick Tips for Editing in Premiere Pro

Premium Beat – Time-Saving Premiere Pro CC Tips

Derek Lieu – Simple Tricks for Faster Editing in Premiere Pro

No Film School – Premiere Pro Keyboard Shortcuts

Wipster – 4 Reasons Why Premiere Pro is a Great Choice

Wipster – Master the Media Browser

Plug-ins, add-ons, other

Kinetic type and layer effects by TypeMonkey for After Effects

Post Notes Premiere Pro control panel

PDFviewer Premiere Pro control panel

Frame.io integration

Wipster.io integration

Axle Video integration

LookLabs SpeedLooks

FxFactory plug-ins

RedGiant Software plug-ins

Boris FX plug-ins

DigitalFilms – SpeedGrade Looks

Jarle’s Premiere Pro Presets

Note : this information is also included on the Editing Resources page accessible from the header of this blog. Future updates will be made there.

©2016 Oliver Peters

Easy 4K Workflow

df1816_easy_4k_sm

In the last post I questioned the visual value of 4K. However, it’s inevitable that more and more distributors will be asking for 4K deliverables, so you might as well start planning how you are going to achieve that. There are certainly plenty of demos showing how easy it is to edit 4K content and they use iPhone video for the demo material. The reality is that such footage is crap and should only be used when it’s the only camera available. At the low end, there are plenty of cameras to choose from that work with highly-compressed 4K images and yet, yield great results. The Blackmagic Design URSA Mini, Sony FS7 and Canon C300 Mark II come to mind. Bump up to something in a more cinema-style package and you are looking at a Sony F55, RED, ARRI or even the AJA CION.

df1816_easy_4k_1While many cameras record to various proprietary compressed codecs, having a common media codec is the most ideal. Typically this means Apple ProRes or Avid DNxHD/HR. Some cameras and standalone monitor/recorders can natively generate media in these formats. In other circumstances, it requires an interim transcode before editing. This is where system throughput becomes a big issue. For example, if you want to work with native 4K material as ProRes 4444, you are going to need fast drives. On my home Mac Pro tower, I have two internal 7200RPM spinning drives for media striped as RAID-0. In addition to these and the boot drive, I also have another internal SSD media drive. When I checked their relative performance with the AJA System Test utility, these clocked at 161 write /168 read for the RAID-0 stripe and 257/266 for the single SSD. That’s good enough for approximately 27fps and 43fps respectively, if the media were large 3840 x 2160 (2160p) ProRes 4444 files. In other words, both drive units are adequate for a single stream of 2160p/23.98 as ProRes 4444, but would have a tougher time with two streams or more.

Unfortunately the story doesn’t end with drive performance alone, because some NLEs handle real-time playback of 4K media better than do others. I’ve performed a number of tests with 4K files in Apple Final Cut Pro X, Adobe Premiere Pro CC, Avid Media Composer and Blackmagic Design DaVinci Resolve. This has been on a number of different units, including a couple of Mac Pro towers, as well as a newer “trash can” Mac Pro. Plus, I’ve run tests with local drives, attached media RAIDs, and network-attached storage systems. What I’ve found is that as long as you have fast drive performance, then the bottleneck is the NLE.

Pretty much all of these choices can handle a single stream of 4K media without too much of an issue. However, when you stack up a second layer or track for a simple 2D PIP composite, generally the system struggles. In some cases, FCPX has performed better than the others, but not consistently.  The others all choked to varying degrees. When you limit it to a single stream of 4K video with associated audio, then FCPX performs more fluidly at a higher quality level than Media Composer or Premiere Pro, although Media Composer also performed well in some of the tests. My conclusion, for now, is that if you want to work with native 4K media in a client-involved session, and with the least amount of rendering, then FCPX is the clear winner – at least on the Mac platform. For many editors it will be the most viable choice.

Native workflow

The first big plus for Final Cut Pro X is how easily it works with native media that it’s compatible with. That’s one thing I don’t generally advocate on a large project like a show or feature film – opting instead to create “optimized” media first, either externally or within FCPX. Nevertheless, a lot of native codecs can be quite easy on the system. For example, one client cut an indie feature, using all native camera files from his Sony FS7. His Final Cut system was a tricked out iMac that was a couple of years old and a Promise Pegasus RAID array. Initially he cut the film from native 4K FS7 files to an FCPX 1080p timeline. I was doing the grading in Resolve, so I had him export a single, flattened movie file from the timeline as 1080p ProRes 4444. I brought this into Resolve, “bladed” the cuts to create edit points and applied my color correction. I exported a single ProRes 4444 master file, which he could import back into FCPX and marry with the post-production mix.

df1816_easy_4k_2Fast forward a year and the film distributor was inquiring whether they could easily produce a 4K master instead of a 1080 master. This turned out to be relatively simple. All my client had to do was change his FCPX project (timeline) settings to 4K, double-check the scaling for his clips and export a new 4K ProRes 4444 file of the timeline. In Resolve, I also changed the timeline setting to 4K and then relinked to the new 4K file. Voila! – all the cuts lined up and the previous grades all looked fine. Then I simply exported the graded 4K file to send back to the client.

In this example, even with a roundtrip to Resolve and a change from 1080p to 2160p, FCPX performed perfectly without much fuss. However, for many, you wouldn’t even need to go this far. Depending on how much you like to play and tweak during the color grade, there are plenty of ways to do this and stay totally inside FCPX. You could use tools like the Color Board, Hawaiki Color, Color Finale, or even some home-brew Motion effects, and achieve excellent results without ever leaving Final Cut Pro X.

As a reminder, Media Composer, Premiere Pro CC and Resolve are all capable of working with native media, including 4K.

Proxy workflow

df1816_easy_4k_4In addition to native 4K post, Apple engineers built an ingenious internal proxy workflow into Final Cut. Transcode the camera files in the background, flip a toggle, and work with the proxy files until you are ready to export a master. When you opt to transcode proxies, FCPX generates half-resolution, ProRes Proxy media corresponding to your original files. As an example, if your media consists of 2160p XAVC camera files, FCPX creates corresponding 1080p ProRes Proxy files. Even though the proxy media’s frame is 1/4th the size of the 4K original, FCPX takes care of handling the scaling math in the timeline between original and proxy media. The viewer display will also appear very close in quality, regardless of whether you have switched to original/optimized or proxy media. The majority of legacy A/V output cards, like a Blackmagic Design Decklink, are only capable of displaying SD and HD content to an external monitor. FCPX can send it the proper data so that a 4K timeline is displayed as a scaled 1080 output to your external video monitor.

Although proxies are small for a 4K project, these are still rather large to be moving around among multiple editors. It’s not an official part of the Final Cut operation, but you can replace these generated proxies with your own versions, with some caveats. Let’s say you have 3840 x 2160, log-gamma-encoded, 4K camera files. You would first need to have FCPX generate proxies. However, using an external application such as EditReady, Compressor, etc, you could transcode these camera files into small 960×540 ProRes Proxy media, complete with a LUT applied and timecode/clip name burnt in. Then find your Proxy Media folder, trash the FCPX-generated files and replace them with your own files. FCPX should properly relink to these and understand the correct relationship between the original and the proxy files. (This post explains the process in more detail.) There are several caveats. Clip name, frame rate, clip length, aspect ratio, and audio channel configurations must match. Otherwise you are good to go.df1816_easy_4k_3

The benefit to this solution is that you can freely edit with the proxies on a lightweight system, such as a MacBook Pro with a portable drive. When ready, move back to a beefier unit and storage, flip to original/optimized media, double-check all effects and color-correction on a good monitor, and then export the master files. It’s worth noting that this workflow is also potentially possible with Premiere Pro CC, because the new version to be introduced later this year will include a proxy editing workflow.

Naturally there is no single solution, but Final Cut Pro X makes this process far easier than any other tool that I use. If 4K is increasingly looming on the horizon for you, then FCPX is certainly worth a test run.

©2016 Oliver Peters

Deadpool

df1016_deadpool_1_sm

Adobe has been on a roll getting filmmakers to adopt its Premiere Pro CC editing software for feature film post. Hot on the heels of its success at Sundance, where a significant number of the indie films we’re edited using Premiere Pro, February saw the release of two major Hollywood films that were cut using Premiere Pro – the Coen Brothers’ Hail, Caesar! and Tim Miller’s Deadpool.

Deadpool is one of Marvel Comics’ more unconventional superheroes. Deadpool, the film, is the origin story of how Wade Wilson (Ryan Reynolds) becomes Deadpool. He’s a mercenary soldier that gains accelerated healing powers through a rogue experiment. Left disfigured, but with new powers, he sets off to rescue his girlfriend (Morena Baccarin) and find the person responsible. Throughout all of this, the film is peppered with Deadpool’s wise-cracking and breaking the fourth wall by addressing the audience.

This is the first feature film for director Tim Miller, but he’s certainly not new to the process. Miller and his company Blur Studios are known for their visual effects work on commercials, shorts, and features, including Scott Pilgrim vs. the World and Thor: The Dark World. Setting out to bring as much of the post in-house, Miller consulted with his friend, director David Fincher, who recommended the Adobe Creative Cloud solution, based on Fincher’s experience during Gone Girl. Several editing bays were established within Blur’s facility – using new, tricked out Mac Pros connected to an Open Drives Velocity SSD 180TB shared storage solution.

Plugging new software into a large VFX film pipeline

df1016_deadpool_6Julian Clarke (Chappie, Elysium, District 9) came on board to edit the film. He explains, “I talked with Tim and was interested in the whole pioneering aspect of it. The set-up costs to make these permanent edit suites for his studio are attractive. I learned editing using [Apple] Final Cut Pro at version one and then I switched to Avid about four years later and have cut with it since. If you can learn [Avid] Media Composer, then [Adobe] Premiere Pro is fine. I was up to about 80% of my normal speed after just two days.”

To ease any growing pains of using a new editing tool on such a complex film, Miller and Adobe also brought in feature film editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2: The Second One, An American Carol) as a workflow consultant. Nedomansky’s job was to help establish a workflow pipeline and to get the editorial team up to speed with Premiere Pro. He had performed a similar role on Gone Girl. He says, “I’ve cut nine features and the last four have been using Premiere Pro. Adobe has called on me for that editor-to-editor interface and to help Blur set up five edit bays. I translated what we figured out with Gone Girl, but adapted it to Blur’s needs, as well as taking into consideration the updates made to the software since then. During the first few weeks of shooting, I worked with Julian and the assistant editors to customize their window layouts and keyboard shortcuts, since prior to this, the whole crew had primarily been using Avid.”

Deadpool was shot mostly with ARRI ALEXA cameras recording open gate 2.8K ARRIRAW. Additional footage also came from Phantom and RED cameras. Most scenes were recorded with two cameras. The original camera files were transcoded to 2K ProRes dailies in Vancouver. Back at Blur, first assistant editor Matt Carson would sync audio and group the clips into Premiere Pro multicam sequences.

Staying up with production

df1016_deadpool_2As with most features, Clarke was cutting while the production was going on. However, unlike many films, he was ready to show Miller edited scenes to review within 24 hours after the shoot had wrapped for the day. Not only a cut scene, but one already fleshed out with temporary sound effects and music. This is quite a feat, considering that Miller shot more than 500 hours of footage. Seeing a quick turnaround of edited scenes was very beneficial for Miller as a first-time feature director. Clarke adds, “My normal approach is to start cutting and see what works as a first draft. The assistant will add sound effects and temp music and if we hit a stumbling block, we move on to another scene. Blur had also created a lot of pre-vis shots for the effects scenes prior to the start of principal photography. I was able to cut these in as temp VFX. This way the scenes could play through without a lot of holes.”

df1016_deadpool_3To make their collaborative workflow function, Nedomansky, Clarke, and the assistants worked out a structure for organizing files and Premiere Pro projects. Deadpool was broken into six reels, based on the approximate page count in the script where a reel break should occur. Every editor had their own folder on the Open Drives SAN containing only the most recent version of whatever project that they were working on. If Julian Clarke was done working on Reel 1, then that project file could be closed and moved from Clarke’s folder into the folder of one of the assistants. They would then open the project to add temporary sound effects or create some temporary visual effects. Meanwhile, Clarke would continue on Reel 2, which was located in his folder. By keeping only the active project file in the various folders and moving projects among editors’ folders, it would mimic the bin-locking method used in shared Avid workflows.

In addition, Premiere Pro’s Media Browser module would also enable the editors to access and import sequences found within other project files. This is a non-destructive process. Older versions of the project files would be stored in a separate folder on the SAN in order to keep the active folders and projects uncluttered. Premiere Pro’s ability to work with folders as they were created in the Finder, let the editors do more of the organization at the Finder level than they normally would, had they been cutting with Avid systems.

Cutting an action film

df1016_deadpool_4Regardless of the software you use, each film presents a unique set of creative challenges. Clarke explains, “One scene that took a while was a long dialogue scene with Deadpool and Colossus on the highway. It’s quintessential Deadpool with a lot of banter and improv from Ryan. There’s not much story going on in the background at that time. We didn’t want to cut too much out, but at the same time we didn’t want to have the audience get lost in what’s supposed to be the bigger story. It took some time to strike the right balance. Overall the film was just about right. The director’s cut was about two hours, which was cut into the final length of one hour and 45 minutes. That’s just about the right amount to cut out, because you don’t end up loosing so much of the heart of the film.”

Many editors have a particular way they like their assistants to organize bins and projects. Clarke offers, “I tend to work in the frame view and organize my set-ups by masters, close-ups, and so on. Where I may be a little different than other editors is how I have my assistants organize action scenes. I’ll have them break down the choreography move-by-move and build a sequence of selected shots in the order of these moves. So for example, all the angles of the first punch, followed by all the angles of the next move – a punch, or block, or kick. Action scenes are often shot with so much coverage, that this lets me quickly zero in on the best stuff. It eliminates the scavenger hunt to find just the right angle on a move.”

df1016_deadpool_8The script was written to work in a nonlinear order. Clarke explains how that played out through the edit, “We stood by this intention in the editing. We found, in fact, that the film just didn’t work linearly at all. The tone of the two [scripted] timelines are quite different, with the more serious undertones of the origin story and the broad humor of the Deadpool timeline. When played sequentially, it was like oil and water – two totally different movies. By interweaving the timelines, the tone of the movie felt more coherent with the added bonus of being able to front load action into the movie to excite the audience, before getting into the heavier cancer part of the story.”

One editing option that might come to mind is that a character in a mask offers an interesting opportunity to change dialogue without difficult sync issues. However it wasn’t the sort of crutch some might assume. Clarke says, “Yes, the mask provided a lot of opportunity for ADR. Though this was used more for tweaking dialogue for plot clarity or to try out alternate jokes, than a wholesale replacement of the production track. If we liked the production performance we generally kept it, and embraced the fact that the mask Ryan was wearing would dull the audio a bit. I try to use as little ADR as possible, when it comes to it being used for technical reasons, rather than creative ones. I feel like there’s a magic that happens on set that is often hard to replicate in the ADR booth.”

Pushing the envelope

df1016_deadpool_7The editing systems proved to offer the performance needed to complete a film of this size and complexity. Vashi Nedomansky says, “There were 1400 effects shots handled by ten vendors. Thanks to the fact that Blur tricked out the bays, the editors could push 10 to 15 layers of 2K media at a time for temp effects – in real-time without rendering. When the film was locked, audio was exported as AAF for the sound facility along with an H.264 picture reference. Blur did many of the visual effects in-house. For final picture deliverables, we exported an XML from Premiere Pro, but also used the Change List tool from Intelligent Assistance. This was mainly to supply the list in a column format that would match Avid’s output to meet the studio’s requirements.”

df1016_deadpool_5I asked Clarke and Nedomansky what the team liked best about working with the Adobe solution. Nedomansky says, “I found that the editors really liked the tilde key [on the keyboard], which in Premiere Pro brings any window to fullscreen. When you have a timeline with 24 to 36 tracks of temp sound effects, it’s really nice to be able to make that fullscreen so that you can fine-tune them. They also liked what I call the ‘pancake timeline’. This is where you can stack two timelines over each other to compare or pull clips from one into the other. When you can work faster like this, there’s more time for creativity.” Clarke adds, “I used a lot of the time-remapping in After Effects. Premiere Pro’s sub-frame audio editing is really good for dialogue. When Avid and Apple were competing with Media Composer and Final Cut Pro it was very productive for both companies. So competition between Avid and Adobe is good, because Premiere Pro is very forward-thinking.”

Many NLE users may question how feature films apply to the work they do. Nedomansky explains, “When Kirk Baxter used Premiere Pro for Fincher’s Gone Girl, the team requested many features that they were used to from Final Cut Pro 7. About 200 of those suggestions have found their way as features into the current release that all Creative Cloud customers receive. Film editors will stress a system in ways that others won’t, and that information benefits all users. The important takeaway from the Deadpool experience is that after some initial adjustment, there were no showstoppers and no chaos. Deadpool is a monster film, but these are just tools. It’s the human in the chair making the decision. We all just want to work and not deal with technical issues. Whatever makes the computer invisible – that’s the power.”

Deadpool is certainly a fun rid, with a lot of inside jokes for veteran Marvel fans. Look for the Stan Lee cameo and be sure to stay all the way through the end credits!

Watch director Tim Miller discuss the choice to go with Adobe.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Whiskey Tango Foxtrot

df1116_wtf_1_sm

As most readers know, “whiskey tango foxtrot” is the military way to communicate the letters WTF. Your imagination can fill in the rest. Whiskey Tango Foxtrot, the movie, is a dark comedy about the experiences of a female journalist in Afghanistan, based on Kim Barker’s memoir, The Taliban Shuffle: Strange Days in Afghanistan and Pakistan. Paramount Pictures tapped the writing/directing team of John Requa and Glenn Ficarra (Focus, Crazy, Stupid, Love., I Love You Phillip Morris) to tackle the film adaptation, starring Tina Fey, Margot Robbie, Martin Freeman, Billy Bob Thornton, and Alfred Molina.

df1116_wtf_2Glenn Ficarra explains the backstory, “When the military focus shifted from Afghanistan to Iraq there was a void in coverage. Barker was looking for a change in her life and volunteered to embed as a correspondent in Kabul. When she got there, she wasn’t quite ready for the high-adrenaline, partying lifestyle of many of the journalists. Most lived in dorms away from the general Afghan population. Since there weren’t that many females there, she found that there was a lot of interest in her.” This is the basis of both the book and the film – an Afghanistan story with a touch of Animal House and M*A*S*H.

Filming in Afghanistan would have been too dangerous, so production shifted to New Mexico, with Xavier Grobet (Focus, Enough Said, I Love You Phillip Morris) as the director of photography. The filmmakers also hired a female, Muslim journalist, Galereh Kiazand, as the second unit photographer to pick up B-roll in Kabul, which added to the authenticity. In addition, they also licensed stock shots originally filmed for The Kite Runner, but not used in that film. Ficarra adds, “We built two huge sets for Kabul and Kandahar, which were quite convincing, even to vets and Afghans who saw them.”

df1116_wtf_9With efficiencies realized during Focus, the team followed a similar course on this film. Ficarra explains, “We previously pulled the editing in-house. For Whiskey Tango Foxtrot we decided to do all the visual effects in-house, too. There are about 1,000 VFX shots in the film. It’s so great to simply bring on more artists as you need them and you only have to pay the crew. At its peak, we had about 20 Nuke artists working on shots. Doing it internally opens you up to more possibilities for minor effects that enhance shots. You would otherwise skip these if you were working with an outside effects house. We carried this approach into the filming as well. While traveling, it was great to quickly pick up a shot that you could use as B-roll. So our whole mentality has been very much like you work in film school.”

Adjusting the workflow for a new film

df1116_wtf_3The duo started production of Whiskey Tango Foxtrot on the heels of completing Focus. They brought along editor Jan Kovac, as well as use of Apple Final Cut Pro X for editing. This was the off-the-shelf version of Final Cut Pro X available to all customers at the time of the production – no special version or side build. Kovac explains what differed on this new film, “The biggest change was in camera formats. Instead of shooting [Apple] ProRes 4444, we switched to using the new ProRes 4444 XQ codec, which was deployed by ARRI on the ALEXAs. On Focus, we recorded ARRIRAW for the green screen shots. We did extensive testing with this XQ codec prior to production and it was perfect for even the green screen work. Most of the production was shot with two ALEXAs recording in a 2K theatrical format using the ProRes 4444 XQ codec.”

Light Iron provided a DIT on set who took the camera files, added a basic color LUT, synced production sound, and then generated viewing dailies, which were distributed to department heads on Apple iPads. The DIT also generated editorial files that were in the full 2K ProRes 4444 XQ resolution. Both the camera original files and the color-corrected editorial files were stored on a 160TB Accusys ExaSAN system back at the film’s post headquarters. Two Mac Minis served as metadata controllers. Kovac explains, “By always having the highest quality image to edit with, it meant that we could have the highest quality screenings at any given time. You always see the film in a state that is very close to the final product. Since visual effects were being handled in-house, it made sense to have the camera original files on the SAN. This way shots could quickly be pulled for VFX work, without the usual intermediate step of coordinating with the lab or post house that might otherwise store these files.”

df1116_wtf_6Another change was that audio was re-synced by the editing team. First assistant editor Kevin Bailey says, “The DIT would sync the production mix, but when it got here, I would sync up all the audio tracks using Sync-N-Link X. This syncs by timecode, making the process fast. I would group the cameras into multicam clips, but as many as 12 isolated audio tracks were also set up as separate angles. This way, Jan could easily switch between the production mix and individual mics. The only part that wasn’t as automatic was that the crew also used a Blackmagic Pocket Camera and a Sony A7 for some of the shots. The production was running at a true 24.0 fps frame rate, while these smaller cameras only shot 24 frames at a video rate of 23.98. These shots required adjustment and manual syncing. The reason for a true 24.0 frame rate was to make it easy to work with 48fps material. Sometimes the A-camera would run at 24fps while the B-camera ran at 48fps. Speeding up the B-camera by a 2X factor gets it into sync, without worrying about more complicated speed offsets.” In addition to these formats, the Afghanistan second unit footage was shot on a RED camera.

df1116_wtf_5Bailey is an experienced programmer who created the program Shot Notes X, which was used on this film. He continues, “Our script supervisor used Filemaker Pro, which exports a .csv file. Using Shot Notes X, I could combine the FCPXML from Final Cut with the .csv file and then generate a new FCPXML file. When imported back into Final Cut, the event would be updated to display scenes and takes, along with the script notes in the browser’s notes column. Common script codes would be used for close-ups, dolly shots, and so on. Filtering the list view by one of these codes in Final Cut would then display only the close-ups or only the dolly shots for easy access.” Bailey helped set up this pipeline during the first few weeks of production, at which point apprentice editor Esther Sokolow took over the dailies processing. Bailey shifted over to assist with sound and Sokolow later moved into a VFX editor role as one of several people doing temp VFX.

From trailer to home base

df1116_wtf_8During production in New Mexico, Kovac worked out of an editorial trailer equipped with a single Mac Pro and an 8TB G-Raid drive. There he was cutting using the proxy files that Final Cut Pro X can generate internally. During that 47-day period, Kovac was doing 90% of the editing. The amount of footage averaged about three hours and 40 minutes per day. In April, the unit moved back to home base in Los Angeles, where the team had two Mac Pro edit suites set up for the editors, as well as iMacs for the assistants.

John Requa and Glenn Ficarra are “hands-on” participants in the editing process. Kovac would cut in one room, while Ficarra and Requa would cut in the other. After the first preview, their collaboration slowly changed into a more traditional editor-director format. Even towards the end, Ficarra would still edit when he found time to do so. Post ended just before Christmas after a 35-week post schedule. Glenn Ficarra explains, “John and I have worked together for 30 years, so we are generally of one mind when we write, direct, or edit. Sometimes John would cut with me and I’d be the ‘fingers’ and other times he’d work with Jan. Or maybe I’d work with Jan and John would review and pick takes. So our process is very fluid.”

df1116_wtf_4The Whiskey Tango Foxtrot team worked deeper into temp sound and visual effects than before. Kovac explains, “Kevin is very comfortable with sound design during the edit. And he’s a good Nuke artist, too. While I was working on one reel, Kevin could work on a different reel adding in sound effects and creating monitor comps and screen replacements. A lot of this work was done inside of Final Cut using the SliceX and TrackX plug-ins from CoreMelt. We were able to work in a 5.1 surround project and did all of our temp mixes in 5.1.” The power of the plug-ins let more of the temp effects be done inside Final Cut  Pro X, resulting in a more efficient workflow with less need for roundtrips to other applications.

All media and render files were kept on the ExaSAN storage, but external of the Final Cut Pro X library files, thus keeping those small. The library files were stored on a separate NFS server (a Mac Mini using NFS Manager) with a separate FCPX library file for each reel of the film. This enabled the editors and assistants to all access any FCPX library file, as long as someone else wasn’t using it at that time. A shared iTunes library for temporary sound effects and music selections was stored on the SAN with all machines pointing to that location. From within Final Cut, any editor could browse the iTunes library for music and sound effects.

When it came time for sound and picture turnovers, X2Pro Audio Convert was used to pass audio to the sound design team as an AAF file. Light Iron’s Ian Vertovec handled final color correction on their Quantel Pablo Rio system. He was working off of camera original media, which Light Iron also stored at their facility after the production. Effects shots were sent over as DPX image sequences.

Thoughts on the cut

df1116_wtf_7The director’s cut for Whisky Tango Foxtrot ran about three hours, although the final length clocked in at 1:52:00 with credits. Kovac explains, “There were 167 scripted scenes in the original script, requiring a fair amount of trimming. Once you removed something it had consequences that rippled throughout. It took time to get it right. While it was a tougher film from that standpoint, it was easier, because no studio approval process was needed for the use of Final Cut Pro X. So it built upon the shoulders of Focus. Final Cut has proven itself as a valuable member of the NLE community. Naturally anything can be improved. For example, optical flow and auditions don’t work with multicam clips. Neither do the CoreMelt plug-ins.” Bailey adds, “For me the biggest selling point is the magnetic timeline. In areas where I would build up temp sound design, these would be the equivalent of ten tracks deep. It’s far easier to trim sections and have the audio follow along than in any other NLE.”

Glenn Ficarra wrapped up with these thoughts. He says, “A big step forward on this film was how we dealt with audio. We devised a method to keep as much as possible inside FCPX, for as long as possible – especially for screenings. This gave us more cutting time, which was nice. There was no need for any of the in-between turnovers I’ve gone through on other systems, just to prepare the movie for screenings. I like the robust third-party approach with Final Cut. It’s a small, tight-knit community. You can actually get in touch with a developer without going through a large corporation. I’d like to see Apple improve some features, like better match-back. I feel they’ve only scratched the surface with roles, so I’d like to see them develop that more.”

He concludes, “A lot of directors would like to cut for themselves, but find a tool like Avid impenetrable. It doesn’t have to be that way. My 12-year-old daughter is perfectly comfortable with Final Cut Pro X. Many of the current workflows stem from what was built up around film and we no longer work that way. Why adhere to the old film methods and rules? Filmmakers who are using new methods are those that aren’t satisfied with the status quo. They are willing to push the boundaries.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters