Apple Logic Pro X

df_lpx_interface

Most nonlinear editing software includes tools for editing and mixing audio. Nevertheless, if you really need to focus on audio, you need a dedicated DAW (digital audio workstation) application. There are quite a few challengers to the dominance of Pro Tools, including Reaper, Nuendo and Audition, but the strongest of these is Apple’s Logic Pro X. It holds a unique position as a tool that covers many divergent needs, including audio production and post, studio music recording and mixing, live performance and music notation.

The revamp to Logic Pro X and its integration with Final Cut Pro X, makes this an ideal time to see how well it works as a companion tool for video editors. Logic Pro X is without a doubt, a wonderful music production tool, but my focus is video editing. I’m going to skip the music side of Logic in this review, in order to focus on its strengths and weaknesses for my needs – film/video editorial.

Tech specs and installation

df_lpx_instrumentsLogic Pro X supports a wide range of control surfaces and I/O devices. Audio resolution is up to 24-bit/192kHz. The mixer supports up to 255 audio channel strips, 255 software instrument channel strips, 255 aux channel strips, 64 busses and 99 MIDI tracks. Logic comes with 67 effects plug-ins, including Pedalboard (with 35 stompboxes), and 18 software instruments. The sound library includes 1548 patches, 3647 Apple Loops, 848 sampler instruments, 30 drum kits and over 2,000 presets for instruments, patches and plug-ins. Needless to say, it’s a whopper!

Logic Pro X may be purchased and downloaded from Apple’s Mac App Store. It also works with a separate companion program, called MainStage that will appeal to musicians who play live. Both are independent applications and you can purchase and install one or the other or both. MainStage taps into Logic’s resources, effects and contents, but it’s not essential for post users.

The download sizes for Logic Pro X and MainStage are under 900MB and under 700MB, respectively. There’s also the free Logic Remote application for the iPad (iOS 7 or later), to control Logic Pro X, MainStage and GarageBand on the Mac. Logic’s plug-in controls and libraries may be accessed from Logic Remote. It’s ideal when you want to control a recording from a quiet room separated from the gear.

Once you install Logic Pro X, you gain access to optional content that may be downloaded and installed from within the application. How much of this you’ll want, depends on how much musical content you need. This includes stereo and surround plug-ins, samples and loops and compatibility content to work with older versions of Logic and GarageBand. All totaled, this extra content is up to 38GB of media. The largest batches are comprised of the Drum Kit samples and the Legacy and Compatibility group, which includes the JamPack loop libraries. If all you want is the basics, like plug-ins and some loops, you’ll only need to download a few extra gigabytes of content to get started. You can download the rest later at any time.

Templates, settings and preferences

df_lpx_templatesWhen you first launch Logic Pro X, you can start with a blank slate or use several different project templates. It is important to understand the distinctions among these, because different templates will use different settings. For example, it took me a while to figure out how to enable scrubbing, because a particular function needed to be enabled in the project settings’ MIDI tab, even though none of my tracks were MIDI tracks. Project settings are tied only to a project and they differ from the application preferences, which apply global changes.

In preferences you can enable Advanced Tools, for project alternatives, media browsers, expanded mixing and automation capabilities. The use of Advanced Tools is a way to make both beginning and advanced users comfortable with Logic Pro X. When Advanced Tools are not shown, the interface is a bit closer to GarageBand, with a number of controls and panels hidden or disabled. Once Advanced Tools are shown in Logic’s Preferences, you gain additional options required by experienced users.

Since a project can be based on time or musical values (beats, measures, key signatures), it’s important to set your project up correctly from the beginning. The opening template page lets you make overall changes to that project – like disabling the musical grid or setting samples rates and frame rates – but, even after you start, some values can still be altered. Once you become familiar with the project variations, it’s easy to create your own custom templates.

Tracks can be standard audio, MIDI, software instrument or drum kits. The difference between a standard track and an instrument track is that the instrument track is a patch configured with a number of in-line plug-ins. Each patch contains one or even multiple channel strips (such as the drummer track) in its configuration. If you add a software instrument track, it will default to piano with a set of plug-ins. Highlight the track and open the Library pane to change the type of sound or the type of instrument – for instance, from a piano to a guitar with a British lead sound. This changes the configuration of the patch. Likewise, a standard audio track, without any plug-ins added at the start, can also be changed in the Library pane to a different option, like a vocal track configured as a fuzz vocal. Each of these patches is simply a set of plug-ins with presets. All can be changed, removed or added to, depending on the sound you are looking for.

Interface

df_lpx_smartcontrolsLogic Pro X’s clean interface is optimized for a single screen, but also takes advantage of dual-screen layouts. Like FCP X, Logic uses a design of various panels that can be opened or closed, depending on your focus. Track adjustments, like volume or panning, can be made in the separate mixer window or in the main window. Effects adjustments can be made by opening each plug-in’s controls or by using the Smart Controls panel in the main window. The latter presents a streamlined set of macro controls that manipulate select parameters belonging to the multiple plug-ins used on the track. By default, each track also has a built-in EQ that becomes active once you make the first adjustment.

The wealth of Logic plug-ins will be familiar to most Soundtrack Pro and Final Cut Pro X users. If you want more, then compatible, third-party Audio Units plug-ins also work, including those from iZotope, Focusrite and Waves. Even the numerous musical plug-ins, such as a guitar pedal, can be added to standard audio tracks, including vocals. I’ve focused a lot on mixing tools, but, of course, Logic Pro X includes all of the file-based editing tools with sample-level accuracy that no professional DAW could be without.

df_lpx_trackstackA big new feature for most will be Track Stacks. To create a Track Stack, select a set of individual tracks and create a Stack from these. This can be a Folder Stack – where the component tracks are simply treated as a group. The other option is a Summing Stack, where the individual tracks are routed through a bus. It’s a lot like a Compound Clip in FCP X or a traditional submix bus in other audio mixing software. Let’s say you have a composite voice-over built out of several clips and spread across several tracks. Select and combine these into one Summing Track Stack and now the entire voice-over can be treated as a single track. The sub-tracks within it can be hidden by twirling the reveal triangle on the Track Stack. If you need to tweak one of the clips within the Stack, simply twirl it open and make the adjustments to that clip.

Logic Pro X also offers a composting feature that’s designed for recording sessions called Quick Swipe Comping. This function lets you quickly cut up the best sections of various takes. These can then be highlighted within a group, somewhat like FCP X’s Audition function. The composite of these recorded tracks is contained within a Folder Stack and can be manipulated as a group, without ever losing access to the alternate takes.

Another useful tool is Flex Time and Flex Pitch, which are great for video production, where broadcast length is critical. Turn on Flex Time and use the Flex Tool to expand or contract part or all of a clip to fit the necessary length. Finally, there’s Groove Track, which will let you realign the timing of tracks to a selected Groove Master track. Thanks to Flex Time, this feature works with both MIDI and audio tracks.

Working with NLEs

df_lpx_nle-to-lpxLogic Pro X supports a number of interchange formats, including AAF, Final Cut Pro XML, OMF and FCPXML. These come with some caveats. FCPXML (from FCP X) came across fine, even after the changes to the FCPXML format made in the 10.1.2 update. Compound Clips were automatically broken apart into individual tracks inside the Logic project. I was also able to bring in audio from FCP X as an AAF file by using the X2Pro Audio Convert utility. Unfortunately an AAF from Avid Media Composer didn’t work, because Logic Pro X cannot read audio files that are formatted as .MXF.

For Avid users, OMF is fine, but you have to use the following workaround in Media Composer: change the project format to NTSC; enable OMF media in the Media Creations menu; export an OMF with embedded AIFF-C sound files. In Logic Pro X, use the “Import Other” menu option. Finally, with Premiere Pro CC, the older (Final Cut Pro 7) XML format works fine. Translation completeness will vary with these different solutions. Generally fade handles or crossfades were completely lost, even with FCP X. Levels set within the NLE may or may not transfer. I got the best translation from Premiere Pro CC2014 using XML, where automation levels and crossfades were interpreted correctly.

Editors sending a project from an NLE into Logic Pro X should understand how to properly prep the audio files. Channels that are muted or disabled in the NLE will still be imported, but muted. This includes any unwanted audio from an FCP X project that’s part of a Connected Clip (B-roll). If you don’t want it, detach and remove it from the sequence. Camera clips using two microphones are often interpreted as stereo audio by some NLEs. These should be edited to the timeline as dual mono and not stereo. DAWs process interleaved stereo pairs differently than most NLEs. Once inside Logic, selecting either input 1 or 2 of a stereo track, will sound different than if this same audio comes in as two separate mono tracks and one or the other is used.

The weakest part of this interchange is video support. With FCPXML, Logic Pro X would attempt to use video from the project as a picture reference for the mix. Unfortunately the clip was completely wrong. If you want a proper picture reference, export a self-contained clip and open that as a movie file in Logic Pro X. The application will sync it to the start of the track and provides good offset control for accurate sync. Movie files may be viewed in a separate viewer window, as a movie track or as a small thumbnail.

Going in the other direction, you can export a full mix, as well as all or just selected tracks. Levels and plug-in effects will be baked in. Likewise you can also export an FCPXML or AAF. In the roundtrip back into FCP X, you have the option to include video and combine the tracks into a Compound Clip.

Conclusion

Logic Pro X is a refined audio tool, but it’s still missing some items offered by competitors and even by past Apple software. Ironically there’s still an “open in Soundtrack Pro” feature, even though that application has been discontinued. It still works quite well. Logic lacks a spectral analysis view. There’s no ability to use noise prints and ambient prints for noise reduction and filling in gaps. In an era when all audio post going to broadcast has to be CALM Act-compliant, there are no built-in loudness controls and metering features specific to this need. You can do each of these with third-party plug-ins, but it would be nice to have that be part of the native toolkit.

In spite of a few deficiencies, Logic Pro X is a wonderful mixer for stereo and surround projects and a great tool for composers. Video editors who use it can also benefit from Logic’s musical side. Simple scores are easy to create even if you aren’t a musician, thanks to the extensive media and loop libraries. Maybe you just need a temporary underscore to play under a voice-over so the client can get the feel of the piece. Or maybe you have strong composition skills and want to build the final music for your piece. Either situation can be filled by Logic Pro X. The term “Swiss army knife” gets bandied about for many applications, but it is warranted for your audio needs here. A few third-party plug-ins might be required to augment the package for some needs, but at the price – and given the wealth of additional content – Logic Pro X is a tremendous value for any video editor who wants to make sure their mixes stand out above the rest.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2014 Oliver Peters

Boris Continuum Complete 9

df_BCC9_3WayClrBoris Continuum Complete has been the flagship effects package for Boris FX. With each new version, the company makes it better, faster and even more useful. They have recently released BCC 9 for Avid Media Composer, Sony Vegas Pro, Adobe After Effects/Premiere Pro, FCP X/Motion and DaVinci Resolve. This has been a two-year effort to build upon BCC 8, which was initially released in 2012.

Boris Continuum Complete 9 sports over 30 new effects, including 23 new transitions. In total, BCC 9 versions for Avid Media Composer, Adobe After Effects and Premiere Pro each offer over 230 filters, including 3D extruded text, particle effects, image restoration tools, lens flares, keying and compositing and much more. Specific new filters include a lens vignette, 2-strip color process, sharpening, lens correction, grunge, edge grunge, a laser beam effect, an improved pan & zoom filter with perspective and a one-stop chromakey “studio”.

df_BCC9_DmgdTVDissWhile new filters are always fun, the two big changes in BCC 9 over previous versions are OpenCL and CUDA acceleration, plus the addition of an FX Browser. Boris FX claims as much as a 2X improvement with some filters, when comparing the rendering of BCC 8 versus BCC 9 (tested on Avid Media Composer running under Windows). From just casual use, you’ll definitely see an improvement when applying and manipulating such processor-intensive effects as lens flares and glows.

FX Browser

df_BCC9_FXBrowserThe FX Browser is new for Continuum Complete. While it’s not unique to have a filter browser within an effects package, this one has an edge over those of competitors, because the clip is displayed in a viewer. You can play or scrub through the clip and sample the various filter presets. You aren’t tied to a default thumbnail image or a single still frame of the clip that you’ve sent to the browser. Nearly all of the BCC filters and transitions come with a set of presets.

Many users simply apply a filter, use its default and tweak the sliders a little. In doing so, they miss the wide range of looks that can actually be achieved. With this new FX Browser, you can apply a filter, open the FX Browser and see what each of the Boris presets will look like on your image. In the left pane you see thumbnails for each preset. As you click on the options, the viewer displays your clip with that preset applied to it. At the bottom of the browser is a history panel that shows the choices you’ve already reviewed. Once you find the desired look, click the “apply” button, which returns you to the host’s effects controls. Now adjust the sliders to fine-tune the effect to taste.

df_BCC9_MagicSharpAnother useful way to work with the FX Browser is to apply the standalone FX Browser filter to a clip. When you open the FX Browser, you’ll now see all of the effects categories and individual filters available within BCC 9. This is a great way to preview different filters when you just need some creative inspiration.

Transitions

df_BCC9_LensFlareDissIn the past, filters within the Boris Continuum Complete for After Effects version would also show up and work in Premiere Pro; however, the transitions functioned like they did in After Effects and not like a built-in Premiere Pro transition. You would have to stack adjacent clips onto overlapping vertical tracks and transition between them, just like in After Effects. Now with the changes in Premiere Pro CC and CC2014, transitions in packages like BCC 9 behave as native effects – the same as in other NLEs. This means you can now apply a transition to the cut between two clips on the same track, just like a standard cross-dissolve.

df_BCC9_SwishPanAs with any of the filters, transitions may also be previewed in the FX Browser. Boris Continuum filters feature on-screen controls (heads-up display or HUD) to vary filter adjustments, in addition to the sliders in the control panel. With BCC 9, some of the transition effects also gain appropriate HUD controls. For example, overlay curves let you make nonlinear adjustments to wipes and dissolves without the need to add keyframes.

Focus on quality and efficiency

df_BCC9_RaysOver the last few versions, Boris FX has focused on improving the overall quality of all of the filters. The new BCC Vignette effect doesn’t just darken the edges of the frame, it also adds a slight defocus to more closely approximate real-world lens issues. Their BCC 2-strip Color filter is designed to accurately mimic the look of classic Technicolor films. The new BCC Lens Correction filter is mathematically designed to “unwarp” the characteristic wide angle lens distortion of cameras like the GoPro.

Some of BCC 9’s strength is in a packaging of old favorites, such as its chromakeying tools. Many editors swear by the Boris keyers. Tools in this group include a matte choker and light wrap, which are typically used to finesse a key. Most editors end up applying a stack of these various filters to a clip, in order to get the cleanest result. New with BCC 9 is a chromakey studio filter, which combines all of the keying tools into one panel. This first made its introduction years ago in BCC for Final Cut Pro (“legacy”), but it’s now a part of all the BCC 9 versions. An improvement in this version is the addition of a built-in set of garbage masks to isolate the area needing to be keyed.

Extras

df_BCC9_ScatterizeWhile most of the filters and effects are consistent across hosts, there are some host-specific variations. For instance, in the AVX package for Avid Media Composer, each filter is more complex, because Boris FX adds a geometry function for image transforms (scale, position, etc.) to most AVX filters. These are not standard clip attributes built into the native Media Composer architecture, like they are in After Effects, FCP 7/X and Premiere Pro. Unfortunately, since these are extra functions required to augment Media Composer, it makes the AVX set more expensive than the AE version. For FCP X, filters must be built for Motion and then packaged as Motion templates to be used inside FCP X.

df_BCC9_PinArtBCC filters, in general, add more controls within each filter than do many of the competitors. Nearly all BCC filters include Pixel Chooser, which is an internal masking control. If you only want a filter to alter the bright areas of an image, you can use Pixel Chooser to mask the image according the luminance values and then the effect is limited to that portion of the image.

Another unique BCC feature is Beat Reactor. Let’s say you want a glow filter to pulse the intensity of the glow in sync with the beat of a music track. Simply enable Beat Reactor in that filter’s control panel, select and link the track that you want as a reference and adjust accordingly. Beat Reactor displays an overlay graphic of color bars for the music. These bounce up and down like a digital VU meter in response to the track. The bars may be used for reference and hidden – or they can also be rendered into the final output as an extra visual element. The Beat Reactor controls let you select the portion of this graph that is to affect the filter and the specific parameters to be controlled. Select a range that includes just the musical peaks and you’ll get a smaller variation to the changes in the filter than if you had selected the entire graph.

df_BCC9_LEDlightsLast, but not least, is a new online help system. Having trouble figuring out a filter? Click on the “help” button in the control panel to access a web page (stored on your computer) for tips and tutorials about using that effect. Boris FX continues to improve and advance what has to be the best all-around editor toolkit on the market. Boris Continuum Complete 9 is cross-platform and covers the most popular NLEs. Now that Avid Media Composer no longer comes bundled with a full set of BCC filters, new Avid editors will certainly want to invest in this package. For Adobe Premiere Pro CC customers, it adds capabilities that far surpass what comes with Premiere Pro. Best of all, the single purchase for the Adobe version covers you for After Effects, as well.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2014 Oliver Peters

Gone Girl

df_gg_4David Fincher is back with another dark tale of modern life, Gone Girl – the film adaptation of Gillian Flynn’s 2012 novel. Flynn also penned the screenplay.  It is the story of Nick and Amy Dunne (Ben Affleck and Rosamund Pike) – writers who have been hit by the latest downturn in the economy and are living in America’s heartland. Except that Amy is now mysteriously missing under suspicious circumstances. The story is told from each of their subjective points of view. Nick’s angle is revealed through present events, while Amy’s story is told through her diary in a series of flashbacks. Through these we learn that theirs is less than the ideal marriage we see from the outside. But whose story tells the truth?

To pull the film together, Fincher turned to his trusted team of professionals including director of photography Jeff Cronenweth, editor Kirk Baxter and post production supervisor Peter Mavromates. Like Fincher’s previous films, Gone Girl has blazed new digital workflows and pushed new boundaries. It is the first major feature to use the RED EPIC Dragon camera, racking up 500 hours of raw footage. That’s the equivalent of 2,000,000 feet of 35mm film. Much of the post, including many of the visual effects, were handled in-house.

df_gg_1Kirk Baxter co-edited David Fincher’s The Curious Case of Benjamin Button, The Social Network and The Girl with the Dragon Tattoo with Angus Wall – films that earned the duo two best editing Oscars. Gone Girl was a solo effort for Baxter, who had also cut the first two episodes of House of Cards for Fincher. This film now becomes the first major feature to have been edited using Adobe Premiere Pro CC. Industry insiders consider this Adobe’s Cold Mountain moment. That refers to when Walter Murch used an early version of Apple Final Cut Pro to edit the film Cold Mountain, instantly raising the application’s awareness among the editing community as a viable tool for long-form post production. Now it’s Adobe’s turn.

In my conversation with Kirk Baxter, he revealed, “In between features, I edit commercials, like many other film editors. I had been cutting with Premiere Pro for about ten months before David invited me to edit Gone Girl. The production company made the decision to use Premiere Pro, because of its integration with After Effects, which was used extensively on the previous films. The Adobe suite works well for their goal to bring as much of the post in-house as possible. So, I was very comfortable with Premiere Pro when we started this film.”

It all starts with dailies

df_gg_3Tyler Nelson, assistant editor, explained the workflow, “The RED EPIC Dragon cameras shot 6K frames (6144 x 3072), but the shots were all framed for a 5K center extraction (5120 x 2133). This overshoot allowed reframing and stabilization. The .r3d files from the camera cards were ingested into a FotoKem nextLAB unit, which was used to transcode edit media, viewing dailies, archive the media to LTO data tape and transfer to shuttle drives. For offline editing, we created down-sampled ProRes 422 (LT) QuickTime media, sized at 2304 x 1152, which corresponded to the full 6K frame. The Premiere Pro sequences were set to 1920 x 800 for a 2.40:1 aspect. This size corresponded to the same 5K center extraction within the 6K camera files. By editing with the larger ProRes files inside of this timeline space, Kirk was only viewing the center extraction, but had the same relative overshoot area to enable easy repositioning in all four directions. In addition, we also uploaded dailies to the PIX system for everyone to review footage while on location. PIX also lets you include metadata for each shot, including lens choice and camera settings, such as color temperature and exposure index.”

Kirk Baxter has a very specific way that he likes to tackle dailies. He said, “I typically start in reverse order. David tends to hone in on the performance with each successive take until he feels he’s got it. He’s not like other directors that may ask for completely different deliveries from the actors with each take. With David, the last take might not be the best, but it’s the best starting point from which to judge the other takes. Once I go through a master shot, I’ll cut it up at the points where I feel the edits will be made. Then I’ll have the assistants repeat these edit points on all takes and string out the line readings back-to-back, so that the auditioning process is more accurate. David is very gifted at blocking and staging, so it’s rare that you don’t use an angle that was shot for a scene. I’ll then go through this sequence and lift my selected takes for each line reading up to a higher track on the timeline. My assistants take the selects and assemble a sequence of all the angles in scene order. Once it’s hyper-organized, I’ll send it to David via PIX and get his feedback. After that, I’ll cut the scene. David stays in close contact with me as he’s shooting. He wants to see a scene cut together before he strikes a set or releases an actor.”

Telling the story

df_gg_5The director’s cut is often where the story gets changed from what works on paper to what makes a better film. Baxter elaborated, “When David starts a film, the script has been thoroughly vetted, so typically there isn’t a lot of radical story re-arrangement in the cutting room. As editors, we got a lot of credit for the style of intercutting used in The Social Network, but truthfully that was largely in the script. The dialogue was tight and very integral to the flow, so we really couldn’t deviate a lot. I’ve always found the assembly the toughest part, due to the volume and the pressure of the ticking clock. Trying to stay on pace with the shoot involves some long days. The shooting schedule was 106 days and I had my first cut ready about two weeks after the production wrapped. A director gets around ten weeks for a director’s cut and with some directors, you are almost starting from scratch once the director arrives. With David, most of that ten week period involves adding finesse and polish, because we have done so much of the workload during the shoot.”

df_gg_9He continued, “The first act of Gone Girl uses a lot of flashbacks to tell Amy’s side of the story and with these, we deviated a touch from the script. We dropped a couple of scenes to help speed things along and reduced the back and forth of the two timelines by grouping flashbacks together, so that we didn’t keep interrupting the present day; but, it’s mostly executed as scripted. There was one scene towards the end that I didn’t feel was in the right place. I kept trying to move it, without success. I ended up taking another pass at the cut of the scene. Once we had the emotion right in the cut, the scene felt like it was in the right place, which is where it was written to be.”

“The hardest scenes to cut are the emotional scenes, because David simplifies the shooting. You can’t hide in dynamic motion. More complex scenes are actually easier to cut and certainly quite fun. About an hour into the film is the ‘cool girls’ scene, which rapidly answers lots of question marks that come before it. The scene runs about eight minutes long and is made up of about 200 set-ups. It’s a visual feast that should be hard to put together, but was actually dessert from start to finish, because David thought it through and supplied all the exact pieces to the puzzle.”

Music that builds tension

df_gg_6Composers Trent Reznor and Atticus Ross of Nine Inch Nails fame are another set of Fincher regulars. Reznor and Ross have typically supplied Baxter with an album of preliminary themes scored with key scenes in mind. These are used in the edit and then later enhanced by the composers with the final score at the time of the mix. Baxter explained, “On Gone Girl we received their music a bit later than usual, because they were touring at the time. When it did arrive, though, it was fabulous. Trent and Atticus are very good at nailing the feeling of a film like this. You start with a piece of music that has a vibe of ‘this is a safe, loving neighborhood’ and throughout three minutes it sours to something darker, which really works.”

“The final mix is usually the first time I can relax. We mixed at Skywalker Sound and that was the first chance I really had to enjoy the film, because now I was seeing it with all the right sound design and music added. This allows me to get swallowed up in the story and see beyond my role.”

Visual effects

df_gg_7The key factor to using Premiere Pro CC was its integration with After Effects CC via Adobe’s Dynamic Link feature. Kirk Baxter explained how he uses this feature, “Gone Girl doesn’t seem like a heavy visual effects film, but there are quite a lot of invisible effects. First of all, I tend to do a lot of invisible split screens. In a two-shot, I’ll often use a different performance for each actor. Roughly one-third of the timeline contains such shots. About two-thirds of the timeline has been stabilized or reframed. Normally, this type of in-house effects work is handled by the assistants who are using After Effects. Those shots are replaced in my sequence with an After Effects composition. As they make changes, my timeline is updated.”

“There are other types of visual effects, as well. David will take exteriors and do sky replacements, add flares, signage, trees, snow, breath, etc. The shot of Amy sinking in the water, which has been used in the trailers, is an effects composite. That’s better than trying to do multiple takes with the real actress by drowning her in cold water. Her hair and the water elements were created by Digital Domain. This is also a story about the media frenzy that grows around the mystery, which meant a lot of TV and computer screen comps. That content is as critical in the timing of a scene as the actors who are interacting with it.”

Tyler Nelson added his take on this, “A total of four assistants worked with Kirk on these in-house effects. We were using the same ProRes editing files to create the composites. In order to keep the system performance high, we would render these composites for Kirk’s timeline, instead of using unrendered After Effects composites. Once a shot was finalized, then we would go back to the 6K .r3d files and create the final composite at full resolution. The beauty of doing this all internally is that you have a team of people who really care about the quality of the project as much as everyone else. Plus the entire process becomes that much more interactive. We pushed each other to make everything as good as it could possibly be.”

Optimization and finishing

df_gg_2A custom pipeline was established to make the process efficient. This was spearheaded by post production consultant Jeff Brue, CTO of Open Drives. The front end storage for all active editorial files was a 36TB RAID-protected storage network built with SSDs. A second RAID built with standard HDDs was used for the .r3d camera files and visual effects elements. The hardware included a mix of HP and Apple workstations running with NVIDIA K6000 or K5200 GPU cards. Use of the NVIDIA cards was critical to permit as much real-time performance as possible doing the edit. GPU performance was also a key factor in the de-Bayering of .r3d files, since the team didn’t use any of the RED Rocket accelerator cards in their pipeline. The Macs were primarily used for the offline edit, while the PCs tackled the visual effects and media processing tasks.

In order to keep the Premiere Pro projects manageable, the team broke down the film into eight reels with a separate project file per reel. Each project contained roughly 1,500 to 2,000 files. In addition to Dynamic Linking of After Effects compositions, most of the clips were multi-camera clips, as Fincher typically shoots scenes with two or more cameras for simultaneous coverage. This massive amount of media could have potentially been a huge stumbling block, but Brue worked closely with Adobe to optimize system performance over the life of the project. For example, project load times dropped from about six to eight minutes at the start down to 90 seconds at best towards the end.

The final conform and color grading was handled by Light Iron on their Quantel Pablo Rio system run by colorist Ian Vertovec. The Rio was also configured with NVIDIA Tesla cards to facilitate this 6K pipeline. Nelson explained, “In order to track everything I used a custom Filemaker Pro database as the codebook for the film. This contained all the attributes for each and every shot. By using an EDL in conjunction with the codebook, it was possible to access any shot from the server. Since we were doing a lot of the effects in-house, we essentially ‘pre-conformed’ the reels and then turned those elements over to Light Iron for the final conform. All shots were sent over as 6K DPX frames, which were cropped to 5K during the DI in the Pablo. We also handled the color management of the RED files. Production shot these with the camera color metadata set to RedColor3, RedGamma3 and an exposure index of 800. That’s what we offlined with. These were then switched to RedLogFilm gamma when the DPX files were rendered for Light Iron. If, during the grade, it was decided that one of the raw settings needed to be adjusted for a few shots, then we would change the color settings and re-render a new version for them.” The final mastering was in 4K for theatrical distribution.

df_gg_8As with his previous films, director David Fincher has not only told a great story in Gone Girl, but set new standards in digital post production workflows. Seeking to retain creative control without breaking the bank, Fincher has pushed to handle as many services in-house as possible. His team has made effective use of After Effects for some time now, but the new Creative Cloud tools with Premiere Pro CC as the hub, bring the power of this suite to the forefront. Fortunately, team Fincher has been very eager to work with Adobe on product advances, many of which are evident in the new application versions previewed by Adobe at IBC in Amsterdam. With a film as complex as Gone Girl, it’s clear that Adobe Premiere Pro CC is ready for the big leagues.

Kirk Baxter closed our conversation with these final thoughts about the experience. He said, “It was a joy from start to finish making this film with David. Both he and Cean [Chaffin, producer and David Fincher’s wife] create such a tight knit post production team that you fall into an illusion that you’re making the film for yourselves. It’s almost a sad day when it’s released and belongs to everyone else.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

_________________________________

Needless to say, Gone Girl has received quite a lot of press. Here are just a few additional discussions of the workflow:

Adobe panel discussion with the post team

PostPerspective

FxGuide

HDVideoPro

IndieWire

IndieWire blog

ICG Magazine

RedUser

Tony Zhou’s Vimeo take on Fincher 

©2014 Oliver Peters

Sitting in the Mix Revisited

df_nlemix2_1

Video editors are being called on to do more and mixing audio is one of those tasks. While advanced audio editing and mixing is still best done in a DAW and by a professional who uses those tools everyday, it’s long been the case that most local TV commercials and a lot of corporate videos are mixed by the editor within the NLE. Time for a second look at the subject.

df_nlemix2_3Although most modern NLEs have very strong audio tools, I find that Adobe Premiere Pro CC is one of the better NLEs when it comes to basic audio mixing. There is a wide range of built-in plug-ins and it accepts most third party VST and AU (Mac) filters. Audio can be mixed at both the clip and the track level using faders, rubber-banding in the timeline or by writing automation mix passes with the track mixer. The following are some simple tips for getting good mixes for TV using Premiere Pro CC.

df_nlemix2_7Repair – If you have problem audio tracks, don’t forget that you can send your audio clip to Audition. When you select a clip to edit in Audition, a copy of the file is extracted and sent to Audition. This extracted copy replaces the original clip on the Premiere timeline so the original stays untouched. Audition is good for surgery, such as removing background noise. There are both waveform and spectral views where it’s possible to isolate and “heal” noise elements visible in the spectral view. I recently used this to reduce the noise from a lawn mower heard in the background of an on-location interview.

df_nlemix2_4Third-party filters – In addition the built-in tools, Premiere Pro supports any compliant audio filters on your system. By scanning the system, Premiere Pro (as well as Audition) can access plug-ins that you might have installed as part of other applications. Several good filter sets are available from Focusrite, Waves and iZotope. When it comes to audio mixing for simple projects, I’m a fan of the Vocal Rider and One Knob plug-ins from Waves. Vocal Rider is best with voice-overs by automatically “riding” the level between a minimum and maximum setting. It works a bit like a human operator in evening out volume variations and is not as blunt a tool as a compressor. The One Knob filters are a series of comprehensive filters for EQ or reverb controlled by a single adjustment knob. For example, you can use the “brighter” filter to adjust a multi-band, parametric-style EQ that increases the trebleness of the sound.

df_nlemix2_5Mixing formula – This is my standard formula for mixing TV spots in Premiere Pro. My intention is to end up with voices that sit well against a music track without the music volume being too low. A handy Premiere tool is the vocal enhancer. It’s a simple filter with an adjustment dial that balances the setting for male or female voices as well as for music. Dial in the setting by ear to the point that the voice “cuts” through the mix without sounding overly processed.  For music, I’ll typically apply an EQ filter to the track and bring down the broader mid-range by -2dB. Across the master bus (or a submix bus for each stem) I’ll apply a dynamic compressor/limiter. This is just used to “soft clip” the bus volume at -10dB. Overall, I’ll adjust clip and track volumes to run under this range, so as not to be harshly compressed or clipped.

df_nlemix2_6CALM – Most audio delivered for US broadcast has to be compliant to the loudness specs of the CALM Act. There are similar European standards. Adobe aids us in this, by including the TC Electronics Radar metering plug-in. If you use this, place it on the master bus and make sure audio is routed first through a submix bus. I’ll place a compressor/limiter on the submix bus. This way, all volume adjustments and limiting happen upstream of the meter. By adjusting your mix with the Radar meter running, it’s possible to end up with a compliant mix that still sounds quite natural.

©2014 Oliver Peters

Custom transitions using adjustment layers

df_trans_main

Sometimes you just want to use a unique transition effect, however, you might not own a package of third party plug-ins with custom transitions. If you are an FCP X / Motion user, then you can create a custom transition as a Motion  template. But, maybe that’s too much trouble when you are in the thick of things. There is actually a very simple technique that After Effects artists have used for years. That’s using an adjustment layer above a cut or dissolve between two shots and applying filters within the adjustment layer.

This works in FCP X, Premiere Pro CC and Media Composer. The first two actually have adjustment layer effects, though in FCP X, it’s based on a blank title generator. In Media Composer, you can add edits into empty video tracks and apply effects to any section of a blank track, which effectively makes this process the same as using an adjustment layer. The Media Composer approach was described nicely by Shane Ross in his Vimeo tutorial, which got me thinking about this technique more broadly. Generally, it works the same in all three of these NLEs.

The examples and description are based on Premiere Pro CC, but don’t let that stop you from trying it out on your particular software of choice. To start, create a new adjustment layer and add a marker to the middle of it. This helps to center the layer over the cut between two shots. Place the adjustment layer effect over a cut between shots, making sure that the marker lines up with the edit point. If the transition is to be a one-second effect, then trim the front and back of the adjustment layer so that one-half second is before the marker and one-half second is after the marker. Depending on the effect, you may or may not also want a short dissolve between the two shots on the base video track. For example, an effect that flashes the screen full frame at the midpoint will work with a cut. A blur effect will work best in conjunction with a dissolve, otherwise you’ll see the cut inside the blur.

The beauty of this technique is that you can apply numerous filters to an adjustment layer and get a unique combination of effects that isn’t otherwise available. For example, a blur+glow+flare transition. At this point, it’s important to realize that not all effects plug-ins work the same way and you will have varying results. Boris filters tends not to work when you stack them in the same adjustment layer and start to change keyframes. In Avid’s architecture, the BCC filters have a specific pipeline and you have to define which filter is the first and which is the last effect. I didn’t find any such controls in the Premiere version. A similar thing happened with the Red Giant Universe filters. On the other hand, most of the native Premiere Pro filters operated correctly in this fashion.

The basic principle is that you want the filters to start and end at a neutral value so that the transition starts and ends without a visible effect. The midpoint of the transition (over the cut) should be at full value of whatever it is you are trying to achieve. If it’s a lens flare, then the middle of the transition should be the midpoint of the lens flare’s travel and also its brightest moment. If you are using a glow, then the intensity is at its maximum in the middle. Typically this means three keyframe points – beginning, middle and end. The values you adjust will differ with the plug-in. It could be opacity, strength, intensity or anything else. Sometimes you will adjust multiple parameters at these three points. This will be true of a lens flare that travels across the screen during the transition.

The point is that you will have to experiment a bit to get the right feel. The benefit is that once you’ve done this, the adjustment layer clip – complete with filters and keyframes – can be copied-and-pasted to other sections of the timeline for a consistent effect.

Here are some examples of custom transition effects in Premiere Pro CC, using this adjustment layer technique. (Click the image for an enlarged view.)

df_trans_1

This is a combination of a Basic 3D horizontal spin and Ripples. The trick is to get the B-side image to not be horizontally flipped, since it’s the backside of the rotating image. To do this, I added an extra Transform filter with a middle keyframe that reverses the scale width to -100.

df_trans_2

This transition combines a Directional Blur with a Chromatic Glow and requires a dissolve on the base video track.

df_trans_3

This is a lens flare transition where the flare travels and changes intensity. The brightest part is the midpoint over the shot change. This could work as a cut or dissolve, since the flare’s brightness “wipes” the screen. In addition, I have the flare center traveling from the upper left to the lower right of the frame.

df_trans_4

Here, I’ve applied the BCC Pencil Sketch filter, bringing it in and out during the length of the transition, with a dissolve on the base layer. This gives us a momentary cartoon look as part of the shot transition.

df_trans_5

Custom UI filters like Magic Bullet Looks also work. This effect combines Looks using the “blockbuster” preset with a Glow Highlights filter. First set the appearance in Looks and then use the strength slider for your three keyframes.

df_trans_6

This transition is based on the Dust & Scratches filter in Premiere Pro. I’m not sure why it produced this blotchy artistic look other than the large radius value. Quite possibly this is a function of its behavior in an adjustment layer. Nevertheless, it’s a cool, impressionistic style.

df_trans_7

This transition takes advantage of the BCC Water Color filter. Like my Pencil Sketch example, the transition briefly turns into a watercolor during the length of the transition.

df_trans_8

Like the previous two BCC example, this is a similar approach using the Universe ToonIt Paint filter.

df_trans_9

This transition combines several of the built-in Premiere Pro effects, including Transform and Radial Blur. The image scales up and back down through the midpoint, along with the blur values ramping and an added skew value change.

df_trans_10

In this last example, I’ve used Magic Bullet Looks. The Looks style uses a Tilt-Shift preset to which I’ve added lens distortion within Looks. The three keyframe points are set by adjusting the strength slider.

©2014 Oliver Peters

Color Grading Strategies

df_gradestrtgy_1_sm

A common mistake made by editors new to color correction is to try to nail a “look” all in a single application of a filter or color correction layer. Subjective grading is an art. Just like a photographer who dodges and burns areas of a photo in the lab or in Photoshop to “relight” a scene, so it is with the art of digital color correction. This requires several steps, so a single solution will never give you the best result. I follow this concept, regardless of the NLE or grading application I’m using at the time. Whether stacked filters in Premiere Pro, several color corrections in FCP X, rooms in Color, nodes in Resolve or layers in SpeedGrade – the process is the same. The standard grade for me is often a “stack” of four or more grading levels, layers or nodes to achieve the desired results. (Please click on any of the images for an expanded view.)

df_gradestrtgy_1red_smThe first step for me is always to balance the image and to make that balance consistent from shot to shot. Achieving this varies with the type of media and application. For example, RED camera raw footage is compatible with most updated software, allowing you to have control over the raw decoding settings. In FCP X or Premiere Pro, you get there through separate controls to modify the raw source metadata settings. In Resolve, I would usually make this the first node. Typically I will adjust ISO, temperature and tint here and then set the gamma to REDlogFilm for easy grading downstream. In a tool like FCP X, you are changing the settings for the media file itself, so any change to the RED settings for a clip will alter those settings for all instances of that clip throughout all of your projects. In other words, you are not changing the raw settings for only the timeline clips. Depending on the application, this type of change is made in the first step of color correction or it is made before you enter color correction.

df_gradestrtgy_cb1_smI’ll continue this discussion based on FCP X for the sake of simplicity, but just remember that the concepts apply generally to all grading tools. In FCP X, all effects are applied to clips before the color board stage. If you are using a LUT filter or some other type of grading plug-in like Nattress Curves, Hawaiki Color or AutoGrade, remember that this is applied first and then that result is effected by the color board controls, which are downstream in the signal flow. If you want to apply an effect after the color board correction, then you must add an adjustment layer title generator above your clip and apply that effect within the adjustment layer.

df_gradestrtgy_cb2_smIn the example of RED footage, I set the gamma to REDlogFilm for a flatter profile to preserve dynamic range. In FCP X color board correction 1, I’ll make the necessary adjustments to saturation and contrast to restore this to a neutral, but pleasing image. I will do this for all clips in the timeline, being careful to make the shots consistent. I am not applying a “look” at this level.

df_gradestrtgy_cb2a_smThe next step, color board correction 2, is for establishing the “look”. Here’s where I add a subjective grade on top of color board correction 1. This could be new from scratch or from a preset. FCP X supplies a number of default color presets that you access from the pull-down menu. Others are available to be installed, including a free set of presets that I created for FCP X. df_gradestrtgy_cb2b_smIf you have a client that likes to experiment with different looks, you might add several color board correction layers here. For instance, if I’m previewing a “cool look” versus a “warm look”, I might do one in color correction 2 and another in color correction 3. Each correction level can be toggled on and off, so it’s easy to preview the warm versus cool looks for the client.

Assuming that color board correction 2 is for the subjective look, then usually in my hierarchy, correction 3 tends to be reserved for a mask to key faces. Sometimes I’ll do this as a key mask and other times as a shape mask. df_gradestrtgy_cb3_smFCP X is pretty good here, but if you really need finesse, then Resolve would be the tool of choice. The objective is to isolate faces – usually in a close shot of your principal talent – and bring skin tones out against the background. The mask needs to be very soft so as not to draw attention to itself. Like most tools, FCP X allows you to make changes inside and outside of the mask. If I isolate a face, then I could brighten the face slightly (inside mask), as well as slightly darken everything else (outside mask).df_gradestrtgy_cb3a_sm

Depending on the shot, I might have additional correction levels above this, but all placed before the next step. For instance, if I want to darken specific bright areas, like the sun reflecting off of a car hood, I will add separate layers with key or shape masks for each of these adjustments. df_gradestrtgy_cb3b_smThis goes back to the photographic dodging and burning analogy.

df_gradestrtgy_cb4_smI like adding vignettes to subtly darken the outer edge of the frame. This goes on correction level 4 in our simplest set-up. The bottom line is that it should be the top correction level. The shape mask should be feathered to be subtle and then you would darken the outside of the mask, by lowering brightness levels and possibly a little lower on saturation. df_gradestrtgy_cb4a_smYou have to adjust this by feel and one vignette style will not work for all shots. In fact, some shots don’t look right with a vignette, so you have to use this to taste on a shot by shot basis. At this stage it may be necessary to go back to color correction level 2 and adjust the settings in order to get the optimal look, after you’ve done facial correction and vignetting in the higher levels.df_gradestrtgy_cb5_sm

df_gradestrtgy_cb5a_smIf I want any global changes applied after the color correction, then I need to do this using an adjustment layer. One example is a film emulation filter like LUT Utility or FilmConvert. Technically, if the effect should look like film negative, it should be a filter that’s applied before the color board. If the look should be like it’s part of a release print (positive film stock), then it should go after. For the most part, I stick to after (using an adjustment layer), because it’s easier to control, as well as remove, if the client decides against it. df_gradestrtgy_cb5b_smRemember that most film emulation LUTs are based on print stock and therefore should go on the higher layer by definition. Of course, other globals changes, like another color correction filters or grain or a combination of the two can be added. These should all be done as adjustment layers or track-based effects, for consistent application across your entire timeline.

©2014 Oliver Peters

The FCP X – RED – Resolve Dance

df_fcpx-red-resolve_5

I recently worked on a short 10 minute teaser video for a potential longer film project. It was shot with a RED One camera, so it was a great test for the RED workflow and roundtrips using Apple Final Cut Pro 10.1.2/10.1.3 and DaVinci Resolve 11.

Starting the edit

As with any production, the first step is to properly back up and verify the data from the camera and sound cards. These files should go to redundant drives that are parked on the shelf for safe keeping. After this has been done, now you can copy the media to the editorial drives. In this case, I was using a LaCie RAID-5 array. Each day’s media was placed in a folder and divided into subfolders for RED, audio and other cameras, like a few 5D shots.

df_fcpx-red-resolve_4Since I was using FCP X and its RED and proxy workflows, I opted not to use REDCINE-X Pro as part of this process. In fact, the Mac Pro also didn’t have any RED Rocket accelerator card installed either, as I’ve seen conflicts with FCP X and RED transcodes when the RED Rocket card was installed. After the files were copied to the editorial drives, they were imported into an FCP X event, with media left in its original location. In the import setting, the option to transcode proxy media was enabled, which continues in the background while you start to work with the RED files directly. The camera files are 4K 16×9 .r3d files, so FCP X transcodes these to half-sized ProRes Proxy media.

df_fcpx-red-resolve_1Audio was recorded as double-system sound using a Sound Devices recorder. The audio files were 2-channel broadcast WAV files using slates for syncing. There was no in-camera audio and no common timecode. I was working with a couple of assistant editors, so I had them sync each clip manually. Instead of using FCP X’s synchronized clips, I had them alter each master clip using the “open in timeline” command. This lets you edit the audio directly to the video as a connected clip within the master clip. Once done, your master clip contains synced audio and video.  It functions just like a master clip with in-camera audio – almost (more on that later).df_fcpx-red-resolve_9

All synced clips were relabeled with a camera, scene and take designation, as well as adding this info to the camera, scene and take columns. Lastly, script notes were added to the notes column based on the script supervisor’s reports.

Transcodes

df_fcpx-red-resolve_6Since the post schedule wasn’t super-tight, I was able to let the transcodes finish overnight, as needed. Once this is done, you can switch FCP X to working with proxies and all the media will be there. The toggle between proxy and/or optimized-original media is seamless and FCP X takes care of properly changing all sizing information. For example, the project is 4K media in a 1080p timeline. FCP X’s spatial conform downscales the 4K media, but then when you toggle to proxy, it has to make the corresponding adjustments to media that is now half-sized. Likewise any blow-ups or reframing that you do also have to match in both modes.

df_fcpx-red-resolve_2The built-in proxy/optimized-original workflow provides you with offline/online editing phases right within the same system. Proxies for fast and efficient editing. Original or high-resolution transcodes for finishing. To keep the process fast and initially true to color decisions made on set, no adjustments were made to the RED files. FCP X does let you alter the camera raw color metadata from inside the application, but there’s no real reason to do this for offline editing files. That can be deferred until it’s time to do color correction. So during the edit, you see what the DoP shot as you view the RED files or the transcoded proxies.

df_fcpx-red-resolve_3We did hit one bad camera load. This might have been due to either a bad RED drive or possibly excessive humidity at that location. No matter what the reason, the result was a set of corrupt RED clips. We didn’t initially realize this in FCP X, and so, hit clips that caused frequent crashes. Once I narrowed it down to the load from that one location, I decided to delete these clips. For that group of shots, I used REDCINE-X Pro to transcode the files. I adjusted the color for a flatter, neutral profile (for later color correction) and transcoded full-resolution debayered 1080p ProRes 4444 files. We considered these as the new camera masters for those clips. Even there, REDCINE-X Pro crashed on a few of the clips, but I still had enough to make a scene out of it.

Editing

The first editing step is culling down the footage in FCP X. I do a first pass rejecting all bogus shots, like short clips of the floor, a bad slate, etc. Set the event browser to “hide rejected”. Next I review the footage based on script notes, looking at the “circle takes” first, plus picking a few alternates if I have a different opinion. I will mark these as Favorites. As I do this, I’ll select the whole take and not just a portion, since I want to see the whole take.

Once I start editing, I switch the event browser to “show favorites”. In the list view, I’ll sort the event by the scene column, which now gives me a quick roadmap of all possible good clips in the order of the script. During editing, I cut mainly using the primary storyline to build up the piece. This includes all overlapping audio, composites, titles and so on. Cutting proceeds until the picture is locked. Once I’m ready to move on to color correction, I export a project XML in the FCPXML format.

Resolve

df_fcpx-red-resolve_7I used the first release version (not beta) of DaVinci Resolve 11 Lite to do this grade. My intention was to roundtrip it back to FCP X and not to use Resolve as a finishing tool, since I had a number of keys and composites that were easier done in FCP X than Resolve. Furthermore, when I brought the project into Resolve, the picture was right, but all of the audio was bogus – wrong takes, wrong syncing, etc. I traced this down to my initial “open in timeline” syncing, which I’ll explaining in a bit. Anyway, my focus in Resolve was only grading and so audio wasn’t important for what I was doing. I simply disabled it.

Importing the FCPXML file into a fresh Resolve 11 project couldn’t have been easier. It instantly linked the RED, 5D and transcoded ProRes 4444 files and established an accurate timeline for my picture cut. All resizing was accurately translated. This means that in my FCP X timeline, when I blew up a shot to 120% (which is a blow-up of the 1080p image that was downscaled from the 4K source), Resolve knew to take the corresponding crop from the full 4K image to equal this framing of the shot without losing resolution.

The one video gotcha I hit was with the FCP X timeline layout. FCP X is one of the only NLEs that lets you place video BELOW what any other software would consider to be the V1 track – that’s the primary storyline. Some of my green screen composite shots were of a simulated newscast inserted on a TV set hanging on a wall in the primary scene. I decided to place the 5 or 6 layers that made up this composite underneath the primary storyline. All fine inside FCP X, however, in Resolve, it has to interpret the lowest video element as V1, thus shifting everything else up accordingly. As a result the, bulk of the video was on V6 or V7 and audio was equally shifted in the other direction. This results in a lot of vertical timeline scrolling, since Resolve’s smallest track height is still larger than most.

df_fcpx-red-resolve_8Resolve, of course, is a killer grading tool that handles RED media well. My grading approach is to balance out the RED shots in the first node. Resolve lets you adjust the camera raw metadata settings for each individual clip, if you need to. Then in node 2, I’ll do most of my primary grading. After that, I’ll add nodes for selective color adjustments, masks, vignettes and so on. Resolve’s playback settings can be adjusted to throttle back the debayer resolution on playback for closer-to-real-time performance with RED media. This is especially important, when you aren’t running the fastest drives, fastest GPU cards nor using a RED Rocket card.

To output the result, I switched over to Resolve’s Deliver tab and selected the FCP X easy set-up. Select handle length, browse for a target folder and run. Resolve is a very fast renderer, even with GPU-based RED debayering, so output wasn’t long for the 130 clips that made up this short. The resulting media was 1080p ProResHQ with an additional 3 seconds per clip on either side of the timeline cut – all with baked in color correction. The target folder also contains a new FCPXML that corresponds to the Resolve timeline with proper links to the new media files.

Roundtrip back into FCP X

Back in FCP X, I make sure I’ve turned off the import preference to transcode proxy media and that my toggle is set back to original/optimized media. Find the new FCPXML file from Resolve and import it. This will create a new event containing a new FCP X project (edited sequence), but with media linked to the Resolve render files. Audio is still an issue, for now.

There is one interesting picture glitch, which I believe is a bug in the FCPXML metadata. In the offline edit, using RED or proxy media, spatial conform is enabled and set to “fit”. That scales the 4K file to a 1080p timeline. In the sequence back from Resolve, I noticed the timeline still had yellow render bars. When I switched the spatial conform setting on a clip to “none”, the render bar over it went away, but the clip blew up much larger, as if it was trying to show a native 4K image at 1:1. Except, that this was now 1080 media and NOT 4K. Apparently this resizing metadata is incorrectly held in the FCPXML file and there doesn’t appear to be any way to correct this. The workaround is to simply let it render, which didn’t seem to hurt the image quality as far as I could tell.

Audio

Now to an explanation of the audio issue. FCP X master clips are NOT like any other master clips in other NLEs, including FCP 7. X’s master clips are simply containers for audio and video essence and, in that way, are not unlike compound clips. Therefore, you can edit, add and/or alter – even destructively – any material inside a master clip when you use the “open in timeline” function. You have to be careful. That appears to be the root of the XML translation issue and the audio. Of course, it all works fine WITHIN the closed FCP X environment!

Here’s the workaround. Start in FCP X. In the offline edited sequence (locked rough cut) and the sequence from Resolve, detach all audio. Delete audio from the Resolve sequence. Copy and paste the audio from the rough cut to the Resolve sequence. If you’ve done this correctly it will all be properly synced. Next, you have to get around the container issue in order to access the correct WAV files. This is done simply by highlighting the connected audio clip(s) and using the “break apart clip items” command. That’s the same command used to break apart compound clips into their component source clips. Now you’ll have the original WAV file audio and not the master clip from the camera.

df_fcpx-red-resolve_11At this stage I still encountered export issues. If your audio mixing engineer wants an OMF for an older Pro Tools unit, then you have to go through FCP 7 (via an Xto7 translation) to create the OMF file. I’ve done this tons of time before, but for whatever reason on this project, the result was not useable. An alternative approach is to use Resolve to convert the FCPXML into XML, which can then be imported into FCP 7. This worked for an accurate translation, except that the Resolve export altered all stereo and multi-channel audio tracks into a single mono track. Therefore, a Resolve translation was also a fail. At this point in time, I have to say that a proper OMF export from FCP X-edited material is no longer an option or at least unreliable at best.

df_fcpx-red-resolve_10This leaves you with two options. If your mixing engineer uses Apple Logic Pro X, then that appears to correctly import and convert the native FCPXML file. If your mixer uses Pro Tools (a more likely scenario) then newer versions will read AAF files. That’s the approach I took. To create an AAF, you have to export an FCPXML from the project file. Then using the X2Pro Audio Convert application, generate an AAF file with embedded and trimmed audio content. This goes to the mixer who in turn can ingest the file into Pro Tools.

Once the mix has been completed, the exported AIF or WAV file of the mix is imported into FCP X. Strip off all audio from the final version of the FCP X project and connect the clip of the final mix to the beginning of the timeline. Now you are done and ready to export deliverables.

For more on RED and FCP X workflows, check out this series of posts by Sam Mestman at MovieMaker.

Part 1   Part 2   Part 3

©2014 Oliver Peters