Apple Final Cut Pro X 10.4

December finally delivered the much-anticipated simultaneous release of new versions of Apple Final Cut Pro X, Motion, and Compressor – all on the same day as the iMac Pro officially went on sale. In the broader ecosystem, we also saw updates for macOS High Sierra, Logic Pro X, Pixelmator Pro, and Blackmagic Design DaVinci Resolve.

Final Cut Pro X (“ten”), version 10.4 is the fifth major release of Apple’s professional NLE in a little over six years. There are changes under the hood tied to technologies in High Sierra (macOS 10.13), which won’t get much press, but are very important in the development and operation of an application. This version will still run on a wide range of recent and older Macs. The minimum OS requirement is 10.12.4, but 10.13 or later is recommended. There are four new, marquee features in this release: advanced color correction tools, 360° editing, HDR (wide gamut) color space support, and HEVC/H.265 codec support for editing and encoding.

New advanced color tools

Final Cut Pro X was first launched with a color correction tool called the color board. It substituted sliders on a color swatch for the standard curves and color wheel controls that editors had been used to. While the color board was and is effective, as well as a bit deceptive in what you can accomplish, it was an instant turn-off for many. The lack of a more advanced color correction interface opened the field for third party color correction plug-in developers who came up with some great tools. With the release of FCPX 10.4, it’s hard for me to see why FCPX diehards would still buy a color correction plug-in. Yet, I have heard from at least one plug-in developer that their color corrector plug-in sales are staying stable. Clearly users want choice and that’s a good thing.

With this update, you’ve gained three new, native color tools, including color wheels, curves, and hue vs. saturation curves. All are elegantly designed, operate quite fluidly, and generally mimic what you can do in DaVinci Resolve. However, the color board didn’t go away however. There’s a preference setting for which of these four color tools is the default effect when first applying color correction (CMD+6).

Once you start color correcting, you can add more instances of any of these four tools in any combination. Final Cut Pro X sports robust performance, so you can apply several layers of correction to a clip and still have real-time playback without rendering. There are also additional keyboard commands to quickly step through effects or clips on your timeline. While not quite as fluid of a grading workflow as you’d have in a true color correction application, like Resolve, you can get pretty close with some experience. My biggest beef is that you are limited to the controls being locked within the inspector pane. You can’t move the controls around and there is no special color correction workspace. So for me, the ergonomics are poor. In my testing, I’ve also hit some flaws in how the processing is done (more on that in a future post). Ironically the color board actually seems to achieve more accurate correction than the color wheels.

There are a few quirks. Previously created presets for the color board will be converted into color preset effects, which now appear in the effects browser. This enables you to preview a color preset applied to a clip by skimming over the effect thumbnail. Unfortunately, I found this conversion didn’t always work. On a Sierra machine (10.12), the older presets were automatically converted after waiting a few minutes; however, nothing happened on a High Sierra machine (10.13). I eventually resorted to copying my converted effects presets from the Sierra Mac over to the High Sierra Mac. I suspect, that because the High Sierra update automatically reformats the internal SSD drive to the new Apple File System (APFS), this conversion process is somehow impeded. Of course, if you don’t already have any existing custom presets, then it’s not an issue.

(You can check out my previously-created color presets for instructions and downloads here.)

There is no control surface support yet, although future support for third party color correction controllers has been alluded to. It would be nice to see support for Tangent or Avid panels at the very least. There’s a new FCPXML version (1.7) that includes this new color metadata; however, it doesn’t seem to be imported into the newest version of Resolve. It’s possible that color metadata in the FCPXML file is only intended for FCPX-to-FCPX transfers and not round tripping to other applications.

360° editing

Let me say up front that this doesn’t hit my hot button. It’s an area where Apple is playing catch-up to Adobe. Quite frankly, for both of these companies, it only appeals to a small percentage of users. Not all 360° formats are supported. Your footage must be equirectangular (stitched panorama), in order that FCPX can properly correct its display. Nevertheless, if you do work on 360° productions, then FCPX provides you a nice tool kit.

You can set up your timeline sequence for monoscopic or stereoscopic 360° editing. Once set up, simply open a separate 360° viewer, side-by-side to the normal viewer. When you do this, you’ll see the uncorrected image on the right and the adjusted point-of-view image on the left. What’s really cool, is that you can play the timeline and actively navigate your view of the content within of the 360° viewer, without ever stopping playback. Plus I’m talking about 4K material here! Clearly the engineers have tweaked the performance and not just integrated a plug-in.

There are also a set of custom effects designed for seamless use on 360° images. For example, if you apply a standard blur, there will be a visible seam where the left and right edges meet. If you apply a 360° blur effect, then the image and effect are properly blended. If you want to get the full effect, just attach an HTC Vive VR headset to view clips in full 360°. Want to test this, but don’t have any footage? A quick web search will provide a ton of downloadable, equirectangular clips to play with.

Wide gamut / high dynamic range (HDR)

Apple is trying to establish leadership with the integration of workflows to support HDR editing. I suspect that their ultimate goal is proper HDR support for Apple TV 4K and the iPhone X. The state of HDR today is very confusing without any real standards. There’s DolbyVision and HDR10, an open standard. The latter leaves the actual implementation up to manufacturers, while Dolby licenses its technology with tight specs. The theoretical DolbyVision brightness standard is 10,000 nits (cd/m2), but their current target is only 4,000 nits. HDR10 caps at 1,000 nits. Current consumer TV sets run in the 300 to 500 nit range with none exceeding 1,000 nits. Finally, projected brightness in movie theaters is even lower.

To work in HDR within Final Cut Pro X, first set up the FCPX Library as wide instead of standard gamut. Then set the Project (sequence) to one of four standards: Rec 709 (standard dynamic range), Rec 2020, Rec 2020 PQ, or Rec 2020 HLG. The first Rec 2020 mode simply preserves the full dynamic range of log-encoded camera files when FCPX applies its LUTs. The PQ and HLG options are designed for DolbyVision and/or HDR10 mastering. HDR tools are provided to go between color spaces, such as mastering in Rec 2020 PQ and delivering in Rec 709 (consult Apple’s workflow document). However, it is only in the Rec 2020 PQ color space that the FCPX scope will display in nits, rather than IRE. When set to nits, the scale is 0 to 10,000 nits instead of 0 to 120 IRE.

To edit in one of these wide gamut color spaces, set your preferences to display HDR in raw values. Then Final Cut interacts with the color profile of the monitor through macOS to effectively dim the viewer image for this new color space. However, this technique is not applied to the filmstrips and thumbnail images in the browser, which will appear with blown out levels unless you manually override the colorspace setting for each clip. If your footage was shot with camera raw or log-encoding, using a RED, ARRI or similar camera, then you are ready to work in HDR today.

It’s critical to note that no current computer display or consumer flat panel will give you an accurate HDR image to grade by. This includes the new iMac Pro screens. You will need the proper AJA i/o hardware and a calibrated HDR display to see a proper HDR image. Even then, it’s still a question of which HDR levels you are trying to master to. For example, if you are using the scope in FCPX with a brightness level up to 10,000 nits, but your target display can only achieve 1,000 nits, then what good is the reading on the scope? We are still early in the HDR process, but I’m concerned that FCPX 10.4 will give users a false impression of what it really takes to do HDR properly.

HEVC / H.265

You can now import iMovie for iOS projects into FCPX 10.4.  Support for the H.265 (HEVC) codec has been added with this release, but you’ll need to be on High Sierra. If you shot video with an iPhone X and started organizing it in iMovie on the phone, then that video may have used the H.265 codec. Now you can bring that into FCPX to continue the job.

Going the other way will require Compressor encoding. HEVC is also the required format to send HDR material to the web. Apple is late to the game in H.265 support, as Sorenson and Adobe users have been able to do that for a while. I tested H.265 encoding of short clips in Compressor on my mid-2014 Retina MacBook Pro and it was extremely slow. There was no issue with H.264 encoding. The same H.265 test in Adobe Media Encoder – even when it was uprezzing a 1080p file to 4K – was significantly faster than Compressor.

Final thoughts

For current users. When you update to Final Cut Pro X 10.4, please remember that it will update each FCPX library file that you open afterwards. Although this has generally been harmless for most users, it’s best to follow some precautions. Zip your 10.3 (or earlier) version of the application and move that .zip file out of the applications folder before you update. Archive all of your existing Final Cut libraries. This way you can find your way back, in case of some type of failure.

Final Cut Pro X 10.4 is a solid upgrade that will have loyal FCPX users applauding. Overall, these new tools are useful and, as before, FCPX is a very fluid, enjoyable editing application. It slices through 4K content better than any other NLE on the Mac platform. If you like its editing paradigm, then nothing else comes close.

Unfortunately, Apple didn’t squash some long-standing bugs. For example, numerous users online are still complaining about the issue where browser text intermittently disappears. I do feel that there were missed opportunities. The functionality of audio lanes – a feature introduced in 10.3 as a way to get closer to track-style audio mixing – hasn’t been expanded. The hope for an enhanced, roles-based audio mixer has once again gone unanswered. On the other hand, the built-in audio plug-ins have been updated to those used by Logic Pro X and there’s a clean path to send your audio to Logic if you want to mix there.

I definitely welcome these updates. The new color tools make it a more powerful application to use for color grading, so I’m happy to see that Apple has been listening. Now, I hope that we’ll see some of the other needs addressed before another year passes us by.

Originally written for Digital Video magazine / Creative Planet Network

©2017, 2018 Oliver Peters

Advertisements

Downsizing

The bond between a film director and the editor is often a long-lasting one. The industry is full of pairings that continue film after film. One such duo is director Alexander Payne (Nebraska, The Descendants, Sideways) and editor Kevin Tent (Welcome to Me, Girl Interrupted, Election). Tent has edited every film that Payne directed, with the exception of Payne’s short film Paris, je t’aime. In fact, Payne also served as producer for Crash Pad, a film directed by Tent.

The latest Alexander Payne film to hit the cinemas is Downsizing, a sci-fi satire starring Matt Damon, Christoph Waltz, and Kristen Wig. In the film, scientists discover human miniaturization as a way to combat overpopulation. Paul (Matt Damon) and Audrey (Kristen Wig) decide to give it a try, exchanging their average life in Omaha for Leisure Land, one of the ‘micro-communities’ sprouting up. Their modest $150,000 in personal assets will make them multimillionaires, so they take the plunge.

Sci-fi and satire

The sci-fi genre is a new approach for Payne, which is where I started my conversation with Kevin Tent. He explains, “The sci-fi theme is a departure for Alexander, but this is still very much an ‘Alexander Payne movie’. It’s still about the human experience. In the plot, shrinking is seen as a way to save the human race, but people get greedy. They can make themselves instantly rich, save money on food, medicine, and move into big ‘McMansions’. Human nature takes over, which makes the film funny and also thought-provoking. It covers a lot of ground and politics.”

“It’s easy to ask, why sci-fi,” Tent continues. “Alexander Payne is an artist who is always looking for ways to challenge himself. He co-wrote the script ten years ago, but it took this long to get it made. For one thing, Downsizing is more expensive than his past films. As an editor, I first looked at the cutting differently, because of working with the visual effects; but, I quickly realized that this film, like Alexander’s others, was about the characters and the story.  [Those are] still the most important elements of the movie. I had recently worked on The Audition, which was shot mostly with green screen – and a while back, The Golden Compass, which was a serious visual effects movie.  I had enough knowledge about the process to know one thing. These people can do anything! We had a terrific VFX team, headed by our creative guru, Jamie Price. ILM and Framestore did most of the visual effects.”

Digital production to aid the process

Alexander Payne shifted to digital acquisition with Nebraska and has followed suit with his latest, Downsizing. According to Tent, “Alexander shoots a lot of coverage, so he likes digital for that. It’s also easier to deal with when compositing visual effects. We had over 130 hours of total footage. Of course, a fairly good chunk was plates for VFX and 2nd unit footage. Most of the scenes were shot with single camera, but sometimes with multi-cam. Especially for some of the big speeches, which were covered with two and sometimes three cameras. We synced up the takes in the Avid, which makes it so easy to switch from camera to camera. Mindy Elliot is our amazing first assistant. She’s a total pro and a total joy to work with. She’s been running our cutting rooms since The Descendants. Angela Latimer was our second. She did 99% of the scripting [for Avid’s ScriptSync feature] and also helped cut early versions of Paul’s drug montage [scene in Downsizing]. Joe Carson was our VFX editor. I met him while working on Sponge Bob The Movie. I was one of the live action CGI editors on that film. Joe is awesome. He not only kept all of our visual effects organized, but he was also kept busy with the countless comps, morphs, and speed-ups that we tossed at him on a daily basis.”

Production wrapped in mid-August 2016 and then Tent started cutting with Payne right after Labor Day. Tent continues, “When I cut with Alexander, we basically start from scratch. I do create an editor’s cut during production, which we go back to for reference during our time together cutting, but it isn’t the starting point when I begin with Alexander. He’s a good editor, so when we work together, it’s really like having two editors in the room. We start watching dailies and start building scenes. We often look back at my editor’s cut and realize the scene or a part of it was better in that earlier version. Or maybe not. If there is something we like, we’ll put it back into the current cut.  We completed our first pass (kind of a director’s assembly) in January to show the studio. By early to mid-July we had a locked cut with about 80% of the completed VFX shots. The remainder trickled in afterwards. All together, that’s about ten or eleven months of cutting and finishing. Our DI/color grading was handled by the amazing Skip Kimball at Technicolor.”

Tools and tips

As a fellow editor, it’s always fun to talk about the tools and how to use them on a feature film project. Kevin Tent is a committed Avid Media Composer user. (Pacific Post provided the Avid systems used by the editing team.) According to Tent, “This was a huge project and Media Composer never had a problem with it.” One unique hallmark of Media Composer is Avid’s Script Integration. Notable within it is ScriptSync, Media Composer’s ability to automatically analyze waveforms and synchronize them – and, therefore, the associated clip – against text that has been input, like a film script. When correctly indexed, simply clicking on a line of dialogue in the on-screen script brings up all of the corresponding coverage. An ongoing licensing dispute limited its use to older versions of Media Composer, until the issue was finally resolved this year. That is great news for devotees of Avid’s powerful ScriptSync capability.

Many film editors swear by Avid’s Script Integration tools, yet some never use them at all. Was Tent a ScriptSync user? “Hell, ya!,” is his instant reply. “We stayed on Media Composer 7.0.6, because of the ScriptSync licensing issue, just so we could use it. I had Angela mark a lot of extra material and ad libs in addition to the scripted dialog. For example, an action like Paul opening a door or something like that. That would help, especially if they shot a lot of takes or resets within one bigger take, which tends to happen a lot when the shooting is on digital. There’s a massive party scene midway through the movie with people dancing, smoking pot, that kind of thing, and I asked Angela to add a ton of detail describing the scene. It made finding specific actions so quick. It’s also an especially great aid at re-cutting scenes when you are looking for alternate coverage.”

Another aid that editors like is to place scene cards on the wall. Typically these are 3”x5” note cards with written scene descriptions – one for each scene – that can be pinned to the wall in the order of the ongoing edit. Although Tent is also a proponent of these – a remnant practice from the old film days – his Downsizing cutting room didn’t have enough wall space to accommodate cards.

The Downsizing script clocked in a tad long and the first assembly that Payne and Tent cut was 2:45 (final length was 2:08). Obviously the team needed to do a bit of “downsizing” themselves. Tent explains, “The biggest lost scenes were bookending storyteller elements to open and close the film. There was an old caveman from far in the future telling a group of children about the events within the film and how once giants roamed the world. This story element was painful to lose, because it was very funny and effective emotionally. But it took an added three or four minutes to get to Matt Damon’s character and that hurt us.  The audience wants you to get to your main characters and understand what they’re seeing within a reasonable amount of time. Fortunately, Alexander hadn’t shot it yet as part of the main production. We previewed with storyboards, temp music, and voice over. While it was tough to lose it from the point of view of the script, we weren’t leaving produced material ‘on the cutting room floor’. Ultimately if you don’t know it was there, you won’t miss not having it.”

Downsizing opened in cinemas on December 21. Whether you are in it for the thought-provoking concepts or simply a lot of laughs and a wild ride, it’s a film to enjoy. Alexander Payne is bound to have another success on his hands.

Originally written for Digital Video magazine / Creative Planet Network

© 2017, 2018 Oliver Peters

Stocking Stuffers 2017

It’s holiday time once again. For many editors that means it’s time to gift themselves with some new tools and toys to speed their workflows or just make the coming year more fun! Here are some products to consider.

Just like the tiny house craze, many editors are opting for their laptops as their main editing tool. I’ve done it for work that I cut when I’m not freelancing in other shops, simply because my MacBook Pro is a better machine than my old (but still reliable) 2009 Mac Pro tower. One less machine to deal with, which simplifies life. But to really make it feel like a desktop tool, you need some accessories along with an external display. For me, that boils down to a dock, a stand, and an audio interface. There are several stands for laptops. I bought both the Twelve South BookArc and the Rain Design mStand: the BookArc for when I just want to tuck the closed MacBook Pro out of the way in the clamshell mode and the mStand for when I need to use the laptop’s screen as a second display. Another option some editors like is the Vertical Dock from Henge Docks, which not only holds the MacBook Pro, but also offers some cable management.

The next hardware add-on for me is a USB audio interface. This is useful for any type of computer and may be used with or without other interfaces from Blackmagic Design or AJA. The simplest of these is the Mackie Onyx Blackjack, which combines interface and output monitor mixing into one package. This means no extra small mixer is required. USB input and analog audio output direct to a pair of powered speakers. But if you prefer a separate small mixer and only want a USB interface for input/output, then the PreSonus Audiobox USB or the Focusrite Scarlett series is the way to go.

Another ‘must have’ with any modern system is a Thunderbolt dock in order to expand the native port connectivity of your computer. There are several on the market but it’s hard to go wrong with either the CalDigit Thunderbolt Station 2 or the OWC Thunderbolt 2 Dock. Make sure you double-check which version fits for your needs, depending on whether you have a Thunderbolt 2 or 3 connection and/or USB-C ports. I routinely use each of the CalDigit and OWC products. The choice simply depends on which one has the right combination of ports to fit your needs.

Drives are another issue. With a small system, you want small portable drives. While LaCie Rugged and G-Technology portable drives are popular choices, SSDs are the way to go when you need true, fast performance. A number of editors I’ve spoken to are partial to the Samsung Portable SSD T5 drives. These USB3.0-compatible drives aren’t the cheapest, but they are ultraportable and offer amazing read/write speeds. Another popular solution is to use raw (uncased) drives in a drive caddy/dock for archiving purposes. Since they are raw, you don’t pack for the extra packaging, power supply, and interface electronics with each, just to have it sit on the shelf. My favorite of these is the HGST Deckstar NAS series.

For many editors the software world is changing with free applications, subscription models, and online services. The most common use of the latter is for review-and-approval, along with posting demo clips and short films. Kollaborate.tv, Frame.io, Wipster.io, and Vimeo are the best known. There are plenty of options and even Vimeo Pro and Business plans offer a Frame/Wipster-style review-and-approval and collaboration service. Plus, there’s some transfer ability between these. For example, you can publish to a Vimeo account from your Frame account. Another expansion of the online world is in team workgroups. A popular solution is Slack, which is a workgroup-based messaging/communication service.

As more resources become available online, the benefits of large-scale computing horsepower are available to even single editors. One of the first of these new resources is cloud-based, speech-to-text transcription. A number of online services provide this functionality to any NLE. Products to check out include Scribeomatic (Coremelt), Transcriptive (Digital Anarchy), and Speedscriber (Digital Heaven). They each offer different pricing models and speech analysis engines. Some are still in beta, but one that’s already out is Speedscriber, which I’ve used and am quite happy with. Processing is fast and reasonably accurate, given a solid audio recording.

Naturally free tools make every user happy and the king of the hill is Blackmagic Design with DaVinci Resolve and Fusion. How can you go wrong with something this powerful and free with ongoing company product development? Even the paid versions with some more advanced features are low cost. However, at the very least the free version of Resolve should be in every editor’s toolkit, because it’s such a Swiss Army Knife application.

On the other hand, editors who have the need to learn Avid Media Composer, need look no further than the free Media Composer | First. Avid has tried ‘dumbed-down’ free editing apps before, but First is actually built off of the same code base as the full Media Composer software. Thus, skills translate and most of the core functions are available for you to use.

Many users are quite happy with the advantages of Adobe’s Creative Cloud software subscription model. Others prefer to own their software. If you work in video, then it’s easy to put together alternative software kits for editing, effects, audio, and encoding that don’t touch an Adobe product. Yet for most, the stumbling block is Photoshop – until now. Both Affinity Photo (Serif) and Pixelmator Pro are full-fledged graphic design and creation tools that rival Photoshop in features and quality. Each of these has its own strong points. Affinity Photo offers Mac and Windows versions, while Pixelmator Pro is Mac only, but taps more tightly into macOS functions.

If you work in the Final Cut Pro X world, several utilities are essential. These include SendToX and XtoCC from Intelligent Assistance, along with X2Pro Audio Convert from Marquis Broadcast. Marquis’ newest is Worx4 X – a media management tool. It takes your final sequence and creates a new FCPX library with consolidated (trimmed) media. No transcoding is involved, so the process is lighting fast. Although in some cases media is copied without being trimmed. This can reduce the media to be archived from TBs down to GBs. They also offer Worx4 Pro, which is designed for Premiere Pro CC users. This tool serves as a media tracking application, to let editors find all of the media used in a Premiere Pro project across multiple volumes.

Most editors love to indulge in plug-in packages. If you can only invest in a single, large plug-in package, then BorisFX’s Boris Continuum Complete 11 and/or their Sapphire 11 bundles are the way to go. These are industry-leading tools with wide host and platform support. Both feature mocha tracking integration and Continuum also includes the Primatte Studio chromakey technology.

If you want to go for a build-it-up-as-you-need-it approach – and you are strictly on the Mac – then FxFactory will be more to your liking. You can start with the free, basic platform or buy the Pro version, which includes FxFactory’s own plug-ins. Either way, FxFactory functions as a plug-in management tool. FxFactory’s numerous partner/developers provide their products through the FxFactory platform, which functions like an app store for plug-ins. You can pick and choose the plug-ins that you need when the time is right to purchase them. There are plenty of plug-ins to recommend, but I would start with any of the Crumplepop group, because they work well and provide specific useful functions. They also include the few audio plug-ins available via FxFactory. Another plug-in to check out is the Hawaiki Keyer 4. It installs into both the Apple and Adobe applications and far surpasses the built-in keying tools within these applications.

The Crumplepop FxFactory plug-ins now includes Koji Advance, which is a powerful film look tool. I like Koji a lot, but prefer FilmConvert from Rubber Monkey Software. To my eyes, it creates one of the more pleasing and accurate film emulations around and even adds a very good three-way color corrector. This opens as a floating window inside of FCPX, which is less obtrusive than some of the other color correction plug-ins for FCPX. It’s not just for film emulation – you can actually use it as the primary color corrector for an entire project.

I don’t want to forget audio plug-ins in this end-of-the-year roundup. Most editors don’t feel too comfortable with a ton of surgical audio filters, so let me stick to suggestions that are easy-to-use and very affordable. iZotope is a well-known audio developer and several of its products are perfect for video editors. These fall into repair, mixing, and mastering needs. These include the Nectar, Ozone, and RX bundles, along with the RX Loudness Control. The first three groups are designed to cover a wide range of needs and, like the BCC video plug-ins, are somewhat of an all-encompassing product offering. But if that’s a bit rich for the blood, then check out iZotope’s various Elements versions.

The iZotope RX Loudness Control is great for accurate loudness compliance, and best used with Avid or Adobe products. However, it is not real-time, because it uses analysis and adaptive processing. If you want something more straightforward and real-time, then check out the LUFS Meter from Klangfreund. It can be used for loudness control on individual tracks or the master output. It works with most of the NLEs and DAWs. A similar tool to this is Loudness Change from Videotoolshed.

Finally, let’s not forget the iOS world, which is increasingly becoming a viable production platform. For example, I’ve used my iPad in the last year to do location interview recordings. This is a market that audio powerhouse Apogee has also recognized. If you need a studio-quality hardware interface for an iPhone or iPad, then check out the Apogee ONE. In my case, I tapped the Apogee MetaRecorder iOS application for my iPad, which works with both Apogee products and the iPad’s built-in mic. It can be used in conjunction with FCPX workflows through the integration of metadata tagging for Keywords, Favorites, and Markers.

Have a great holiday season and happy editing in the coming year!

©2017 Oliver Peters

Audio Splits and Stems in Premiere Pro Revisited

Creating multichannel, “split-track” master exports of your final sequences is something that should be a standard step in all of your productions. It’s often a deliverable requirement and having such a file makes later revisions or derivative projects much easier to produce. If you are a Final Cut Pro X user, the “audio lanes” feature makes it easy to organize and export sequences with isolated channels for dialogue, music, and effects. FCPX pros like to tweak the noses of other NLE users about how much easier it is in FCPX. While that’s more or less true – and, in fact, can be a lot deeper than simply a few aggregate channels – that doesn’t mean it’s particularly hard or less versatile in Premiere Pro.

Last year I wrote about how to set this up using Premiere submix tracks, which is a standard audio post workflow, common to most DAW and mix applications. Go back and read the article for more detail. But, what about sequences that are already edited, which didn’t start with a track configuration already set up with submix tracks and proper output routing? In fact, that’s quite easy, too, which brings me to today’s post.

Step 1 – Edit

Start out by editing as you always have, using your standard sequence presets. I’ve created a few custom presets that I normally use, based on the several standard formats I work in, like 1080p/23.976 and 1080p/29.97. These typically require stereo mixes, so my presets start with a minimum configuration of one picture track, two standard audio tracks, and stereo output. This is the starting point, but more video and audio tracks get added, as needed, during the course of editing.

Get into a habit of organizing your audio tracks. Typically this means dialogue and VO tracks towards the top (A1-A4), then sound effects (A5-A8), and finally music (A9-A12). Keep like audio types on their intended tracks. What you don’t want to do is mix different audio types onto the same track. For instance, don’t put sound effects onto tracks that you’ve designated for dialogue clips. Of course, the number of actual tracks needed for these audio types will vary with your projects. A simple VO+music sequence will only have two to four tracks, while dramatic entertainment pieces will have a lot more. Delete all empty audio tracks when you are ready to mix.

Mix for stereo output as you normally would. This means balancing components using keyframes and clip mixing. Then perform overall adjustments and “riding faders” in the track mixer. This is also where I add global effects, like compression for dialogue and limiting for the master mix.

Output your final mixed master file for delivery.

Step 2 – Multichannel DME sequences

The next step is to create or open a new multichannel DME (dialogue/music/effects) sequence. I’ve already created a custom preset, which you may download and install. It’s set up as 1080p/23.976, with two standard audio channels and three, pre-labelled stereo submix channels, but you can customize yours as needed. The master output is multichannel (8-channels), which is sufficient to cover stereo pairs for the final mix, plus isolated pairs for each of the three submixes – dialogue, music, and effects.

Next, copy-and-paste all clips from your final stereo sequence to the new multichannel sequence. If you have more than one track of picture and two tracks of audio, the new blank sequence will simply auto-populate more tracks once you paste the clips into it. The result should look the same, except with the additional three submix tracks at the bottom of your timeline. At this stage, the output of all tracks is still routed to the stereo master output and the submix tracks are bypassed.

Now open the track mixer panel and, from the pulldown output selector, switch each channel from master to its appropriate submix channel. Dialogue tracks to DIA, music tracks to MUS, and effects tracks to SFX. The sequence preset is already set up with proper output routing. All submixes go to output 1 and 2 (composite stereo mix), along with their isolated output – dialogue to 3 and 4, effects to 5 and 6, music to 7 and 8. As with your stereo mix, level adjustments and plug-in processing (compression, EQ, limiting, etc.) can be added to each of the submix channels.

Note: while not essential, multichannel, split-track master files are most useful when they are also textless. So, before outputting, I would recommend disabling all titles and lower third graphics in this sequence. The result is clean video – great for quick fixes later in the event of spelling errors or a title change.

Step 3 – Multichannel export

Now that the sequence is properly organized, you’ve got to export the multichannel sequence. I have created a mastering export preset, which you may also download. It works in the various Adobe CC apps, but is designed for Adobe Media Encoder workflows. This preset will match its output to the video size and frame rate of your sequence and master to a file with the ProRes4444 codec. The audio is set for eight output channels, configured as four stereo pairs – composite mix, plus three DME channels.

To test your exported file, simply reimport the multichannel file back into Premiere Pro and drop it onto a timeline. There you should see four independent stereo channels with audio organized according to the description above.

Presets

I have created a sequence and an export preset, which you may download here. I have only tested these on Mac systems, where they are installed into the Adobe folder contained within the user’s Documents folder. The sequence preset is placed into the Premiere Pro folder and the export preset into the Adobe Media Encoder folder. If you’ve updated the Adobe apps along the way, you will have a number of version subfolders. As of December 2017, the 12.0 subfolder is the correct location. Happy mixing!

©2017 Oliver Peters

Avid Media Composer | First

They’ve teased us for two years, but now it’s finally out. Avid Technology has released its free nonlinear editing application, Media Composer | First. This is not dumbed-down, teaser software, but rather a partially-restricted version of the full-fledged Media Composer software and built upon the same code. With that comes an inherent level of complexity, which Avid has sought to minimize for new users; however, you really do want to go through the tutorials before diving in.

It’s important to understand who the target user is. Avid didn’t set out to simply add another free, professional editing tool to an increasingly crowded market. Media Composer | First is intended as a functional starter tool for users who want to get their feet wet in the Avid ecosystem, but then eventually convert to the full-fledged, paid software. That’s been successful for Avid with Pro Tools | First. To sweeten the pot, you’ll also get 350 sound effects from Pro Sound Effects and 50 royalty-free music tracks from Sound Ideas (both sets are also free).

Diving in

To get Media Composer | First, you must set up an Avid master account, which is free. Existing customers can also get First, but the software cannot be co-installed on a computer with the full version. For example, I installed Media Composer | First on my laptop, because I have the full Media Composer application on my desktop. You must sign into the account and stay signed in for Media Composer | First to lunch and run. I did get it to work if I signed in, but then disconnected the internet. There was a disconnection prompt, but nevertheless, the application worked, saved, and exported properly. It doesn’t seem mandatory to be constantly connected to Avid over the internet. All project data is stored locally, so this is not a cloud application.

The managing of the account and future updates are handled through Application Manager, an Avid desktop utility. It’s not my favorite, as at times it’s unreliable, but it does work most of the time. Opening the installer .dmg file will take a long time to verify. This seems to be a general Avid quirk, so be patient. When you first open the application, you may get a disk drive write permissions error message. On macOS you normally set drive permissions for “system”, “wheel”, and “everyone”. Typically I have the last two set to “read only”, which works for every other application, except Avid’s. Therefore, if you want to store Avid media on your internal system hard drive, then “everyone” must be changed to “read & write”.

The guided tour

The Avid designers have tried to make the Media Composer | First interface easy to navigate for new users – especially those coming from other NLEs, where media and projects are managed differently than in Media Composer. Right at the launch screen you have the option to learn through online tutorials. These will be helpful even for experienced users who might try to “out-think” the software. The interface includes a number of text overlays to help you get started. For example, there is no place to set project settings. The first clip added to the first sequence sets the project settings from there on. So, don’t drop a 25fps clip onto the timeline as your first clip, if you intend to work in a 23.98fps project. These prompts are right in front of you, so if you follow their guidance, you’ll be OK.

The same holds true for importing media through the Source Browser. With Media Composer you either transcode a file, which turns it into Avid-managed media placed into the Avid MediaFiles folder, or simply link to the file. If you select link, then the file stays in place and it’s up to the user not to move or delete that file on the hard drive. Although the original Avid paradigm was to only manage media in its MediaFiles hard drive folders, the current versions handle linking just fine and act largely the same as other NLEs.

Options, restrictions, and limitations

Since this is a free application, a number of features have been restricted. There are three biggies. Tracks are limited to four video tracks and eight audio tracks. This is actually quite workable, however, I think a higher audio track count would have been advisable, because of how Avid handles stereo, mono, and multichannel files. On a side note, if you use the “collapse” function to nest video clips, it’s possible to vertically stack more than just four clips on the timeline.

The application is locked to a maximum project size of 1920×1080 (Rec. 709 color space only) and up to 59.94fps. Source files can be larger (such as 4K) and you can still use them on the timeline, but you’ll have to pan-and-scan, crop, or scale them. I hope future versions will permit at least UltraHD (4K) project sizes.

Finally, Media Composer | First projects cannot be interchanged with full fledged Media Composer projects. This means that you cannot start in Media Composer | First and then migrate your project to the paid version. Hopefully this gets fixed in a future update. If not, it will negatively impact students and indie producers using the application for any real work.

As expected, there are no 3D stereoscopic tools, ScriptSync (automatic speech-to-text/sync-to-script), PhraseFind (phonetic search engine), or Symphony (advanced color correction) options. One that surprised me, though, was the removable of the superior Spectramatte keyer. You are left with the truly terrible RGB keyer for blue/green-screen work.

Nevertheless, there’s plenty of horsepower left. For example, FrameFlex to handle resizing and Timewarps for retiming clips, which is how 4K and off-speed frame rates are handled. Color correction (including scopes), multicam, IllusionFX, source setting color LUTs, Audiosuite, and Pro Tools-style audio track effects are also there. Transcoding allows for the use of a wide range of codecs, including ProRes on a Mac. 4K camera clips will be transcoded to 1080. However, exports are limited to Avid DNxHD and H.264 QuickTime files at up to 1920×1080. The only DNxHD export flavor is the 100Mbps variant (at 29.97, 80Mbps for 23.98), which is comparable to ProResLT. It’s good quality, but not at the highest mastering levels.

Conclusion

This is a really good first effect, no pun intended. As you might expect, it’s a little buggy for a first version. For example, I experienced a number of crashes while testing source LUTs. However, it was well-behaved during standard editing tasks. If Media Composer | First files can become compatible with the paid systems and the 1080 limit can be increased to UHD/4K, then Avid has a winner on its hands. Think of the film student who starts on First at home, but then finishes on the full version in the college’s computer lab. Or the indie producer/director who starts his or her own rough cut on First, but then takes it to an editor or facility to complete the process. These are ideal scenarios for First. I’ve cut tons of short and long form projects, including a few feature films, using a variety of NLEs. Nearly all of those could have been done using Media Composer | First. Yes, it’s free, but there’s enough power to get the job done and done well.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Suburbicon

George Clooney’s latest film, Suburbicon, originated over a decade ago as a screenplay by Joel and Ethan Coen. Clooney picked it up when the Coens decided not to produce the film themselves. Clooney and writing partner Grant Heslov (The Monuments Men, The Ides of March, Good Night, and Good Luck), rewrote it as taking place in the 1950s and added another story element. In the summer of 1957, the Myers, an African-American couple, moved into a largely white suburb in Levittown, Pennsylvania, setting off months of violent protests. The rewritten script interweaves the tale of the black family with that of their next-door neighbors, Gardner (Matt Damon) and Margaret (Julianne Moore). In fact, a documentary was produced about the historical events and shots from that documentary were used in Suburbicon.

Calibrating the tone

During the production and editing of the film, the overall tone was adjusted as a result of the actual, contemporary events occurring in the country. I spoke with the film’s editor, Stephen Mirrione (The Revenant, Birdman or (The Unexpected Virtue of Ignorance), The Monuments Men) about this. Mirrione explains, “The movie is presented as over-the-top to exaggerate events as satire. In feeling that out, George started to tone down the silliness, based on surrounding events. The production was being filmed during the time of the US election last year, so the mood on the set changed. The real world was more over-the-top than imagined, so the film didn’t feel quite right. George started gravitating towards a more realistic style and we locked into that tone by the time the film moved into post.”

The production took place on the Warner Brothers lot in September 2016 with Mirrione and first assistant editor Patrick Smith cutting in parallel with the production. Mirrione continues, “I was cutting during this production period. George would come in on Saturdays to work with me and ‘recalibrate’ the cut. Naturally some scenes were lost in this process. They were funny scenes, but just didn’t fit the direction any longer. In January we moved to England for the rest of the post. Amal [Clooney, George’s wife] was pregnant at the time, so George and Amal wanted to be close to her family near London. We had done post there before and had a good relationship with vendors for sound post. The final sound mix was in the April/May time frame. We had an editing room set up close to George outside of London, but also others in Twickenham and at Pinewood Studios. This way I could move around to work with George on the cut, wherever he needed to be.”

Traveling light

Mirrione is used to working with a light footprint, so the need for mobility was no burden. He explains, “I’m accustomed to being very mobile. All the media was in the Avid DNxHD36 format on mobile drives. We had an Avid ISIS shared storage system in Twickenham, which was the hub for all of the media. Patrick would make sure all the drives were updated during production, so I was able to work completely with standalone drives. The Avid is a bit faster that way, although there’s a slight trade-off waiting for updated bins to be sent. I was using a ‘trash can’ [2013] Mac Pro plus AJA hardware, but I also used a laptop – mainly for reference – when we were in LA during the final steps of the process.” The intercontinental workflow also extended to color correction. According to Mirrione, “Stefan Sonnenfeld was our digital intermediate colorist and Company 3 [Co3] stored a back-up of all the original media. Through an arrangement with Deluxe, he was able to stream material to England for review, as well as from England to LA to show the DP [Robert Elswit].”

Music was critical to Suburbicon and scoring fell to Alexandre Desplat (The Secret Life of Pets, Florence Foster Jenkins, The Danish Girl). Mirrione explains their scoring process. “It was very important, as we built the temp score in the edit, to understand the tone and suspense of the film. George wanted a classic 1950s-style score. We tapped some Elmer Bernstein, Grifters, The Good Son, and other music for our initial style and direction. Peter Clarke was brought on as music editor to help round out the emotional beats. Once we finished the cut, Alexandre and George worked together to create a beautiful score. I love watching the scenes with that score, because his music makes the editing seem much more exciting and elegant.”

Suiting the edit tool to your needs

Stephen Mirrione typically uses Avid Media Composer to cut his films and Suburbicon is no exception. Unlike many film editors who rely on unique Avid features, like ScriptSync, Mirrione takes a more straightforward approach. He says, “We were using Media Composer 8. The way George shoots, there’s not a lot of improv or tons of takes. I prefer to just rely on PDFs of the script notes and placing descriptions into the bins. The infrastructure required for ScriptSync, like extra assistants, is not something I need. My usual method of organization is a bin for each day of dailies, organized in shooting order. If the director remembers something, it’s easy to find in a day bin. During the edit, I alternate my bin set-ups between the script view and the frame view.”

With a number of noted editors dabbling with other software, I wondered whether Mirrione has been tempted. He responds, “I view my approach as system-agnostic and have cut on Lightworks and the old Laser Pacific unit, among others. I don’t want to be dependent on one piece of software to define how I do my craft. But I keep coming back to Avid. For me it’s the trim mode. It takes me back to the way I cut film. I looked at Resolve, because it would be great to skip the roundtrip between applications. I had tested it, but felt it would be too steep a learning curve, and that would have impacted George’s experience as the director.”

In wrapping our conversation, Mirrione concluded with this take away from his Suburbicon experience. He explains, “In our first preview screening, it was inspiring to see how seriously the audience took to the film and the attachment they had to the characters. The audiences were surprised at how biting and relevant it is to today. The theme of the film is really talking about what can happen when people don’t speak out against racism and bullying. I’m so proud and lucky to have the opportunity to work with someone like George, who wants to do something meaningful.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Mindhunter

The investigation of crime is a film topic with which David Fincher is very familiar. He returns to this genre in the new Netflix series, Mindhunter, which is executive produced by Fincher and Charlize Theron. The series is the story of the FBI’s Behavioral Science Unit and how it became an elite profiling team, known for investigating serial criminals. The TV series is based on the nonfiction book Mind Hunter: Inside the FBI’s Elite Serial Crime Unit, co-written by Mark Olshaker and John Douglas, a former agent in the unit who spent 25 years with the FBI. Agent Douglas interviewed scores of serial killers, including Charles Manson, Ted Bundy, and Ed Gein, who dressed himself in his victims’ skin. The lead character in the series, Holden Ford (played by Jonathan Groff) is based on Douglas. The series takes place in 1979 and centers on two FBI agents, who were among the first to interview imprisoned serial killers in order to learn how they think and apply that to other crimes. Mindhunter is about the origins of modern day criminal profiling.

As with other Fincher projects, he brought in much of the team that’s been with him through the various feature films, like Gone Girl, The Girl with the Dragon Tattoo, and Zodiac. It has also given a number in the team the opportunity to move up in their careers. I recently spoke with Tyler Nelson, one of the four series editors, who was given the opportunity to move from the assistant chair to that of a primary editor. Nelson explains, “I’ve been working with David Fincher for nearly 11 years, starting with The Curious Case of Benjamin Button. I started on that as an apprentice, but was bumped up to an assistant editor midway through. There was actually another series in the works for HBO called Videosyncrasy, which I was going to edit on. But that didn’t make it to air. So I’m glad that everyone had the faith in me to let me edit on this series. I cut the four episodes directed by Andrew Douglas and Asif Kapadia, while Kirk Baxter [editor on Gone Girl, The Girl with the Dragon Tattoo, The Social Network] cut the four shows that David directed.”

Pushing the technology envelope

The Fincher post operation has a long history of trying new and innovative techniques, including their selection of editing tools. The editors cut this series using Adobe Premiere Pro CC. Nelson and the other editors are no stranger to Premiere Pro, since Baxter had cut Gone Girl with it. Nelson says, “Of course, Kirk and I have been using it for years. One of the editors, Byron Smith, came over from House of Cards, which was being cut on [Apple] Final Cut Pro 7. So that was an easy transition for him. We are all fans of Adobe’s approach to the entertainment industry and were onboard with using it. In fact, we were running on beta software, which gave us the ability to offer feedback to Adobe on features that will hopefully make it into released products and benefit all Premiere users.”

Pushing the envelope is also a factor on the production side. The series was shot with custom versions of the RED Weapon camera. Shots were recorded at 6K resolution, but framed for a 5K extraction, leaving a lot of “padding” around the edges. This allowed room for reposition and stabilization, which is done a lot on Fincher’s projects. In fact, nearly all of the moving footage is stabilized. All camera footage is processed into EXR image sequences in addition to ProRes editing files for “offline” editing. These ProRes files also get an added camera LUT so everyone sees a good representation of the color correction during the editing process. One change from past projects was to bring color correction in-house. The final grade was handled by Eric Weidt on a FilmLight Baselight X unit, which was sourcing from the EXR files. The final Netflix deliverables are 4K/HDR masters. Pushing a lot of data through a facility requires robust hardware systems. The editors used 2013 (“trash can”) Mac Pros connected to an Open Drives shared storage system. This high-end storage system was initially developed as part of the Gone Girl workflow and uses storage modules populated with all SSD drives.

The feature film approach

Unlike most TV series, where there’s a definite schedule to deliver a new episode each week, Netflix releases all of their shows at once, which changes the dynamic of how episodes are handled in post. Nelson continues, “We were able to treat this like one long feature film. In essence, each episode is like a reel of a film. There are 10 episodes and each is 45 minutes to an hour long. We worked it as if it was an eight-and-a-half to nine hour long movie.” Skywalker Sound did all the sound post after a cut was locked. Nelson adds, “Most of the time we handed off locked cuts, but sometimes when you hear the cleaned up sound, it can highlight issues with the edit that you didn’t notice before. In some cases, we were able to go back into the edit and make some minor tweaks to make it flow better.”

As Adobe moves more into the world of dialogue-driven entertainment, a number of developers are coming up with speech-to-text solutions that are compatible with Premiere Pro. This potentially provides editors a function similar to Avid’s ScriptSync. Would something like this have been beneficial on Mindhunter, a series based on extended interviews? Nelson replies, “I like to work with the application the way it is. I try not to get too dependent on any feature that’s very specific or unique to only one piece of software. I don’t even customize my keyboard settings too much, just so it’s easier to move from one workstation to another that way. I like to work from sequences, so I don’t need a special layout for the bins or anything like that.”

“On Mindhunter we used the same ‘KEM roll’ system as on the films, which is a process that Kirk Baxter and Angus Wall [editor on Zodiac, The Curious Case of Benjamin Button, The Social Network] prefer to work in,” Nelson continues. “All of the coverage for each scene set-up is broken up into ‘story beats’. In a 10 minute take for an interview, there might be 40 ‘beats’. These are all edited in the order of last take to first take, with any ‘starred’ takes at the head of the sequence. This way you will see all of the coverage, takes, and angles for a ‘beat’ before moving on to the group for the next ‘beat’. As you review the sequence, the really good sections of clips are moved up to video track two on the sequence. Then create a new sequence organized in story order from these selected clips and start building the scene. At any given time you can go back to the earlier sequences if the director asks to see something different than what’s in your scene cut. This method works with any NLE, so you don’t become locked into one and only one software tool.”

“Where Adobe’s approach is very helpful to us is with linked After Effects compositions,” explains Nelson. “We do a lot of invisible split screen effects and shot stabilization. Those clips are all put into After Effects comps using Dynamic Link, so that an assistant can go into After Effects and do the work. When it’s done, the completed comp just pops back into the timeline. Then ‘render and replace’ for smooth playback.”

The challenge

Certainly a series like this can be challenging for any editor, but how did Nelson take to it? He answers, “I found every interview scene to be challenging. You have an eight to 10 minute interview that needs to be interesting and compelling. Sometimes it takes two days to just get through looking at the footage for a scene like that. You start with ‘How am I going to do this?’ Somewhere along the line you get to the point where ‘This is totally working.’ And you don’t always know how you got to that point. It takes a long time approaching the footage in different ways until you can flesh it out. I really hope people enjoy the series. These are dramatizations, but real people actually did these terrible things. Certainly that creeps me out, but I really love this show and I hope people will see the craftsmanship that’s gone into Mindhunter and enjoy the series.”

In closing, Nelson offered these additional thoughts. “I’d gotten an education each and every day. Lots of editors haven’t figured it out until well into a long career. I’ve learned a lot being closer to the creative process. I’ve worked with David Fincher for almost 11 years. You think you are ready to edit, but it’s still a challenge. Many folks don’t get an opportunity like this and I don’t take that lightly. Everything that I’ve learned working with David has given me the tools and I feel fortunate that the producers had the confidence in me to let me cut on this amazing show.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters