Final Cut Pro X – Reflecting on Six Years

df0417_fcpx5yrs_01_sm

Some personal musings…

Apple’s Final Cut Pro X has passed its five-year mark – and by now nearly most of its sixth. Although it’s getting increasing respect from many corners of the professional editing community, there are still many that dismiss it, due to its deviation from standard editing software conventions. Like so many other things that are Apple, FCPX tends to be polarizing with a large cohort of both fanboys and haters.

For me software is a tool. I’ve been editing since the 70s and have used about 15 different linear and nonlinear systems on billable work during that time. More like 20 if you toss in color correction applications. Even more with tools where I’ve had a cursory exposure to (such as in product reviews), but haven’t used on real jobs. All of these tools are a love-hate relationship for me. I have to laugh when folks talk about FCPX bringing back fun to their editing experience. I hope that the projects I work on bring me fun. I don’t really care about the software itself. Software should just get out of the way and let me do my job.

These six years have been a bit of a personal journey with Final Cut Pro X after a number of years with the “classic” version. I’ve been using FCPX since it first came out on commercials, corporate videos, shorts and even an independent feature film. It’s not my primary NLE most of the time, because my clients have largely moved to Adobe Premiere Pro CC and ask me to be compatible with them. My FCPX work tends to be mixed in and around my Premiere Pro editing gigs. For instance, right now I’m simultaneously involved in two large corporate video jobs – one of which I’m cutting in Premiere Pro and the other in Final Cut Pro X. As these things go, it can be frustrating, because you always want some function, tool or effect that’s available in Application A while you’re working in Application B. However, it also provides a perspective on what’s good and bad about each and where real speed advantages exist.

I have to say that even after six years, Final Cut Pro X is still more of a crapshoot than any other editing tool that I’ve used. I love its organizing power and often start a job really liking it. However, the deeper I get into the job – and the larger the library becomes – and the more complex the sequences become – the more bogged down FCPX becomes. It’s also the most inconsistent across various Mac models. I’ve run it on older towers, new MacBook Pros, iMacs and 2013 Mac Pros. Of these experiences, the laptops seem to be the most optimized for FCPX.

Quite frankly, working with the “trash can” Mac Pros, at times I wonder if Apple has lost its mojo. Don’t get me wrong – it’s a sweet machine, but its horsepower leaves me underwhelmed. Given the right upgrades, a 2010 Mac Pro tower is still quite competitive against it. Couple that with intermittent corrupt renders and exports on Adobe applications – due to the D-series AMD GPUs – one really has to question Apple’s design compromises. On the other hand, working with recent and new MacBook Pros, it seems pretty obvious that this is where Apple’s focus has been. And in fact, that’s where Final Cut really shines. Run a complex project on a MacBook Pro versus an older tower and it’s truly a night-and-day experience. By comparison, the performance with Adobe and Avid on the same range of machines results in a much more graduated performance curve. Best might not be quite as good, but worst isn’t nearly as awful.

A lot is made of new versus old code in these competing applications. The running argument is that FCPX uses a sleek, new codebase, whereas Premiere Pro and Media Composer run on creaky old software. Yet Final Cut has been out publicly for six years, which means development started a few years before that. Hmmm, no longer quite so new. Yet, if you look at the recent changes from 10.2 to 10.3, it seems pretty clear that a lot more was changed than just cosmetics. The truth of the matter is that all three of these major applications are written in a way that modules of software can be added, removed or changed, without the need to start from scratch. Therefore, from a coding standpoint, Final Cut doesn’t have nearly the type of advantages that many think it has.

The big advantage that FCPX does have, is that Apple can optimize its performance for the holistic hardware and macOS software architecture of their own machines. As such, performance, render speeds, etc. aren’t strictly tied to only the CPU or the GPU. It’s what enables the new MacBook Pro to offer top-end performance, while still staying locked to 16GB of RAM. It seems to me, that this is also why the Core-series processors appear to be better performers than are the Xeon-series chips, when it comes to Final Cut, Motion and Compressor.

If you compare this to Premiere Pro, Adobe hits the GPUs much harder than does Apple, which is the reason behind the occasional corruptions on the “trash can” Macs with Adobe renders. If you were running the Adobe suite on a top-level PC with high-end Nvidia cards, performance would definitely shine over that of the Macs. This is largely due to leveraging the CUDA architecture of these Nvidia GPUs. With Apple’s shift to using only AMD and Intel GPUs, CUDA acceleration isn’t available on newer Macs. Under the current software versions of Adobe CC (at the time of this writing) and Sierra, you are tied to OpenCL or software-only rendering and cannot even use Apple’s Metal acceleration. This is a driver issue still being sorted out between Apple and Adobe. Metal is something that Apple tools take advantage of and is a way that they leverage the combined hardware power, without focusing solely on CPU or GPU acceleration.

All of this leads me back to a position of love-hate with any of these tools. I suspect that my attitude is more common than most folks who frequent Internet forum debates want to admit. The fanboy backlash is generally large. When I look at how I work and what gets the results, I usually prefer track-based systems to the FCPX approach. I tend to like Final Cut as a good rough-cut editing application, but less as a fine-cut tool. Maybe that’s just me. That being said, I’ve had plenty of experiences where FCPX quite simply is the better tool under the circumstance. On a recent on-site edit gig at CES, I had to cut some 4K ARRI ALEXA material on my two-year-old Retina MacBook Pro. Premiere Pro couldn’t hack it without stuttering playback, while FCPX was buttery smooth. Thus FCPX was the axe for me throughout this gig.

Likewise, in the PC vs. Mac hardware debates,  I may criticize some of Apple’s moves and long to work on a fire-breathing platform. But if push came to shove and I had to buy a new machine today, it would be either a Mac Pro “trash can” or a tricked-out iMac. I don’t do heavy 3D renders or elaborate visual effects – I edit and color correct. Therefore, the overall workflow, performance and “feel” of the Apple ecosystem is a better fit for me, even though at times performance might be middling.

Wrapping up this rambling post – it’s all about personal preference. I applaud Apple for making the changes in Final Cut Pro X that they did; however, a lot of things are still in need of improvement. Hopefully these will get addressed soon. If you are looking to use FCPX professionally, then my suggestion is to stick with only the newest machines and keep your productions small and light. Keep effects and filters to a minimum and you’ll be happiest with the results and the performance. Given the journey thus far, let’s see what the next six years will bring.

©2017 Oliver Peters

Nocturnal Animals

nocanim_01_smSome feature films are entertaining popcorn flicks, while others challenge the audience to go deeper. Writer/director Tom Ford’s (A Single Man) second film, Nocturnal Animals definitely fits into the latter group. Right from the start, the audience is confronted with a startling and memorable main title sequence, which we soon learn is actually part of an avant garde art gallery opening. From there the audience never quite knows what’s around the next corner.

Susan Morrow (Amy Adams) is a privileged Los Angeles art gallery owner who seems to have it all, but whose life is completely unfulfilled. One night she receives an unsolicited manuscript from Edward Sheffield (Jake Gyllenhaal), her ex-husband with whom she’s been out of touch for years. With her current husband (Armie Hammer) away on business, she settles in for the night to read the novel. She is surprised to discover it is dedicated to her. The story being told by Edward is devastating and violent, and it triggers something in Susan that arouses memories of her past love with the author.

Nocturnal Animals keeps the audience on edge and is told through three parallel storylines – Susan’s current reality, flashbacks of her past with Edward, and the events that are unfolding in the novel. Managing this delicate balancing act fell to Joan Sobel, ACE, the film’s editor. In her film career, Sobel has worked with such illustrious directors as Quentin Tarantino, Billy Bob Thornton, Paul Thomas Anderson and Paul Weitz.  She was Sally Menke’s First Assistant Editor for six-and-a-half years on four films, including Kill Bill, vol. 1 and Kill Bill, vol. 2.  Sobel also edited the Oscar-winning short dark comedy, The Accountant.  This is her second feature with Tom Ford at the helm.

Theme and structure

In our recent conversation, Joan Sobel discussed Nocturnal Animals. She says, “At its core, this film is about love and revenge and regret, with art right in the middle of it all. It’s about people we have loved and then carelessly discarded, about the cruelties that we inflict upon each other, often out of fear or ambition or our own selfishness.  It is also about art and the stuff of dreams.  Susan has criticized Edward’s ambition as a writer. Edward gets Susan to feel again through his art – through that very same writing that Susan has criticized in the past. But art is also Edward’s vehicle for revenge – revenge for the hurt that Susan has caused him during their past relationship. The film uses a three-pronged story structure, which was largely as Tom scripted. The key was to find a fluid and creative way to transition from one storyline to the other, to link those moments emotionally or visually or both. Sometimes that transition was triggered by a movement, but other times just a look, a sound, a color or an actor’s nuanced glance.”

nocanim_02Nocturnal Animals was filmed (yes, on film not digital) over 31 days in California, with the Mojave Desert standing in for west Texas. Sobel was cutting while the film was being shot and turned in her editor’s cut about a week after the production wrapped. She explains, “Tom likes to work without a large editorial infrastructure, so it was just the two of us working towards a locked cut. I finished my cut in December and then we relocated to London for the rest of post. I always put together a very polished first cut, so that there is already an established rhythm and a flow.  That way the director has a solid place to begin the journey. Though the movie was complex with its three-pronged structure – along with the challenge of bringing to life the inner monologue that is playing in Susan’s head – the movie came together rather quickly. Tom’s script was so well written and the performances so wonderful that by the end of March we pretty much had a locked cut.”

The actors provided fruitful ground for the editor.  Sobel continues, “It was a joy to edit Amy Adams’ performance. She’s a great actress, but when you actually edit her dailies, you get to see what she brings to the movie. Her performance is reliant less on dialogue (she actually doesn’t have many lines), instead emphasizing Amy’s brilliance as a film actor in conveying emotion through her mind and through her face and her eyes.”

“Tom is a staggering talent, and working with him is a total joy.  He’s fearless and his creativity is boundless.  He is also incredibly generous and very, very funny (we laugh a lot!), and we share an endless passion for movies.  Though the movie is always his vision, his writing, he gravitates towards collaboration. So we would get quite experimental in the cut. The trust and charm and sharp, clear intelligence that he brings into the cutting room resulted in a movie that literally blossoms with creativity. Editing Nocturnal Animals was a totally thrilling experience.”

Tools of the trade

nocanim_03Sobel edited Nocturnal Animals with Avid Media Composer. Although she’s used other editing applications, Media Composer is her tool of choice. I asked about how she approaches each new film project. She explains, “The first thing I do is read the script. Then I’ll read it again, but this time out loud. The rhythms of the script become more lucid that way and I can conceptualize the visuals. When I get dailies for a scene, I start by watching everything and taking copious notes about every nuance in an actor’s performance that triggers an emotion in me, that excites me, that moves me, that shows me exactly where this scene is going.  Those moments can be the slightest look, a gesture, a line reading.”

“I like to edit very organically based on the footage. I know some editors use scene cards on a wall or they rely on Avid’s Script Integration tools, but none of those approaches are for me. Editing is like painting – it’s intuitive. My assistants organize bins for themselves in dailies order. Then they organize my bins in scene/script order. I do not make selects sequences or ‘KEM rolls’. I simply set up the bins in frame view and then rearrange the order of clips according to the flow – wide to tight and so on. As I edit, I’m concentrating on performance and balance. One little trick I use is to turn off the sound and watch the edit to see what is rhythmically and emotionally working. Often, as I’m cutting the scene, I find myself actually laughing with the actor or crying or gasping! Though this is pretty embarrassing if someone happens to walk into my cutting room, I know that if I’m not feeling it, then the audience won’t either.”

Music and sound are integral for many editors, especially Sobel. She comments, “I love to put temp music into my editor’s cuts. That’s a double-edged sword, though, because the music may or may not be to the taste of the director. Though Tom and I are usually in sync when it comes to music, Tom doesn’t like to start off with temp music in the initial cut, so I didn’t add it on this film. Once Tom and I started working together, we played with music to see what worked. This movie is one that we actually used very little music in and when we did, added it quite sparingly. Mostly the temp music we used was music from some of Abel’s [Korzeniowski, composer] other film scores. I also always add layers of sound effects to my tracks to take the movie and the storytelling to a further level. I use sound to pull your attention, to define a character, or a mood, or elevate a mystery.”

Unlike many films, Nocturnal Animals flew through the post process without any official test screenings. Its first real screening was at the Venice Film Festival where it won the Silver Lion Grand Jury Prize. “Tom has the unique ability to both excite those working with him and to effortlessly convey his vision, and he had total confidence in the film. The film is rich with many layers and is the rare film that can reveal itself through subsequent viewings, hopefully providing the audience with that unique experience of being completely immersed in a novel, as our heroine becomes immersed in Nocturnal Animals,” Sobel says. The film opened in the US during November and is a Focus Features release.

Check out more with Joan Sobel at “Art of the Cut”.

Originally written for Digital Video magazine / Creative Planet Network.

©2017 Oliver Peters

AJA T-Tap

 

df0217_ttap_sm

The Thunderbolt protocol has ushered in a new era for easy connectivity of hardware peripherals. It allows users to deploy a single connection type to tie in networking, external storage, monitoring and broadcast audio and video input and output. Along with easy connections, it has also enabled peripheral devices to becomes smaller, lighter and more powerful. This is in part due to advances in the hardware and software, as well. AJA Video Systems is one of the popular video manufacturers that has taken advantage of these benefits.

In many modern editing environments, the actual editing system has become extremely streamlined. All it really takes is a Thunderbolt-enabled laptop, all-in-one (like an iMac) or desktop computer, fast external storage, and professional monitoring – and you are good to go. For many editors, live video output is strictly for monitoring, as deliverables are more often-than-not files and not tape. Professional monitoring is easy to achieve using SDI or HDMI connections. Any concern for analog is gone, unless you need to maintain analog audio monitoring. AJA makes a series of i/o products to address these various needs, ranging from full options down to simple monitoring devices. Blackmagic Design and AJA currently produce the lion’s share of these types of products, including PCIe cards for legacy installations and Thunderbolt devices for newer systems.

I recently tested the AJA T-Tap, which is a palm-sized video output device that connects to the computer using the Thunderbolt 2 protocol. It is bus-powered – meaning that no external power supply or “wall-wart” is needed to run it. I tested this on both a 2013 Mac Pro and a 2015 MacBook Pro. In each case, my main need was SDI and/or HDMI out of the unit to external monitors. Installation couldn’t be easier. Simply download the current control panel software and drivers from AJA’s website, install, and then connect the T-Tap. Hook up your monitors and you are ready. There’s very little else to do, except set your control panel configuration for the correct video/frame rate standard. Everything else is automatic in both Adobe Premiere Pro CC and Apple Final Cut Pro X. Although you’ll want to check your preference settings to make sure the device is detected and enabled.

One of the main reasons I wanted to test the T-Tap was as a direct comparison with the Blackmagic products on these same computers. For example, the current output device being used on the 2013 Mac Pro that I tested is a Blackmagic DesignUltraStudio Express. This contains a bit more processing and is comparable to AJA’s Io XT . I also tested the BMD MiniMonitor, which is a direct competitor to the T-Tap. The UltraStudio provides both input and output and offers an analog break-out cable harness, whereas the two smaller units are only output using SDI and HDMI. All three are bus-powered. In general, all performed well with Premiere Pro, except that the BMD MiniMonitor couldn’t provide output via HDMI. For unexplained reasons, that screen was blank. No such problem with either the T-Tap or the UltraStudio Express.

The real differences are with Final Cut Pro X on the Mac Pro. That computer has six Thunderbolt ports, which are shared across three buses – i.e. two connectors per bus. On the test machine, one bus feeds the two external displays, the second bus connects to external storage (not shared for maximum throughput), and the remaining bus connects to both the output device and a CalDigit dock. If the BMD UltraStudio Express is plugged into any connection shared with another peripheral, JKL high-speed playback and scrubbing in FCPX is useless. Not only does the video output stutter and freeze, but so does the image in the application’s viewer. So you end up wasting an available Thunderbolt port on the machine, if you want to use that device with FCPX. Therefore, using the UltraStudio with FCPX on this machine isn’t really functional, except for screening with a client. This means I end up disabling the device most of the time I use FCPX. In that respect, both the AJA T-Tap and the BMD MiniMonitor performed well. However, my subjective evaluation is that the T-Tap gave better performance in my critical JKL scrubbing test.

One difference that might not be a factor for most, is that the UltraStudio Express (which costs a bit more) has advanced processing. This yields a smooth image in pause when working with progressive and PsF media. When my sequence was stopped in either FCPX or Premiere, both the T-Tap and the UltraStudio yield a full-resolution, whole-frame image on the HDMI output. (HDMI didn’t appear to function on the MiniMonitor.) On the TV Logic broadcast display that was being fed vis SDI, the T-Tap and MiniMonitor only displayed a field in pause, so you get an image with “jaggies”. The UltraStudio Express generates a whole frame for a smooth image in pause. I didn’t test a unit like AJA’s Io XT, so I’m not sure if the more expensive AJA model offers similar processing. However, it should be noted that the Io XT is triple the cost of the UltraStudio Express.

The elephant in the room, of course, is Blackmagic Design DaVinci Resolve. That application is restricted to only work with Blackmagic’s own hardware devices. If you want to run Resolve – and you want professional monitoring out of it – then you can’t use any AJA product with it. However, these units are so inexpensive to begin with – compared with what they used to cost – it’s realistic to own both. In fact, some FCPX editors use a T-Tap while editing in FCPX and then switch over to a MiniMonitor or UltraStudio for Resolve work. The reason being the better performance between Final Cut and the AJA products.

Ultimately these are all wonderful devices. I like the robustness of AJA’s manufacturing and software tools. I’ve used their products over the years and never been disappointed with performance or service if needed. If you don’t need video output from Resolve, then the AJA T-Tap is a great choice for an inexpensive, simple, Thunderbolt video output solution. Laptop users who need to hook up to monitors while working at home or away will find it a great choice. Toss it into your laptop bag and you are ready to rock.

©2017 Oliver Peters

BorisFX BCC 10

df3216_bcc10_01_sml

Boris Continuum Complete (BCC) by BorisFX is the epitome of the term “Swiss Army knife” when it comes to talking about plug-ins. Most editors will pick this package over others, if they can only have one toolkit to cover a diverse range of picture enhancements. In the past year, BorisFX has upgraded this toolkit with new effects, expanded to add more NLE hosts, and integrated mocha’s Academy Award-winning planar tracking technology after the acquisition of Imagineer Systems. This set of plug-ins is now up to version BCC10. BorisFX has not only added new effects to BCC10, but also expanded its licensing options to include multi-host and subscription options.

Since many users now work with several NLEs, multi-host licensing makes a lot of sense. One purchase with a single serial number covers the installation for each of the various applications. There are two multi-host license versions: one for Avid/Adobe/Apple/OFX and the second that doesn’t include Avid. OFX licensing covers the installation for Blackmagic Design DaVinci Resolve, as well as Sony Vegas Pro for PC users.

What’s new in BCC10

df3216_bcc10_10Boris Continuum Complete version 10 includes over 230 effects within 16 different categories, like 3D Objects, Art Looks, Particles, Perspective and more. Each effect comes with numerous presets for a total of over 2,500 presets in all. There are plenty of new tools in BCC10, but the biggest news is that each effect filter integrates mocha planar tracking. BorisFX has always included Pixel Chooser as a way of masking objects. Now each filter also lets you launch the mocha interface right from inside the plug-in’s effect control panel. For example, if you are applying skin smoothing to only your talent’s forehead using the new BCC Beauty Studio, simply launch mocha, create a mask for the forehead and track the talent’s movement within the shot. The mask and track are saved within the plug-in, so you can instantly see the results.

df3216_bcc10_05A second big change is the addition and integration of the FX Browser. Each plug-in effect lets you launch the FX Browser interface to display how each of the various presets for that effect would look when applied to the selected clip. You can preview the whole clip, not just a thumbnail. FX Browser is also a standalone effect that can be applied to the clip. When you use it that way, then all presets for all filters can be previewed. While FX Browser has been implemented in past versions in some of the hosts, this is the first time that it’s become an integrated part of the BCC package across all NLEs.

df3216_bcc10_02BCC10 includes two new “studio” tools, as well as a number of new individual effects. BCC Beauty Studio is a set of tools in a single filter targeted at image retouching, especially the skin texture of talent. Photographers retouch “glamor” shots to reduce or remove blemishes, so Photoshop-style retouching is almost expected these days. This is the digital video equivalent. As with most skin smoothing filters, BCC Beauty Studio uses skin keying algorithms to isolate skin colors. It then blurs skin texture, but also lets the editor adjust contrast, color correction, and even add a subtle glow to image highlights. Of course, as I mentioned above, mocha masking and tracking is integrated for the ultimate control in where and how the effect is applied.

The second new, complex filter is BCC Title Studio. This is an integrated 3D titling tool that can be used based on templates within the effects browser or by launching the separate Title Studio interface. Editors familiar with BorisFX products will recognize this titling interface as essentially Boris RED right inside of their NLE. Not only can you create titles, but also more advanced motion graphics. You can even import objects, EPS and image files for 3D effects, including the addition of materials and shading. As with other BorisFX tilting tools, you can animate text on and off the screen.

df3216_bcc10_03In addition to these two large plug-ins, BCC10 also gained nine new filters and transitions. These include BCC Remover (fills in missing pixels or removes objects using cloning) and BCC Drop-out Fixer (restores damaged footage). For the folks who have to deal with a lot of 4×3 content and vertical cell phone footage, there’s BCC Reframer. Unlike the usual approach where the same image is stretched and blurred behind the vertical shot, this filter includes options to stylize the foreground and background.

df3216_bcc10_11The trend these days is to embrace image “defects” as a creative effect, so two of the new filters are BCC Light Leaks and BCC Video Glitch. Each adds organic, distressed effects, like in-camera light contamination and corrupted digital video artifacts. To go along with this, there are also four new transitions, including a BCC Light Leaks Dissolve, Cross Glitch, Cross Zoom and Cross Melt. Of these, the light leaks, glitch and zoom transitions are about what you’d expect from the name, however, the melt transition seems rather unique. In addition to the underlying dissolve between two images, there are a variety of effects options that can be applied as part of this transition. Many of these are glass, plastic, prism or streak effects, which add an interesting twist to this style of transition.

In use

df3216_bcc10_04The new BCC10 package works within the established hosts much like it always has, so no surprises there. The Boris Continuum Complete package used to come bundled with Avid Media Composer, but unfortunately that’s no longer the case. Avid editors who want the full BCC set have to purchase it. As with most plug-ins, After Effects is generally the best host when adjustment and manipulation of effects are required.

df3216_bcc10_09A new NLE to consider is DaVinci Resolve. Many are testing the waters to see if Resolve could become their NLE of choice. Blackmagic Design introduced Resolve 12.5 with even more focus on its editing toolset, including new, built-in effect filters and transitions. In my testing, BCC10 works reasonably well with Resolve 12.5 once you get used to where the effects are. Resolve uses a modal design with editing and color correction split into separate modes or pages. BCC10 transition effects only show up in the OFX library of the edit page. For filter effects, which are applied to the whole clip, you have to go to the color page. During the color correction process you may add any filter effect, but it has to be applied to a node. If you apply more than one filter, you have to add a new node for each filter. With the initial release of BCC10, mocha did not work within Resolve. If you tried to launch it, a message came up that this functionality would be added at a later time. In May, BorisFX released BCC10.2, which included mocha for both Resolve 12.5 and Vegas Pro. To use the BCC10 effects with Resolve 12.5 you need the paid Studio version and not the free version of Resolve.

df3216_bcc10_07BorisFX BCC10 is definitely a solid update, with new features, mocha integration and better GPU-based performance. It runs best in After Effects CC, Premiere Pro CC and Avid Media Composer. The built-in effects tools are pretty good in After Effects, Final Cut Pro X and Resolve 12.5 – meaning you might get by without needing what BCC10 has to offer. On the other hand, they are unfortunately very mediocre in Premiere Pro or Media Composer. If one of those is your editing axe, then BCC10 becomes an essential purchase, if you want to improve the capabilities of your editing application. Regardless of which tool you use, BCC10 will give you more options to stretch your creativity.

df3216_bcc10_08On a related note, at IBC 2016 in Amsterdam, BorisFX announced the acquisition of GenArts. This means that the Sapphire effects are now housed under the BorisFX umbrella, which could make for some interesting bundling options in the future. As with their integration of mocha tracking into the BCC effects, future versions of BCC and/or Sapphire might also see a sharing of compatible technologies across these two effects families. Stay tuned.

Originally written for Digital Video magazine / Creative Planet Network

©2016 Oliver Peters

The wait is over – FCP X 10.3

df3116_fcpx1003_1_smAmidst the hoopla on Oct. 27th, when Apple introduced the new MacBook Pro with Touch Bar, the ProApps team also released updates to Final Cut Pro X, Motion and Compressor. This was great news for fans, since Final Cut got a prime showcase slot in the event’s main stage presentation. Despite the point numbering, the bump from 10.2 to 10.3 is a full version change, just like in macOS, where 10.11 (El Capitan) to 10.12 (Sierra) is also a new version. This makes FCP X 10.3 the fourth iteration in the FCP X line and the eleventh under the Final Cut Pro brand. I’m a bit surprised that Apple didn’t drop the “X” from the name, though, seeing as it’s done that with macOS itself. And speaking of operating systems, this release requires 10.11.4 (El Capitan) or higher (Sierra).

If you already purchased the application in the past, then this update will be a free upgrade for you. There are numerous enhancements, but three features stand out among the changes: the new interface, the expanded use of roles for mixing, and support for a wider color gamut.

A new look for the user interface

The new user interface is darker and flatter. Although for my taste, it’s a bit too dark without any brightness sliders to customize the appearance. The dimensional style is gone, putting Final Cut Pro X in line with the aesthetics of iMovie and other Apple applications. Final Cut Pro X was already out of step with design trends at the time it was first released. Reskinning the application with this new appearance brings it in line with the rest of the design industry.

The engineers have added workspaces and rearranged where certain controls are, though generally, panels are in the same places as before. Workspaces can be customized, but not nearly to the level of Adobe’s Premiere Pro CC. The most welcomed of these changes is that the inspector pane can be toggled to full height when needed. In reality, the inspector height isn’t changed. It’s the width of the timeline that changes and toggles between covering and revealing the full inspector panel.

There are other minor changes throughout 10.3, which make it a much better application. For example, if you like to work with a source/record, 2-up viewer display, then 10.3 now allows you to play a source clip from inside the event viewer.

Magnetic Timeline 2 and the expansion of roles

df3116_fcpx1003_2Apple did a lot of work to rejigger the way the timeline works and to expand the functionality of roles. It’s even being marketed as Magnetic Timeline 2. Up until now, the use of roles in Final Cut has been optional. With 10.3, it’s become the primary way to mix and organize connected clips within the timeline. Apple has resisted adding a true mixing panel, instead substituting the concept of audio lanes.

Let’s say that you assign the roles of dialogue, music or effects to your timeline audio clips. The timeline index panel lets you organize these clips into groups according to their assigned roles, which Apple calls audio lanes. If you click “show audio lanes”, the various connected clips rearrange vertical position in the timeline window to be grouped into their corresponding lanes, based on roles. Now you have three lanes of grouped clips: dialogue, effects, music. You can change timeline focus to individual roles – such as only dialogue – which will minimize the size of all the other roles (clips) in the window. These groups or lanes can also be soloed, so you just hear dialogue without the rest, for example.

There is no submix bus to globally control or filter groups of clips, like you have in Premiere Pro or most digital audio applications. The solution in FCP X 10.3 is to select all clips of the same role and create a compound clip. (Other NLEs refer to this as “nesting”.) By doing so, all of the dialogue, effects and music clips appear on the timeline as only three compound clips – one for each role. You can then apply audio filters or adjust the overall level of that role by applying them to the compound clip.

Unfortunately, if you have to go back and make adjustments to an individual clip, you’ll have to open up the compound clip in its own timeline. When you do that, you lose the context of the other clips. For example, tweaking a sound effect clip inside its compound clip, means that you would only hear the other surrounding effect clips, without dialogue and music or seeing the video. In addition, you won’t hear the result of filters or volume changes made at the top level of that compound clip. Nevertheless, it’s not as complex as it sounds and this is a viable solution, given the design approach Apple engineers have taken.

df3116_fcpx1003_3It does surprise me that they ended up with this solution, because it’s a very modal way of operating. This would seem to be an anathema to the intent of much of the rest of FCP X’s design. One has to wonder whether or not they’ve become boxed in my their own architecture. Naturally others will counter that this process is simplified due to the lack of track patching and submix matrices.

Wide color

The industry at large is embracing color standards that enable displays to reproduce more of the color spectrum, which the human eye can see. An under-the-hood change with FCP X is the embrace of wide gamut color. I think that calling it “wide color” dumbs down the actual standards, but I guess Apple wants to keep things in plain language. In any case, the interface is pretty clear on the actual specs.

Libraries can be set up for “standard color” (Rec. 601 for SD and Rec. 709 for HD) or “wide color” (Rec. 2020). The Projects (sequences) that you create within a Library can be either, as long as the Library was initially set up for wide gamut. You can also change the setting for a Project after the fact. Newer cameras that record in raw or log color space, like RED or ARRI models, are perfectly compatible with wide color (Rec. 2020) delivery, thanks to post-production color grading techniques. That is where this change comes into play.

For the most part you won’t see much difference in normal work, unless you really crank up the saturation. If you do this in the wide color gamut mode, you can get pretty extreme and the scopes will display an acceptable signal. However, if you then switch the Project setting to standard color, the high chroma areas will change to a somewhat duller appearance in the viewer and the scopes will show signal clipping. Most current television display systems don’t display wide gamut color, yet, so it’s not something most users need to worry about today. This is Apple’s way of future-proofing Final Cut and to pass the cleanest possible signal through the system.

A few more things

df3116_fcpx1003_4Numerous other useful tools were added in this version. For example, Flow – a morphing dissolve – for use in bridging jump cuts. Unlike Avid’s or Adobe’s variations, this transition works in real-time without analysis or rendering. This is because it morphs between two still frames. Each company’s approach has a slightly different appearance, but Flow definitely looks like an effect that will get a lot of use – especially with interview-driven productions. Other timeline enhancements include the ability to easily add and toggle audio fades. There’s simplified top and tail trimming. Now you can remove attributes and you can roll (trim) between adjacent, connected clips. Finally – a biggie for shared storage users – FCP X can now work with NAS systems that use the SMB protocol.

Working with it for over a week at the time I post this, the application has been quite stable, even on a production with over 2,000 4K clips. I wouldn’t recommend upgrading if you are in the middle of a production. The upgraded Libraries I tested did exhibit some flakiness, which weren’t there in freshly created Libraries. There’s also a technique to keep both 10.2 and 10.3 active on the same computer. Definitely trash your preferences before diving in.

So far, the plug-ins and Motion templates still work, but you’ll definitely need to check whether these vendors have issued updates designed for this release. This also goes for the third-party apps, like those from Intelligent Assistance, because 10.3 adds a new version of FCPXML. Both Intelligent Assistance and Blackmagic Design issued updates (for Resolve and Desktop Video) by the next day.

There are a few user interface bugs, but no show-stoppers. For instance, the application doesn’t appear to hold its last state upon close, especially when more than one Library is open. When you open it again the next time, the wrong Library may be selected or the wrong Project loaded in the timeline. It occasionally loses focus on the pane selected. This is an old bug that was there in previous versions. You are working in the timeline and all of a sudden nothing happens, because the application “forgot” which pane it’s supposed to have focus on. Clicking command-1 seems to fix this. Lastly, the audio meters window doesn’t work properly. If you resize it to be slimmer, the next time you launch FCP X, the meters panel is large again. That’s even if you updated the workspace with this smaller width. And then sometimes they don’t display audio until you close and reopen the audio meters window.

In this round of testing, I’ve had to move around Libraries with external media to different storage volumes. This requires media relinking. While it was ultimately successful, the time needed to relink was considerably longer than doing this same task in other NLEs.

My test units are all connected to Blackmagic Design i/o hardware, which seems to retard performance a bit. With a/v output turned off within the FCP X interface, clips play right away without stuttering when I hit the spacebar. With the a/v output on, I randomly get stuttering on clips when they start to play. It’s only a minor nuisance, so I just turn it off until I need to see the image on an external monitor. I’ve been told that AJA hardware performs better with FCP X, but I haven’t had a chance to test this myself. In any case, I don’t see this issue when running the same media through Premiere Pro on the exact same computer, storage and i/o hardware.

Final Cut Pro X 10.3 will definitely please most of its fans. There’s a lot of substance and improvement to be appreciated. It also feels like it’s performing better, but I haven’t had enough time with a real project yet to fully test that. Of course, the users who probe a bit deeper will point to plenty of items that are still missing (and available in products like Premiere Pro), such as better media relinking, more versatile replace edit functions and batch exporting.

For editors who’ve only given it a cursory look in the past or were swayed by the negative social media and press over the past five years, this would be the version to re-evaluate. Every new or improved item is targeted at the professional editor. Maybe it’s changed enough to dive in. On the other hand, if you’re an editor who’s given FCP X a fair and educated assessment and just not found it to your liking or suitable for your needs, then I doubt 10.3 will temp you. Regardless, this gives fans some reassurance about Apple’s commitment to professional users of their software – at least for another five years.

If you have the time, there are plenty of great tips here at the virtual Final Cut User Group.

The new Final Cut Pro X 10.3 user manual can be found here.

Click here for additional links highlighting features in this update.

Originally written for Digital Video magazine / Creative Planet Network

©2016 Oliver Peters

Tools for Dealing with Media

df3016_media_1_sm

Although most editing application manufacturers like to tout how you can just go from camera to edit with native media, most editors know that’s a pretty frustrating way to work. The norm these days is for the production team to use a whole potpourri of professional and prosumer cameras, so it’s really up to the editor to straighten this out before the edit begins. Granted a DIT could do all of this, but in my experience, the person being called a DIT is generally just someone who copies/backs-up the camera cards onto hard drives to bring back from the shoot. As an editor you are most likely to receive a drive with organized copies of the camera media cards, but still with the media in its native form.

Native media is fine when you are talking about ARRI ALEXA, Canon C300 or even RED files. It is not fine when coming from a Canon 5D, DJI, iPhone, Sony A7S, etc. The reason is that these systems record long-GOP media without valid timecode. Most do not generate unique file names. In some cases, there is no proper timebase within the files, so time itself is “rubbery” – meaning, a frame of time varies slightly in true duration from one frame to the next.

If you remove the A7S .mp4 files from within the clutter of media card folders and take these files straight into an NLE, you will get varying results. There is a signal interpreted as timecode by some tools, but not by others. Final Cut Pro X starts all of these clips at 00:00:00:00, while Premiere Pro and Resolve read something that is interpreted as timecode, which ascends sequentially on successive clips. Finally, these cameras have no way to deal with off-speed recordings. For example, if a higher frame rate is recorded with the intent to play it back in slow motion. You can do that with a high-end camera, but not these prosumer products. So I’ve come to rely on several software products heavily in these types of productions.

Step 1 : Hedge for Mac

df3016_media_2The first step in any editing is to get the media from the field drives onto the edit system drives. Hopefully your company’s SOP is to archive this media from the field in addition to any that comes out of the edit. However, you don’t want to edit directly from these drives. When you do a Finder copy from one drive to the next there is no checksum verification. In other words, the software doesn’t actually check to make sure the copy is exact without errors. This is the biggest plus for an application like Hedge – copy AND verification.

Hedge comes in a free and a paid version. The free version is useful, but copy and verify is slower than the paid version. The premium (paid) version uses a software component that they call Fast Lane to speed up the verification process so that it takes roughly the same amount of time as a Finder copy, which has no verification. To give you an idea, I copied a 62GB folder from a USB2.0 thumb drive to an external media drive connected to my Mac via eSATA (through an internal card). The process took under 30 minutes for a copy through Hedge (paid version) – about the same as it took for a Finder copy. Using the free version takes about twice as long, so there’s a real advantage to buying the premium version of the application. In addition, the premium version works with NAS and RAID systems.

The interface is super simple. Sources and targets are drag-and-drop. You can specify folders within the drives, so it’s not just a root-level, drive-to-drive copy. Multiple targets and even multiple sources can be specified within the same batch. This is great for creating a master as well as several back-up copies. Finally, Hedge generates a transfer log for written evidence of the copies and verification performed.

Step 2 : EditReady

df3016_media_3Now that you have your media copies, it’s time to process the prosumer camera media into something more edit-friendly. Since the camera-original files are being archived, I don’t generally save both the original and converted files on my edit system. For all intents and purposes, the new, processed files become my camera media. I’ve used tools like MPEG Streamclip in the past. That still works well, but EditReady from Divergent Media is better. It reads many media formats that other players don’t and it does a great job writing ProRes media. It will do other formats, too, but ProRes is usually the best format for projects that I work with.

One nice benefit of EditReady is that it offers additional processing functions. For example, if you want to bake in a LUT to the transcoded files, there’s a function for that. If you shot at 29.97, but want the files to play at 23.976 inside you NLE, EditReady enables you to retime the files accordingly. Since Divergent Media also makes ScopeBox, you can get a bundle with both EditReady and ScopeBox. Through a software conduit called ScopeLink, clips from the EditReady player show up in the ScopeBox viewer and its scopes, so you can make technical evaluations right within the EditReady environment.

EditReady uses a drag-and-drop interface that allows you to set up a batch for processing. If you have more that one target location or process chain, simply open up additional windows for each batch that you’d like to set up. Once these are fired off, all process will run simultaneously. The best part is that these conversions are fast, resulting in reliable transcoded media in an edit-friendly format.

Step 3: Better Rename

df3016_media_4The last step for me is usually to rename the file names. I won’t do this with formats like ALEXA ProRes or RED, but it’s essential for 5D, DJI and other similar cameras. That’s because these camera normally don’t generate unique file names. After all, you don’t want a bunch of clips that are named C0001 with a starting timecode of 00:00:00:00 – do you?

While there are a number of batch renaming applications and even Automator scripts that you can create, my preferred application is Better Rename, which is available in the Mac App Store. It has a host of functions to change names, add numbered sequences and append a text prefix or suffix to a name. The latter option is usually the best choice. Typically I’ll drag my camera files from each group into the interface and append a prefix that adds a camera card identifier and a date to the clip name. So C0001 becomes A01_102916_C0001. A clip from the second card would change from C0001 to A02_102916_C0001. It’s doubtful that the A camera would shoot more than 99 cards in a day, but if so, you can adjust your naming scheme accordingly.

There you go. Three simple steps to bulletproof how you work with media.

©2016 Oliver Peters

Audio Splits and Stems in Premiere Pro

df2916_audspltppro_8_sm

When TV shows and feature films are being mixed, the final deliverables usually include audio stems as separate audio files or married to a multi-channel video master file or tape. Stems are the isolated submix channels for dialogue, sound effects and music. These elements are typically called DME (dialogue, music, effects) stems or splits and a multi-channel master file that includes these is usually called a split-track submaster. These isolated tracks are normally at mix level, meaning that you can combine them and the sum should equal the same level and mix as the final composite mixed track.

The benefit of having such stems is that you can easily replace elements, like re-recording dialogue in a different language, without having to dive back into the original audio project. The simplest form is to have 3 stereo stem tracks (6 mono tracks) for left and right dialogue, sound effects and music. Obviously, if you have a 5.1 surround mix, you’ll end up with a lot more tracks. There are also other variations for sports or comedy shows. For example, sports shows often isolate the voice-over announcer material from an on-camera dialogue. Comedy shows may isolate the laugh track as a stem. In these cases, rather than 3 stereo DME stems, you might have 4 or more. In other cases, the music and effects stems are combined to end up with a single stereo M&E track (music and effects minus dialogue).

Although this is common practice for entertainment programming, it should also be common practice if you work in short films, corporate videos or commercials. Creating such split-track submasters at the time you finish your project can often save your bacon at some point down the line. I ran into this during the past week. df2916_audspltppro_1A large corporate client needed to replace the music tracks on 11 training videos. These videos were originally editing in 2010 using Final Cut Pro 7 and mixed in Pro Tools. Although it may have been possible to resurrect the old project files, doing so would have been problematic. However, in 2010, I had exported split-track submasters with the final picture and isolated stereo tracks for dialogue, sound effects and music. These have become the new source for our edit – now 6 years later. Since I am editing these in Premiere Pro CC, it is important to also create new split-track submasters, with the revised music tracks, should we ever need to do this again in the future.

Setting up a new Premiere Pro sequence 

I’m usually editing in either Final Cut Pro X or Premiere Pro CC these days. It’s easy to generate a multi-channel master file with isolated DME stems in FCP X, by using the Roles function. However, to do this, you need to make sure you properly assign the correct Roles from the get-go. Assuming that you’ve done this for dialogue, sound effects and music Roles on the source clips, then the stems become self-sorting upon export – based on how you route a Role to its corresponding export channel. When it comes to audio editing and mixing, I find Premiere Pro CC’s approach more to my liking. This process is relatively easy in Premiere, too; however, you have to set up a proper sequence designed for this type of audio work. That’s better than trying to sort it out at the end of the line.

df2916_audspltppro_4The first thing you’ll need to do is create a custom preset. By default, sequence presets are configured with a certain number of tracks routed to a stereo master output. This creates a 2-channel file on export. Start by changing the track configuration to multi-channel and set the number of output channels. My requirement is to end up with an 8-channel file that includes a stereo mix, plus stereo stems for isolated dialogue, sound effects and music. Next, add the number of tracks you need and assign them as “standard” for the regular tracks or “stereo submix” for the submix tracks.

df2916_audspltppro_2This is a simple example with 3 regular tracks and 3 submix tracks, because this was a simple project. A more complete project would have more regular tracks, depending on how much overlapping dialogue or sound effects or music you are working with on the timeline. For instance, some editors like to set up “zones” for types of audio. You might decide to have 24 timeline tracks, with 1-8 used for dialogue, 9-18 for sound effects and 17-24 for music. In this case, you would still only need 3 submix tracks for the aggregate of the dialogue, sound effects and music.

df2916_audspltppro_5Rename the submix tracks in the timeline. I’ve renamed Submix 1-3 as DIA, SFX and MUS for easy recognition. With Premiere Pro, you can mix audio in several different places, such as the clip mixer or the audio track mixer. Go to the audio track mixer and assign the channel output and routing. (Channel output can also be assigned in the sequence preset panel.) For each of the regular tracks, I’ve set the pulldown for routing to the corresponding submix track. Audio 1 to DIA, Audio 2 to SFX and Audio 3 to MUS. The 3 submix tracks are all routed to the Master output.

df2916_audspltppro_3The last step is to properly assign channel routing. With this sequence preset, master channels 1 and 2 will contain the full mix. First, when you export a 2-channel file as a master file or a review copy, by default only the first 2 output channels are used. So these will always get the mix without you having to change anything. Second, most of us tend to edit with stereo monitoring systems. Again, output channels 1 and 2 are the default, which means you’ll always be monitoring the full mix, unless you make changes or solo a track. Output channels 3-8 correspond to the stereo stems. Therefore, to enable this to happen automatically, you must assign the channel output in the following configuration: DIA (Submix 1) to 1-2 and 3-4, SFX (Submix 2) to 1-2 and 5-6, and MUS (Submix 3) to 1-2 and 7-8. The result is that everything goes to both the full mix, as well as the isolated stereo channel for each audio component – dialogue, sound effects and music.

Editing in the custom timeline

Once you’ve set up the timeline, the rest is easy. Edit any dialogue clips to track 1, sound effects to track 2 and music to track 3. In a more complex example, like the 24-track timeline I referred to earlier, you’d work in the “zones” that you had organized. If 1-8 are routed to the dialogue submix track, then you would edit dialogue clips only to tracks 1-8. Same for the corresponding sound effects and music tracks. Clips levels can still be adjusted as you normally would. But, by having submix tracks, you can adjust the level of all dialogue by moving the single, DIA submix fader in the audio track mixer. This can also be automated. If you want a common filter, like a compressor, added all of one stem – like a compressor across all sound effects – simply assign it from the pulldown within that submix channel strip.

Exporting the file

df2916_audspltppro_6The last step is exporting your spilt-track submaster file. If this isn’t correct, the rest was all for naught. The best formats to use are either a QuickTime ProRes file or one of the MXF OP1a choices. In the audio tab of the export settings panel, change the pulldown channel selection from Stereo to 8 channels. Now each of your timeline output channels will be exported as a separate mono track in the file. These correspond to your 4 stereo mix groups – the full mix plus stems. Now in one single, neat file, you have the final image and mix, along with the isolated stems that can facilitate easy changes down the road. Depending on the nature of the project, you might also want to export versions with and without titles for an extra level of future-proofing.

Reusing the file

df2916_audspltppro_7If you decide to use this exported submaster file at a later date as a source clip for a new edit, simply import it into Premiere Pro like any other form of media. However, because its channel structure will be read as 8 mono channels, you will need to modify the file using the Modify-Audio Channels contextual menu (right-click the clip). Change the clip channel format from Mono to Stereo, which turns your 8 mono channels back into the left and right sides of 4 stereo channels. You may then ignore the remaining “unassigned” clip channels. Do not change any of the check boxes.

Hopefully, by following this guide, you’ll find that creating timelines with stem tracks becomes second nature. It can sure help you years later, as I found out yet again this past week!

©2016 Oliver Peters