Apple 2019 16″ MacBook Pro

Creatives in all fields are the target market for Apple’s MacBook Pro product line. Apple introduced the 16″ model to replace its 15″ predecessor in late 2019. This new model is not only a serious tool for location work but is powerful enough to form the hub of your edit suite, whether on-site or at a fixed facility.

Nuts and bolts

The 2019 MacBook Pro comes in 6-core (Intel Core i7) and 8-core (Intel Core i9) configurations. It boasts up to 64GB RAM, an AMD Radeon Pro 5500M series GPU with up to 8GB of GDDR6 VRAM, and can be equipped with up to 8TB of internal SSD storage. Prices start at under $2,800 USD*, but the full monty rings up at about $6,500 USD*. The high-capacity SSD options contribute to the most expensive configurations, but those sizes may be overkill for most users. A more realistic price for a typical editor’s 8-core configuration would be about $3,900 USD*. Just stick to a 1TB internal SSD and back the RAM down to 32GB. Quite frankly the 6-core is likely to be sufficient for many editing and design tasks. (*With AppleCare warranty, but no local VAT or sales taxes added.)

While there are great PC laptop choices, it’s very hard to make direct comparisons to laptops with all of these same components. Few PC laptops offer this much RAM or SSDs that large, for example. When you can make a direct comparison, name brand PC laptops are often more expensive. And remember, great gaming PCs are not necessarily the best editing machines and vice versa.

What’s improved?

Apple loaned me a space gray, 8-core 16″ MacBook Pro configured with 64GB RAM, the AMD GPU with 8GB VRAM, and a 4TB internal hard drive. I own a mid-2014 MacBook Pro as the center of my edit system at home. The two MacBook Pros are physically very similar. They are nearly the identical size, with the 2019 MacBook Pro a bit thinner and lighter. The keyboard footprint is the same as my five-year-old model, though the keys are slightly larger on the new laptop with less space between. Apple tweaked the keyboard mechanism on the 16″ model. Typing feels about the same between these two, though the keys on the new machine are quieter. Plus, it has a much bigger trackpad and the touch bar.

The Retina screen is housed in a lid about the same size as the 15″ model. Its 16″ diagonal spec is achieved by using smaller bezels. Of course, the newer display is also higher density resulting in 3072 x 1920 pixels at 226 ppi. This 500-nit, P3 display offers True Tone, thanks to macOS Catalina. True Tone alters the color temperature based on ambient light. It’s easy on the eyes, because it warms up the image in standard room lighting. However, I would discourage enabling it while working on projects requiring critical color accuracy.

The MacBook Pro features four Thunderbolt 3/USB-C ports, like its 15″ predecessor. These connect peripherals and/or power. I don’t know whether Apple had room to add standard USB-A ports or if there was a trade-off for space needed for cooling or the improved speaker system. Maybe it was just a decision to move forward regardless of some short-term inconvenience. In any case, if you purchase a MacBook Pro, plan on also buying a few adapters and cables, a USB hub, and/or a Thunderbolt 3 dock.

Performance testing

How does the 2019 MacBook Pro stack up against a desktop Mac, like the 10-core 3.0 GHz 2017 iMac Pro used at my daily editing gig? Both have 64GB RAM, but there are 16GB of VRAM in the iMac Pro. I tested with both the internal SSDs and an external USB 3.0 GDRIVE SSD. Both internal drives clocked well over 2500 MB/sec, while the GDRIVE was in the 400-500 range.

My “benchmark” tests included BruceX for Final Cut Pro X, Puget Systems’ After Effects benchmark, and two of Simon Ubsdell’s Motion tutorial projects. For the “real world” tests, I used a travelogue series sizzle edit with 4K (and larger) media, various codecs, scaling, speed changes, and color correction. The timeline was exported to ProRes and H.264 using various NLEs. My final test sequence was a taxing 6K FCPX timeline composed of nine layers of 6K RED raw files.

Until the release of the new Mac Pro tower, the iMac Pro had been Apple’s most powerful desktop computer. Yet, in nearly all of these tests, the 16″ MacBook Pro equaled or slightly bettered the times of the iMac Pro. The laptop had faster export times and a higher Puget score with After Effects. One exception was my nine-layer 6K RED project, where the iMac Pro shined – exporting twice as fast as the MacBook Pro. Both Macs played back and scrubbed through this variety of files with ease, regardless of application or internal versus external drive. Overall, the biggest difference I noticed was that exports on the iMac Pro stayed quiet throughout, while the MacBook Pro frequently had to rev up the fans.

Apple claims up to 11 hours of battery life, but that’s really just during light duty computing: checking e-mails, writing, surfing the web, etc. And if you’ve optimized your energy settings for battery life. The MacBook Pro includes an integrated Intel GPU and employs automatic graphics cards switching. During tasks that don’t generate a heavy GPU load the machine is running on the integrated card. It switches to the AMD for apps like Final Cut or Premiere. I purposefully set up a looping 4K sequence in FCPX and found that the battery drained from 100% down to 10% in about two hours. While this is more stress than normal editing, it’s typical behavior for creative applications. The bottom line is that you shouldn’t expect to be editing for 11 hours straight running only on the internal battery.

Final thoughts

This machine comes with macOS Catalina pre-installed. Don’t plan on downgrading that. Catalina is the first version of macOS that is purely 64-bit and with significant under-the-hood security changes. All 32-bit dependencies have been dropped, which means that some software and hardware products might not be fully compatible. So, check first. I would recommend a clean install of apps and plug-ins rather than migrating from another machine’s drive. I did a clean installation of apps through my Apple and Adobe CC IDs and was ready to go in half a day. Conversely I’ve heard stories from others who pulled their hair out for a week trying to migrate between machines and OS versions. There are ways to do this and most of the time they work, but why not just start out fresh when you get a new machine?

Apple’s ProApps, Adobe Creative Cloud software, and DaVinci Resolve are all ready to go. Media Composer editors have a slight wait. Avid is beta testing a Catalina-compatible version of Media Composer (at the time of this writing). Some functionality won’t be there at the start, but updates will quickly add in many of those missing features.

Mac owners who bought a recent 15″ MacBook Pro will see a definite boost, but probably not enough to refresh their systems yet. But if, like me, you are running a five-year-old model, then this unit becomes very tempting, especially when working with anything more taxing than ProRes 1080p files. For work on-the-go, like on-site editing, DIT tasks, photo processing, or other creative tasks, the 2019 MacBook Pro easily fits the bill. Many editors may also need a machine that can easily shift from the field to the office/home/studio in place of an iMac or iMac Pro. If that’s you, then the MacBook Pro has the horsepower. Plug it up to a Thunderbolt dock, an external display, or other peripherals, and you are ready to go – no desktop computer needed.

Original written for RedShark News.

©2020 Oliver Peters

A First Look at Postlab Cloud

Apple developed Final Cut Pro X around single-editor workflows. As such, professional editing teams who wanted to use this tool for collaborative editing have been challenged to develop their own solutions. One approach was Postlab, which was developed in-house at Dutch broadcaster Evanglische Omroep (EO). In order to expand the product as a commercial application, lead developer Jasper Siegers decided to move it under the Hedge umbrella. This required the app to be rebuilt with new code before it could be offered to the FCPX market. That time has come and Postlab is now available as Postlab Cloud.

As the name implies, Postlab Cloud hosts your FCPX libraries “in the cloud,” i.e. on Postlab’s servers. Some production companies or broadcasters are reticent to have their editing computers connected online, but it’s important to note that only libraries and no media or caches are hosted by Postlab. This keeps the transfer times fast and file sizes light. Cache and media files stay local, whether on your machine or on connected shared storage. Postlab sets up accounts based on site licenses and numbers of users. Each user is assigned a log-in based on an e-mail address and a password. This means that a production hosted by Postlab can be accessed by authorized users anywhere in the world, provided there’s a viable internet connection.

The owner of the account can set up productions and organize them within folders. Each production is a collection or bundle of one or more Final Cut Pro X libraries. If you have ever worked with Final Cut Server in the FCP7 days, then the Postlab workflow is very similar. Once a production has been created, an editor can log in, download the library (a check-out step), edit in it, and then upload the changed version (a check-in step). As part of this upload, the Postlab interface prompts you to add comments describing the work you’ve done. Only one editor at a time can download a library and have write access; however, other users can still download it with read-only access. If you have two editors ping-ponging work on the same library file, then one has to upload it (check in) before the other editor can download it (check out) for their edits.

Getting started

I decided to test Postlab Cloud in two scenarios: a) multiple workstations connected to a shared storage network, and b) two disconnected editors collaborating over long distances. To start, once an account has been established, any editor using Postlab Cloud must install the small Postlab application. Since the app controls some of Final Cut’s functions, you will be prompted to enable GUI Scripting in your privacy preferences. In order for Postlab to work properly, media and cache files need to be outside of the library bundle. When you first download a library, you may be prompted to change your settings. In a networked environment with media on shared storage, the path to the media should be the same on each workstation. This means when Editor A finishes and checks in the production and then Editor B checks it back out, you generally will not need to relink the media files on Editor B’s system. Therefore, this edit collaboration can proceed fluidly.

Once a production has been downloaded, the library file exists as a temporary file on the local machine and not the network. This means that Postlab can still work in tandem with storage solutions that don’t normally perform well with FCPX libraries. In addition to this temporary library file, the Final Cut backup library is also stored in the location you have designated. If you are working in a networked, collaborative environment, then the advantage Postlab offers is version tracking and the ability for multiple users to open a library file (only one with write access).

Long distance

The second scenario is working with other editors outside of your facility. The first step is to get the media to the outside editor. You could certainly send a drive, but that isn’t efficient in time nor cost, especially across continents. If you only need creative editing and not finishing services, then low-res, proxy files are fine. So I converted my 4K UHD ProRes HQ files to 960 x 540 H264 (3Mbps) files and used Frame.io to transfer them over the internet. The key to proper relinking when you are done is to set audio to pass-through when converting this files. This was a double-system sound shoot, so I uploaded both the H264 videos files and the sound recordist’s WAV files to Frame and then downloaded them again at the other end (my home). Now I had media in both locations. The process would be the same even if it were two editors in two different countries.

The first Postlab step is to create and upload this FCPX library. Once that has been established, any authorized user with a Postlab log-in can access the production. I decided to go back and forth on this production between my home and the facility and also using different user log-ins – thus simulating a team of remote editors. Each time I did this, version changes were tracked by Postlab. If I were working with multiple editors, I would have been able to see what tasks each had performed.

It’s important to note that when you collaborate in this way, each editor should be using the same effects, LUTs, and Motion templates, otherwise some things will appear offline. Since the path to the media was different at home versus at the facility, each time I went between the two, checking in and then checking out the production, media files would appear offline. A simple relink fixed this, but it’s something to be aware of. Once totally done, I could relink to the high-res camera files and “finish” the project back at the office.

Wrap-up

When you upload a library back to Postlab, that open FCPX library is closed within Final Cut Pro X on your system, because you have checked it back in. Once you log out of Postlab, the temporary library file is moved to the trash. If you need a local version of the library, then export it from the Postlab app.

Once you get the hang of it, collaboration is simple using Postlab Cloud. Library files stay light without any of the sort of corruption caused by using services like DropBox. My test project included synchronized multi-cam clips and multi-channel audio. Each time during this exchange clips, projects, and edits showed up as expected when going between the various users. Whether or not Apple ever tackles collaboration within Final Cut Pro X is an unknown. But why wait? If you need that today, then Postlab Cloud offers a solid answer.

The relaunched Postlab Cloud includes three plans, which are priced per user/per year: Postlab, Postlab Pro, and Postlab Server. The first tier only allows for library version tracking and sharing. Pro allows for a lot more libraries to be shared and comes with more features. Server is a dedicated Postlab Cloud server for larger teams or those that require IT-specific features like Active Directory. Finally, Hedge/Postlab plans to ship a local version of Postlab – designed for use within local networks – soon after launch.

Postlab has now expanded to include Premiere Pro users.

Check out the Postlab tutorials for more information.

The article was originally written for FCP.co.

©2020 Oliver Peters

Video Technology 2020 – Shared Storage

Shared storage used to be the domain of “heavy iron” facilities with Avid, Facilis, and earlier Apple Xserve systems providing the horsepower. Thanks to advances in networking and Ethernet technology, shared storage is accessible to any user. Whether built-in or via adapters, modern computers can tap into 1Gbps, 10Gbps, and even higher, networking speeds. Most computers can natively access Gigabit Ethernet networks (1Gbps) – adequate for SD and HD workflows. Computers designed for the pro video market increasingly sport built-in 10GbE ports, enabling comfortable collaboration with 4K media and up. Some of today’s most popular shared storage vendors include QNAP, Synology, and LumaForge.

This technology will become more prolific in 2020, with systems easier to connect and administer, making shared storage as plug-and-play as any local drives. Network Attached Storage (NAS) systems can service a single workstation or multiple users. In fact, companies like QNAP even offer consumer versions of these products designed to operate as home media servers. Even LumaForge sells a version of its popular Jellyfish through the online Apple Store. A simple, on-line connection guide will get you up and running, no IT department required. This is ideal for the individual editor or small post shop.

Expect 2020 to see higher connection speeds, such as 40GbE, and NAS proliferation even more widespread. It’s not just a matter of growth. These vendors are also interested in extending the functionality of their products beyond being a simple bucket for media. NAS systems will become full-featured media hubs. For example, if you an Avid user, you are familiar with their Media Central concept. In essence, this means the shared storage solution is a platform for various other applications, including the editing software. There are additional media applications that include management apps for user permission control, media queries, and more. Like Avid, the other vendors are exploring similar extensibility through third-party apps, such as Axle Video, Kyno, Hedge, Frame.io, and others. As such, a shared network becomes the whole that is greater than the sum of its parts.

Along with increased functionality, expect changes in the hardware, too. Modern NAS hardware is largely based on RAID arrays with spinning mechanical drives. As solid state (SSD) storage devices become more affordable, many NAS vendors will offer some of their products featuring RAID arrays configured with SSDs or even NVMe systems. Or a mixture of the two, with the SSD-based units used for short-term projects or cache files. Eventually the cost will come down enough so that large storage volumes can be cost-effectively populated with only SSDs. Don’t expect to be purchasing 100TB of SSD storage at a reasonable price in 2020; however, that is the direction in which we are headed. At least in this coming year, mechanical drives will still rule. Nevertheless, start looking at some percentage of your storage inventory to soon be based on SSDs.

Click here for more on shared storage solutions.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Video Technology 2020 – The Cloud

The “cloud” is merely a collection of physical data centers in multiple locations around the world – not much different than a small storage center you might have. Of course, they employ more advanced systems for power, redundancy, and security than you do. When you work with one of the companies marketing cloud-based editing or a review-and-approval service, like Frame.io or Wipster, they provide the user-facing interface, but are actually renting storage space from one of the big three cloud providers – Google, Amazon, or Microsoft.

There are three reasons that I’m skeptical about ubiquitous, cloud-based editing (with media at native resolutions) in the short term: upload speeds, cost, and security.

Speed

5G (fifth generation wireless) is the technology predicted to offer adequate speeds and low latency for native 4K (and higher) media. While 5G will be a great advancement for many things, it’s a short distance signal requiring more transmission spots than current wireless technology. Full coverage in most metro areas, let alone widespread geographical coverage worldwide, will take many years to fully deploy. Other than potential camera-to-cloud uploads of proxy media in the field, 5G won’t soon be the killer solution. Current technology still dictates that if you want the fastest possible upload speeds for large amounts of data, then you have to tap as close as possible to the internet’s backbone.

Cost

Cloud storage is cheap, but extensive upload and download times aren’t. Unfortunately modern video resolutions also result in huge amounts of data generated on every shoot. Uploading native 4K media for a week-long production is considerably more expensive than FedEx and overnight charges to ship drives. What about long term storage? Let’s say that all of your native media is in the cloud and you pay according to a monthly or annual subscription plan. But what if you want to stop? That media will have to be downloaded and stored locally, which will incur data rate charges, as well as your time to download everything.

Security

Think these sites are unequivocally secure? Look at any data hack at a major company. Security is such a concern in our business that most major movie studios won’t let their editors connect the computers to the internet. Many make these editors check their cell phones at the door. No matter how secure, it’s going to be a hard sell, except for limited slices of the production, such as cloud-based VFX rendering.

I do believe 2020 will be a year in which many will take advantage of some modes of long distance, cloud-based edit services using low-res proxy media. Increasingly some services will be used to move dailies and deliverables around the globe via the cloud. But that’s a big difference from cloud-based editing becoming the norm. One edit scenario many will experiment with is to store the edit project files in the cloud, but with the media mirrored locally at each edit site. This way only the lightweight files used for edit collaboration need be moved over the internet. Think of this as Google Docs for editing. Adobe already offers a version of this, but I suspect you’ll see others, including solutions for Final Cut Pro X. So while true cloud-based editing is not a near-term solution, bits and pieces will become increasingly commonplace.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Video Technology 2020 – Editing Software

Four editing applications dominate the professional market: Adobe Premiere Pro, Apple Final Cut Pro X, Avid Media Composer, and Blackmagic Design DaVinci Resolve. Established facilities are still heavy Avid users, with Adobe being the up-and-coming choice. This doesn’t mean that Final Cut Pro X lost out. Going into 2020, Apple can tout FCPX as the best-selling version of its professional editing tool. It most likely has three million users after nearly nine years on the market. While pro editors in the US are often reluctant to adopt FCPX, this innovative application has earned wider acceptance in the broader international market.

The three “A”s have been battling for editing market share, but the wild card is Blackmagic Design’s DaVinci Resolve. It started as a high-end color correction application, but through Blackmagic’s acquisitions and fast development pace, Resolve is becoming an all-in-one application rivaling Autodesk Smoke or Avid DS. Recent versions bring enhanced creative editing tools, making it possible to edit, mix, composite, grade, and deliver entirely from Resolve. No need to roundtrip with other applications. Blackmagic is so dedicated to Resolve as an editor that they introduced a special editor keyboard.

Is Resolve attractive enough to sway editors to shift away from other tools? The answer for most in 2020 will still be “no.” Experienced editors have made their choice and all of the current options are quite good. However, Resolve does make the most sense for new users with no prior allegiances. The caveat is advanced finishing. Users may edit in an editing application, but then roundtrip to Resolve and back for grading. Unfortunately these roundtrips can be problematic. So I do think that many will opt to cut creatively in their NLE of choice, but then send to Resolve for the final grade, mix, and VFX work. Expect to see Resolve’s finishing footprint expand in 2020.

Two challenges confront these companies in 2020: multi-user collaboration and high dynamic range (HDR) delivery. Collaboration is an Avid strength, but not so for the other three. Blackmagic and Adobe have an approach to project sharing, but still not what Avid users have come to expect. Apple offers nothing directly, but there are some third-party workarounds. Expect 2020 to yield collaboration improvements for Final Cut Pro X and Premiere Pro.

HDR is a more complex situation requiring specialized hardware for proper monitoring. There simply is no way to accurately view HDR on any computer display. All of these companies are developing software pipelines to deal with HDR, but in 2020, HDR delivery will still require specific hardware that will remain the domain of dedicated color correction facilities.

Finally, as with cameras, AI will become an increasing aspect of post hardware. You already see that in Apple’s shape recognition within FCPX (automatic sorting of wides and close-ups) or Adobe Sensei for content replacement and automatic music editing. Expect to see more of these features introduced in coming software versions.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Every NLE is a Database

Apple’s Final Cut Pro X has spawned many tribal arguments since its launch eight years ago. There have been plenty of debates about the pros and cons of its innovative design and editing model. One that I’ve heard a number of times is that FCPX is a relational database, while traditional editing applications are more like an Excel spreadsheet. I can see how the presentation of a bin in the list view format might convey that impression, but that doesn’t make it accurate. Spreadsheets are a grid of cells that are based on a combination of mathematical formulae, regardless of whether the info is text or numbers. All nonlinear editing applications (NLE) use a relational database to track media, although the type and format of this database will differ among brands. In all cases, these function altogether differently than how a spreadsheet functions.

It started with film

When all editing was done on film, the editors cut work print, which was a reversal copy printed from the camera negative. Edits made on the work print were eventually duplicated on the pristine negative by a negative cutter, based on a cut list. Determining where to cut and join the film segments together was based on a list of edit points corresponding to the source rolls of the film, plus a foot+frame count for specific edit points. The work print, which the editors could physically cut and splice as needed, was effectively an abstraction of – and stand-in for – the negative.

In order to enable the process, assistant editors (or in some cases, the editor) created a handwritten log, known as a codebook. This started with the dailies and included all the pertinent information, such as source roll, shoot days/dates, scenes/takes, director’s notes, editor’s notes, and so on. The codebook was a physical database that allowed an editor to know what the options were and where to find them.

During the videotape-editing era prior to NLEs, any sort of database for tracking source information was still manual. Only the cut list portion, known as the edit decision list, could be generated by the edit computer, based on the timecode values recorded on the tape. Timecode became the electronic equivalent of the foot+frame count of physical film.

Fast forward to the modern era with file-based camera acquisition and ubiquitous, inexpensive editing software. The file recorded by the camera is a container of sorts that holds essence (audio and video) and metadata (information about the essence). Some cameras generate a lot of metadata and others don’t. One example of this type of metadata that we all encounter is the information embedded into digital still photos, which can include location, lens data, and a ton more.

When clips are ingested/imported into your NLE – whether into a project, bin, folder, or an event – the NLE links to the essence of the media clips on the hard drive or camera card and brings in whatever clip metadata is understood by that application. In addition, the user can add and merge a lot more metadata derived from other sources, like the sound recorder, script supervisor notes, electronic script, and manually-added data.

The clip that you see in the bin/event/folder is an abstraction for the actual audio and video media, just like work print was for film editors. The bin/folder/event data entries are like the film editor’s codebook and are tracked in the internal database used by that application to cross-reference the clip with the actual stored media. Since a clip in the app’s browser is simply an abstraction, it can appear in multiple places at the same time – in various bins and sequences. The internal database makes sure that each of these instances of the clip all reference the same piece of media accurate down to the video frame or audio sample.

It doesn’t matter how the bin looks

The spreadsheet comparison is based on how bins have appeared in most NLEs, including Final Cut Pro “legacy,” Avid Media Composer, and others. Unfortunately that opinion is usually based on a narrow exposure to other NLEs. As I said, at the core, every NLE is a relational database. And so, there are other things that can be tracked and/or ways it can be displayed.

For instance, older Quantel edit systems displayed source information based on what we would consider a smart search view today. The entirety of the source material was not displayed in front of the editor, since it was a single-screen layout. Entering data into a search field would sift through and present clips matching the requested data.

Avid Media Composer systems also track media based on Script Integration (sometimes incorrectly referred to as ScriptSync, which is a separate Avid option). This is a graphical bin layout with the script text displayed on screen and clips linked to coverage of that scene. Media Composer and now Premiere Pro both permit a freeform clip view for a bin, in which the editor can freely rearrange the position of the clip thumbnails within the bin window. This visual juxtaposition by the user of clips conveys important information to the editor.

All NLEs have multiple ways to present the data and aren’t limited to a grid-style list view that resembles a spreadsheet or a grid of clip thumbnails. Enabling these alternate views takes a lot more than simply cross-referencing your bin and timelines against a set of edit points. That’s where databases come in and why every NLE is built around one.

How can you be in two places at once when you’re not anywhere at all?

My apologies to Firesign Theatre. A huge aspect of the Final Cut Pro X edit workflow is the use of keyword collections. You aren’t limited to being in just a single bin thanks to them. While this is a selling point for FCPX, it is also well within the capabilities of most NLEs.

Organizing your event (bin) media in FCPX can start by assigning keywords to each clip. Each new keyword used creates a keyword collection – sort of a “smart sub-bin.” As you assign one or more keywords to a clip, FCPX automatically sorts the clip into those corresponding keyword collections.  For example, let’s say you have a series of wide and close-up shots featuring both male and female actors. Clip 1 might be sorted into WIDE and MAN; Clip 2 into WIDE and WOMAN; Clip 3 into WOMAN and CLOSE-UP. So then the keyword collection for WIDE displays Clip 1 and Clip 2; MAN displays Clip 1; WOMAN displays Clip 2 and Clip 3; CLOSE-UP displays Clip 3.

Once this initial step is completed it enables the editor to view source clips in a more focused manner. Instead of wading through 100 clips in the event (bin) each time, the editor may only have to deal with 10 clips in the CLOSE-UP keyword collection. Or in any other collection. The beauty of FCPX’s interface design is the speed and fluidity with which this can be accomplished. This feature is one of the hallmarks of the application and no other NLE does it nearly as elegantly. In fact, FCPX tackles the challenge of narrowing down the browser options through three methods – ratings, keyword collections, and smart collection (described in this linked tutorial by Simon Ubsdell).

As elegantly as Final Cut tackles this task, that doesn’t mean that other NLEs can’t function in a similar manner. Within Premiere Pro, those exact same keywords can be assigned to the clips. Then simply create a set of search bins using those same keywords as search criteria. The result is the exact same type of distribution of clips into collections where multiple clips can appear in multiple bins at the same time. Likewise, the editor doesn’t need to go through the full set of clips in a bin, but can concentrate on the small handful in any given search bin. Media Composer also offers search functions, as well as, custom sift routines, which enable you to only display clips matching specific column details, like a custom keyword.

Most NLEs can only store one set of in/out edit marks on a clip within a bin at any given time. On the other hand, Final Cut Pro X offers range-based selection. Clips can retain multiple in/out selections at once. Nevertheless other NLE aren’t behind here either. The obvious solution that most editors use when this is needed is to create a subclip, which can be a duplicate of the entire clip or a portion from within a single clip. Need to pull multiple sections of the clip? Simply create multiple subclips. In effect, these are the same as range-based selections in Final Cut Pro X. Admittedly the FCPX method is more fluid and straightforward. Nevertheless, range-based selections are virtual subclips that are dynamically created by the editor; but unlike subclips, these can’t be moved separately to other events (bins). Two ways to tackle a very similar need.

The bottom line is that under the hood, all NLEs are still very much the same. Let me emphasize that I’m not arguing the superiority, speed, or elegance of one approach or tool over another. Every company has their own set of unique features that appeal to different types of editors. They are simply different methods to place information at your fingertips, get roadblocks out of the way, and thus to make editing more creative and enjoyable.

©2019 Oliver Peters

Ford v Ferrari

Outraged by a failed attempt to acquire European carmaker Ferrari, Henry Ford II sets out to trounce Enzo Ferrari on his own playing field – automobile endurance racing. Unfortunately, the effort falls short, leading Ford to turn to independent car designer, Carroll Shelby. But Shelby’s outspoken lead test driver, Ken Miles, complicates the situation by making an enemy out of Ford Senior VP Leo Beebe. Nevertheless, Shelby and his team are able to build one of the greatest race cars ever – the GT40 MkII – setting the showdown between the two auto legends at the 1966 24 Hours of Le Mans. Matt Damon and Christian Bale star as Shelby and Miles.

The challenge of bringing this clash of personalities to the screen was taken on by director James Mangold (Logan, Wolverine, 3:10 to Yuma) and his team of long time collaborators. I recently spoke with film editors Michael McCusker, ACE (Walk the Line, 3:10 to Yuma, Logan) and Andrew Buckland (The Girl on the Train) about what it took to bring Ford v Ferrari together.

_____________________________________________

[OP] The post team for this film has worked with James Mangold on quite a few films. Tell me a bit about the relationship.

[MM] I cut my very first movie, Walk The Line, for Jim 15 years ago and have since cut his last six movies. I was the first assistant editor on Kate & Leopold, which was shot in New York in 2001. That’s where I met Andrew, who was hired as one of the local New York film assistants. We became fast friends. Andrew moved out to LA in 2009 and I hired him to assist me on Knight & Day. We’ve been working together for 10 years now.

I always want to keep myself available for Jim, because he chooses good material, attracts great talent, and is a filmmaker with a strong vision who works across multiple genres. Since I’ve worked with him, I’ve cut a musical movie, a western, a rom-com, an action movie, a straight-up superhero movie, a dystopian superhero movie, and now a car racing film.

[OP] As a film editor, it must be great not to get type-cast for any particular cutting style.

[MM] Exactly. I worked for David Brenner for years as his first. He was able to cross genres and that’s what I wanted to do. I knew even then that the most important decisions I would make would be choosing projects. I couldn’t have foreseen that Jim was going to work across all these genres – I simply knew that we worked well together and that the end product was good.  

[OP] In preparing for Ford v Ferrari, did you study any other recent racing films, like Ron Howard’s Rush?

[MM] I saw that movie and liked it. Jim was aware of it, too, but I think he wanted to do something a little more organic. We watched a lot of older racing films, like Steve McQueen’s Le Mans and Frankenheimer’s Grand Prix. Jim’s original intention was to play the racing in long takes and bring the audience along for the ride. As he was developing the script and we were in preproduction, it became clear that there was so much more drama that was available for him to portray during the racing sequences than he anticipated. And so, the races took on more of an energized pace.

[OP] Energized in what way? Do you mean in how you cut it or in a change of production technique, like more stunt cameras and angles?

[MM] I was fortunate to get involved about two-and-a-half months prior to the start of production. We were developing the Le Mans race in pre-vis, which required a lot of editing and discussions about shot design and figuring out what the intercutting was going to be during that sequence, which is like the fourth act of the movie. You’re dealing with Mollie and Peter [Ken Miles’ wife and son] at home watching the race, the pit drama, what’s going on with Shelby and his crew, with Ford and Leo Beebe, and also, of course, what’s going on in the car with Ken. It’s a three act movie unto itself, so Jim was trying to figure out how it was all going to work, before he had to shoot it. That’s where I came in. The frenetic pace of Le Mans was more a part of the writing process – and part of the writing process was the pre-vis. The trick was how to make sure we weren’t just following cars around a track. That’s where redundancy can tend to beleaguer an audience in racing movies. 

[OP] What was the timeline for production and post?

[MM] I started at the end of May 2018. Production began at the the beginning of August and went all the way through to the end of November. We started post in earnest at the beginning of November of last year, took some time off for the holidays, and then showed the film to the studios around February or March.

The challenge was that there was going to be a lot of racing footage, which meant there was going to be a LOT of footage. I knew I was going to need a strong co-editor, so Andrew was the natural choice. He had been cutting on his own and cutting with me over the years. We share a common approach to editing and have a similar aesthetic. There was a point when things got really intense and we needed another pair of hands, so I brought in Dirk Westervelt to help out for a couple of months. That kept our noses above water, but the process was really enjoyable. We were never in a crisis mode. We got a great response from preview audiences and, of course, that calms everybody down. At that point it was just about quality control and making sure we weren’t resting on our laurels. 

[OP] How long was your initial cut and what was your process for trimming the film down to the present run time?

[MM] We’re at 2:30:00 right now and I think the first cut was 3:10:00 or 3:12:00. The Le Mans section was longer. The front end of the movie had more scenes in it. We ended up lifting some scenes and rearranging others.  Plus, the basic trimming of scenes brought the length down. But nothing was the result of a panic, like, “Oh my God, we’ve got to get to 2:30:00!” There were no demands by the studio or any pressures we placed upon ourselves to hit a particular running time. I like to say that there’s real time and there’s cinematic time. You can watch Once Upon a Time in America, which is 3:45:00, and feel likes it’s an hour. Or you can watch an 89-minute movie and feel like it’s drudgery. We just wanted to make sure we weren’t overstaying our welcome. 

[OP] How extensively did you re-arrange scenes during the edit? Or did the structure of the film stay pretty much as scripted?

[MM] To a great degree it stayed as scripted. We had some scenes in the beginning that we felt were a little bit tangential and weren’t serving the narrative directly and those were cut. The real endeavor of this movie starts the moment that these two guys [Shelby and Miles] decide to tackle the challenge of developing this car. There’s a scene where Miles sees the car for the first time at LAX. We understood that we had to get to that point in a very efficient way, but also set up all the other characters – their motives and their desires.

It’s an interesting movie, because it starts off with a lot of characters. But then it develops into a movie about two guys and their friendship. So it goes from an ensemble piece to being about Ken and Carroll, while at the same time the scope of the movie is opening up and becoming larger as the racing is going on. For us, the trickiest part was the front end – to make sure we spent enough time with each character so that we understood them, but not so much time that audience would go, “Enough already! Get on with it!”

[OP] Were you both racing fans before you signed onto this film?

[AB] I was not.

[MM] When I was a kid, I watched a lot of racing. I liked CART racing – open wheel racing – not so much stock car racing. As I grew older, I lost interest, particularly when CART disbanded and NASCAR took over. So, I had an appreciation for it. I went to races, like the old Ontario 500 here in California.

[OP] Did that help inform your cutting style for this film?

[MM] I don’t think so. Where it helped was knowing the sound of the broadcasters and race announcers. I liked Chris Economaki and Jim McKay – guys who were broadcasting the races when I was a kid. I was intrigued about how they gave us the narrative of the race. It came in handy while we were making this movie, because we were able to get our hands on some of Jim McKay’s actual coverage of Le Mans and used it in the movie. That brings so much authenticity.

[OP] Let’s dive deeper into the sound for this film. I would imagine that sound design was integral to your rough cuts. How did you tackle that?

 [AB] We were fortunate to have the sound team on very early during preproduction. We were cutting in a 5.1 environment, so we wanted to create sound design early in the process. The sounds may have not been the exact engine sounds that would end up in the final, but they were adequate to allow you to experience the scenes as intended and to give the right feel.  Because we needed to get Jim’s response early, some of the races were cut with the production sound – from the live mics during filming. This allowed us and Jim to quickly see how the scenes would flow. Other scenes were cut strictly MOS, because the sound design would have been way too complicated for the initial cut of the scene. Once the scene was cut visually, we’d hand over the scene to Don [Sylvester, sound supervisor] who was able to provide us with a set of 5.1 stems. That was great, because we could recut and repurpose those stems for other races.

[MM] We had developed a strategy with Don to split the sound design into four or five stems to give us enough discrete channels to recut these sequences. The stems were a palette of interior perspectives, exterior perspectives, crowds, car-bys, and so on. By employing this strategy, we didn’t need to continually turn over the cut to sound for patch-up work. Then, as Don went out and recorded the real cars and was developing the actual sounds for what was going to be used in the mix, he’d generate new stems and we would put them into the Avid. This was extremely informative to Jim, because he could experience our Avid temp mix in 5.1 and give notes, which ultimately informed the final sound design and the mix. 

[OP] What about temp music? Did you also weave that into your rough cuts?

[MM] Ted Caplan, our music editor, has also worked with Jim for 15 years. He’s a bit of a renaissance man – a screenwriter, a novelist, a one-time musician, and a sound designer in his own right. When he sits down to work with music, he’s coming at it from a story point-of-view. He has a very instinctual knowledge of where music should start and it happens to dovetail into the aesthetic that Jim, Andrew, and I are working towards. None of us like music to lead scenes in a way that anticipates what the scene is going to be about before you experience it.

Specifically, for this movie, it was challenging to develop what the musical tone of the movie would be. Ted was developing the temp track along with us from a very early stage. We found over time that not one particular musical style was going to work. Which is to say that this is a very complex score. It includes a kind of surf rock sound with Carroll Shelby in LA; an almost jaunty, lounge jazz sound for Detroit and the Ford executives; and then the hard-driving rhythmic sound for the racing.

(The final score was composed by Marco Beltrami and Buck Sanders.)

[OP] I presume you were housed in multiple cutting rooms at a central facility. Right?

[MM] We cut at 20th Century Fox, where Jim has a large office space. We cut Logan and Wolverine there before this movie. It has several cutting spaces, I was situated between Andrew and Don. Ted was next to Don and John Berri, our additional editor, and assistants were right around the corner. It makes for a very efficient working environment. 

[OP] Since the team was cutting with Avid Media Composer, did any of its features stand out to you for this film?

[Both] FluidMorph! (laughs)

[MM] FluidMorph, speed-ramping – we often had to manipulate the shot speeds to communicate the speed of the cars. A lot of these cars were kit cars that could drive safely at a certain speed for photography, but not at race speed. So we had to manipulate the speed a lot to get the sense of action that these cars have.

[OP] What about Avid’s Script Integration feature, often referred to as ScriptSync? I know a lot of narrative editors love it.

[MM] I used ScriptSync once a few years ago and I never cut a scene faster. I was so excited. Then I watched it and it was terrible. To me there’s so much more to editing than hitting the next line of dialogue. I’m more interested in the lines between the lines – subtext. I found that with ScriptSync I could put the scene together quickly, but it was flat as a pancake. I do understand the value of it in certain applications. For instance, I think it’s great on straight comedy. It’s helpful to get around and find things when you are shooting tons of coverage for a particular joke. But for me, it’s not something I lean on. I mark up my own dailies and find stuff that way.

[OP] Tell me a bit more about your organizational process. Do you start with a KEM roll or stringouts of selected takes?

[MM] I don’t watch dailies, which sounds weird. By that I mean, I don’t watch them in a traditional sense. I don’t start in the morning, watch the dailies, and then start cutting. And I don’t ask my assistants to organize any of my dailies in bins. I come in and grab the scene that I have in front on me. I’ll look at the last take of every set-up really quickly and then I spend an enormous amount of time – particularly on complex scenes – creating a bin structure that I can work with. Sometimes it’s the beats in a scene, sometimes I organize by shot size, sometimes by character – it depends on what’s driving the scene. That’s the way I learn my footage – by organizing it. I remember shot sizes. I remember what was shot from set-up to set-up. I have a strong visual memory of where things are in a bin. So, if I ask an assistant to do that, then I’m not going to remember it. If I do it myself, then I’ll remember it. If there are a lot of resets or restarts in a take, I’ll have the assistant mark those up. But, I’ll go through and mark up beats or pivotal points in a scene, or particularly beautiful moments. And then I’ll start cutting.

[AB] I’ve adopted a lot of Mike’s methodology, mainly because I assisted Mike on a few films. But it actually works for me, as well. I have a similar aesthetic to Mike. I’ve used ScriptSync before and I tend to agree that it discourages you from seeing – as Mike described – the moments between lines. Those moments are valuable to remember.  

[OP] I presume this film was shot digitally. Right?

[MM] It was primarily shot with [ARRI] Alexa 65 LF cameras, plus some other small format cameras. A lot of it was shot with old anamorphic lenses on the Alexa that allowed them to give it a bit of a vintage feeling. It’s interesting that as you watch it, you see the effect of the old lenses. There’s a fall-off on the edges, which is kind of cool. There were a couple of places where the subject matter was framed into the curve of the lens, which affects the focus. But we stuck with it, because it feels ‘of the time.’

[OP] Since the film takes place in the 1960s and with racing action sequences, I presume there were quite a few visual effects to properly place the film in time. Right?

[MM] There’s a ton of that. The whole movie is a period film. We could temp certain things in the Avid for the rough cuts. John Berri was wrangling visual effects. He’s a master in the Avid, but also Adobe After Effects. He has some clever ways of filling in backgrounds or green screens with temp elements to give the director an idea of what’s going to go there. We try to do as much temp work in the Avid as we are capable of doing, but there’s so much 3D visual effects work in this movie that we weren’t able to do that all of the time.

The caveat, though, is that the racing is real. The cars are real. The visual effects work was for a lot of the backgrounds. The movie was shot almost entirely in Los Angeles with some second unit footage shot in Georgia. The current, modern day Le Mans track isn’t at all representative of what Le Mans was in 1966, so there was no way to shoot Le Mans. Everything had to be doubled and then augmented with visual effects. In addition to Georgia, where they shot most of the actual racing for Le Mans, they went for a week to France to get some shots of the actual town of Le Mans. Of those, I think only about four of those shots are left. (laughs)

[OP] Any final thoughts about how this film turned out? 

[MM] I’m psyched that people seem to like the film. Our concern was that we had a lot of story to tell. Would we wear audiences out? We continually have people tell us, “That was two and a half hours? We had no idea.” That’s humbling for us and it’s a great feeling. It’s a movie about these really great characters with great scope and great racing. That goes back to the very advent of movies. You can put all the big visual effects in a film that you want to, but it’s really about people.

[AB] I would absolutely agree. It’s more of a character movie with racing.  Also, because I am not a ‘racing fan’ per se, the character drama really pulled me into the film while working on it.

[MM] It’s classic Hollywood cinema. I feel proud to be part of a movie that does what Hollywood does best.

The article is also available at postPerspective.

For more, check out this interview with Steve Hullfish.

©2019 Oliver Peters