Trusting Apple Displays?

In the “good old days” of post, directors, cinematographers, and clients would all judge final image quality in an online edit or color correction suite using a single, calibrated reference monitor. We’ve moved away from rooms that look like the bridge of the Enterprise into more minimalist set-ups. This is coupled with the current and possibly future work-from-home and general remote post experiences. Without everyone looking at the same reference display, it becomes increasingly difficult to be sure that what everyone sees is actually the proper appearance of the image. For some, clients rarely comes into the suite anymore. Instead, they are often making critical judgements based on what they see on their home or work computers and/or devices.

The lowest common denominator

Historically, a common item in most recording studios was a set of Auratone sound cubes. These small, single speaker monitors, which some mixers dubbed “awful-tones,” were intended to provide a representation of the mix as it would sound on radios and cheaper hi-fi audio set-ups. TV show re-recording mixers would also use these to check a mix in order to hear how it would translate to home TV sets.

Today, smart phones and tablets have become the video equivalent of that cheap hi-fi set-up. Generally that means Apple iPhones or iPads. In fact, thanks to Apple’s color management, videos played back on iPads and iPhones do approximate the correct look of your master file. As editors or colorists, we often ask clients to evaluate the image on an Apple device, not because they are perfect (they aren’t), but rather because they are the best of the many options out in the consumer space. In effect, checking against an iPhone has become the modern video analog of the Auratone sound cubes.

Apple color management

Apple’s color management includes several techniques that are helpful, but can also trip you up. If you are going to recommend that your clients use an iPhone, iPad, or even on iMac to judge the material, then you also want to make sure they have correctly set up their device. This also applies to you, the editor, if you are creating videos and only making judgements on an iMac (or XDR) display, without any actual external video (not computer) display.

Apple computers enable the use of different color profiles and the ability to make adjustments according to calibration. If you have a new iMac, then you are generally better off leaving the color profile set to the default iMac setting instead of fiddling with other profiles. New Apple device displays are set to P3 D65 color with a higher brightness capacity (up to 500 nits – more with XDR). You cannot expect them to perfectly reproduce an image that looks 100% like a Rec 709 100 nits TV set. But, they do get close.

I routinely edit/grade with Media Composer, Premiere Pro, DaVinci Resolve, and Final Cut Pro on iMacs and iMac Pros. Of these four, only Final Cut Pro shows an image in the edit viewer window that is relatively close to the way that image appears on the video output to a monitor. This is thanks to Apple’s color management and the broader Apple hardware/software ecosystem. The viewer image for the other three may look darker, be more saturated, have richer reds, and/or show more contrast.

User control

Once you get past the color profile (Mac only), then most Apple devices offer two or three additional user controls (depending on OS version). Obviously there’s brightness, which can be manual or automatic. When set to automatic, the display will adjust brightness based on the ambient light. Generally auto will be fine, unless you really need to see crucial shadow detail. For example, the pluge portion of a test pattern (darkest gray patches) may not be discernible unless you crank up the brightness or are in a dark room.

The next two are gotchas. Along with the user interface dark mode, Apple introduced Night Shift and True Tone in an effort to reduce eye fatigue after long computer use. These are based on the theory that blue light from computer and device screens is fatiguing, harmful, and/or can impact sleep patterns. Such health concerns, as they relate to computer use, are not universally supported by the medical community.

Nevertheless, they do have a pleasing effect, because these features make the display warmer or cooler based on the time of day or the color temperature of the ambient light in the room. Typically the display will appear warmer at night or in a dimmer room. If you are working with a lot of white on the screen, such as working with documents, then these modes do feel more comfortable on your eyes (at least for me). However, your brain adjusts to the color temperature shift of the display when using something like True Tone. The screen doesn’t register in your mind as being obviously warm.

If you are doing anything that involves judging color, the LAST thing you want to use is True Tone or Night Shift. This applies to editing, color correction, art, photography, etc. It’s important to note that these settings only affect the way the image is displayed on the screen. They don’t actually change the image itself. Therefore, if you take a screen grab with True Tone or Night Shift set very cool or warm, the screen grab itself will still be neutral.

In my case, I leave these off for all of the computers I use, but I’m OK with leaving them on for my iPhone and iPad. However, this does mean I need to remember to turn the setting off whenever I use the iPhone or iPad to remotely judge videos. And there’s the rub. If you are telling your client to remotely judge a video using an Apple device – and color is part of that evaluation – then it’s imperative that you ask them (and maybe even teach them how) to turn off those settings. Unless they are familiar with the phenomena, the odds are that True Tone and/or Night Shift has been enabled on their device(s) and they’ve never thought twice about it simply because the mind adjusts.

QuickTime

QuickTime Player is the default media player for many professionals and end users, especially those using Macs. The way QuickTime displays a compatible file to the screen is determined by the color profile embedded into the file metadata. If I do a color correction session in Resolve, with the color management set to Rec 709 2.4 gamma (standard TV), then when I render a ProRes file, it will be encoded with a color profile of 1-2-1 (the 2 indicates 2.4 gamma).

If I export that same clip from Final Cut Pro or Premiere Pro (or re-encode the Resolve export through one of those apps) the resulting ProRes now has a profile of 1-1-1. The difference through QuickTime Player is that the Resolve clip will look darker in the shadows than the clip exported from FCP or Premiere Pro. Yet both files are exactly the same. It’s merely how QuickTime player displays it to the screen based on the metadata. If I open both clips in different players, like Switch or VLC, which don’t use this same metadata, then they will both appear the same, without any gamma shift.

Client recommendations

How should one deal with such uncertainties? Obviously, it’s a lot easier to tackle when everyone is in the same room. Unfortunately, that’s a luxury that may become totally obsolete. It already has for many. Fortunately most people aren’t as sensitive to color issues as the typical editor, colorist, or DP. In my experience, people tend to have greater issues with the mix than they do with color purity. But that doesn’t preclude you from politely educating your client and making sure certain best practices are followed.

First, make sure that features like True Tone and Night Shift are disabled, so that a neutral image is being viewed. Second, if you use a review-and-approval service, like frame.io or Vimeo, then you can upload test chart image files (color bars, grayscale, etc). These may be used whenever you need to check the image with your client. Is the grayscale a neutral gray in appearance or is it warmer or cooler? Can you see separation in the darkest and brightest patches of these charts? Or are they all uniformly black or white? Knowing the answers will give you a better idea about what the client is seeing and how to guide them to change or improve their settings for more consistent results.

Finally, if their comments seem to relate to a QuickTime issue, then suggest using a different player, such as Switch (free with watermarks will suffice) or VLC.

The brain, eyes, and glasses

Some final considerations… No two people see colors in exactly the same way. Many people suffer from mild color blindness, i.e. color vision deficiencies. This means they may be more or less sensitive to shades of some colors. Eye glasses affect your eyesight. For example, many glasses, depending on the coatings and material, will yellow over time. I cannot use polycarbonate lenses, because I see chromatic aberration on highlights wearing this material, even though most opticians and other users don’t see that at all. CR-9 (optical plastic) or glass (no longer sold) are the only eyeglass materials that work for me.

If I’m on a flight in a window seat, then the eye closest to the window is being bombarded with a different color temperature of light than the eye towards the plane’s interior. This can be exacerbated with sunglasses. After extended exposure to such a differential, I can look at something neutral and when I close one eye or the other, I will see the image with a drastically different color temperature for one eye versus the other. This eventually normalizes itself, but it’s an interested situation.

The bottom line of such anecdotes is that people view things differently. The internet dress color question is an example of this. So when a client gives you color feedback that just doesn’t make sense to you, it might just be them and not their display!

Check out my follow-up article at PVC about dealing with color management in Adobe Premiere Pro.

©2021 Oliver Peters

W.A.S.P.

A regrettable aspect of history and the march of time is that many interesting stories are buried or forgotten. We learn the bullet points of the past, but not the nuances that bring history alive. It’s a challenge that many documentarians seek to meet. While the WWII era is ripe with heroic tales, one unit was almost forgotten.

Women Airforce Service Pilots  (aka WASP)

As WWII ramped up, qualified male pilots were sent to European and Pacific combat, leaving a shortage of stateside pilots. The WASP unit was created as a civilian auxiliary  attached to the U. S. Army Air Forces, It was organized and managed by Jackie Cochran, an accomplished female aviator and entrepreneur. More than 25,000 women applied for the WASP, but only 1,830 were accepted into the program.

The WASP members engaged in military-style training at Avenger Field in Sweetwater, Texas. They wore uniforms, and were given flight assignments by the military, yet they weren’t actually in the military. Their role was to handle all non-combat, military flight tasks within the states, including ferrying aircraft cross-country from factories to deployment bases, serve as test pilots, and handle training tasks like towing targets and mock strafing runs over combat trainees. During her service, the typical WASP would fly more types of aircraft than most male, military pilots. Sadly, 38 WASP died during training or active duty assignments.

Although WASP members joined with the promise of their unit becoming integrated into the regular military, that never happened. As the war wound down and male pilots returned home needing jobs, the WASP units were disbanded, due in part to Congressional and media resistance. Records were sealed and classified and the WASP were almost forgotten by history. Finally in the late 1970s President Carter signed legislation that recognized WASP members as veterans and authorized veterans benefits. In 2009 President Obama and the Congress awarded WASP members with the Congressional Gold Medal.

The documentary

Documentary filmmaker Jon Anderson set out over a decade ago to tell a complete story of the WASP in a feature-length film. Anderson, a history buff, had already produced and directed one documentary about the Tuskegee Airmen. So the WASP story was the next logical subject. The task was to interview as many living WASP to tell their story as possible. The goal was not just the historical facts, but also what it was like to be a WASP, along with some of the backstory details about Cochran and the unit’s formation. The result was W.A.S.P. – A Wartime Experiment in WoManpower.

Anderson accumulated a wealth of interviews, but with limited resources. This meant that interviews were recorded mostly on DV cameras in standard definition. However, as an instructor of documentary filmmaking at Valencia College, Anderson also utilized some of the film program’s resources in the production. This included a number of re-enactments – filmed with student crews, talent, and RED cameras. The initial capture and organization of footage was handled by a previous student of his using Final Cut Pro 7.

Technical issues

Jon asked me to join the project as co-editor after the bulk of interviews and re-enactments had been compiled. Several dilemmas faced me at the front end. The project was started in FCP7, which was now a zombie application. Should I move the project to Final Cut Pro X, Premiere Pro, or Media Composer? After a bit of experimentation, the best translation of the work that had already been done was into Premiere Pro. Since we had a mix of SD and HD/4K content, what would be the best path forward – upconvert to HD or stay in standard def? HD seemed to be the best option for distribution possibilities, but that posed additional challenges.

Only portions of tapes were originally captured – not complete tapes. These were also captured with separated audio and video going to different capture folders (a feature of FCP “classic”). Timecode accuracy was questionable, so it would be nearly impossible to conform the current organized clips from the tapes at a higher resolution. But since it was captured as DV from DV tapes, there was no extra quality loss due to interim transcoding into a lower resolution file format.

Ultimately I opted to stick with what was on the drives as my starting point. Jon and I organized sequences and I was able to borrow a Blackmagic Teranex unit. I exported the various sequences between two computers through the Teranex, which handled the SD to HD conversion and de-interlacing of any interlaced footage. This left us with upscaled ProRes interviews that were 4×3 within a 16×9 HD sequence. Nearly all interviews were filmed against a black limbo background, so I then masked around each woman on camera. In addition, each was reframed to the left or right side, depending on where they faced. Now we could place them against another background – either true black, a graphic, or B-roll. Finally, all clips were graded using Lumetri within Premiere Pro. My home base for video post was TinMen – an Orlando creative production company.

Refining the story

With the technical details sorted out, it was time to refine the story. Like many docs, you end up with more possible storylines than will fit. It’s always a whittling process to reveal a story’s essence and to decide which items are best left out so that the rest remains clear. Interviews were bridged with voice-overs plus archival footage, photos, or re-enactments to fill in historical details. This went through numerous rounds of refinement with input from Jon and Rachel Becker Wright, the producer and co-editor on the film. Along the way Rachel was researching, locating, and licensing archival footage for B-roll. 

Once the bulk of the main storyline was assembled with proper voice-overs, re-enactments, and some B-roll, I turned the cut over to Rachel. She continued with Jon to refine the edit with graphics, music, and final B-roll. Sound post was handled by the audio production department at Valencia College. A nearly-final version of the 90-minute documentary was presented at a “friends and family” screening at the college.

Emmy®

Many readers know about the national Emmy® Awards handed out annually by the National Academy of Television Arts and Sciences (NATAS). It may be less known that NATAS includes 19 regional chapters, which also award Emmys within their chapters. Awards are handed out for projects presented in that region, usually via local broadcast or streaming. Typically the project wins the award without additional craft categories. Anderson was able to submit a shortened version of the documentary for judging by the Suncoast regional chapter, which includes Florida, Puerto Rico, and parts of Louisiana, Alabama, and Georgia. I’m happy to say that W.A.S.P. – A Wartime Experiment in WoManpower won a 2020 regional Emmy, which included Jon Anderson, Rachel Becker Wright, Joe Stone (production designer), and myself.

Awards are nice, of course, but getting the story out about the courageous ladies of the WASP is far more important and I was happy to play a small part in that.

©2021 Oliver Peters

Five Adobe Workflow Tips

Subscribers to Adobe Creative Cloud have a whole suite of creative tools at their fingertips. I believe most users often overlook some of the less promoted features. Here are five quick tips for your workflow. (Click on images to see an enlarged view.)

Camera Raw. Photographers know that the Adobe Camera Raw module is used to process camera raw images, such as .cr2 files. It’s a “develop” module that opens first when you import a camera raw file into Photoshop. It’s also used in Bridge and Lightroom. Many people use Photoshop for photo enhancement – working with the various filters and adjustment layer tools available. What may be overlooked is that you can use the Camera Raw Filter in Photoshop on any photo, even if the file is not raw, such as a JPEG or TIFF.

Select the layer containing the image and choose the Camera Raw Filter. This opens that image into this separate “develop” module. There you have all the photo and color enhancement tools in a single, comprehensive toolkit – the same as in Lightroom. Once you’re done and close the Camera Raw Filter, those adjustments are now “baked” into the image on that layer.

Remix. Audition is a powerful digital audio workstation application that many use in conjunction with Premiere Pro or separately for audio productions. One feature it has over Premiere Pro is the ability to use AI to automatically edit the length of music tracks. Let’s say you have a music track that’s 2:47 in length, but you want a :60 version to underscore a TV commercial. Yes, you could manually edit it, but Audition Remix turns this into an “automagic” task. This is especially useful for projects where you don’t need to have certain parts of the song time to specific visuals.

Open Audition, create a multitrack session, and place the music selection on any track in the timeline. Right-click the selection and enable Remix. Within the Remix dialogue box, set the target duration and parameters – for example, short versus long edits. Audition will calculate the number and location of edit points to seamlessly shorten the track to the approximate desired length.

Audition attempts to create edits at points that are musically logical. You won’t necessarily get an exact duration, since the value you entered is only a target. This is even more true with tracks that have a long musical fade-out. A little experimentation may be needed. For example, a target value of :59 will often yield significantly different results than a target of 1:02, thanks to the recalculation. Audition’s remix isn’t perfect, but will get you close enough that only minimal additional work is required. Once you are happy, bounce out the edited track for the shortened version to bring into Premiere Pro.

Photoshop Batch Processing. If you want to add interesting stylistic looks to a clip, then effects filters in Premiere Pro and/or After Effects usually fit the bill. Or you can go with expensive third party options like Continuum Complete or Sapphire from Boris FX. However, don’t forget Photoshop, which includes many stylized looks not offered in either of Adobe’s video applications, such as specific paint and brush filters. But, how do you apply those to a video clip?

The first step is to turn your clip into an image sequence using Adobe Media Encoder. Then open a representative frame in Photoshop to define the look. Create a Photoshop action using the filters and settings you desire. Save the action, but not the image. Then create a batch function to apply that stored action to the clean frames within the image sequence folder. The batch operation will automatically open each image, apply the effects, and save the stylized results to a new destination folder.

Open that new image sequence using any app that supports image sequences (including QuickTime) and save it as a ProRes (or other) movie file. Stylized effects, like oil paint, are applied to individual frames and will vary with the texture and lighting of each frame; therefore, the stitched movie will display an animated appearance to that effect.

After Effects for broadcast deliverables. After Effects is the proverbial Swiss Army knife for editors and designers. It’s my preferred conversion tool when I have 24p masters that need to be delivered as 60i broadcast files.

Import a 23.98 master and place it into a new composition. Scale, if needed (UHD to HD, for instance). Send to the Render Queue. Set the frame rate to 29.97, field render to Upper (for HD), and enable pulldown (any whole/split frame cadence is usually OK). Turn off Motion Blur and Frame Blending. Render for a proper interlaced broadcast deliverable file.

Photoshop motion graphics. One oft-ignored (or forgotten) feature of Photoshop is that you can do layer-based video animation and editing within. Essentially there’s a very rudimentary version of After Effects inside Photoshop. While you probably wouldn’t want to use it for video instead of using After Effects or Premiere Pro, Photoshop does have a value in creating animated lower thirds and other titles.

Photoshop provides much better text and graphic style options than Premiere Pro. The files are more lightweight than an After Effects comp on your Premiere timeline – or rendering animated ProRes 4444 movies. Since it’s still a Photoshop file (albeit a special version), the “edit in original” command opens the file in Photoshop for easy revisions. Let’s say you are working on a show that has 100 lower thirds that slide in and fade out. These can easily be prepped for the editor by the graphics department in Photoshop – no After Effects skills required.

Create a new file in Photoshop, turn on the timeline window, and add a new blank video layer. Add a still onto a layer for positioning reference, delete the video layer, and extend the layers and timeline to the desired length. Now build your text and graphic layers. Keyframe changes to opacity, position, and other settings for animation. Delete the reference image and save the file. This is now a keyable Photoshop file with embedded animation properties.

Import the Photoshop file into Premiere with Merged Layers. Add to your timeline. The style in Premiere should match the look created in Photoshop. It will animate based on the keyframe settings created in Photoshop.

©2021 Oliver Peters

Kirk Baxter, ACE on editing Mank

Mank, David Fincher’s eleventh film, chronicles Herman Mankiewicz (portrayed by Gary Oldman) during the writing of the film classic, Citizen Kane. Mankiewicz, known as Mank, was a witty New York journalist and playwright who moved to Los Angles in the 1930s to become a screenwriter. He wrote or co-wrote about 40 films, often uncredited, including the first draft of The Wizard of Oz. Together with Orson Welles, he won an Academy Award for the screenplay of Citizen Kane. It’s long been disputed whether or not he, rather than Welles, actually did the bulk of the work on the screenplay. 

The script for Mank was penned decades ago by David Fincher’s father, Jack Fincher, and was finally brought to the screen thanks to Netflix this past year. Fincher deftly blends two parallel storylines: Mankiewicz’ writing of Kane during his convalescence from an accident – and his earlier Hollywood experiences with the studios, as told through flashbacks. These experiences, including his acquaintance with William Randolph Hearst – the media mogul of his time and the basis for Charles Foster Kane in Citizen Kane – inspired Mankiewicz’ script. This earlier period is infused with the political undercurrent of the Great Depression and the California gubernatorial race between Upton Sinclair and Frank Merriam.

David Fincher and director of photography Erik Messerschmidt, ASC (Mindhunter) used many techniques to pay homage to the look of Citizen Kane and other classic films of the era, including shooting in true black-and-white with RED Monstro 8K Monochrome cameras and Leica Summilux lenses. Fincher also tapped other frequent collaborators, including Trent Reznor and Atticus Ross for a moving, vintage score, and Oscar-winning editor, Kirk Baxter, ACE. I recently caught up with Baxter to discuss Mank, the fourth film he’s edited for David Fincher.

***

Citizen Kane is the 800 pound gorilla. Had you seen that film before this or was it research for the project?

I get so nervous about this topic, because with cinephiles, it’s almost like talking about religion. I had seen Citizen Kane when I was younger, but I was too young to appreciate it. I was growing up on Star Wars, Indiana Jones, and Conan the Barbarian. Then advancing my tastes to the Godfather films and French Connection. Citizen Kane is still just such a departure from all of that. I was kind of like, “What?” That was probably in my late teens.

I went back and watched it again before the shoot after reading the screenplay. There were certain technical aspects to the film that I thought were incredible. I loved the way OrsonWelles chose to leave his scenes by turning off lights like it was in the theater. There was this sort of slow decay and I enjoy how David picked up on that and took it into Mank. Each time one of those shots came up in the bungalow scenes, I thought it was fantastic.

Overall, I don’t consider myself any sort of expert on 1930s and 1940s movie-making and I didn’t make a conscious effort to try to replicate any styles. I approached the work in the same way I do with all of David’s work – by being reactionary to the material and the coverage that he shot. In regard to how close David took the stylings, well, that was more his tight rope walk. So, I felt no shackling to slow down an edit pace or stay in masters or stay in 50-50s as might have been common in the genre. I used all the tools at my disposal to exploit every scene the best I could. 

Since you are cutting while the shooting goes on, do you have the ability to ask for coverage that you might feel is missing? 

I think a little bit of that goes on, but it’s not me telling Fincher what’s required. It’s me building assemblies and giving them to David as he’s going and he will assess where he’s short and where he’s not. I’ve read many editor interviews over the years and I’ve always kind of gone, “huh,” when someone’s projecting they’re in the control seat. When you’re with someone with the ability that Fincher has, then I’m in a support position of helping him make his movie as best he can. Any other way of looking at it is delusional. But, I take a lot of pride in where I do get to contribute. 

Mank is a different style of film than Fincher’s previous projects. Did that change the workflow or add any extra pressure? 

I don’t think it did for me. I think it was harder for David. The film was in his head for so many decades and there were a couple of attempts to make it happen. Obviously a lot changes in that time frame. So, I think he had a lot of internal pressure about what he was making. For me, I found the entire process to be really buoyant and bubbly and just downright fun. 

As with all films, there were moments when it was hard to keep up during the shoot. And definitely moments coming down to that final crunch. That’s when I really put a lot of pressure on myself to deliver cut scenes to David to help him. I felt the pressure of that, but my main memory of it really was one of joy. Not that the other movies aren’t, but I think sometimes the subject matter can control the mood of the day. For instance, in other movies, like Dragon Tattoo, the feeling was a bit like your head in a vise when I look back at it.

Sure. Dragoon Tattoo is dark subject matter. On the other hand, Gary Oldman’s portrayal of Mankiewicz really lights up the screen. It certainly looks like he’s having fun with the character. 

Right. I loved all the bungalow scenes. I thought there was so much warmth in those. And I had so much compassion for the lead character, Mank. Those scenes really made me adore him. But also when the flashback scenes came, they’re just a hoot and great fun to put together. There was this warmth and playfulness of the two different opposing storylines. No matter which one turned up, I was happy to see it. 

Was the inter-cutting of those parallel storylines the way it was scripted? Or was that a construction in post? 

Yes, it was scripted that way. There was a little bit of pulling at the thread later. Can we improve on this? There was a bit of reshuffling later on and then working out that ‘as written’ was the best path. We certainly kicked the tires a few times. After we put the blueprint together, mostly the job became tightening and shortening. 

Obviously one of the technical differences was that this film was a true black-and-white film shot with modified, monochrome RED cameras. So not color and then changed to black-and-white in the grade. Did that impact your thinking in how to tackle the edit?

For the first ten minutes. At first you sit down and you go, “Oh, we work in black and white.” And then you get used to it very quickly. I forwarded the trailer when it was released to my mother in Australia. She texted back, “It’s black and white????” [laugh] You’ve got to love family!

Black-and-white has a unique look, but I know that other films, like Roma, were shot in color to satisfy some international distribution requirements. 

That’s never going to happen with someone like David. I can’t picture who that person would be that would tell him with any authority that his movie requires color. 

Of course, it matches films of the era and more importantly Citizen Kane. It does bring an intentional, stylistic treatment to the content. 

Black-and-white has got a great way of focusing your attention and focusing your eye. There’s a discipline that’s required with how shots are framed and how you’re using the images for eye travel. But I think all of David work comes with that discipline anyway. So to me, it didn’t alter it. He’s already in that ballpark.

In terms of recreating the era, I’ve seen a few articles and comments about creating the backgrounds and sets using visual effects, but also classic techniques, like rear projection. What about the effects in Mank

As in most of David’s movies, it’s everywhere and a lot of the time it looks invisible, but things are being replaced. I don’t have a ratio for it, but I’d say almost half the movie. We’ve got a team that’s stabilizing shots as we’re going. We’ve got an in-house visual effects team that is building effects, just to let us know that certain choices can be made. The split screen thing is constant, but I’ll do a lot of that myself. I’ll do a fairly haphazard job of it and then pass it on for our assistant editors to follow up on. Even the montage kaleidoscope effect was all done in-house down the hall by Christopher Doulgeris, one of our VFX artists. A lot of it’s farmed out, but a fair slice is done under the roof. 

Please tell me a bit about working with Adobe Premiere Pro again to cut this film.

It’s best for me not even to attempt to answer technical questions. I don’t mind exposing myself as a luddite. My first assistant editor, Ben Insler, set it up so that I’m able to move the way I want to move. For me, it’s all muscle memory. I’m hitting the same keystrokes that I was hitting back when we were using Avid. Then I crossed those keys over to Final Cut and then over to Premiere Pro. 

In previous versions, Premiere Pro required projects to contain copies of all the media used in that project.  As you would hand the scene off to other people to work on in parallel, all the media would travel into that new project, and the same was true when combining projects back together to merge your work.  You had monstrously huge projects with every piece of media, and frequently duplicate copies of that media, packed into them. They often took 15 minutes to open. Now Adobe has solved that and streamlined the process. They knew it was a massive overhaul, but I think that’s been completely solved. Because it’s functioning, I can now purely concentrate on the thought process of where I’m going in the edit. I’m spoiled with having very technical people around me so that I can exist as a child. [laugh]

How was the color grade handled?

We had Eric Weidt working downstairs at Fincher’s place on Baselight. David is really fortunate that he’s not working in this world of “Here’s three weeks for color. Go into this room each day and where you come out is where you are at.” There’s an ongoing grade that’s occurring in increments and traveling with the job that we’re doing. It’s  updated and brought into the cut. We experience editing with it and then it’s updated again and brought back into the cut. So it’s this constant progression. 

Let’s talk about project organization. You’ve told me in the past that your method of organizing a selects reel was to string out shots in the order of wide shots, mediums, close ups, and so on. And then bump up the ones you like. Finally, you’d reduce the choices before those were presented to David as possible selects. Did you handle it the same way on Mank?

Over time, I’ve streamlined that further. I’ve found that if I send something that’s too long while he’s in the middle of shooting that he might watch the first two minutes of it, give me a couple of notes of what he likes and what he doesn’t like, and move on. So, I’ve started to really reduce what I send. It’s more cut scenes with some choices. That way I get the most relevant information and can move  forward.

With scenes that are extremely dense, like Louis B. Mayer’s birthday party at Hearst’s, it really is an endless multiple choice of how to tackle it. I’ll often present a few paths. Here’s what it is if I really hold out these wides at the front and I hang back for a bit longer. Here’s what it is if I stay more with Gary [Oldmam] listening. It’s not that this take is better than the other take, but more options featuring different avenues and ways to tell the story. 

I like working that way, even if it wasn’t for the sake of presenting it to David. I can’t watch a scene that’s that dense and go, “Oh, I know what to do.” I wouldn’t have a clue. I like to explore it. I’ve got to turn the soil and snuff the truffles and try it all out. And then the answers present themselves. It all just becomes clear. Unfortunately, the world of the editor, regardless of past experiences, is always destined to be filled with labor. There is no shortcut to doing it properly.

With large-scale theatrical distribution out of the question – and the shift to Netflix streaming as the prime focus – did the nature of studio notes change at all? 

David’s generous about thought and opinion, if it’s constructive and helpful.  He’s got a long history of forwarding those notes to me and exploring them. I’m not positive if I get all of them. Anything that’s got merit will reach me, which is wise. Having spent so many years in the commercial world, there’s a part of me that’s always a little eager to solve a puzzle. If I’m delivered a pile of notes, good or bad, I’m going to try my best to execute them.  So, David is wise to just not let me see the bad ones.

Were you able to finish Mank before the virus-related lockdowns started? Did you have to move to a remote workflow? 

The shooting had finished and we already had the film assembled. I work at a furious rate whilst David’s shooting, so that we can interface during the shoot. That way he knows what he’s captured, what he needs, and he can move on and strike sets, release actors, etc. There’s this constant back and forth.

At the point when he stops shooting, we’re pretty far along in terms of replicating the original plan, the blueprint. Then it’s what I call the sweeps, where you go back to the top and you just start sweeping through the movie, improving it. I think we’d already done one of those when we went remote. So, it was very fortunate timing.

We’re quite used to it. During shooting, we work in a remote way anyway. It’s a language and situation that we’re completely used to. I think from David’s perspective, it didn’t change anything. 

If the timing had been different and you would have had to handle all of the edit under remote conditions, would anything change? Or would you approach it the same way? 

Exactly the same. It wouldn’t have changed the amount of time that I get directly with David. I don’t want to give the impression that I cut this movie and David was on the sidelines. He’s absolutely involved, but pops in and out and looks at things that are made. He’s not a director that sits there the whole time. A lot of it is, “I’ve made this cut, let’s watch it together. I’ve done these selects, let’s watch them together.” It’s really possible to do that remotely. 

I prefer to be with David when he’s shooting and especially in this one that he shot in Los Angeles. I really tried to have one day a week where we got to be together on the weekends and his world quieted down. David loves that. I would sort of construct my week’s thinking towards that goal. If on a Wednesday I had six scenes that were backed up, I’d sort of think to myself, “What can I achieve in the time frame before David’s with me on Saturday? Should I just select all these scenes and then we’ll go through the selects together? Or should I tackle this hardest one and get a good cut of that going?”

A lot of the time I would choose – if he was coming in and had the time to watch things – to do selects. Sometimes we could bounce through them just from having a conversation of what his intent was and the things that he was excited about when he was capturing them. With that, I’m good to go. Then I don’t need David for another week or so. We were down to the short hand of one sentence, one email, one text. That can inform me with all the fuel I need to drive cross-country. 

The film’s back story clearly has political overtones that have an eerie similarity to 2020. I realize the script was written a while back at a different time, but was some of that context added in light of recent events? 

That was already there. But, it really felt like we are reliving this now. In the beginning of the shutdown, you didn’t quite know where it was going to go. The parallels to the Great Depression were extreme. There were a lot of lessons for me.

The character of Louis B. Mayer slashes all of his studio employees’ salaries to 50 percent. He promises to give every penny back and then doesn’t do it. I was crafting that villain’s performance, but at the same time I run a company [Exile Edit] that has a lot of employees in Los Angeles and New York. We had no clue if we would be able to get through the pandemic at the time when it hit. We also asked staff to take a pay cut, so that we could keep everyone employed and keep everybody on health insurance. But the moment we realized we could get through it six months later, there was no way I could ever be that villain. We returned every cent. 

I think most companies are set up to be able to exist for four months. If everything stops dead – no one’s anticipating that – the 12-month brake pull. It was really, really frightening. I would hope that I would think this way anyway, but with crafting that villain’s performance, there was no way I was going to replicate it.

***

Mank was released in select theaters in November and launched on Netflix December 4, 2020.

Be sure to check out Steve Hullfish’s podcast interview with Kirk Baxter.

This article originally written for postPerspective.

©2021 Oliver Peters

January 2021 Links

Occasionally I write articles for other sites, which I don’t repost here or which I may repost much later at a future date. In case you missed them, here are links to some of the more recent ones.

My thoughts on how the Apple silicon transition affects post production professionals.
Is Apple Silicon Your Fork in the Road? by Oliver Peters – ProVideo Coalition

A review of the first M1-powered desktop Mac.
Apple M1 Mac mini Review by Oliver Peters (fcp.co)

The experience of putting together a virtual holiday show.
Holiday Cabaret de Noël – Virtual Show Production in Final Cut Pro (fcp.co)

Working with the Simon Says Transcription service.
Oliver Peters Reviews Simon Says Transcription Service (fcp.co)

Check out my conversation with Kirk Baxter about cutting the Netflix film, Mank.
Kirk Baxter Talks Editing Workflow on David Fincher’s Mank – postPerspective

Pointers about mobile filmmaking with iPhone and Final Cut Pro.
Mobile Filmmaking with iPhone, FiLMiC Pro, and Final Cut Pro (fcp.co)