YouTube influencers are a big part of the content creation landscape today. Their videos cover many, different niches and often have a surprisingly large base of followers. I take a look at YouTube, aviation, and the use of Final Cut Pro to post-produce these videos (FCP.co).
Unless you’ve delivered master files for broadcast, you might not be as tuned into meeting delivery specs, especially when it comes to the perceived loudness levels of your mix. I discuss working with audio in Final Cut Pro to mix and master at optimal levels (FCP.co).
The folks at Color Trix have come up with an ingenious solution to augment Final Cut Pro’s camera raw support. The new Color Finale Transcoder adds additional camera raw formats, notably Blackmagic RAW. Check out my review of the software (FCP.co).
Chilean filmmaker Maite Alberdi originally set out to document the work being done by private investigator Romulo Aitkin. The narrative became quite different, thanks to Romulo’s mole, Sergio Chamy. The charming, 83-year-old widower was hired to be the inside man to follow a case at a retirement home. Once on the inside, we see life from Sergio’s perspective.
The Mole Agent is a touching film about humanity, deftly told without the benefit of an all-knowing narrator or on-camera interviews. The thread that binds the film is often Sergio’s phoned reports to Romulo, but the film’s approach is largely cinema verite. Building that structure fell to Carolina Siraqyan, a Chile-based editor, whose main experience has been cutting short-form projects and commercials. I recently connected with Carolina over Zoom to discuss the post behind this Oscar contender.
* * *
Please tell me how you got the chance to edit this film.
I met Maite years ago while giving a presentation about editing trailers for documentaries, which is a speciality of mine. She was finishing the The Grown-Ups and I’m Not From Here, a short documentary film. I ended up doing the trailers for both and we connected. She shared that she was developing The Mole Agent. I loved the mixture of film noir and observational documentary, so I asked to work on the film and ended up cutting it.
Did her original idea start with the current premise of the film or was the concept broader at that point?
Maite wanted to do a documentary about the workings of a private detective agency, since detectives are often only represented in fiction. She worked with Romulo for a few months and realized that investigations into retirement homes are quite common. She loved the idea for the film and started focusing on that aspect.
Romulo already had a mole that he used inside the homes on these cases, but the mole broke his hip. So Romulo placed a newspaper want ad for someone in his 80s who could work as his new mole on this case. A number of credible older men applied. Out of those applicants, Sergio was hired and turned out to be perfect for the film. He entered into the retirement home after some initial training, including how to discretely communicate with Romulo and how to use the spy cameras.
How was the director able to convince the home and the residents to be in the film?
The film crew had arrived a couple of weeks before Sergio. It was explained that they were doing a film on old age and would be focusing on any new residents in the home. So, the existing residents were already comfortable with the presence of the cameras before he arrived. Maite was very empathetic about where to place cameras so that they wouldn’t bother residents or interfere with what the staff was doing, even if that might not be the best location aesthetically.
Maite is very popular here. She’s written and directed a number of films about social issues and her point-of-view is very humble and very respectful. This is a good retirement home with nothing to hide, so both the staff and the residents were OK with filming. But to be clear, only people who consented appear in the film.
I understand that there were 300 hours of raw footage filmed for this documentary. How did you approach that?
The crew filmed for over three months. It’s actually more that 300 hours of footage, because of the spy cameras. Probably as much as 50 hours more. I couldn’t use a lot of that spy camera material, because Sergio would accidentally press record instead of pressing stop. The camera was in his pocket all the time, so I might have black for 20 minutes. [laugh]
I starting on the project in January  after it had been shot and the camera footage merged with the sound files. The native footage was shot with Sony cameras in their MXF format. The spy cameras generated H.264 files. To keep everything smooth, I was working with proxy files.
Essentially I started from zero on the edit. It took me two months to categorize the footage. I have an assistant, but I wanted to watch all of the material first. I like to add markers while I’m watching and then add text to those markers as I react to that footage. The first impression is very important for me.
We had a big magnetic blackboard and I placed magnetic cards on the wall for each of the different situations that I had edited. Then Maite came during the middle of March and we worked together like playing Tetris to structure the film. After that we shifted to Amsterdam for two months to work in a very focused way in order to refine the film’s structure. The first edition was completed in November and the final mix and color correction was done in December.
Did you have a particular method to create the structure of this documentary?
I feel that every film is different and you have to think a lot about how you are going to face each movie. In this film I had two certainties, the beginning – Romulo training Sergio – and the ending – what Sergio’s thoughts were. The rest is all emotion. That’s the spine. I have to analyze the emotion to converge to the conflict. First, there’s the humor and then the evolution to the sadness and loneliness. That’s how I approached the material – by the emotion.
I color-coded the magnetic cards for different emotions. For example, pink was for the funny scenes. When Maite was there, the cards provided the big picture showing all the situations. We could look and decide if a certain order worked or not.
What sort of changes to the film came out of the review stage?
This is a very international film with co-producers in the United States, Germany, the Netherlands, Spain, and Chile. We would share cuts with them to get helpful feedback. It let us make the movie more universal, because we had the input of many professionals from different parts of the world.
When we arrived in Amsterdam the first cut of the film was about three hours long. Originally the first part was 30 minutes long and that was cut down to 10 minutes. When we watched the longer cut, we felt that we were losing interest in the investigation; however, the relationship that Sergio was establishing with the women was wonderful. All the women are in love with him. It starts like film noir, but with humor. So we focused on the relationships and edited the investigation parts into shorter humorous segments that were interspersed throughout the film.
The reality was incredible and definitely nothing was scripted. But some of the co-producers commented that various scenes in the film didn’t feel real to them. So, we considered those opinions as we were tightening the film.
You edited this film with Adobe Premiere Pro. How do you like using it and why was it the right tool for this film?
I started on film with Moviola and then edited on U-matic, which I hated. I moved to Avid, because it was the first application we had. Then I moved to Final Cut Pro; but after FCP7 died, I switched to Premiere Pro. I love it and am very comfortable with how the timeline works. The program leaves you a lot of freedom as to how and where you put your material. You have control – none of that magnetic stuff that forces you to do something by default.
Premiere Pro was great for this documentary. If a program shuts down unexpectedly, it’s very frustrating, because the creative process stops. I didn’t have any problems even though everything was in one, large project. I did occasionally clean up the project to get rid of stuff I wasn’t using, so it wasn’t too heavy. But Premiere allowed me to work very fluidly, which is crucial.
You completed the The Mole Agent at the end of 2019. That’s prior to the “work from home” remote editing reality that most of the world has lived through during this past year. What would be different if you had worked on the film a year later?
The Mole Agent was completed in time for Sundance in January of 2020. Fortunately we were able to work without lockdowns. I’ve worked a lot remotely during this past year and it’s difficult. You get accustomed to it, but there is something missing. You don’t get the same feeling looking through a [web] camera as being together in the room. Something in the creative communication is lost in the technology. If the movie had been edited like this [communicating through Zoom] – and considering the mood during the lockdowns and how that affects your perception of the material – then it really would be a different film.
Any final thoughts about your experience editing this film?
I had previously worked sporadically on films, but have spent most of my career in the advertising industry. A few years ago I decided that I wanted to work full-time on long-form films. Then this project came to me. So I was very open during the process to all of the notes and comments. I understood the process, of course, but because I had worked so much in advertising, I now had to put this new information into practice. I learned a lot!
The Mole Agent is a very touching film. It’s different – very innovative. It’s an incredible movie for people who have seen the film. It affects the conscience and they take action. I feel very glad to have worked on this film.
We now have the latest machine in the transition to Apple’s Arm-based SoC M1 processor – the new 24” M1 iMac. This is the same integrated chip used in the other M1 machines – the laptops and the Mac mini. Apple launched it with a bouquet of seven colors, harkening back to the original Bondi iMacs. Of course, the iMac itself is the natural descendant of the first Mac. Along with color options, there are color-coordinated accessories, including the mouse and a new round-edge keyboard with Touch ID.
This is the first in what is presumably a line of several new iMacs. It’s targeted at consumers; however, the M1 Macs have proven to be more than capable enough for editing, especially with Apple Final Cut Pro. This model has a 24” screen and is 11.5mm thick. That’s only slightly thicker than the 12.9” iPad Pro! It replaces the former 21” model. With slimmer bezels you get 4.5K Retina resolution in similar screen real estate to the older model. There’s an upgraded audio system (mics, speakers, and Dolby Atmos support) and a 1080p camera. According to Apple, the M1 iMac will also drive an external 6K Pro Display XDR.
An interesting design choice is the return of the puck-like power adapter. This enables a magnetic power plug, a la the older MagSafe plugs. Ethernet connects to the adapter, instead of the back of the iMac. Personally, I feel this is a poor design choice. I had to deal with those on Apple displays over the years and have been more than happy not to have them on the Intel iMacs. I’d rather see a slightly thicker iMac design without the adapter. Although, many do like this design, because the ultra-slim design has a cool factor. I can also appreciate that Apple designers wanted to get rid of the bump at the back of the iMac. It just seems to me that there might have been a middle ground that didn’t require the puck and would be equally as stunning. Either way, it’s certainly not a showstopper.
These new iMacs become available in the second half of May, but special configurations have not yet been listed on the pricing page. If predictions are correct, then later this year Apple will likely release a more powerful iMac, featuring the next iteration of M-series chip. M1X, M2, other name – who knows? Obviously this will include a larger screen (27”, 30”, 32”?), but given the additional of XDR technology to the iPad Pro, one now has to wonder whether such an iMac would also include an XDR screen. Since the iMac color scheme no longer includes space gray, will the more advanced iMac be offered in space gray? And would such a model be called iMac Pro again or maybe iMac XDR? All just speculation at this point.
The iPad Pro’s A-series processor has now been upgraded to an M1 chip. Since these were already cousins, it’s not really clear to me what the performance difference will be. Apple speaks of huge gains in performance, but it’s not clear what those comparisons are based on. There are a number of enhancements, like 5G and the 12MP ultra wide camera, but I’ll focus on two production-related upgrades.
The iPad Pro does now support Thunderbolt connectivity, enabled by the M1. However, it may or may not work in the same way as connecting a a drive to a Mac. But you will be able to connect it to an external display like the XDR. You can run some iPadOS apps on a Mac with Big Sur, but I doubt that you’ll be able to run a macOS app on an iPad Pro. That may be a limitation of the more stripped-down iPadOS. It could also be because of the different design choices that have to be made for touch versus keyboard/mouse interaction.
The big improvement is in the 12.9” iPad Pro, which gains a new Liquid Retina XDR display. It’s based on the same technology as the Pro Display XDR, which should result in stunning images on a tablet. It will offer 1,000 nits of full-screen brightness and 1,600 nits of peak brightness. This will be interesting for location productions if DPs adopt the iPad Pro as a device to proof shots and create looks.
A final point to note is that Apple has successfully introduced a processor architecture that scales from tablet to desktop with the same chip. Gone are the Intel Core i3 through i9 configurations. Obviously, more powerful Macs will require more powerful chip versions; but, we’ll have to see whether these future configuration options become as varied as with Intel or AMD. I’m sure that will become clearer by the end of 2021. All in all the Spring event had some nice, new products along with some incremental updates. Let’s see how this Apple Silicon transition continues to shape up.
In the “good old days” of post, directors, cinematographers, and clients would all judge final image quality in an online edit or color correction suite using a single, calibrated reference monitor. We’ve moved away from rooms that look like the bridge of the Enterprise into more minimalist set-ups. This is coupled with the current and possibly future work-from-home and general remote post experiences. Without everyone looking at the same reference display, it becomes increasingly difficult to be sure that what everyone sees is actually the proper appearance of the image. For some, clients rarely comes into the suite anymore. Instead, they are often making critical judgements based on what they see on their home or work computers and/or devices.
The lowest common denominator
Historically, a common item in most recording studios was a set of Auratone sound cubes. These small, single speaker monitors, which some mixers dubbed “awful-tones,” were intended to provide a representation of the mix as it would sound on radios and cheaper hi-fi audio set-ups. TV show re-recording mixers would also use these to check a mix in order to hear how it would translate to home TV sets.
Today, smart phones and tablets have become the video equivalent of that cheap hi-fi set-up. Generally that means Apple iPhones or iPads. In fact, thanks to Apple’s color management, videos played back on iPads and iPhones do approximate the correct look of your master file. As editors or colorists, we often ask clients to evaluate the image on an Apple device, not because they are perfect (they aren’t), but rather because they are the best of the many options out in the consumer space. In effect, checking against an iPhone has become the modern video analog of the Auratone sound cubes.
Apple color management
Apple’s color management includes several techniques that are helpful, but can also trip you up. If you are going to recommend that your clients use an iPhone, iPad, or even on iMac to judge the material, then you also want to make sure they have correctly set up their device. This also applies to you, the editor, if you are creating videos and only making judgements on an iMac (or XDR) display, without any actual external video (not computer) display.
Apple computers enable the use of different color profiles and the ability to make adjustments according to calibration. If you have a new iMac, then you are generally better off leaving the color profile set to the default iMac setting instead of fiddling with other profiles. New Apple device displays are set to P3 D65 color with a higher brightness capacity (up to 500 nits – more with XDR). You cannot expect them to perfectly reproduce an image that looks 100% like a Rec 709 100 nits TV set. But, they do get close.
I routinely edit/grade with Media Composer, Premiere Pro, DaVinci Resolve, and Final Cut Pro on iMacs and iMac Pros. Of these four, only Final Cut Pro shows an image in the edit viewer window that is relatively close to the way that image appears on the video output to a monitor. This is thanks to Apple’s color management and the broader Apple hardware/software ecosystem. The viewer image for the other three may look darker, be more saturated, have richer reds, and/or show more contrast.
Once you get past the color profile (Mac only), then most Apple devices offer two or three additional user controls (depending on OS version). Obviously there’s brightness, which can be manual or automatic. When set to automatic, the display will adjust brightness based on the ambient light. Generally auto will be fine, unless you really need to see crucial shadow detail. For example, the pluge portion of a test pattern (darkest gray patches) may not be discernible unless you crank up the brightness or are in a dark room.
Nevertheless, they do have a pleasing effect, because these features make the display warmer or cooler based on the time of day or the color temperature of the ambient light in the room. Typically the display will appear warmer at night or in a dimmer room. If you are working with a lot of white on the screen, such as working with documents, then these modes do feel more comfortable on your eyes (at least for me). However, your brain adjusts to the color temperature shift of the display when using something like True Tone. The screen doesn’t register in your mind as being obviously warm.
If you are doing anything that involves judging color, the LAST thing you want to use is True Tone or Night Shift. This applies to editing, color correction, art, photography, etc. It’s important to note that these settings only affect the way the image is displayed on the screen. They don’t actually change the image itself. Therefore, if you take a screen grab with True Tone or Night Shift set very cool or warm, the screen grab itself will still be neutral.
In my case, I leave these off for all of the computers I use, but I’m OK with leaving them on for my iPhone and iPad. However, this does mean I need to remember to turn the setting off whenever I use the iPhone or iPad to remotely judge videos. And there’s the rub. If you are telling your client to remotely judge a video using an Apple device – and color is part of that evaluation – then it’s imperative that you ask them (and maybe even teach them how) to turn off those settings. Unless they are familiar with the phenomena, the odds are that True Tone and/or Night Shift has been enabled on their device(s) and they’ve never thought twice about it simply because the mind adjusts.
QuickTime Player is the default media player for many professionals and end users, especially those using Macs. The way QuickTime displays a compatible file to the screen is determined by the color profile embedded into the file metadata. If I do a color correction session in Resolve, with the color management set to Rec 709 2.4 gamma (standard TV), then when I render a ProRes file, it will be encoded with a color profile of 1-2-1 (the 2 indicates 2.4 gamma).
If I export that same clip from Final Cut Pro or Premiere Pro (or re-encode the Resolve export through one of those apps) the resulting ProRes now has a profile of 1-1-1. The difference through QuickTime Player is that the Resolve clip will look darker in the shadows than the clip exported from FCP or Premiere Pro. Yet both files are exactly the same. It’s merely how QuickTime player displays it to the screen based on the metadata. If I open both clips in different players, like Switch or VLC, which don’t use this same metadata, then they will both appear the same, without any gamma shift.
How should one deal with such uncertainties? Obviously, it’s a lot easier to tackle when everyone is in the same room. Unfortunately, that’s a luxury that may become totally obsolete. It already has for many. Fortunately most people aren’t as sensitive to color issues as the typical editor, colorist, or DP. In my experience, people tend to have greater issues with the mix than they do with color purity. But that doesn’t preclude you from politely educating your client and making sure certain best practices are followed.
First, make sure that features like True Tone and Night Shift are disabled, so that a neutral image is being viewed. Second, if you use a review-and-approval service, like frame.io or Vimeo, then you can upload test chart image files (color bars, grayscale, etc). These may be used whenever you need to check the image with your client. Is the grayscale a neutral gray in appearance or is it warmer or cooler? Can you see separation in the darkest and brightest patches of these charts? Or are they all uniformly black or white? Knowing the answers will give you a better idea about what the client is seeing and how to guide them to change or improve their settings for more consistent results.
Finally, if their comments seem to relate to a QuickTime issue, then suggest using a different player, such as Switch (free with watermarks will suffice) or VLC.
The brain, eyes, and glasses
Some final considerations… No two people see colors in exactly the same way. Many people suffer from mild color blindness, i.e. color vision deficiencies. This means they may be more or less sensitive to shades of some colors. Eye glasses affect your eyesight. For example, many glasses, depending on the coatings and material, will yellow over time. I cannot use polycarbonate lenses, because I see chromatic aberration on highlights wearing this material, even though most opticians and other users don’t see that at all. CR-9 (optical plastic) or glass (no longer sold) are the only eyeglass materials that work for me.
If I’m on a flight in a window seat, then the eye closest to the window is being bombarded with a different color temperature of light than the eye towards the plane’s interior. This can be exacerbated with sunglasses. After extended exposure to such a differential, I can look at something neutral and when I close one eye or the other, I will see the image with a drastically different color temperature for one eye versus the other. This eventually normalizes itself, but it’s an interested situation.
A regrettable aspect of history and the march of time is that many interesting stories are buried or forgotten. We learn the bullet points of the past, but not the nuances that bring history alive. It’s a challenge that many documentarians seek to meet. While the WWII era is ripe with heroic tales, one unit was almost forgotten.
The WASP members engaged in military-style training at Avenger Field in Sweetwater, Texas. They wore uniforms, and were given flight assignments by the military, yet they weren’t actually in the military. Their role was to handle all non-combat, military flight tasks within the states, including ferrying aircraft cross-country from factories to deployment bases, serve as test pilots, and handle training tasks like towing targets and mock strafing runs over combat trainees. During her service, the typical WASP would fly more types of aircraft than most male, military pilots. Sadly, 38 WASP died during training or active duty assignments.
Although WASP members joined with the promise of their unit becoming integrated into the regular military, that never happened. As the war wound down and male pilots returned home needing jobs, the WASP units were disbanded, due in part to Congressional and media resistance. Records were sealed and classified and the WASP were almost forgotten by history. Finally in the late 1970s President Carter signed legislation that recognized WASP members as veterans and authorized veterans benefits. In 2009 President Obama and the Congress awarded WASP members with the Congressional Gold Medal.
Documentary filmmaker Jon Anderson set out over a decade ago to tell a complete story of the WASP in a feature-length film. Anderson, a history buff, had already produced and directed one documentary about the Tuskegee Airmen. So the WASP story was the next logical subject. The task was to interview as many living WASP to tell their story as possible. The goal was not just the historical facts, but also what it was like to be a WASP, along with some of the backstory details about Cochran and the unit’s formation. The result was W.A.S.P. – A Wartime Experiment in WoManpower.
Anderson accumulated a wealth of interviews, but with limited resources. This meant that interviews were recorded mostly on DV cameras in standard definition. However, as an instructor of documentary filmmaking at Valencia College, Anderson also utilized some of the film program’s resources in the production. This included a number of re-enactments – filmed with student crews, talent, and RED cameras. The initial capture and organization of footage was handled by a previous student of his using Final Cut Pro 7.
Jon asked me to join the project as co-editor after the bulk of interviews and re-enactments had been compiled. Several dilemmas faced me at the front end. The project was started in FCP7, which was now a zombie application. Should I move the project to Final Cut Pro X, Premiere Pro, or Media Composer? After a bit of experimentation, the best translation of the work that had already been done was into Premiere Pro. Since we had a mix of SD and HD/4K content, what would be the best path forward – upconvert to HD or stay in standard def? HD seemed to be the best option for distribution possibilities, but that posed additional challenges.
Only portions of tapes were originally captured – not complete tapes. These were also captured with separated audio and video going to different capture folders (a feature of FCP “classic”). Timecode accuracy was questionable, so it would be nearly impossible to conform the current organized clips from the tapes at a higher resolution. But since it was captured as DV from DV tapes, there was no extra quality loss due to interim transcoding into a lower resolution file format.
Ultimately I opted to stick with what was on the drives as my starting point. Jon and I organized sequences and I was able to borrow a Blackmagic Teranex unit. I exported the various sequences between two computers through the Teranex, which handled the SD to HD conversion and de-interlacing of any interlaced footage. This left us with upscaled ProRes interviews that were 4×3 within a 16×9 HD sequence. Nearly all interviews were filmed against a black limbo background, so I then masked around each woman on camera. In addition, each was reframed to the left or right side, depending on where they faced. Now we could place them against another background – either true black, a graphic, or B-roll. Finally, all clips were graded using Lumetri within Premiere Pro. My home base for video post was TinMen – an Orlando creative production company.
Refining the story
With the technical details sorted out, it was time to refine the story. Like many docs, you end up with more possible storylines than will fit. It’s always a whittling process to reveal a story’s essence and to decide which items are best left out so that the rest remains clear. Interviews were bridged with voice-overs plus archival footage, photos, or re-enactments to fill in historical details. This went through numerous rounds of refinement with input from Jon and Rachel Becker Wright, the producer and co-editor on the film. Along the way Rachel was researching, locating, and licensing archival footage for B-roll.
Once the bulk of the main storyline was assembled with proper voice-overs, re-enactments, and some B-roll, I turned the cut over to Rachel. She continued with Jon to refine the edit with graphics, music, and final B-roll. Sound post was handled by the audio production department at Valencia College. A nearly-final version of the 90-minute documentary was presented at a “friends and family” screening at the college.
Many readers know about the national Emmy® Awards handed out annually by the National Academy of Television Arts and Sciences (NATAS). It may be less known that NATAS includes 19 regional chapters, which also award Emmys within their chapters. Awards are handed out for projects presented in that region, usually via local broadcast or streaming. Typically the project wins the award without additional craft categories. Anderson was able to submit a shortened version of the documentary for judging by the Suncoast regional chapter, which includes Florida, Puerto Rico, and parts of Louisiana, Alabama, and Georgia. I’m happy to say that W.A.S.P. – A Wartime Experiment in WoManpower won a 2020 regional Emmy, which included Jon Anderson, Rachel Becker Wright, Joe Stone (production designer), and myself.
Awards are nice, of course, but getting the story out about the courageous ladies of the WASP is far more important and I was happy to play a small part in that.