Last Night in Soho

Edgar Wright has written and directed a string of successful comedies and cult classics. His latest, Last Night in Soho is a change from this pattern. It’s a suspense thriller that would make Hitchcock proud. Eloise (Thomasin McKenzie) is a young English country girl who’s moved to the Soho area of London to study fashion design. Her nightly dreams transport her into the past of the swinging 60s and the Soho nightlife, observing and becoming strangely intertwined with the life of Sandie (Anya Taylor-Joy), an aspiring singer. Those dreams quickly evolve into nightmares as the story turns more sinister.

Last Night in Soho was edited by Paul Machliss, ACE, a frequent collaborator with Wright. Machliss edited Scott Pilgrim vs. The World, The World’s End, and Baby Driver. The latter picked up a BAFTA win and an Oscar nomination for best editing. I recently had a chance to interview Machliss during a break from cutting his next feature, The Flash.

______________________________

Music was a key plot device and impetus for much of the editing in Baby Driver. In Last Night in Soho, music of the 1960s also plays a key role, but in a less blatant manner.

Baby Driver was about capital E editing. Edgar and I took a lot of what we learned and applied it to Soho. Can we do it in a more subtle away? It’s not immediately obvious, but it’s probably as intense. It just doesn’t show itself to the same level.

Many shots that look like sophisticated visual effects were done practically, such as the dance sequence involving both lead actresses and Matt Smith. What were some of the other practical effects?

One clever shot was the bit in the phone box, where right at the end of the dance, they go in for a snog. Initially the reflection is that of Matt [Jack] and Anya [Sandie]. Then as Matt pulls back, you realize that it becomes Thomasin [Eloise] in the mirror. That was initially in a mirror when they start kissing. In the middle of the kiss there’s an effects team yanking the mirror back to reveal Thomasin and a stand-in behind the mirror. The visual effect is to get rid of the moment when the mirror is yanked.

I’d love to say it was full of subtle things that couldn’t have been done without editing. However, it almost goes back to the days of Vaudeville or the earliest days of cinema. Less is more – those simple things are still incredibly effective.

As with Baby Driver, I presume you were editing primarily on the set initially?

I was on set every day. We did about three weeks straight of night shoots at the height of summer in Soho. I wouldn’t say we owned Soho, but we could just literally run anywhere. We could park in Soho Square and wheel the trolleys wherever we needed for the purposes of filming. However, Soho didn’t close down – life carried on in Soho. It was fascinating to see the lifestyle of Soho change when you are filming from 6:00 PM till 6:00 AM. It’s in the smaller hours that elements of the ‘darker’ side of Soho start to appear and haunt the place. So that slightly sinister level of it hasn’t gone away.

Being on set had a lot to do with things like music playback, motion control cameras, and certainly lighting, which probably more than Baby Driver played a huge part in this film. I found myself somewhat responsible for the timing to make sure that Edgar’s idea of how he wanted the lighting to work in sync with the music was 100% successful on the night, because there was no fixing it afterwards. 

Some editors avoid being on set to maintain objectivity. Your working style seems to be different.

My MO really is not to do the perfect assemble edit in the midst of all the madness of filming. What I’m trying to do for Edgar is a proof of concept. We know as we’re shooting that issues can arise from continuity, camera angles, and various other things. So part of what I’m doing on set is keeping an eye on all the disparate elements from a technical perspective to make sure they’re all in harmony to support Edgar’s vision. But as the editor, I still need to make time for an assembly. Sometimes that meant getting there an hour or two earlier on set. Then I just sit at the Avid with headphones and quickly catch up on the previous day’s work, where I do try and make more of a concerted effort to cut a good scene. Then that gets passed on to the editorial department. By the time Edgar and I get back into the cutting room, we have a fully assembled film to start working on.

Lighting design – especially the neon sign outside of Ellie’s window – drives the scene transitions.

Edgar’s original brief was, “There is a point where the lighting goes from blue, white, red, blue, white, red, and then whenever she transitions into the past, it goes into a constant flashing red.” That’s something that any good lighting operator can queue quite simply. What made it more interesting is that Edgar said, “What I’d love, is for the lighting to flash subtly, but in time to every different bit of music that Eloise puts on her little record player.” Then it was like, “Oh, right, how do we do that?” 

Bradley Farmer on our music team was able to break the songs down into a kind of beat sheet with all the lyrics and the chorus. Edgar would go, “According to the storyboards this is the line in the song that I’d like it to start going red, red, red.” Armed with that knowledge, I had one track of audio with the music in mono and another track with longitudinal timecode – a different hour code for every song. I would edit full screen color images of red, white, and blue as a reference on the Avid to match what the timing of the color changes should be against the track.

Next, I was able to export an XML file out of the Avid, which could be read by the lighting panel computer. The lighting operator would load these sequences in so the panel would know when to have the lights in their ‘on’ or ‘off’ state. He also had a QuickTime reference from the Avid so he could see the color changes against the burnt-in timecode and know, “Next one’s red, program the SkyPanel to go red.”

Our music playback guy, Pete Blaxill, had copies of the tracks in stereo and was able to use my timecode as his base timecode. He then sent that timecode to the lighting panel. So if Edgar goes, “I now want to pick up the song from the first chorus,” then the lighting panel would chase the playback timecode. Once the sequence was set at the lighting panel, wherever Edgar wanted to go, the lighting desk knew which part of the song we were at and what the next color in the sequence was.

To make things a tad more complex Edgar wanted to shoot some of the action to the playback at 32fps so there could be a dreamlike quality to movement at certain points in the song. This meant creating a lighting map that would work at 32fps, as well as the regular 24fps version. Bradley Farmer gave me a version of the songs that were exactly 33.3% faster and pitch-corrected. I reformatted my Avid sequence so everything just went on and off a third faster for the 32fps tracks. And once again, gave the XML file to the lighting guy and the sped-up tracks to Pete for his Pro Tools system.

I realized that the motion control camera could also be triggered by external timecode from Pete’s Pro Tools system. We utilized that at the climax of ‘You’re My World’ where Anya descends the staircase in the Cafe de Paris ballroom. This was a two-layer composite shot filmed with a motion control camera in multiple passes. Thomasin does one pass descending the staircase and then we did Anya’s pass. We also had singer Beth Singh [Cilla Black] in the foreground of the shot with her backing band behind her. Pete Blaxill would hit play on his Pro Tools. The music would be coming through the foldback for Beth to mime to, the lighting switched in at the right musical point, and then at exactly the right moment in the chorus the Pro Tools timecode would trigger the MoCo. I remember sitting crosslegged on the floor out of the way in the corner and watching all this happen. It’s incredibly ‘nerdy,’ but it gave one a wonderful feeling of satisfaction to be part of the team that could make moments like this happen seamlessly.

What formats were used to capture Last Night in Soho?

This was all shot 35mm, except that we used an ARRI Alexa for the night shoots. We tested both 16mm and 35mm for these exteriors. The 16mm looked good, but was too grainy and the highlights flared out tremendously. The 35mm was okay, but required a lot more lighting to achieve the desired look. Of course with the Alexa being digital, you put it there without any extra lighting at all, look at the result, and go, “Gosh, look at the amount of detail – that’s almost usable as it is.” Just because of the variety of shots we needed, Edgar and DP Chung-hoon Chung decided to use the Alexa for exterior night locations and any interior location where we would be looking out into the street at some point – for example, the library that Eloise visits later in the film.

Does the choice of film rather than digital acquisition add some technical challenges?

Fortunately, Kodak is still a company that is churning out 35mm stock full-time. The infrastructure for it though is getting less and less. Even in the time between doing The World’s End and Last Night in Soho, there is now only one facility in the UK that can process 35mm. And it has to process the rushes for every production that’s shooting on 35mm in the UK.

Even so, it’s great to think that a medium like 35mm can still support something as complicated as one of Edgar’s films without having to do everything in the digital domain. You’ve got the [Panavision] Millennium XL cameras – even though they’re nearing 20 years old, they’re pretty solid. 

Edgar is totally aware that he’s not shooting on an ‘archaic’ format for the sake of it, but that it’s his preferred medium of acquisition – in the same way some painters choose water-based paints as opposed to oil-based. He knows what medium he’s on and he respects that. You might say, “Yes, but the punter won’t necessarily know or appreciate what format this was shot on.” However, it helps to contribute to the feeling of our production. It’s part of the look, part of the patina – that slightly organic feel of the media brings so much to the look of Soho.

Because you are cutting on set, you’re working from a video tap off of the film camera.

Right. Once again, I had a little network set up with a large spool of Cat 5 cable connected to the Qtake system. And I would put myself out of the way in a corner of the soundstage and get on with it. I would just be a crew member quietly doing my job editing, not bothering Edgar. Sometimes he might not see me for a couple of hours into the day until he needed my input on a shot.

So that means at some point the low-resolution video tap clips have to be replaced by the actual footage after the film has been transferred.

That’s right. I used my editorial team from Baby Driver – Jerry Ramsbottom and Jessica Medlycott. Both of them were well-versed in the technique of over-cutting the low-res video tap footage once the hi-res Avid media came back from the lab. I never had to worry about that. The cutting room was also in Soho and when we were shooting there, if Edgar ever had a question about the footage or the previous day’s edit he would say, “Could we meet an hour early (before call-time) and pop into the cutting room to look at some stuff?” It also meant that we could tweak the edit before that day’s shoot and give Edgar a better idea of his goals for that day.

Please tell me about the production and post timeline, considering that this was all impacted by the pandemic.

Prep started at the beginning of 2019. I came on board in April. The shoot itself went from May till early September. We then had the director’s cut period, which due to the complexity took slightly longer than your average 10 weeks.

We worked up until the new year and had a preview screening early in January, which got some good numbers back. Edgar was able to start talking to the studio about some additional photography. We planned that shoot for the last week of March 2020. But at the last minute it was cancelled as we entered the first lockdown. However, visual effects work continued, because all of that could be done remotely. We had a weekly VFX review using Zoom and Cinesync with Double Negative for about a four-month period during the first lockdown. That was a bit of a lifesaver for us to know that the film still had a heartbeat during that time.

Things calmed down at the end of July – enough for the studio to consider allowing us to remount the additional photography. And of course, it was a new world we were living in by that stage. Suddenly we were working in ‘zones’ on set. I was assigned the zone with the video department and couldn’t go on to set and work with Edgar. We had an area – divided by plastic fencing – where we could be. We would have to maintain a distance with him on one side and me on the other. Fortunately, I had my edit-trolley modified so that the A-Grade monitor was on a swivel-mount and that’s how he was able to keep an eye on the progress of the work.

We were only the second production in England to resume at that time. I think other productions were watching us and thinking, “Would we all just collapse like flies? Would COVID just go ‘BAM!’ and knock us all out?” Overall, the various PCR testing and new health and safety procedures added about an extra 20% to the budget, but it was the only way we were going to be allowed to shoot.

The reshoots went very well and we had another preview screening in October and the numbers were even better. But then we were approaching our second lockdown in the UK. However, this time Edgar and I were able to see post-production all the way through. All of our dates at Twickenham Studios for the sound mix and at Warner Bros. De Lane Lea for the grade could be honored, even though strict safety precautions were in place.

We delivered the film on December 18th of last year, having made all the various HDR and SDR versions for the UHD Blu-ray, as well as general release. We did a wonderful Dolby Vision/Dolby Atmos version, which is actually the ultimate way to see the film. Because of the pandemic and the lack of open theaters at the time, there was a school of thought that this film should be released directly onto a streaming platform. Fortunately, producers Eric Fellner and Nira Park viewed the final grade and mix at Twickenham and said, “No, no, this is a cinematic experience. It was designed to be seen on the big screen, so let’s wait. Let’s bide our time.”

Edgar Wright was also working on the documentary, The Sparks Brothers. Was the post on these two simultaneous? If so, how did this impact his availability to you during the Soho post?

The timelines to the two projects were kind of parallel. Making a documentary is about gathering tons of archival footage and obtaining new interviews. Then you can leave it with the editor to put a version of it together. I remember that Edgar had done a ton of interviews before we started shooting Soho. He’d done all of the black-and-white interviews in London or in LA pre-pandemic. The assembly of all of that footage happened during our shoot. Then when the lockdown occurred, it was very easy for Paul Trewartha, the editor of the Sparks documentary, to carry on working from home.

When we came back, the Soho and Sparks cutting rooms were both on different floors of our edit facility on Wardour Street in Soho. They were on the first floor and we were on the top floor. Edgar would literally bounce between the two. It got a little bit fraught for Edgar, because the grading and dubbing for both films happened at the same time at different facilities. I remember Edgar had to sign off on both films on December 18th. So he had to go to one facility to watch Soho with me and then he went off to watch Sparks with Paul, which I imagined gave him quite a degree of satisfaction to complete both huge projects on the same day.

To wrap it up, let’s talk edit systems. Avid Media Composer is still your go-to NLE. Right?

Avid, yeah. I’ve been running it for God knows how long now. It’s a little like touch-typing for me – it all just happens very quickly. When I watch all the dailies of a particular scene, I’m in almost a trance-like state, putting shots together very quickly on the timeline before taking a more meticulous approach. I know the ballistics of the Avid, how it behaves, how long it takes between commands. It’s still the best way for me to connect emotionally to the material. Plus on a technical level – in terms of media management, having multiple users, VFX editors, two assistant editors – It’s still the best.

In a past life you were a Smoke editor. Any closing thoughts about some of the other up-and-coming editing applications?

You certainly can’t call them the poor cousins of post-production. Especially Resolve. Our colorist, Asa Shoul, suggested I look at Resolve. He said, “Paul, you really should have a look, because not only is the cutting intuitive, but you could send me a sequence, to which I could apply a grade, and that would be instantly updated on your timeline.” Temp mixes from the sound department would work in a very similar way. I think that sort of cross-pollination of ideas from various departments, all contributing elements to a sequence, which itself is being continually updated, is a very exciting concept.

I wouldn’t be surprised if one day someone said, “Paul, we are doing this film, but we’re going to do it on Resolve, because we have this workflow in place and it’s really good.” At my advanced age of 49 [laugh], I don’t want to think, “Well, no, it’s Avid or nothing.” I think part of keeping your career fresh is the ability to integrate your skills with new workflow methods that offer up results, which would have seemed impossible ten years earlier. 

Images courtesy of Focus Features.

This article is also available at postPerspective.

For more, check out Steve Hullfish’s Art of the Cut interview with Paul Machliss.

©2021 Oliver Peters

Blood Red Sky

Halloween is just around the corner. It’s the traditional time of the year for ghosts, ghouls, and vampires. Sounds like a great time to enjoy the Netflix original, Blood Red Sky. This highly-viewed, German-language film is a genre mix that blends horror with heart. It was edited by veteran Berlin-based film editor Knut Hake. Just in time for the latest Final Cut Pro update, Blood Red Sky has also been highlighted by Apple for its challenging feature film workflow.

I recently had a chance to interview Hake about cutting the film, working through the pandemic, and his use of Final Cut Pro on such high-end projects. Read the interview here at FCP.co.

©2021 Oliver Peters

Final Cut Pro vs DaVinci Resolve

Apple’s innovative Final Cut Pro editing software has passed its tenth year and for many, the development pace has become far too slow. As a yardstick, users point to the intensity with which Blackmagic Design has advanced its flagship DaVinci Resolve application. Since acquiring DaVinci, Blackmagic has expanded the editing capabilities and melded in other acquisitions, such as EyeOn Fusion and Fairlight audio. They’ve even integrated a second, FCP-like editing model called the Cut page. This has some long-time Final Cut editors threatening to jump ship and switch to Resolve.

Let’s dig a bit deeper into some of the comparisons. While Resolve has a strong presence as a premier color correction tool, its actual adoption as the main editor within the post facility world hasn’t been very strong. On the other hand, if you look outside of the US to Europe and the rest of the world, you’ll find quite a few installations of Final Cut Pro within larger media operations and production companies. Clearly both products have found a home servicing professional workflows.

Editing versus finishing

When all production and post was done with film, the picture editor would make all of the creative editing decisions by cutting workprint and sound using a flatbed or upright editing machine. The edited workprint became the template for the optical house, negative cutter, film timer, and lab to produce the final film prints. There was a clear delineation between creative editing and the finishing stages of filmmaking.

Once post moved to videotape, the film workflow was translated into its offline (creative editing) and online (finishing) video counterparts. Offline editing rooms used low-res formats and were less expensive to equip and operate. Online rooms used high-res formats and often looked like the bridge of a starship. But it could also be the other way around, because the offline and online processes were defined by the outcome and not the technology. Offline = creative decisions. Online = finished masters. Of course, given proper preparation or a big budget, the offline edit stage could be skipped. Everything – creative edit and finishing – was all performed in the same online edit bay.

Early nonlinear editing supplemented videotape offline edit bays for a hybrid workflow. As computer technology advanced and NLE quality and capabilities improved, all post production shifted to workstation-based operations. But the offline/online – editing/finishing – workflows have persisted, in spite of the fact that most computers and editing applications are capable of meeting both needs. Why? It comes down to three things: personality, kit, and skillset.

Kit first. Although your software might do everything well, you may or may not have a capable computer, which is why proxy workflows exist today. Beyond that comes monitoring. Accurate color correction and sound mixing requires proper high-quality audio and video monitoring. A properly equipped finishing room should also have the right lighting environment and/or wall treatments for sound mixing. None of this is essential for basic editing tasks, even at the highest level. While having a tool like Resolve makes it possible to cover all of the technical aspects of editing and finishing, if you don’t have the proper room, high-quality finishing may still be a challenge.

Each of the finishing tasks requires its own specialized skillset. A topnotch re-recording mixer isn’t going to be a great colorist or an award-winning visual effects compositor. It’s not that they couldn’t, but for most of us, that’s not the way the mind works nor the opportunities presented to us. As we spend more time at a specialized skill – the “10,000 hour” rule – the better we are at it.

Finally, the issue of personality. Many creative editors don’t have a strong technical background and some aren’t all that precise in how they handle the software. As someone who works on both sides, I’ve encountered some of the most awful timelines on projects where I’ve handled the finishing tasks. The cut was great and very creative, but the timeline was a mess.

On the flipside, finishing editors (or online editors before them) tend to be very detail-oriented. They are often very creative in their own right, but they do tend to fit the “left-brained” description. Many prefer finishing tasks over the messy world of clients, directors, and so on. In short, a topnotch creative editor might not be a good finisher and vice versa.

The all-in-one application versus the product ecosystem

Blackmagic Design’s DaVinci Resolve is an all-in-one solution, combining editing, color, visual effects, and sound mixing. As such, it follows in the footsteps of other all-in-ones, like Avid|DS (discontinued) and Autodesk Flame (integrated with Smoke and Lustre). Historically, neither of these or any other all-in-ones have been very successful in the wider editing market. Cost coupled with complex user interfaces have kept them in more rarified areas of post.

Apple took the opposite approach with the interaction of Final Cut Pro X. They opted for a simpler, more approachable interface without many features editors had grown used to in the previous FCP 7/FCP Studio versions. This stripped-down application was augmented by other Apple and third-party applications, extensions, and plug-ins to fill the void.

If you want the closest equivalent to Resolve’s toolkit in the Final Cut ecosystem, you’ll have to add Motion, Logic Pro, Xsend Motion, X2Pro Audio Convert, XtoCC, and SendToX at a very minimum. If you want to get close to the breadth of Adobe Creative Cloud offerings, also add Compressor, Pixelmator Pro (or Affinity, Photo, Publisher, and Designer), and a photo application. Resolve is built upon a world-class color correction engine, but Final Cut Pro does include high-quality grading tools, too. Want more? Then add Color Finale 2, Coremelt Chromatic, FilmConvert Nitrate, or one of several other color correction plug-ins.

Yes, the building block approach does seem messy, but it allows a user to tailor the software toolkit according to their own particular use case. The all-in-one approach might appear better, but that gets to personality and skillset. It’s highly unlikely that the vast majority of Resolve users will fully master its four core capabilities: edit, color, VFX (Fusion), and mixing (Fairlight). A good, full-time editor probably isn’t going to be as good at color correction as a full-time colorist. A great colorist won’t also be a good mixer.

In theory, if you have a team of specialists who have all centralized around Resolve, then the same tool and project files could bounce from edit to VFX, to color, and to the mix, without any need to roundtrip between disparate applications. In reality it’s likely that your go-to mograph/VFX artist/compositor is going to prefer After Effects or maybe Nuke. Your favorite audio post shop probably won’t abandon Pro Tools for Fairlight.

Even for the single editor who does it all, Resolve presents some issues with its predefined left-to-right, tabbed workflow. For example, grading performed in the Color tab can’t be tweaked in the Edit tab. The UI is based on modal tabs instead of fly-out panels within a single workspace.

If you boil it all down, Resolve is the very definition of a finishing application and appeals best to editors of that mindset and with the skills to effectively use the majority of its power. Final Cut Pro is geared to the creative approach with its innovative feature set, like metadata-based organization, skimming, and the magnetic timeline. It’s more approachable for less-experience editors, hiding the available technical complexity deeper down. However, just like offline and online editing suites, you can flip it around and do creative editing with Resolve and finishing with Final Cut Pro (plus the rest of the ecosystem).

The intangibles of editing

It’s easy to compare applications on paper and say that one product appears better and more feature-rich than another. That doesn’t account for how an application feels when you use it, which is something Apple has spent a lot of time thinking about. Sometimes small features can make all the difference in an editor’s preference. The average diner might opine that chef’s knives are the same, but don’t tell that to a real chef!

Avid Media Composer editors rave about the trim tool. Many Adobe Premiere Pro editors swear by Dynamic Link. Some Apple Final Cut Pro editors get frustrated when they have to return to a track-based, non-magnetic NLE. It’s puzzling to me that some FCP stalwarts are vocal about shifting to Resolve (a traditional track-based NLE) if Apple doesn’t add ‘xyz’ feature. That simply doesn’t make sense to me, unless a) you are equally comfortable in track-based versus trackless architectures, and/or b) you truly have the aptitude to make effective use out of an all-in-one application like Resolve. Of course, you can certainly use both side-by-side depending on the task at hand. Cost is no longer an impediment these days. Organize and cut in FCP, and then send an FCPXML of the final sequence to Resolve for the grade, visual effects, and the mix.

It’s horses for courses. I recently read where NFL Films edits in Media Composer, grades in DaVinci Resolve, and conforms/finishes projects in Premiere Pro. That might seem perplexing to some, but makes all the sense in the world to me, because of the different skillsets of the users at those three stages of post. In my day gig, Premiere Pro is also the best choice for our team of editors. Yet, when I have projects that are totally under my control, I’ll often use FCP.

Ultimately there is no single application that is great at each and every element in post production. While the majority of features might fit all of my needs, that may not be true for you or anyone else. The divide between creative editing and finishing is likely to continue – at least at the higher end of production. In that context, Final Cut Pro still makes more sense for a frictionless editing experience, but Resolve is hard to beat for finishing.

There is one final caveat to consider. The post world is changing and much is driven by the independent content creator, as well as the work-from-home transformation. That market segment is cost conscious and subscription business models are less appealing. So Resolve’s entry point at free is attractive. Coupling Resolve with Blackmagic’s low cost, high quality cameras is also a winning strategy for new users. While Resolve can be daunting in its breadth, a new user can start with just the tools needed to complete the project and then learn new aspects of the software over time. As I look down the road, it’s a toss up as to who will be dominant in another ten years.

For another look at this topic, click here.

©2021 Oliver Peters

The Mole Agent

At times you have to remind yourself that you are watching a documentary and not actors in a fictional drama. I’m talking about The Mole Agent, one of the nominees for Best Documentary Feature in this year’s Academy Awards competition. What starts as film noir with a humorous slant evolves into a film essay on aging and loneliness.

Chilean filmmaker Maite Alberdi originally set out to document the work being done by private investigator Romulo Aitkin. The narrative became quite different, thanks to Romulo’s mole, Sergio Chamy. The charming, 83-year-old widower was hired to be the inside man to follow a case at a retirement home. Once on the inside, we see life from Sergio’s perspective.

The Mole Agent is a touching film about humanity, deftly told without the benefit of an all-knowing narrator or on-camera interviews. The thread that binds the film is often Sergio’s phoned reports to Romulo, but the film’s approach is largely cinema verite. Building that structure fell to Carolina Siraqyan, a Chile-based editor, whose main experience has been cutting short-form projects and commercials. I recently connected with Carolina over Zoom to discuss the post behind this Oscar contender.

* * *

Please tell me how you got the chance to edit this film.

I met Maite years ago while giving a presentation about editing trailers for documentaries, which is a speciality of mine. She was finishing the The Grown-Ups and I’m Not From Here, a short documentary film. I ended up doing the trailers for both and we connected. She shared that she was developing The Mole Agent. I loved the mixture of film noir and observational documentary, so I asked to work on the film and ended up cutting it.

Did her original idea start with the current premise of the film or was the concept broader at that point?

Maite wanted to do a documentary about the workings of a private detective agency, since detectives are often only represented in fiction. She worked with Romulo for a few months and realized that investigations into retirement homes are quite common. She loved the idea for the film and started focusing on that aspect.

Romulo already had a mole that he used inside the homes on these cases, but the mole broke his hip. So Romulo placed a newspaper want ad for someone in his 80s who could work as his new mole on this case. A number of credible older men applied. Out of those applicants, Sergio was hired and turned out to be perfect for the film. He entered into the retirement home after some initial training, including how to discretely communicate with Romulo and how to use the spy cameras.

How was the director able to convince the home and the residents to be in the film?

The film crew had arrived a couple of weeks before Sergio. It was explained that they were doing a film on old age and would be focusing on any new residents in the home. So, the existing residents were already comfortable with the presence of the cameras before he arrived. Maite was very empathetic about where to place cameras so that they wouldn’t bother residents or interfere with what the staff was doing, even if that might not be the best location aesthetically.

Maite is very popular here. She’s written and directed a number of films about social issues and her point-of-view is very humble and very respectful. This is a good retirement home with nothing to hide, so both the staff and the residents were OK with filming. But to be clear, only people who consented appear in the film.

I understand that there were 300 hours of raw footage filmed for this documentary. How did you approach that?

The crew filmed for over three months. It’s actually more that 300 hours of footage, because of the spy cameras. Probably as much as 50 hours more. I couldn’t use a lot of that spy camera material, because Sergio would accidentally press record instead of pressing stop. The camera was in his pocket all the time, so I might have black for 20 minutes. [laugh] 

I starting on the project in January [2019] after it had been shot and the camera footage merged with the sound files. The native footage was shot with Sony cameras in their MXF format. The spy cameras generated H.264 files. To keep everything smooth, I was working with proxy files.

Essentially I started from zero on the edit. It took me two months to categorize the footage. I have an assistant, but I wanted to watch all of the material first. I like to add markers while I’m watching and then add text to those markers as I react to that footage. The first impression is very important for me.

We had a big magnetic blackboard and I placed magnetic cards on the wall for each of the different situations that I had edited. Then Maite came during the middle of March and we worked together like playing Tetris to structure the film. After that we shifted to Amsterdam for two months to work in a very focused way in order to refine the film’s structure. The first edition was completed in November and the final mix and color correction was done in December.

Did you have a particular method to create the structure of this documentary?

I feel that every film is different and you have to think a lot about how you are going to face each movie. In this film I had two certainties, the beginning – Romulo training Sergio – and the ending – what Sergio’s thoughts were. The rest is all emotion. That’s the spine. I have to analyze the emotion to converge to the conflict. First, there’s the humor and then the evolution to the sadness and loneliness. That’s how I approached the material – by the emotion.

I color-coded the magnetic cards for different emotions. For example, pink was for the funny scenes. When Maite was there, the cards provided the big picture showing all the situations. We could look and decide if a certain order worked or not.

What sort of changes to the film came out of the review stage?

This is a very international film with co-producers in the United States, Germany, the Netherlands, Spain, and Chile. We would share cuts with them to get helpful feedback. It let us make the movie more universal, because we had the input of many professionals from different parts of the world. 

When we arrived in Amsterdam the first cut of the film was about three hours long. Originally the first part was 30 minutes long and that was cut down to 10 minutes. When we watched the longer cut, we felt that we were losing interest in the investigation; however, the relationship that Sergio was establishing with the women was wonderful. All the women are in love with him. It starts like film noir, but with humor.  So we focused on the relationships and edited the investigation parts into shorter humorous segments that were interspersed throughout the film.

The reality was incredible and definitely nothing was scripted. But some of the co-producers commented that various scenes in the film didn’t feel real to them. So, we considered those opinions as we were tightening the film.

You edited this film with Adobe Premiere Pro. How do you like using it and why was it the right tool for this film?

I started on film with Moviola and then edited on U-matic, which I hated. I moved to Avid, because it was the first application we had. Then I moved to Final Cut Pro; but after FCP7 died, I switched to Premiere Pro. I love it and am very comfortable with how the timeline works. The program leaves you a lot of freedom as to how and where you put your material. You have control – none of that magnetic stuff that forces you to do something by default.

Premiere Pro was great for this documentary. If a program shuts down unexpectedly, it’s very frustrating, because the creative process stops. I didn’t have any problems even though everything was in one, large project. I did occasionally clean up the project to get rid of stuff I wasn’t using, so it wasn’t too heavy. But Premiere allowed me to work very fluidly, which is crucial.

You completed the The Mole Agent at the end of 2019. That’s prior to the “work from home” remote editing reality that most of the world has lived through during this past year. What would be different if you had worked on the film a year later?

The Mole Agent was completed in time for Sundance in January of 2020. Fortunately we were able to work without lockdowns. I’ve worked a lot remotely during this past year and it’s difficult. You get accustomed to it, but there is something missing. You don’t get the same feeling looking through a [web] camera as being together in the room. Something in the creative communication is lost in the technology. If the movie had been edited like this [communicating through Zoom] – and considering the mood during the lockdowns and how that affects your perception of the material – then it really would be a different film.

Any final thoughts about your experience editing this film?

I had previously worked sporadically on films, but have spent most of my career in the advertising industry. A few years ago I decided that I wanted to work full-time on long-form films. Then this project came to me. So I was very open during the process to all of the notes and comments. I understood the process, of course, but because I had worked so much in advertising, I now had to put this new information into practice. I learned a lot!

The Mole Agent is a very touching film. It’s different – very innovative. It’s an incredible movie for people who have seen the film. It affects the conscience and they take action. I feel very glad to have worked on this film.

This article also appears at postPerspective.

©2021 Oliver Peters

W.A.S.P.

A regrettable aspect of history and the march of time is that many interesting stories are buried or forgotten. We learn the bullet points of the past, but not the nuances that bring history alive. It’s a challenge that many documentarians seek to meet. While the WWII era is ripe with heroic tales, one unit was almost forgotten.

Women Airforce Service Pilots  (aka WASP)

As WWII ramped up, qualified male pilots were sent to European and Pacific combat, leaving a shortage of stateside pilots. The WASP unit was created as a civilian auxiliary  attached to the U. S. Army Air Forces, It was organized and managed by Jackie Cochran, an accomplished female aviator and entrepreneur. More than 25,000 women applied for the WASP, but only 1,830 were accepted into the program.

The WASP members engaged in military-style training at Avenger Field in Sweetwater, Texas. They wore uniforms, and were given flight assignments by the military, yet they weren’t actually in the military. Their role was to handle all non-combat, military flight tasks within the states, including ferrying aircraft cross-country from factories to deployment bases, serve as test pilots, and handle training tasks like towing targets and mock strafing runs over combat trainees. During her service, the typical WASP would fly more types of aircraft than most male, military pilots. Sadly, 38 WASP died during training or active duty assignments.

Although WASP members joined with the promise of their unit becoming integrated into the regular military, that never happened. As the war wound down and male pilots returned home needing jobs, the WASP units were disbanded, due in part to Congressional and media resistance. Records were sealed and classified and the WASP were almost forgotten by history. Finally in the late 1970s President Carter signed legislation that recognized WASP members as veterans and authorized veterans benefits. In 2009 President Obama and the Congress awarded WASP members with the Congressional Gold Medal.

The documentary

Documentary filmmaker Jon Anderson set out over a decade ago to tell a complete story of the WASP in a feature-length film. Anderson, a history buff, had already produced and directed one documentary about the Tuskegee Airmen. So the WASP story was the next logical subject. The task was to interview as many living WASP to tell their story as possible. The goal was not just the historical facts, but also what it was like to be a WASP, along with some of the backstory details about Cochran and the unit’s formation. The result was W.A.S.P. – A Wartime Experiment in WoManpower.

Anderson accumulated a wealth of interviews, but with limited resources. This meant that interviews were recorded mostly on DV cameras in standard definition. However, as an instructor of documentary filmmaking at Valencia College, Anderson also utilized some of the film program’s resources in the production. This included a number of re-enactments – filmed with student crews, talent, and RED cameras. The initial capture and organization of footage was handled by a previous student of his using Final Cut Pro 7.

Technical issues

Jon asked me to join the project as co-editor after the bulk of interviews and re-enactments had been compiled. Several dilemmas faced me at the front end. The project was started in FCP7, which was now a zombie application. Should I move the project to Final Cut Pro X, Premiere Pro, or Media Composer? After a bit of experimentation, the best translation of the work that had already been done was into Premiere Pro. Since we had a mix of SD and HD/4K content, what would be the best path forward – upconvert to HD or stay in standard def? HD seemed to be the best option for distribution possibilities, but that posed additional challenges.

Only portions of tapes were originally captured – not complete tapes. These were also captured with separated audio and video going to different capture folders (a feature of FCP “classic”). Timecode accuracy was questionable, so it would be nearly impossible to conform the current organized clips from the tapes at a higher resolution. But since it was captured as DV from DV tapes, there was no extra quality loss due to interim transcoding into a lower resolution file format.

Ultimately I opted to stick with what was on the drives as my starting point. Jon and I organized sequences and I was able to borrow a Blackmagic Teranex unit. I exported the various sequences between two computers through the Teranex, which handled the SD to HD conversion and de-interlacing of any interlaced footage. This left us with upscaled ProRes interviews that were 4×3 within a 16×9 HD sequence. Nearly all interviews were filmed against a black limbo background, so I then masked around each woman on camera. In addition, each was reframed to the left or right side, depending on where they faced. Now we could place them against another background – either true black, a graphic, or B-roll. Finally, all clips were graded using Lumetri within Premiere Pro. My home base for video post was TinMen – an Orlando creative production company.

Refining the story

With the technical details sorted out, it was time to refine the story. Like many docs, you end up with more possible storylines than will fit. It’s always a whittling process to reveal a story’s essence and to decide which items are best left out so that the rest remains clear. Interviews were bridged with voice-overs plus archival footage, photos, or re-enactments to fill in historical details. This went through numerous rounds of refinement with input from Jon and Rachel Becker Wright, the producer and co-editor on the film. Along the way Rachel was researching, locating, and licensing archival footage for B-roll. 

Once the bulk of the main storyline was assembled with proper voice-overs, re-enactments, and some B-roll, I turned the cut over to Rachel. She continued with Jon to refine the edit with graphics, music, and final B-roll. Sound post was handled by the audio production department at Valencia College. A nearly-final version of the 90-minute documentary was presented at a “friends and family” screening at the college.

Emmy®

Many readers know about the national Emmy® Awards handed out annually by the National Academy of Television Arts and Sciences (NATAS). It may be less known that NATAS includes 19 regional chapters, which also award Emmys within their chapters. Awards are handed out for projects presented in that region, usually via local broadcast or streaming. Typically the project wins the award without additional craft categories. Anderson was able to submit a shortened version of the documentary for judging by the Suncoast regional chapter, which includes Florida, Puerto Rico, and parts of Louisiana, Alabama, and Georgia. I’m happy to say that W.A.S.P. – A Wartime Experiment in WoManpower won a 2020 regional Emmy, which included Jon Anderson, Rachel Becker Wright, Joe Stone (production designer), and myself.

Awards are nice, of course, but getting the story out about the courageous ladies of the WASP is far more important and I was happy to play a small part in that.

©2021 Oliver Peters