Suburbicon

George Clooney’s latest film, Suburbicon, originated over a decade ago as a screenplay by Joel and Ethan Coen. Clooney picked it up when the Coens decided not to produce the film themselves. Clooney and writing partner Grant Heslov (The Monuments Men, The Ides of March, Good Night, and Good Luck), rewrote it as taking place in the 1950s and added another story element. In the summer of 1957, the Myers, an African-American couple, moved into a largely white suburb in Levittown, Pennsylvania, setting off months of violent protests. The rewritten script interweaves the tale of the black family with that of their next-door neighbors, Gardner (Matt Damon) and Margaret (Julianne Moore). In fact, a documentary was produced about the historical events and shots from that documentary were used in Suburbicon.

Calibrating the tone

During the production and editing of the film, the overall tone was adjusted as a result of the actual, contemporary events occurring in the country. I spoke with the film’s editor, Stephen Mirrione (The Revenant, Birdman or (The Unexpected Virtue of Ignorance), The Monuments Men) about this. Mirrione explains, “The movie is presented as over-the-top to exaggerate events as satire. In feeling that out, George started to tone down the silliness, based on surrounding events. The production was being filmed during the time of the US election last year, so the mood on the set changed. The real world was more over-the-top than imagined, so the film didn’t feel quite right. George started gravitating towards a more realistic style and we locked into that tone by the time the film moved into post.”

The production took place on the Warner Brothers lot in September 2016 with Mirrione and first assistant editor Patrick Smith cutting in parallel with the production. Mirrione continues, “I was cutting during this production period. George would come in on Saturdays to work with me and ‘recalibrate’ the cut. Naturally some scenes were lost in this process. They were funny scenes, but just didn’t fit the direction any longer. In January we moved to England for the rest of the post. Amal [Clooney, George’s wife] was pregnant at the time, so George and Amal wanted to be close to her family near London. We had done post there before and had a good relationship with vendors for sound post. The final sound mix was in the April/May time frame. We had an editing room set up close to George outside of London, but also others in Twickenham and at Pinewood Studios. This way I could move around to work with George on the cut, wherever he needed to be.”

Traveling light

Mirrione is used to working with a light footprint, so the need for mobility was no burden. He explains, “I’m accustomed to being very mobile. All the media was in the Avid DNxHD36 format on mobile drives. We had an Avid ISIS shared storage system in Twickenham, which was the hub for all of the media. Patrick would make sure all the drives were updated during production, so I was able to work completely with standalone drives. The Avid is a bit faster that way, although there’s a slight trade-off waiting for updated bins to be sent. I was using a ‘trash can’ [2013] Mac Pro plus AJA hardware, but I also used a laptop – mainly for reference – when we were in LA during the final steps of the process.” The intercontinental workflow also extended to color correction. According to Mirrione, “Stefan Sonnenfeld was our digital intermediate colorist and Company 3 [Co3] stored a back-up of all the original media. Through an arrangement with Deluxe, he was able to stream material to England for review, as well as from England to LA to show the DP [Robert Elswit].”

Music was critical to Suburbicon and scoring fell to Alexandre Desplat (The Secret Life of Pets, Florence Foster Jenkins, The Danish Girl). Mirrione explains their scoring process. “It was very important, as we built the temp score in the edit, to understand the tone and suspense of the film. George wanted a classic 1950s-style score. We tapped some Elmer Bernstein, Grifters, The Good Son, and other music for our initial style and direction. Peter Clarke was brought on as music editor to help round out the emotional beats. Once we finished the cut, Alexandre and George worked together to create a beautiful score. I love watching the scenes with that score, because his music makes the editing seem much more exciting and elegant.”

Suiting the edit tool to your needs

Stephen Mirrione typically uses Avid Media Composer to cut his films and Suburbicon is no exception. Unlike many film editors who rely on unique Avid features, like ScriptSync, Mirrione takes a more straightforward approach. He says, “We were using Media Composer 8. The way George shoots, there’s not a lot of improv or tons of takes. I prefer to just rely on PDFs of the script notes and placing descriptions into the bins. The infrastructure required for ScriptSync, like extra assistants, is not something I need. My usual method of organization is a bin for each day of dailies, organized in shooting order. If the director remembers something, it’s easy to find in a day bin. During the edit, I alternate my bin set-ups between the script view and the frame view.”

With a number of noted editors dabbling with other software, I wondered whether Mirrione has been tempted. He responds, “I view my approach as system-agnostic and have cut on Lightworks and the old Laser Pacific unit, among others. I don’t want to be dependent on one piece of software to define how I do my craft. But I keep coming back to Avid. For me it’s the trim mode. It takes me back to the way I cut film. I looked at Resolve, because it would be great to skip the roundtrip between applications. I had tested it, but felt it would be too steep a learning curve, and that would have impacted George’s experience as the director.”

In wrapping our conversation, Mirrione concluded with this take away from his Suburbicon experience. He explains, “In our first preview screening, it was inspiring to see how seriously the audience took to the film and the attachment they had to the characters. The audiences were surprised at how biting and relevant it is to today. The theme of the film is really talking about what can happen when people don’t speak out against racism and bullying. I’m so proud and lucky to have the opportunity to work with someone like George, who wants to do something meaningful.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Advertisements

Mindhunter

The investigation of crime is a film topic with which David Fincher is very familiar. He returns to this genre in the new Netflix series, Mindhunter, which is executive produced by Fincher and Charlize Theron. The series is the story of the FBI’s Behavioral Science Unit and how it became an elite profiling team, known for investigating serial criminals. The TV series is based on the nonfiction book Mind Hunter: Inside the FBI’s Elite Serial Crime Unit, co-written by Mark Olshaker and John Douglas, a former agent in the unit who spent 25 years with the FBI. Agent Douglas interviewed scores of serial killers, including Charles Manson, Ted Bundy, and Ed Gein, who dressed himself in his victims’ skin. The lead character in the series, Holden Ford (played by Jonathan Groff) is based on Douglas. The series takes place in 1979 and centers on two FBI agents, who were among the first to interview imprisoned serial killers in order to learn how they think and apply that to other crimes. Mindhunter is about the origins of modern day criminal profiling.

As with other Fincher projects, he brought in much of the team that’s been with him through the various feature films, like Gone Girl, The Girl with the Dragon Tattoo, and Zodiac. It has also given a number in the team the opportunity to move up in their careers. I recently spoke with Tyler Nelson, one of the four series editors, who was given the opportunity to move from the assistant chair to that of a primary editor. Nelson explains, “I’ve been working with David Fincher for nearly 11 years, starting with The Curious Case of Benjamin Button. I started on that as an apprentice, but was bumped up to an assistant editor midway through. There was actually another series in the works for HBO called Videosyncrasy, which I was going to edit on. But that didn’t make it to air. So I’m glad that everyone had the faith in me to let me edit on this series. I cut the four episodes directed by Andrew Douglas and Asif Kapadia, while Kirk Baxter [editor on Gone Girl, The Girl with the Dragon Tattoo, The Social Network] cut the four shows that David directed.”

Pushing the technology envelope

The Fincher post operation has a long history of trying new and innovative techniques, including their selection of editing tools. The editors cut this series using Adobe Premiere Pro CC. Nelson and the other editors are no stranger to Premiere Pro, since Baxter had cut Gone Girl with it. Nelson says, “Of course, Kirk and I have been using it for years. One of the editors, Byron Smith, came over from House of Cards, which was being cut on [Apple] Final Cut Pro 7. So that was an easy transition for him. We are all fans of Adobe’s approach to the entertainment industry and were onboard with using it. In fact, we were running on beta software, which gave us the ability to offer feedback to Adobe on features that will hopefully make it into released products and benefit all Premiere users.”

Pushing the envelope is also a factor on the production side. The series was shot with custom versions of the RED Weapon camera. Shots were recorded at 6K resolution, but framed for a 5K extraction, leaving a lot of “padding” around the edges. This allowed room for reposition and stabilization, which is done a lot on Fincher’s projects. In fact, nearly all of the moving footage is stabilized. All camera footage is processed into EXR image sequences in addition to ProRes editing files for “offline” editing. These ProRes files also get an added camera LUT so everyone sees a good representation of the color correction during the editing process. One change from past projects was to bring color correction in-house. The final grade was handled by Eric Weidt on a FilmLight Baselight X unit, which was sourcing from the EXR files. The final Netflix deliverables are 4K/HDR masters. Pushing a lot of data through a facility requires robust hardware systems. The editors used 2013 (“trash can”) Mac Pros connected to an Open Drives shared storage system. This high-end storage system was initially developed as part of the Gone Girl workflow and uses storage modules populated with all SSD drives.

The feature film approach

Unlike most TV series, where there’s a definite schedule to deliver a new episode each week, Netflix releases all of their shows at once, which changes the dynamic of how episodes are handled in post. Nelson continues, “We were able to treat this like one long feature film. In essence, each episode is like a reel of a film. There are 10 episodes and each is 45 minutes to an hour long. We worked it as if it was an eight-and-a-half to nine hour long movie.” Skywalker Sound did all the sound post after a cut was locked. Nelson adds, “Most of the time we handed off locked cuts, but sometimes when you hear the cleaned up sound, it can highlight issues with the edit that you didn’t notice before. In some cases, we were able to go back into the edit and make some minor tweaks to make it flow better.”

As Adobe moves more into the world of dialogue-driven entertainment, a number of developers are coming up with speech-to-text solutions that are compatible with Premiere Pro. This potentially provides editors a function similar to Avid’s ScriptSync. Would something like this have been beneficial on Mindhunter, a series based on extended interviews? Nelson replies, “I like to work with the application the way it is. I try not to get too dependent on any feature that’s very specific or unique to only one piece of software. I don’t even customize my keyboard settings too much, just so it’s easier to move from one workstation to another that way. I like to work from sequences, so I don’t need a special layout for the bins or anything like that.”

“On Mindhunter we used the same ‘KEM roll’ system as on the films, which is a process that Kirk Baxter and Angus Wall [editor on Zodiac, The Curious Case of Benjamin Button, The Social Network] prefer to work in,” Nelson continues. “All of the coverage for each scene set-up is broken up into ‘story beats’. In a 10 minute take for an interview, there might be 40 ‘beats’. These are all edited in the order of last take to first take, with any ‘starred’ takes at the head of the sequence. This way you will see all of the coverage, takes, and angles for a ‘beat’ before moving on to the group for the next ‘beat’. As you review the sequence, the really good sections of clips are moved up to video track two on the sequence. Then create a new sequence organized in story order from these selected clips and start building the scene. At any given time you can go back to the earlier sequences if the director asks to see something different than what’s in your scene cut. This method works with any NLE, so you don’t become locked into one and only one software tool.”

“Where Adobe’s approach is very helpful to us is with linked After Effects compositions,” explains Nelson. “We do a lot of invisible split screen effects and shot stabilization. Those clips are all put into After Effects comps using Dynamic Link, so that an assistant can go into After Effects and do the work. When it’s done, the completed comp just pops back into the timeline. Then ‘render and replace’ for smooth playback.”

The challenge

Certainly a series like this can be challenging for any editor, but how did Nelson take to it? He answers, “I found every interview scene to be challenging. You have an eight to 10 minute interview that needs to be interesting and compelling. Sometimes it takes two days to just get through looking at the footage for a scene like that. You start with ‘How am I going to do this?’ Somewhere along the line you get to the point where ‘This is totally working.’ And you don’t always know how you got to that point. It takes a long time approaching the footage in different ways until you can flesh it out. I really hope people enjoy the series. These are dramatizations, but real people actually did these terrible things. Certainly that creeps me out, but I really love this show and I hope people will see the craftsmanship that’s gone into Mindhunter and enjoy the series.”

In closing, Nelson offered these additional thoughts. “I’d gotten an education each and every day. Lots of editors haven’t figured it out until well into a long career. I’ve learned a lot being closer to the creative process. I’ve worked with David Fincher for almost 11 years. You think you are ready to edit, but it’s still a challenge. Many folks don’t get an opportunity like this and I don’t take that lightly. Everything that I’ve learned working with David has given me the tools and I feel fortunate that the producers had the confidence in me to let me cut on this amazing show.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

6 Below

From IMAX to stereo3D, theaters have invested in various technologies to entice viewers and increase ticket sales. With a tip of the hat to the past, Barco has developed a new ultrawide, 3-screen digital projection system, which is a similar concept to Cinerama film theaters from the 1950s. But modern 6K-capable digital cinema cameras make the new approach possible with stunning clarity. There are currently 40 Barco Escape theaters worldwide, with the company looking for opportunities to run films designed for this format.

Enter Scott Waugh, director (Act of Valor, Need for Speed) and co-founder of LA production company, Bandito Brothers. Waugh, who is always on the lookout for new technologies, was interested in developing the first full-length, feature film to take advantage of this 3-screen, 7:1 aspect ratio for the entire length of the film. But Waugh didn’t want to change how he intended to shoot the film strictly for these theaters, since the film would also be distributed to conventional theaters. This effectively meant that two films needed to come out of the post-production process – one formatted for the Barco Escape format and one for standard 4K theaters.

6 Below (written by Madison Turner) became the right vehicle. This is a true life survival story of Eric LaMarque (played by Josh Harnett), an ex-pro hockey player turned snowboarder with an addiction problem, who finds himself lost in the ice and snow of the California Sierra mountains for a week. To best tell this story, Waugh and company trekked an hour or more into the mountains above Sundance, Utah for the production.

To handle the post workflow and co-edit the film with Waugh, editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2, An American Carol) joined the team. Nedomansky, another veteran of Bandito Brothers who uses Adobe Premiere Pro as his axe of choice, has also helped set up Adobe-based editorial workflows for Deadpool and Gone Girl. Ironically, in earlier years Nedomansky had been a pro hockey player himself, before shifting to a career in film and video. In fact, he played against the real Eric LeMarque on the circuit.

Pushing the boundaries

The Barco Escape format projects three 2K DCPs to cover the total 6K width. To accommodate this, RED 6K cameras were used and post was done with native media at 6K in Adobe Premiere Pro CC. My first question to Nedomansky was this. Why stay native? Nedomansky says, “We had always been pushing the boundaries at Bandito Brothers. What can we get away with? It’s always a question of time, storage, money, and working with a small team. We had a small 4-person post team for 6 Below, located near Sundance. So there was interest in not losing time to transcoding.

After some testing, we settled on decked out Dell workstations, because these could tackle the 6K RED raw files natively.” Two Dell Precision 7910 towers (20-core, 128GB RAM) with Nvidia Quadro M6000 GPUs were set up for editing, along with a third, less beefy HP quad-core computer for the assistant editor and visual effects. All three were connected to shared storage using a 10GigE network. Mike McCarthy, post production supervisor for 6 Below, set up the system. To keep things stable, they were running Windows 7 and stayed on the same Adobe Creative Cloud version throughout the life of the production. Nedomansky continues, “We kept waiting for the 6K to not play, but it never stopped in the six weeks of time that we were up there. My first assembly was almost three hours long – all in a single timeline – and I was able to play it straight through without any skips or stuttering.”

There were other challenges along the way. Nedomansky explains, “Almost all of the film was done as single-camera and Josh has to carry it with his performance as the sole person on screen for much of the film. He has to go through a range of emotions and you can’t just turn that on and off between takes. So there were lots of long 10-minute takes to convey his deterioration within the hostile environmental conditions. The story is about a man lost in the wild, without much dialogue. The challenge is how to cut down these long takes without taking away from his performance. One solution was to go against the grain – using jump cuts to shorten long takes. But I wanted to look for the emotional changes or a physical act to motivate a jump cut in a way that would make it more organic. In one case, I took a 10-minute take down to 45 seconds.”

When you have a film where weather is a character, you hope that the weather will cooperate. Nedomansky adds, “One of our biggest concerns going in, was the weather. Production started in March – a time when there isn’t a lot of snow in Utah. Fortunately for us, a day before we were supposed to start shooting, they had the biggest ‘blizzard’ of the winter for four days. This saved us a lot of VFX time, because we didn’t have to create atmospherics, like snow in front of the lens. It was there naturally.”

Using the Creative Cloud tools to their fullest

6 Below features an extensive percentage of visual effects shots. Nedomansky says, “The film has 1500 shots with 205 of them as VFX shots. John Carr was the assistant editor and visual effects artist on the film and he did all of the work in After Effects and at 6K resolution, which is unusual for films. Some of the shots included ‘day for night’ where John had to add star plates for the sky. This meant rotoscoping behind Josh and the trees to add the plates. He also had to paint out crew footprints in the snow, along with the occasional dolly track or crew member in a shot. There were also some split screens done at 6K right in Premiere Pro.”

The post schedule involved six weeks on-set and then fourteen more weeks back in LA, for a 20-week total. After that, sound post and grading (done at Technicolor). The process to correctly format the film for both Barco and regular theaters almost constituted posting two films. The RED camera image is 6144 x 2592 pixels, Barco Escape 6144 x 864, and a 4K extraction 4096 x 2160. Nedomansky explains, “The Barco frame is thin and wide. It could use the full width, but not height, of the full 6K RED image. So, I had to do a lot of ‘animation’ to reposition the frame within the Barco format. For the 4K version, the framing would be adjusted accordingly. The film has about 1500 shots, but we didn’t use different takes for the two versions. I was able to do this all through reframing.”

In wrapping up our conversation, Nedomansky adds, “I played hockey against Eric and this added an extra layer of responsibility. He’s very much still alive today. Like any film of this type, it’s ‘based on’ the true story, but liberties are taken. I wanted to make sure that Eric would respect the result. Scott and I’ve done films that were heavy on action, but this film shows another directorial style – more personal and emotional with beautiful visuals. That’s also a departure for me and it’s very important for editors to have that option.”

6 Below was released on October 13 in cinemas.

Read Vashi’s own write-up of his post production workflow.

Images are courtesy of Vashi Visuals.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Baby Driver

You don’t have to be a rabid fan of Edgar Wright’s work to know of his films. His comedy trilogy (Shaun of the Dead, Hot Fuzz, The World’s End) and cult classics like Scott Pilgrim vs. the World loom large in pop culture. His films have earned a life beyond most films’ brief release period and earned Wright a loyal following. The latest film from Wright is Baby Driver, a musically-fueled action film written and directed by Wright, which just made a big splash at SXSW. It stars Ansel Elgort, Kevin Spacey, Jon Hamm, Jamie Foxx, and Eiza Gonzalez.

At NAB, Avid brought in a number of featured speakers for its main stage presentations, as well as its Avid Connect event. One of these speakers was Paul Machliss (Scott Pilgrim vs. the World, The World’s End, Baby Driver), who spoke to packed audiences about the art of editing these films. I had a chance to go in-depth with Machliss about the complex process of working on Baby Driver.

From Smoke to baptism by fire

We started our conversation with a bit of the backstory of the connection between Wright and Machliss. He says, “I started editing as an online editor and progressed from tape-based systems to being one of the early London-based Smoke editors. My boss at the time passed along a project that he thought would be perfect for Smoke. That was onlining the sitcom Spaced, directed by Edgar Wright. Edgar and I got on well. Concurrent to that, I had started learning Avid. I started doing offline editing jobs for other directors and had a ball. A chance came along to do a David Beckham documentary, so I took the plunge from being a full-time online editor to taking my chances in the freelance world. On the tail end of the documentary, I got a call from Edgar, offering me the gig to be the offline editor for the second season of Spaced, because Chris Dickens (Hot Fuzz, Berberian Sound Studio, Slumdog Millionaire) wasn’t available to complete the edit. And that was really jumping into the deep end. It was fantastic to be able to work with Edgar at that level.”

Machliss continues, “Chris came back to work with Edgar on Shaun of the Dead and Hot Fuzz, so over the following years I honed my skills working on a number of British comedies and dramas. After Slumdog Millionaire came out, which Chris cut and for which he won a number of awards, including an Oscar, Chris suddenly found himself very busy, so the rest of us working with Edgar all moved up one in the queue, so to speak. The opportunity to edit Scott Pilgrim came up, so we all threw ourselves into the world of feature films, which was definitely a baptism by fire. We were very lucky to be able to work on a project of that nature during a time where the industry was in a bit of a slump due to the recession. And it’s fantastic that people still remember it and talk about it seven years on. Which brings us to Baby Driver. It’s great when a studio is willing to invest in a film that isn’t a franchise, a sequel, or a reboot.”

Music drives the film

In Baby Driver, Ansel Elgort plays “Baby”, a young kid who is the getaway driver for a gang. At a young age, he was in a car accident which leaves him with tinnitus, so it takes listening to music 24/7 to drown out the tinnitus. Machliss explains, “His whole life becomes regimented to whatever music he is listening to – different music for different moods or occasions. Somehow everything falls magically into sync with whatever he is listening to – when he’s driving, swerving to avoid a car, making a turn – it all seems to happen on the beat. Music drives every single scene. Edgar deliberately chose commercial top-20 tracks from the 1960s up to today. Each song Baby listens to also slyly comments on whatever is happening at the time in the story. Everything is seemingly choreographed to musical rhythms. You’re not looking at a musical, but everything is musically driven.”

Naturally, building a film to popular music brings up a whole host of production issues. Machliss tells how this film had been in the planning for years, “Edgar had chosen these tracks years ago. I believe it was in 2011 that Edgar and I tried to sequence the tracks and intersperse them with sound effects. A couple of months later, he did a table read in LA and sent me the sound files. In the Avid, I combined the sound files, songs, and some sound effects to create effectively a 100-minute radio play, which was, in fact, the film in audio form. The big thing is that we had to clear every song before we could start filming. Eventually we cleared 30-odd songs for the film. In addition, Edgar worked with his stunt team and editor Evan Schiff in LA to create storyboards and animatics for all of the action scenes.”

Editor on the front lines

Unlike most films, a significant amount of the editing took place on-set with Machliss working from a portable set-up. He says, “Based on our experiences with Scott Pilgrim and World’s End, Edgar decided it would be best to have me on-set during most of the Atlanta shoot for Baby Driver. Even though a cutting room was available, I was in there maybe ten percent of the time. The rest of the time I was on set. I had a trolley with a laptop, monitor, an Avid Mojo, and some hard drives and I would connect myself via ethernet to the video assist’s hard drive. Effectively I was crew in the front lines with everyone else. Making sure the edit worked was as important as getting a good take in the can. If I assured Edgar that a take would work, then he knew it wasn’t going to come back and cause problems for us six months later. We wanted things to work naturally in camera without a lot of fiddling in post. We didn’t want to have to fall back on frame-cutting and vari-speeding if we didn’t have to. There was a lot of prep work in making sure actions correctly coincided with certain lyrics without the action seeming mechanical.”

The nature of the production added to the complexity of the production audio configuration, too. Machliss explains, “Sound-wise, it was very complicated. We had playback going to earwigs in the actors’ ears, Edgar wanted to hear music plus the dialogue in his cans, and then I needed to get a split feed of the audio, since I already had the clean music on my timeline. We shot this mostly on 35mm film. Some days were A-camera only, but usually two cameras running. It was a combination of Panavision, Arricams, and occasionally Arri Alexas. Sometimes there were some stunt shots, which required nine or ten cameras running. Since the action all happened against playback of a track, this allowed me to use Avid’s multicam tools to quickly group shots together. Avid’s AMA tools have really come of age, so I was able to work without needing to ingest anything. I could treat the video assist’s hard drive as my source media, as long as I had the ethernet connection to it. If we were between set-ups, I could get Avid to background-transcode the media, so I’d have my own copy.”

Did all of this on-set editing speed up the rest of the post process? He continues, “All of the on-set editing helped a great deal, because we went into the real post-production phase knowing that all the sequences basically worked. During that time, as I’d fill up a LaCie Rugged drive, I would send that back to the suites. My assistant, Jerry Ramsbottom, would then patiently overcut my edits from the video assist with the actual scanned telecine footage as it came in. We shot from mid-February until mid-May and then returned to England. Jonathan Amos came on board a few weeks into the director’s cut edit and worked on the film with Edgar and myself up until the director’s cut picture lock. He did a pass on some of the action scenes while Edgar and myself concentrated on dialogue and the overall shape of the film. He stayed on board up until the final picture lock and made an incredible contribution to the action and the tension of the film. By the end of the year we’d locked and then we finished the final mix mid-February of this year. But the great thing was to be able to come into the edit and have those sequences ready to go.”

Editing from set is something many editors try to avoid. They feel they can be more objective that way. Machliss sees it a bit differently, “Some editors don’t like being on set, but I like the openness of it – taking it all in. Because when you are in the edit, you can recall the events of the day a particular scene was shot – ‘I can remember when Kevin Spacey did this thing on the third take, which could be useful’. It’s not vital to work like this, but it does preclude to a kind of short-hand, which is something Edgar and I have developed over these years anyway. The beauty of it is that Edgar and I will take the time to try every option. You can never hit on the perfect cut the first time. Often you’ll get feedback from screenings, such as ‘we’d like to see more emotion between these characters’. You know what’s available and sometimes four extra shots can make all the difference in how a scene reads without having to re-imagine anything. We did drop some scenes from the final version of the film. Of course, you go ‘that’s a shame’, but at least these scenes were given a chance. However, there are always bits where upon the 200th viewing you can decide, ‘well, that’s completely redundant’ – and it’s easy to drop. You always skate as close to the edge of making a film shorter without doing any damage to it.”

The challenge of sound

During sound post, Baby Driver also presented some unique challenges. Machliss says, “For the sound mix – and even for the shoot – we had to make sure we were working with the final masters of the song recordings to make sure the pitch and duration remained constant throughout. Typically these came in as mono or stereo WAVs. Because music is such an important element to the film, the concept of perceived direction becomes important. Is the music emanating from Baby’s earbuds? What happens to it when the camera moves or he turns his head? We had to work out a language for the perception of sound. This was Edgar’s first film mixed in Dolby ATMOS and we were the second film in Goldcrest London’s new Atmos-certified dubbing theater. Then we did a reduction to 7.1 and 5.1. Initially we were thinking this film would have no score other than the songs. Invariably you need something to get from A to B. We called on the services of Steven Price (Gravity, Fury, Suicide Squad), who provided us with some original cues and some musical textures. He did a very clever thing where he would match the end pitch or notes of a commercial song and then by the time he came to the end of his cue, it would match to the incoming note or key of the next song. And you never notice the change.”

Working with Avid in a new way

To wrap up the conversation, we talked a bit about using Avid Media Composer on his work. Machliss has used numerous other systems, but Media Composer still fits the bill for his work today. He says, “For me, the speed of working with AMA in Avid in the latest software was a real benefit. I could actually keep up with the speed of the shoot. You don’t want to be the one holding up a crew of 70. I also made good use of background transcoding. On a different project (Fleabag), I was able to work with native 2K Alexa ProRes camera files at full resolution. It was fantastic to be able to use Frameflex and apply LUTs – doing the cutting, but then bringing back my old skills as an online editor to paint out booms and fix things up. Once we locked, I could remove the LUTs and export DPX files, which went straight to the grading facility. That was exciting to work in a new way.”

Baby Driver opened at the start of July in the US and is a fun ride. You can certainly enjoy a film like this without knowing the nitty gritty of the production that goes into it. However, after you’ve read this article, you just might need to see it at least twice – once to just enjoy and once again to study the “invisible art” that’s gone into bringing it to screen.

(For more with Paul Machliss, check out these interviews at Studio Daily, ProVideoCoalition, and FrameIO.)

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

The Handmaid’s Tale

With tons of broadcast, web, and set-top outlets for dramatic television, there’s a greater opportunity than ever for American audiences to be exposed to excellent productions produced outside of Hollywood or New York. Some of the most interesting series come out of Canada from a handful of production vendors. One such company is Take 5 Productions, which has worked on such co-productions as Vikings, American Gothic, Penny Dreadful, and others. One of their newest offerings is The Handmaid’s Tale, currently airing in ten, hourlong episodes on Hulu, as well as being distributed internationally through MGM.

The Handmaid’s Tale is based on a dystopian novel written in 1985 by Margaret Atwood. It’s set in New England during the near future, when an authoritarian theocracy has overthrown the United States government and replaced it with the Republic of Gilead. The population has had declining births due to pollution and disease, so a class of women (the handmaids), who are considered fertile, are kept by the ruling class (the Commanders) as concubines for the purpose of having their children. This disturbing tale and series, with its nods to Nazi Germany and life behind the Iron Curtain, not to mention Orwell and Kubrick, stars Elizabeth Moss (Mad Men, The One I Love, Girl, Interrupted) as Offred, one of the handmaids, as she tries to survive her new reality.

The tone of the style and visuals for The Handmaid’s Tale was set by cinematographer-turned-director, Reed Morano (Frozen River, Meadowland, The Skeleton Twins). She helmed three of the episodes, including the pilot. As with many television series, a couple of editors traded off the cutting duties. For this series, Julian Clarke (Deadpool, Chappie, Elysium) started the pilot, but it was wrapped up by Wendy Hallam Martin (Queer As Folk, The Tudors, The Borgias). Hallam Martin and Christopher Donaldson (Penny Dreadful, Vikings, The Right Kind of Wrong) alternated episodes in the series, with one episode cut by Aaron Marshall (Vikings, Penny Dreadful, Warrior).

Cutting a dystopian future

I recently spoke with Wendy Hallam Martin about this series and working in the Toronto television scene. She says, “As a Canadian editor, I’ve been lucky to work on some of the bigger shows. I’ve done a lot of Showtime projects, but Queer As Folk was really the first big show for me. With the interest of outlets like Netflix and Hulu, budgets have increased and Canadian TV has had a chance to produce better shows, especially the co-productions. I started on The Handmaid’s Tale with the pilot, which was the first episode. Julian [Clarke] started out cutting the pilot, but had to leave due to his schedule, so I took over. After the pilot was shot (with more scenes to come), the crew took a short break. Reed [Morano] was able to start her director’s cut before she shot episodes two and three to set the tone. The pilot didn’t lock until halfway through the season.”

One might think a mini-series that doesn’t run on a broadcast network would have a more relaxed production and post schedule, akin to a feature film. But not so with The Handmaid’s Tale, which was produced and delivered on a schedule much like other television dramatic series. Episodes were shot in blocks of two episodes at a time with eight days allotted per episode. The editor’s assembly was due five days later followed by two weeks working with the director for a director’s cut. Subsequent changes from Hulu and MGM notes result in a locked cut three months after the first day of production for those two episodes. Finally, it’s three days to color grade and about a month for sound edit and mix.

Take 5 has its own in-house visual effects department, which handles simple VFX, like wire removals, changing closed eyes to open, and so on. A few of the more complex VFX shots are sent to outside vendors. The episodes average about 40 VFX shots each, however, the season finale had 70 effects shots in one scene alone.

Tackling the workload

Hallam Martin explained how they dealt with the post schedule. She continues, “We had two editors handling the shows, so there was always some overlap. You might be cutting one show while the next one was being assembled. This season we had a first and second assistant editor. The second would deal with the dailies and the first would be handling visual effects hand-offs, building up sound effects, and so on. For the next season we’ll have two firsts and one second assistant, due to the load. Reed was very hands-on and wanted full, finished tracks of audio. There were always 24 tracks of sound on my timelines. I usually handle my own temp sound design, but because of the schedule, I handed that off to my first assistant. I would finish a scene and then turn it over to her while I moved on to the next scene.”

The Handmaid’s Tale has a very distinctive look for its visual style. Much of the footage carries a strong orange-and-teal grade. The series is shot with an ARRI ALEXA Mini in 4K (UHD). The DIT on set applies a basic look to the dailies, which are then turned into Avid DNxHD36 media files by Deluxe in Toronto to be delivered to the editors at Take 5. Final color correction is handled from the 4K originals by Deluxe under the supervision of the series director of photography, Colin Watkinson (Wonder Woman, Entourage, The Fall). A 4K (UHD) high dynamic range master is delivered to Hulu, although currently only standard dynamic range is streamed through the service. Hallam Martin adds, “Reed had created an extensive ‘look book’ for the show. It nailed what [series creator] Bruce Miller was looking for. That, combined with her interview, is why the executive producers hired her. It set the style for the series.”

Another departure from network television is that episodes do not have a specific duration that they must meet. Hallam Martin explains, “Hulu doesn’t dictate exact lengths like 58:30, but they did want the episodes to be under an hour long. Our episodes range from about 50 to 59 minutes. 98% of the scenes make it into an episode, but sometimes you do have to cut for time. I had one episode that was 72 minutes, which we left that long for the director’s cut. For the final version, the producers told me to ‘go to town’ in order to pace it up and get it under an hour. This show had a lot of traveling, so through the usual trimming, but also a lot of jump cuts for the passage of time, I was able to get it down. Ironically the longest show ended up being the shortest.”

Adam Taylor (Before I Fall, Meadowland, Never a Neverland) was the series composer, but during the pilot edit, Morano and Hallam Martin had to set the style. Hallam Martin says, “For the first three episodes, we pulled a lot of sources from other film scores to set the style. Also a lot of Trent Reznor stuff. This gave Adam an idea of what direction to take. Of course, after he scored the initial episodes, we could use those tracks as temp for the next episodes and as more episode were completed, that increased the available temp library we had to work with.”

Post feelings

Story points in The Handmaid’s Tale are often exposed through flashbacks and Moss’ voice over. Naturally voice over pieces affect the timing of both the acting and the edit. I asked Hallam Martin how this was addressed. She says, “The voice over was recorded after the fact. Lizzie Moss would memorize the VO and act with that in mind. I would have my assistant do a guide track for cutting and when we finally received Lizzie’s, we would just drop it in. These usually took very little adjustment thanks to her preparation while shooting. She’s a total pro.” The story focuses on many ideas that are tough to accept and watch at times. Hallam Martin comments, “Some of the subject matter is hard and some of the scenes stick with you. It can be emotionally hard to watch and cut, because it feels so real!”

Wendy Hallam Martin uses Avid Media Composer for these shows and I asked her about editing style. She comments, “I watch all the dailies from top to bottom, but I don’t use ScriptSync. I will arrange my bins in the frame view with a representative thumbnail for each take. This way I can quickly see what my coverage is. I like to go from the gut, based on my reaction to the take. Usually I’ll cut a scene first and then compare it against the script notes and paperwork to make sure I haven’t overlooked anything that was noted on set.” In wrapping up, we talked about films versus TV projects. Hallam Martin says, “I have done some smaller features and movies-of-the-week, but I like the faster pace of TV shows. Of course, if I were asked to cut a film in LA, I’d definitely consider it, but the lifestyle and work here in Toronto is great.”

The Handmaid’s Tale continues with season one on Hulu and a second season has been announced.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

A Conversation with Thomas Grove Carter

The NAB Show is a great place to see the next level of media hardware and software. Even better, it’s also a great place to meet old friends, make new ones, and pick up the tips and tricks of your craft through the numerous tutorials, seminars, and off-site events that accompany the show.

This year I had the chance to interview Thomas Grove Carter, an editor at Trim Editing, which is a London-based creative editorial shop. He appeared at several sessions to present his techniques for maximizing the power of Final Cut Pro X. These sessions were moderated by Apple and FCPWORKS.

Thomas Grove Carter has a number of high-profile projects on his reel, including work for Honda, Game of Thrones, Audi, and numerous music artists. Carter is a familiar name in the Final Cut Pro X editing community. He first came to prominence with Honda’s “The Other Side” long-form web commercial. In it, Carter juxtaposes parallel day and night driving scenarios covering the main actor – dad by day, undercover police officer by night. On the interactive website, you can toggle in-sync between the two versions. Thanks to FCPX’s way of connecting clips and the nature of its magnetic timeline, Carter could use this then-young application to build the commercial, as well as preview the interactivity for the client – all on a very tight deadline.

I had the pleasure of sitting down with Carter in a semi-quiet corner of the NAB Press Room shortly after his Post Production World keynote session on Sunday evening.

____________________________________

[Oliver Peters]: We first started hearing your name when Honda’s “The Other Side” long-form commercial hit the web. That fit ideally with Final Cut Pro X’s unique ability to connect clips above and below the primary storyline on the timeline. Was that something you came up with intuitively?

[Thomas Grove Carter]: I knew that Final Cut Pro X was going to be good for this interactive piece. As you’re playing back in FCPX you can enable and disable layers. This meant I could actually do a rough preview of what it’s going to look like. I knew that I was going to have these two layers of video, but I didn’t exactly know what it was going to be until the edit, so I started to assemble each story separately. Then at some point, once I had each narrative roughly built, I put them both together on the same timeline and started adding the sound. From then on I was able to play it ‘interactively’ right inside FCPX.  Back then, I split the day and night audio above and below the primary storyline. Today though, I’d probably assign a role for the day and a role for all of the night. Because, you can’t add audio-only above the primary storyline anymore. So that’s what I’d do to divide it out. All the audio and video still connects in exactly the same way – it just looks slightly different. Another great advantage of doing this in X was clip connections. For any given shot, there was the day and night version, and then, all the audio for the day and all the audio for the night. Just by grabbing the one clip in the primary and moving it or trimming it – everything for day and night – picture and audio – both would move together.

[OP]: Tell me a bit about your relationship with Trim Editing.

[TGC]: There are three partners, who are the most senior three editors. Then there are four or five other main editors and two or three junior editors, plus a number of assistants and runners.

It’s been running over 12 years and I joined the team just over 4 years ago.

[OP]: Are all of you using Final Cut Pro X?

[TGC]: Originally, before anyone started using Final Cut Pro X, we had a mix of Avid and Final Cut Pro 7. Then we began to move to Avid as we saw that Final Cut Pro 7 was not going to be improved. So I started to move to Avid, too. But, I was using Final Cut Pro X on my own personal projects. I began to use it on smaller jobs and one of the other editors said, “That’s cool, that thing you’re doing there.” And he started to try it out. Now we’re kind of at a point where most of the editors are on Final Cut Pro X. One is using Avid, so our assistants need to be able to work with both.

[OP]: Have you been able to convert the last hold-out?

[TGC]: He’s always been Avid. That’s what he uses. The company doesn’t dictate what we use to edit with. It’s all about making the best work. If I decided tomorrow that I wanted to cut in Avid or Premiere – it wouldn’t be an issue. Anyone can cut with anything they like.

[OP]: Any thoughts of going to Premiere?

[TGC]: We’ve fallen in love with the way FCPX works – the browser and the timeline. I think Premiere is good, because it feels very much like a continuation of where Final Cut Pro 7 was, which is why loads of people have moved to it. I understand that. It’s an easy move. But it’s the core way that X functions that I love. That stuff just isn’t in any other NLE. What I’ve found with everyone who has moved to it, including myself – there were always a few little hooks that keep people coming back, even if you don’t like the whole app initially. For me, the first thing I liked is how you can pull out the audio clips and things move out of the way automatically. And I always just thought ‘I can’t make this thing work, but that feature is cool’. And then I kept coming back to it and slowly fell I love with the rest of it. One of the other editors loved the way of making dynamic selects in the browser and said, “I’m going to do this job in X.” He’d select in the browser using favorites and rejects and he absolutely loved it. Loved the way it was so fluid with the thumbnails and he felt immersed in his rushes. Then he gets to the timeline. “Oh, I can’t make this work.” He sent it back to Final Cut Pro 7 and finished up there. He did that on two or three jobs, because it takes time to get comfortable with the timeline. It’s strange when you come from track-based. But once it clicks, it’s amazing.

[OP]: How do your assistant editors fit into the workflow?

[TGC]: Generally I go from one job to the next. It might be two weeks or a month and a quick turnaround. Occasionally there might be an overlap – like, the next job has already started shooting and I haven’t finished the last one off yet. So it might be that I need an assistant editor to load my stuff. Or maybe I have to move on to the next job and I’ve got an assistant doing final tweaks on the last one. It’s much simpler to load projects in X than it is in Avid and one thing I’ve heard in the industry is, “Oh, does that mean you’re going to fire a lot of assistants, because you don’t need them?” No! Of course, we’re going to employ them, but we’ll actually give them editing work to do whenever we can – not just grunt work. Let them do the cut-downs, versions, first assemblies. There’s more time now for them to be doing creative work.

We also try to promote from within. I was the first person who was hired from outside of the company. Almost all the other editors, apart from the partners, have been people who’ve moved up from within. Yes, we could be paying this assistant to be loading all our stuff and making QuickTimes. But if you can be paying the assistant and they can be doing another job, why wouldn’t you do that? It’s another revenue stream for the company. So it’s great to be able to get them up to a level where they can pick up work and build up their own reels and creative chops.

[OP]: Are you primarily working with proxy media?

[TGC]: Not ‘Final Cut Pro X proxy media’, but we use ProRes Proxy or  LT files, which are often transcoded by a DIT on set. They look great, but the post house always goes back to the camera originals for the grade. Sometimes if it’s a smaller job – a low budget music video, for example – I’ll get the ARRI files if they shooting ProRes and just take them into Final Cut straight away- just to get working quicker.

[OP]: Since you work in the area of high-end commercials, do you typically send out audio, color and effects to outside post facilities?

[TGC]: Sound and post work is finished off elsewhere. We work with all the big post facilities –  The Mill, Framestore, and MPC, for example. The directors we work with have their favorite colorists. They’re hiring them because they have the right eye, the right creative skills – not just because they can push the buttons. But we’re doing more and more in the offline now. Clients aren’t used to seeing things as ‘offline’ these days. They’re used to things looking slick. I do a lot of sound design, because it goes so hand in hand with the picture edit. Sometimes the picture doesn’t work without any of the sound, so I do quite a lot of it – get it sounding really great, but it will ultimately be remixed later. I might be working on a project for a month and the sound becomes a very integral creative element. And then the sound mixer only gets a day to pull it all together. They do a great job, but it’s really important to give them as much as we can to work with – to really set the creative direction of the audio.

[OP]: In your presentations, you’ve mentioned Trim’s light hardware footprint. How is the facility configured?

[TGC]: Well, we’ve got ‘cylinder’ Mac Pros, Retina iMacs, and more recently we’ve been trying out a few of the new MacBook Pros, alongside the LG 5K displays. I’ve actually been cutting with that set up a lot recently. I really like it, because I turn up at the suite with my laptop, plug two cables in and that’s it! One cable for the 5K display, power and audio. The second cable goes out to HDMI. It runs the client monitor (HD/4K TV) and a USB hub. It’s a really slick and flexible set up.

For storage, we’re currently using Samsung T3 SSD drives, which are so fast and light, they can handle most things we throw at them. It’s a really slick and flexible set up. But with a few potential feature films in the near future, we are looking again at shared storage. I think that’s an interesting area of the market these days. There are some really amazing new products, which don’t come from the same old vendors.

[OP]: How do clients react to this modular suite approach?

[TGC]: If were doing our jobs, clients shouldn’t really notice the tech were using to drive the edit. And people love the space we’ve created. We’ve got really nice rooms – none of our suites are small. Clients are looking at a 50″ to 60” TV, which is 4K in some of our suites. And we’ve got really great sound systems. So, in terms of what clients are seeing and hearing, it doesn’t get much better in an edit suite.

Sometimes directors will come by even when they’re not editing with us. They’ll come by and write their treatments and just hang out, which is really nice. There’s a lot of common space with areas to work and meet.

There’s a lot of art all over the place and when anyone sees a sign that has the word ‘trim’ in it – they buy it. It might be a street sign or a ‘trim something’ logo. So, you see these signs all over the building. It adds a really nice character to the place. When I joined the company, I wanted to bring something to it – and I love LEGO – so I built our logo using it. That’s mounted at our entrance now.

[OP]: There’s a certain mentality in working with agencies. How does Trim approach that?

[TGC]: We tend to focus on the directors. That’s where you develop the greatest relationships, which is where the best work comes from. Not that I dislike working with an agency, but you build a much closer creative bond with your directors.

One small way we help build a good working environment for directors and agencies is to all have lunch together, every single day. We have lunch rather than editing and eating at our desks. One of the great things about this is that directors get to meet other agencies and editors get to meet other directors. It’s really good to be able to socialize like that. It also helps build different relationships than what would ever happen if we we’re all locked away in a suite all day.

[OP]: At what point do you typically get involved with a job?

[TGC]: I’ll usually get pencilled on a job while the director is still pitching it. And then I’ll start work straight after the shoot. Occasional we’ll be on set, but only if it’s a really tight deadline. On that Honda job, that was a six-day shoot to make two, 2 1/2 minute films and then they needed to see it really soon after the shoot. So, I had to be on set. But typically I like not being on set, because when you’re on set you’re suddenly part of the, “Oh, this shot was amazing. It took us four hours to get in the pouring rain.” You’re invested in that baggage. Whereas, when you just view it coldly in the edit, you don’t know what happened on set. You can go, “This shot doesn’t work – let’s lose it.” That fresh vision is a great reason for the editor to be as far from a shoot as possible.

[OP]: One of the projects on your reel is a Games of Thrones promo. How did that job come your way?

[TGC]: That was actually a director I hadn’t worked with – but, just a director who wanted to work with me. He’d been trying to get me on a few jobs that I hadn’t been able to do. It was an outside director that HBO brought in to shoot. It wasn’t a trailer made of footage from the show. They brought in a commercials and music director to shoot the piece and he wanted to work with me. So, it came down like that and then I worked with him and HBO to bring it all together.

[OP]: Do you have any preferences for the types of projects you work on?

[TGC]: Things like the Audi commercial are really fun, because there’s a lot of sound design. A lot of commercials are heavily storyboarded, but it can often be more satisfying if the director has been a bit more loose in the filming. It might be a montage of different people doing activities, for example. And those can be quite fun, because the final thing – you’ve come up with it and you’ve created the narrative and the flow of it. I say that with hindsight, because they turn out to be the most creatively satisfying. But, the process can be much harder when you’re in the thick of it – because it’s on your shoulders and you haven’t got a really locked storyboard to fall back on. I’ll happily do really long hours and work really hard, if it’s a good bit of work – and, at the end of the day, I’ve worked with nice people.

[OP]: With Final Cut Pro X – anything that you’d like to see different?

[TGC]: Maybe collaboration is one thing that would be interesting to see if there’s a new and interesting take on it. Avid bin-locking is great, but actually when you boil it down, it’s quite a simple thing. It locks this bin, you can’t go in there. You can make a copy of it. That’s all it’s doing, but it’s simple and it works really well. All the cloud-based things I’ve seen so far – they’ve not really gotten me excited. I don’t feel like anyone has really nailed what that is yet. Everyone is just doing it because they can, not because it works really well, or is actually useful. I’d be interested to see if there’s something that can be done there.

In the timeline, I’d like to be able to look inside compound clips without stepping into them. I often use compound clips to combine sound effects or music stems. I’d like to be able to open them in context in the timeline and edit the contents inline with the master timeline. And I’d love some kind of dupe detection in the timeline. But otherwise, I’m really enjoying the new version.

Click this link to watch Thomas Grove Carter in action with FCPX at this year’s Las Vegas SuperMeet at NAB.

____________________________________

I certainly appreciated the time Thomas Grove Carter spent with me to do this interview. Along with a few other interviews, it made for a better-than-average Vegas trip. As a side note, I recorded my interviews (for transcription only) on my iPad, with the aid of the Apogee MetaRecorder app. This works with iPhones and iPads and starts at free, however, you should spend the $4.99 in-app upgrade to be able to do anything useful with it. It can use the built-in mic and records full quality audio WAV files – and – it features a connection to FCPX with fcpxml. Finally, to aid in generating a text transcript, I used Digital Heaven’s SpeedScriber. Although still in beta, it worked well for what I needed. As with all audio-to-text transcription applications, there’s no such thing as perfect. I did need to do a fair amount of clean-up, however, that’s not uncommon.

©2017 Oliver Peters

Five Came Back

We know them today as the iconic Hollywood directors who brought us such classic films as Mr. Smith Goes To Washington, It’s a Wonderful Life, The African Queen, and The Man Who Shot Liberty Valance – just to name a few. John Ford, William Wyler, John Huston, Frank Capra and George Stevens also served their country on the ground in World War II, bringing its horrors and truth to the American people through film. In Netflix’s new three-part documentary series, based on Mark Harris’ best-selling book, Five Came Back: A Story of Hollywood and the Second World War, contemporary filmmakers explore the extraordinary story of how Hollywood changed World War II – and how World War II changed Hollywood, through the interwoven experiences of these five legendary filmmakers.

This documentary series features interviews with Steven Spielberg, Francis Ford Coppola, Guillermo Del Toro, Paul Greengrass and Lawrence Kasdan, who add their own perspectives on these efforts. “Film was an intoxicant from the early days of the silent movies,” says Spielberg in the opening moments of Five Came Back. “And early on, Hollywood realized that it had a tremendous tool or weapon for change, through cinema.” Adds Coppola, “Cinema in its purest form could be put in the service of propaganda. Hitler and his minister of propaganda Joseph Goebbels understood the power of the cinema to move large populations toward your way of thinking.”

Five Came Back is directed by Laurent Bouzereau, written by Mark Harris and narrated by Meryl Streep. Bouzereau and his team gathered over 100 hours of archival and newsreel footage; watched over 40 documentaries and training films directed and produced by the five directors during the war; and studied 50 studio films and over 30 hours of outtakes and raw footage from their war films to bring this story to Netflix audiences. Says director Laurent Bouzereau, “These filmmakers, at that time, had a responsibility in that what they were putting into the world would be taken as truth. You can see a lot of echoes in what is happening today. It became clear as we were doing this series that the past was re-emerging in some ways, including the line we see that separates cinema that exists for entertainment and cinema that carries a message. And politics is more than ever a part of entertainment. I find it courageous of filmmakers then, as with artists today, to speak up for those who don’t have a platform.”

An editor’s medium

As every filmmaker knows, documentaries are truly an editor’s medium. Key to telling this story was Will Znidaric, the series editor. Znidaric spent the first sixteen years of his career as a commercial editor in New York City before heading to Los Angeles, in a move to become more involved in narrative projects and hone his craft. This move led to a chance to cut the documentary Winter on Fire: Ukraine’s Fight for Freedom. Production and post for that film was handled by LA’s Rock Paper Scissors Entertainment, a division of the Rock Paper Scissors post facility. RPS is co-owned by Oscar-winning editor, Angus Wall (The Social Network, The Girl with the Dragon Tattoo). Wall, along with Jason Sterman and Linda Carlson, was an executive producer on Winter of Fire for RPS. The connection was a positive experience, so when RPS got involved with Five Came Back, Wall tapped Znidaric as its editor. Much of the same post team worked on both of these documentaries.

I recently interviewed Will Znidaric about his experience editing Five Came Back. “I enjoyed working with Angus,” he explains. “We edited and finished at Rock Paper Scissors over a fifteen month period. They are structured to encourage creativity, which was great for me as a documentary editor. Narratively, this story has five main characters who are on five individual journeys. The canvas is civilization’s greatest conflict. You have to be clear about the war in order to explain their context. You have to be able to find the connections to weave a tapestry between all of these elements. This came together thanks to the flow and trust that was there with Laurent [Bouzereau, director]. The unsung hero is Adele Sparks, our archival producer, who had to find the footage and clear the rights. We were able to generally get rights to acquire the great majority of the footage on our wish list.”

Editing is paleontology

Znidaric continues, “In a documentary like this, editing is a lot like paleontology – you have to find the old bones and reconstruct something that’s alive. There was a lot of searching through newsreels of the day, which was interesting thematically. We all look at the past through the lens of history, but how was the average American processing the events of that world during that time? Of course, those events were unfolding in real time for them. It really makes you think about today’s films and how world events have an impact on them. We had about 100 hours of archival footage, plus studio films and interviews. For eight to nine months we had our storyboard wall with note cards for each of the films. As more footage came in, you could chart the growth through the cards.”

Five Came Back was constructed using three organizing principles: 1) the directors’ films before the war, 2) their documentaries during the war, and 3) their films after the war. According to Znidaric, “We wanted to see how the war affected their work after the war. The book was our guide for causality and order, so I was able to build the structure of the documentary before the contemporary directors were interviewed. I was able to do so with the initial interview with the author, Mark Harris. This way we were able to script an outline to follow. Interview footage of our actual subjects from a few decades ago were also key elements used to tell the story. In recording the modern directors, we wanted to give them space – they are masters – we just needed to make sure we got certain story beats. Their point of view is unique in the sense that they are providing their perspective on their heroes. At the beginning, we have one modern director talking about one of our subject directors. Then that opens up over the three hours, as each talks a little bit about all of these filmmakers.”

From Moviola to Premiere Pro

This was the first film that Znidaric had edited using Adobe Premiere Pro. He says, “During film school, I got to cut 16mm on the Moviola, but throughout my time in New York, I worked on [Avid] Media Composer and then later [Apple] Final Cut Pro 7. When Final Cut Pro X came out, I just couldn’t wrap my head around it, so it was time to shift over to Premiere Pro. I’m completely sold on it. It was a dream to work with on this project. At Rock Paper Scissors, my associate editor James Long and I were set up in two suites. We had duplicate drives of media – not a SAN, which was just given to how the suites were wired. It worked out well for us, but forced us to be extremely diligent with how our media was organized and maintaining that throughout.” The suites were configured with 6-core 2013 Mac Pros, AJA IoXT boxes and Mackie Big Knob mixers for playback.

“All of the media was first transcoded to ProRes, which I believe is one of the reasons that the systems were rock solid during that whole time. There’s an exemplary engineering department at RPS, and they have a direct line to Adobe, so if there were any issues, they became the go-betweens. That way I could stay focused on the creative and not get bogged down with technical issues. Plus, James [Long] would generally handle issues of a technical nature. All told, it was very minimal. The project ran quite smoothly.” To stay on the safe side, the team did not update their versions of Premiere Pro during this time frame, opting to stick with Premiere Pro CC2015 for the duration. Because of the percentage of archival footage, Five Came Back was finished as HD and not in 4K, as are a number of other Netflix shows.

To handle Premiere Pro projects over the course of fifteen months, Znidaric and Long would transfer copies of the project files on a daily basis between the rooms. Znidaric continues, “There were sequences for individual ‘mini-stories’ inside the film. I would build these and then combine the stories. As the post progressed, we would delete some of the older sequences from the project files in order to keep them lean. Essentially we had a separate Premiere Pro project file for each day, therefore, at any time we could go back to an earlier project file to access an older sequence, if needed. We didn’t do much with the other Creative Cloud tools, since we had Elastic handling the graphics work. I would slug in raw stills or placeholder cards for maps and title cards. That way, again, I could stay focused on weaving the complex narrative tapestry.”

Elastic developed the main title and a stylistic look for the series while a52 handled color correction and finishing. Elastic and a52 are part of the Rock Paper Scissors group. Znidaric explains, “We had a lot of discussions about how to handle photos, stills, flyers, maps, dates and documents. The reality of filming under the stress of wartime and combat creates artifacts like scratches, film burn-outs and so on. These became part of our visual language. The objective was to create new graphics that would be true to the look and style of the archival footage.” The audio mix when out-of-house to Monkeyland, a Los Angeles audio post and mixing shop.

Five Came Back appealed to the film student side of the editor. Znidaric wrapped up our conversation with these thoughts. “The thrill is that you are learning as you go through the details. It’s mind-blowing and the series could easily have been ten hours long. We are trying to replicate a sense of discovery without the hindsight of today’s perspective. This was fun because it was like a graduate level film school. Most folks have seen some of the better known films, but many of these films aren’t as recognized these days. Going through them is a form of ‘cinematic forensics’. You find connections tied to the wartime experience that might not otherwise be as obvious. This is great for a film geek like me. Hopefully many viewers will rediscover some of these films by seeing this documentary series.”

The first episode of Five Came Back aired on Netflix on March 31. In conjunction with the launch of Five Came Back, Netflix will also present thirteen documentaries discussed in the series, including Ford’s The Battle of Midway, Wyler’s The Memphis Belle: A Story of a Flying Fortress, Huston’s Report from the Aleutians, Capra’s The Battle of Russia, Stevens’ Nazi Concentration Camps, and Stuart Heisler’s The Negro Soldier.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters