Sicario

df0416_sicario_1_sm

Sicario is an emotional and suspenseful look into the dark side of the war on drugs as told by Canadian director Denis Villeneuve (Enemy, Prisoners, Incendies). It teams a by-the-book FBI agent Kate (Emily Blunt) with an interagency task force led by CIA agent Matt (Josh Brolin). The shadowy mix of characters includes Alejandro (Benicio Del Toro) – an enigmatic contractor working with Matt. As the special operation proceeds with increasingly extra-legal means, we learn that there’s more to Alejandro than meets the eye – part former crusading prosecutor and part hitman. Kate and the audience are forced to question the morality of whether the ends justify the means as told through an increasingly tense and suspenseful story.

From Wagner to Hollywood

The key to driving such a thriller is often the editor, which in this case was Joe Walker (12 Years a Slave, Hunger, Harry Brown). I had a chance to discuss Sicario with Walker as he took a break from cutting the next Villeneuve film, Story of Your Life. Walker’s road to Hollywood is different than many other top-level, feature film editors. While editors often play musical instruments as a hobby, Walker actually studied to be a classical composer in his native England.

df0416_sicario_7Walker explains, “It’s always been a hard choice between films and writing music. I remember when I was ten years old, I’d run 8mm films of the Keystone Cops at slow speed with Richard Wagner playing against it and kind of get depressed! So, these were twin interests of mine. I studied classical composing and balanced two careers of editing and composing up until the mid-2000s. I used my music degree to get a job with the BBC where I moved into assistant editor roles. The BBC is very cautious and it took me eleven years before finally being allowed to cut drama as an editor. This was all on 16mm film and then I moved into digital editing, first with Lightworks and later Avid. I always wanted to work on bigger films, but I felt there was a glass ceiling in England. Big studio films that came in would always bring their own editors. The big break for me was 12 Years a Slave, which provided the opportunity to move to Los Angeles.”

Controlling the story, characters and rhythm

df0416_sicario_6Sicario has a definite rhythm designed to build suspense. There are scenes that are slow but tense and others that are action-packed. Walker explains his philosophy on setting the pace, “Since working with Steve McQueen (director, 12 Years a Slave) I’ve been known for holding shots a long time to build tension. This is contrary to the usual approach, which says you build tension by an increasingly faster cutting pace. Sometimes if you hold a shot, there’s even more tension if the story supports it. I’ll even use the trick of invisible split screens in order to hold a take longer than the way it was originally shot. For example, the left side of one take might hold long enough, but something breaks on the right. I’ll pull the right side from a different take in order to extend the end of the complete shot.”

Another interesting aspect to Sicario is the sparseness of the musical score, in favor of sound design. Walker comments, “Music is in an abusive relationship with film. Putting on my composer hat, I don’t want to tell the audience what to think only by the music. It’s part of the composite. I try to cut without a temp score, because you have to know when it’s only the music that drives the emotion. I’ll even turn the sound down and cut it as if it was a silent movie, so that I can feel the rhythm visually. Then sound effects add another layer and finally music. In Sicario, I made use of a lot of walkie-talkie dialogue to fill in spaces – using them almost like a sound effect.  Jóhann Jóhannsson (composer, Prisoners, The Theory of Everything, Foxcatcher) was thrilled to get a clean output without someone else’s preconceived temp score, because it allowed him to start with a clean palette.”

df0416_sicario_3Editing shapes the characters. Walker says, “Taylor Sheridan’s script was fantastic, so I don’t want to do a disservice to him, but there was a continual process of paring down the dialogue and simplifying the story, which continued long into the edit. Benicio Del Toro’s character says very little and that helps keep him very mysterious. One of the biggest cuts we made in the edit was to eliminate the original opening scene, shot on the coast at Veracruz. In it, Alejandro (Del Toro) is interrogating a cop by holding his head underwater. He goes too far and kills him.  So he drags the lifeless body to the shore only to resuscitate him and begin the interrogation again. A strong and brutal scene, but one that told too much about Alejandro at the outset, rather than letting us – and Kate (Emily Blunt) – figure him out piece by piece. We needed to tell the story through Kate’s eyes. The film now starts with the hostage rescue raid, which better anchors the film on Kate.  And it’s not short of its own brutality. At the end of the scene we smash cut from a mutilated hand on the ground to Kate washing the blood out of her hair in the shower. This very violent beginning lets the audience know that anything could happen in this film.”

A carefully considered production

Sicario was produced for an estimated $31 million. While not exactly low budget, it was certainly modest for a film of this ambition. The majority of the film was shot in New Mexico over a 49-day period, starting in July of 2014. Final post was completed in March of this year. Roger Deakins (Unbroken, Prisoners, Skyfall), the film’s director of photography, relied on his digital camera of choice these days, the ARRI Alexa XT recording to ARRIRAW. The editorial team cut with transcoded Avid DNxHD media using two Avid Media Composer systems.

df0416_sicario_4Joe Walker continues, “This was a very carefully considered shoot. They spent a lot of effort working out shots to avoid overshooting. Most of the set-ups were in the final cut. They were also lucky with the weather. I cut the initial assembly in LA while they were shooting in New Mexico. The fine cut was done in Montreal with Denis for ten weeks and then back to LA for the final post. The edit really came together easily because of all the prep. Roger has to be one of our generation’s greatest cinematographers. Not only are his shots fantastic, but he has a mastery of sequence building, which is matched by Denis.”

“Ninety percent of the time the editorial team consisted of just my long-time first assistant Javier [Marcheselli] and me. The main focus of the edit was to streamline the storytelling and to be as muscular and rhythmic with the cutting as possible. We spent a lot of time focused on the delicate balance between how much we see the story through our central character’s eyes and how much we should let the story progress by itself.  One of the constructs that came out of the edit was to beef up the idea of surveillance by taking helicopter aerials of the desert and creating drone footage from it.  Javier is great with temp visual effects and I’m good with sound, so we’d split up duties that way.”

df0416_sicario_8“I’m happy that this was largely a single-camera production. Only a few shots were two-camera shots. Single-camera has the advantage that the editor can better review the footage. With multi-cam you might get four hours of dailies, which takes about seven hours to review. When are you left with time to cut? This makes it hard to build a relationship with the dailies. With a single-camera film, you have more time to really investigate the coverage. I like to mind-read what the direction was by charting the different nuances between takes.”

It shouldn’t matter what the knives are

Walker is a long-time Media Composer user. We wrapped up with a discussion about the tools of the trade. Walker says, “This was a small film compared to some, so we used two Avid workstations connected to Avid’s ISIS shared storage while in LA. It’s rock solid. In Montreal, there was a different brand of shared storage, which wasn’t nearly as solid as ISIS. On Michael Mann’s Blackhat, we sometimes had sixteen Avids connected to ISIS, so that’s pretty hard to beat. I really haven’t used other NLEs, like Final Cut, but Premiere is tempting. If anything, going back to Lightworks is even more intriguing to me. I really loved how intuitive the ‘paddles’ (the Lightworks flatbed-style Controller) were. But edit systems are like knives. You shouldn’t care what knives the chef used if the meal tastes good. Given the right story, I’d be happy to cut it on wet string.”

df0416_sicario_2The editing application isn’t Walker’s only go-to tool. He continues, “I wish Avid would include more improvements on the audio side of Media Composer. I often go to outside applications. One of my favorites is [UISoftware’s] MetaSynth, which lets me extend music. For instance, if a chord is held for one second, I can use MetaSynth to extend that hold for as much as ten, twenty seconds. This makes it easy to tailor music under a scene and it sounds completely natural. I also used it on Sicario to elongate some great screaming sounds in the scene where Alejandro is having a nightmare on the plane – they are nicely embedded into the sounds of the jet engines – we wanted the message to be subliminal.”

df0416_sicario_5Joe Walker is a fan of visual organization. He explains, “When I’m working with dailies, I usually don’t pre-edit select sequences for a scene unless it’s a humongous amount of coverage. Instead, I prefer to visually arrange the ‘tiles’ (thumbnail frames in the bin) in a way that makes it easier to tuck in. But I am a big fan of the scene wall. I write out 3” x 5” note cards for each scene with a short description of the essence of that scene on it. This is a great way to quickly see what that scene is all about and remind you of a character’s journey up to that point. When it comes time to re-order scenes, it’s often better to do that by shifting the cards on the wall first. If you try to do it in the software, you get bogged down in the logistics of making those edit changes. I’ll put the cards for deleted scenes off to the side, so a quick glance reminds me of what we’ve removed. It’s just something that works for me.  Denis has just spent the best part of a year turning words into pictures so he laughs at my wall and my reliance on it!”

(It’s also worth checking out Steve Hullfish’s excellent interview with Walker at his Art of the Cut column.)

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Advertisements

Steve Jobs

df0216_sj1_smIt’s challenging to condense the life of a complex individual into a two-hour-long film. So it’s no wonder that the filmmakers of Steve Jobs have earned both praise and criticism for their portrayal of the Apple co-founder. The real Steve Jobs generated differing emotions from those who knew him or those who viewed his life from the outside. To tackle that dilemma screenwriter Aaron Sorkin (Moneyball, The Social Network, Charlie Wilson’s War) and director Danny Boyle (127 Hours, Slumdog Millionaire, 28 Days Later) set out to create a “painting instead of a photograph”.

Steve Jobs with Michael Fassbender in the central role uses a classic Shakespearean three-act structure, focusing on three key product launches. Act 1 depicts the unveiling of the first Macintosh computer (1984); Act 2 is the introduction of the NeXT computer (1988); Act 3 is the reveal of the original iMac (1998). These three acts cover the narrative arc of Jobs’ rise, humiliation/revenge, and his ultimate return to prominence at Apple. All of the action takes place backstage at these launch events, but is intercut with flashbacks. The emotional thread that ties the three acts together is Jobs’ relationship with his daughter, Lisa Brennan-Jobs.

An action film of words

Aaron Sorkin’s scripts are known for their rapid fire dialogue and Steve Jobs is no exception. Clocking in at close to 190 script pages, the task of whittling that down to a two-hour movie fell to editor Elliot Graham (Milk, 21, Superman Returns). I recently spoke with Graham about how he connected with this project and some of the challenges the team faced. He explains, “I’ve been a fan of Danny’s and his regular editor wasn’t available to cut this film. So I reached out and met with them and I joined the team.”

Steve Jobs“When I read the script, I characterized it as an ‘action film of words.’ Early on we talked about the dialogue and the need to get to two hours. I’ve never talked about the film’s final length with a director at the start of the project, but we knew the information would come fast and we didn’t want the audience to feel pummeled. We needed to create a tide of energy from beginning to end that takes the viewer through this dialogue as these characters travel from room to room. It’s our responsibility to keep each entrance into a different room or hallway revelatory in some fashion – so that the viewer stays with the ideas and the language. Thank goodness we had sound recordist Lisa Pinero on hand – she really helped the cast stay true to the musicality of the writing. The script is full of intentional overlaps, and Danny didn’t want to stop them from happening. Lisa captured it so that I could edit it. We knew we wanted very little ADR in this film, so we let the actors play out the scene. That was pivotal in capturing Aaron’s language.”

“Each act is a little different, both in production design and in the format. [Director of photography] Alwin Küchler (Divergent, R.I.P.D., Hanna) filmed Act 1 on 16mm, Act 2 on 35mm, and Act 3 digitally with the ARRI Alexa. We also added visuals in the form of flashbacks and other intercutting to make it more cinematic. Danny would keep rolling past the normal end of a take and would get some great emotions from the actors that I could use elsewhere. Also when the audience arrives to take their seats at these launch events, Danny would record that, which gave us additional material to work with. In one scene with Jobs and Joanna Hoffman (Kate Winslet), Danny kept rolling on Kate after Michael left the room. In that moment we got an exquisite emotional performance from her that was never in the script. In another example, he got this great abstract close-up of Michael that we were able to use to intercut with the boardroom scene later. This really puts the audience into Steve’s head and is a pay-off for the revenge concept.”

Building structure

df0216_sj2Elliot Graham likes to make his initial cut tight and have a first presentation that’s reasonably finished. His first cut was approximately 147 minutes long compared with a final length of 117 minutes plus credits. He continues, “In the case of this film, cutting tight was beneficial, because we needed to know whether or not the pace would work. The good news is that this leaves you more time to experiment, because less time is spent in cutting it down for time. We needed to make sure the viewer would stay engaged, because the film is really three separate stories. To avoid the ‘stage play’ feeling and move from one act into the next, we added some interstitial visual elements to move between acts. In our experimenting and trimming, we opted to cut out part of the start of Act 2 and Act 3 and join the walking-talking dialogue ‘in progress.’ This becomes a bit of a montage, but it serves the purpose of quickly bringing the viewer along even though they might have to mentally fill in some of the gaps. That way it didn’t feel like Act 2 and Act 3 were the start of new films and kept the single narrative intact.”

“At the start, the only way to really ascertain the success of our efforts was to see Act 1, as close to screen-ready as we could come. So I put together an assemblage and Danny, the producers, and I viewed it. Not only did we want to see how it all worked together before moving on, we wanted to see that we had achieved the tone and quality we were after, because each act needed to feel completely different. And since Danny was shooting each piece a bit differently, I was cutting each one differently. For example, there’s a lot of energy, almost frenetic, to the camera movements in Act 1, plus it was shot on 16mm, so it gives it this cinema verité feel and harkens back to a less technically-savvy time. Act 2 has a more classical technique to it, so the cutting becomes a little slower in pacing. By getting a sense of what was working and maybe what wasn’t, it helped define how we were going to shoot the subsequent two acts and ensure we were creating an evolution for the character and the story. We would not have been able to do this if we had shot this film chronologically out of order, the way most features are.”

It’s common for a film’s scene structure to be re-arranged during the edit, but that’s harder to do with a film like Steve Jobs. There’s walking-talking dialogue that moves from one room to the next, which means the written script forces a certain linear progression. It’s a bit like the challenge faced in Birdman: Or (The Unexpected Virtue of Ignorance), except without the need to present the story as a continuous, single take. Graham says, “We did drop some scenes, but it was tricky, because you have to bridge the gap without people noticing. One of the scenes that was altered a lot from how it was written was the fight between John Scully (Jeff Daniels) and Steve Jobs (Michael Fassbender). This scene runs about eleven minutes and Danny and I felt it lost momentum. So we spent about 48 hours recutting the scene. Instead of following the script literally, we followed the change in emotion of the actors’ performances. This led to a better emotional climax, which made the scene work.”

From San Francisco to London

df0216_sj4Steve Jobs was shot in San Francisco from January to April of this year and then post shifted to London from April until October. The editorial team worked with two Avid Media Composers connected to Avid ISIS shared storage. The film elements were scanned and then all media transcoded to Avid DNxHD for the editing team. Graham explains, “From the standpoint of the edit, it didn’t matter whether it was shot on film or digitally – the different formats didn’t change our workflow. But it was still exciting to have part of this on film, because that’s so rare these days. Danny likes a very collaborative process, so Aaron and the producers were all involved in reviewing the cuts and providing their creative input. As a director, Danny is very involved with the edit. He’d go home and review all the dailies again on DVD just to make sure we weren’t missing anything. This wasn’t an effects-heavy film like a superhero film, yet there were still several hundred visual effects. These were mostly clean-ups, like make-up fixes, boom removals, but also composites, like wall projections.”

Various film editors have differing attitudes about how much sound they include in their cut. For Elliot Graham it’s an essential part of the process. He says, “I love working with sound and temp music, because it changes your perception and affects how you approach the cut. For Steve Jobs, music was a huge part of the process from the beginning. Unlike other films, we received a lot of pieces of music from Daniel Pemberton (composer, The Man from U.N.C.L.E., Cuban Fury, The Counselor) right at the start. He had composed a number of options based on his reading of the script. We tried different test pieces even before the shoot. Once some selections were made, Daniel gave us stems so that I could really tailor the music to the scene. This helped to define the flashbacks musically. The process was much more collaborative between the director and composer than on other films and it was a really unique way to work.”

Getting the emotion right

Elliot Graham joined the project after Michael Fassbender was signed to play Steve Jobs. Graham comments, “I’ve always thought Michael was a brilliant actor and I’d much rather have that to work with than someone who just looks like Jobs. Steve Wozniak (who is played by actor Seth Rogan in the film) watched the film several times and he commented that although the actual events were slightly different, the feeling behind what’s in the film was right. He’s said that to him, it was like seeing the real Steve.  So Michael was in some way capturing the essence of this guy.  I’m biased, of course, but Danny’s aim was to get the emotional approach right and I think he succeeded.”

“I’m a big Apple fan, so the whole process felt a bit strange – like I was in some sort of wonderful Charlie Kaufman wormhole. Here I was working on a Mac and using an iPhone to communicate while cutting a film about the first Mac and the person who so impacted the world through these innovations. I felt that by working on this film, I could understand Jobs just a little bit better. You get a sense of Jobs through his coming into contact with all of these people and his playing out whatever conflicts that existed. I think it’s more of a ‘why’ and ‘who’ story – rather than a point for point biography – why this person, whose impact on our lives is immeasurable, was the way he was. It’s my feeling that we were trying to look at his soul much more than track his life story.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Fear the Walking Dead

df3615_ftwd_1_sm

When AMC cable network decided to amp up the zombie genre with The Walking Dead series, it resulted in a huge hit. Building upon that success, they’ve created a new series that could be viewed as a companion story, albeit without any overlapping characters. Fear the Walking Dead is a new, six-episode series that starts season one on August 23. The story takes place across the country in Los Angeles and chronologically just before the outbreak in the original series. The Walking Dead was based on Robert Kirkman’s graphic novels by the same name and he has been involved in both versions as executive producer.

Unlike the original series, which was shot on 16mm film, Fear the Walking Dead is being shot digitally with ARRI ALEXA cameras and anamorphic lenses. That’s in an effort to separate the two visual styles, while maintaining a cinematic quality to the new series. I recently spoke with Tad Dennis, the editor of two of the six episodes in season one, about the production.

Tad Dennis started his editing career as an assistant editor on reality TV shows. He says, “I started in reality TV and then got the bump-up to full-time editing (Extreme Makeover: Home Edition, America’s Next Top Model, The Voice). However, I realized my passion was elsewhere and made the shift to scripted television. I started there again as an assistant and then was bumped back up to editing (Fairly Legal, Manhattan, Parenthood). Both types of shows really do have a different workflow, so when I shifted to scripted TV, it was good to start back as an assistant. That let me be very grounded in the process.”

Creating a new show with a shared concept

Dennis started with these thoughts on the new show, “We think of this series as more of a companion show to the other and not necessarily a spin-off or prequel. The producers went with different cameras and lenses for a singular visual aesthetic, which affects the style. In trying to make it more ‘cinematic’, I tend linger on wider shots and make more selective use of tight facial close-ups. However, the material really has to dictate the cut.”

df3615_ftwd_3Three editors and three assistant editors work on the Fear the Walking Dead series, with each editor/assistant team cutting two of the six shows of season one. They are all working on Avid Media Composer systems connected to an Avid Isis shared storage solution. Scenes were shot in both Vancouver and in Los Angeles, but the editing teams were based in Los Angeles. ALEXA camera media was sent to Encore Vancouver and Encore Hollywood, depending on the shooting location. Encore staff synced sound and provided the editors with Avid DNxHD editorial media. The final color correction, conform, and finishing was also handled at Encore Hollywood.

Dennis described how post on this show differed from other network shows he’s worked on in the past. He says, “With this series, everything was shot and locked for the whole season by the first airdate. On other series, the first few shows will be locked, but then for the rest of the season, it’s a regular schedule of locking a new show each week until the end of the season. This first season was shot in two chunks for all six episodes – the Vancouver settings and then the Los Angeles scenes. We posted everything for the Vancouver scenes and left holes for the LA parts. The shows went all the way through director cuts, producer cuts, and network notes with these missing sections. Then when the LA portions came in, those scenes were edited and incorporated. This process was driven by the schedule. Although we didn’t have the pressure of a weekly airdate, the schedule was definitely tight.” Each of the editors had approximately three to four days to complete their cut of an episode after receiving the last footage. Then the directors got another four days for a director’s cut.

df3615_ftwd_5Often films and television shows go through adjustments as they move from script to actual production and ultimately the edit. Dennis feels this is more true of the first few shows in a new series than with an established series. He explains, “With a new series, you are still trying to establish the style. Often you’ll rethink things in the edit. As I went through the scenes, performances that were coming across as too ‘light’ had to be given more ‘weight’. In our story, the world is falling apart and we wanted every character to feel that all the way throughout the show. If a performance didn’t convey a sense of that, then I’d make changes in the takes used or mix takes, where picture might be better on one and audio better on the other.”

Structure and polish in post

In spite of the tight schedule, the editors still had to deal with a wealth of footage. Typical of most hour-long dramas, Fear the Walking Dead is shot with two or three cameras. For very specific moments, the director would have some of the footage shot on 48fps. In those cases, where cameras ran at different speeds, Dennis would treat these as separate clips. When cameras ran at the same speed (for example, at 24fps for sync sound), such as in dialogue scenes, Susan Vinci (assistant editor) would group the clips as multicam clips. He explains, “The director really determines the quality of the coverage. I’d often get really necessary options on both cameras that weren’t duplicated otherwise. So for these shows, it helped. Typically this meant three to four hours of raw footage each day. My routine is to first review the multicam clips in a split view. This gives me a sense of what the coverage is that I have for the scene. Then I’ll go back and review each take separately to judge performance.”

df3615_ftwd_4Dennis feels that sound is critical to his creative editing process. He continues, “Sound is very important to the world of Fear the Walking Dead. Certain characters have a soundscape that’s always associated with them and these decisions are all driven by editorial. The producers want to hear a rough cut that’s as close to airable as possible, so I spend a lot of time with sound design. Given the tight schedule on this show, I would hand off a lot of this to my long-time assistant, Susan. The sound design that we do in the edit becomes a template for our sound designer. He takes that, plus our spotting notes, and replaces, improves, and enhances the work we’ve done. The show’s music composer also supplied us with a temp library of past music he’d composed for other productions. We were able to use these as part of our template. Of course, he would provide the final score customized to the episode. This score would be based on our template, the feelings of the director, and of course the composer’s own input for what best suited each show.”

df3615_ftwd_2Dennis is an unabashed Avid Media Composer proponent. He says, “Over the past few years, the manufacturers have pushed to consolidate many tools from different applications. Avid has added a number of Pro Tools features into Media Composer and that’s been really good for editors. There are many tools I rely on, such as those audio tools. I use the Audiosuite and RTAS filters in all of my editing. I like dialogue to sound as it would in a live environment, so I’ll use the reverb filters. In some cases, I’ll pitch-shift audio a bit lower. Other tools I’ll use include speed-ramping and invisible split-screens, but the the trim tool is what defines the system for me. When I’m refining a cut, the trim tool is like playing a precise instrument, not just using a piece of software.”

Dennis offered these parting suggestions for young editors starting out. “If you want to work in film and television editing, learn Media Composer inside and out. The dominant tool might be Final Cut or Premiere Pro in some markets, but here in Hollywood, it’s largely Avid. Spend as much time as possible learning the system, because it’s the most in-demand tool for our craft.”

Originally written for Digital Video magazine / CreativePlanetNetwork

©2015 Oliver Peters

Avid Media Composer Goes 4K

df0915_avidmc83_5_sm

Avid Technology entered 2015 with a bang. The company closed out 2014 with the release of its Media Composer version 8.3 software, the first to enable higher resolution editing, including 2K, UHD and 4K projects. On January 16th of this year, Avid celebrated its relisting on the NASDAQ exchange by ringing the opening bell. Finally – as in most years – the Academy Awards nominee field is dominated by films that used either Media Composer and/or Pro Tools during the post-production process.

In a software landscape quickly shifting to rental (subscription) business models, Avid now offers the most flexible price model. Media Composer | Software may be purchased, rented, or managed through a floating licensing. If you purchase a perpetual license (you own the software), then an annually-renewed support contract gives you phone support and continued software updates. Opt out of the contract and you’ll still own the software you bought – you just lose any updates to newer software.

You can purchase other optional add-ons, like Symphony for advanced color correction. Unfortunately there’s still no resolution to the impasse between Avid and Nexidia. If you purchased ScriptSync or PhraseFind in the past, which rely on IP from Nexidia, then you can’t upgrade to version 8 or higher software and use those options. On the other hand, if you own an older version, such as Media Composer 7, and need to edit a project that requires a higher version, you can simply pick up a software subscription for the few months. This would let you run the latest software version for the time that it will take to complete that project.

df0915_avidmc83_1_smThe jump from Media Composer | Software 8.2 to 8.3 might seem minor, but in fact this was a huge update for Avid editors. It ushered in new, high-resolution project settings and capabilities, but also added a resolution-independent Avid codec – DNxHR. Not merely just the ability to edit in 4K, Media Composer now addresses most of the different 4K options that cover the TV and cinema variations, as well as new color spaces and frame rates. Need to edit 4K DCI Flat (3996×2160) at 48fps in DCI-P3 color space? Version 8.3 makes it possible. Although Avid introduced high-resolution editing in its flagship software much later than its competitors, it comes to the table with a well-designed upgrade that attempts to address the nuances of modern post.

df0915_avidmc83_2_smAnother new feature is LUT support. Media Composer has allowed users to add LUTs to source media for awhile now, but 8.3 adds a new LUT filter. Apply this to a top video track on your timeline and you can then add a user-supplied, film emulation (or any other type) look to all of your footage. There’s a new Proxy setting designed for work with high-resolution media. For example, switch your project settings to 1/4 or 1/16 resolution for better performance while editing with large files. Switch Proxy off and you are ready to render and output at full quality. As Media Composer becomes more capable of functioning as a finishing system, it has gained DPX image sequence file export via the Avid Image Sequencer, as well as export to Apple ProRes 4444 (Mac only).

df0915_avidmc83_4_smThis new high resolution architecture requires that the software increasingly shed itself of any remaining 32-bit parts in order to be compatible with modern versions of the Mac and Windows operating systems. Avid’s Title Tool still exists for legacy SD and HD projects, but higher resolutions will use NewBlue Titler Pro, which is included with Media Composer. It can, of course, also be used for all other titling.

There are plenty of new, but smaller features for the editor, such as a “quick filter” in the bin. Use it to quickly filter items to match the bin view to correspond with your filter text entry. The Avid “helper” applications of EDL Manager and FilmScribe have now been integrated inside Media Composer as the List Tool. This may be used to generate EDLs, Cut Lists and Change Lists.

df0915_avidmc83_3_smAvid is also a maker of video i/o hardware – Mojo DX and Nitris DX. While these will work to monitor higher resolution projects as downscaled HD, they won’t be updated to display native 4K output, for instance. Avid has qualifying AJA and Blackmagic Design hardware for use as 4K i/o. It is currently also qualifying BlueFish 444. If you work with a 4K computer display connected to your workstation, then the Full Screen mode enables 4K preview monitoring.

Avid Media Composer | Software version 8.3 is just the beginning of Avid’s entry into the high-resolution post-production niche. Throughout 2015, updates will further refine and enhance these new capabilities and expand high-resolution to other Avid products and solutions. Initial user feedback is that 8.3 is reasonably stable and performs well, which is good news for the high-end film and television world that continues to rely on Avid for post-production tools and solutions.

(Full disclosure: I have participated in the Avid Customer Association and chaired the Video Subcommittee of the Products and Solutions Council. This council provides user feedback to Avid product management to aid in future product development.)

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

The Black Panthers: Vanguard of the Revolution

df0615_panthers_3_smDocumentaries covering subject matter that happens within a generation usually divides the audience between those who personally lived through the time period and those who’ve only read about it in history books. The Black Panthers: Vanguard of the Revolution is one such film. If you are over 50, you are aware of the media coverage of the Black Panther Party and certainly have opinions and possibly misconceptions of who they were. If you are under 50, then you may have learned about them in history class, if which case, you may only know them by myth and legend. Filmmaker Stanley Nelson (The American Experience, Freedom Summer, Wounded Knee, Jonestown: The Life and Death of Peoples Temple) seeks to go beyond what you think you know with this new Sundance Film Festival documentary entry.

I spoke with the film’s editor, Aljernon Tunsil, as he was putting the finishing touches on the film to get it ready for Sundance presentation. Tunsil has worked his way up from assistant editor to editor and discussed the evolution in roles. “I started in a production company office, initially helping the assistant editor,” he says. “Over a period of seven or eight years, I worked my way up from assistant to a full-time editor. Along the way, I’ve had a number of mentors and learned to cut on both [Apple] Final Cut Pro and [Avid] Media Composer. These mentors were instrumental in my learning how to tell a story. I worked on a short with Stanley [Nelson] and that started our relationship of working together on films. I view my role as the ‘first audience’ for the film. The producer or director knows the story they want to make, but the editor helps to make sense of it for someone who doesn’t intimately know the material. My key job is to make sure that the narrative makes senses and that no one gets lost.”

df0615_panthers_2_smThe Black Panthers is told through a series of interviews (about 40 total subjects). Although a few notables, like Kathleen Cleaver, are featured, the chronicle of the rise and fall of the Panthers is largely told by lesser known party members, as well as FBI informants and police officers active in the events. The total post-production period took about 40 to 50 weeks. Tunsil explains, “Firelight Films (the production company) is very good at researching characters and finding old subjects for the interviews. They supplied me with a couple of hundred hours of footage. That’s a challenge to organize so that you know what you have. My process is to first watch all of that with the filmmakers and then to assemble the best of the interviews and best of the archival footage. Typically it takes six to ten weeks to get there and then another four to six weeks to get to a rough cut.”

Tunsil continues, “The typical working arrangement with Stanley is that he’ll take a day to review any changes I’ve made and then give me notes for any adjustments. As we were putting the film together, Stanley was still recording more interviews to fill in the gaps – trying to tie the story together without the need for a narrator. After that, it’s the usual process of streamlining the film. We could have made a ten-hour film, but, of course, not all of the stories would fit into the final two-hour version.”

df0615_panthers_5_smLike many documentary film editors, Tunsil prefers having interview transcripts, but acknowledged they don’t tell the whole story. He says, “One example is in the interview with former Panther member Wayne Pharr. He describes the police raid on the LA headquarters of the party and the ensuing shootout. When asked how he felt, he talks about his feeling of freedom, even though the event surrounding him was horrific. That feeling clearly comes across in the emotion on his face, which transcends the mere words in the transcript. You get to hear the story from the heart – not just the facts. Stories are what makes a documentary like this.”

As with many films about the 1960s and 1970s, The Black Panthers weaves into its fabric the music of the era. Tunsil says, “About 60% of the film was composed by Tom Phillips, but we also had about seven or eight period songs, like ‘Express Yourself’, which we used under [former Panther member] Bobby Seale’s run for mayor of Oakland. I used other pieces from Tom’s library as temp music, which we then gave to him for the feel. He’d compose something similar – or different, but in a better direction.”

df0615_panthers_6_smTunsil is a fervent Avid Media Composer editor, which he used for The Black Panthers. He explains, “I worked with Rebecca Sherwood as my associate editor and we were both using Media Composer version 7. We used a Facilis Terrablock for shared storage, but this was primarily used to transfer media between us, as we both had our own external drives with a mirrored set of media files. All the media was at the DNxHD 175 resolution. I like Avid’s special features such as PhraseFind, but overall, I feel that Media Composer is just better at letting me organize material than is Final Cut. I love Avid as an editing system, because it’s the most stable and makes the work easy. Editing is best when there’s a rhythm to the workflow and Media Composer is good for that. As for the stills, I did temporary moves with the Avid pan-and-zoom plug-in, but did the final moves in [Adobe] After Effects.”

df0615_panthers_1_smFor a documentary editor, part of the experience are the things you personally learn. Tunsil reflects, “I like the way Stanley and Firelight handle these stories. They don’t just tell it from the standpoint of the giants of history, but more from the point-of-view of the rank-and-file people. He’s trying to show the full dimension of the Panthers instead of the myth and iconography. It’s telling the history of the real people, which humanizes them. That’s a more down-to-earth, honest experience. For instance, I never knew that they had a communal living arrangement. By having the average members tell their stories, it makes it so much richer. Another example is the Fred Hampton story. He was the leader of the Chicago chapter of the party who was killed in a police shootout; but, there was no evidence of gunfire from inside the building that he was in. That’s a powerful scene, which resonates. One part of the film that I think is particularly well done is the explanation of how the party declined due to a split between Eldridge Cleaver and Huey Newton. This was in part as a result of an internal misinformation campaign instigated by the FBI within the Panthers.”

df0615_panthers_4_smThroughout the process, the filmmakers ran a number of test screenings with diverse audiences, including industry professionals and non-professionals, people who knew the history and people who didn’t. Results from these screenings enabled Nelson and Tunsil to refine the film. To complete the film’s finishing, Firelight used New York editorial facility Framerunner. Tunsil continues, “Framerunner is doing the online using an Avid Symphony. To get ready, we simply consolidated the media to a single drive and then brought it there. They are handling all color correction, improving moves on stills and up-converting the standard definition archival footage.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

24p HD Restoration

df_24psdhd_6

There’s a lot of good film content that only lives on 4×3 SD 29.97 interlaced videotape masters. Certainly in many cases you can go back and retransfer the film to give it new life, but for many small filmmakers, the associated costs put that out of reach. In general, I’m referring to projects with $0 budgets. Is there a way to get an acceptable HD product from an old Digibeta master without breaking the bank? A recent project of mine would say, yes.

How we got here

I had a rather storied history with this film. It was originally shot on 35mm negative, framed for 1.85:1, with the intent to end up with a cut negative and release prints for theatrical distribution. It was being posted around 2001 at a facility where I worked and I was involved with some of the post production, although not the original edit. At the time, synced dailies were transferred to Beta-SP with burn-in data on the top and bottom of the frame for offline editing purposes. As was common practice back then, the 24fps film negative was transferred to the interlaced video standard of 29.97fps with added 2:3 pulldown – a process that duplicates additional fields from the film frames, such that 24 film frames evenly add up to 60 video fields in the NTSC world. This is loaded into an Avid, where – depending on the system – the redundant fields are removed, or the list that goes to the negative cutter compensates for the adjustments back to a frame-accurate 24fps film cut.

df_24psdhd_5For the purpose of festival screenings, the project file was loaded into our Avid Symphony and I conformed the film at uncompressed SD resolution from the Beta-SP dailies and handled color correction. I applied a mask to hide the burn-in and ended up with a letter-boxed sequence, which was then output to Digibeta for previews and sales pitches to potential distributors. The negative went off to the negative cutter, but for a variety of reasons, that cut was never fully completed. In the two years before a distribution deal was secured, additional minor video changes were made throughout the film to end up with a revised cut, which no longer matched the negative cut.

Ultimately the distribution deal that was struck was only for international video release and nothing theatrical, which meant that rather than finishing/revising the negative cut, the most cost-effective process was to deliver a clean video master. Except, that all video source material had burn-in and the distributor required a full-height 4×3 master. Therefore, letter-boxing was out. To meet the delivery requirements, the filmmaker would have to go back to the original negative and retransfer it in a 4×3 SD format and master that to Digital Betacam. Since the negative was only partially cut and additional shots were added or changed, I went through a process of supervising the color-corrected transfer of all required 35mm film footage. Then I rebuilt the new edit timeline largely by eye-matching the new, clean footage to the old sequence. Once done and synced with the mix, a Digibeta master was created and off it went for distribution.

What goes around comes around

After a few years in distribution, the filmmaker retrieved his master and rights to the film, with the hope of breathing a little life into it through self-distribution – DVDs, Blu-rays, Internet, etc. With the masters back in-hand, it was now a question of how best to create a new product. One thought was simply to letter-box the film (to be in the director’s desired aspect) and call it a day. Of course, that still wouldn’t be in HD, which is where I stepped back in to create a restored master that would work for HD distribution.

Obviously, if there was any budget to retransfer the film negative to HD and repeat the same conforming operation that I’d done a few years ago – except now in HD – that would have been preferable. Naturally, if you have some budget, that path will give you better results, so shop around. Unfortunately, while desktop tools for editors and color correction have become dirt-cheap in the intervening years, film-to-tape transfer and film scanning services have not – and these retain a high price tag. So if I was to create a new HD master, it had to be from the existing 4×3 NTSC interlaced Digibeta master as the starting point.

In my experience, I know that if you are going to blow-up SD to HD frame sizes, it’s best to start with a progressive and not interlaced source. That’s even more true when working with software, rather than hardware up-convertors, like Teranex. Step one was to reconstruct a correct 23.98p SD master from the 29.97i source. To do this, I captured the Digibeta master as a ProResHQ file.

Avid Media Composer to the rescue

df_24psdhd_2_sm

When you talk about software tools that are commonly available to most producers, then there are a number of applications that can correctly apply a “reverse telecine” process. There are, of course, hardware solutions from Snell and Teranex (Blackmagic Design) that do an excellent job, but I’m focusing on a DIY solution in this post. That involves deconstructing the 2:3 pulldown (also called “3:2 pulldown”) cadence of whole and split-field frames back into only whole frames, without any interlaced tearing (split-field frames). After Effects and Cinema Tools offer this feature, but they really only work well when the entire source clip is of a consistent and unbroken cadence. This film had been completed in NTSC 29.97 TV-land, so frequently at cuts, the cadence would change. In addition, there had been some digital noise reduction applied to the final master after the Avid output to tape, which further altered the cadence at some cuts. Therefore, to reconstruct the proper cadence, changes had to be made at every few cuts and, in some scenes, at every shot change. This meant slicing the master file at every required point and applying a different setting to each clip. The only software that I know of to effectively do this with is Avid Media Composer.

Start in Media Composer by creating a 29.97 NTSC 4×3 project for the original source. Import the film file there. Next, create a second 23.98 NTSC 4×3 project. Open the bin from the 29.97 project into the 23.98 project and edit the 29.97 film clip to a new 23.98 sequence. Media Composer will apply a default motion adapter to the clip (which is the entire film) in order to reconcile the 29.97 interlaced frame rate into a 23.98 progressive timeline.

Now comes the hard part. Open the Motion Effect Editor window and “promote” the effect to gain access to the advanced controls. Set the Type to “Both Fields”, Source to “Film with 2:3 Pulldown” and Output to “Progressive”. Although you can hit “Detect” and let Media Composer try to decide the right cadence, it will likely guess incorrectly on a complex file like this. Instead, under the 2:3 Pulldown tab, toggle through the cadence options until you only see whole frames when you step through the shot frame-by-frame. Move forward to the next shot(s) until you see the cadence change and you see split-field frames again. Split the video track (place an “add edit”) at that cut and step through the cadence choices again to find the right combination. Rinse and repeat for the whole film.

Due to the nature of the process, you might have a cut that itself occurs within a split-field frame. That’s usually because this was a cut in the negative and was transferred as a split-field video frame. In that situation, you will have to remove the entire frame across both audio and video. These tiny 1-frame adjustments throughout the film will slightly shorten the duration, but usually it’s not a big deal. However, the audio edit may or may not be noticeable. If it can’t simply be fixed by a short 2-frame dissolve, then usually it’s possible to shift the audio edit a little into a pause between words, where it will sound fine.

Once the entire film is done, export a new self-contained master file. Depending on codecs and options, this might require a mixdown within Avid, especially if AMA linking was used. That was the case for this project, because I started out in ProResHQ. After export, you’ll have a clean, reconstructed 23.98p 4×3 NTSC-sized (720×486) master file. Now for the blow-up to HD.

DaVinci Resolve

df_24psdhd_1_smThere are many applications and filters that can blow-up SD to HD footage, but often the results end up soft. I’ve found DaVinci Resolve to offer some of the cleanest resizing, along with very fast rendering for the final output. Resolve offers three scaling algorithms, with “Sharper” providing the crispest blow-up. The second issue is that since I wanted to restore the wider aspect, which is inherent in going from 4×3 to 16×9, this meant blowing up more than normal – enough to fit the image width and crop the top and bottom of the frame. Since Resolve has the editing tools to split clips at cuts, you have the option to change the vertical position of a frame using the tilt control. Plus, you can do this creatively on a shot-by-shot basis if you want to. This way you can optimize the shot to best fit into the 16×9 frame, rather than arbitrarily lopping off a preset amount from the top and bottom.

df_24psdhd_3_smYou actually have two options. The first is to blow up the film to a large 4×3 frame out of Resolve and then do the slicing and vertical reframing in yet another application, like FCP 7. That’s what I did originally with this project, because back then, the available version of Resolve did not offer what I felt were solid editing tools. Today, I would use the second option, which would be to do all of the reframing strictly within Resolve 11.

As always, there are some uncontrollable issues in this process. The original transfer of the film to Digibeta was done on a Rank Cintel Mark III, which is a telecine unit that used a CRT (literally an oscilloscope tube) as a light source. The images from these tubes get softer as they age and, therefore, they require periodic scheduled replacement. During the course of the transfer of the film, the lab replaced the tube, which resulted in a noticeable difference in crispness between shots done before and after the replacement. In the SD world, this didn’t appear to be a huge deal. Once I started blowing up that footage, however, it really made a difference. The crisper footage (after the tube replacement) held up to more of a blow-up than the earlier footage. In the end, I opted to only take the film to 720p (1280×720) rather than a full 1080p (1920×1080), just because I didn’t feel that the majority of the film held up well enough at 1080. Not just for the softness, but also in the level of film grain. Not ideal, but the best that can be expected under the circumstances. At 720p, it’s still quite good on Blu-ray, standard DVD or for HD over the web.

df_24psdhd_4_smTo finish the process, I dust-busted the film to fix places with obvious negative dirt (white specs in the frame) caused by the initial handling of the film negative. I used FCP X and CoreMelt’s SliceX to hide and cover negative dirt, but other options to do this include built in functions within Avid Media Composer. While 35mm film still holds a certain intangible visual charm – even in such a “manipulated” state – the process certainly makes you appreciate modern digital cameras like the ARRI ALEXA!

As an aside, I’ve done two other complete films this way, but in those cases, I was fortunate to work from 1080i masters, so no blow-up was required. One was a film transferred in its entirety from a low-contrast print, broken into reels. The second was assembled digitally and output to intermediate HDCAM-SR 23.98 masters for each reel. These were then assembled to a 1080i composite master. Aside from being in HD to start with, cadence changes only occurred at the edits between reels. This meant that it only required 5 or 6 cadence corrections to fix the entire film.

©2014 Oliver Peters

The Zero Theorem

df_tzt_1Few filmmakers are as gifted as Terry Gilliam when it comes to setting a story inside a dystopian future. The Monty Python alum, who brought us Brazil and Twelve Monkeys, to name just a few, is back with his newest, The Zero Theorem. It’s the story of Qohen Leth – played by Christoph Walz (Django Unchained, Water for Elephants, Inglorious Basterds) – an eccentric computer programmer who has been tasked by his corporate employer to solve the Zero Theorem. This is a calculation, that if solved, might prove that the meaning of life is nothingness.

The story is set in a futuristic London, but carries many of Gilliam’s hallmarks, like a retro approach to the design of technology. Qohen works out of his home, which is much like a rundown church. Part of the story takes Qohen into worlds of virtual reality, where he frequently interacts with Bainsley (Melanie Thierry), a webcam stripper that he met at a party, but who may have been sent by his employer, Mancom, to distract him. The Zero Theorem is very reminiscent of Brazil, but in concept, also of The Prisoner, a 1960s-era television series. Gilliam explores themes of isolation versus loneliness, the pointlessness of mathematical modeling to derive meaning and privacy issues.

I recently had a Skype chat with Mick Audsley, who edited the film last year. Audsley is London-based, but is currently nearing completion of a director’s cut of the feature film Everest in Iceland. This was his third Gilliam film, having previously edited Twelve Monkeys and The Imaginarium of Doctor Parnassus. Audsley explained, “I knew Terry before Twelve Monkeys and have always had a lot of admiration for him. This is my third film with Terry, as well as a short, and he’s an extraordinarily interesting director to work with. He still thinks in a graphic way, since he is both literally and figuratively an artist. He can do all of our jobs better than we can, but really values the input from other collaborators. It’s a bit like playing in a band, where everyone feeds off of the input of the other band members.”df_tzt_5

The long path to production

The film’s screenplay writer Pat Rushin teaches creative writing at the University of Central Florida in Orlando, Florida. He originally submitted the script for The Zero Theorem to the television series Project Greenlight, where it made the top 250. The script ended up with the Zanuck Company. It was offered to Gilliam in 2008, but initially other projects got in the way. It was revived in June 2012 with Gilliam at the helm. The script was very ambitious for a limited budget of under $10 million, so production took place in Romania over a 37-day period. In spite of the cost challenges, it was shot on 35mm film and includes 250 visual effects.

df_tzt_6Audsley continued, “Nicola [Pecorini, director of photography] shot a number of tests with film, RED and ARRI ALEXA cameras . The decision was made to use film. It allowed him the latitude to place lights outside of the chapel set – Qohen’s home – and have light coming in through the windows to light up the interior. Kodak’s lab in Bucharest handled the processing and transfer and then sent Avid MXF files to London, where I was editing. Terry and the crew were able to view dailies in Romania and then we discussed these over the phone. Viewing dailies is a rarity these days with digitally-shot films and something I really miss. Seeing the dailies with the full company provides clarity, but I’m afraid it’s dying out as part of the filmmaking process.”df_tzt_7

While editing in parallel to the production, Audsley didn’t upload any in-progress cuts for Gilliam to review. He said, “It’s hard for the director to concentrate on the edit, while he’s still in production. As long as the coverage is there, it’s fine. Certainly Terry and Nicola have a supreme understanding of film grammar, so that’s not a problem. Terry knows to get those extra little shots that will make the edit better. So, I was editing largely on my own and had a first cut within about ten days of the time that the production wrapped. When Terry arrived in London, we first went over the film in twenty-minute reels. That took us about two to three weeks. Then we went through the whole film as one piece to get a sense for how it worked as a film.”

Making a cinematic story

df_tzt_4As with most films, the “final draft” of the script occurs in the cutting room. Audsley continued, “The film as a written screenplay was very fluid, but when we viewed it as a completed film, it felt too linear and needed to be more cinematic – more out of order. We thought that it might be best to move the sentences around in a more interesting way. We did that quite easily and quickly. Thus, we took the strength of the writing and realized it in cinematic language. That’s one of the big benefits of the modern digital editing tools. The real film is about the relationship between Bainsley and Qohen and less about the world they inhabit. The challenge as filmmakers in the cutting room is to find that truth.”

df_tzt_8Working with visual effects presents its own editorial challenge. “As an editor, you have to evaluate the weight and importance of the plate – the base element for a visual effect – before committing to the effect. From the point-of-view of cost, you can’t keep undoing shots that have teams of artists working on them. You have to ensure that the timing is exactly right before turning over the elements for visual effects development. The biggest, single visual challenge is making Terry’s world, which is visually very rich. In the first reel, we see a futuristic London, with moving billboards. These shots were very complex and required a lot of temp effects that I layered up in the timeline. It’s one of the more complex sequences I’ve built in the Avid, with both visual and audio elements interacting. You have to decide how much can you digest and that’s an open conversation with the director and effects artists.”

The post schedule lasted about twenty weeks ending with a mix in June 2013. Part of that time was tied up in waiting for the completion of visual effects. Since there was no budget for official audience screenings, the editorial team was not tasked with creating temp mixes and preview versions before finishing the film. Audsley said, “The first cut was not overly long. Terry is good in his planning. One big change that we made during the edit was to the film’s ending. As written, Qohen ends up in the real world for a nice, tidy ending. We opted to end the film earlier for a more ambiguous ending that would be better. In the final cut the film ends while he’s still in a virtual reality world. It provides a more cerebral versus practical ending for the viewer.”

Cutting style 

df_tzt_9Audsley characterizes his cutting style as “old school”. He explained, “I come from a Moviola background, so I like to leave my cut as bare as possible, with few temp sound effects or music cues. I’ll only add what’s needed to help you understand the story. Since we weren’t obliged on this film to do temp mixes for screenings, I was able to keep the cut sparse. This lets you really focus on the cut and know if the film is working or not. If it does, then sound effects and music will only make it better. Often a rough cut will have temp music and people have trouble figuring out why a film isn’t working. The music may mask an issue or, in fact, it might simply be that the wrong temp music was used. On The Zero Theorem, George Fenton, our composer, gave us representative pieces late in the  process that he’d written for scenes.” Andre Jacquemin was the sound designer who worked in parallel to Audsley’s cut and the two developed an interactive process. Audsley explained, “Sometimes sound would need to breath more, so I’d open a scene up a bit. We had a nice back-and-forth in how we worked.”

df_tzt_3Audsley edited the film using Avid Media Composer version 5 connected to an Avid Unity shared storage system. This linked him to another Avid workstation run by his first assistant editor, Pani Ahmadi-Moore. He’s since upgraded to version 7 software and Avid ISIS shared storage. Audsley said, “I work the Avid pretty much like I worked when I used the Moviola and cut on film. Footage is grouped into bins for each scene. As I edit, I cut the film into reels and then use version numbers as I duplicate sequences to make changes. I keep a daily handwritten log about what’s done each day. The trick is to be fastidious and organized. Pani handles the preparation and asset management so that I can concentrate on the edit.”

df_tzt_2Audsley continued, “Terry’s films are very much a family type of business. It’s a family of people who know each other. Terry is supremely in control of his films, but he’s also secure in sharing with his filmmaking family. We are open to discuss all aspects of the film. The cutting room has to be a safe place for a director, but it’s the hub of all the post activity, so everyone has to feel free about voicing their opinions.”

Much of what the editor does, proceeds in isolation. The Zero Theorem provided a certain ironic resonance for Audsley, who commented, “At the start, we see a guy sitting naked in front of a computer. His life is harnessed in manipulating something on screen, and that is something I can relate to as a film editor! I think it’s very much a document of our time, about the notion that in this world of communication, there’s a strong aspect of isolation. All the communication in the world does not necessarily connect you spiritually.” The Zero Theorem is scheduled to open for limited US distribution in September.

For more thoughts from Mick Audsley, read this post at Avid Blogs.

Originally written for DV magazine / CreativePlanetNetwork.

©2014 Oliver Peters