A Film’s Manageable Length

There are three versions of every feature film: as written, as shot, and what comes out of post. The written and filmed story can often be too long as compared to some arbitrary target length. It is the editor’s job to get the film down such a “manageable length.” Since a film isn’t broadcast television and doesn’t have to fit into a time format, deciding on the right length is a vague concept. It’s like saying how long a book should be.

This idea was derived in part based on both the audience’s attention span to the story and how long their bladders held out. Couple this to a theater’s schedule – longer movies meant fewer screenings and, therefore, lower box office revenue. In past decades, the accepted length was in the 90 to 100 minute range. Modern blockbusters can easily clock in at 120 to 150 minutes. However, if you are an indie filmmaker and didn’t produce a film starring Tom Cruise, then you better stick closely to that 100 minute mark.

The script

The rule of thumb for a script is around one minute per page. 100 pages = 100 minutes. For the most part that works, until you hit a script line, such as, “and the battle ensued.” That can easily consume several minutes of screen time. And so, it takes careful reading and interpretation of a script to get a valid ballpark length. That not only impacts the final length, but also the shooting schedule, production budget, and more.

Walter Murch has a technique to get a good idea for the true length of a film. His method is to take a day or two and act out each scene of the script by himself – reading the dialogue and going through characters’ actions. As he does this, he times each scene. He’ll do this two or three times until he has a good average timing for each scene and a total estimate for the film. Then, as the film is being shot, he’ll compare his time estimates with those coming from the script supervisor. If they are radically off, then he knows that something deviated a lot from the written script. And that will need some explanation.

Trimming the first assembly

The starting point for any editor is to assemble everything according to the script. At this point, the editor does not have discretion to drop lines, scenes, or re-arrange anything. The point is to present an initial cut to the director, which is faithful to the director’s intention during filming. Now you know how long the combined material really is. It’s quite common for the film to be long. In fact, that’s better than being too short or even very close to the target length.

If a film runs 10-30% over, then according to Murch, you can get there through “diet and exercise.” If it’s 50-100% or more over-length, then it’s time for true “surgery” to figuratively lose some body parts or organs.

A film that’s 10-30% long can usually be trimmed in various ways, without losing any key scenes. One way is to cut lines more tightly together, which can also help with pacing. A film often has “shoe leather” – getting a character from point A to point B. For example, a character arrives home in his car, walks up to the front door, opens it, and enters the home. Here, the editor can cut from the car arriving home directly to the interior of the home as the actor enters. Another technique is to enter scenes a bit later and exit them earlier. And finally, as you see the assembled film, you may realize that there are redundant dialogue lines or early plot reveals that can be cut. All of these comprise the “diet and exercise” solution.

Surgery

If the film is long and you can’t get to a desired length through “diet and exercise,” then more drastic cuts are needed. You might have to lose entire scenes or even characters. Sometimes this can focus the film by honing in on the real story. You often realize that some of these scenes weren’t needed after all and the film plays better without them. It’s at this stage, that the director and editor may re-arrange some of the scene order. In doing so, you may also discover that certain plot elements become obvious and that scenes, which might have foreshadowed or explained them aren’t needed after all. This process can takes days, weeks, or months.

It can also be painful for many directors. Some are happy to jump in and make severe cuts right away. Others have to go through an iterative process of whittling the film down in numerous passes over the course of weeks.

One of the earliest films I’ve cut was “The First of May.” It was a family film with a child lead actor coupled with an ensemble of older acting legends. Toss in a literal circus and you can see the complexity. The final length was long by what was assumed to be the “ideal” length for an indie, family film.

As we we getting down to the wire for the initial pitches to potential distributors, the producing partners – who split the roles of writer and director – were at odds over the length. One argument was that “ET” was a family film and it was long. The counter-argument was that this wasn’t “ET” and if it was too long, they’d never get in the door in the first place.

We were at an impasse and the co-producer/director and I did what we called the “slash and burn” edit. What could we cut out of the film to get to 90 minutes if told it had to absolutely be at that length? Unfortunately, this exercise didn’t sit well with the co-producer/writer. In the end, after some tense conversations, they were able to agree on an edit that held together well and met the objectives.

This is a dilemma that every editor/director team faces and it will always be painful for some. After all, when the editor cuts out the scene with that great crane shot that took all day to pull off, the director can’t help but wince. However, it’s all in service of the story. Remember, the audience only sees the film that they are presented with and will usually never know what was cut out. If the pacing and emotion are right and the story holds up and entertains, then you’ve done your job as an editor – no matter what the film’s final length is.

©2023 Oliver Peters

The Oscar. Now what?

Everything Everywhere All at Once dominated the Academy Awards night, including winning the Best Film Editing award for Paul Rogers. The team used Adobe Premiere Pro as their NLE of choice. By extension this becomes the first editing Oscar win for Premiere. Of course, it’s the team and editor that won the award, not the software that they used. Top editors could cut with any application and get the same result.

The Academy Awards started as a small celebratory dinner for insiders to recognize each other’s achievements in film. Over the decades this has become a major cultural event. Winning or even being nominated is a huge feather in the cap for any film. This can be heavily leveraged by the marketing teams of not only the film distributors and talent agents, but also the various products used in the process – be that cameras or software.

Avid’s dominance

When it comes to editing, Avid has been the 800-pound gorilla in the modern digital era. Ever since Walter Murch won for editing The English Patient using Media Composer, the specific NLE on an Oscar-winning film has become a hot topic among editors. This was never the case when the only options were Moviola, KEM, or Steenbeck.

Even this year nine out of the ten nominees for the Oscar for Best Picture and four out of the five nominees for Best Film Editing used Media Composer. Yet, Avid’s dominance in the winner’s circle has seen some occasional cracks from competitors, like Apple’s Final Cut Pro (legacy version) and Lightworks. Nevertheless, Media Composer is still a safe bet. And let’s not forget sound, where Pro Tools has even less competition from other DAWs among film and TV sound editors and mixers. All of the nominees for the Oscar for Best Sound at this year’s Academy Awards used Pro Tools.

There are, of course, many awards competitions around the world, including the ACE Eddie Awards, BAFTA, Golden Globes, and others, including various film festivals. Many of these don’t give out specific craft awards for editors or editing; however, a lot of these winning films have been edited with other tools. For example, many award-worthy indie films, especially documentaries, have been edited with Premiere Pro. Even Final Cut Pro (the current “X” version) has had wins in such categories. This includes wins for the short films, The Silent Child and Skin at the 2018 and 2019 Academy Awards.

Stacking up the NLE competitors

The truth of the matter is that today, there are seven viable applications that might be used to cut a professional feature film or documentary: Media Composer, Final Cut Pro, Premiere Pro, DaVinci Resolve, Lightworks, Edius X, and Vegas Pro. You could probably also factor in others, such Final Cut Pro 7 (now zombie-ware) and Media 100 (yes, still alive), not to mention consumer-oriented NLEs like iMovie or Movie Maker. Realistically, most experienced film editors are likely to only use one of the first five on the list.

Of those five, Blackmagic Design’s DaVinci Resolve is the app that most editors have their eyes on. Aside from its widespread use in color correction, Resolve is also a perfectly capable editing application. Although it has yet to pull off an Oscar win for editing, Resolve has been widely used in many aspects of the production and post workflow of top films. Owing to its nature as a “Swiss Army Knife” application, Resolve fits into various on-set, editing, and visual effects niches. It’s only a matter of time before Resolve gets an Oscar win for editing. But other Blackmagic Design products also shouldn’t be overlooked. In the 2023 Academy Awards, more than 20 films across the technical, documentary, short film, international feature film, and animated categories used some Blackmagic Design product.

Marketing

When an application is used on an award-winning film, I’d bet that the manufacturer’s marketing department is doing high-fives. But does this really move the sales needle? Maybe. It’s all aspirational marketing. They want you to feel that if you use the same software as an Oscar-winning film editor used, then you, too, could be in that league. Talent is always the key factor, but we can all dream. Right? That’s what marketing plays upon, but it also impacts the development of the application itself.

Both Avid and Adobe have been fine-tuning their tools with professional users in mind for years. They’ve added features based on the needs of a small, but influential (or at least vocal) market sector. This results in applications that tick most of the professional boxes, but which are also harder to learn and eventually master.

That’s a route Apple also chose to pursue with Final Cut Pro 1 through 7. Despite a heralded introduction with Cold Mountain in 2003, it took until 2010 before Angus Wall and Kirk Baxter nailed down an Oscar with The Social Network. They then reprised that in 2011 with a win for The Girl with the Dragon Tattoo. Even as late as 2020, the discontinued FCP 7 was represented by Parasite, winning Best Picture and nominated for Best Film Editing.

Apple and Final Cut Pro’s trajectory unexpectedly changed course with the introduction of Final Cut Pro X. This shift coincided with the growth of social media and a new market of many non-traditional video editors. Final Cut Pro in its current iteration is the ideal application for this market and has experienced a huge growth in users. But, it still gets labelled as being not ready for professional users, even though a ton of professional content is posted using the app. Apple took the platform approach – opting to leave out many advanced features and letting third party developers fill in the gaps where needed. This is the core of much of the criticism.

How advanced/complex does a professional NLE really need to be?

In the case of FCP, it’s certainly capable of Hollywood-level films along with a range of high-end, international dramas. Witness the many examples I’ve written about, like Focus, Whiskey Tango Foxtrot, Voice from the Stone, The Banker, Jezebel, and Blood Red SkyHowever, a wide range of professional editors would like to see more.

The internal corporate discussion goes like this. Marketing asks, “What do we have to do to get broader adoption among professional film editors?” Engineering answers, “It will take X dollars and X amount of time.” Top management asks, “What’s the return if we do that?” And that’s usually where the cycle stops, until the next year or awards season.

The truth is that the traditional high-end post market is extremely small for a company like Apple. The company is already selling hardware, which is their bread and butter. Will a more advanced version of FCP sell more hardware? Probably not. Avid, Adobe, and Blackmagic Design are already doing that for them. On the other hand, what is more influential for sales in today’s market – Oscar-winning professional editors or a bevy of YouTube influencers touting your product?

I’m not privy to sales numbers, so I have no idea whether or not going after the very small professional post market makes financial sense for either Blackmagic Design or Adobe. In the case of Avid, their dominance pays off through their ecosystem. Avid-based facilities are also likely to have Avid storage and Pro Tools audio facilities. Hardware most likely covers the development costs. Plus, both Avid and Adobe have shifted to subscription models (Adobe fully, Avid as an option). This seems to be good for both companies.

Blackmagic Design is also a hardware developer and manufacturer. Selling cameras and a wide range of other products enables them to offer DaVinci Resolve for as little as free. You’d be hard-pressed to find a production company that wasn’t using one or more Blackmagic products. Only time will tell which company has taken the approach that a) ensures their long term survival, and b) benefits professional film editors in the best way. In the case of Apple, it’s pretty clear that adding new feature to Final Cut Pro will generate more revenue in an amount that many competitors would envy. Yet, it would be small by Apple’s measurement.

In the end, awards are good for a developer’s marketing buzz, but don’t forget the real team that won the award itself. It’s wonderful for Paul Rogers and Adobe that Everything Everywhere All at Once was tapped for the Oscar for Best Film Editing. It’s an interesting milestone, but when it comes to software, it’s little more than bragging rights. Great to have, but remember, it’s Rogers that earned it, regardless of the tools he used.

©2023 Oliver Peters

What is a Finishing Editor?

To answer that, let’s step back to film. Up until the 1970s dramatic television shows, feature films, and documentaries were shot and post-produced on film. The film lab would print positive copies (work print) of the raw negative footage. Then a team of film editors and assistants would handle the creative edit of the story by physically cutting and recutting this work print until the edit was approved. This process was often messy with many film splices, grease pencil marks on the work print to indicate dissolves, and so on.

Once a cut was “locked” (approved by the director and the execs) the edited work print and accompanying notes and logs were turned over to the negative cutter. It was this person’s job to match the edits on the work print by physically cutting and splicing the original camera negative, which up until then was intact. The negative cutter would also insert any optical effects created by an optical house, including titles, transitions, and visual effects.

Measure twice, cut once

Any mistakes made during negative cutting were and are irreparable, so it is important that a negative cutter be detail-oriented, precise, and works cleanly. You don’t want excess glue at the splices and you don’t want to pick up any extra dirt and dust on the negative if it can be avoided. If a mistaken cut is made and you have to repair that splice, then at least one frame is lost from that first splice.

A single frame – 1/24th of a second – is the difference in a fight scene between a punch just about to enter the frame and the arm passing all the way through the frame. So you don’t want a negative cutter who is prone to making mistakes. Paul Hirsch, ACE points out in his book A long time ago in a cutting room far, far away…. that there’s an unintentional jump cut in the Death Star explosion scene in the first Star Wars film, thanks to a negative cutting error.

In the last phase of the film post workflow, the cut negative goes to the lab’s color timer (the precursor to today’s colorist), who sets the “timing” information (color, brightness, and densities) used by the film printer. The printer generates an interpositive version of the complete film from the assembled negative. From this interpositive, the lab will generally create an internegative from which release prints are created.

From the lab to the linear edit bay

This short synopsis of the film post-production process points to where we started. By the mid-1970s, video post-production technology came onto the scene for anything destined for television broadcast. Material was still shot on film and in some cases creatively edited on film, as well. But the finishing aspect shifted to video. For example, telecine systems were used to transfer and color correct film negative to videotape. The lab’s color timing function was shifted to this stage (before the edit) and was now handled by the telecine operator, who later became known as a colorist.

If work print was generated and edited by a film editor, then it was the video editor’s job to match those edits from the videotapes of the transferred film. Matching was a manual process. A number of enterprising film editors worked out methods to properly compute the offsets, but no computerized edit list was involved. Sometimes a video offline edit session was first performed with low-res copies of the film transfer. Other times producers simply worked from handwritten timecode notes for selected takes. This video editing – often called online editing and operated by an online editor – was the equivalent to the negative cutting stage described earlier. Simpler projects, such as TV commercials, might be edited directly in an online edit session without any prior film or offline edit.

Into the digital era

Over time, any creative editing previously done on film for television projects shifted to videotape edit systems and later to digital nonlinear edit systems (NLEs), such as Avid and Lightworks. These editors were referred to as offline editors and post now followed a bifurcated process know as offline and online editing. This was analogous to film’s work print and negative cutting stages. Likewise, telecine technology evolved to not only perform color correction during the film transfer process, but also afterwards working from the assembled master videotape as a source. This process, known as tape-to-tape color correction, gave the telecine operator – now colorist – the tools to perform better shot matching, as well as to create special looks in post. With this step the process had gone full circle, making the video colorist the true equivalent of the lab’s color timer.

As technology marched on, videotape and linear online edit bays gave way to all-digital, NLE-based facilities. Nevertheless, the separation of roles and processes continued. Around 2000, Avid came in with its Symphony model – originally a separate product and not just a software option. Avid Symphony systems offered a full set of color-correction tools and the ability to work in uncompressed resolutions.

It became quite common for a facility to have multiple offline edit bays using Avid Media Composer units staffed by creative, offline editors working with low-res media. These would be networked to an Avid shared storage solution. In addition, these facilities would also have one or more Avid Symphony units staffed by online editors.

A project would be edited on Media Composer until the cut was locked. Then assistants would ingest high-res media from files or videotape, and an online editor would “conform” the edit with this high-res media to match the approved timeline. The online editor would also handle Symphony color correction, insert visual effects, titles, etc. Finally, all tape or file deliverables would be exported out of the Avid Symphony. This system configuration and workflow is still in effect at many facilities around the world today, especially those that specialize in unscripted (“reality”) TV series.

The rise of the desktop systems

Naturally, there are more software options today. Over time, Avid’s dominance has been challenged by Apple Final Cut Pro (FCP 1-7 and FCPX), Adobe Premiere Pro, and more recently Blackmagic Design DaVinci Resolve. Systems are no longer limited by resolution constraints. General purpose computers can handle the work with little or no bespoke hardware requirements.

Fewer projects are even shot on film anymore. An old school, film lab post workflow is largely impossible to mount any longer. And so, video and digital workflows that were once only used for television shows and commercials are now used in nearly all aspects of post, including feature films. There are still some legacy terms in use, such as DI (digital intermediate), which for feature film is essentially an online edit and color correction session.

Given that modern software – even running on a laptop – is capable of performing nearly every creative and technical post-production task, why do we still have separate dedicated processes and different individuals assigned to each? The technical part of the answer is that some tasks do need extra tools. Proper color correction requires precision monitoring and becomes more efficient with specialized control panels. You may well be able to cut with a laptop, but if your source media is made up of 8K RED files, a proxy (offline-to-online) workflow makes more sense.

The human side of the equation is more complex

Post-production tasks often involve a left/right-side brain divide. Not every great editor is good when it comes to the completion phase. In spite of being very creative, many often have sloppy edits, messy timelines, and their project organization leaves a lot to be desired. For example, all footage and sequences just bunched together in one large project without bins. Timelines might have clips spread vertically in no particular order with some disabled clip – based on changes made in each revision path. As I’ve said before: You will be judged by your timelines!

The bottom line is that the kind of personality that makes a good creative editor is different than one that makes a good online editor. The latter is often called a finishing editor today within larger facilities. While not a perfect analogy, there’s a direct evolutionary path from film negative cutter to linear online editor to today’s finishing editor.

If you compare this to the music world, songs are often handled by a mixing engineer followed by a mastering engineer. The mix engineer creates the best studio mix possible and the mastering engineer makes sure that mix adheres to a range of guidelines. The mastering engineer – working with a completely different set of audio tools – often adds their own polish to the piece, so there is creativity employed at this stage, as well. The mastering engineer is the music world’s equivalent to a finishing editor in the video world.

Remember, that on larger projects, like a feature film, the film editor is contracted for a period of time to deliver a finished cut of the film. They are not permanent staff. Once, that job is done the project is handed off to the finishing team to accurately generate the final product working with the high-res media. Other than reviewing the work, there’s no value to having a highly paid film editor also handle basic assembly of the master. This is also true in many high-end commercial editorial companies. It’s more productive to have the creative editors working with the next client, while the staff finishing team finalizes the master files.

The right kit for the job

It also comes down to tools. Avid Symphony is still very much in play, especially with reality television shows. But there’s also no reason finishing and final delivery can’t be done using Apple Final Cut Pro or Adobe Premiere Pro. Often more specialized edit tools are assigned to these finishing duties, including systems such as Autodesk Smoke/Flame, Quantel Rio, and SGO Mistika. The reason, aside from quality, is that these tools also include comprehensive color and visual effects functions.

Finishing work today includes more that simply conforming a creative edit from a decision list. The finishing editor may be called upon to create minor visual effects and titles along with finessing those that came out of the edit. Increasingly Blackmagic Design DaVinci Resolve is becoming a strong contender for finishing – especially if Resolve was used for color correction. It’s a powerful all-in-one post-production application, capable of handling all of the effects and delivery chores. If you finish out of Resolve, that cuts out half of the roundtrip process.

Attention to detail is the hallmark of a good finishing editor. Having good color and VFX skills is a big plus. It is, however, a career path in its own right and not necessarily a stepping stone to becoming a top-level feature film editor or even an A-list colorist. While that might be a turn-off to some, it will also appeal to many others and provide a great place to let your skills shine.

©2023 Oliver Peters

A Conversation with Walter Murch – Part 4

Roads not travelled.

No matter how long the career or number of awards, any editor might consider those films that passed by and wonder what they might have done with the film. That’s where we conclude this discussion.

______________________

Walter, in your career, are there any films that you didn’t edit, but wish you had?

I worked on K-19: The Widowmaker for Kathryn Bigelow. And in 2007 she sent me the script for The Hurt Locker, about the Iraq war. It made me realize that the last four films I had edited had been films about war, the latest one being Jarhead. I told her that I just wanted to take a break from editing war films. Of course, Hurt Locker went on to win six Oscars – Best Picture, Best Screenplay, Best Director, Best Editing, Best Sound Effects, and Best Mixing.

What would have happened if I had said yes to that? But, you also get into practical things. At the time of making that decision, I’d been away from home for a year editing Jarhead in Los Angeles and New York. This would have meant going into the Middle East or at least going to Los Angeles. But, the main thing was just I’d been thinking about war since 2000: Apocalypse Redux, war – K-19, war – Cold Mountain, war – Jarhead, war. Even The English Patient is kind of a war film. So turning down The Hurt Locker is the big What If?  that comes to mind.

I know you are an Orson Welles buff, so I was actually thinking it might have been The Other Side of the Wind, which was finally completed in 2018.

I did the recut of Touch of Evil. At that time, 1998, I was taken to the vaults in Los Angeles, where the material for The Other Side of the Wind was in storage. Gary Graver showed me some of the rough assemblies that had been put together by Welles himself, but I just didn’t want to work on that one. This looked to me very self-indulgent. The situation with Touch of Evil was very, very different. 

The finished version of Wind seems like it’s an art film within the film and appears to be somewhat autobiographical. Although, I believe Welles denied that the director character (John Huston) was modeled after his own career.

Right. Touch of Evil was obviously a scripted film that was produced by Universal Studios in Hollywood – albeit, in a Wellesian manner – but it was very buttoned down, relatively speaking. And then Welles left us the 58 page memo, which made very specific suggestions for how the film should be recut. It was clear what he wanted done. I mean, he didn’t talk about frames – he would just say: this section needs to be shorter. Or: restore the script structure for the first four reels, cross-cutting between Janet Leigh’s story and Charlton Heston’s story. The studio put all the Heston story together and then all the Leigh story together. He wanted his original structure back. I don’t believe there was a guiding memo like that for The Other Side of the Wind.

Welles’ Touch of Evil memo is a wonderful document. It’s 58 pages by a genius filmmaker under duress, writing about his ideas and addressed to his enemies at the studio. It’s a masterclass in political diplomacy of trying to get his ideas across without accusation. It’s sad that he had to write it, but I’m happy we have it.

Thank you.

______________________

Walter Murch has led and continues to lead an interesting and eclectic filmmaking career. If you’ve enjoyed this 4-part series, there’s plenty more to be found in the books written by and about him. There are also many of his interviews and presentations available on the web.

SIGHT & SOUND: The Cinema of Walter Murch is documentary created by Jon Lefkovitz. This video is assembled from various interviews and presentations by Murch discussing his take on filmmaking and editing. It’s illustrated with many film examples to highlight the concepts.

Web of Stories – Life Stories of Remarkable People includes an 18-hour series of interviews with Walter Murch recorded in London in 2016. These are broken down into 320 short clips for easier viewing.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

A Conversation with Walter Murch – Part 3

©2023 Oliver Peters

A Conversation with Walter Murch – Part 3

Sound design and the film mixing process

Walter Murch is not only known for his work and awards in the realm of picture editing, but he’s also made significant contributions to the art of film sound. That’s where we pick up in Part 3.

______________________

Walter, let’s switch gears and talk about sound and your work as a sound designer and re-recording mixer. There’s an origin story about the term ‘sound designer’ credited to union issues. That might be a good place to start.

Frances [Ford Coppola] tells the story, but he gets it wrong. [laugh] There were union problems, because I was in the San Francisco union. Many of the films were financially based in LA, so what am I doing working on that? On The Rain People, for instance, my credit is sound montage. I wasn’t called sound mixer, re-recording mixer, or sound editor. We were just trying to avoid blatantly stepping on toes. I had the same sound montage credit on The Conversation. Then on Apocalypse Now I was credited with sound re-recording, because it was an independent film. Francis basically was the financier of it. So, it was an independent film, partially supported by United Artists.

The sound design idea came up because of this new format, which we now call 5.1. Apocalypse Now was the first time I had ever worked in that format. It was the first big film that really used it in a creative way. The Superman film in 1978 had it technically, but I don’t think they used it much in a creative fashion.

As I recall, at that time there was the four channel surround format that was left, center, right, and a mono rear channel.

Star Wars, which came out in 1977, had that format with the ‘baby boom’ thing. 70mm prior to that would have five speakers behind the screen – left, left-center, center, right-center, and right. What they decided to do on Star Wars was to not have the left-center/right-center speakers. Those were only used for super low frequency enhancement. Then the surround was many speakers, but all of them wired to only one channel of information.

Walter Murch mixing Apocalypse Now

We didn’t use those intermediate speakers at all and brought in Meyer ‘super boom’ speakers. These went down to 20Hz, much lower than the Altec speakers in those days, which I think bottomed out at around 50Hz. And then we split the mono surround speakers into two channels – left and right. Basically what we call 5.1 today. We didn’t call it 5.1, we just called it six-track or a split-surround format.

This was a new format, but how do you use it creatively? I couldn’t copy other films that had done it, because there weren’t any. So I designed an entire workflow system for the whole film. Where would we really use 5.1 and where would we not? That was a key thing – not to fall into the trap that usually happens with new technology, which is to overuse it. Where do we really want to have 5.1? In between that, let’s just use stereo and even some long sections where it’s just mono – when Willard is looking at the Kurtz dossier, for instance.

Willard’s narration section, if he’s in the focsle of the boat at night reading the memo, it’s just mono – it’s just his voice and that’s it. As things open up, approaching Hau Phat (the Playboy Bunny concert) it becomes stereo, and then as it really opens up, with the show beginning, it becomes full six-track, 5.1. Then it collapses back down to mono again the next morning. That’s the design element. So I thought, that’s the unique job that I did on the film – sound design – designing where we would fully use this new format and how we would use it.

Mark Berger, Francis Ford Coppola, and Walter Murch mixing The Godfather Part II

In addition, of course, I’d cut some of the sound effects, but not anywhere near most of them, because we had a small army of sound effects editors working under Sound Effects Supervisor Richard Cirincione. However, I was the main person responsible. Francis at one point had a meeting and he said, “Any questions about sound? Walter’s the guy. Don’t ask me, ask Walter.” So I effectively became the director of sound. And then, of course, I was the lead re-recording mixer on the film.

Some readers might not be familiar with how film mixing works and why there are teams. Please go into that more.

Previously, in Hollywood, there would usually be three people – DME sitting at the board. That is how The Godfather was mixed. If you are facing the screen behind the console, then D [dialogue mixer] on the left, M [music mixer] in the middle and E [sound effects mixer] on the right. On the other hand, in San Francisco I had been the solo re-recording mixer on Rain People, THX-1138, and The Conversation. 

Walter Murch handling the music mix, Particle Fever

As soon as you have automation you don’t need as many people, because the automation provides extra fingers. We had very basic automation on Apocalypse Now. Only the faders were automated, but none of the equalization, sends, echo, reverb, or anything else. So we had to keep lots of notes about settings. The automation did at least control the levels of each of the faders.

Of course, these days a single person can mix large projects completely ‘in the box’ using mainly a DAW. I would imagine mixing for music and mixing for film and television is going to use many of the same tools.

The big difference is that in the old days – and I’m thinking of The Godfather – we had very limited ability with the edited soundtracks to hear them together before we got to the mix. You had no way to set their levels relative to each other until you got to the mix. So the mix was really the creation of this from the ground up.

Supervising the mix, Coup 53

Thinking of the way I work now or the way Skip Lievsay works with the Coen brothers, he will create the sound for a section in Pro Tools and build it up. Then he’ll send the brothers a five-track or a three-track and they just bring it into the audio tracks of the Premiere timeline. So they’re editing the film with his full soundtrack. There are no surprises in the final mix. You don’t have to create anything. The final mix is when you hear it all together and put ‘holy water’ on it and say, that’s it – or not. Now that you’ve slept on it overnight, let’s reduce the the bells of the cows by 3dB. You make little changes, but it’s not this full-on assault of everything. As I said earlier, bareback – where it’s just taking the raw elements and putting them together for the first time in the mix. The final mix now is largely a certification of things that you have already been very familiar with for some time.

______________________

Click here for the conclusion of this conversation in Part 4.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

©2023 Oliver Peters