Life in Six Strings

Whether you’re a guitar nerd or just into rock ‘n roll history, learning what makes our music heroes tick is always entertaining. Music journalist and TV presenter Kylie Olsson started a YouTube channel during the pandemic lockdown, teaching herself how to play guitar and reaching out to famous guitarists that she knew. This became the concept for a TV series called “Life in Six Strings with Kylie Olsson” that airs on AXS TV. The show is in the style of “Comedians in Cars getting Coffee.” Olsson explores the passions behind these guitarists, plus gets a few guitar pointers along the way.

I spoke with James Tonkin and Leigh Brooks about the post workflow for these episodes. Tonkin is founder of Hangman in London, which handled the post on the eight-part series. He was also the director of photography for the first two episodes and has handled the online edit and color grading for all of the episodes. Leigh Brooks (Firebelly Films) was the offline (i.e. creative) editor on the series, starting with episode three. Together they have pioneered an offline-to-online post workflow based entirely around Blackmagic Design DaVinci Resolve.

James, how did you get started on this project?

James Tonkin: Kylie approached us about shooting a pilot for the series. We filmed that in Nashville with Joe Bonamassa and it formed the creative style for the show. We didn’t want to just fixate on the technical side of the guitar and tone of these players, but their geographical base – explore the city a little bit. We had to shoot it very documentary style, but wrap it up into a 20-25 minute episode. No pre-lighting, just a tiny team following her around, interacting with with these people.

Then we did a second one with Nuno Bettencourt and that solidified the look of the show during those two initial episodes. She eventually got distribution through AXS TV in the States for the eight-part series. I shot the first two episodes and the rest were shot by a US-based crew, which followed the production workflow that we had set up. Not only the look and retaining the documentary format, but also maintaining the highest production value we could give it in the time and budget that we’re working with.

We chose to shoot anamorphic with a cinematic aspect ratio, because it’s slightly different from the usual off-the-cuff reality TV look. Also whenever possible, record in a raw codec, because we (Hangman) were doing all of the post on it and me specifically being the colorist. I always advocate for a raw workflow, especially something in a documentary style. People are walking from daylight into somebody’s house and then down to a basement, basically following them around. And Kylie wants to keep interacting with whomever she’s interviewing without needing to wait for cameras to stop and re-balance. She wants to keep it flowing. So when it comes to posting that, you’ve got a much more robust digital negative to work with [if it was shot as camera raw].

What was the workflow for the shows and were there any challenges?

Leigh Brooks: The series was shot mainly with Red and Canon cameras as 6K anamorphic files. Usually the drive came to me and I would transcode the rushes or create proxy files and then send the drive to James. The program is quite straightforward and narrative-based, without much scope for doing crazy things with it. It’s about the nuts and bolts of guitars and the players that use them. But that being said, each episode definitely had its own little flavor and style. Once we locked the show, James took the sequence, got hold of the rushes, and then got to work on the grade and the sound.

What Kylie’s pulled off on her own is no small feat. She’s a great producer, knows her stuff, and really does the research. She’s so passionate about the music and the people that she’s interviewing and that really comes across. The Steve Vai episode was awesome. He’s very holistic. These people dictate the narrative and tell you where the edit is going to go. Mick Mars was also really good fun. That was the trickiest show to do, because the A and B-side camera set-up wasn’t quite working for us. We had to really get clever in the edit.

DaVinci Resolve gets a lot of press when it’s used for finishing and color grading. But on this project it was used for the offline cut as well. TV series post is usually dominated by Avid Media Composer. Why do you feel that Resolve is a good choice for offline editing?

Tonkin: I’ve been a longtime advocate of working inside of Resolve, not just from a grading perspective, but editorial. As soon as the Edit page started to offer me the feature set that we needed, it became a no-brainer that we should do all of our offline in Resolve, whenever possible. On a show like this, I’ve got about six hours of online time and I want to spend the majority being as creative as I can. So, focusing on color correction, looking at anything I need to stabilize, resize, any tracking, any kind of corrective work – rather than spending two or three hours conforming from one timeline into another.

The offline on this series was done in DaVinci Resolve, with the exception of the first episode, which was cut in Final Cut Pro X. I’m trying to leave editors open to the choice of the application they like to use. My gentlemen’s agreement with Matt [Cronin], who cut the first pilot, was that he could cut it in whatever he liked, as long as he gave me back a .drp (DaVinci Resolve project) file. He loves Final Cut Pro X, because that’s what he’s quickest at. But he also knows the pain that conforms can be. So he handled that on his side and just gave me back a .drp file. So it was quick and easy.

From episode three onwards, I was delighted to know that Leigh was based in Resolve, as well, as his primary workflow. Everything just transfers and translates really quickly. Knowing that we had six more episodes to work through together, I suggested things that would help us a lot, both for picture for me and for audio as well, which was also being done here in our studio. We’re generating the 5.1 mix.

Brooks: I come from an Avid background. I was an engineer initially before ever starting to edit. When I started editing, I moved from Avid to Final Cut Pro 7 and then back to Avid, after which I made the push to go to Resolve. It’s a joy to edit on and does so many things really well. It’s become my absolute workhorse. Avid is fine in a multi-user operation, but now that doesn’t really matter. Resolve does it so well with the cloud management. I even bought the two editor keyboards. The jog wheel is fantastic! The scrolling on that is amazing.

You mentioned cloud. Was any of that a factor in the post on “Life in Six Strings?”

Tonkin: Initially, when Leigh was reversioning the first two episodes for AXS TV, we were using his Blackmagic Cloud account. But for the rest of the episodes we were just exchanging files. Rushes either came to me or would go straight to Leigh. He makes his offline cut and then the files come to me for finishing, so it was a linear progression.

However, I worked on a pilot for another project where every version was effectively a finished online version. And so we used [the Blackmagic] Cloud for that all the way through. The editor worked offline with proxies in Resolve. We worked from the same cloud project and every time he had finished, I would log in and switch the files from proxy to camera originals with a single click. That was literally all we had to do in terms of an offline-to-online workflow.

Brooks: I’m working on delivering a feature length documentary for [the band] Nickelback that’s coming out in cinemas later in March. I directed it, cut it in Avid, and then finished in Resolve. My grader is in Portsmouth and I can sit here and watch that grade being done live, thanks to the cloud management. It definitely still has a few snags, but they’re on it. I can phone up Blackmagic and get a voice – an actual person to talk to that really wants to fix my problem.

Were there any particular tools in Resolve that benefitted these shows?

Tonkin: The Blackmagic team are really good at introducing new tools to Resolve all the time. I’ve used trackers for years and that’s one of my favorite things about Resolve. Their AI-based subtitling is invaluable. These are around 20-minute episodes and 99% of it is talking. Without that tool, we would have to do a lot of extra work.

Resolve is also good for the more complex things. For example, a driving sequence in the bright California sun that wasn’t shot as camera raw. The only way I could get around the blown-out sky was with corrections specifically for the sky portion of the shot. Obviously, you want to track the subject so that the sky is not going over his head. All of those types of tools are just built in. When I’ve effectively got six hours to work on an episode and I might have about 20 shots like that, then having these tools inside are invaluable from a finishing perspective.

You’ve both worked with a variety of other nonlinear editing applications. How do you see the industry changing?

Tonkin: Being in post for a couple of decades now and using Final Cut Studio, Final Cut Pro X, and a bit of Premiere Pro throughout the years, I find that the transition from offline to online starts to blur more and more these days. Clients watching their first pass want to get a good sense of what it should look like with a lot of finishing elements in place already. So, you’re effectively doing these finishing things right at the beginning.

It’s really advantageous when you’re doing both in Resolve. When you offline in a different NLE, not all of that data is transferred or correctly converted between applications. By both of us working in Resolve, even simple things you wouldn’t think of, like timeline markers, come through. Maybe he’s had some clips that need extra work. He can leave a marker for me and that will translate through. You can fudge your way through one episode using different systems, but if you’re going to do at least six or eight of them – and we’re hopefully looking at a season two this year – then you want to really establish your workflow upfront just to make things more straightforward.

Brooks: Editing has changed so much over the years. When I became an engineer, it was linear and nonlinear, right? I was working on “The World Is Not Enough” – the James Bond film around 1998. One side of the room was conventional – Steenbeck’s, bins, numbering machines. The other side was Avid. We were viewing 2K rushes on film, because that’s what you can see on the screen. On Avid it was AVR 77. It’s really interesting to see it come full circle. Now with Resolve, you’re seeing what you need to see rather than something that’s subpar.

I’d say there are a lot of editors who are ‘Resolve curious.’ If you’re in Premiere Pro you’re not moving, because you’re too tied into the way Adobe’s apps work. If you know Premiere, you know After Effects and are not going to move to Resolve and relearn Fusion. I think more people would move from Avid to Resolve, because simple things in Resolve are very complicated in Avid – the effects tab, the 3D warp, and so on.

Editors often have quite strange egos. I find the incessant arguing between platforms is just insane. It’s this playground kind of argument about bloody software! [laugh] After all, these tools are all there to tell stories. There are a lot of enthusiastic people on YouTube that make really good videos about Resolve, as well. It’s nice to be in that ecosystem. I’d implore any Avid editor to look at Resolve. If you’re frustrated and you want to try something else, then this might open your eyes to a new way of working. And the Title Tool works! [smile]

This article was originally posted at postPerspective.

©2024 Oliver Peters

Editors’ Thoughts on Documentaries

The growth of modern streaming platforms and alternative media has ushered in a golden age for documentary films, series, and short-form projects. Editors working on scripted versus unscripted projects find similarities, but also major differences between these two film genres.

I surveyed a small, diverse cross-section of documentary film editors for their thoughts on the genre. These editors cover a range of experience working with award-winning directors. This editors’ roundtable includes Steven Hathaway (The Pigeon Tunnel, American Dharma), Neil Meiklejohn (Wild Wild Country, Biggie: I Got a Story to Tell), Walter Murch (Coup 53, Particle Fever), Kayla Sklar (Rams, Rise Again: Tulsa and the Red Summer), and Will Znidaric (Five Camera Back, Freedom on Fire: Ukraine’s Fight for Freedom).

What is the difference between editing a documentary film or series and dramatic/fictional films?

Walter Murch: The real question: Is there a script before the start of production? If there is a script, it doesn’t matter if it is fiction or doc, the editor’s job is fairly clear: to interpret the script in cinematic terms given the material. If unscripted, then the editor’s job is different: to help build/discover the storyline and structure.

In unscripted, there is an abundance of events, but a paucity of interpretation. In scripted films it is the opposite: an abundance of interpretation, but a paucity of events. What do I mean by that?

In a scripted film, the editor may have 44 different readings of one line of dialogue. Four printed takes from eleven different camera angles and lenses. The decision becomes: which line reading and camera angle is best for this moment in the film. As for the events: there are only the events (the scenes) that are in the script. This choice is restricted relative to the profusion of interpretation of line readings.

In an unscripted film, the editor has only one line reading and one camera angle for each moment in the film. But, there will be many events (potential scenes) shot for the film, which may or may not be included in the final structure. It is the editor’s job to help decide which are the best scenes and in what order to put them, and then to make the best use of and framing for each single moment. Particle Fever and Coup 53 were both unscripted documentaries, and had, respectively, 400 and 530 hours of material to choose from.

You can compare scripted and unscripted to two types of science: Copernican and Darwinian. Copernicus had an idea (a sun-centered universe) and mined existing data to prove it. He did not make more than a few token astronomical observations himself. He had a pre-conceived idea to begin with and put that idea to the test. This is like having a script (a pre-conceived idea) and then putting it to the test of actually shooting it. Will the thing hang together? Will the weather co-operate? Will the actors deliver the goods? Etc. Etc.

Darwin on the other hand spent five years on an around-the-world voyage of discovery, making observations and collecting specimens, and then returned to home base and sifted through his “dailies” (so to speak) to find the underlying and animating story (evolution through natural selection). The idea emerged out of the evidence. This is like an unscripted documentary, where you shoot a lot of material and hope that a story will emerge from a careful selection of everything you have gathered.

Will Znidaric: When you start the process of editing a documentary, there is no pre-written script. It’s much more free form, and the job of a doc editor is one of discovery. The process is unique, a wholly different type of editing. Whether it is a cinéma vérité style of production, or one that leans on interview, or archive, or some combination of the above – you are going through all of it with an open mind. You are pulling material that not only helps to achieve the pre-stated concept or thesis, but also material that expands upon, and in some cases, even changes the thesis along the way. In that sense, the thing you are working to help create is almost alive. It breathes and evolves and grows and begins to even communicate with you on its own frequency more and more as you get closer and closer to it.

Kayla Sklar: Based on my experience (and this is definitely a generalization), a narrative editor shapes how the story is told, but a doc editor shapes what story is told. Most documentaries aren’t scripted, unless you’re working on something like a Ken Burns/PBS biography project. The director may have a thesis, but what story is the footage actually telling? It’s the doc editor’s job to help figure that out and to problem-solve when those don’t align.

Neil Meiklejohn: In documentary films and series, the editor is constantly writing and rewriting “the script” of the film. Most documentaries usually don’t have a clear story, so you are constantly trying to sort out what the best story is from the material you have. The words people say tend to be the driving force of that narrative. Careful arrangement of these sound bites gives you the strongest story. So, as an editor you’re essentially coming up with what the story is, along with the visuals being seen and sounds being heard to establish the tone of the film as a whole and throughout individual scenes.

Narrative is fun, because the script is written for you and most of the storytelling is sorted out in pre-production. You can really just focus on the tone of the piece sonically and visually.  In documentaries, especially “talking head” docs, the writing is not done until post-production.  Each day in the edit is different for an editor. Sometimes you are more creative with sound design or working on the visuals and some days are hard in terms of building out the film and “writing” new scenes. There are many moments where you think the story can take different directions and sometimes you have to try a few different options.

Many times on a documentary I suffer from “writer’s block,” where I need to jump into a new scene or rework an old one to push on. So, docs require a lot of “research and development.” In narrative, you may scrap a scene or two from the finished film, but for the most part you will see most of the script in the final film. In documentaries, you will spend weeks on scenes and multiple chunks of the film that will never see the light of day. Therefore, documentaries can take significantly more time, because of this trial-and-error period.

Even after the first rough assembly of, say, a true crime doc, you may realize that you need to work much more on the intrigue and mystery of an investigation, or clarification of things that are too vague. You are constantly adjusting how much and how little information (words or sound bites) the audience receives.

Different types of documentaries require different “writing tools.” For example, I have worked on a few documentaries that were very comedic. Even if your “talking head” subject is very funny, as an editor you spend a lot of time working on the timing and the wording of a gag or joke. In documentaries, it is significantly more difficult to edit something funny than in a narrative piece. In narrative, the writer, the script, and the actor have sorted it out for you before the camera even rolls.

On many bio-documentaries, peoples’ lives are not necessarily lived in a three act structure. Many times you have to sort out how to create a script of this person’s life or a moment in time that will be entertaining and understandable to an audience. It is no easy task. However. this is also what makes documentary editing fun. You really have an opportunity to tell a story how you as the editor see it and have the freedom to try lots of different things. The world is your oyster.

Steven Hathaway: The idea is the same: how to craft the best narrative out of the source material. For me a documentary starts with an interview. That interview is cut down to an essential state. You leave what is needed to tell the story. And maybe a few punctuating details. The same can be said for a scripted scene. You want to enter late and leave early. Build suspense and mystery, but also leave enough to follow the story. It is the basic editorial question: when is enough not too much?

In drama, there is more coverage, but less overall material. It is more about finding the right takes. In documentary, one is usually cutting a 90 minute movie from maybe 100 hours of footage. So it less about the best version and more about cutting it down while keeping the best moments.

Many documentary editors view what they do as a form of writing. Should they be credited as co-writers?

Walter Murch: If the doc is an unscripted one, then they should be credited as co-authors of the film. I think the editor should request that credit. This is what I did on Coup 53, and was granted that co-writing credit. I did not request that for Particle Fever, but perhaps I should have. I did proportionately the same amount of “writing” on Particle Fever as I did on Coup 53. In the end, no one received a writing credit for Particle Fever.

Will Znidaric: I used to believe very strongly that a doc editor should also be credited as a writer; but now I take a more nuanced perspective. The art of documentary editing is truly its own art form – very much like writing in so many ways. But, it’s also akin to sculpture, design, architecture, and even music composition. As anyone in the editing community knows – doc or narrative – what we do is considered an “invisible art,” which makes it challenging for people to appreciate its own unique subtleties.

Yet, if a young doc editor asks me for points of inspiration to better hone their craft, I point to the same types of material that any screenwriter would go to: “The Power Of Myth” series about Joseph Campbell, books about storytelling, books about screenwriting. Ultimately, the material you are sculpting as a doc editor is all in service of telling a compelling and engaging story.

Kayla Sklar: If no one else (director, story producer, etc) is getting credit for writing on a doc, then I don’t think it’s a necessary additional line. It’s within the scope of our duties. But I do think doc editing is definitely a form of writing. If someone can get a writing credit for punching up a script (as they should), then that’s similar to what a doc editor does to shape the director’s vision. A co-writing credit would be fair if there are other people on the doc getting a writing credit already.

Neil Meiklejohn: Most documentaries I have worked on have not had a writer, although a few have. Usually the writer credit is given to someone that is either writing the voice-over or sitting in with the editor and honing the story. I think each film or series should be taken on a case-by-case basis, as there are different ways of creating a documentary. But, I do believe if a writer credit is given to someone, filmmakers should also not ignore the contributions of the editor.

Steven Hathaway: I do not think of what I do in Avid as writing. I think it is closer to Tetris. If editors are writing the voice-over for a documentary and not getting credited, that is crazy. But I think story structure is part of the editor’s role whether it is scripted or documentary.

How have modern media platforms (Netflix, Apple TV+, Amazon Prime, YouTube, etc) impacted documentary films and series?

Walter Murch: As Chairman Mao said when asked about the impact of the French Revolution: “It is too soon to tell.”

Will Znidaric: On the whole, the growth of streaming services and platforms has fueled a complete boom time for the doc medium. It would not be hyperbole to say the last decade has been nothing short of transformative, as far as the doc medium is concerned.

There are many factors that come into play, including the accessibility and quality of production and post production equipment. But, also because there are now more entities that are hungry for doc projects. That means more documentaries get funding and find distribution than would have been imagined even 20 years ago. With that, naturally comes a wider spectrum of material that gets made – from more commercially viable, higher budget projects – to smaller, more idiosyncratic films/series. Ideally, there is room for an ecosystem that flourishes with a diversity of artistic voices. I am optimistic that can only grow in the future.

Neil Meiklejohn: For the most part, modern media has been wonderful for documentaries.  Docs are no longer this boring genre of film. I think now we are seeing some of the most entertaining documentaries ever. People talk about docs around the water cooler. Documentary films and series are actually cool and popular, which is great, because there are so many great projects to work on as an editor.

On the other hand, I do think their popularity has also impacted streamers and studios, because now there is a sense of urgency to get the doc films out the door and into people’s homes. The schedules are now crazy fast. In the past, filmmakers would spend years making a documentary. These days films are supposed to be produced in months. These expedited schedules can hurt the creative and not allow you to make the best film possible.

Steven Hathaway: For me, it’s been very good. They enable me to be a part of telling more stories and reaching wider audiences.

Kayla Sklar: I think that there’s never been a better time to work in docs! So many doc series have entered the zeitgeist in the past ten years and especially since the early days of covid-19. Who would have thought that a series about a wild cat rescuer or cult leader would be weekly appointment television? And yes, maybe those aren’t as “serious” as the doc genre usually tries to be. But, I think it’s good for everyone when the mainstream public is open to the idea of non-fiction programming as a worthwhile way to spend an evening.

This roundtable discussion was originally published by postPerspective.

©2024 Oliver Peters

Dos and Don’ts of Low-Budget Filmmaking

Over the course of my career, I’ve worked on dozens of theatrical feature films, documentaries, TV series and pilots, made-for-TV movies, and shows made only for streaming. I’ve worked as the editor, colorist, online/finishing editor, or post supervisor. As it’s become easier to make films on limited budgets, it’s also easier than ever for the clueless to jump into the game. It doesn’t have to be that way. If you take a little care, solid and successful workflows can be devised to produce respectable results. Here are a few suggestions.

Develop a manageable game plan

Time is literally money in filmmaking. That starts with the script. The longer the script, the higher the production costs. On average, scripts run around a minute per page; but there are variables.

A script line like “a fight ensued” could take a day or more to shoot, require a lot of extras, and consume several minutes of screen time. A sci-fi script could involve complex sets, locations, props, and both practical and digital effects. Compare that to a grounded drama set in a small town, which is a simpler and cheaper script to film. A 100-page script for a basic story will take between 15 and 30 days to shoot. An experienced producer will budget two to five additional days in reserve to cover for reshoots, pick-up shots, and weather days.

To know whether those 100 pages translate into a 100-minute movie or a lot longer, it’s important to time the script. Read it through multiple times and mime the action. Time each scene as you go through it. Do the same for table reads. This will give you a realistic idea of how long the script should be. Ultimately, you want something that’s 10-30% longer than your target. This gives you room to tighten, trim, and rearrange the film during the editorial phase.

Your Team

Gather the full team of stakeholders early. This should include all keys, such as the director of photography, editor(s), colorist, lead audio mixer, and VFX supervisor. Do not wait until everything has been shot before you decide on the players that will finish the film.

Technical specs

Make sure that all of the stakeholders are on the same page – technically speaking. This means decide on the common frame rate (24.0, 23.98, 25.0, 30.0, 29.97, other?), final delivery size/aspect ratio (4K UHD, 4K DCI, 2K DCI, HD, 2.39:1, 1.85:1, 16×9, other?), audio delivery (stereo, 5.1, ATMOS, other?), and so on. Everyone should be well aware of these requirements before filming begins.

Pipeline for nonstandard media

If you have material that deviates from the above specs, like archival footage, decide in advance how that footage is to be integrated into the final product. Determine a workflow, such as whether or not to convert it prior to being introduced into the editorial pipeline.

Exposure philosophy

Principal photography should be limited as much as possible to one camera type. If this is a two-camera shoot, cameras should match and a matched set of lenses should be used. The DP should be conscientious about making sure all settings are correct and consistent. Expose “to the right” – meaning, shoots should NOT be underexposed. It’s easier to make shots darker than to bring them out of the mud. RAW and LOG recordings are not a replacement for correct exposure, lighting, and color temperatures at the time of fiming a scene. Certain RAW formats perform better than others, so don’t expect a RAW recording to save your under or overexposed shot. The colorist should be able to take a LOG or RAW recording as is, apply a LUT or mild grade and be 90% of the way towards a finished look.

Camera selection

When selecting cameras, make sure you are using professional gear. This means cameras that record in professional formats (like ProRes, not H.264/265) and that record with proper timecode and unique file names. If you have no other option than to use a camera that generates generic clips without proper timecode (I’m looking at you iPhone, GoPro, DJI), then the footage must be transcoded to intermediate files prior to the edit. This may also include frame-rate correction to adjust the footage to your common project frame rate. Once done, the transcoded files become the “original” media to be used for edits and final color correction.

Edit system selection

You can edit with any NLE application you prefer. In my opinion, Apple Final Cut Pro is the best tool for many indie filmmakers who are not experienced editors. It’s easy to use, discoverable (deeper features are revealed as you dig farther into the application), and it’s easy to review media in a manner that will look close to the final product. More experienced users may also opt for Avid Media Composer, Adobe Premiere Pro, or Blackmagic Design DaVinci Resolve. Avoid reframing and changing speeds as much as possible. This shouldn’t be necessarily if principal photography was done properly. You aren’t David Fincher, so don’t plan on this level of post. Instead, take the Clint Eastwood approach to filmmaking.

Understanding the finishing steps

For color correction, your edit in Media Composer, Premiere Pro, or Final Cut Pro will be sent to an application like DaVinci Resolve for color correction/grading. In many cases it will be “round-tripped” back into your NLE for final assembly. Resizing and speed changes may or may not be handled properly in this process. That’s because every application computes frame rate and speed, spatial conform, and scale and position information somewhat differently. The less of this that you’ve done, the better – especially on a tight budget.

Audio

Be sure to handle audio properly, especially if it was recorded as double-system sound. Every NLE application has a method to sync location audio with picture. The editor and lead mixer should consult with each other as to how to properly handle the audio workflow, since it will vary with the tools being used.

Deliverables

Determine who and how to handle your deliverables. Different distributors have differing requirements. Meeting those will add to your budget unless you’ve planned for them in advance. For example, the most recent films I’ve handled required 4K and HD master files with both 5.1 and stereo sound tracks. In additional, various textless versions and separate audio stems were also required.

A film is never finished – merely abandoned

Many creatives are never completely happy with the results of their efforts. A director will always look for ways to make the film better. Given the time, they will often tweak, edit, and rearrange elements of the film until the very last moment. Resist the urge to do that. Be disciplined with your time. Evaluate, adjust, commit, and move on. Remember that perfection is the enemy of good.

©2023 Oliver Peters

A Conversation with Walter Murch – Part 4

Roads not travelled.

No matter how long the career or number of awards, any editor might consider those films that passed by and wonder what they might have done with the film. That’s where we conclude this discussion.

______________________

Walter, in your career, are there any films that you didn’t edit, but wish you had?

I worked on K-19: The Widowmaker for Kathryn Bigelow. And in 2007 she sent me the script for The Hurt Locker, about the Iraq war. It made me realize that the last four films I had edited had been films about war, the latest one being Jarhead. I told her that I just wanted to take a break from editing war films. Of course, Hurt Locker went on to win six Oscars – Best Picture, Best Screenplay, Best Director, Best Editing, Best Sound Effects, and Best Mixing.

What would have happened if I had said yes to that? But, you also get into practical things. At the time of making that decision, I’d been away from home for a year editing Jarhead in Los Angeles and New York. This would have meant going into the Middle East or at least going to Los Angeles. But, the main thing was just I’d been thinking about war since 2000: Apocalypse Redux, war – K-19, war – Cold Mountain, war – Jarhead, war. Even The English Patient is kind of a war film. So turning down The Hurt Locker is the big What If?  that comes to mind.

I know you are an Orson Welles buff, so I was actually thinking it might have been The Other Side of the Wind, which was finally completed in 2018.

I did the recut of Touch of Evil. At that time, 1998, I was taken to the vaults in Los Angeles, where the material for The Other Side of the Wind was in storage. Gary Graver showed me some of the rough assemblies that had been put together by Welles himself, but I just didn’t want to work on that one. This looked to me very self-indulgent. The situation with Touch of Evil was very, very different. 

The finished version of Wind seems like it’s an art film within the film and appears to be somewhat autobiographical. Although, I believe Welles denied that the director character (John Huston) was modeled after his own career.

Right. Touch of Evil was obviously a scripted film that was produced by Universal Studios in Hollywood – albeit, in a Wellesian manner – but it was very buttoned down, relatively speaking. And then Welles left us the 58 page memo, which made very specific suggestions for how the film should be recut. It was clear what he wanted done. I mean, he didn’t talk about frames – he would just say: this section needs to be shorter. Or: restore the script structure for the first four reels, cross-cutting between Janet Leigh’s story and Charlton Heston’s story. The studio put all the Heston story together and then all the Leigh story together. He wanted his original structure back. I don’t believe there was a guiding memo like that for The Other Side of the Wind.

Welles’ Touch of Evil memo is a wonderful document. It’s 58 pages by a genius filmmaker under duress, writing about his ideas and addressed to his enemies at the studio. It’s a masterclass in political diplomacy of trying to get his ideas across without accusation. It’s sad that he had to write it, but I’m happy we have it.

Thank you.

______________________

Walter Murch has led and continues to lead an interesting and eclectic filmmaking career. If you’ve enjoyed this 4-part series, there’s plenty more to be found in the books written by and about him. There are also many of his interviews and presentations available on the web.

SIGHT & SOUND: The Cinema of Walter Murch is documentary created by Jon Lefkovitz. This video is assembled from various interviews and presentations by Murch discussing his take on filmmaking and editing. It’s illustrated with many film examples to highlight the concepts.

Web of Stories – Life Stories of Remarkable People includes an 18-hour series of interviews with Walter Murch recorded in London in 2016. These are broken down into 320 short clips for easier viewing.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

A Conversation with Walter Murch – Part 3

©2023 Oliver Peters

A Conversation with Walter Murch – Part 3

Sound design and the film mixing process

Walter Murch is not only known for his work and awards in the realm of picture editing, but he’s also made significant contributions to the art of film sound. That’s where we pick up in Part 3.

______________________

Walter, let’s switch gears and talk about sound and your work as a sound designer and re-recording mixer. There’s an origin story about the term ‘sound designer’ credited to union issues. That might be a good place to start.

Frances [Ford Coppola] tells the story, but he gets it wrong. [laugh] There were union problems, because I was in the San Francisco union. Many of the films were financially based in LA, so what am I doing working on that? On The Rain People, for instance, my credit is sound montage. I wasn’t called sound mixer, re-recording mixer, or sound editor. We were just trying to avoid blatantly stepping on toes. I had the same sound montage credit on The Conversation. Then on Apocalypse Now I was credited with sound re-recording, because it was an independent film. Francis basically was the financier of it. So, it was an independent film, partially supported by United Artists.

The sound design idea came up because of this new format, which we now call 5.1. Apocalypse Now was the first time I had ever worked in that format. It was the first big film that really used it in a creative way. The Superman film in 1978 had it technically, but I don’t think they used it much in a creative fashion.

As I recall, at that time there was the four channel surround format that was left, center, right, and a mono rear channel.

Star Wars, which came out in 1977, had that format with the ‘baby boom’ thing. 70mm prior to that would have five speakers behind the screen – left, left-center, center, right-center, and right. What they decided to do on Star Wars was to not have the left-center/right-center speakers. Those were only used for super low frequency enhancement. Then the surround was many speakers, but all of them wired to only one channel of information.

Walter Murch mixing Apocalypse Now

We didn’t use those intermediate speakers at all and brought in Meyer ‘super boom’ speakers. These went down to 20Hz, much lower than the Altec speakers in those days, which I think bottomed out at around 50Hz. And then we split the mono surround speakers into two channels – left and right. Basically what we call 5.1 today. We didn’t call it 5.1, we just called it six-track or a split-surround format.

This was a new format, but how do you use it creatively? I couldn’t copy other films that had done it, because there weren’t any. So I designed an entire workflow system for the whole film. Where would we really use 5.1 and where would we not? That was a key thing – not to fall into the trap that usually happens with new technology, which is to overuse it. Where do we really want to have 5.1? In between that, let’s just use stereo and even some long sections where it’s just mono – when Willard is looking at the Kurtz dossier, for instance.

Willard’s narration section, if he’s in the focsle of the boat at night reading the memo, it’s just mono – it’s just his voice and that’s it. As things open up, approaching Hau Phat (the Playboy Bunny concert) it becomes stereo, and then as it really opens up, with the show beginning, it becomes full six-track, 5.1. Then it collapses back down to mono again the next morning. That’s the design element. So I thought, that’s the unique job that I did on the film – sound design – designing where we would fully use this new format and how we would use it.

Mark Berger, Francis Ford Coppola, and Walter Murch mixing The Godfather Part II

In addition, of course, I’d cut some of the sound effects, but not anywhere near most of them, because we had a small army of sound effects editors working under Sound Effects Supervisor Richard Cirincione. However, I was the main person responsible. Francis at one point had a meeting and he said, “Any questions about sound? Walter’s the guy. Don’t ask me, ask Walter.” So I effectively became the director of sound. And then, of course, I was the lead re-recording mixer on the film.

Some readers might not be familiar with how film mixing works and why there are teams. Please go into that more.

Previously, in Hollywood, there would usually be three people – DME sitting at the board. That is how The Godfather was mixed. If you are facing the screen behind the console, then D [dialogue mixer] on the left, M [music mixer] in the middle and E [sound effects mixer] on the right. On the other hand, in San Francisco I had been the solo re-recording mixer on Rain People, THX-1138, and The Conversation. 

Walter Murch handling the music mix, Particle Fever

As soon as you have automation you don’t need as many people, because the automation provides extra fingers. We had very basic automation on Apocalypse Now. Only the faders were automated, but none of the equalization, sends, echo, reverb, or anything else. So we had to keep lots of notes about settings. The automation did at least control the levels of each of the faders.

Of course, these days a single person can mix large projects completely ‘in the box’ using mainly a DAW. I would imagine mixing for music and mixing for film and television is going to use many of the same tools.

The big difference is that in the old days – and I’m thinking of The Godfather – we had very limited ability with the edited soundtracks to hear them together before we got to the mix. You had no way to set their levels relative to each other until you got to the mix. So the mix was really the creation of this from the ground up.

Supervising the mix, Coup 53

Thinking of the way I work now or the way Skip Lievsay works with the Coen brothers, he will create the sound for a section in Pro Tools and build it up. Then he’ll send the brothers a five-track or a three-track and they just bring it into the audio tracks of the Premiere timeline. So they’re editing the film with his full soundtrack. There are no surprises in the final mix. You don’t have to create anything. The final mix is when you hear it all together and put ‘holy water’ on it and say, that’s it – or not. Now that you’ve slept on it overnight, let’s reduce the the bells of the cows by 3dB. You make little changes, but it’s not this full-on assault of everything. As I said earlier, bareback – where it’s just taking the raw elements and putting them together for the first time in the mix. The final mix now is largely a certification of things that you have already been very familiar with for some time.

______________________

Click here for the conclusion of this conversation in Part 4.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

©2023 Oliver Peters