The Oscar. Now what?

Everything Everywhere All at Once dominated the Academy Awards night, including winning the Best Film Editing award for Paul Rogers. The team used Adobe Premiere Pro as their NLE of choice. By extension this becomes the first editing Oscar win for Premiere. Of course, it’s the team and editor that won the award, not the software that they used. Top editors could cut with any application and get the same result.

The Academy Awards started as a small celebratory dinner for insiders to recognize each other’s achievements in film. Over the decades this has become a major cultural event. Winning or even being nominated is a huge feather in the cap for any film. This can be heavily leveraged by the marketing teams of not only the film distributors and talent agents, but also the various products used in the process – be that cameras or software.

Avid’s dominance

When it comes to editing, Avid has been the 800-pound gorilla in the modern digital era. Ever since Walter Murch won for editing The English Patient using Media Composer, the specific NLE on an Oscar-winning film has become a hot topic among editors. This was never the case when the only options were Moviola, KEM, or Steenbeck.

Even this year nine out of the ten nominees for the Oscar for Best Picture and four out of the five nominees for Best Film Editing used Media Composer. Yet, Avid’s dominance in the winner’s circle has seen some occasional cracks from competitors, like Apple’s Final Cut Pro (legacy version) and Lightworks. Nevertheless, Media Composer is still a safe bet. And let’s not forget sound, where Pro Tools has even less competition from other DAWs among film and TV sound editors and mixers. All of the nominees for the Oscar for Best Sound at this year’s Academy Awards used Pro Tools.

There are, of course, many awards competitions around the world, including the ACE Eddie Awards, BAFTA, Golden Globes, and others, including various film festivals. Many of these don’t give out specific craft awards for editors or editing; however, a lot of these winning films have been edited with other tools. For example, many award-worthy indie films, especially documentaries, have been edited with Premiere Pro. Even Final Cut Pro (the current “X” version) has had wins in such categories. This includes wins for the short films, The Silent Child and Skin at the 2018 and 2019 Academy Awards.

Stacking up the NLE competitors

The truth of the matter is that today, there are seven viable applications that might be used to cut a professional feature film or documentary: Media Composer, Final Cut Pro, Premiere Pro, DaVinci Resolve, Lightworks, Edius X, and Vegas Pro. You could probably also factor in others, such Final Cut Pro 7 (now zombie-ware) and Media 100 (yes, still alive), not to mention consumer-oriented NLEs like iMovie or Movie Maker. Realistically, most experienced film editors are likely to only use one of the first five on the list.

Of those five, Blackmagic Design’s DaVinci Resolve is the app that most editors have their eyes on. Aside from its widespread use in color correction, Resolve is also a perfectly capable editing application. Although it has yet to pull off an Oscar win for editing, Resolve has been widely used in many aspects of the production and post workflow of top films. Owing to its nature as a “Swiss Army Knife” application, Resolve fits into various on-set, editing, and visual effects niches. It’s only a matter of time before Resolve gets an Oscar win for editing. But other Blackmagic Design products also shouldn’t be overlooked. In the 2023 Academy Awards, more than 20 films across the technical, documentary, short film, international feature film, and animated categories used some Blackmagic Design product.


When an application is used on an award-winning film, I’d bet that the manufacturer’s marketing department is doing high-fives. But does this really move the sales needle? Maybe. It’s all aspirational marketing. They want you to feel that if you use the same software as an Oscar-winning film editor used, then you, too, could be in that league. Talent is always the key factor, but we can all dream. Right? That’s what marketing plays upon, but it also impacts the development of the application itself.

Both Avid and Adobe have been fine-tuning their tools with professional users in mind for years. They’ve added features based on the needs of a small, but influential (or at least vocal) market sector. This results in applications that tick most of the professional boxes, but which are also harder to learn and eventually master.

That’s a route Apple also chose to pursue with Final Cut Pro 1 through 7. Despite a heralded introduction with Cold Mountain in 2003, it took until 2010 before Angus Wall and Kirk Baxter nailed down an Oscar with The Social Network. They then reprised that in 2011 with a win for The Girl with the Dragon Tattoo. Even as late as 2020, the discontinued FCP 7 was represented by Parasite, winning Best Picture and nominated for Best Film Editing.

Apple and Final Cut Pro’s trajectory unexpectedly changed course with the introduction of Final Cut Pro X. This shift coincided with the growth of social media and a new market of many non-traditional video editors. Final Cut Pro in its current iteration is the ideal application for this market and has experienced a huge growth in users. But, it still gets labelled as being not ready for professional users, even though a ton of professional content is posted using the app. Apple took the platform approach – opting to leave out many advanced features and letting third party developers fill in the gaps where needed. This is the core of much of the criticism.

How advanced/complex does a professional NLE really need to be?

In the case of FCP, it’s certainly capable of Hollywood-level films along with a range of high-end, international dramas. Witness the many examples I’ve written about, like Focus, Whiskey Tango Foxtrot, Voice from the Stone, The Banker, Jezebel, and Blood Red SkyHowever, a wide range of professional editors would like to see more.

The internal corporate discussion goes like this. Marketing asks, “What do we have to do to get broader adoption among professional film editors?” Engineering answers, “It will take X dollars and X amount of time.” Top management asks, “What’s the return if we do that?” And that’s usually where the cycle stops, until the next year or awards season.

The truth is that the traditional high-end post market is extremely small for a company like Apple. The company is already selling hardware, which is their bread and butter. Will a more advanced version of FCP sell more hardware? Probably not. Avid, Adobe, and Blackmagic Design are already doing that for them. On the other hand, what is more influential for sales in today’s market – Oscar-winning professional editors or a bevy of YouTube influencers touting your product?

I’m not privy to sales numbers, so I have no idea whether or not going after the very small professional post market makes financial sense for either Blackmagic Design or Adobe. In the case of Avid, their dominance pays off through their ecosystem. Avid-based facilities are also likely to have Avid storage and Pro Tools audio facilities. Hardware most likely covers the development costs. Plus, both Avid and Adobe have shifted to subscription models (Adobe fully, Avid as an option). This seems to be good for both companies.

Blackmagic Design is also a hardware developer and manufacturer. Selling cameras and a wide range of other products enables them to offer DaVinci Resolve for as little as free. You’d be hard-pressed to find a production company that wasn’t using one or more Blackmagic products. Only time will tell which company has taken the approach that a) ensures their long term survival, and b) benefits professional film editors in the best way. In the case of Apple, it’s pretty clear that adding new feature to Final Cut Pro will generate more revenue in an amount that many competitors would envy. Yet, it would be small by Apple’s measurement.

In the end, awards are good for a developer’s marketing buzz, but don’t forget the real team that won the award itself. It’s wonderful for Paul Rogers and Adobe that Everything Everywhere All at Once was tapped for the Oscar for Best Film Editing. It’s an interesting milestone, but when it comes to software, it’s little more than bragging rights. Great to have, but remember, it’s Rogers that earned it, regardless of the tools he used.

©2023 Oliver Peters

What is a Finishing Editor?

To answer that, let’s step back to film. Up until the 1970s dramatic television shows, feature films, and documentaries were shot and post-produced on film. The film lab would print positive copies (work print) of the raw negative footage. Then a team of film editors and assistants would handle the creative edit of the story by physically cutting and recutting this work print until the edit was approved. This process was often messy with many film splices, grease pencil marks on the work print to indicate dissolves, and so on.

Once a cut was “locked” (approved by the director and the execs) the edited work print and accompanying notes and logs were turned over to the negative cutter. It was this person’s job to match the edits on the work print by physically cutting and splicing the original camera negative, which up until then was intact. The negative cutter would also insert any optical effects created by an optical house, including titles, transitions, and visual effects.

Measure twice, cut once

Any mistakes made during negative cutting were and are irreparable, so it is important that a negative cutter be detail-oriented, precise, and works cleanly. You don’t want excess glue at the splices and you don’t want to pick up any extra dirt and dust on the negative if it can be avoided. If a mistaken cut is made and you have to repair that splice, then at least one frame is lost from that first splice.

A single frame – 1/24th of a second – is the difference in a fight scene between a punch just about to enter the frame and the arm passing all the way through the frame. So you don’t want a negative cutter who is prone to making mistakes. Paul Hirsch, ACE points out in his book A long time ago in a cutting room far, far away…. that there’s an unintentional jump cut in the Death Star explosion scene in the first Star Wars film, thanks to a negative cutting error.

In the last phase of the film post workflow, the cut negative goes to the lab’s color timer (the precursor to today’s colorist), who sets the “timing” information (color, brightness, and densities) used by the film printer. The printer generates an interpositive version of the complete film from the assembled negative. From this interpositive, the lab will generally create an internegative from which release prints are created.

From the lab to the linear edit bay

This short synopsis of the film post-production process points to where we started. By the mid-1970s, video post-production technology came onto the scene for anything destined for television broadcast. Material was still shot on film and in some cases creatively edited on film, as well. But the finishing aspect shifted to video. For example, telecine systems were used to transfer and color correct film negative to videotape. The lab’s color timing function was shifted to this stage (before the edit) and was now handled by the telecine operator, who later became known as a colorist.

If work print was generated and edited by a film editor, then it was the video editor’s job to match those edits from the videotapes of the transferred film. Matching was a manual process. A number of enterprising film editors worked out methods to properly compute the offsets, but no computerized edit list was involved. Sometimes a video offline edit session was first performed with low-res copies of the film transfer. Other times producers simply worked from handwritten timecode notes for selected takes. This video editing – often called online editing and operated by an online editor – was the equivalent to the negative cutting stage described earlier. Simpler projects, such as TV commercials, might be edited directly in an online edit session without any prior film or offline edit.

Into the digital era

Over time, any creative editing previously done on film for television projects shifted to videotape edit systems and later to digital nonlinear edit systems (NLEs), such as Avid and Lightworks. These editors were referred to as offline editors and post now followed a bifurcated process know as offline and online editing. This was analogous to film’s work print and negative cutting stages. Likewise, telecine technology evolved to not only perform color correction during the film transfer process, but also afterwards working from the assembled master videotape as a source. This process, known as tape-to-tape color correction, gave the telecine operator – now colorist – the tools to perform better shot matching, as well as to create special looks in post. With this step the process had gone full circle, making the video colorist the true equivalent of the lab’s color timer.

As technology marched on, videotape and linear online edit bays gave way to all-digital, NLE-based facilities. Nevertheless, the separation of roles and processes continued. Around 2000, Avid came in with its Symphony model – originally a separate product and not just a software option. Avid Symphony systems offered a full set of color-correction tools and the ability to work in uncompressed resolutions.

It became quite common for a facility to have multiple offline edit bays using Avid Media Composer units staffed by creative, offline editors working with low-res media. These would be networked to an Avid shared storage solution. In addition, these facilities would also have one or more Avid Symphony units staffed by online editors.

A project would be edited on Media Composer until the cut was locked. Then assistants would ingest high-res media from files or videotape, and an online editor would “conform” the edit with this high-res media to match the approved timeline. The online editor would also handle Symphony color correction, insert visual effects, titles, etc. Finally, all tape or file deliverables would be exported out of the Avid Symphony. This system configuration and workflow is still in effect at many facilities around the world today, especially those that specialize in unscripted (“reality”) TV series.

The rise of the desktop systems

Naturally, there are more software options today. Over time, Avid’s dominance has been challenged by Apple Final Cut Pro (FCP 1-7 and FCPX), Adobe Premiere Pro, and more recently Blackmagic Design DaVinci Resolve. Systems are no longer limited by resolution constraints. General purpose computers can handle the work with little or no bespoke hardware requirements.

Fewer projects are even shot on film anymore. An old school, film lab post workflow is largely impossible to mount any longer. And so, video and digital workflows that were once only used for television shows and commercials are now used in nearly all aspects of post, including feature films. There are still some legacy terms in use, such as DI (digital intermediate), which for feature film is essentially an online edit and color correction session.

Given that modern software – even running on a laptop – is capable of performing nearly every creative and technical post-production task, why do we still have separate dedicated processes and different individuals assigned to each? The technical part of the answer is that some tasks do need extra tools. Proper color correction requires precision monitoring and becomes more efficient with specialized control panels. You may well be able to cut with a laptop, but if your source media is made up of 8K RED files, a proxy (offline-to-online) workflow makes more sense.

The human side of the equation is more complex

Post-production tasks often involve a left/right-side brain divide. Not every great editor is good when it comes to the completion phase. In spite of being very creative, many often have sloppy edits, messy timelines, and their project organization leaves a lot to be desired. For example, all footage and sequences just bunched together in one large project without bins. Timelines might have clips spread vertically in no particular order with some disabled clip – based on changes made in each revision path. As I’ve said before: You will be judged by your timelines!

The bottom line is that the kind of personality that makes a good creative editor is different than one that makes a good online editor. The latter is often called a finishing editor today within larger facilities. While not a perfect analogy, there’s a direct evolutionary path from film negative cutter to linear online editor to today’s finishing editor.

If you compare this to the music world, songs are often handled by a mixing engineer followed by a mastering engineer. The mix engineer creates the best studio mix possible and the mastering engineer makes sure that mix adheres to a range of guidelines. The mastering engineer – working with a completely different set of audio tools – often adds their own polish to the piece, so there is creativity employed at this stage, as well. The mastering engineer is the music world’s equivalent to a finishing editor in the video world.

Remember, that on larger projects, like a feature film, the film editor is contracted for a period of time to deliver a finished cut of the film. They are not permanent staff. Once, that job is done the project is handed off to the finishing team to accurately generate the final product working with the high-res media. Other than reviewing the work, there’s no value to having a highly paid film editor also handle basic assembly of the master. This is also true in many high-end commercial editorial companies. It’s more productive to have the creative editors working with the next client, while the staff finishing team finalizes the master files.

The right kit for the job

It also comes down to tools. Avid Symphony is still very much in play, especially with reality television shows. But there’s also no reason finishing and final delivery can’t be done using Apple Final Cut Pro or Adobe Premiere Pro. Often more specialized edit tools are assigned to these finishing duties, including systems such as Autodesk Smoke/Flame, Quantel Rio, and SGO Mistika. The reason, aside from quality, is that these tools also include comprehensive color and visual effects functions.

Finishing work today includes more that simply conforming a creative edit from a decision list. The finishing editor may be called upon to create minor visual effects and titles along with finessing those that came out of the edit. Increasingly Blackmagic Design DaVinci Resolve is becoming a strong contender for finishing – especially if Resolve was used for color correction. It’s a powerful all-in-one post-production application, capable of handling all of the effects and delivery chores. If you finish out of Resolve, that cuts out half of the roundtrip process.

Attention to detail is the hallmark of a good finishing editor. Having good color and VFX skills is a big plus. It is, however, a career path in its own right and not necessarily a stepping stone to becoming a top-level feature film editor or even an A-list colorist. While that might be a turn-off to some, it will also appeal to many others and provide a great place to let your skills shine.

©2023 Oliver Peters

A Conversation with Walter Murch – Part 4

Roads not travelled.

No matter how long the career or number of awards, any editor might consider those films that passed by and wonder what they might have done with the film. That’s where we conclude this discussion.


Walter, in your career, are there any films that you didn’t edit, but wish you had?

I worked on K-19: The Widowmaker for Kathryn Bigelow. And in 2007 she sent me the script for The Hurt Locker, about the Iraq war. It made me realize that the last four films I had edited had been films about war, the latest one being Jarhead. I told her that I just wanted to take a break from editing war films. Of course, Hurt Locker went on to win six Oscars – Best Picture, Best Screenplay, Best Director, Best Editing, Best Sound Effects, and Best Mixing.

What would have happened if I had said yes to that? But, you also get into practical things. At the time of making that decision, I’d been away from home for a year editing Jarhead in Los Angeles and New York. This would have meant going into the Middle East or at least going to Los Angeles. But, the main thing was just I’d been thinking about war since 2000: Apocalypse Redux, war – K-19, war – Cold Mountain, war – Jarhead, war. Even The English Patient is kind of a war film. So turning down The Hurt Locker is the big What If?  that comes to mind.

I know you are an Orson Welles buff, so I was actually thinking it might have been The Other Side of the Wind, which was finally completed in 2018.

I did the recut of Touch of Evil. At that time, 1998, I was taken to the vaults in Los Angeles, where the material for The Other Side of the Wind was in storage. Gary Graver showed me some of the rough assemblies that had been put together by Welles himself, but I just didn’t want to work on that one. This looked to me very self-indulgent. The situation with Touch of Evil was very, very different. 

The finished version of Wind seems like it’s an art film within the film and appears to be somewhat autobiographical. Although, I believe Welles denied that the director character (John Huston) was modeled after his own career.

Right. Touch of Evil was obviously a scripted film that was produced by Universal Studios in Hollywood – albeit, in a Wellesian manner – but it was very buttoned down, relatively speaking. And then Welles left us the 58 page memo, which made very specific suggestions for how the film should be recut. It was clear what he wanted done. I mean, he didn’t talk about frames – he would just say: this section needs to be shorter. Or: restore the script structure for the first four reels, cross-cutting between Janet Leigh’s story and Charlton Heston’s story. The studio put all the Heston story together and then all the Leigh story together. He wanted his original structure back. I don’t believe there was a guiding memo like that for The Other Side of the Wind.

Welles’ Touch of Evil memo is a wonderful document. It’s 58 pages by a genius filmmaker under duress, writing about his ideas and addressed to his enemies at the studio. It’s a masterclass in political diplomacy of trying to get his ideas across without accusation. It’s sad that he had to write it, but I’m happy we have it.

Thank you.


Walter Murch has led and continues to lead an interesting and eclectic filmmaking career. If you’ve enjoyed this 4-part series, there’s plenty more to be found in the books written by and about him. There are also many of his interviews and presentations available on the web.

SIGHT & SOUND: The Cinema of Walter Murch is documentary created by Jon Lefkovitz. This video is assembled from various interviews and presentations by Murch discussing his take on filmmaking and editing. It’s illustrated with many film examples to highlight the concepts.

Web of Stories – Life Stories of Remarkable People includes an 18-hour series of interviews with Walter Murch recorded in London in 2016. These are broken down into 320 short clips for easier viewing.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

A Conversation with Walter Murch – Part 3

©2023 Oliver Peters

A Conversation with Walter Murch – Part 3

Sound design and the film mixing process

Walter Murch is not only known for his work and awards in the realm of picture editing, but he’s also made significant contributions to the art of film sound. That’s where we pick up in Part 3.


Walter, let’s switch gears and talk about sound and your work as a sound designer and re-recording mixer. There’s an origin story about the term ‘sound designer’ credited to union issues. That might be a good place to start.

Frances [Ford Coppola] tells the story, but he gets it wrong. [laugh] There were union problems, because I was in the San Francisco union. Many of the films were financially based in LA, so what am I doing working on that? On The Rain People, for instance, my credit is sound montage. I wasn’t called sound mixer, re-recording mixer, or sound editor. We were just trying to avoid blatantly stepping on toes. I had the same sound montage credit on The Conversation. Then on Apocalypse Now I was credited with sound re-recording, because it was an independent film. Francis basically was the financier of it. So, it was an independent film, partially supported by United Artists.

The sound design idea came up because of this new format, which we now call 5.1. Apocalypse Now was the first time I had ever worked in that format. It was the first big film that really used it in a creative way. The Superman film in 1978 had it technically, but I don’t think they used it much in a creative fashion.

As I recall, at that time there was the four channel surround format that was left, center, right, and a mono rear channel.

Star Wars, which came out in 1977, had that format with the ‘baby boom’ thing. 70mm prior to that would have five speakers behind the screen – left, left-center, center, right-center, and right. What they decided to do on Star Wars was to not have the left-center/right-center speakers. Those were only used for super low frequency enhancement. Then the surround was many speakers, but all of them wired to only one channel of information.

Walter Murch mixing Apocalypse Now

We didn’t use those intermediate speakers at all and brought in Meyer ‘super boom’ speakers. These went down to 20Hz, much lower than the Altec speakers in those days, which I think bottomed out at around 50Hz. And then we split the mono surround speakers into two channels – left and right. Basically what we call 5.1 today. We didn’t call it 5.1, we just called it six-track or a split-surround format.

This was a new format, but how do you use it creatively? I couldn’t copy other films that had done it, because there weren’t any. So I designed an entire workflow system for the whole film. Where would we really use 5.1 and where would we not? That was a key thing – not to fall into the trap that usually happens with new technology, which is to overuse it. Where do we really want to have 5.1? In between that, let’s just use stereo and even some long sections where it’s just mono – when Willard is looking at the Kurtz dossier, for instance.

Willard’s narration section, if he’s in the focsle of the boat at night reading the memo, it’s just mono – it’s just his voice and that’s it. As things open up, approaching Hau Phat (the Playboy Bunny concert) it becomes stereo, and then as it really opens up, with the show beginning, it becomes full six-track, 5.1. Then it collapses back down to mono again the next morning. That’s the design element. So I thought, that’s the unique job that I did on the film – sound design – designing where we would fully use this new format and how we would use it.

Mark Berger, Francis Ford Coppola, and Walter Murch mixing The Godfather Part II

In addition, of course, I’d cut some of the sound effects, but not anywhere near most of them, because we had a small army of sound effects editors working under Sound Effects Supervisor Richard Cirincione. However, I was the main person responsible. Francis at one point had a meeting and he said, “Any questions about sound? Walter’s the guy. Don’t ask me, ask Walter.” So I effectively became the director of sound. And then, of course, I was the lead re-recording mixer on the film.

Some readers might not be familiar with how film mixing works and why there are teams. Please go into that more.

Previously, in Hollywood, there would usually be three people – DME sitting at the board. That is how The Godfather was mixed. If you are facing the screen behind the console, then D [dialogue mixer] on the left, M [music mixer] in the middle and E [sound effects mixer] on the right. On the other hand, in San Francisco I had been the solo re-recording mixer on Rain People, THX-1138, and The Conversation. 

Walter Murch handling the music mix, Particle Fever

As soon as you have automation you don’t need as many people, because the automation provides extra fingers. We had very basic automation on Apocalypse Now. Only the faders were automated, but none of the equalization, sends, echo, reverb, or anything else. So we had to keep lots of notes about settings. The automation did at least control the levels of each of the faders.

Of course, these days a single person can mix large projects completely ‘in the box’ using mainly a DAW. I would imagine mixing for music and mixing for film and television is going to use many of the same tools.

The big difference is that in the old days – and I’m thinking of The Godfather – we had very limited ability with the edited soundtracks to hear them together before we got to the mix. You had no way to set their levels relative to each other until you got to the mix. So the mix was really the creation of this from the ground up.

Supervising the mix, Coup 53

Thinking of the way I work now or the way Skip Lievsay works with the Coen brothers, he will create the sound for a section in Pro Tools and build it up. Then he’ll send the brothers a five-track or a three-track and they just bring it into the audio tracks of the Premiere timeline. So they’re editing the film with his full soundtrack. There are no surprises in the final mix. You don’t have to create anything. The final mix is when you hear it all together and put ‘holy water’ on it and say, that’s it – or not. Now that you’ve slept on it overnight, let’s reduce the the bells of the cows by 3dB. You make little changes, but it’s not this full-on assault of everything. As I said earlier, bareback – where it’s just taking the raw elements and putting them together for the first time in the mix. The final mix now is largely a certification of things that you have already been very familiar with for some time.


Click here for the conclusion of this conversation in Part 4.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

©2023 Oliver Peters

A Conversation with Walter Murch – Part 2

Her Name was Moviola, continued

In Part 1 of my recent discussion with Oscar-winning editor Walter Murch, he explained about a documentary film with which he’s currently involved. In that film he takes a look at some of the processes traditional film editors went through. We continue that conversation here in Part 2.


What was the experience like to go back in time, so to speak – working with a Moviola again?

I hadn’t cut any dailies on a Moviola since 1977, 45 years ago. Dan had not done anything on a Moviola since 1994. But it all came back instantly. There was not the slightest hesitation about what any of this stuff was or how we made it work. Interestingly, that’s very different from my experience with digital platforms.

In what way?

Let’s say I’m cutting a film using Avid and finish the work. Then three months later I get another job using the same Avid. In those three months the muscle memory of my fingers has somewhat evaporated. I have to ask my assistant questions similar to, “How do I tie my shoelaces?” [laugh] Of course, it comes back – it takes about three or four days to get rid of the rust. Then in about a week I’m fully back.

Murch editing Coup 53 with Premiere Pro

So that’s an interesting neurological question. Why does editing on the Moviola not have the slightest evaporation in 45 years, whereas editing on a digital platform that you are very familiar with start to evaporate if you are away from it for a few months? I think it’s because every skill in Moviola editing is a completely different set of physical muscular moves: splicing is different from rewinding is different from braking is different from threading up the Moviola, etc. etc. And each of them makes a different sound. Whereas the difference between ‘splicing’ and ‘rewinding’ in digital editing is simply a different keystroke. 

In our emails we had talked a little bit about the differences between an upright Moviola and flatbeds like KEM and Steenbeck. Would you expand upon that a bit?

Ironically, the outliers in this are the flatbeds. In a sense, both the Moviola and nonlinear digital are random access machines. With the Moviola, everything is broken down into individual shots, which are rolled up and put into boxes. There might be two or three or sometimes six or seven shots in a box. When you want to see a shot, you ask your assistant, “Can you give me 357, take two?” That’s kind of what happens digitally, too, except you are making the selection by typing or mouse-clicking. Digital is much more random access: you can select internally within the shot.

A KEM or a Steenbeck on the other hand is linear. Everything is kept in the dailies rolls as they came from the lab. If you want to see a particular shot, you have to find it in its ten minute dailies roll. What I would do is thread it through the sprocketed prism, without going through the drive sprockets. Then I’d just spool down at very high speed with my hands on both the take up and the feed and watch for 30 seconds or so while it was winding until I got to the shot.

Next, I would put it on the screening side screen and lock it into the sprocketed motor drives, to figure out a place to edit it into the cut. On the KEM the picture module would be on my left and the sound for that on my right. The center would be what’s coming into the film. That’s my way of working, but everyone has a different way of working.

Murch and Taghi Amirani, director, Coup 53

When George [Lucas] cut on the Steenbeck, he was using a one-screen Steenbeck, so that option of having two screens was not available to him. And so he would make select rolls. He would go through the dailies roll, cut out the good bits and then either hang them on hooks or build them into a separate selects roll. For philosophical reasons, I don’t like working that way, but it’s certainly a valid way of working.

I liked the ‘dailies roll’ method, because as I would be hi-speed scrolling for the shot I wanted, I often would find what I needed instead. As I would be spooling, I would glimpse alternate takes, things that I had initially rejected, which proved to be valuable, because as the film evolves, maybe they would now be helpful. Even when editing digitally, I still also construct what I call ‘KEM rolls’ of everything shot for a scene strung together.

We humans have this fascination with vintage analog gear, whether it’s film or audio. Is it just that touch of nostalgia or something different?

The basic idea behind the Moviola film is that if we don’t do it now it will disappear from history. It was difficult enough now in 2022, which coincidentally is the 100th birthday of the Moviola. The first one was built in 1922. If we don’t do it now, it will become exponentially more difficult.

Not the machine itself. I think they’ll always hang around, because they’re iconic, like ancient sculptures. But ancillary equipment like mag film was really hard to get. Also hard was the pressure-sensitive thermal tape for the Acmade printer. We eventually found it from Pixar in California. But if we hadn’t found those tapes, we couldn’t have made the film. It was as simple as that. Without that specific tape, all of this inverted pyramid would have just collapsed. 

Every film in Hollywood from about 1925 until 1968, let’s say, was cut on a Moviola. Before 1925 cutting was done without any machine. The editors were just cutting by hand, assembling shots together and then screening the assembly. So the screening room was in essence their Moviola. They would take notes during the screening and then go back to the bench and trimmed and transposed shots or whatever. Ultimately all of the classic films from Hollywood after about 1925 were forged on the Moviola. That was my experience standing all day for 12 hours a day at the Moviola, with all this [winding noise] stuff.

It felt like blacksmithing in comparison to what we do digitally. And obviously you are physically cutting the film. I used the Inviso film splicer on this, which was invented in the mid-1970s. So it’s a little bit of a cheat to say in 1972, because I invented the Inviso in 1976. We also used my other invention, which was to make the hooks on the trim bin out of 1.75mm crochet hooks. These have a barb at the end of them, so that prevented several pieces of film from falling off when you hung them on a hook.

I mean, this is inside baseball, but one of the fascinating things about analog editing is that it is open to physical tinkering. For instance: the Acmade numbers were not printing boldly enough for some reason. What to do? Howard’s solution was to wrap a piece of adhesive tape around the sprocket wheel, forcing the film closer to the print head. You could fix something, just like working on a car from 1956. If you had a problem with the carburetor, take a screwdriver and bang away at it. Right? Today with digital fuel injection or now with electric motors, it’s hopeless for an ordinary person to have any access to the engine. To a certain extent there’s a similarity with digital film editing. Of course, if you know how to code and you know what a database does, you can do very sophisticated things in the digital realm, but that’s the requirement.

When Dan was syncing the film up, there was a sound recordist on the shoot, who was a young film student – maybe 24 years old. At the end of one session, he asked Dan confidentially, “Did you do this on every film?” [laugh] It was just incomprehensible to him that we had to do all this very physical work.

John Gregory Dunne wrote an article around 2004 about film craft. He said, a director friend of his called the old way, which is what we were doing in Moviola, “surgery without anesthetic.” That’s interesting, because how did you do it? Well, first of all, we had to do it. There was no other way. How did Michelangelo carve David? Did he sharpen his chisels after every ten bangs? How many assistants did he have? Just those really ephemeral things that were necessary to do well, which I don’t think we have any record of. So that was another reason to do the film. It was just to say, this is what you had to do.

I read Michael Rubin’s book Droidmaker about the start of Lucasfilm. You are heavily featured in it. He referred to a picnic event you hosted called the Droid Olympics, which was directly related to the film editing techniques we’ve been talking about. Please tell me a bit about that.

Droid Olympics, 1978 – Worst Hand Anyway

The first event was held in the summer of 1978. I was working on Apocalypse Now and had been for a year. I was editing on a KEM. Richie Marks and Jerry Greenberg were working on Moviolas. There was an army of assistants re-constituting the dailies after we had cut a scene. I would cut stuff out and hang it on a bin. Then at the end of the day everything would have to be put back together again on the dailies rolls. This was re-constituting the dailies, which was very tedious work.

Marcia Lucas speed splicing. (L-R) Duwayne Dunham, Richard Hymns, and Dale Strumpell look on.

Steve, my assistant, and I were working in the same room. He had an action figure called Stretch [Armstrong]. It was a rubbery creature who could reach across the room with arms that would stretch. He’d taken Stretch and manacled him with wire to the rewinds and put Stretch’s body inside the sync machine. Stretch was being stretched on the rack of the sync machine. I said, “Steve what are doing?” And he said, “Stretch has to suffer!” [laugh] It was a way of blowing off steam from all of the semi-mindless, but crucially exacting work. And so I thought, they need a break.

My wife, Aggie, had put on a horse show earlier that summer for the local kids where they could do various horseback riding skills and get blue ribbons. So I thought, well, we’ll have one of those for a decathlon of skills in film. How fast can you splice? How quickly can you rewind a thousand feet? How accurately can you guess how many feet are in an arbitrary-sized reel of film? Those kind of things.

Murch winning rack stacking

All of the Apocalypse Now editors and Lucasfilm editors were invited. I think The Black Stallion was editing at the time, so they were invited. I think anyone in the Bay Area working in film was invited and it was just a wonderful afternoon. We staged it again two more times – 1983, I think, and then also in 1987. By the time we thought about doing it again, everything had become digital.

There’s probably a digital equivalent of that. But, I guess it wouldn’t be as much fun physically.

No, it wouldn’t be as much fun to look at. There were all kinds of ridiculous things that happened. Carroll Ballard was rewinding and it got out of control and the loops went way up, probably six feet on either side of the sync machine. He didn’t know what to do and panicked. And then, suddenly the loops collapsed and the sync machine flew up into the air and the film got torn to shreds.

Those are the things that you wanted to see! Much more exciting than watching the beach-ball spin around.


This conversation continues in Part 3.

A Conversation with Walter Murch – Part 1

©2023 Oliver Peters