The Oscar. Now what?

Everything Everywhere All at Once dominated the Academy Awards night, including winning the Best Film Editing award for Paul Rogers. The team used Adobe Premiere Pro as their NLE of choice. By extension this becomes the first editing Oscar win for Premiere. Of course, it’s the team and editor that won the award, not the software that they used. Top editors could cut with any application and get the same result.

The Academy Awards started as a small celebratory dinner for insiders to recognize each other’s achievements in film. Over the decades this has become a major cultural event. Winning or even being nominated is a huge feather in the cap for any film. This can be heavily leveraged by the marketing teams of not only the film distributors and talent agents, but also the various products used in the process – be that cameras or software.

Avid’s dominance

When it comes to editing, Avid has been the 800-pound gorilla in the modern digital era. Ever since Walter Murch won for editing The English Patient using Media Composer, the specific NLE on an Oscar-winning film has become a hot topic among editors. This was never the case when the only options were Moviola, KEM, or Steenbeck.

Even this year nine out of the ten nominees for the Oscar for Best Picture and four out of the five nominees for Best Film Editing used Media Composer. Yet, Avid’s dominance in the winner’s circle has seen some occasional cracks from competitors, like Apple’s Final Cut Pro (legacy version) and Lightworks. Nevertheless, Media Composer is still a safe bet. And let’s not forget sound, where Pro Tools has even less competition from other DAWs among film and TV sound editors and mixers. All of the nominees for the Oscar for Best Sound at this year’s Academy Awards used Pro Tools.

There are, of course, many awards competitions around the world, including the ACE Eddie Awards, BAFTA, Golden Globes, and others, including various film festivals. Many of these don’t give out specific craft awards for editors or editing; however, a lot of these winning films have been edited with other tools. For example, many award-worthy indie films, especially documentaries, have been edited with Premiere Pro. Even Final Cut Pro (the current “X” version) has had wins in such categories. This includes wins for the short films, The Silent Child and Skin at the 2018 and 2019 Academy Awards.

Stacking up the NLE competitors

The truth of the matter is that today, there are seven viable applications that might be used to cut a professional feature film or documentary: Media Composer, Final Cut Pro, Premiere Pro, DaVinci Resolve, Lightworks, Edius X, and Vegas Pro. You could probably also factor in others, such Final Cut Pro 7 (now zombie-ware) and Media 100 (yes, still alive), not to mention consumer-oriented NLEs like iMovie or Movie Maker. Realistically, most experienced film editors are likely to only use one of the first five on the list.

Of those five, Blackmagic Design’s DaVinci Resolve is the app that most editors have their eyes on. Aside from its widespread use in color correction, Resolve is also a perfectly capable editing application. Although it has yet to pull off an Oscar win for editing, Resolve has been widely used in many aspects of the production and post workflow of top films. Owing to its nature as a “Swiss Army Knife” application, Resolve fits into various on-set, editing, and visual effects niches. It’s only a matter of time before Resolve gets an Oscar win for editing. But other Blackmagic Design products also shouldn’t be overlooked. In the 2023 Academy Awards, more than 20 films across the technical, documentary, short film, international feature film, and animated categories used some Blackmagic Design product.

Marketing

When an application is used on an award-winning film, I’d bet that the manufacturer’s marketing department is doing high-fives. But does this really move the sales needle? Maybe. It’s all aspirational marketing. They want you to feel that if you use the same software as an Oscar-winning film editor used, then you, too, could be in that league. Talent is always the key factor, but we can all dream. Right? That’s what marketing plays upon, but it also impacts the development of the application itself.

Both Avid and Adobe have been fine-tuning their tools with professional users in mind for years. They’ve added features based on the needs of a small, but influential (or at least vocal) market sector. This results in applications that tick most of the professional boxes, but which are also harder to learn and eventually master.

That’s a route Apple also chose to pursue with Final Cut Pro 1 through 7. Despite a heralded introduction with Cold Mountain in 2003, it took until 2010 before Angus Wall and Kirk Baxter nailed down an Oscar with The Social Network. They then reprised that in 2011 with a win for The Girl with the Dragon Tattoo. Even as late as 2020, the discontinued FCP 7 was represented by Parasite, winning Best Picture and nominated for Best Film Editing.

Apple and Final Cut Pro’s trajectory unexpectedly changed course with the introduction of Final Cut Pro X. This shift coincided with the growth of social media and a new market of many non-traditional video editors. Final Cut Pro in its current iteration is the ideal application for this market and has experienced a huge growth in users. But, it still gets labelled as being not ready for professional users, even though a ton of professional content is posted using the app. Apple took the platform approach – opting to leave out many advanced features and letting third party developers fill in the gaps where needed. This is the core of much of the criticism.

How advanced/complex does a professional NLE really need to be?

In the case of FCP, it’s certainly capable of Hollywood-level films along with a range of high-end, international dramas. Witness the many examples I’ve written about, like Focus, Whiskey Tango Foxtrot, Voice from the Stone, The Banker, Jezebel, and Blood Red SkyHowever, a wide range of professional editors would like to see more.

The internal corporate discussion goes like this. Marketing asks, “What do we have to do to get broader adoption among professional film editors?” Engineering answers, “It will take X dollars and X amount of time.” Top management asks, “What’s the return if we do that?” And that’s usually where the cycle stops, until the next year or awards season.

The truth is that the traditional high-end post market is extremely small for a company like Apple. The company is already selling hardware, which is their bread and butter. Will a more advanced version of FCP sell more hardware? Probably not. Avid, Adobe, and Blackmagic Design are already doing that for them. On the other hand, what is more influential for sales in today’s market – Oscar-winning professional editors or a bevy of YouTube influencers touting your product?

I’m not privy to sales numbers, so I have no idea whether or not going after the very small professional post market makes financial sense for either Blackmagic Design or Adobe. In the case of Avid, their dominance pays off through their ecosystem. Avid-based facilities are also likely to have Avid storage and Pro Tools audio facilities. Hardware most likely covers the development costs. Plus, both Avid and Adobe have shifted to subscription models (Adobe fully, Avid as an option). This seems to be good for both companies.

Blackmagic Design is also a hardware developer and manufacturer. Selling cameras and a wide range of other products enables them to offer DaVinci Resolve for as little as free. You’d be hard-pressed to find a production company that wasn’t using one or more Blackmagic products. Only time will tell which company has taken the approach that a) ensures their long term survival, and b) benefits professional film editors in the best way. In the case of Apple, it’s pretty clear that adding new feature to Final Cut Pro will generate more revenue in an amount that many competitors would envy. Yet, it would be small by Apple’s measurement.

In the end, awards are good for a developer’s marketing buzz, but don’t forget the real team that won the award itself. It’s wonderful for Paul Rogers and Adobe that Everything Everywhere All at Once was tapped for the Oscar for Best Film Editing. It’s an interesting milestone, but when it comes to software, it’s little more than bragging rights. Great to have, but remember, it’s Rogers that earned it, regardless of the tools he used.

©2023 Oliver Peters

A Conversation with Walter Murch – Part 4

Roads not travelled.

No matter how long the career or number of awards, any editor might consider those films that passed by and wonder what they might have done with the film. That’s where we conclude this discussion.

______________________

Walter, in your career, are there any films that you didn’t edit, but wish you had?

I worked on K-19: The Widowmaker for Kathryn Bigelow. And in 2007 she sent me the script for The Hurt Locker, about the Iraq war. It made me realize that the last four films I had edited had been films about war, the latest one being Jarhead. I told her that I just wanted to take a break from editing war films. Of course, Hurt Locker went on to win six Oscars – Best Picture, Best Screenplay, Best Director, Best Editing, Best Sound Effects, and Best Mixing.

What would have happened if I had said yes to that? But, you also get into practical things. At the time of making that decision, I’d been away from home for a year editing Jarhead in Los Angeles and New York. This would have meant going into the Middle East or at least going to Los Angeles. But, the main thing was just I’d been thinking about war since 2000: Apocalypse Redux, war – K-19, war – Cold Mountain, war – Jarhead, war. Even The English Patient is kind of a war film. So turning down The Hurt Locker is the big What If?  that comes to mind.

I know you are an Orson Welles buff, so I was actually thinking it might have been The Other Side of the Wind, which was finally completed in 2018.

I did the recut of Touch of Evil. At that time, 1998, I was taken to the vaults in Los Angeles, where the material for The Other Side of the Wind was in storage. Gary Graver showed me some of the rough assemblies that had been put together by Welles himself, but I just didn’t want to work on that one. This looked to me very self-indulgent. The situation with Touch of Evil was very, very different. 

The finished version of Wind seems like it’s an art film within the film and appears to be somewhat autobiographical. Although, I believe Welles denied that the director character (John Huston) was modeled after his own career.

Right. Touch of Evil was obviously a scripted film that was produced by Universal Studios in Hollywood – albeit, in a Wellesian manner – but it was very buttoned down, relatively speaking. And then Welles left us the 58 page memo, which made very specific suggestions for how the film should be recut. It was clear what he wanted done. I mean, he didn’t talk about frames – he would just say: this section needs to be shorter. Or: restore the script structure for the first four reels, cross-cutting between Janet Leigh’s story and Charlton Heston’s story. The studio put all the Heston story together and then all the Leigh story together. He wanted his original structure back. I don’t believe there was a guiding memo like that for The Other Side of the Wind.

Welles’ Touch of Evil memo is a wonderful document. It’s 58 pages by a genius filmmaker under duress, writing about his ideas and addressed to his enemies at the studio. It’s a masterclass in political diplomacy of trying to get his ideas across without accusation. It’s sad that he had to write it, but I’m happy we have it.

Thank you.

______________________

Walter Murch has led and continues to lead an interesting and eclectic filmmaking career. If you’ve enjoyed this 4-part series, there’s plenty more to be found in the books written by and about him. There are also many of his interviews and presentations available on the web.

SIGHT & SOUND: The Cinema of Walter Murch is documentary created by Jon Lefkovitz. This video is assembled from various interviews and presentations by Murch discussing his take on filmmaking and editing. It’s illustrated with many film examples to highlight the concepts.

Web of Stories – Life Stories of Remarkable People includes an 18-hour series of interviews with Walter Murch recorded in London in 2016. These are broken down into 320 short clips for easier viewing.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

A Conversation with Walter Murch – Part 3

©2023 Oliver Peters

A Conversation with Walter Murch – Part 3

Sound design and the film mixing process

Walter Murch is not only known for his work and awards in the realm of picture editing, but he’s also made significant contributions to the art of film sound. That’s where we pick up in Part 3.

______________________

Walter, let’s switch gears and talk about sound and your work as a sound designer and re-recording mixer. There’s an origin story about the term ‘sound designer’ credited to union issues. That might be a good place to start.

Frances [Ford Coppola] tells the story, but he gets it wrong. [laugh] There were union problems, because I was in the San Francisco union. Many of the films were financially based in LA, so what am I doing working on that? On The Rain People, for instance, my credit is sound montage. I wasn’t called sound mixer, re-recording mixer, or sound editor. We were just trying to avoid blatantly stepping on toes. I had the same sound montage credit on The Conversation. Then on Apocalypse Now I was credited with sound re-recording, because it was an independent film. Francis basically was the financier of it. So, it was an independent film, partially supported by United Artists.

The sound design idea came up because of this new format, which we now call 5.1. Apocalypse Now was the first time I had ever worked in that format. It was the first big film that really used it in a creative way. The Superman film in 1978 had it technically, but I don’t think they used it much in a creative fashion.

As I recall, at that time there was the four channel surround format that was left, center, right, and a mono rear channel.

Star Wars, which came out in 1977, had that format with the ‘baby boom’ thing. 70mm prior to that would have five speakers behind the screen – left, left-center, center, right-center, and right. What they decided to do on Star Wars was to not have the left-center/right-center speakers. Those were only used for super low frequency enhancement. Then the surround was many speakers, but all of them wired to only one channel of information.

Walter Murch mixing Apocalypse Now

We didn’t use those intermediate speakers at all and brought in Meyer ‘super boom’ speakers. These went down to 20Hz, much lower than the Altec speakers in those days, which I think bottomed out at around 50Hz. And then we split the mono surround speakers into two channels – left and right. Basically what we call 5.1 today. We didn’t call it 5.1, we just called it six-track or a split-surround format.

This was a new format, but how do you use it creatively? I couldn’t copy other films that had done it, because there weren’t any. So I designed an entire workflow system for the whole film. Where would we really use 5.1 and where would we not? That was a key thing – not to fall into the trap that usually happens with new technology, which is to overuse it. Where do we really want to have 5.1? In between that, let’s just use stereo and even some long sections where it’s just mono – when Willard is looking at the Kurtz dossier, for instance.

Willard’s narration section, if he’s in the focsle of the boat at night reading the memo, it’s just mono – it’s just his voice and that’s it. As things open up, approaching Hau Phat (the Playboy Bunny concert) it becomes stereo, and then as it really opens up, with the show beginning, it becomes full six-track, 5.1. Then it collapses back down to mono again the next morning. That’s the design element. So I thought, that’s the unique job that I did on the film – sound design – designing where we would fully use this new format and how we would use it.

Mark Berger, Francis Ford Coppola, and Walter Murch mixing The Godfather Part II

In addition, of course, I’d cut some of the sound effects, but not anywhere near most of them, because we had a small army of sound effects editors working under Sound Effects Supervisor Richard Cirincione. However, I was the main person responsible. Francis at one point had a meeting and he said, “Any questions about sound? Walter’s the guy. Don’t ask me, ask Walter.” So I effectively became the director of sound. And then, of course, I was the lead re-recording mixer on the film.

Some readers might not be familiar with how film mixing works and why there are teams. Please go into that more.

Previously, in Hollywood, there would usually be three people – DME sitting at the board. That is how The Godfather was mixed. If you are facing the screen behind the console, then D [dialogue mixer] on the left, M [music mixer] in the middle and E [sound effects mixer] on the right. On the other hand, in San Francisco I had been the solo re-recording mixer on Rain People, THX-1138, and The Conversation. 

Walter Murch handling the music mix, Particle Fever

As soon as you have automation you don’t need as many people, because the automation provides extra fingers. We had very basic automation on Apocalypse Now. Only the faders were automated, but none of the equalization, sends, echo, reverb, or anything else. So we had to keep lots of notes about settings. The automation did at least control the levels of each of the faders.

Of course, these days a single person can mix large projects completely ‘in the box’ using mainly a DAW. I would imagine mixing for music and mixing for film and television is going to use many of the same tools.

The big difference is that in the old days – and I’m thinking of The Godfather – we had very limited ability with the edited soundtracks to hear them together before we got to the mix. You had no way to set their levels relative to each other until you got to the mix. So the mix was really the creation of this from the ground up.

Supervising the mix, Coup 53

Thinking of the way I work now or the way Skip Lievsay works with the Coen brothers, he will create the sound for a section in Pro Tools and build it up. Then he’ll send the brothers a five-track or a three-track and they just bring it into the audio tracks of the Premiere timeline. So they’re editing the film with his full soundtrack. There are no surprises in the final mix. You don’t have to create anything. The final mix is when you hear it all together and put ‘holy water’ on it and say, that’s it – or not. Now that you’ve slept on it overnight, let’s reduce the the bells of the cows by 3dB. You make little changes, but it’s not this full-on assault of everything. As I said earlier, bareback – where it’s just taking the raw elements and putting them together for the first time in the mix. The final mix now is largely a certification of things that you have already been very familiar with for some time.

______________________

Click here for the conclusion of this conversation in Part 4.

A Conversation with Walter Murch – Part 1

A Conversation with Walter Murch – Part 2

©2023 Oliver Peters

A Conversation with Walter Murch – Part 2

Her Name was Moviola, continued

In Part 1 of my recent discussion with Oscar-winning editor Walter Murch, he explained about a documentary film with which he’s currently involved. In that film he takes a look at some of the processes traditional film editors went through. We continue that conversation here in Part 2.

______________________

What was the experience like to go back in time, so to speak – working with a Moviola again?

I hadn’t cut any dailies on a Moviola since 1977, 45 years ago. Dan had not done anything on a Moviola since 1994. But it all came back instantly. There was not the slightest hesitation about what any of this stuff was or how we made it work. Interestingly, that’s very different from my experience with digital platforms.

In what way?

Let’s say I’m cutting a film using Avid and finish the work. Then three months later I get another job using the same Avid. In those three months the muscle memory of my fingers has somewhat evaporated. I have to ask my assistant questions similar to, “How do I tie my shoelaces?” [laugh] Of course, it comes back – it takes about three or four days to get rid of the rust. Then in about a week I’m fully back.

Murch editing Coup 53 with Premiere Pro

So that’s an interesting neurological question. Why does editing on the Moviola not have the slightest evaporation in 45 years, whereas editing on a digital platform that you are very familiar with start to evaporate if you are away from it for a few months? I think it’s because every skill in Moviola editing is a completely different set of physical muscular moves: splicing is different from rewinding is different from braking is different from threading up the Moviola, etc. etc. And each of them makes a different sound. Whereas the difference between ‘splicing’ and ‘rewinding’ in digital editing is simply a different keystroke. 

In our emails we had talked a little bit about the differences between an upright Moviola and flatbeds like KEM and Steenbeck. Would you expand upon that a bit?

Ironically, the outliers in this are the flatbeds. In a sense, both the Moviola and nonlinear digital are random access machines. With the Moviola, everything is broken down into individual shots, which are rolled up and put into boxes. There might be two or three or sometimes six or seven shots in a box. When you want to see a shot, you ask your assistant, “Can you give me 357, take two?” That’s kind of what happens digitally, too, except you are making the selection by typing or mouse-clicking. Digital is much more random access: you can select internally within the shot.

A KEM or a Steenbeck on the other hand is linear. Everything is kept in the dailies rolls as they came from the lab. If you want to see a particular shot, you have to find it in its ten minute dailies roll. What I would do is thread it through the sprocketed prism, without going through the drive sprockets. Then I’d just spool down at very high speed with my hands on both the take up and the feed and watch for 30 seconds or so while it was winding until I got to the shot.

Next, I would put it on the screening side screen and lock it into the sprocketed motor drives, to figure out a place to edit it into the cut. On the KEM the picture module would be on my left and the sound for that on my right. The center would be what’s coming into the film. That’s my way of working, but everyone has a different way of working.

Murch and Taghi Amirani, director, Coup 53

When George [Lucas] cut on the Steenbeck, he was using a one-screen Steenbeck, so that option of having two screens was not available to him. And so he would make select rolls. He would go through the dailies roll, cut out the good bits and then either hang them on hooks or build them into a separate selects roll. For philosophical reasons, I don’t like working that way, but it’s certainly a valid way of working.

I liked the ‘dailies roll’ method, because as I would be hi-speed scrolling for the shot I wanted, I often would find what I needed instead. As I would be spooling, I would glimpse alternate takes, things that I had initially rejected, which proved to be valuable, because as the film evolves, maybe they would now be helpful. Even when editing digitally, I still also construct what I call ‘KEM rolls’ of everything shot for a scene strung together.

We humans have this fascination with vintage analog gear, whether it’s film or audio. Is it just that touch of nostalgia or something different?

The basic idea behind the Moviola film is that if we don’t do it now it will disappear from history. It was difficult enough now in 2022, which coincidentally is the 100th birthday of the Moviola. The first one was built in 1922. If we don’t do it now, it will become exponentially more difficult.

Not the machine itself. I think they’ll always hang around, because they’re iconic, like ancient sculptures. But ancillary equipment like mag film was really hard to get. Also hard was the pressure-sensitive thermal tape for the Acmade printer. We eventually found it from Pixar in California. But if we hadn’t found those tapes, we couldn’t have made the film. It was as simple as that. Without that specific tape, all of this inverted pyramid would have just collapsed. 

Every film in Hollywood from about 1925 until 1968, let’s say, was cut on a Moviola. Before 1925 cutting was done without any machine. The editors were just cutting by hand, assembling shots together and then screening the assembly. So the screening room was in essence their Moviola. They would take notes during the screening and then go back to the bench and trimmed and transposed shots or whatever. Ultimately all of the classic films from Hollywood after about 1925 were forged on the Moviola. That was my experience standing all day for 12 hours a day at the Moviola, with all this [winding noise] stuff.

It felt like blacksmithing in comparison to what we do digitally. And obviously you are physically cutting the film. I used the Inviso film splicer on this, which was invented in the mid-1970s. So it’s a little bit of a cheat to say in 1972, because I invented the Inviso in 1976. We also used my other invention, which was to make the hooks on the trim bin out of 1.75mm crochet hooks. These have a barb at the end of them, so that prevented several pieces of film from falling off when you hung them on a hook.

I mean, this is inside baseball, but one of the fascinating things about analog editing is that it is open to physical tinkering. For instance: the Acmade numbers were not printing boldly enough for some reason. What to do? Howard’s solution was to wrap a piece of adhesive tape around the sprocket wheel, forcing the film closer to the print head. You could fix something, just like working on a car from 1956. If you had a problem with the carburetor, take a screwdriver and bang away at it. Right? Today with digital fuel injection or now with electric motors, it’s hopeless for an ordinary person to have any access to the engine. To a certain extent there’s a similarity with digital film editing. Of course, if you know how to code and you know what a database does, you can do very sophisticated things in the digital realm, but that’s the requirement.

When Dan was syncing the film up, there was a sound recordist on the shoot, who was a young film student – maybe 24 years old. At the end of one session, he asked Dan confidentially, “Did you do this on every film?” [laugh] It was just incomprehensible to him that we had to do all this very physical work.

John Gregory Dunne wrote an article around 2004 about film craft. He said, a director friend of his called the old way, which is what we were doing in Moviola, “surgery without anesthetic.” That’s interesting, because how did you do it? Well, first of all, we had to do it. There was no other way. How did Michelangelo carve David? Did he sharpen his chisels after every ten bangs? How many assistants did he have? Just those really ephemeral things that were necessary to do well, which I don’t think we have any record of. So that was another reason to do the film. It was just to say, this is what you had to do.

I read Michael Rubin’s book Droidmaker about the start of Lucasfilm. You are heavily featured in it. He referred to a picnic event you hosted called the Droid Olympics, which was directly related to the film editing techniques we’ve been talking about. Please tell me a bit about that.

Droid Olympics, 1978 – Worst Hand Anyway

The first event was held in the summer of 1978. I was working on Apocalypse Now and had been for a year. I was editing on a KEM. Richie Marks and Jerry Greenberg were working on Moviolas. There was an army of assistants re-constituting the dailies after we had cut a scene. I would cut stuff out and hang it on a bin. Then at the end of the day everything would have to be put back together again on the dailies rolls. This was re-constituting the dailies, which was very tedious work.

Marcia Lucas speed splicing. (L-R) Duwayne Dunham, Richard Hymns, and Dale Strumpell look on.

Steve, my assistant, and I were working in the same room. He had an action figure called Stretch [Armstrong]. It was a rubbery creature who could reach across the room with arms that would stretch. He’d taken Stretch and manacled him with wire to the rewinds and put Stretch’s body inside the sync machine. Stretch was being stretched on the rack of the sync machine. I said, “Steve what are doing?” And he said, “Stretch has to suffer!” [laugh] It was a way of blowing off steam from all of the semi-mindless, but crucially exacting work. And so I thought, they need a break.

My wife, Aggie, had put on a horse show earlier that summer for the local kids where they could do various horseback riding skills and get blue ribbons. So I thought, well, we’ll have one of those for a decathlon of skills in film. How fast can you splice? How quickly can you rewind a thousand feet? How accurately can you guess how many feet are in an arbitrary-sized reel of film? Those kind of things.

Murch winning rack stacking

All of the Apocalypse Now editors and Lucasfilm editors were invited. I think The Black Stallion was editing at the time, so they were invited. I think anyone in the Bay Area working in film was invited and it was just a wonderful afternoon. We staged it again two more times – 1983, I think, and then also in 1987. By the time we thought about doing it again, everything had become digital.

There’s probably a digital equivalent of that. But, I guess it wouldn’t be as much fun physically.

No, it wouldn’t be as much fun to look at. There were all kinds of ridiculous things that happened. Carroll Ballard was rewinding and it got out of control and the loops went way up, probably six feet on either side of the sync machine. He didn’t know what to do and panicked. And then, suddenly the loops collapsed and the sync machine flew up into the air and the film got torn to shreds.

Those are the things that you wanted to see! Much more exciting than watching the beach-ball spin around.

______________________

This conversation continues in Part 3.

A Conversation with Walter Murch – Part 1

©2023 Oliver Peters

A Conversation with Walter Murch – Part 1

Her Name was Moviola

Walter Murch is not one to sit around quietly in retirement. In recent years, he’s spent a lot of time living in London, working on a book and giving lectures, along with numerous film projects. I caught up with him via Zoom after his return from the 2022 Rome Film Festival last October. He was there for the screening of a multi-part series about the South African artist, William Kentridge. Murch is serving as the consulting editor for the South African production team with whom he is able to exchange Premiere Pro project files, complete with his markers, comments, and suggested edits.

Murch describes another production that’s been taking up his time as “a love letter to the Moviola.” Called Her Name Was Moviola, the documentary is about the history of the machine that defined film editing technology for three-fourths of the last century. Invented in the early 1920s, the Moviola hits its centennial anniversary. That’s where we’ll start this four-part discussion.

______________________

Walter, please tell me about the Moviola project.

Moviola was an idea that I cooked up fifteen years ago, I think. I pitched it to Norm Hollyn, who was head of post-production at USC. He was very interested, but then he died suddenly while in Japan a few years ago. It never really had a home at USC, because it didn’t get beyond just talking about it. The simple idea is to recreate an editing room from half a century ago, 1972, with what you would have – a Moviola, a rewind bench, a synchronizer, trim bins, an Acmade coding machine and pressure sensitive tape. Plus, something to cut.

We did find a home for it at the University of Hertfordshire, which has a cinema program. Howard Berry, the head of post-production there, put together crowdfunding through IndieGoGo [the IndieGoGo funding round has been closed]. We raised £30,000 and then also some money was kicked in from people like Dolby. So I think the money for it is somewhere on the order of £50,000.

Cutting room at the BBC Studios

We were able to recreate a cutting room at the BBC studios in Elstree. We convinced Mike Leigh, the British director, to give us the digital files from one of his films, which turned out to be Mr. Turner (2014). We took those files and reverse engineered them to print up 35mm film in 1.85:1 aspect ratio and the sound onto magnetic stripe film. That was probably the hardest thing to get, because nobody deals with mag stripe anymore. I don’t know where Howard got it all, but he was checking with post-production people here in London and combing through the catalog on eBay and eventually put together everything we needed.

I worked with my long-time British assistant, Dan Farrell, who I met on Return to Oz back 40 years ago. We had about 70 minutes of dailies for two scenes from Mr. Turner and treated them just as you would have back then… The lab has delivered the dailies from yesterday. OK, start syncing them up. Dan and I were wired up with radio mics and it was being covered by two tripod cameras, plus an iPhone with Filmic Pro for extreme close-ups. Dan and I were giving what you might call ‘golf commentaries’ describing the process. And occasional reminiscences about disasters that had happened in the past.

Walter Murch and Dan Farrell

Next, these were synced up and screened. I took notes the way I used to on 3 x 5 cards. We printed Acmade numbers on the film and the sound to keep it in sync using the British system. That’s a little more work than the American system. The American system starts at the beginning of the roll and runs sequential numbers all the way through. Here in England, you stop at the end of every take and reset the footage number to zero, corresponding to the clapstick. For example, if you are 45 feet from the clapstick, then there is a number that tells you that you are 45 feet into this take. You also have at the head of it what set up it is and what take it is.

Dan did all of that and broke it down into little roll-ups, put tags on them, filled out the logbook, and then turned it over to me. Then I started cutting the two scenes. I didn’t have a script. I just cut it using my intuition.

Did you have any script supervisor notes?

No. I was riding bareback. [laugh] To tell the truth, that’s what I usually do. I only refer to the script supervisor’s notes on a film if there’s really some head scratcher. Like, what were they thinking here? Usually – and it was certainly the case here – it’s very obvious how it might go together. 

I had seen Mr. Turner five months earlier. So, just based on the coverage, how would you cut this together? That took about two days. And as I was cutting it together, I gave a running commentary about what I was doing technically and also creatively – why was I making the cuts where I did and why did I choose the takes that I did. It wound up being about 750 feet long, three quarters of a reel.

We wanted to screen it in a theater with double-system sound, but that was impossible. There is no place in London where people have maintained a mag dubber that can sync up with a projector.

I would imagine that sort of post-production gear – if it still exists – probably isn’t in working condition.

Right. You know, it was amazing enough that we found somewhere to transfer the sound from digital files to mag. Howard found a place owned by a guy who’s into this stuff, maintains it, but mostly specializes in projectors. He had a mag recorder, but didn’t really know how it worked, so the two of them worked it out together and managed to get the transfer done. He’s probably the only person in London who still has that working machinery.

Walter Murch, Mike Leigh, Howard Berry

Anyway, we needed to screen it, because we were going to show it to Mike Leigh. Howard works frequently with the Stanley Kubrick estate. So we drove out there and they kindly let us use Stanley’s Steenbeck, which he bought for The Shining. It still had the original cardboard-core power transformers in the feet, which were dangerous and tripping the fuse. So as part of the production we also repaired the Kubrick Steenbeck and it then worked perfectly!

Mike arrived and we screened the scenes for him. At the end his only comment was, “You used too much of the sailboat.” [laugh] In the dailies there was a shot of a sailboat anchored off the coast. They also had shots of William Turner – the actor Timothy Spall – looking out a window at the ocean. So I put those two things together. I asked, “Why did you shoot it then?” And he said, “The cameraman shot it just because it was there.” It was one of those shots.

I guess he figured it would be useful B-roll.

Exactly. It’s a seagull type of shot. Use, if necessary. I had used it three times at transition points – seagull points. Interestingly, he said, “If you use something three times, it means it’s important,” which is true. It means that at some later point, somebody from this boat is going to come ashore and murder people or find treasure or whatever.

After that, I recut the scene and used only one shot of the sailboat. Then we pulled out a laptop and looked at Mr. Turner streaming and discovered how those scenes had actually been cut in the film. Essentially they were very similar. It’s that old question – give the dailies to two different editors, what do you get? In Mike’s version, the dialogue scene starts on the master and then goes into closeup. And once he was in closeups, he just stayed in closeups. I didn’t do that. In the middle of the scene the conversation changed completely and then got into more serious stuff in the second half. I used that as a carriage return, so to speak, to the change of topic. I cut back out to the master and then went in again. It’s a minor thing and that was essentially the difference there.

The full documentary is now being cut together by Howard, who was the director of it. Since that part was shot digitally, it’s also being cut digitally. At some point I’ll get a call from him to look at the first assembly. That’s where it is at the moment.

______________________

This conversation continues in Part 2.

An abridged version of this interview also appears at Pro Video Coalition.

©2023 Oliver Peters