No matter how long the career or number of awards, any editor might consider those films that passed by and wonder what they might have done with the film. That’s where we conclude this discussion.
Walter, in your career, are there any films that you didn’t edit, but wish you had?
I worked on K-19: The Widowmaker for Kathryn Bigelow. And in 2007 she sent me the script for The Hurt Locker, about the Iraq war. It made me realize that the last four films I had edited had been films about war, the latest one being Jarhead.I told her that I just wanted to take a break from editing war films. Of course, Hurt Locker went on to win six Oscars – Best Picture, Best Screenplay, Best Director, Best Editing, Best Sound Effects, and Best Mixing.
What would have happened if I had said yes to that? But, you also get into practical things. At the time of making that decision, I’d been away from home for a year editing Jarhead in Los Angeles and New York. This would have meant going into the Middle East or at least going to Los Angeles. But, the main thing was just I’d been thinking about war since 2000: Apocalypse Redux, war – K-19, war – Cold Mountain, war – Jarhead, war. Even The English Patient is kind of a war film. So turning down The Hurt Locker is the big What If? that comes to mind.
I did the recut of Touch of Evil. At that time, 1998, I was taken to the vaults in Los Angeles, where the material for The Other Side of the Wind was in storage. Gary Graver showed me some of the rough assemblies that had been put together by Welles himself, but I just didn’t want to work on that one. This looked to me very self-indulgent. The situation with Touch of Evil was very, very different.
The finished version of Wind seems like it’s an art film within the film and appears to be somewhat autobiographical. Although, I believe Welles denied that the director character (John Huston) was modeled after his own career.
Right. Touch of Evil was obviously a scripted film that was produced by Universal Studios in Hollywood – albeit, in a Wellesian manner – but it was very buttoned down, relatively speaking. And then Welles left us the 58 page memo, which made very specific suggestions for how the film should be recut. It was clear what he wanted done. I mean, he didn’t talk about frames – he would just say: this section needs to be shorter. Or: restore the script structure for the first four reels, cross-cutting between Janet Leigh’s story and Charlton Heston’s story. The studio put all the Heston story together and then all the Leigh story together. He wanted his original structure back. I don’t believe there was a guiding memo like that for The Other Side of the Wind.
Welles’ Touch of Evil memo is a wonderful document. It’s 58 pages by a genius filmmaker under duress, writing about his ideas and addressed to his enemies at the studio. It’s a masterclass in political diplomacy of trying to get his ideas across without accusation. It’s sad that he had to write it, but I’m happy we have it.
Walter Murch has led and continues to lead an interesting and eclectic filmmaking career. If you’ve enjoyed this 4-part series, there’s plenty more to be found in the books written by and about him. There are also many of his interviews and presentations available on the web.
SIGHT & SOUND: The Cinema of Walter Murch is documentary created by Jon Lefkovitz. This video is assembled from various interviews and presentations by Murch discussing his take on filmmaking and editing. It’s illustrated with many film examples to highlight the concepts.
Walter Murch is not only known for his work and awards in the realm of picture editing, but he’s also made significant contributions to the art of film sound. That’s where we pick up in Part 3.
Walter, let’s switch gears and talk about sound and your work as a sound designer and re-recording mixer. There’s an origin story about the term ‘sound designer’ credited to union issues. That might be a good place to start.
Frances [Ford Coppola] tells the story, but he gets it wrong. [laugh] There were union problems, because I was in the San Francisco union. Many of the films were financially based in LA, so what am I doing working on that? On The Rain People, for instance, my credit is sound montage. I wasn’t called sound mixer, re-recording mixer, or sound editor. We were just trying to avoid blatantly stepping on toes. I had the same sound montage credit on The Conversation. Then on Apocalypse Now I was credited with sound re-recording, because it was an independent film. Francis basically was the financier of it. So, it was an independent film, partially supported by United Artists.
The sound design idea came up because of this new format, which we now call 5.1. Apocalypse Now was the first time I had ever worked in that format. It was the first big film that really used it in a creative way. The Superman film in 1978 had it technically, but I don’t think they used it much in a creative fashion.
As I recall, at that time there was the four channel surround format that was left, center, right, and a mono rear channel.
Star Wars, which came out in 1977, had that format with the ‘baby boom’ thing. 70mm prior to that would have five speakers behind the screen – left, left-center, center, right-center, and right. What they decided to do on Star Wars was to not have the left-center/right-center speakers. Those were only used for super low frequency enhancement. Then the surround was many speakers, but all of them wired to only one channel of information.
We didn’t use those intermediate speakers at all and brought in Meyer ‘super boom’ speakers. These went down to 20Hz, much lower than the Altec speakers in those days, which I think bottomed out at around 50Hz. And then we split the mono surround speakers into two channels – left and right. Basically what we call 5.1 today. We didn’t call it 5.1, we just called it six-track or a split-surround format.
This was a new format, but how do you use it creatively? I couldn’t copy other films that had done it, because there weren’t any. So I designed an entire workflow system for the whole film. Where would we really use 5.1 and where would we not? That was a key thing – not to fall into the trap that usually happens with new technology, which is to overuse it. Where do we really want to have 5.1? In between that, let’s just use stereo and even some long sections where it’s just mono – when Willard is looking at the Kurtz dossier, for instance.
Willard’s narration section, if he’s in the focsle of the boat at night reading the memo, it’s just mono – it’s just his voice and that’s it. As things open up, approaching Hau Phat (the Playboy Bunny concert) it becomes stereo, and then as it really opens up, with the show beginning, it becomes full six-track, 5.1. Then it collapses back down to mono again the next morning. That’s the design element. So I thought, that’s the unique job that I did on the film – sound design – designing where we would fully use this new format and how we would use it.
In addition, of course, I’d cut some of the sound effects, but not anywhere near most of them, because we had a small army of sound effects editors working under Sound Effects Supervisor Richard Cirincione. However, I was the main person responsible. Francis at one point had a meeting and he said, “Any questions about sound? Walter’s the guy. Don’t ask me, ask Walter.” So I effectively became the director of sound. And then, of course, I was the lead re-recording mixer on the film.
Some readers might not be familiar with how film mixing works and why there are teams. Please go into that more.
Previously, in Hollywood, there would usually be three people – DME sitting at the board. That is how The Godfather was mixed. If you are facing the screen behind the console, then D [dialogue mixer] on the left, M [music mixer] in the middle and E [sound effects mixer] on the right. On the other hand, in San Francisco I had been the solo re-recording mixer on Rain People, THX-1138, and The Conversation.
As soon as you have automation you don’t need as many people, because the automation provides extra fingers. We had very basic automation on Apocalypse Now. Only the faders were automated, but none of the equalization, sends, echo, reverb, or anything else. So we had to keep lots of notes about settings. The automation did at least control the levels of each of the faders.
Of course, these days a single person can mix large projects completely ‘in the box’ using mainly a DAW. I would imagine mixing for music and mixing for film and television is going to use many of the same tools.
The big difference is that in the old days – and I’m thinking of The Godfather – we had very limited ability with the edited soundtracks to hear them together before we got to the mix. You had no way to set their levels relative to each other until you got to the mix. So the mix was really the creation of this from the ground up.
Thinking of the way I work now or the way Skip Lievsay works with the Coen brothers, he will create the sound for a section in Pro Tools and build it up. Then he’ll send the brothers a five-track or a three-track and they just bring it into the audio tracks of the Premiere timeline. So they’re editing the film with his full soundtrack. There are no surprises in the final mix. You don’t have to create anything. The final mix is when you hear it all together and put ‘holy water’ on it and say, that’s it – or not. Now that you’ve slept on it overnight, let’s reduce the the bells of the cows by 3dB. You make little changes, but it’s not this full-on assault of everything. As I said earlier, bareback – where it’s just taking the raw elements and putting them together for the first time in the mix. The final mix now is largely a certification of things that you have already been very familiar with for some time.
What was the experience like to go back in time, so to speak – working with a Moviola again?
I hadn’t cut any dailies on a Moviola since 1977, 45 years ago. Dan had not done anything on a Moviola since 1994. But it all came back instantly. There was not the slightest hesitation about what any of this stuff was or how we made it work. Interestingly, that’s very different from my experience with digital platforms.
In what way?
Let’s say I’m cutting a film using Avid and finish the work. Then three months later I get another job using the same Avid. In those three months the muscle memory of my fingers has somewhat evaporated. I have to ask my assistant questions similar to, “How do I tie my shoelaces?” [laugh] Of course, it comes back – it takes about three or four days to get rid of the rust. Then in about a week I’m fully back.
So that’s an interesting neurological question. Why does editing on the Moviola not have the slightest evaporation in 45 years, whereas editing on a digital platform that you are very familiar with start to evaporate if you are away from it for a few months? I think it’s because every skill in Moviola editing is a completely different set of physical muscular moves: splicing is different from rewinding is different from braking is different from threading up the Moviola, etc. etc. And each of them makes a different sound. Whereas the difference between ‘splicing’ and ‘rewinding’ in digital editing is simply a different keystroke.
In our emails we had talked a little bit about the differences between an upright Moviola and flatbeds like KEM and Steenbeck. Would you expand upon that a bit?
Ironically, the outliers in this are the flatbeds. In a sense, both the Moviola and nonlinear digital are random access machines. With the Moviola, everything is broken down into individual shots, which are rolled up and put into boxes. There might be two or three or sometimes six or seven shots in a box. When you want to see a shot, you ask your assistant, “Can you give me 357, take two?” That’s kind of what happens digitally, too, except you are making the selection by typing or mouse-clicking. Digital is much more random access: you can select internally within the shot.
A KEM or a Steenbeck on the other hand is linear. Everything is kept in the dailies rolls as they came from the lab. If you want to see a particular shot, you have to find it in its ten minute dailies roll. What I would do is thread it through the sprocketed prism, without going through the drive sprockets. Then I’d just spool down at very high speed with my hands on both the take up and the feed and watch for 30 seconds or so while it was winding until I got to the shot.
Next, I would put it on the screening side screen and lock it into the sprocketed motor drives, to figure out a place to edit it into the cut. On the KEM the picture module would be on my left and the sound for that on my right. The center would be what’s coming into the film. That’s my way of working, but everyone has a different way of working.
When George [Lucas] cut on the Steenbeck, he was using a one-screen Steenbeck, so that option of having two screens was not available to him. And so he would make select rolls. He would go through the dailies roll, cut out the good bits and then either hang them on hooks or build them into a separate selects roll. For philosophical reasons, I don’t like working that way, but it’s certainly a valid way of working.
I liked the ‘dailies roll’ method, because as I would be hi-speed scrolling for the shot I wanted, I often would find what I needed instead. As I would be spooling, I would glimpse alternate takes, things that I had initially rejected, which proved to be valuable, because as the film evolves, maybe they would now be helpful. Even when editing digitally, I still also construct what I call ‘KEM rolls’ of everything shot for a scene strung together.
We humans have this fascination with vintage analog gear, whether it’s film or audio. Is it just that touch of nostalgia or something different?
The basic idea behind the Moviola film is that if we don’t do it now it will disappear from history. It was difficult enough now in 2022, which coincidentally is the 100th birthday of the Moviola. The first one was built in 1922. If we don’t do it now, it will become exponentially more difficult.
Not the machine itself. I think they’ll always hang around, because they’re iconic, like ancient sculptures. But ancillary equipment like mag film was really hard to get. Also hard was the pressure-sensitive thermal tape for the Acmade printer. We eventually found it from Pixar in California. But if we hadn’t found those tapes, we couldn’t have made the film. It was as simple as that. Without that specific tape, all of this inverted pyramid would have just collapsed.
Every film in Hollywood from about 1925 until 1968, let’s say, was cut on a Moviola. Before 1925 cutting was done without any machine. The editors were just cutting by hand, assembling shots together and then screening the assembly. So the screening room was in essence their Moviola. They would take notes during the screening and then go back to the bench and trimmed and transposed shots or whatever. Ultimately all of the classic films from Hollywood after about 1925 were forged on the Moviola. That was my experience standing all day for 12 hours a day at the Moviola, with all this [winding noise] stuff.
It felt like blacksmithing in comparison to what we do digitally. And obviously you are physically cutting the film. I used the Inviso film splicer on this, which was invented in the mid-1970s. So it’s a little bit of a cheat to say in 1972, because I invented the Inviso in 1976. We also used my other invention, which was to make the hooks on the trim bin out of 1.75mm crochet hooks. These have a barb at the end of them, so that prevented several pieces of film from falling off when you hung them on a hook.
I mean, this is inside baseball, but one of the fascinating things about analog editing is that it is open to physical tinkering. For instance: the Acmade numbers were not printing boldly enough for some reason. What to do? Howard’s solution was to wrap a piece of adhesive tape around the sprocket wheel, forcing the film closer to the print head. You could fix something, just like working on a car from 1956. If you had a problem with the carburetor, take a screwdriver and bang away at it. Right? Today with digital fuel injection or now with electric motors, it’s hopeless for an ordinary person to have any access to the engine. To a certain extent there’s a similarity with digital film editing. Of course, if you know how to code and you know what a database does, you can do very sophisticated things in the digital realm, but that’s the requirement.
When Dan was syncing the film up, there was a sound recordist on the shoot, who was a young film student – maybe 24 years old. At the end of one session, he asked Dan confidentially, “Did you do this on every film?” [laugh] It was just incomprehensible to him that we had to do all this very physical work.
John Gregory Dunne wrote an article around 2004 about film craft. He said, a director friend of his called the old way, which is what we were doing in Moviola, “surgery without anesthetic.” That’s interesting, because how did you do it? Well, first of all, we had to do it. There was no other way. How did Michelangelo carve David? Did he sharpen his chisels after every ten bangs? How many assistants did he have? Just those really ephemeral things that were necessary to do well, which I don’t think we have any record of. So that was another reason to do the film. It was just to say, this is what you had to do.
I read Michael Rubin’s book Droidmakerabout the start of Lucasfilm. You are heavily featured in it. He referred to a picnic event you hosted called the Droid Olympics, which was directly related to the film editing techniques we’ve been talking about. Please tell me a bit about that.
The first event was held in the summer of 1978. I was working on Apocalypse Now and had been for a year. I was editing on a KEM. Richie Marks and Jerry Greenberg were working on Moviolas. There was an army of assistants re-constituting the dailies after we had cut a scene. I would cut stuff out and hang it on a bin. Then at the end of the day everything would have to be put back together again on the dailies rolls. This was re-constituting the dailies, which was very tedious work.
Steve, my assistant, and I were working in the same room. He had an action figure called Stretch [Armstrong]. It was a rubbery creature who could reach across the room with arms that would stretch. He’d taken Stretch and manacled him with wire to the rewinds and put Stretch’s body inside the sync machine. Stretch was being stretched on the rack of the sync machine. I said, “Steve what are doing?” And he said, “Stretch has to suffer!” [laugh] It was a way of blowing off steam from all of the semi-mindless, but crucially exacting work. And so I thought, they need a break.
My wife, Aggie, had put on a horse show earlier that summer for the local kids where they could do various horseback riding skills and get blue ribbons. So I thought, well, we’ll have one of those for a decathlon of skills in film. How fast can you splice? How quickly can you rewind a thousand feet? How accurately can you guess how many feet are in an arbitrary-sized reel of film? Those kind of things.
All of the Apocalypse Now editors and Lucasfilm editors were invited. I think The Black Stallion was editing at the time, so they were invited. I think anyone in the Bay Area working in film was invited and it was just a wonderful afternoon. We staged it again two more times – 1983, I think, and then also in 1987. By the time we thought about doing it again, everything had become digital.
There’s probably a digital equivalent of that. But, I guess it wouldn’t be as much fun physically.
No, it wouldn’t be as much fun to look at. There were all kinds of ridiculous things that happened. Carroll Ballard was rewinding and it got out of control and the loops went way up, probably six feet on either side of the sync machine. He didn’t know what to do and panicked. And then, suddenly the loops collapsed and the sync machine flew up into the air and the film got torn to shreds.
Those are the things that you wanted to see! Much more exciting than watching the beach-ball spin around.
Walter Murch is not one to sit around quietly in retirement. In recent years, he’s spent a lot of time living in London, working on a book and giving lectures, along with numerous film projects. I caught up with him via Zoom after his return from the 2022 Rome Film Festival last October. He was there for the screening of a multi-part series about the South African artist, William Kentridge. Murch is serving as the consulting editor for the South African production team with whom he is able to exchange Premiere Pro project files, complete with his markers, comments, and suggested edits.
Murch describes another production that’s been taking up his time as “a love letter to the Moviola.” Called Her Name Was Moviola, the documentary is about the history of the machine that defined film editing technology for three-fourths of the last century. Invented in the early 1920s, the Moviola hits its centennial anniversary. That’s where we’ll start this four-part discussion.
Walter, please tell me about the Moviola project.
Moviola was an idea that I cooked up fifteen years ago, I think. I pitched it to Norm Hollyn, who was head of post-production at USC. He was very interested, but then he died suddenly while in Japan a few years ago. It never really had a home at USC, because it didn’t get beyond just talking about it. The simple idea is to recreate an editing room from half a century ago, 1972, with what you would have – a Moviola, a rewind bench, a synchronizer, trim bins, an Acmade coding machine and pressure sensitive tape. Plus, something to cut.
We were able to recreate a cutting room at the BBC studios in Elstree. We convinced Mike Leigh, the British director, to give us the digital files from one of his films, which turned out to be Mr. Turner (2014). We took those files and reverse engineered them to print up 35mm film in 1.85:1 aspect ratio and the sound onto magnetic stripe film. That was probably the hardest thing to get, because nobody deals with mag stripe anymore. I don’t know where Howard got it all, but he was checking with post-production people here in London and combing through the catalog on eBay and eventually put together everything we needed.
I worked with my long-time British assistant, Dan Farrell, who I met on Return to Oz back 40 years ago. We had about 70 minutes of dailies for two scenes from Mr. Turner and treated them just as you would have back then… The lab has delivered the dailies from yesterday. OK, start syncing them up. Dan and I were wired up with radio mics and it was being covered by two tripod cameras, plus an iPhone with Filmic Pro for extreme close-ups. Dan and I were giving what you might call ‘golf commentaries’ describing the process. And occasional reminiscences about disasters that had happened in the past.
Next, these were synced up and screened. I took notes the way I used to on 3 x 5 cards. We printed Acmade numbers on the film and the sound to keep it in sync using the British system. That’s a little more work than the American system. The American system starts at the beginning of the roll and runs sequential numbers all the way through. Here in England, you stop at the end of every take and reset the footage number to zero, corresponding to the clapstick. For example, if you are 45 feet from the clapstick, then there is a number that tells you that you are 45 feet into this take. You also have at the head of it what set up it is and what take it is.
Dan did all of that and broke it down into little roll-ups, put tags on them, filled out the logbook, and then turned it over to me. Then I started cutting the two scenes. I didn’t have a script. I just cut it using my intuition.
Did you have any script supervisor notes?
No. I was riding bareback. [laugh] To tell the truth, that’s what I usually do. I only refer to the script supervisor’s notes on a film if there’s really some head scratcher. Like, what were they thinking here? Usually – and it was certainly the case here – it’s very obvious how it might go together.
I had seen Mr. Turner five months earlier. So, just based on the coverage, how would you cut this together? That took about two days. And as I was cutting it together, I gave a running commentary about what I was doing technically and also creatively – why was I making the cuts where I did and why did I choose the takes that I did. It wound up being about 750 feet long, three quarters of a reel.
We wanted to screen it in a theater with double-system sound, but that was impossible. There is no place in London where people have maintained a mag dubber that can sync up with a projector.
I would imagine that sort of post-production gear – if it still exists – probably isn’t in working condition.
Right. You know, it was amazing enough that we found somewhere to transfer the sound from digital files to mag. Howard found a place owned by a guy who’s into this stuff, maintains it, but mostly specializes in projectors. He had a mag recorder, but didn’t really know how it worked, so the two of them worked it out together and managed to get the transfer done. He’s probably the only person in London who still has that working machinery.
Anyway, we needed to screen it, because we were going to show it to Mike Leigh. Howard works frequently with the Stanley Kubrick estate. So we drove out there and they kindly let us use Stanley’s Steenbeck, which he bought for The Shining. It still had the original cardboard-core power transformers in the feet, which were dangerous and tripping the fuse. So as part of the production we also repaired the Kubrick Steenbeck and it then worked perfectly!
Mike arrived and we screened the scenes for him. At the end his only comment was, “You used too much of the sailboat.” [laugh] In the dailies there was a shot of a sailboat anchored off the coast. They also had shots of William Turner – the actor Timothy Spall – looking out a window at the ocean. So I put those two things together. I asked, “Why did you shoot it then?” And he said, “The cameraman shot it just because it was there.” It was one of those shots.
I guess he figured it would be useful B-roll.
Exactly. It’s a seagull type of shot. Use, if necessary. I had used it three times at transition points – seagull points. Interestingly, he said, “If you use something three times, it means it’s important,” which is true. It means that at some later point, somebody from this boat is going to come ashore and murder people or find treasure or whatever.
After that, I recut the scene and used only one shot of the sailboat. Then we pulled out a laptop and looked at Mr. Turner streaming and discovered how those scenes had actually been cut in the film. Essentially they were very similar. It’s that old question – give the dailies to two different editors, what do you get? In Mike’s version, the dialogue scene starts on the master and then goes into closeup. And once he was in closeups, he just stayed in closeups. I didn’t do that. In the middle of the scene the conversation changed completely and then got into more serious stuff in the second half. I used that as a carriage return, so to speak, to the change of topic. I cut back out to the master and then went in again. It’s a minor thing and that was essentially the difference there.
The full documentary is now being cut together by Howard, who was the director of it. Since that part was shot digitally, it’s also being cut digitally. At some point I’ll get a call from him to look at the first assembly. That’s where it is at the moment.
Editors often think of the clip within the edit application’s browser as the media file. But that clip is only a facsimile of the actual media. It links to potentially three different assets on the hard drive – the original camera (or sound) file, optimized media, and/or proxy media.
Optimized media. You may decide to create optimized media when the original media’s codec or file format is too taxing on your system. For example, you might convert a media file made up of an image sequence into an optimized movie file using one of the ProRes or DNx codecs. When you create optimized media, that is often the media used for finishing instead of the original camera media. For sake of simplicity I’ll refer to original media from here on, but understand that it could be optimized media or original camera files.
Proxy media. There are many reasons for creating proxy media – portability, system performance, remote editing, etc. Proxy media is usually lightweight, more highly compressed, and of a lower resolution than the original media. Nearly all editing applications enable users to edit with lightweight proxy media in lieu of heavier, native camera files. When proxy media has been created, then the media clip in the NLE’s browser can actually link to both the original camera file, as well as the proxy media file. Software “toggles” in the application can seamlessly swap the link from one type of media file to the other.
The NLEs that offer proxy editing workflows integrate routines to transcode and automatically switch the links between proxy and original camera files on the hard drive. DaVinci Resolve 18 is the newest in this group with the addition of the Blackmagic Proxy Generator application. However, that tool only works with Resolve Studio 18 downloaded from Blackmagic Design’s website. The Generator is an addition to Resolve 18 and augments the built-in transcoding tools. In either case, you don’t have to use the built-in routines nor the Blackmagic Proxy Generator. You can encode proxies using different software and even different computers. Then you can attach those proxies to the clips in the editing application at a later time.
Creating external proxy media
Proxies can be created with any encoding software. I like Apple Compressor, which includes a category of presets specifically designed for proxy media generation. The presets can be modified according to your needs. For instance, you can add a LUT and effects, like a timecode overlay. This makes it easy to know when you are toggled to the original or the proxy media within the NLE.
Before creating any proxy files, make sure that your original files all have unique file names. Rename any duplicates or those with generic file names, like Clip001, Clip002, etc. There are several key parameters needed for successful relinking between original and proxy media. These include matching names, frame rates, timecode, lengths, and audio channel configurations. Some applications let you force a relink when some of these items don’t match, but it will usually be one file at a time.
Frame sizes can be smaller, since that’s an aspect of any proxy workflow. For example, if you start with 4K/UHD original media, but you create half-size HD proxies. The embedded metadata in the proxy file informs the NLE so that the correct size is maintained when switching between the two. Likewise, the codecs do not need to match. You can have 4K/UHD ProRes HQ originals and HD H.264 proxy media (I prefer ProRes Proxy). The point is to have proxy media with smaller file sizes, which play back more efficiently on your computer.
When you transcode proxy media files in Compressor or any other encoding application, it’s best to render them into a folder specifically called Proxy. This can be anywhere you like, but it’s best to have it near your original camera files. If you have multiple camera file folders – organized by camera roll, day, camera model, etc – then there are two options. You can either have one single Proxy file for all renders or have a separate subfolder called Proxy within each camera roll folder.
Dealing with externally-created proxies in different editing applications
Final Cut Pro – There is a setting to switch between Proxy Preferred and Original/Optimized. When you create external proxies, highlight the original camera clips and relink to the proxy media in the Proxy folder(s). Once proxies have been linked, then you can seamlessly switch between the two types of media.
Premiere Pro – There is a similar toggle button accessible in the timeline tools panel. The linking steps are similar to Final Cut Pro. Highlight the originals and then Attach Proxies. Navigate to the Proxy folder(s) and attach that media. The toggle button lets you switch back and forth between media types.
DaVinci Resolve Studio 18 – This update changed the proxy workflow as well as added the Generator application. You can still use the older proxy generation method. If so, then set the encoding parameters and location in your project settings. If you encode using the Blackmagic Proxy Generator app or an external application, then it’s a different process. The advantage to using Blackmagic Proxy Generator is that you can set up watch folders for automatic encoding.
The default location when using the Blackmagic Proxy Generator app or Resolve’s internal routine places a Proxy subfolder inside the folder of each roll of original media. When that condition exists, then original clips added into the Media page automatically include links to both the original and the proxy media. In fact, the Proxy subfolders don’t even show up in Resolve’s browser when searching for media. When both types of media are present, then the Resolve clip icons reflects that duality.
When you transcode externally with Compressor or another app, then media placed into individual Proxy subfolders will also automatically link inside Resolve. However, if you render to a single, unified Proxy folder, then you’ll need to manually relink the proxy files to the originals in the Media page. Like the other two NLEs, you can do this as a batch function by navigating to the Proxy folder.
I hope these pointers will be a useful guide the next time you decide to use a proxy media workflow.