DaVinci Resolve Editor Keyboard

Blackmagic Design doubled-down on advanced editing features in 2019 by introducing a new editing mode to DaVinci Resolve 16 called the cut page. They also added a dedicated editor’s keyboard – something that warms the heart of any editor who started their career in a linear edit suite. After some post-NAB feedback and adjustment, the keyboard is finally ready for prime time, running with DaVinci Resolve 16.1 (currently in public beta) or later.

Blackmagic Design’s Grant Petty comes from a broadcast engineering background and knows how fast tape editing was with the right controller. Speed is lost using a mouse-centric, drag-and-drop approach, so the DaVinci Resolve keyboard is designed to put speed back into modern edit workflows. Blackmagic Design was kind enough to loan me a keyboard for a couple of weeks of testing for this review.

Hardware design

The keyboard is very reminiscent of Sony’s BVE keyboards of the past. That’s not simply cosmetic – there are a number of plastic editing keyboards with a shuttle knob – it’s about precision engineering. The DaVinci Resolve search dial (job/shuttle/scroll wheel) truly feels like it has the same type of ballistics and tactile feedback that a Sony dial gave you. The DaVinci Resolve keyboard is built into a sturdy metal case with keycaps that are designed to take some pounding. They intend for the keyboard to last and will offer replacement parts as needed. In short, don’t think of this as a product you’ll have to toss out in a few years.

The keyboard connects via USB-C. But it also worked on the USB3.0 connection of a two-year old iMac and MacBook Pro by using a USB-A to USB-C cable. The back of the keyboard includes two additional USB-A ports for a thumb drive, mouse, or a DaVinci Resolve license key (“dongle”). The keyboard is wider than a standard extended keyboard due to dedicated edit keys on the left and the search dial on the right. It has a replaceable wrist rest on the front edge and adjustable feet to elevate the keyboard angle.

The Cut Page

The Editor Keyboard is optimized for the cut and edit pages. It does work as a standard keyboard in the color, Fairlight, and Fusion pages. However, I found the dial operation in those modes to be rather finicky. Outside of DaVinci Resolve, it’s a generic QWERTY keyboard, but the special edit keys and dial will not work with other editing software.

It’s hard to talk about the keyboard without delving into the cut page. While the keyboard works effectively and correctly in the edit page, you’ll still find yourself needing the mouse, which defeats the purpose. In short, the design motivation is fast editing where your hands never leave the keyboard. That ideal plays out best in the cut page and the two have been developed in tandem.

While the DaVinci Resolve cut page shares many similarities to Apple’s Final Cut Pro X, Blackmagic Design software engineers added a number of unique functions that improve editing speed. The best of these is the source tape view. The bin can be sorted by timecode, camera, duration, or name order using dedicated keys and then viewed as if from a single source – essentially a virtual string-out. Quickly scroll through the footage using the search dial as effortlessly as using the FCPX skimming function. Large, dedicated buttons for source and timeline, in and out, and sort methods make for easy navigation and quick assembly. Smart edit and special function buttons, such as the unique “close-up” button (automatically does a basic punch-in of high-res footage), round out the picture.

The cut page itself has a number of other unique features that are beyond the scope of this article. Nevertheless, one unique tool that is worth mentioning is the dual timeline view. The timeline pane is divided into a top mini-display of the full timeline, while the lower area always shows the zoomed-in section of the timeline at the current time indicator (cursor). You never have to zoom in and zoom out to navigate your timeline. The search dial makes it a breeze to quickly scroll through the full timeline (top) and then hit the jog key to zero in on the frame you want (bottom).

Trimming is where the dial shines. Dedicated keys quickly select in-point, out-point, roll, slip, or slide trimming. Simply hit the key and DaVinci Resolve automatically jumps to the nearest cut point. Then use the search dial for the rest. As you adjust the head or tail of a cut the rest of the timeline ripples accordingly. It’s one of the best trim models of any NLE.

Some additional thoughts

I do have a few quibbles. Trim functions in the cut and edit pages are inconsistent with each other. The cut page uses a similar model to FCPX, where audio and video from the clip are combined into a single timeline clip rather than on separate tracks. Unfortunately, Blackmagic Design has yet to implement a way to expand a/v clips and perform L-cut or J-cut trimming on the cut page. You’ll have to shift to the edit page to perform those.

This is a right-handed device, so left-handed editors will have the same dilemma that left-handed guitar players encounter. In addition, these are imprinted keycaps based on DaVinci Resolve’s default keyboard map. If you use a custom layout or one of the other keyboard maps that DaVinci Resolve offers, then the QWERTY command portion of the keyboard becomes less useful.

The search dial will not override the J-K-L or the space bar play commands. In order to jog once the sequence is playing, you must first hit the K key or the space bar to stop playback before you can properly jog through frames. Otherwise, playback continues the minute you let go of the dial.

Conclusion

This keyboard is addictive. But, is its $995 (USD) price tag justified? That’s steep, but many plastic gaming keyboards can run up to $200 and some even $500. That’s without any extra pointers, dials, or keys. I’ve also found precision metal keyboards with force-sensitive pointers as high as $3,000. Given that, Blackmagic Design may be in the right ballpark. Just like control surfaces for grading or mixing, this keyboard isn’t for everyone. If you are already a fast, keyboard-oriented editor, then the DaVinci Resolve Editor Keyboard may not make you faster. Likewise, a Final Cut Pro X editor who flies by skimming with a mouse is also going to have a hard time justifying the expense, not to mention a shift to a different application.

This keyboard is designed for DaVinci Resolve editors and not colorists. It’s for facilities that intend to deploy DaVinci Resolve as their full-time editing application. I could easily see DaVinci Resolve and this keyboard used in a fast turnaround edit environment, like broadcast news. Under that scenario, it will certainly enhance speed and workflow, especially for editors who want to make the most out of the new cut page.

Originally written for RedShark News.

Be sure to also check out Scott Simmons’ review at ProVideoCoalition.

©2019 Oliver Peters

Advertisements

Foolproof Relinking Strategy

Prior to file-based camera capture, film and then videotape were the dominant visual acquisition technologies. To accommodate, post-production adopted a two-stage solution: work print editing + negative conform for film, offline/online editing for video. During the linear editing era high-res media on tape was transferred to a low-res tape format, like 3/4″, for creative editing (offline). The locked cut was assembled and enhanced with effects and graphics in a high-end online suite using an edit decision list and the high-res media. The inherent constraints of tape formats forced consistency in media standards and frame rates.

In the early nonlinear days, storage capacities were low and hard drives expensive, so this offline/online methodology persisted. Eventually storage could cost-effectively handle high-res media, but this didn’t eliminate these workflows. File-based camera acquisition has brought down operating cost, but the proliferation of formats and ever-increasing resolutions have meant that there is still a need for such a two-stage approach. This is now generally referred to as proxy versus full-resolution editing. The reasons vary, but typically it’s a matter of storage size, system performance, or the capabilities of the systems and operator/artist running the finishing/full-res (aka “online”) system.

All of this requires moving media around among drives, systems, locations, and facilities, thus making correct list management essential. Whether or not it works well depends on the ability to accurately relink media with each of these moves. Despite the ability of most modern NLEs to freely mix and match formats, sizes, frame rates, etc., ignoring certain criteria will break media relinking. You must be able to relink the same media between systems or between low and high-res media on the same or different systems.

Criterial for successful relinking

– Unique file names that match between low and high-res media (extensions are usually not important).

– Proper timecode that does not repeat within a single clip.

– A single, standard frame rate that matches the project’s base frame-rate. Using conform or interpret functions within an NLE to alter a clip’s frame rate will mess up relinking on another system. Constant speed changes (such as slomo at 50%) is generally OK, but speed ramp effects tend to be proprietary with every NLE and typically do not translate correctly between different edit or grading applications.

– Match audio configurations between low and high-res media. If your camera source has eight channels of audio, then so must the low-res proxy media.

– Match clip duration. High-res media and proxies must be of the exact same length.

– Note that what is not important is matching frame size or codec or movie wrapper type (extension).

Proxy workflows

Several NLE applications – particularly Final Cut Pro X and Premiere Pro – offer built-in proxy workflows, which automatically generate proxy media and let the editor seamlessly toggle between full-res and proxy files. These are nice as long as you don’t move files around between hard drives.

In the case of Premiere Pro, you can delete proxy files once you no longer need them. From that point on you are only working with full-res media. However, the Premiere project continues to expect to have the proxy file available and wants to locate them when you launch the project. You can, of course, ignore this prompt, but it’s still hard to get rid of completely.

With FCPX, any time you move media and the Library file to another drive with a different volume name, FCPX prompts a relink dialogue. It seems to relink master clips just fine, but not the proxy media that it generated IF stored outside of the Library package. The solution is to set your proxy location to be inside the Library. However, this will cause the Library file to bloat in size, making transfers of Library files between drives and editors that much more cumbersome. So for these and other reasons (like not adhering strictly to the criteria listed above) relinking can often be problematic to impossible (Avid, I’m looking at you).

Instead of using the built-in proxy workflows for projects with extended timetables or huge amounts of media, I prefer an old-school method. Simply transcode everything, work with low-res media, and then relink to the master clips for finishing. Final Cut Pro X, Premiere Pro, and Resolve all allow the relinking of master clips to different media if the criteria match.

Here are five simple steps to make that foolproof.

1. Transcode all non-professional camera originals to a high-quality mastering codec for optimized performance on your systems. I’m talking about footage from DSLRs, GoPros, drones, smart phones, etc. On Macs this will tend to be the ProRes codec family. On PCs, I would recommend DNxHD/HR. Make sure file names are unique (rename if needed) and that there is proper timecode. Adjust frame rates in the transcode if needed. For example, 29.97fps recordings for a playback base rate of 23.98fps should be transcoded to play natively at 23.98fps. This new media will become your master files, so park the camera originals on the shelf with the intent of never needing them (but for safety, DO NOT erase).

2. Transcode all master clips (both pro formats like RED or ARRI, as well as those transcoded in step 1) to your proxy format. Typically this might be ProRes Proxy at a lower frame size, like 1280 x 720. (This is obviously an optional step. If your system has sufficient performance and you have enough available drive space, then you may be able to simply edit with your master source files.)

3. Edit with your proxy media.

4. When you are ready to finish, relink the locked cut to your master files – pro formats like RED and ARRI – and/or the high-res transcodes from step 1.

5. Color correct/grade and add any final effects for finish and delivery.

©2019 Oliver Peters

Rocketman

The last two of years have been rich for film audiences interested in the lives of rock legends. Rocketman was this year’s stylized biography about Elton John. Helmed by British actor/director Dexter Fletcher and starring Taron Egerton of the Kingsman film series, Rocketman tells John’s life through his songs. Astute film buffs also know that Fletcher was the uncredited, additional director who completed Bohemian Rhapsody through the end of principal photography and post, which will invite obvious comparisons between the two rock biopics.

Shepherding Rocketman through the cut was seasoned film editor, Chris Dickens. With experience cutting comedies, dramas, and musicals, it’s impossible to pin Dickens down to any particular film genre. I had recently interviewed him for Mary Queen of Scots, which was a good place to pick up this conversation about editing Rocketman.

__________________________________________

[OP] Our last conversation was about Mary Queen of Scots. I presume you were in the middle of cutting Rocketman at that time. Those are two very different films, so what brought you to edit Rocketman?

[CD] I made a quick shift onto Rocketman after Mary Queen of Scots. It was a fast production with eight or nine months filming and editing. The project had been in the cards a year before and I had met with Dexter to discuss doing the film. But, it didn’t happen, so I had forgotten about it until it got greenlit. I like musicals and have done one before – Les Miserables. This one was more ambitious creatively. Right from the beginning I liked the treatment of it. Rocketman was a classic kind of musical, but it was different in that the themes were adult and had a strong visual sense. Also the treatment using Elton John’s songs and illustrating his life with those was interesting.

[OP] The director had a connection with both Rocketman and Bohemian Rhapsody. Both films are about rock legends, so audiences may draw an obvious comparison. What’s your feeling about the contrast between these two films?

[CD] Obviously, there are a lot of similarities. Both films are essentially rock biopics about a musical figure. Both Freddie and Elton were gay. So that theme is similar, but that’s where it ends. Bohemian Rhapsody was aimed at a wider audience, i.e. less adult material – sex and drug-taking – things like that. And secondly, it’s about music, but it’s not a musical. It’s always grounded in reality. Characters don’t get up and sing to the camera. It’s about Freddie Mercury and Queen and their music. So the treatment of it is very different. Another fundamental difference is that Elton John is still alive and Freddie Mercury is not, so that was right at the film’s core. From the start you know that, so it has a different kind of power.

[OP] Whenever a film deals with popular music – especially when the rights-owners are still alive and active – the treatment and use of that music can be a sticking point. Were Elton John or Bernie Taupin actively involved in the production of Rocketman?

[CD] Yes, they were. Bernie less so – mainly Elton. He didn’t come in the edit room that much, but his husband, David Furnish, was a bit more involved. Elton is not someone who goes out in public that much, except to perform. He’s such a massive star. But, he did watch cuts of the film and had notes – not at every stage – but, David Furnish was the conduit between us and him. Naturally, Elton sanctioned all of the music tracks that were used. But the film was not made by them, i.e. we were making the film and they were giving us notes.

[OP] How were the tracks handled? Was the music remixed from the original studio masters with Taron lip-syncing to Elton’s voice – or was it different?

[CD] The music was radically changed in some cases from the original – the arrangements, the scoring. The music was completely re-recorded and sung by Taron, the actor playing Elton. We evolved the choices made at the beginning during the edit. So alongside of the picture edit was a music edit and a music mix going on constantly. In some cases Taron was singing on-set and we used that for about a quarter of the tracks. These were going in and out of scenes that had natural dialogue. Taron would start singing and we would play the track underneath. Then at that point perhaps, he would start lip-syncing, so it was a combination. On some tracks he was completely lip-syncing to what he had recorded before. This set the tempo for those scenes, but the arrangements evolved during the edit.

Even when he was lip-syncing, it was to his own voice. The whole idea was that the singing would not be Elton, except at the end where we have a track with both singing in the credits roll. So it’s a key thing that these were new recordings. Giles Martin, son of the legendary George Martin, was the music producer who took care of everything and put up with our constant changes. We had a team of two music editors who worked alongside us and a score as well, written by Matt Margeson, which we were rolling into the film in places. It was a real team process of building the film slowly.

[OP] Please expand on the structure of a film musical and what it takes to edit one.

[CD] The editing process was challenging, because of the complex structure. It was fundamentally a musical, with fifteen or sixteen tracks – meaning songs or music numbers – that were initially planned to be shot. Some of these were choreographed song-and-dance sequences. Combined with that was a sort of kitchen sink drama about Elton’s life, his childhood, his teenage years, and then into manhood. And then becoming a superstar. The script has the songs and then long sequences of more classic storytelling. What I found – slowly, as we were putting the film together, even during the shoot – was that we needed to unify those two things within the edit.

For instance, the first song number in the movie is “The Bitch is Back.” It’s a dance sequence with Elton as a boy walking down the street while people are singing and dancing around him. Then his adult self is chasing him around. It’s a very stylized sequence, which then went into about an eight minute sequence of storytelling about his childhood. We needed to give the film the same tone all the way through, i.e. that slightly fantastical feel of a musical. We screened it a few times for some of the core people and it became clear that we wanted to go with the fantastical elements of the film, not the more down-to-earth, realistic elements. Obviously, you could have made the choice to cut back on the music, but that seemed counter-intuitive. So we had to make some deep cuts in the sections between the musical pieces to get the story to flow and have that same kind of tone.

There was also a flashback structure. The film starts with Elton later on as an adult in rehab, after having fallen into drug and alcohol addiction. We framed the film with this device, so it was another element that we had to make work in the edit to get it to feel as an organic part of the story. We found that we didn’t have enough of these rehab sequences and had to shoot a few more of them during the edit to knit the film together in this way in order to remind you that he was telling this story – looking back on his past.

Cutting back sections between the musical numbers wasn’t our only solution to get the right tone. We had to work out how to get in and out of the musical sequences and that’s where the score comes in. I played with this quite a lot with the composer and Giles to have themes from Elton’s song coming throughout the film. For example, “Goodbye Yellow Brick Road” had some musical themes in it that we started using as the theme that went with his rehab. The theme of the film is that Elton lost any sense of where he came from as a person, because of his stardom and “Goodby Yellow Brick Road” – the song – is about that. It’s actually about going back to the farm and your roots. The song isn’t actually in the film until the very end when he performs it. So we found that using this musical theme as a motif throughout the film is very powerful and helped to combine the classical storytelling scenes with the musical scenes.

[OP] Was this process of figuring out the right balance something that happened at the beginning and then became a type of template for the rest of the film? Or was is a constant adjustment process throughout the cutting of Rocketman?

[CD] It was a constant thing trying to make the film work as a whole so people wouldn’t be confused about the tone. At one point we had far too much music and had to take some out. It became very minimal in some areas. In others, it led you more. It was about getting that balance right all the way through. I’m primarily a picture editor, but on this film you couldn’t just concentrate on the picture and then leave the music to the music editors and composer, because it was absolutely a fundamental part of the film. It was about music and so how you were using music was very key within the edit. Sometimes we had to cut longer songs down. Very few are at their original length. Some are half their recorded length.

[OP] This process sounds intriguing, since the scenes use a song as the underlying building block. Elton John’s songs tend to be pop songs – or at least they received a lot of radio airplay – so did those recorded lengths tend to drive the film?

[CD] No. At first I thought we’d have to be very faithful, but as we started cutting, the producers -and particularly Elton John’s side of it – didn’t care whether we cut things down or made them longer or added bits. They weren’t precious about it. In fact, they wanted us to be creative. The producers would say, “Don’t worry about cutting that down, Giles will deal with it.” Of course he would. Although sometimes he’d come back to me and say, “Look, this doesn’t quite work musically. You need to add a bit more time to this, or another couple of bars of music.” So we had a whole back-and-forth process like that.

For instance, in the track “Rocketman,” which is the film’s centerpiece, Elton tries to commit suicide. He’s at a party, gets drunk, and jumps into the swimming pool. While he’s underneath he starts seeing visions of himself as a child under there. He starts singing and gets fished out of the pool and then put on stage in a stadium. It’s a whole sequence that’s been planned to play like that. Of course, I couldn’t fit what they’d shot into the song – there wasn’t enough time. It was all good stuff, so I added a few bars. I’d give it to music and they’d say, “Oh, you can’t add that in that way.” So I’d go back and try different ways of doing it.

At the end, when he’s put back on stage at Dodger Stadium, he’s in a baseball uniform and then fires into the air like a rocket. They shot it in a studio without a big crowd and it looked okay. As soon as we started getting the visual effects, we thought, “Wow. This looks great.” So we doubled the length of that – added on, repeated the chorus, and all of that – because we thought people were going to love this. It looked and sounded great. But, when we then tested the film, it was way too long. It had just outstayed its welcome. We then had to cut it down again, although it was still longer than they’d originally planned it.

[OP] With a regular theatrical musical, the song are written to tell the story. Here, you are using existing songs that weren’t written with that story in mind. I presume you have to be careful that you don’t end up with just a bunch of music videos strung back-to-back.

[CD] Exactly. I don’t think we ever strayed into that. It was always about – does it make its point? These songs were written at all times in his career, but we didn’t use them in their original chronological order. “Honky Cat” was written later than when we used it. He’s just getting successful and at the end of “Honky Cat” they are buying Rolls Royces and clothes and football teams. At the end of that there was a great song-and-dance routine with them dancing on a record – Elton and John Reid, his manager and also a kind of boyfriend. That part went on for two minutes and we ended cutting it out. Partly because people and the producers who saw it thought it wasn’t the right style. It had a kind of 1920s or 1930s style with lots of dancers. It was a big number and took a long time to edit, but we took it out. I thought it was quite a nice sequence, but most people thought the film was better without it, because it wasn’t moving the story on.

[OP] Other than adjusting scenes and length, did friends-and-family and test audience screenings change your edit significantly?

We did three big screenings in Los Angeles, San Francisco, and Kansas City, plus a number of smaller ones in England. The audiences were a mix of people who were Elton John fans, as well as those that weren’t. Essentially people liked the film right from the start, but the audiences weren’t getting some parts, like the flashback structure with the rehab scenes – particularly at the beginning. They didn’t really understand what he was singing about. 

That first song [“The Bitch Is Back”] caused a lot of difficulty, because it starts the film and says this is a musical. You have to handle that the right way. I think the initial problems were partly in how I had cut the sequence originally. I tried to show too much of the crowd around him and the dancers and I thought that was the way to go with it. Actually what it turned out was the way to go was the relationship between the two of them – Elton and Elton as the little boy – because that’s what the song was about. I then readjusted the edits, taking out a lot of the wide shots.

Also Taron had done some improvised dialogue to the little boy rather than just singing all the way through – dialogue lines like, “Stop doing that.” That was in the film a long time, but people didn’t like it and didn’t understand why he was angry with the boy. So we cut that out completely. Another issue was that right at the start, the little boy starts singing to Taron as Elton first, but audiences did not feel comfortable with it. We discussed it a lot and decided that the lead actor should be the one we hear singing first. We did a reshoot of that beginning portion of the scene. You have to let the audience into it more slowly than we had originally done. That’s a prime example of how editing decisions can lead to additional filming to really make it work.

[OP] You mentioned visual effects to complete the “Rocketman” scene. Were there a lot of effects used to make the film period-accurate or just for visual style?

[CD] Quite a lot, though not excessively, like a comic book movie. I imagine it was similar to Bohemian Rhapsody that had to shoot gigs and concerts and places were you couldn’t go now and film that. But our visual effects weren’t as fundamental in that I didn’t need them to cut with. The boy underwater was all created, of course. Taron in the pool was actually him underwater, because he had breathing apparatus. But the little boy couldn’t, so he was singing ‘dry for wet’ – shot in the studio and put into the scene later. There were different evolutions of that scene. In one version we took the boy out completely and just had Taron singing.

The end of the film as written was going to be a re-imagined version of Elton John’s “I’m Still Standing” music video, which is on the beach in Cannes, shot in the 80s. The idea was to go there and shoot it with a lot more dancers. By the time the film was being shot, the weather changed and we couldn’t shoot that sequence. That whole ending was shot later, partly in a studio. Because we couldn’t afford to go to Cannes and reshoot the whole thing, someone was able to get the original rushes from that music video, which had been shot on 16mm film, but edited on videotape. We had to get permission from the original director of that music video and he was very happy for us to do it. We had the 16mm film rescanned and also removed the grain. Instead of Elton, we put Taron into it.  In every shot with Elton, we replaced his head with Taron’s and that became the ending sequence of the film. As a visual effect, that took quite a leap of faith, but it did work in the end. That wasn’t the original plan, but I think it’s better.

[OP] In Bohemian Rhapsody there was a conscious consideration of matching the Live Aid concert angles and actions. Was there anything like that in Rocketman?

[CD] There was no point in trying to do that on Rocketman. It was always going to be stylized and different from reality. We staged Dodger Stadium the way it looked, but we didn’t try to match it. The original concert was late afternoon and ours is more towards night, which was visually better. The visual inspiration came from the stills taken by a famous rock photographer and they look a little more night. At one point we talked about having a concert at the end and we tried shooting something, but it just didn’t feel right. We were going to get compared to BoRhap anyway, so we didn’t want to even try and do something the same way.

[OP] Any final thoughts or advice on how to approach a film like Rocketman?

[CD] Every movie is different. Every single time you come to a story, you nearly have to start again. The director wants to do it a certain way and you have to adapt to that. With some of the dramas or comedies that I’ve cut, it’s a less immediate process. You don’t really know how the whole thing is coming together until you get a sense of it quite late. With this, they shot a few of the song sequences early and as soon as I saw that, I thought right away, “Oh, this is great.” You can build a quick three-minute sequence to show people and you get a feel for the whole film. You can get excited about it. On a drama or even worse, on a thriller, you’re guessing how it’s coming together and you’re using all of your skills to do that.

The director and the story are the differences and I try to adapt. Dexter wanted the film to be popular, but also distinctive. He wanted to see very quickly how it was coming together. As soon as he was done filming he wanted to go to the edit and see how it was coming along. In that scenario you try to get some things done more quickly. So I would try to get some sequences put together knowing that, and then come back to them later if you’ve rushed them.

Since it’s a musical you could string together the songs and get a feel, but that would be misleading. When you start off you can produce a sequence very quickly that looks good, because you’ve got the music that makes it feel almost finished and that it’s working. But that can lead you into a dead end if you’re not careful – if you are too precious about the music – the length of it and such. You still have to be hard about the storytelling element. Ultimately all of the decisions come from the story – how long the scene is, whether you start on a close-up or a wide – I always try to approach everything like that. If you keep that in your head, you’ll make the right decisions.

©2019 Oliver Peters

Why editors prefer Adobe Premiere Pro CC

Over my career I’ve cut client jobs with well over a dozen different linear and nonlinear editing systems and/or brands. I’ve been involved with Adobe Premiere/Premiere Pro as a user on and off since Premiere 5.5 (yes kids – before, Pro, CS, and CC). But I seriously jumped into regular use at the start of the Creative Cloud era, thanks to many of my clients’ shift away from Final Cut Pro. Some seriously gave FCPX a go, yet could never warm up to it. Others bailed right away. In any case, the market I work in and the nature of my clients dictate a fluency in Premiere Pro. While I routinely bounce between Final Cut Pro X, Media Composer, DaVinci Resolve, and Premiere Pro, the latter is my main axe at the day job.

Before I proceed, let me stop and acknowledge those readers who are now screaming, “But Premiere always crashes!” I certainly don’t want to belittle anyone’s bad experiences with an app; however in my experience, Premiere Pro has been just as stable as the others. All software crashes on occasion and usually at the most inopportune time. Nevertheless, I currently manage about a dozen Mac workstations between home and work, which are exposed to our regular pool of freelance editors. Over the course of the past three to four years, Premiere Pro (as well as the other Creative Cloud applications) has performed solidly for us across a wide range of commercial, corporate, and entertainment projects. Realistically, if our experiences were as bad as many others proclaim, we would certainly have shifted to some other editing software!

Stability questions aside, why do so many professional editors prefer Adobe Premiere Pro given the choices available? The Final Cut Pro X fans will point to Premiere’s similarities with Final Cut Pro 7, thus providing a comfort zone. The less benevolent FCPX fanboys like to think these editors are set in their ways and resistant to change. Yet many Premiere Pro users have gone through several software or system changes in their careers and are no strangers to a learning curve. Some have even worked with Final Cut Pro X, but find Premiere Pro to be a better fit. Whatever the reason, the following is a short list (in no order of importance) of why Premiere Pro becomes such a good option for many editors, given the available alternatives.

Responsive interface – I find the Premiere Pro user interface to be the most responsive application of any of the NLEs. I’m not talking about media handling, but rather the time between clicking on something or commanding a function and having that action occur. For example, in my Final Cut Pro X experience – which is an otherwise fast application – it feels slower for this type of response time. When I click to select a clip in the timeline, it takes a fraction of a second to respond. The same action is nearly instant in Premiere Pro. The reason seems to be that FCPX is constantly writing each action to the Library in a “constant save” mode. I have seen such differences across multiple Macs and hard drive types over the eight years since its introduction with very little improvement. Not a deal-breaker, but meanwhile, Premiere Pro has continued to become more responsive in the same period.

Customizable user interface – Users first exposed to Premiere Pro’s interface may feel it’s very complex. The truth is that you can completely customize the look, style, and complexity of the interface by re-arranging the stacked, tabbed, or floating panels. Make it as minimalistic or complex as you need and save these as workspaces. It’s not just the ability to show/hide panels, but unlike other NLEs, it’s the complete control over their size and location.

Media Browser – Premiere Pro includes a built in Media Browser panel that enables the immediate review and import of clips external to your project. It’s not just a view of folders in a clip name or thumbnail format to be imported. Media Browser offers the same scrubbing capabilities as for clips in a bin. Furthermore, the editor can directly edit clips to the timeline from the Media Browser, which then automatically also imports that clip into the project in a one-step process. You could start with a completely blank project (no imported media clips) and work directly between the Media Browser and the timeline if you wanted to.

Bins – Editors rely on bins for the organization of raw media. It’s the first level of project organization. FCPX went deep down this hole with Events and Keywords. Premiere Pro uses a more traditional approach and features three primary modes – list, thumbnail, and freeform. List and thumbnail are obvious, but what needs to be reiterated is that the thumbnail view enables Adobe’s hover scrubbing. While not as fluid as FCPX’s skimming, it’s a quick way to see what a clip contains. But more importantly, the thumbnails are completely resizable. If you want to see a few very large thumbnails in the bin, simply crank up the slider. The newest is a freeform view – something Avid editors know well. This removes the grid arrangement of the bin view and allows the editor to rearrange the position of clips within the panel for that bin. This is how many editors like to work, because it gives them visual cues about how material is organized, much like a storyboard.

Versatile media and project locations – Since Premiere Pro treats all of your external storage as available media locations (without the need for a structured MediaFiles folder or Library file), this gives the editor a better handle on controlling where media should be located. Of course, this puts the responsibility for proper media management on the user, without the application playing nanny. The big plus is that projects can be organized within a siloed folder structure on your hard drive. One main folder for each job, with subfolders for associated video clips, graphics, audio, and Premiere Pro project files. Once you are done, simply archive the job folder and everything is there. Or… If a completely different organizational structure better fits your needs – no sweat. Premiere Pro makes it just as easy.

Multiple open sequences/timelines – One big feature that brings editors to Premiere Pro instead of Media Composer or Final Cut Pro X is the ability work with multiple, open sequences in the timeline panel and easily edit between them. Thanks to the UI structure of Premiere Pro, editors can also have multiple stacked timeline panels open in their workspace – the so-called “pancake timeline” mode. Open a “KEM roll” (selects sequence) in one panel and your working sequence in another. Then edit between the two timeline panels without ever needing to go back-and-forth between bins and the timeline.

Multiple open projects/collaboration – Premiere Pro’s collaboration capabilities (working with multiple editors on one job) are not as robust as with Avid Media Composer. That being said, Premiere’s structure does enable a level of versatility not possible in the Avid environment – so it’s a trade-off. With Premiere project locking, the first editor to open a project has read/write control, while additional editors to open one of those open projects can access the files in a read-only mode. Clips and sequences can be pulled (copied/imported) from a read-only project into your own active project. The two will then be independent of each other. This is further enhanced by the fact that Premiere offers standard “save as” computer functions. If Editor #1 wants to offload part of the work to Editor #2, simply saving the project as a new file permits Editor #2 to work in their own active version of the project with complete read/write control.

Mixed frame rates and sizes – Premiere Pro projects can freely mix media and timelines with different sizes, aspect ratios and frame rates. It’s not the only NLE to do that, but some applications still start by having the project file based on a specific sequence format. Everything in the project must conform or be modified to those settings. Both solutions are viable, but Premiere’s open approach is more versatile for editors working in the hodgepodge that is today’s media landscape.

Audio mixing – While all NLEs offer decent audio mixing capabilities, Premiere Pro offers more refined mixing functions, including track automation, submaster tracks, proper loudness measurement, and AU, VST, and VST3 plug-in support. FCPX attempts to offer a trackless mixing model using audio roles, but the mixing routine breaks done pretty quickly when you get to a complex scenario, often requiring multiple levels of compound clips (nested sequences). None of that is needed in Premiere Pro. In addition, Creative Cloud subscribers also have access to Adobe Audition, a full-fledged DAW application. Premiere Pro sequences can be sent directly to Audition for more advanced mixing, plus additional Audition-specific tools, like Loudness Match and Music Remix. Adobe markets these as powered by Adobe Sensei (Adobe’s banded artificial intelligence). Loudness Match analyzes an audio clip and intelligently rises the gain of the quieter sections. Traditional loudness controls raise or lower the entire clip by a fixed amount. Music Remix doesn’t actually remix a track. Instead, it automatically edits a track based on a target length. Set a desired duration and Audition will determine the correct music edit points to get close to that target. You can use the default or set it to favor shorter sections, which will result in more edit points.

Interoperability – Most professional editors do not work within a single software ecosystem. You often have to work with After Effects and Photoshop files. Needless to say, Premiere Pro features excellent interoperability with the other Adobe applications, whether or not you use the Dynamic Link function. In addition, there’s the outside world. You may send out to a Pro Tools mixer for a final mix. Or a Resolve colorist for grading. Built-in list/file export formats make this easy without the requirement for third-party applications to facilitate such roundtrips.

Built-in tools that enhance editing – This could be a rather long list, but I’ll limit myself to a few functions. The first one I use a lot is the Replace command. This appears to be the best and easiest to use of all the apps. I can easily replace clips on the timeline from the source clips loaded into the viewer or directly from any clip in a bin. No drag-and-drop required. The second very useful operation is built-in masking and tracking for nearly every video filter and color correction layer. This is right at your fingertips in the Effects Control panel without requiring any extra steps or added plug-ins. Need more? Bounce out to After Effects with its more advanced tools, including the bundled Mocha tracker.

Proxy workflow – Premiere Pro includes a built-in Proxy workflow, which permits low-res edit proxies to be created externally and attached, or created within the application itself. In addition, working with proxies in not an all-or-nothing feature. You can toggle between proxies and high-res master clips, but you can also work with a mixture of proxies and high-res files. In other words, not all of your clips have to be transcoded into proxies to gain the benefit of a proxy workflow. Premiere takes care of tracking the various clip sizes and making sure that the correct size is displayed. It also calculates the size shift between proxy frame sizes and larger high-res frame sizes to keep the toggle between these two seamless.

Relinking – Lastly,  Premiere Pro can work with media on any of the available attached drives; therefore, it’s got to be able to quickly relink these files if you move locations. I tend to work in a siloed folder structure, where everything I need for a project is contained within a job folder and its subfolders. These folders are often moved to other drives (for instance, if I need to travel with a project) or archived to an external drive and later restored. It’s critical that a project easily find and relink to the correct media files. Generally, as long as files stay in the same relative folder paths – in relation to the location of the project files on the drive – then Premiere can easily find all the necessary offline media files once a project is moved from its original location. This is true whether you move to a different drive with a different volume name or whether you move the entire job folder up or down a level within the drive’s folder hierarchy. Media relinking is either automatic or worst case, requires one dialogue box for the editor to point Premiere to the new path for the first file. From there, Premiere Pro will locate all of the other files. I find this process to be the fastest and least onerous relink operation of all the NLEs.

©2019 Oliver Peters

Free Solo

Every now and then a documentary comes along that simply blows away the fictional super-hero feats of action films. Free Solo is a testament to the breathtaking challenges real life can offer. This documentary chronicles the first free solo climb (no ropes) by Alex Honnold of El Capitan’s 3,000-feet-high sheer rock face. This was the first and so far only successful free solo climb of the mountain.

Free Solo was produced by the filmmaking team of Elizabeth Chai Vasarhelyi and Jimmy Chin, who is renowned as both an action-adventure cinematographer/photographer and mountaineer. Free Solo was produced in partnership with National Geographic Documentary Films and has garnered numerous awards, including OSCAR and BAFTA awards for best documentary, as well as an ACE award for its editor Bob Eisenhardt, ACE. Free Solo enjoyed IMAX and regular theatrical distribution and can now be seen on the National Geographic Television streaming service.

Bob Eisenhardt is a well-known documentary film editor with over 60 films to his credit. Along with his ACE award for Free Solo, Eisenhardt is currently an editing nominee in this year’s EMMY Awards for his work in cutting the documentary. I recently had a chance to speak with Bob Eisenhardt and what follows is that conversation.

_________________________________________

[OP] You have a long history in the New York documentary film scene. Please tell me a bit about your background.

[BE] I’ve done a lot of different kinds of films. The majority is cinema vérité work, but some films use a lot of archival footage and some are interview-driven. I’ve worked on numerous films with the Maysles, Barbara Kopple, Matt Tyrnauer, a couple of Alex Gibney’s films – and I often did more than one film with people. I also teach in the documentary program at the New York Film Academy, which is interesting and challenging. It’s really critiquing their thesis projects and discussing some general editing principles. I went to architecture school. Architectural design is taught by critique, so I understand that way of teaching.

[OP] It’s interesting that you studied architecture. I know that a lot of editors come from a musical background or are amateur musicians and that influences their approach to cutting. How do you think architecture affects your editing style?

[BE] They say architecture is frozen music, so that’s how I was taught to design. I’m very much into structure – thinking about the structure of the film and solving problems. Architecture is basically problem solving and that’s what editing is, too. How do I best tell this story with these materials that I have or a little bit of other material that I can get? What is the essence of that and how do I go about it?

[OP] What led to you working on Free Solo?

[BE] This is the second film I’ve made with Chai and Jimmy. The first was Meru. So we had some experience together and it’s the second film about climbing. I did learn about the challenges of climbing the first time and was familiar with the process – what the climbing involved and how you use the ropes. 

Meru was very successful, so we immediately began discussing Free Solo. But the filming took about a year-and-a-half. That was partly due to accidents and injuries Alex had. It went into a second season and then a third season of climbing and you just have to follow along. That’s what documentaries are all about. You hitch your wagon to this person and you have to go where they take you. And so, it became a much longer project than initially thought. I began editing six months before Alex made the final climb. At that point they had been filming for about a year. So I came on in January and he made the climb in June – at which point I was well into the process of editing.

[OP] There’s a point in Free Solo, where Alex had started the ascent once and then stopped, because he wasn’t feeling good about it. Then it was unclear whether or not he would even attempt it again. Was that the six-month point when you joined the production?

[BE] Yes, that’s it. It’s very much the climbers’ philosophy that you have to feel it, or you don’t do it. That’s very true of free soloing. We wanted him to signal the action, “This is what I plan to do.” And he wouldn’t do it – ever – because that’s against the mentality of climbing. “If I feel it, I may do it. Otherwise, not.” It’s great for climbing, but not so good for film production.

[OP] Unlike any other film project, failure in this case would have meant Alex’s death. In that event you would have had a completely different film. That was touched on in the film, but what was the behind-the-scenes thinking about the possibility of such as catastrophe? Any Plan B?

[BE] In these vérité documentaries you never know what’s going to happen, but this is an extreme example. He was either going to do it and succeed, decide he wasn’t going to do it, or die trying, and that’s quite a range. So we didn’t know what film we were making when I started editing. We were going to go with the idea of him succeeding and then we’d reconsider if something else happened. That was our mentality, although in the back of our minds we knew this could be quite different.

When they started, it wasn’t with the intention of making this film. Jimmy knew Alex for 10 years. They were old friends and had done a lot of filming together. He thought Alex would be a great subject for a documentary. That’s what they proposed to Nat Geo – just a portrait of Alex – and Alex said, “If you are going to do that, then I’ve got to do something worthwhile. I’m going to try to free solo El Cap.” He told that to Chai while Jimmy wasn’t there. Chai is not a climber and she thought, “Great, that sounds like it will be a good film.” Jimmy completely freaked out when he found out, because he knew what it meant.

It’s an outrageous concept even to climbers. They actually backed off and had to reconsider whether this was something they wanted to get involved in. Do you really want to see your friend jeopardize his life for this? Would the filming add additional pressure on Alex? They had to deal with this even before they started shooting, which is why that was part of the film. I felt it was a very important idea to get across. Alex is taciturn, so you needed ways to understand him and what he was doing. The crew as a character really helped us do that. They were people Alex could interact with and the audience could identify with.

The other element that I felt was very important, was Sanni [McCandless, Alex Honnold’s girlfriend], who suddenly came onto the scene after the filming began. This felt like a very important way to get to know Alex. It also became another challenge for Alex – whether he would be able not only to climb this mountain, but whether he would be able to have a relationship with this woman. And aren’t those two diametrically opposed? Being able to open yourself up emotionally to someone, but also control your emotions enough to be able to hang by your fingertips 2,000 feet in the air on the side of a cliff.

[OP] Sanni definitely added a lot of humanity to him. Before the climb they discuss the possibility of his falling to his death and Alex’s point of view is that’s OK. “If I die, I die.” I’m not sure he really believed that deep inside. Or did he?

[BE] Alex is very purposeful and lives every day with intention. That’s what’s so intriguing. He knows any minute on the wall could be his last and he’s comfortable with that. He felt like he was going to succeed. He didn’t think he was going to fall. And if he didn’t feel that way he wasn’t going to do it. Seeing the whole thing through Sanni’s eyes allowed us as the audience to get closer to and identify with Alex. We call that moment the ‘Take me into consideration’ scene, which I felt was vitally important.

[OP] Did you have any audience screenings of the rough cuts? If so, how did that inform your editing choices?

[BE] We did do some screenings and it’s a tricky thing. Nat Geo was a great partner throughout. Most companies wouldn’t be able to deal with this going on for a year-and-a-half. It’s in Nat Geo’s DNA to fund exploration and make exploratory films. They were completely supportive, but they did decide they wanted to get into Sundance and we were a month from the deadline. We brought in three other editors (Keiko Deguchi, Jay Freund, and Brad Fuller) to jump in and try to make it. Even though we got an extension and we did a great job, we didn’t get in. The others left and I had another six months to work on the film and make it better. Because of all of this, the screenings were probably too early. The audience had trouble understanding Alex, understanding what he’s trying to do – so the first couple screenings were difficult.

We knew when we saw the initial climbing footage that the climb itself was going to be amazing. By the time we showed it to an audience, we were completely immune to any tension from the climb – I mean, we’d seen it 200 times. It was no longer as scary to us as it had been the first time we saw it. In editing you have to remember the initial reaction you had to the footage so that you can bring it to bear later on. It was a real struggle to make the rest of the story as strong as possible to keep you engaged, until we got to the climb. So we were pleasantly surprised to see that people were so involved and very tense during the climb. We had underestimated that.

We also figured that everyone would already know how this thing ends. It was well-publicized that he successfully climbed El Cap. The film had to be strong enough that people could forget they knew what happened. Although I’ve had people tell me they could not have watched the climb if they hadn’t known the outcome.

[OP] Did you end up emphasizing some aspects over others as a result of the screenings?

[BE] The main question to the audience is, “Do you understand what we are trying to say?” And then, “What do you think of him or her as a character?” That’s interesting information that you get from an audience. We really had to clarify what his goal was. He never says at the beginning, “I’m going to do this thing.” In fact, I couldn’t get him to say it after he did it. So it was difficult to set up his intention. And then it was also difficult to make clear what the steps were. Obviously we couldn’t cover the whole 3,000 feet of El Capitan, so they had to concentrate on certain areas.

We decided to cover five or six of the most critical pitches – sections of the climb – to concentrate on those and really cover them properly during the filming. These were challenging to explain and it took a lot of effort to make that clear. People ask, “How did you manage to cut the final climb – it was amazing.” Well, it worked because of the second act that explains what he is trying to do. We didn’t have to say anything in the third act. You just watch because you understand. 

When we started people didn’t understand what free soloing is. At first we were calling the film Solo. The nomenclature of climbing is confusing. Soloing is actually climbing with a rope, but only for protection. Then we’d have to explain what free soloing was as opposed to soloing. However, Hans Solo came along and stole our title, so it was much easier to call it Free Solo. Explaining the mentality of climbing, the history of climbing, the history of El Capitan, and then what exactly the steps were for him to accomplish what he was trying to do – all that took a long time to get right and a lot of that came out of good feedback from the audience.

Then, “Do you understand the character?” At one point we didn’t have enough of Sanni and then we had too much of Sanni. It became this love story and you forgot that he was going to climb. So the balancing was tricky.

[OP] Since you were editing before the final outcome and production was still in progress, did you have an opportunity to request more footage or that something in particular be filmed that you were missing in the edit?

[BE] That was the big advantage to starting the edit before the filming was done. I often end up coming into projects that are about 80-90% shot on average. So they have the ability to get pick-ups if people are alive or if the event can still be filmed in some way. This one was more ‘in progress.’ For instance, he practiced a specific move a lot for the most difficult pitch and I kept asking for more of that. We wanted to show how many times he practiced it in order to get the feel of it.

[OP] Let’s switch gears and talk about the technical side. Which edit system did you use to cut Free Solo?

[BE] We were using Avid Media Composer 8.8.5 with Nexis shared storage. Avid is my first choice for editing. I’ve done about four films on the old Final Cut – Meru being one of them – but, I much prefer Avid. I’ve often inherited projects that were started on something else, so you are stuck. On this one we knew going in that we would do it on Avid. Their ScriptSync feature is terrific. Any long discussions or sit-down interviews were transcribed. We could then word-search them, which was invaluable. My associate editor, Simona Ferrari, set up everything and was also there for the output.

[OP] Did you handle the finishing – color correction and sound post – in-house or go outside to another facility?

[BE] We up-rezzed in the office on [Blackmagic Design DaVinci] Resolve and then took that to Company 3 for finishing and color correction. Deborah Wallach did a great job sound editing and we mixed with Tommy Fleischman [Hugo, The Wolf of Wall Street, BlacKkKlansman]. They shot this on about every camera, aspect ratio, and frame rate imaginable. But if they’re hanging 2,000 feet in the air and didn’t happen to hit the right button for the frame rate – you really can’t complain too much! So there was an incredible wide range and Simona managed to handle all that in the finishing. There wasn’t a lot of archival footage, but there were photos for the backstory of the family.

The other big graphic element was the mountain itself. We needed to be able to trace his route up the mountain and that took forever. It wasn’t just to show his climb, but also to connect the pitches that we had concentrated on, since there wasn’t much coverage between them. Making this graphic became very complicated. We tried one house and they couldn’t do it. Finally, Big Star, who was doing the other graphics – photomontages and titles – took this on. It was the very last thing done and was dropped in during the color correction session.

For the longest time in the screenings, the audience was watching a drawing that I had shot off of the cutting room wall and traced in red. It was pretty lame. For the screenings, it was a shot of the mountain and then I would dissolve through to get the line moving. After a while we had some decent in and out shots, but nothing in-between, except this temporary graphic that I created. 

[OP] I caught Free Solo on the plane to Las Vegas for NAB and it had me on the edge of my seat. I know the film was also released in IMAX, so I can only image what that experience was like.

[BE] The film wasn’t made for IMAX – that opportunity came up later. It’s a different film on IMAX. Although there is incredible high-angle photography, it’s an intimate story. So it worked well on a moderately big screen. But in IMAX it becomes a spectacle, because you can really see all those details in the high-angle shots. I have cut an IMAX film before and you do pace them different, because of the ability to look around. However, there wasn’t a different version of Free Solo made for IMAX – we didn’t have the freedom to do that. Of course, the whole film is largely handheld, so we did stabilize a few shots. IMAX merely used their algorithm to bump it up to their format. I was shocked – it was beautiful.

[OP] Let’s talk a bit about your process as an editor. For instance, music. Different editors approach music differently. Do you cut with temp music or wait until the very end to introduce the score?

[BE] Marco Beltrami [Fantastic Four, Logan, Velvet Buzzsaw] was our composer, but I use temp music from very early on. I assemble a huge library of scratch music – from other films or from the potential composers’ past films. I use that until we get the right feel for the music and that’s what we show to the composer. It gives us something to talk about. It’s much easier to say, “We like what the music is doing here, but it’s the wrong instrumentation.” Or, “This is the right instrument, but the wrong tempo.” It’s a baseline.

[OP] How do you tackle the footage at the very beginning? Do you create selects or Kem rolls or some other approach?

[BE] I create a road map to know where I’m going. I go through all the dailies and pull the stuff that I think might be useful. Everything from the good-looking shots to a taste of something that I may never use, but I want to remember. Then I screen selects reels. I try to do that with the director. Sometimes we can schedule that and sometimes not. On Free Solo there was over 700 hours of footage, so it’s hard to get your arms around that. By the time you get through looking at the 700th hour you’ve forgotten the first one. That’s why the selecting process is so important to me. The selects amount to maybe a third of the dailies footage. After screening the selects, I can start to see the story and how to tell it. 

I make index cards for every scene and storyboard the whole thing. By that I mean arrange the cards on a wall. They are color-coded for places, years, or characters. It allows me to stand back and see the flow of the film, to think about the structure, and the points that I have to hit. I basically cut to that. Of course, if it doesn’t work, I re-arrange the index cards (laugh).

A few years ago, I did a film about the Dixie Chicks [Shut Up & Sing] at the time they got into trouble for comments they had made about President Bush. We inherited half of the footage and shot half. The Dixie Chicks went on to produce a concert and an album based upon their feelings about the whole experience. It was kind of present and past, so there were basically two different colors to the cards. It was not cut in chronological order, so you could see very quickly whether you were in the past or the present just by looking at the wall. There were four editors working on Shut Up & Sing and we could look at the wall, discuss, and decide if the story was working or not. If we moved this block of cards, what would be the consequences of telling the story in a different order?

[OP] Were Jimmy or Chai very hands-on as directors during the edit – in the room with you every day at the end?

[BE] Chai and Jimmy are co-directors and so Jimmy tended to be more in the field and Chai more in the edit room. Since we had worked together before, we had built a common language and a trust. I would propose ideas to Chai and try them and she would take a look. My feeling is that the director is very close to it and not able to see the dailies with fresh eyes. I have the fresh perspective. I like to take advantage of that and let them step back a little. By the end, I’m the one that’s too close to it and they have a little distance if they pace themselves properly.

[OP] To wrap it up, what advice would you have for young editors tackling a documentary project like this?

[BE] Well, don’t climb El Cap – you probably won’t make it (laugh)! I always preach this to my students: I encourage them to make an outline and work towards it. You can make index cards like I do, you can make a Word document, a spreadsheet; but try to figure out what your intentions are and how you are going to use the material. Otherwise, you are just going to get lost. You may be cutting things that are lovely, but then don’t fit into the overall structure. That’s my big encouragement.

Sometimes with vérité projects there’s a written synopsis, but for Free Solo there was nothing on paper at the beginning. They went in with one idea and came out with a different film. You have to figure out what the story is and that’s all part of the editing process. This goes back to the Maysles’ approach. Go out and capture what happened and then figure out the story. The meaning is found in the cutting room.

Images courtesy of National Geographic and Bob Eisenhardt.

©2019 Oliver Peters

Black Mirror: Bandersnatch

Bandersnatch was initially conceived as an interactive episode within the popular Black Mirror anthology series on Netflix. Instead, Netflix decided to release it as a standalone, spin-off film in December 2018. It’s the story of programmer Stefan Butler (Fionn Whitehead) as he adapts a choose-your-own-adventure novel into a video game. Set in 1984, the viewers get to make decisions for Butler’s actions, which then determine the next branch of the story shown to the viewer. They can go back though Bandersnatch and opt for different decisions, in order to experience other versions of the story.

Bandersnatch was written by show creator Charlie Brooker (Black Mirror, Cunk on Britain, Cunk on Shakespeare), directed by David Slade (American Gods, Hannibal, The Twilight Saga: Eclipse), and edited by Tony Kearns (The Lodgers, Cardboard Gangsters, Moon Dogs). I recently had a chance to interview Kearns about the experience of working on such a unique production.

__________________________________________________

[OP] Please tell me a little about your editing background leading up to cutting Bandersnatch.

[TK] I started out almost 30 years ago editing music videos in London. I did that full-time for about 15 years working for record companies and directors. At the tail end of that a lot of the directors I was working with moved into doing commercials, so I started editing commercials more and more in Dublin and London. In Dublin I started working on long form, feature film projects and cut about 10 projects that were UK or European co-productions with the Irish Film Board.

In 2017 I got a call from Black Mirror to edit the Metalhead episode, which was directed by David Slade. He was someone I had worked with on music videos and commercials 15 years previously, before he had moved to the United States. That was a nice circularity. We were together working again, but on a completely different type of project – drama, on a really cool series, like Black Mirror. It went very well, so David and I were asked to get involved with Bandersnatch, which we jumped at, because it was such an amazing, different kind of project. It was unlike anything either of us – or anyone else, for that matter – has ever done to that level of complexity.

[OP] Other attempts at interactive storytelling – with the exception of the video game genre – have been a hit-or-miss. What were your initial thoughts when you read the script for the first time?

[TK] I really enjoyed the script. It was written like a conventional script, but with software called Twine, so you could click on it and go down different paths. Initially I was overwhelmed at the complexity of the story and the structure. It wasn’t that I was like a deer in the headlights, but it gave me a sense of scale of the project and [writer/show runner] Charlie Brooker’s ambition to take the interactive story to so many layers.

On my own time I broke down the script and created spreadsheets for each of the eight sections in the script and wrote descriptions of every possible permutation, just to give me a sense of what was involved and to get it in my head what was going on. There are so many different narrative paths – it was helpful to have that in my brain. When we started editing, that would also help me to keep a clear eye at any point.

[OP] How long of a schedule did you have to post Bandersnatch?

[TK] 17 weeks was the official edit time, which isn’t much longer than on a low-budget feature. When I mentioned that to people, they felt that was a really short amount of time; but, we did a couple of weekends, we were really efficient, and we knew what we were doing.

[OP] Were you under any running length constraints, in the same way that a TV show or a feature film editor often wrestles with on a conventional linear program?

[TK] Not at all. This is the difference – linear doesn’t exist. The length depends on the choices that are made. The only direction was for it not to be a sprawling 15-hour epic – that there would be some sort of ball park time. We weren’t constrained, just that each segment had to feel right – tight, but not rushed.

[OP] With that in mind, what sort of process did you do through to get it to feel right?

[TK] Part of each edit review was to make it as tight or as lean as it needed to be. Netflix developed their own software, called Branch Manager, which allowed people to review the cut interactively by selecting the choice points. My amazing assistant editor, John Weeks, is also a coder, so he acquired an extra job, which was to take the exports and do the coding in order to have everything work in Branch Manager. He’s a very robust person, but I think we almost broke him (laughs), because there were up to 100 Branch Manager versions by the end. The coding was hanging on by a thread. He was a bit like Scotty in Star Trek, “The engines can’t hold it anymore, Captain!”

By using Branch Manager, people could choose a path and view it and give notes. So I would take the notes, make the changes, and it would be re-exported. Some segments might have five cuts while others would be up to 13 or 14. Some scenes were very straightforward, but others were more difficult to repurpose.

Originally there were more segments in the script, but after the first viewings it was felt that there were too many in there. It was on the borderline of being off-putting for viewers. So we combined a few, but I made sure to keep track of that so it was in the system. There was a lot of reviewing, making notes, updating spreadsheets, and then making sure John had the right version for the next Branch Manager creation. It was quite an involved process.

[OP] How were you able to keep all of this straight? Did you use the common technique of scenes cards on the wall or something different?

[TK] If you looked at flowcharts your head would explode, because it would be like looking at the wiring diagram of an old-fashioned telephone exchange. There wouldn’t have been enough room on the wall. For us, it would just be on paper – notebooks and spreadsheets. It was more in our heads – our own sense of what was happening – that made it less confusing. If you had the whole thing as a picture, you just wouldn’t know where to look.

[OP] In a conventional production an editor always has to be mindful that when something is removed, it may have ramifications to the story later on. In this case, I would imagine that those revisions affected the story in either direction. How were you able to deal with that?

[TK] I have been asked about how did we know that each path would have a sense of a narrative arc. We couldn’t think of it as one, total narrative arc. That’s impossible. You’d have to be a genius to know that it’s all going to work. We felt the performances were great, the story was strong, but it doesn’t have a conventional flow. There are choice points, which act as a propellant into the next part of the film thus creating an unconventional experience to the straight story arc of conventional films or episodes. Although there wasn’t a traditional arc, it still had to feel like a well-told story. And that you would have empathy and a sense of engagement – that it wasn’t a gimmick.

[OP] How did the crew and actors mange to keep the story straight in their minds as scenes were filmed?

[TK] As with any production, the first few days are finding out what you’ve let yourself in for. This was a steep learning curve in that respect. Only three weeks of the seven-week shoot was in the same studio complex where I was working, so I wasn’t present. But there was a sense that they needed to make it easier for the actors and the crew. The script supervisor, Marilyn Kirby, was amazing. She was the oracle for the whole shoot. She kept the whole show on the road, even when it was quite complicated. The actors got into the swing of it quickly, because I had no issues with the rushes. They were fantastic.

[OP] What camera formats were used and what is your preparation process for this footage prior to editing?

[TK] It’s the most variety of camera formats I’ve ever worked on. ARRI Alexa 65 and RED, but also 1980s Ikegami TV cameras, Super 8mm, 35mm, 16mm, and VHS. Plus, all of the print stills were shot on black-and-white film. The data lab handled the huge job to keep this all organized and provide us with the rushes. So, when I got them, they were ready to go. The look was obviously different between the sources, but otherwise it was the same as a regular film. Each morning there was a set of ProRes Proxy rushes ready for us. John synced and organized them and handed them over. And then I started cutting. Considering all the prep the DIT and the data lab had to go through, I think I was in a privileged position!

[OP] What is your method when first starting to edit a scene?

[TK] I watch all of the rushes and can quickly see which take might be the bedrock framing for a scene – which is best for a given line. At that point I don’t just slap things together on a timeline. I try to get a first assembly to be as good as possible, because it just helps anyone who sees it. If you show a director or a show runner a sloppy cut, they’ll get anxious and I don’t want that to happen. I don’t want to give the wrong impression.

When I start a scene, I usually put the wide down end-to-end, so I know I have the whole scene. Then I’ll play it and see what I have in the different framings for each line – and then the next line and the next and so on. Finally, I go back and take out angles where I think I may be repeating a shot too much, extend others, and so on. It’s a built-it-up process in an effort to get to a semi-fine cut as quickly as possible.

[OP] Were you able to work with circle takes and director’s notes on Bandersnatch?

[TK] I did get circle takes, but no director’s notes. David and I have an intuitive understanding, which I hope to fulfill each time – that when I watch the footage he shoots, that I’ll get what he’s looking for in the scene. With circles takes, I have to find out very quickly whether the script supervisor is any good or not. Marilyn is brilliant so whenever she’s doing that, I know that take is the one. David is a very efficient director, so there weren’t a massive number of takes – usually two or three takes for each set-up. Everything was shot with two cameras, so I had plenty of coverage. I understand what David is looking for and he trusts me to get close to that.

[OP] With all of the various formats, what sort of shooting ratio did you encounter? Plus, you had mentioned two-camera scenes. What is your approach to that in your edit application?

[TK] I believe the various story paths totaled about four-and-a-half hours of finished material. There was a 3:1 shooting ratio, times two cameras – so maybe 6:1 or even 9:1. I never really got a final total of what was shot, but it wasn’t as big as you’d expect. 

When I have two-camera coverage I deal with it as two individual cameras. I can just type in the same timecode for the other matching angle. I just get more confused with what’s there when I use multi-cam. I prefer to think of it as that’s the clip from the clip. I hope I’m not displaying an anti-technology thing, but I’m used to it this way from doing music videos. I used to use group clips in Avid and found that I could think about each camera angle more clearly by dealing with them separately.

[OP] I understand that you edited Bandersnatch on Adobe Premiere Pro. Is that your preferred editing software?

[TK] I’ve used Premiere Pro on two feature films, which I cut in Dublin, and a number of shorts and TV commercials. If I am working where I can set up my own cutting room, then I’m working with Premiere. I use both Avid and Adobe, but I find I’m faster on Premiere Pro than on Media Composer. The tools are tuned to help me work faster.

The big thing on this job was that you can have multiple sequences open at the same time in Premiere. That was going to be the crunch thing for me. I didn’t know about Branch Manager when I specified Premiere Pro, so I figured that would be the way we work need to review the segments – simply click on a sequence tab and play it as a rudimentary way to review a story path. The company that supplied the gear wasn’t as familiar with Premiere [as they were with Avid], so there were some issues, but it was definitely the right choice.

[OP] Media Composer’s strength is in multi-editor workflows. How did you handle edit collaboration in Premiere Pro?

[TK] We used Adobe’s shared projects feature, which worked, but wasn’t as efficient as working with Avid in that version of Premiere. It also wasn’t ideal that we were working from Avid Nexis as the shared storage platform. In the last couple of months I’ve been in contact with the people at Adobe and I believe they are sorting out some of the issues we were having in order to make it more efficient. I’m keen for that to happen.

In the UK and London in particular, the big player is Avid and that’s what people know, so anything different, like Premiere Pro, is seen with a degree of suspicion. When someone like me comes in and requests something different, I guess I’m viewed as a bit of a pain in the ass. But, there shouldn’t just be one behemoth. If you had worked on the old Final Cut Pro, then Premiere Pro is a natural fit – only more advanced and supported by a company that didn’t want to make smart phones and tablets.

[OP] Since Adobe Creative Cloud offers a suite of compatible software tools, did you tap into After Effects or other tools for your edit?

[TK] That was another real advantage – the interaction with the graphics user interface and with After Effects. When we mocked up the first choice points, it was so easy to create, import, and adjust. That was a huge advantage. Our VFX editor was able to build temp VFX in After Effects and we could integrate that really easily. He wasn’t just using an edit system’s effects tool, but actual VFX software, which seamlessly integrated with Premiere. Although these weren’t final effects at full 4K resolution, he was able to do some very complex things, so that everyone could go, “Yes, that’s it.”

[OP] In closing, what take-away would you offer an editor interested in tackling an interactive story as compared to a conventional linear film?

[TK] I learned to love spreadsheets (laugh). I realized I had to be really, really organized. When I saw the script I knew I had to go through it with a fine-tooth comb and get a sense of it. I also realized you had to unlearn some things you knew about conventional episodic TV. You can’t think of some things in the same way. A practical thing for the team is that you have to have someone who knows coding, if you are using a similar tool to Branch Manager. It’s the only way you will be able to see it properly.

It’s a different kind of storytelling pressure that you have to deal with, mostly because you have to trust your instincts even more that it will work as a coherent story across all the narrative paths. You also have to be prepared to unlearn some of the normal methods you might use. One example is that you have to cut the opening of different segments differently to work with the last shot of the previous choice point, so you can’t just go for one option, you have to think more carefully what the options are. The thing is not to walk in thinking it’s going to be the same as any other production, because it ain’t.

For more on Bandersnatch, check out these links: postPerspective, an Art of the Guillotine interview with Tony Kearns, and a scene analysis at This Guy Edits.

Images courtesy of Netflix and Tony Kearns.

©2019 Oliver Peters

It is time to reconsider Final Cut Pro X?

While Final Cut Pro X may have ultimately landed in the market sector that Apple envisioned, the industry widely acknowledged that the original launch could have been better managed. Many staunch Final Cut Pro (“legacy”) users were irrevocably alienated. That’s a shame, because FCPX wasn’t a bad design when released – merely incomplete. In the eight years that have followed, the user base has grown to more than 2.5 million (April 2018) and the application sports the widest third-party support of any editing software.

I have certainly gone back and forth in my own use of FCPX, depending on whether it was the right tool for a given job. I cut a feature film with it back in the pre-10.1 days when it was a bifurcated application with separate Event and Project files. Since then, I have also used it on plenty of spots and corporate videos. Although my daily workflow is largely Premiere Pro-based now, I regularly use Final Cut Pro X when appropriate, as well as Blackmagic Design DaVinci Resolve and Avid Media Composer. Modern editors need to be NLE-multilingual.

I realize that winning Oscars and cutting large-scale productions isn’t what the majority of editors do. Nevertheless, these types of productions give any product street cred. You are probably aware of Focus and Whiskey Tango Foxtrot, but there are certainly others that have used FCPX. Hollywood studios films are dominated by films cut with Avid Media Composer; however, short films cut using FCPX have won the short film Oscar category for two years in a row. While largely invisible to many US viewers, major international productions, on par with Game of Thrones, have been edited using Final Cut Pro X.

If you were one of those FCP7 users who jumped ship to another tool, then maybe it’s time to revisit Final Cut Pro X. There are many reasons I say that. In the past eight years, Apple has added wide codec support, LUTs, HDR capabilities, vastly improved color correction tools, and an easy method of working with captioning. Final Cut is clearly the better tool in many situations and here’s a quick overview why I feel that way.

What productions are best with FCPX?

Final Cut Pro X is capable of handling all types of editing, but it’s more ideal for some than others. The biggest differentiator is turnaround time. If you have to get done quickly – from ingest to delivery – then FCPX is hard to beat. It handles media better than any other NLE without the need for the beefiest hardware. Want to cut 4K ProResHQ on a two-year-old MacBook Pro? Then FCPX shines. That makes it a natural in broadcast news, promos, and sports. It’s also perfect for non-broadcast event coverage. Frankly, I’m surprised that US broadcasters haven’t gravitated to it like various other broadcasters around the world – especially for cutting news stories. The workflow, interface, and low hardware requirements make it well-suited to the task.

Station promo production might be questionable for some, but stop and think about the use of Motion Templates and how that technology can be applied to broadcast design. Final Cut features the unique ability to use templates that any user can create and publish as an effect out of Apple Motion. Therefore, custom effects, animation, and graphics can easily be created specifically for a station’s bespoke look.

For example, a broadcast group or network that owns multiple stations in different cities could have one creative team develop a custom station graphics package for each outlet, simply by using Motion. Those templates could be deployed to each promo department and installed into the individual FCPX edit systems. This would allow each editor to modify or customize time and event information based on the published parameters without mistakenly deviating from the prescribed graphic look. That’s a broadcast creative director’s dream.

A simple hardware footprint

Obviously Final Cut requires Apple computers, but there’s easy connectivity to media from external Thunderbolt, USB, and ethernet-based storage. Some facilities certainly need elaborate shared storage systems for collaborative workflows, but others don’t. If you are a creative editorial boutique, all of a given project’s proxy editing files can be stored on a single SSD drive, allowing the editor to easily move from room to room, or home to work, simply by carrying the SSD with them. They can even be cutting on a laptop and then bring that in to work, connect to an external display for better monitoring, and keep rocking. With the advent of external GPU systems (eGPU), you can easily augment the horsepower of middle-level Macs when the need arises. 

No external i/o hardware is required for monitoring. While I recommend a simple audio i/o interface and external speakers as a minimum, there are plenty of fixed-location systems where the editors only use headphones. AJA or Blackmagic interfaces to play video out to an external display are optional. Simply connect a high-quality display to the Mac via HDMI or Thunderbolt and FCPX will feed real video to it full screen. Premiere Pro can also do this, but Media Composer and Resolve do not.

Third-party ecosystem

Some of Final Cut’s deficits have developed into a huge asset. It enjoys one of the best ecosystems of third-party tools that enhance the application. These range from translation tools from vendors like Intelligent Assistance and Marquis Broadcast, to a myriad of plug-ins, such as those from FxFactory and Coremelt. Final Cut already comes with a very solid set of built-in effects filters – probably the most useful variety of the various NLE options. Even better, if you also purchase Motion, you can easily create more effects by building your own as Motion Templates. This has resulted in a ton of small developers who create and sell their own variations using this core technology.

You certainly don’t have to purchase any additional effects to be productive with FCPX, but if you do, then one of the better options is FxFactory by Noise Industries. FxFactory is both a set of effects and a delivery platform for other developers. You can use the FxFactory interface to purchase, install, and manage plug-ins and even applications from a diverse catalogue of tools. Pick and choose what you need and grow the repertoire as you see fit. One of the first options to start with is idustrial revolution’s newly revamped XEffects Toolkit. This includes numerous effects and title templates to augment your daily work. Some of these employ built-in tracking technology that allows you to pin items to objects within a shot.

Apple’s latest feature addition is workflow extensions. Adobe introduced this technology first in its products. But Apple has built upon it through macOS integration with apps like Photos and now in Final Cut Pro X. In short, an extension allows direct FCPX integration with another application. Various extensions can be downloaded from the Mac App Store and installed into FCPX. An extension then adds a panel into Final Cut, which allows you to interact with that application from inside the FCPX interface. Initially some of the companies offering extensions include frame.io, Shutterstock, Simon Says, and others.

Subscription

A sore point for many Adobe customers was the shift to the subscription business model. While the monthly rates are reasonable if you are an ongoing business, they have caused some to stick with software as old as CS6 (yikes!). As more companies adopt subscriptions, you have to start wondering when enough is enough. I don’t think we are there yet and Creative Cloud is still a solid value. But if you are an individual who doesn’t make a living with these tools, then it’s a concern. Adobe recently raised eyebrows with the doubling of the monthly cost for its Photography plan. As it turns out this is an additional pricing plan with more storage and not a replacement, but that’s only evident after the website page appears to have been quickly fixed. Predictably this gives competitors like ON1 an avenue for counter-marketing.

Concerned with subscriptions? Then the Apple professional applications are an alternative. Final Cut Pro X, Compressor, Motion, and Logic ProX – coupled with photo and graphics tools from Affinity and/or Pixelmator – provide a viable competing package to Adobe Creative Cloud. Heck, augment that with Fusion and/or DaVinci Resolve – even the free versions – and the collection becomes a formidable toolkit.

The interface

Naturally, the elephant in the room is the FCPX interface. It’s what simultaneously excited and turned off so many FCP7 users. In the end, how you edit with Final Cut Pro X does not have to be all that different than your editing style with other NLEs. Certainly there are differences, but once you get used to the basics, there’s more that’s similar than is different.

Isn’t imitation the highest form of flattery? You only have to look at Adobe Premiere Rush or the new Cut Page in Resolve 16 to realize that just maybe, others are starting to see the value in Apple’s approach. On top of that, there are features touted in Resolve 16, like facial (actually shape) recognition or adjustment layers, that were there even in FCPX 10.0. Whether this all is blatant copying or simply a tip-of-the-hat doesn’t matter. Each company has come to the conclusion that some workflows and some newer editors need a faster and more direct user interface that is easily scalable to small and large screens and to single and dual-display systems.

I realize that many out there will read this post and scream Apple apologist. Whatever. If you’ve shifted to PC, then very little of what I’ve said applies to you. I make my daily living with Apple hardware. While I recognize you can often get superior performance with a PC, I don’t find the need to make a change yet. This means that Final Cut Pro X remains a great option for my workflows. It’s a tool I can use for nearly any job and one that is often times better than most. If you rejected it eight years ago, maybe it’s time to take a second look.

©2019 Oliver Peters