Free Solo

Every now and then a documentary comes along that simply blows away the fictional super-hero feats of action films. Free Solo is a testament to the breathtaking challenges real life can offer. This documentary chronicles the first free solo climb (no ropes) by Alex Honnold of El Capitan’s 3,000-feet-high sheer rock face. This was the first and so far only successful free solo climb of the mountain.

Free Solo was produced by the filmmaking team of Elizabeth Chai Vasarhelyi and Jimmy Chin, who is renowned as both an action-adventure cinematographer/photographer and mountaineer. Free Solo was produced in partnership with National Geographic Documentary Films and has garnered numerous awards, including OSCAR and BAFTA awards for best documentary, as well as an ACE award for its editor Bob Eisenhardt, ACE. Free Solo enjoyed IMAX and regular theatrical distribution and can now be seen on the National Geographic Television streaming service.

Bob Eisenhardt is a well-known documentary film editor with over 60 films to his credit. Along with his ACE award for Free Solo, Eisenhardt is currently an editing nominee in this year’s EMMY Awards for his work in cutting the documentary. I recently had a chance to speak with Bob Eisenhardt and what follows is that conversation.

_________________________________________

[OP] You have a long history in the New York documentary film scene. Please tell me a bit about your background.

[BE] I’ve done a lot of different kinds of films. The majority is cinema vérité work, but some films use a lot of archival footage and some are interview-driven. I’ve worked on numerous films with the Maysles, Barbara Kopple, Matt Tyrnauer, a couple of Alex Gibney’s films – and I often did more than one film with people. I also teach in the documentary program at the New York Film Academy, which is interesting and challenging. It’s really critiquing their thesis projects and discussing some general editing principles. I went to architecture school. Architectural design is taught by critique, so I understand that way of teaching.

[OP] It’s interesting that you studied architecture. I know that a lot of editors come from a musical background or are amateur musicians and that influences their approach to cutting. How do you think architecture affects your editing style?

[BE] They say architecture is frozen music, so that’s how I was taught to design. I’m very much into structure – thinking about the structure of the film and solving problems. Architecture is basically problem solving and that’s what editing is, too. How do I best tell this story with these materials that I have or a little bit of other material that I can get? What is the essence of that and how do I go about it?

[OP] What led to you working on Free Solo?

[BE] This is the second film I’ve made with Chai and Jimmy. The first was Meru. So we had some experience together and it’s the second film about climbing. I did learn about the challenges of climbing the first time and was familiar with the process – what the climbing involved and how you use the ropes. 

Meru was very successful, so we immediately began discussing Free Solo. But the filming took about a year-and-a-half. That was partly due to accidents and injuries Alex had. It went into a second season and then a third season of climbing and you just have to follow along. That’s what documentaries are all about. You hitch your wagon to this person and you have to go where they take you. And so, it became a much longer project than initially thought. I began editing six months before Alex made the final climb. At that point they had been filming for about a year. So I came on in January and he made the climb in June – at which point I was well into the process of editing.

[OP] There’s a point in Free Solo, where Alex had started the ascent once and then stopped, because he wasn’t feeling good about it. Then it was unclear whether or not he would even attempt it again. Was that the six-month point when you joined the production?

[BE] Yes, that’s it. It’s very much the climbers’ philosophy that you have to feel it, or you don’t do it. That’s very true of free soloing. We wanted him to signal the action, “This is what I plan to do.” And he wouldn’t do it – ever – because that’s against the mentality of climbing. “If I feel it, I may do it. Otherwise, not.” It’s great for climbing, but not so good for film production.

[OP] Unlike any other film project, failure in this case would have meant Alex’s death. In that event you would have had a completely different film. That was touched on in the film, but what was the behind-the-scenes thinking about the possibility of such as catastrophe? Any Plan B?

[BE] In these vérité documentaries you never know what’s going to happen, but this is an extreme example. He was either going to do it and succeed, decide he wasn’t going to do it, or die trying, and that’s quite a range. So we didn’t know what film we were making when I started editing. We were going to go with the idea of him succeeding and then we’d reconsider if something else happened. That was our mentality, although in the back of our minds we knew this could be quite different.

When they started, it wasn’t with the intention of making this film. Jimmy knew Alex for 10 years. They were old friends and had done a lot of filming together. He thought Alex would be a great subject for a documentary. That’s what they proposed to Nat Geo – just a portrait of Alex – and Alex said, “If you are going to do that, then I’ve got to do something worthwhile. I’m going to try to free solo El Cap.” He told that to Chai while Jimmy wasn’t there. Chai is not a climber and she thought, “Great, that sounds like it will be a good film.” Jimmy completely freaked out when he found out, because he knew what it meant.

It’s an outrageous concept even to climbers. They actually backed off and had to reconsider whether this was something they wanted to get involved in. Do you really want to see your friend jeopardize his life for this? Would the filming add additional pressure on Alex? They had to deal with this even before they started shooting, which is why that was part of the film. I felt it was a very important idea to get across. Alex is taciturn, so you needed ways to understand him and what he was doing. The crew as a character really helped us do that. They were people Alex could interact with and the audience could identify with.

The other element that I felt was very important, was Sanni [McCandless, Alex Honnold’s girlfriend], who suddenly came onto the scene after the filming began. This felt like a very important way to get to know Alex. It also became another challenge for Alex – whether he would be able not only to climb this mountain, but whether he would be able to have a relationship with this woman. And aren’t those two diametrically opposed? Being able to open yourself up emotionally to someone, but also control your emotions enough to be able to hang by your fingertips 2,000 feet in the air on the side of a cliff.

[OP] Sanni definitely added a lot of humanity to him. Before the climb they discuss the possibility of his falling to his death and Alex’s point of view is that’s OK. “If I die, I die.” I’m not sure he really believed that deep inside. Or did he?

[BE] Alex is very purposeful and lives every day with intention. That’s what’s so intriguing. He knows any minute on the wall could be his last and he’s comfortable with that. He felt like he was going to succeed. He didn’t think he was going to fall. And if he didn’t feel that way he wasn’t going to do it. Seeing the whole thing through Sanni’s eyes allowed us as the audience to get closer to and identify with Alex. We call that moment the ‘Take me into consideration’ scene, which I felt was vitally important.

[OP] Did you have any audience screenings of the rough cuts? If so, how did that inform your editing choices?

[BE] We did do some screenings and it’s a tricky thing. Nat Geo was a great partner throughout. Most companies wouldn’t be able to deal with this going on for a year-and-a-half. It’s in Nat Geo’s DNA to fund exploration and make exploratory films. They were completely supportive, but they did decide they wanted to get into Sundance and we were a month from the deadline. We brought in three other editors (Keiko Deguchi, Jay Freund, and Brad Fuller) to jump in and try to make it. Even though we got an extension and we did a great job, we didn’t get in. The others left and I had another six months to work on the film and make it better. Because of all of this, the screenings were probably too early. The audience had trouble understanding Alex, understanding what he’s trying to do – so the first couple screenings were difficult.

We knew when we saw the initial climbing footage that the climb itself was going to be amazing. By the time we showed it to an audience, we were completely immune to any tension from the climb – I mean, we’d seen it 200 times. It was no longer as scary to us as it had been the first time we saw it. In editing you have to remember the initial reaction you had to the footage so that you can bring it to bear later on. It was a real struggle to make the rest of the story as strong as possible to keep you engaged, until we got to the climb. So we were pleasantly surprised to see that people were so involved and very tense during the climb. We had underestimated that.

We also figured that everyone would already know how this thing ends. It was well-publicized that he successfully climbed El Cap. The film had to be strong enough that people could forget they knew what happened. Although I’ve had people tell me they could not have watched the climb if they hadn’t known the outcome.

[OP] Did you end up emphasizing some aspects over others as a result of the screenings?

[BE] The main question to the audience is, “Do you understand what we are trying to say?” And then, “What do you think of him or her as a character?” That’s interesting information that you get from an audience. We really had to clarify what his goal was. He never says at the beginning, “I’m going to do this thing.” In fact, I couldn’t get him to say it after he did it. So it was difficult to set up his intention. And then it was also difficult to make clear what the steps were. Obviously we couldn’t cover the whole 3,000 feet of El Capitan, so they had to concentrate on certain areas.

We decided to cover five or six of the most critical pitches – sections of the climb – to concentrate on those and really cover them properly during the filming. These were challenging to explain and it took a lot of effort to make that clear. People ask, “How did you manage to cut the final climb – it was amazing.” Well, it worked because of the second act that explains what he is trying to do. We didn’t have to say anything in the third act. You just watch because you understand. 

When we started people didn’t understand what free soloing is. At first we were calling the film Solo. The nomenclature of climbing is confusing. Soloing is actually climbing with a rope, but only for protection. Then we’d have to explain what free soloing was as opposed to soloing. However, Hans Solo came along and stole our title, so it was much easier to call it Free Solo. Explaining the mentality of climbing, the history of climbing, the history of El Capitan, and then what exactly the steps were for him to accomplish what he was trying to do – all that took a long time to get right and a lot of that came out of good feedback from the audience.

Then, “Do you understand the character?” At one point we didn’t have enough of Sanni and then we had too much of Sanni. It became this love story and you forgot that he was going to climb. So the balancing was tricky.

[OP] Since you were editing before the final outcome and production was still in progress, did you have an opportunity to request more footage or that something in particular be filmed that you were missing in the edit?

[BE] That was the big advantage to starting the edit before the filming was done. I often end up coming into projects that are about 80-90% shot on average. So they have the ability to get pick-ups if people are alive or if the event can still be filmed in some way. This one was more ‘in progress.’ For instance, he practiced a specific move a lot for the most difficult pitch and I kept asking for more of that. We wanted to show how many times he practiced it in order to get the feel of it.

[OP] Let’s switch gears and talk about the technical side. Which edit system did you use to cut Free Solo?

[BE] We were using Avid Media Composer 8.8.5 with Nexis shared storage. Avid is my first choice for editing. I’ve done about four films on the old Final Cut – Meru being one of them – but, I much prefer Avid. I’ve often inherited projects that were started on something else, so you are stuck. On this one we knew going in that we would do it on Avid. Their ScriptSync feature is terrific. Any long discussions or sit-down interviews were transcribed. We could then word-search them, which was invaluable. My associate editor, Simona Ferrari, set up everything and was also there for the output.

[OP] Did you handle the finishing – color correction and sound post – in-house or go outside to another facility?

[BE] We up-rezzed in the office on [Blackmagic Design DaVinci] Resolve and then took that to Company 3 for finishing and color correction. Deborah Wallach did a great job sound editing and we mixed with Tommy Fleischman [Hugo, The Wolf of Wall Street, BlacKkKlansman]. They shot this on about every camera, aspect ratio, and frame rate imaginable. But if they’re hanging 2,000 feet in the air and didn’t happen to hit the right button for the frame rate – you really can’t complain too much! So there was an incredible wide range and Simona managed to handle all that in the finishing. There wasn’t a lot of archival footage, but there were photos for the backstory of the family.

The other big graphic element was the mountain itself. We needed to be able to trace his route up the mountain and that took forever. It wasn’t just to show his climb, but also to connect the pitches that we had concentrated on, since there wasn’t much coverage between them. Making this graphic became very complicated. We tried one house and they couldn’t do it. Finally, Big Star, who was doing the other graphics – photomontages and titles – took this on. It was the very last thing done and was dropped in during the color correction session.

For the longest time in the screenings, the audience was watching a drawing that I had shot off of the cutting room wall and traced in red. It was pretty lame. For the screenings, it was a shot of the mountain and then I would dissolve through to get the line moving. After a while we had some decent in and out shots, but nothing in-between, except this temporary graphic that I created. 

[OP] I caught Free Solo on the plane to Las Vegas for NAB and it had me on the edge of my seat. I know the film was also released in IMAX, so I can only image what that experience was like.

[BE] The film wasn’t made for IMAX – that opportunity came up later. It’s a different film on IMAX. Although there is incredible high-angle photography, it’s an intimate story. So it worked well on a moderately big screen. But in IMAX it becomes a spectacle, because you can really see all those details in the high-angle shots. I have cut an IMAX film before and you do pace them different, because of the ability to look around. However, there wasn’t a different version of Free Solo made for IMAX – we didn’t have the freedom to do that. Of course, the whole film is largely handheld, so we did stabilize a few shots. IMAX merely used their algorithm to bump it up to their format. I was shocked – it was beautiful.

[OP] Let’s talk a bit about your process as an editor. For instance, music. Different editors approach music differently. Do you cut with temp music or wait until the very end to introduce the score?

[BE] Marco Beltrami [Fantastic Four, Logan, Velvet Buzzsaw] was our composer, but I use temp music from very early on. I assemble a huge library of scratch music – from other films or from the potential composers’ past films. I use that until we get the right feel for the music and that’s what we show to the composer. It gives us something to talk about. It’s much easier to say, “We like what the music is doing here, but it’s the wrong instrumentation.” Or, “This is the right instrument, but the wrong tempo.” It’s a baseline.

[OP] How do you tackle the footage at the very beginning? Do you create selects or Kem rolls or some other approach?

[BE] I create a road map to know where I’m going. I go through all the dailies and pull the stuff that I think might be useful. Everything from the good-looking shots to a taste of something that I may never use, but I want to remember. Then I screen selects reels. I try to do that with the director. Sometimes we can schedule that and sometimes not. On Free Solo there was over 700 hours of footage, so it’s hard to get your arms around that. By the time you get through looking at the 700th hour you’ve forgotten the first one. That’s why the selecting process is so important to me. The selects amount to maybe a third of the dailies footage. After screening the selects, I can start to see the story and how to tell it. 

I make index cards for every scene and storyboard the whole thing. By that I mean arrange the cards on a wall. They are color-coded for places, years, or characters. It allows me to stand back and see the flow of the film, to think about the structure, and the points that I have to hit. I basically cut to that. Of course, if it doesn’t work, I re-arrange the index cards (laugh).

A few years ago, I did a film about the Dixie Chicks [Shut Up & Sing] at the time they got into trouble for comments they had made about President Bush. We inherited half of the footage and shot half. The Dixie Chicks went on to produce a concert and an album based upon their feelings about the whole experience. It was kind of present and past, so there were basically two different colors to the cards. It was not cut in chronological order, so you could see very quickly whether you were in the past or the present just by looking at the wall. There were four editors working on Shut Up & Sing and we could look at the wall, discuss, and decide if the story was working or not. If we moved this block of cards, what would be the consequences of telling the story in a different order?

[OP] Were Jimmy or Chai very hands-on as directors during the edit – in the room with you every day at the end?

[BE] Chai and Jimmy are co-directors and so Jimmy tended to be more in the field and Chai more in the edit room. Since we had worked together before, we had built a common language and a trust. I would propose ideas to Chai and try them and she would take a look. My feeling is that the director is very close to it and not able to see the dailies with fresh eyes. I have the fresh perspective. I like to take advantage of that and let them step back a little. By the end, I’m the one that’s too close to it and they have a little distance if they pace themselves properly.

[OP] To wrap it up, what advice would you have for young editors tackling a documentary project like this?

[BE] Well, don’t climb El Cap – you probably won’t make it (laugh)! I always preach this to my students: I encourage them to make an outline and work towards it. You can make index cards like I do, you can make a Word document, a spreadsheet; but try to figure out what your intentions are and how you are going to use the material. Otherwise, you are just going to get lost. You may be cutting things that are lovely, but then don’t fit into the overall structure. That’s my big encouragement.

Sometimes with vérité projects there’s a written synopsis, but for Free Solo there was nothing on paper at the beginning. They went in with one idea and came out with a different film. You have to figure out what the story is and that’s all part of the editing process. This goes back to the Maysles’ approach. Go out and capture what happened and then figure out the story. The meaning is found in the cutting room.

Images courtesy of National Geographic and Bob Eisenhardt.

©2019 Oliver Peters

Advertisements

Black Mirror: Bandersnatch

Bandersnatch was initially conceived as an interactive episode within the popular Black Mirror anthology series on Netflix. Instead, Netflix decided to release it as a standalone, spin-off film in December 2018. It’s the story of programmer Stefan Butler (Fionn Whitehead) as he adapts a choose-your-own-adventure novel into a video game. Set in 1984, the viewers get to make decisions for Butler’s actions, which then determine the next branch of the story shown to the viewer. They can go back though Bandersnatch and opt for different decisions, in order to experience other versions of the story.

Bandersnatch was written by show creator Charlie Brooker (Black Mirror, Cunk on Britain, Cunk on Shakespeare), directed by David Slade (American Gods, Hannibal, The Twilight Saga: Eclipse), and edited by Tony Kearns (The Lodgers, Cardboard Gangsters, Moon Dogs). I recently had a chance to interview Kearns about the experience of working on such a unique production.

__________________________________________________

[OP] Please tell me a little about your editing background leading up to cutting Bandersnatch.

[TK] I started out almost 30 years ago editing music videos in London. I did that full-time for about 15 years working for record companies and directors. At the tail end of that a lot of the directors I was working with moved into doing commercials, so I started editing commercials more and more in Dublin and London. In Dublin I started working on long form, feature film projects and cut about 10 projects that were UK or European co-productions with the Irish Film Board.

In 2017 I got a call from Black Mirror to edit the Metalhead episode, which was directed by David Slade. He was someone I had worked with on music videos and commercials 15 years previously, before he had moved to the United States. That was a nice circularity. We were together working again, but on a completely different type of project – drama, on a really cool series, like Black Mirror. It went very well, so David and I were asked to get involved with Bandersnatch, which we jumped at, because it was such an amazing, different kind of project. It was unlike anything either of us – or anyone else, for that matter – has ever done to that level of complexity.

[OP] Other attempts at interactive storytelling – with the exception of the video game genre – have been a hit-or-miss. What were your initial thoughts when you read the script for the first time?

[TK] I really enjoyed the script. It was written like a conventional script, but with software called Twine, so you could click on it and go down different paths. Initially I was overwhelmed at the complexity of the story and the structure. It wasn’t that I was like a deer in the headlights, but it gave me a sense of scale of the project and [writer/show runner] Charlie Brooker’s ambition to take the interactive story to so many layers.

On my own time I broke down the script and created spreadsheets for each of the eight sections in the script and wrote descriptions of every possible permutation, just to give me a sense of what was involved and to get it in my head what was going on. There are so many different narrative paths – it was helpful to have that in my brain. When we started editing, that would also help me to keep a clear eye at any point.

[OP] How long of a schedule did you have to post Bandersnatch?

[TK] 17 weeks was the official edit time, which isn’t much longer than on a low-budget feature. When I mentioned that to people, they felt that was a really short amount of time; but, we did a couple of weekends, we were really efficient, and we knew what we were doing.

[OP] Were you under any running length constraints, in the same way that a TV show or a feature film editor often wrestles with on a conventional linear program?

[TK] Not at all. This is the difference – linear doesn’t exist. The length depends on the choices that are made. The only direction was for it not to be a sprawling 15-hour epic – that there would be some sort of ball park time. We weren’t constrained, just that each segment had to feel right – tight, but not rushed.

[OP] With that in mind, what sort of process did you do through to get it to feel right?

[TK] Part of each edit review was to make it as tight or as lean as it needed to be. Netflix developed their own software, called Branch Manager, which allowed people to review the cut interactively by selecting the choice points. My amazing assistant editor, John Weeks, is also a coder, so he acquired an extra job, which was to take the exports and do the coding in order to have everything work in Branch Manager. He’s a very robust person, but I think we almost broke him (laughs), because there were up to 100 Branch Manager versions by the end. The coding was hanging on by a thread. He was a bit like Scotty in Star Trek, “The engines can’t hold it anymore, Captain!”

By using Branch Manager, people could choose a path and view it and give notes. So I would take the notes, make the changes, and it would be re-exported. Some segments might have five cuts while others would be up to 13 or 14. Some scenes were very straightforward, but others were more difficult to repurpose.

Originally there were more segments in the script, but after the first viewings it was felt that there were too many in there. It was on the borderline of being off-putting for viewers. So we combined a few, but I made sure to keep track of that so it was in the system. There was a lot of reviewing, making notes, updating spreadsheets, and then making sure John had the right version for the next Branch Manager creation. It was quite an involved process.

[OP] How were you able to keep all of this straight? Did you use the common technique of scenes cards on the wall or something different?

[TK] If you looked at flowcharts your head would explode, because it would be like looking at the wiring diagram of an old-fashioned telephone exchange. There wouldn’t have been enough room on the wall. For us, it would just be on paper – notebooks and spreadsheets. It was more in our heads – our own sense of what was happening – that made it less confusing. If you had the whole thing as a picture, you just wouldn’t know where to look.

[OP] In a conventional production an editor always has to be mindful that when something is removed, it may have ramifications to the story later on. In this case, I would imagine that those revisions affected the story in either direction. How were you able to deal with that?

[TK] I have been asked about how did we know that each path would have a sense of a narrative arc. We couldn’t think of it as one, total narrative arc. That’s impossible. You’d have to be a genius to know that it’s all going to work. We felt the performances were great, the story was strong, but it doesn’t have a conventional flow. There are choice points, which act as a propellant into the next part of the film thus creating an unconventional experience to the straight story arc of conventional films or episodes. Although there wasn’t a traditional arc, it still had to feel like a well-told story. And that you would have empathy and a sense of engagement – that it wasn’t a gimmick.

[OP] How did the crew and actors mange to keep the story straight in their minds as scenes were filmed?

[TK] As with any production, the first few days are finding out what you’ve let yourself in for. This was a steep learning curve in that respect. Only three weeks of the seven-week shoot was in the same studio complex where I was working, so I wasn’t present. But there was a sense that they needed to make it easier for the actors and the crew. The script supervisor, Marilyn Kirby, was amazing. She was the oracle for the whole shoot. She kept the whole show on the road, even when it was quite complicated. The actors got into the swing of it quickly, because I had no issues with the rushes. They were fantastic.

[OP] What camera formats were used and what is your preparation process for this footage prior to editing?

[TK] It’s the most variety of camera formats I’ve ever worked on. ARRI Alexa 65 and RED, but also 1980s Ikegami TV cameras, Super 8mm, 35mm, 16mm, and VHS. Plus, all of the print stills were shot on black-and-white film. The data lab handled the huge job to keep this all organized and provide us with the rushes. So, when I got them, they were ready to go. The look was obviously different between the sources, but otherwise it was the same as a regular film. Each morning there was a set of ProRes Proxy rushes ready for us. John synced and organized them and handed them over. And then I started cutting. Considering all the prep the DIT and the data lab had to go through, I think I was in a privileged position!

[OP] What is your method when first starting to edit a scene?

[TK] I watch all of the rushes and can quickly see which take might be the bedrock framing for a scene – which is best for a given line. At that point I don’t just slap things together on a timeline. I try to get a first assembly to be as good as possible, because it just helps anyone who sees it. If you show a director or a show runner a sloppy cut, they’ll get anxious and I don’t want that to happen. I don’t want to give the wrong impression.

When I start a scene, I usually put the wide down end-to-end, so I know I have the whole scene. Then I’ll play it and see what I have in the different framings for each line – and then the next line and the next and so on. Finally, I go back and take out angles where I think I may be repeating a shot too much, extend others, and so on. It’s a built-it-up process in an effort to get to a semi-fine cut as quickly as possible.

[OP] Were you able to work with circle takes and director’s notes on Bandersnatch?

[TK] I did get circle takes, but no director’s notes. David and I have an intuitive understanding, which I hope to fulfill each time – that when I watch the footage he shoots, that I’ll get what he’s looking for in the scene. With circles takes, I have to find out very quickly whether the script supervisor is any good or not. Marilyn is brilliant so whenever she’s doing that, I know that take is the one. David is a very efficient director, so there weren’t a massive number of takes – usually two or three takes for each set-up. Everything was shot with two cameras, so I had plenty of coverage. I understand what David is looking for and he trusts me to get close to that.

[OP] With all of the various formats, what sort of shooting ratio did you encounter? Plus, you had mentioned two-camera scenes. What is your approach to that in your edit application?

[TK] I believe the various story paths totaled about four-and-a-half hours of finished material. There was a 3:1 shooting ratio, times two cameras – so maybe 6:1 or even 9:1. I never really got a final total of what was shot, but it wasn’t as big as you’d expect. 

When I have two-camera coverage I deal with it as two individual cameras. I can just type in the same timecode for the other matching angle. I just get more confused with what’s there when I use multi-cam. I prefer to think of it as that’s the clip from the clip. I hope I’m not displaying an anti-technology thing, but I’m used to it this way from doing music videos. I used to use group clips in Avid and found that I could think about each camera angle more clearly by dealing with them separately.

[OP] I understand that you edited Bandersnatch on Adobe Premiere Pro. Is that your preferred editing software?

[TK] I’ve used Premiere Pro on two feature films, which I cut in Dublin, and a number of shorts and TV commercials. If I am working where I can set up my own cutting room, then I’m working with Premiere. I use both Avid and Adobe, but I find I’m faster on Premiere Pro than on Media Composer. The tools are tuned to help me work faster.

The big thing on this job was that you can have multiple sequences open at the same time in Premiere. That was going to be the crunch thing for me. I didn’t know about Branch Manager when I specified Premiere Pro, so I figured that would be the way we work need to review the segments – simply click on a sequence tab and play it as a rudimentary way to review a story path. The company that supplied the gear wasn’t as familiar with Premiere [as they were with Avid], so there were some issues, but it was definitely the right choice.

[OP] Media Composer’s strength is in multi-editor workflows. How did you handle edit collaboration in Premiere Pro?

[TK] We used Adobe’s shared projects feature, which worked, but wasn’t as efficient as working with Avid in that version of Premiere. It also wasn’t ideal that we were working from Avid Nexis as the shared storage platform. In the last couple of months I’ve been in contact with the people at Adobe and I believe they are sorting out some of the issues we were having in order to make it more efficient. I’m keen for that to happen.

In the UK and London in particular, the big player is Avid and that’s what people know, so anything different, like Premiere Pro, is seen with a degree of suspicion. When someone like me comes in and requests something different, I guess I’m viewed as a bit of a pain in the ass. But, there shouldn’t just be one behemoth. If you had worked on the old Final Cut Pro, then Premiere Pro is a natural fit – only more advanced and supported by a company that didn’t want to make smart phones and tablets.

[OP] Since Adobe Creative Cloud offers a suite of compatible software tools, did you tap into After Effects or other tools for your edit?

[TK] That was another real advantage – the interaction with the graphics user interface and with After Effects. When we mocked up the first choice points, it was so easy to create, import, and adjust. That was a huge advantage. Our VFX editor was able to build temp VFX in After Effects and we could integrate that really easily. He wasn’t just using an edit system’s effects tool, but actual VFX software, which seamlessly integrated with Premiere. Although these weren’t final effects at full 4K resolution, he was able to do some very complex things, so that everyone could go, “Yes, that’s it.”

[OP] In closing, what take-away would you offer an editor interested in tackling an interactive story as compared to a conventional linear film?

[TK] I learned to love spreadsheets (laugh). I realized I had to be really, really organized. When I saw the script I knew I had to go through it with a fine-tooth comb and get a sense of it. I also realized you had to unlearn some things you knew about conventional episodic TV. You can’t think of some things in the same way. A practical thing for the team is that you have to have someone who knows coding, if you are using a similar tool to Branch Manager. It’s the only way you will be able to see it properly.

It’s a different kind of storytelling pressure that you have to deal with, mostly because you have to trust your instincts even more that it will work as a coherent story across all the narrative paths. You also have to be prepared to unlearn some of the normal methods you might use. One example is that you have to cut the opening of different segments differently to work with the last shot of the previous choice point, so you can’t just go for one option, you have to think more carefully what the options are. The thing is not to walk in thinking it’s going to be the same as any other production, because it ain’t.

For more on Bandersnatch, check out these links: postPerspective, an Art of the Guillotine interview with Tony Kearns, and a scene analysis at This Guy Edits.

Images courtesy of Netflix and Tony Kearns.

©2019 Oliver Peters

It is time to reconsider Final Cut Pro X?

While Final Cut Pro X may have ultimately landed in the market sector that Apple envisioned, the industry widely acknowledged that the original launch could have been better managed. Many staunch Final Cut Pro (“legacy”) users were irrevocably alienated. That’s a shame, because FCPX wasn’t a bad design when released – merely incomplete. In the eight years that have followed, the user base has grown to more than 2.5 million (April 2018) and the application sports the widest third-party support of any editing software.

I have certainly gone back and forth in my own use of FCPX, depending on whether it was the right tool for a given job. I cut a feature film with it back in the pre-10.1 days when it was a bifurcated application with separate Event and Project files. Since then, I have also used it on plenty of spots and corporate videos. Although my daily workflow is largely Premiere Pro-based now, I regularly use Final Cut Pro X when appropriate, as well as Blackmagic Design DaVinci Resolve and Avid Media Composer. Modern editors need to be NLE-multilingual.

I realize that winning Oscars and cutting large-scale productions isn’t what the majority of editors do. Nevertheless, these types of productions give any product street cred. You are probably aware of Focus and Whiskey Tango Foxtrot, but there are certainly others that have used FCPX. Hollywood studios films are dominated by films cut with Avid Media Composer; however, short films cut using FCPX have won the short film Oscar category for two years in a row. While largely invisible to many US viewers, major international productions, on par with Game of Thrones, have been edited using Final Cut Pro X.

If you were one of those FCP7 users who jumped ship to another tool, then maybe it’s time to revisit Final Cut Pro X. There are many reasons I say that. In the past eight years, Apple has added wide codec support, LUTs, HDR capabilities, vastly improved color correction tools, and an easy method of working with captioning. Final Cut is clearly the better tool in many situations and here’s a quick overview why I feel that way.

What productions are best with FCPX?

Final Cut Pro X is capable of handling all types of editing, but it’s more ideal for some than others. The biggest differentiator is turnaround time. If you have to get done quickly – from ingest to delivery – then FCPX is hard to beat. It handles media better than any other NLE without the need for the beefiest hardware. Want to cut 4K ProResHQ on a two-year-old MacBook Pro? Then FCPX shines. That makes it a natural in broadcast news, promos, and sports. It’s also perfect for non-broadcast event coverage. Frankly, I’m surprised that US broadcasters haven’t gravitated to it like various other broadcasters around the world – especially for cutting news stories. The workflow, interface, and low hardware requirements make it well-suited to the task.

Station promo production might be questionable for some, but stop and think about the use of Motion Templates and how that technology can be applied to broadcast design. Final Cut features the unique ability to use templates that any user can create and publish as an effect out of Apple Motion. Therefore, custom effects, animation, and graphics can easily be created specifically for a station’s bespoke look.

For example, a broadcast group or network that owns multiple stations in different cities could have one creative team develop a custom station graphics package for each outlet, simply by using Motion. Those templates could be deployed to each promo department and installed into the individual FCPX edit systems. This would allow each editor to modify or customize time and event information based on the published parameters without mistakenly deviating from the prescribed graphic look. That’s a broadcast creative director’s dream.

A simple hardware footprint

Obviously Final Cut requires Apple computers, but there’s easy connectivity to media from external Thunderbolt, USB, and ethernet-based storage. Some facilities certainly need elaborate shared storage systems for collaborative workflows, but others don’t. If you are a creative editorial boutique, all of a given project’s proxy editing files can be stored on a single SSD drive, allowing the editor to easily move from room to room, or home to work, simply by carrying the SSD with them. They can even be cutting on a laptop and then bring that in to work, connect to an external display for better monitoring, and keep rocking. With the advent of external GPU systems (eGPU), you can easily augment the horsepower of middle-level Macs when the need arises. 

No external i/o hardware is required for monitoring. While I recommend a simple audio i/o interface and external speakers as a minimum, there are plenty of fixed-location systems where the editors only use headphones. AJA or Blackmagic interfaces to play video out to an external display are optional. Simply connect a high-quality display to the Mac via HDMI or Thunderbolt and FCPX will feed real video to it full screen. Premiere Pro can also do this, but Media Composer and Resolve do not.

Third-party ecosystem

Some of Final Cut’s deficits have developed into a huge asset. It enjoys one of the best ecosystems of third-party tools that enhance the application. These range from translation tools from vendors like Intelligent Assistance and Marquis Broadcast, to a myriad of plug-ins, such as those from FxFactory and Coremelt. Final Cut already comes with a very solid set of built-in effects filters – probably the most useful variety of the various NLE options. Even better, if you also purchase Motion, you can easily create more effects by building your own as Motion Templates. This has resulted in a ton of small developers who create and sell their own variations using this core technology.

You certainly don’t have to purchase any additional effects to be productive with FCPX, but if you do, then one of the better options is FxFactory by Noise Industries. FxFactory is both a set of effects and a delivery platform for other developers. You can use the FxFactory interface to purchase, install, and manage plug-ins and even applications from a diverse catalogue of tools. Pick and choose what you need and grow the repertoire as you see fit. One of the first options to start with is idustrial revolution’s newly revamped XEffects Toolkit. This includes numerous effects and title templates to augment your daily work. Some of these employ built-in tracking technology that allows you to pin items to objects within a shot.

Apple’s latest feature addition is workflow extensions. Adobe introduced this technology first in its products. But Apple has built upon it through macOS integration with apps like Photos and now in Final Cut Pro X. In short, an extension allows direct FCPX integration with another application. Various extensions can be downloaded from the Mac App Store and installed into FCPX. An extension then adds a panel into Final Cut, which allows you to interact with that application from inside the FCPX interface. Initially some of the companies offering extensions include frame.io, Shutterstock, Simon Says, and others.

Subscription

A sore point for many Adobe customers was the shift to the subscription business model. While the monthly rates are reasonable if you are an ongoing business, they have caused some to stick with software as old as CS6 (yikes!). As more companies adopt subscriptions, you have to start wondering when enough is enough. I don’t think we are there yet and Creative Cloud is still a solid value. But if you are an individual who doesn’t make a living with these tools, then it’s a concern. Adobe recently raised eyebrows with the doubling of the monthly cost for its Photography plan. As it turns out this is an additional pricing plan with more storage and not a replacement, but that’s only evident after the website page appears to have been quickly fixed. Predictably this gives competitors like ON1 an avenue for counter-marketing.

Concerned with subscriptions? Then the Apple professional applications are an alternative. Final Cut Pro X, Compressor, Motion, and Logic ProX – coupled with photo and graphics tools from Affinity and/or Pixelmator – provide a viable competing package to Adobe Creative Cloud. Heck, augment that with Fusion and/or DaVinci Resolve – even the free versions – and the collection becomes a formidable toolkit.

The interface

Naturally, the elephant in the room is the FCPX interface. It’s what simultaneously excited and turned off so many FCP7 users. In the end, how you edit with Final Cut Pro X does not have to be all that different than your editing style with other NLEs. Certainly there are differences, but once you get used to the basics, there’s more that’s similar than is different.

Isn’t imitation the highest form of flattery? You only have to look at Adobe Premiere Rush or the new Cut Page in Resolve 16 to realize that just maybe, others are starting to see the value in Apple’s approach. On top of that, there are features touted in Resolve 16, like facial (actually shape) recognition or adjustment layers, that were there even in FCPX 10.0. Whether this all is blatant copying or simply a tip-of-the-hat doesn’t matter. Each company has come to the conclusion that some workflows and some newer editors need a faster and more direct user interface that is easily scalable to small and large screens and to single and dual-display systems.

I realize that many out there will read this post and scream Apple apologist. Whatever. If you’ve shifted to PC, then very little of what I’ve said applies to you. I make my daily living with Apple hardware. While I recognize you can often get superior performance with a PC, I don’t find the need to make a change yet. This means that Final Cut Pro X remains a great option for my workflows. It’s a tool I can use for nearly any job and one that is often times better than most. If you rejected it eight years ago, maybe it’s time to take a second look.

©2019 Oliver Peters

NAB Show 2019

This year the NAB Show seemed to emphasize its roots – the “B” in National Association of Broadcasters. Gone or barely visible were the fads of past years, such as stereoscopic 3D, 360-degree video, virtual/augmented reality, drones, etc. Not that these are gone – merely that they have refocused on the smaller segment of marketshare that reflects reality. There’s not much point in promoting stereo 3D at NAB if most of the industry goes ‘meh’.

Big exhibitors of the past, like Quantel, RED, Apple, and Autodesk, are gone from the floor. Quantel products remain as part of Grass Valley (now owned by Belden), which is the consolidation of Grass Valley Group, Quantel, Snell & Wilcox, and Philips. RED decided last year that small, camera-centric shows were better venues. Apple – well, they haven’t been on the main floor for years, but even this year, there was no off-site, Final Cut Pro X stealth presence in a hotel suite somewhere. Autodesk, which shifted to a subscription model a couple of years ago, had a demo suite in the nearby Renaissance Hotel, focusing on its hero product, Flame 2020. Smoke for Mac users – tough luck. It’s been over for years.

This was a nuts-and-bolts year, with many exhibits showing new infrastructure products. These appeal to larger customers, such as broadcasters and network facilities. Specifically the world is shifting to an IP-based infrastructure for signal routing, control, and transmission. This replaces copper and fiber wiring of the past, along with the devices (routers, video switchers, etc) at either end of the wire. Companies that might have appeared less relevant, like Grass Valley, are back in a strong sales position. Other companies, like Blackmagic Design, are being encouraged by their larger clients to fulfill those needs. And as ever, consolidation continues – this year VizRT acquired NewTek, who has been an early player in video-over-IP with their proprietary NDI protocol.

Adobe

The NAB season unofficially started with Adobe’s pre-NAB release of the CC2019 update. For editors and designers, the hallmarks of this update include a new, freeform bin window view and adjustable guides in Premiere Pro and content-aware, video fill in After Effects. These are solid additions in response to customer requests, which is something Adobe has focused on. A smaller, but no less important feature is Adobe’s ongoing effort to improve media performance on the Mac platform.

As in past years, their NAB booth was an opportunity to present these new features in-depth, as well as showcase speakers who use Adobe products for editing, sound, and design. Part of the editing team from the series Atlanta was on hand to discuss the team’s use of Premiere Pro and After Effects in their ‘editing crash pad’.

Avid

For many attendees, NAB actually kicked off on the weekend with Avid Connect, a gathering of Avid users (through the Avid Customer Association), featuring meet-and-greets, workshops, presentations, and ACA leadership committee meetings. While past product announcements at Connect have been subdued from the vantage of Media Composer editors, this year was a major surprise. Avid revealed its Media Composer 2019.5 update (scheduled for release the end of May). This came as part of a host of many updates. Most of these apply to companies that have invested in the full Avid ecosystem, including Nexis storage and Media Central asset management. While those are superb, they only apply to a small percentage of the market. Let’s not forget Avid’s huge presence in the audio world, thanks to the dominance of Pro Tools – now with Dolby ATMOS support. With the acquisition of Euphonix years back, Avid has become a significant player in the live and studio sound arena. Various examples of its S-series consoles in action were presented.

Since I focus on editing, let me discuss Media Composer a bit more. The 2019.5 refresh is the first major Media Composer overhaul in years. It started in secret last year. 2019.5 is the first iteration of the new UI, with more to be updated in coming releases. In short, the interface has been modernized and streamlined in ways to attract newer, younger users, without alienating established editors. Its panel design is similar to Adobe’s approach – i.e. interface panels can be docked, floated, stacked, or tabbed. Panels that you don’t want to see may be closed or simply slid to the side and hidden. Need to see a hidden panel again? Simply side it back open from the edge of the screen.

This isn’t just a new skin. Avid has overhauled the internal video pipeline, with 32-bit floating color and an uncompressed DNx codec. Project formats now support up to 16K. Avid is also compliant with the specs of the Netflix Post Alliance and the ACES logo program.

I found the new version very easy to use and a welcomed changed; however, it will require some adaptation if you’ve been using Media Composer for a long time. In a nod to the Media Composer heritage, the weightlifter (aka ‘liftman’) and scissors icons (for lift and extract edits) are back. Even though Media Composer 2019.5 is just in early beta testing, Avid felt good enough about it to use this version in its workshops, presentations, and stage demos.

One of the reasons to go to NAB is for the in-person presentations by top editors about their real-world experiences. No one can top Avid at this game, who can easily tap a host of Oscar, Emmy, BFTA, and Eddie award winners. The hallmark for many this year was the presentation at Avid Connect and/or at the show by the Oscar-winning picture and sound editing/mixing team for Bohemian Rhapsody. It’s hard not to gather a standing-room-only crowd when you close your talk with the Live Aid finale sequence played in kick-ass surround!

Blackmagic Design

Attendees and worldwide observers have come to expect a surprise NAB product announcement out of Grant Petty each year and he certainly didn’t disappoint this time. Before I get into that, there were quite a few products released, including for IP infrastructures, 8K production and post, and more. Blackmagic is a full spectrum video and audio manufacturer that long ago moved into the ‘big leagues’. This means that just like Avid or Grass Valley, they have to respond to pressure from large users to develop products designed around their specific workflow needs. In the BMD booth, many of those development fruits were on display, like the new Hyperdeck Extreme 8K HDR recorder and the ATEM Constellation 8K switcher.

The big reveal for editors was DaVinci Resolve 16. Blackmagic has steadily been moving into the editorial space with this all-in-one, edit/color/mix/effects/finishing application. If you have no business requirement for – or emotional attachment to – one of the other NLE brands, then Resolve (free) or Resolve Studio (paid) is an absolute no-brainer. Nothing can touch the combined power of Resolve’s feature set.

New for Resolve 16 is an additional editorial module called the Cut Page. At first blush, the design, layout, and operation are amazingly similar to Apple’s Final Cut Pro X. Blackmagic’s intent is to make a fast editor where you can start and end your project for a time-sensitive turnaround without the complexities of the Edit Page. However, it’s just another tool, so you could work entirely in the Cut Page, or start in the Cut Page and refine your timeline in the Edit Page, or skip the Cut Page all together. Resolve offers a buffet of post tools that are at your disposal.

While Resolve 16’s Cut Page does elicit a chuckle from experienced FCPX users, it offers some new twists. For example, there’s a two-level timeline view – the top section is the full-length timeline and the bottom section is the zoomed-in detail view. The intent is quick navigation without the need to constantly zoom in and out of long timelines. There’s also an automatic sync detection function. Let’s say you are cutting a two-camera show. Drop the A-camera clips onto the timeline and then go through your B-camera footage. Find a cut-away shot, mark in/out on the source, and edit. It will ‘automagically’ edit to the in-sync location on the timeline. I presume this is matched by either common sound or timecode. I’ll have to see how this works in practice, but it demos nicely. Changes to other aspects of Resolve were minor and evolutionary, except for one other notable feature. The Color Page added its own version of content-aware, video fill.

Another editorial product addition – tied to the theme of faster, more-efficient editing – was a new edit keyboard. Anyone who’s ever cut in the linear days – especially those who ran Sony BVE9000/9100 controllers – will feel very nostalgic. It’s a robust keyboard with a high-quality, integrated jog/shuttle knob. The feel is very much like controlling a tape deck in a linear system, with fast shuttle response and precise jogging. The precision is far better than any of the USB controllers, like a Contour Shuttle. Whether or not enough people will have interest in shelling out $1,025 for it awaits to be seen. It’s a great tool, but are you really faster with one, than with FCPX’s skimming and a standard keyboard and mouse?

Ironically, if you look around the Blackmagic Design booth there does seem to be a nostalgic homage to Sony hardware of the past. As I said, the edit keyboard is very close to a BVE9100 keyboard. Even the style of the control panel on the Hyperdecks – and the look of the name badges on those panels – is very much Sony’s style. As humans, this appeals to our desire for something other than the glass interfaces we’ve been dealing with for the past few years. Michael Cioni (Panavision, Light Iron) coined this as ‘tactile attraction’ in his excellent Faster Together Stage talk. It manifests itself not only in these type of control surfaces, but also in skeuomorphic designs applied to audio filter interfaces. Or in the emotion created in the viewer when a colorist adds film grain to digital footage.

Maybe Grant is right and these methods are really faster in a pressure-filled production environment. Or maybe this is simply an effort to appeal to emotion and nostalgia by Blackmagic’s designers. (Check out Grant Petty’s two-hour 2019 Product Overview for more in-depth information on Blackmagic Design’s new products.)

8K

I won’t spill a lot of words on 8K. Seems kind of silly when most delivery is HD and even SD in some places. A lot of today’s production is in 4K, but really only for future-proofing. But the industry has to sell newer and flashier items, so they’ve moved on to 8K pixel resolution (7680 x 4320). Much of this is driven by Japanese broadcast and manufacturer efforts, who are pushing into 8K. You can laugh or roll your eyes, but NAB had many examples of 8K production tools (cameras and recorders) and display systems. Of course, it’s NAB, making it hard to tell how many of these are only prototypes and not yet ready for actual production and delivery.

For now, it’s still a 4K game, with plenty of mainstream product. Not only cameras and NLEs, but items like AJA’s KiPro family. The KiPro Ultra Plus records up to four channels of HD or one channel of 4K in ProRes or DNx. The newest member of the family is the KiPro GO, which records up to four channels of HD (25Mbps H.264) onto removable USB media.

Of course, the industry never stops, so while we are working with HD and 4K, and looking at 8K, the developers are planning ahead for 16K. As I mentioned, Avid already has project presets built-in for 16K projects. Yikes!

HDR

HDR – or high dynamic range – is about where it was last year. There are basically four formats vying to become the final standard used in all production, post, and display systems. While there are several frontrunners and edicts from distributors to deliver HDR-compatible masters, there still is no clear path. In you shoot in log or camera raw with nearly any professional camera produced within the past decade, you have originated footage that is HDR-compatible. But none of the low-cost post solutions make this easy. Without the right monitoring environment, you are wasting your time. If anything, those waters are muddier this year. There were a number of HDR displays throughout the show, but there were also a few labelled as using HDR simulation. I saw a couple of those at TV Logic. Yes, they looked gorgeous and yes, they were receiving an HDR signal. I found out that the ‘simulation’ part of the description meant that the display was bright (up to 350 nits), but not bright enough to qualify as ‘true’ HDR (1,000 nits or higher).

As in past transitions, we are certainly going to have to rely on a some ‘glue’ products. For me, that’s AJA again. Through their relationship with Colorfront, AJA offers two FS-HDR products: the HDR Image Analyzer and the FS-HDR convertor. The latter was introduced last year as a real-time frame synchronizer and color convertor to go between SDR and HDR display standards.  The new Analyzer is designed to evaluate color space and gamut compliance. Just remember, no computer display can properly show you HDR, so if you need to post and delivery HDR, proper monitoring and analysis tools are essential.

Cameras

I’m not a cinematographer, but I do keep up with cameras. Nearly all of this year’s camera developments were evolutionary: new LF (large format sensor) cameras (ARRI), 4K camcorders (Sharp, JVC), a full-frame mirrorless DSLR from Nikon (with ProRes RAW recording coming in a future firmware update). Most of the developments were targeted towards live broadcast production, like sports and megachurches.  Ikegami had an 8K camera to show, but their real focus was on 4K and IP camera control.

RED, a big player in the cinema space, was only there in a smaller demo room, so you couldn’t easily compare their 8K imagery against others on the floor, but let’s not forget Sony and Panasonic. While ARRI has been a favorite, due to the ‘look’ of the Alexa, Sony (Venice) and Panasonic (Varicam and now EVA-1) are also well-respected digital cinema tools that create outstanding images. For example, Sony’s booth featured an amazing, theater-sized, LED 8K micro-pixel display system. Some of the sample material shown was of the Rio Carnival, shot with anamorphic lenses on a 6K full-frame Sony Venice camera. Simply stunning.

Finally, let’s not forget Canon’s line-up of cinema cameras, from the C100 to the C700FF. To complement these, Canon introduced their new line of Sumire Prime lenses at the show. The C300 has been a staple of documentary films, including the Oscar-winning film, Free Solo, which I had the pleasure of watching on the flight to Las Vegas. Sweaty palms the whole way. It must have looked awesome in IMAX!

(For more on RED, cameras, and lenses at NAB, check out this thread from DP Phil Holland.)

It’s a wrap

In short, NAB 2019 had plenty for everyone. This also included smaller markets, like products for education seminars. One of these that I ran across was Cinamaker. They were demonstrating a complete multi-camera set-up using four iPhones and an iPad. The iPhones are the cameras (additional iPhones can be used as isolated sound recorders) and the iPad is the ‘switcher/control room’. The set-up can be wired or wireless, but camera control, video switching, and recording is done at the iPad. This can generate the final product, or be transferred to a Mac (with the line cut and camera iso media, plus edit list) for re-editing/refinement in Final Cut Pro X. Not too shabby, given the market that Cinamaker is striving to address.

For those of us who like to use the NAB Show exhibit floor as a miniature yardstick for the industry, one of the trends to watch is what type of gear is used in the booths and press areas. Specifically, one NLE over another, or one hardware platform versus the other. On that front, I saw plenty of Premiere Pro, along with some Final Cut Pro X. Hardware-wise, it looked like Apple versus HP. Granted, PC vendors, like HP, often supply gear to use in the booths as a form of sponsorship, so take this with a grain of salt. Nevertheless, I would guess that I saw more iMac Pros than any other single computer. For PCs, it was a mix of HP Z4, Z6, and Z8 workstations. HP and AMD were partner-sponsors of Avid Connect and they demoed very compelling set-ups with these Z-series units configured with AMD Radeon cards. These are very powerful workstations for editing, grading, mixing, and graphics.

©2019 Oliver Peters

Are you ready for a custom PC?

Why would an editor, colorist, or animator purchase a workstation from a custom PC builder, instead of one of the brand name manufacturers? Puget Systems, a PC supplier in Washington state, loaned me a workstation to delve into this question. They pride themselves on assembling systems tailor-made for creative users. Not all component choices are equal, so Puget tests the same creative applications we use every day in order to optimize their systems. For instance, Premiere Pro benefits from more CPU cores, whereas with After Effects, faster core speeds are more important than the core count.

Puget Systems also offers a unique warranty. It’s one year on parts, but lifetime free labor. This means free tech and repair support for as long as you own the unit. Even better, it also includes free labor to install hardware upgrades at their facility at any point in the future – you only pay for parts and shipping.

Built for editing

The experience starts with a consultation, followed by progress reports, test results, and photos of your system during and after assembly. These include thermal scans showing your system under load. Puget’s phone advisers can recommend a system designed specifically for your needs, whether that’s CAD, gaming, After Effects, or editing. My target was Premiere Pro and Resolve with a bit of After Effects. I needed it to be capable of dealing with 4K media using native codecs (no transcodes or proxies). 

Puget’s configuration included an eight-core Intel i9 3.6GHz CPU, 64GB RAM, and an MSI GeForce RTX 2080 Ti Venus GPU (11GB). We put in two Samsung SSDs (a Samsung 860 Pro for OS/applications, plus a faster Samsung 970 Pro M.2 NVMe for cache) and a Western Digital Ultrastar 6TB SATA3 spinning drive for media. This PC has tons of connectivity with ports for video displays, Thunderbolt 3, USB-C, and USB 3. The rest was typical for any PC: sound card, ethernet, wifi, DVD-RW, etc. This unit without a display costs slightly over $5K USD, including shipping and a Windows 10 license. That price is in line with (or cheaper than) any other robust, high-performance workstation.

The three drives in this system deliver different speeds and are intended for different purposes. The fastest of these is the “D” drive, which is a blazingly fast NVMe drive that is mounted directly onto the motherboard. This one is intended for use with material requiring frequent and fast read/write cycles. So it’s ideal for Adobe’s cache files and previews. While you wouldn’t store the media for a large Premiere Pro project on it, it would be well-suited for complex After Effects jobs, which typically only deal with a smaller amount of media. While the 6TB HGST “E” drive dealt well with the 4K media for my test projects, in actual practice you would likely add more drives and build up an internal RAID, or connect to a fast external array or NAS.

If we follow Steve Jobs’ analogy that PCs are like trucks, then this is the Ford F-350 of workstations. The unit is a tad bigger and heavier than an older Mac Pro tower. It’s built into an all-metal Fractal Design case with sound dampening and efficient cooling, resulting in the quietest workstation I’ve ever used – even the few times when the fans revved up. There’s plenty of internal space for future expansion, such as additional hard drives, GPUs, i/o card, etc.

For anyone fretting about a shift from macOS to Windows, setting up this system couldn’t have been simpler. Puget installs a professional build of Windows 10 without all of the junk software most PC makers put there. After connecting my devices, I was up and running in less than an hour, including software installation for Adobe CC, Resolve, Chrome, MacDrive, etc. That’s a very ‘Apple-like’ experience and something you can’t touch if you built your own PC.

The proof is in the pudding

Professional users want hardware and software to fade away so they can fluidly concentrate on the creative process. I was working with 4K media and mixed codecs in Premiere Pro, After Effects, and Resolve. The Puget PC more than lived up to its reputation. It was quiet, media handling was smooth, and Premiere and Resolve timelines could play without hiccups. In short, you can stay in the zone without the system creating distractions.

I don’t work as often with RED camera raw files; however, I did load up original footage from an indie film onto the fastest SSD. This was 4K REDCODE media in a 4K timeline in Premiere Pro. Adobe gives you access to the raw settings, in addition to Premiere’s Lumetri color correction controls. The playback was smooth as silk at full timeline resolution. Even adding Lumetri creative LUTs, dissolves, and slow motion with optical flow processing did not impede real-time playback at full resolution. No dropped frames! Nvidia and RED Digital Camera have been working closely together lately, so if your future includes work with 6K/8K RED media, then a system like this requires serious consideration.

The second concern is rendering and exporting. The RTX 2080 Ti is an Nvidia card that offers CUDA processing, a proprietary Nvidia technology.  So, how fast is the system? There are many variables, of course, such as scaling, filters, color correction, and codecs. When I tested the export of a single 4K Alexa clip from a 1080p Premiere Pro timeline, the export times were nearly the same between this PC and an eight-core 2013 Mac Pro. But you can’t tell much from such a simple test.

To push Premiere Pro, I used a nine minute 1080p travelogue episode containing mostly 4K camera files. I compared export times for ProRes (new on Windows with Adobe CC apps) and Avid DNx between this PC and the Mac Pro (through Adobe Media Encoder). ProRes exports were faster than DNxHD and the PC exports were faster than on the Mac, although comparative times tended to be within a minute of each other. The picture was different when comparing H.264 exports using the Vimeo Full HD preset. In that test, the PC export was approximately 75% faster.

The biggest performance improvements were demonstrated in After Effects and Resolve. I used Puget Systems’ After Effects Benchmark, which includes a series of compositions that test effects, tracking, keys, caustics, 3D text, and more (based on Video Copilot’s tutorials). The Puget PC trounced the Mac Pro in this test. The PC scored a total of 969.5 points versus the Mac’s 535 out of a possible maximum score of 1,000. Resolve was even more dramatic with the graded nine-minute-long sequence sent from Premiere Pro. Export times bested the Mac Pro by more than 2.5x for DNxHD and 6x for H.264.

Aside from these benchmark tests, I also created a “witches brew” After Effects composition of my own. This one contains ten layers of 4K media in a one-minute-long 6K composition. The background layer was blown up and defocused, while all other layers were scaled down and enhanced with a lot of color and Cycore stylized effects. A 3D camera was added to create a group move for the layers. In addition, I was working from the slower drives and not the fast SSDs on either machine. Needless to say this one totally bogs any system down. The Mac Pro rendered a 1080 ProRes file in about 54 minutes, whereas the PC took 42 minutes. Not the same 2-to-1 advantage as in the benchmarks; however, that’s likely due to the fact that I heavily weighted the composition with the Cycore effects. These are not particularly efficient and probably introduce some bottlenecks in After Effects’ processing. Nevertheless, the Puget Systems PC still maintained a decided advantage.

Conclusion

Mac vs. PC comparisons are inevitable when discussing creative workstations. Ultimately it gets down to preference – the OS, the ecosystem, and hardware options. But if you want the ultimate selection of performance hardware and to preserve future expandability, then a custom-built PC is currently the best solution. For straight-forward editing, both platforms will generally serve you well, but there are times when a top-of-the-line PC simply leaves any Mac in the dust. If you need to push performance in After Effects or Resolve, then Windows-based solutions offer the edge today. Custom systems, like those from Puget Systems, are designed with our needs in mind. That’s something you don’t necessarily get from a mainline PC maker. This workstation is a future-proof, no-compromise system that makes the switch from Mac to PC an easy and graceful transition – and with power to space.

Originally written for RedShark News.

©2019 Oliver Peters

Blackmagic Design eGPU Pro

Last year Apple embraced external graphics processing units. Blackmagic Design responded with the release of its AMD-powered eGPU model. Many questioned their choice of the Radeon Pro 580 chip instead of something more powerful. That challenge has been answered with the new Blackmagic eGPU Pro. It sports the Radeon RX Vega 56 – a similar model to the one inside the base iMac Pro configuration. The two eGPU models are nearly identical in design, but in addition to more processing power, the eGPU Pro adds a DisplayPort connection that can support 5K monitors.

The eGPU Pro includes two Thunderbolt 3/USB-C ports with 85W charging capability, HMDI, DisplayPort, and four USB-A type connectors for standard USB-3.1 devices. This means you can connect multiple peripherals and displays, plus power your laptop. You’ll need a Thunderbolt 3 connection from the computer and then either eGPU model becomes plug-and-play with Mojave (macOS 10.14) or later.

Setting up the eGPU Pro

With Mojave, most current creative apps, like Final Cut Pro X, Premiere Pro, Resolve, etc. offer a preference selection to always use the eGPU (when connected) from the application’s Get Info panel. This is an “either/or” choice. The application does not combine the power of both GPUs for maximum performance. When you pull up the Activity Monitor, you can easily see that the internal GPU is loafing while the eGPU Pro does the heavy lifting during tasks such as rendering. External GPUs benefit Macs with low-end, built-in GPUs, like the 13″ MacBook Pro or the Mac mini. A Blackmagic eGPU or eGPU Pro wouldn’t provide an edge to the render times of an iMac Pro, for example. It wouldn’t be worth the investment, unless you need one to connect additional high-resolution displays.

Users who are unfamiliar with external GPUs assume that the advantage is in faster export and render times, but that’s only part of the story. Not every function of an application uses the GPU, so many factors determine rendering. External GPU technology is very much about real-time image output. An eGPU will allow more connected displays of higher resolutions than an underpowered Mac would normally support on its own. The eGPU will also improve real-time playback of effects-heavy timelines. So yes, editors will get faster exports, but they will also enjoy a more fluid editing experience.

Extending the power of the Mac mini

In my Mac mini review, I concluded that a fully-loaded configuration made for a very capable editing computer. However, if you tend to use a number of effects that lean on GPU power, you will see an impact on real-time playback. For example, with the standard Intel GPU, I could add color correction, gaussian blur, and a title, and playback was generally fine with a fast drive. But, when I added a mask to the blur, it quickly dropped frames during playback. Once I connected the eGPU Pro to this same Mac Mini, such timelines played fluidly and, in fact, more effects could be layered onto clips. As in my other tests, Final Cut Pro X performed the best, but Premiere Pro and Resolve also performed solidly.

For basic rendering, I tested the same sequence that I used in the Mac mini review. This is a 9:15-long 1080p timeline made up of 4K source clips in a variety of codecs, plus scaling and color correction. I exported ProRes and H.264 master files from FCPX, Premiere Pro, and Resolve. With the eGPU Pro, times were cut in the range of 12% (FCPX) to 54% (Premiere). An inherently fast renderer, like Final Cut, gained the least by percentage, as it already exhibited the fastest times overall. Premiere Pro saw the greatest gain from the addition of the eGPU Pro. This is a major improvement over last year when Premiere didn’t seem to take much advantage of the eGPU. Presumably both Apple and Adobe have optimized performance when an eGPU is present.

Most taxing tests

A timeline export test is real-world but may or may not tax a GPU. So, I set up a specific render test for that purpose. I created a :60 6K timeline (5760×3240) composed of a nine-screen composite of 4K clips scaled into nine 1920×1080 sections. Premiere Pro would barely play this at even 1/16th resolution using only the Intel. With the eGPU Pro, it generally played at 1/2 resolution. This was exported to a final 1080 ProRes file. During my base test (without the eGPU connected) Premiere Pro took over 31 minutes with “maximum quality” selected. A standard quality export was about eight minutes, while Final Cut Pro X took five minutes. Once I re-connected the eGPU Pro, the same timelines exported in 3:20 under all three test scenarios. That’s a whopping 90% reduction in time for the most taxing condition! One last GPU-centric test was the BruceX test, which has been devised for Final Cut. The result without the eGPU was :58, but an impressive :16 when the eGPU Pro was used.

As you can see, effects-heavy work will benefit from the eGPU Pro, not only in faster renders and exports, but also improved real-time editing. This is also true of Resolve timelines with many nodes and in other graphics applications, like Pixelmater Pro. The 2018 Mac mini is a capable mid-range system when you purchase it with the advanced options. Nevertheless, users who need that extra grunt will definitely see a boost from the addition of a Blackmagic eGPU Pro.

Originally written for RedShark News.

©2019 Oliver Peters

Glass – Editing an Unconventional Trilogy

Writer/director M. Night Shyamalan has become synonymous with films about the supernatural that end with a twist. He first gained broad attention with The Sixth Sense and in the two decades since, has written, produced, and directed a range of large and small films. In recent years, he has taken a more independent route to filmmaking, working with lower budgets and keeping close control of production and post.

His latest endeavor, Glass, also becomes the third film in what is now an unconventional trilogy, starting first with Unbreakable, released 19 years ago. 2017’s Split was the second in this series. Glass combines the three principal characters from the previous two films – David Dunn/The Overseer (Bruce Willis), Elijah Price/Mr. Glass (Samuel L. Jackson), and Kevin Wendell Crumb (James McAvoy), who has 23 multiple personalities.

Shyamalan likes to stay close to his northeastern home base for production and post, which has afforded an interesting opportunity to young talent. One of those is Luke Ciarrocchi, who edited the final two installments of the trilogy, Split and Glass. This is only his third film in the editor’s chair. 2015’s The Visit was his first. Working with Shyamalan has provided him with a unique opportunity, but also a master class in filmmaking. I recently spoke with Luke Ciarrocchi about his experience editing Glass.

_________________________________________________

[OP] You’ve had the enviable opportunity to start your editing career at a pretty high level. Please tell me a bit about the road to this point.

[LC] I live in a suburb of Philadelphia and studied film at Temple University. My first job after college was as a production assistant to the editing team on The Happening with editor Conrad Buff (The Huntsman: Winter’s War, Rise of the Planet of the Apes, The Last Airbender) and his first assistant Carole Kenneally. When the production ended, I got a job cutting local market commercials. It wasn’t glamorous stuff, but it is where I got my first experience working on Avid [Media Composer] and really started to develop my technical knowledge. I was doing that for about seven months when The Last Airbender came to town.

I was hired as an apprentice editor by the same editing crew that I had worked with on The Happening. It was on that film that I started to get onto Night’s radar. I was probably the first Philly local to break into his editing team. There’s a very solid and talented group of local production crew in Philly, but I think I was the first local to join the Editors Guild and work in post on one of his films. Before that, all of the editing crew would come from LA or New York. So that was a big ‘foot in the door’ moment, getting that opportunity from Conrad and Carole.  I learned a lot on Airbender. It was a big studio visual effects film, so it was a great experience to see that up close – just a really exciting time for me.

During development of After Earth, even before preproduction began, Night asked me to build a type of pre-vis animatic from the storyboards for all the action sequences. I would take these drawings into After Effects and cut them up into moveable pieces, animate them, then cut them together into a scene in Avid. I was putting in music and sound effects, subtitles for the dialogue, and really taking them to a pretty serious and informative level. I remember animating the pupils on one of the drawings at one point to convey fear (laughs). We did this for a few months. I would do a cut, Night would give me notes, maybe the storyboard artist would create a new shot, and I would do a recut. That was my first back-and-forth creative experience with him.

Once the film began to shoot, I joined the editing team as an assistant editor. At the end of post – during crunch time – I got the opportunity to jump in and cut some actual scenes with Night. It was surreal. I remember sitting in the editing room auditioning cuts for him and him giving notes and all the while I’m just repeating in my head, ‘Don’t mess this up, don’t mess this up.’ I feel like we had a very natural rapport though, besides the obvious nervousness that would come from a situation like that. We really worked well together from the start. We both had a strong desire to dig deep and really analyze things, to not leave anything on the table. But at the same time we also had the ability to laugh at things and break the seriousness when we needed to. We have a similar sense of humor that to this day I think helps us navigate the more stressful days in the editing room. Personality plays a big roll in the editing room. Maybe more so then experience. I may owe my career to my immature sense of humor. I’m not sure.     

After that, I assisted on some other films passing through Philly and just kept myself busy. Then I got a call from Night’s assistant to come by to talk about his next film, The Visit. I got there and he handed me a script and told me he wanted me to be the sole editor on it. Looking back it seems crazy, because he was self-financing the film. He had lot on the line and he could have gotten any editor, but he saw something. So that was the first of the three films I would cut for him. The odds have to be one-in-a-million for that to pan out the way that it did in the suburbs of Philly. Right place, right time, right people. It’s a lot of luck, but when you find yourself in that situation, you just have to keep telling yourself, ‘Don’t mess this up.’

[OP] These three films, including Glass, are being considered a trilogy, even though they span about two decades. How do they tie together, not just in story, but also style?

[LC] I think it’s fair to call Glass the final installment of a trilogy – but definitely an untraditional one. First Unbreakable, then 19 years later Split, and now Glass. They’re all in the same universe and hopefully it feels like a satisfying philosophical arc through the three. The tone of the films is ingrained in the scripts and footage. Glass is sort of a mash-up of what Unbreakable was and what Split was. Unbreakable was a drama that then revealed itself as a comic book origin story. Split was more of a thriller – even horror at times – that then revealed itself as part of this Unbreakable comic book universe. Glass is definitely a hybrid of tone and genre representing the first two films. 

[OP] Did you do research into Unbreakable to study its style?

[LC] I didn’t have to, because Unbreakable has been one of my favorite films since I was 18. It’s just a beautiful film. I loved that in the end it wasn’t just about David Dunn accepting who he was, but also Elijah finding his place in the world only by committing these terrible crimes to discover his opposite. He had to become a villain to find the hero. It’s such a cool idea and for me, very rewatchable. The end never gets old to me. So I knew that film very, very well. 

[OP] Please walk me through your schedule for post-production.

[LC] We started shooting in October of 2017 and shot for about two month. I was doing my assembly during that time and the first week of December. Then Night joined me and we started the director’s cut. The way that Night has set up these last three films is with a very light post crew. It’s just my first assistant, Kathryn Cates, and me set up at Night’s offices here in the suburbs of Philadelphia with two Avids. We had a schedule that we were aiming for, but the release date was over a year out, so there was wiggle room if it was needed. 

Night’s doing this in a very unconventional way. He’s self-financing, so we didn’t need to go into a phase of a studio cut. After his director’s cut, we would go into a screening phase – first just for close crew, then more of a friends-and-family situation. Eventually we get to a general audience screening. We’re working and addressing notes from these screenings, and there isn’t an unbearable amount of pressure to lock it up before we’re happy. 

[OP] I understand that your first cut was about 3 1/2 hours long. It must take a lot of trimming and tweaking to get down to the release length of 129 minutes. What sort of things did you do to cut down the running time from that initial cut?

[LC] One of our obstacles throughout post was that initial length. You’re trying to get to the length that the film wants to be without gutting it in the process. You don’t want to overcut as much as you don’t want to undercut. We had a similar situation on Split, which was a long assembly as well. The good news is that there’s a lot of great stuff to work with and choose from.

We approach it very delicately. After each screening we trimmed a little and carefully pulled things out, so each screening was incrementally shorter, but never dramatically so. Sometimes you will learn from a screenings that you pulled the wrong thing out and it needed to go back in. Ultimately no major storyline was cut out of Glass. It was really just finding where we are saying the same thing twice, but differently – diagnosing which one of those versions is the more impactful one – then cutting the others. And so, we just go like that. Pass after pass. Reel by reel.

An interesting thing I’ve found is that when you are repeating things, you will often feel that the second time is the offensive moment of that information and the one to remove, because you’ve heard it once before. But the truth is that the first telling of that information is more often what you want to get rid of. By taking away the first one, you are saving something for later. Once you remove something earlier, it becomes an elevated scene, because you are aren’t giving away so much up front. 

[OP] What is your approach to getting started when you are first confronted with the production footage? What is your editing workflow like?

[LC] I’m pretty much paper-based. I have all of the script supervisor’s notes. Night is very vocal on set about what he likes and doesn’t like, and Charlie Rowe, our script supervisor, is very good at catching those thoughts. On top of that, Night still does dailies each day – either at lunch or the end of the day. As a crew, we get together wherever we are and screen all of the previous day’s footage, including B-roll. I will sit next to Night with a sheet that has all of the takes and set-ups with descriptions and I’ll take notes both on Night’s reactions, as well as my own feelings towards the footage. 

With that information, I’ll start an assembly to construct the scene in a very rough fashion without getting caught up in the small details of every edit. It starts to bring the shape of the scene out for me. I can see where the peaks and valleys are. Once I have a clearer picture of the scene and its intention, I’ll go back through my detailed notes – there’s a great look for this, there’s a great reading for that – and I find where those can fit in and whether they serve the edit. You might have a great reaction to something, but the scene might not want that to be on-camera. So first I find the bones of the scene and then I dress it up. 

Night gets a lot range from the actors from the first take to the last take. It is sometimes so vast that if you built a film out of only the last takes, it would be a dramatically different movie than if you only used take one. With each take he just pushes the performances further. So he provides you with a lot of control over how animated the scene is going to be. In Glass, Elijah is an eccentric driven by a strong ideology, so in the first take you get the subdued, calculated villain version of him, but by the last take it’s the carnival barker version. The madman. 

[OP] Do you get a sense when screening the dailies of which way Night wants to go with a scene?

[LC] Yes, he’ll definitely indicate a leaning and we can boil it down to a couple of selects. I’ll initially cut a scene with the takes that spoke to him the most during the dailies and never cut anything out ahead of time. He’ll see the first cuts as they were scripted, storyboarded, and shot. I’ll also experiment with a different take or approach if it seems valid and have that in my back pocket. He’s pretty quick to acknowledge that he might have liked a raw take on set and in dailies, but it doesn’t work as well when cut together into a scene. So then we’ll address that. 

[OP] As an Avid editor, have you used Media Composer’s script integration features, like ScriptSync?

[LC] I just had my first experience with it on a Netflix show. I came on later in their post, so the show had already been set up for ScriptSync. It was very cool and helpful to be able to jump in and quickly compare the different takes for the reading of a line. It’s a great ‘late in the game’ tool. Maybe you have a great take, but just one word is bobbled and you’d like to find a replacement for just that word. Or the emotion of a key word isn’t exactly what you want. It could be a time-saver for a lot of that kind of polishing work.

[OP] What takeaways can you share from your experiences working with M. Night Shyamalan?

[LC] Night works in the room with you everyday. He doesn’t just check in once a week or something like that. It’s really nice to have that other person there. I feel like often times the best stuff comes from discussing it and talking it through. He loves to deconstruct things and figure out the ‘why’. Why does this work and this doesn’t? I enjoy that as well. After three films of doing that, you learn a lot. You’re not aware of it, but you’re building a toolkit. These tools and choices start to become second nature. 

On the Netflix show that I just did, there were times where I didn’t have anyone else in the room for long stretches and I started to hear those things that have become inherent in my process clearer. I started to take notice of what had become my second nature – what the last decade had produced. Editing is something you just have to do to learn. You can’t just read about it or study a great film. You have to do it, do it again, and struggle with it. You need to mess it up to get it right.

________________________________________________

This interview is going online after Glass has scored its third consecutive weekend in the number one box office slot. Split was also number one for three weeks in a row. That’s a pretty impressive feat and fitting for the final installment of a trilogy.

Be sure to also check out Steve Hullfish’s AOTC interview with Luke Ciarrocchi here.

©2019 Oliver Peters