Rams

If you are a fan of the elegant, minimalist design of Apple products, then you have seen the influence of Dieter Rams. The renowned, German industrial designer, associated with functional and unobtrusive design, is known for the iconic consumer products he developed for Braun, as well as his Ten Principles for Good Design. Dieter Rams is the subject of Rams, a new documentary film by Gary Hustwit (Helvetica, Objectified, Urbanized).

This has been a labor of love for Hustwit and partially funded through a Kickstarter campaign. In a statement to the website Designboom, Huswit says, “This film is an opportunity to celebrate a designer whose work continues to impact us and preserve an important piece of design history. I’m also interested in exploring the role that manufactured objects play in our lives and, by extension, the relationship we have with the people who design them. We hope to dig deeper into Rams’ untold story – to try and understand a man of contradictions by design. I want the film to get past the legend of Dieter. I want it to get into his philosophy, process, inspirations, and even his regrets.” 

Hustwit has worked on the documentary for the past three years and premiered it in New York at the end of September. The film is currently on the road for a series of international premiere screenings until the end of the year. I recently had a conversation with Kayla Sklar, the young editor how had the opportunity to tackle this as her first feature film.

______________________________________________________

[OP] Please give me a little background about how you got into editing and then became connected with this project.

[KS] I moved to New York in 2014 after college to pursue working in theater administration for non-profit, Off Broadway theater companies. But at 25, I had sort of a quarter-life crisis and realized that wasn’t what I wanted to do at all. I knew I had to make a career change. I had done some video editing in high school with [Apple] iMovie and in college with [Apple] Final Cut Pro 7 and had enjoyed that. So I enrolled at The Edit Center in Brooklyn. They have an immersive, six-week-long program where you learn the art of editing by working with actual footage from real projects. Indie filmmakers working in documentaries and narrative films, who don’t have a lot of money, can submit their film to The Edit Center. Two are chosen per semester. 12 to 16 students are given scenes and get to work with the director. They give us feedback and at the end, we present a finished rough cut. This process gives us a sense of how to edit.

I knew I could definitely teach myself [Adobe] Premiere Pro, and probably figure out Avid [Media Composer], but I wanted to know if I would even enjoy the process of working with a director. I took the course in 2016 thinking I would pursue narrative films, because it felt the most similar to the world I had come from. But I left the course with an interest in documentary editing. I liked the puzzle-solving aspect of it. It’s where my skillset best aligned.

Afterwards, I took a few assistant editing jobs and eventually started as an assistant editor with Film First, which is owned by Jessica Edwards and Gary Hustwit. That’s how I got connected with Gary. I was assisting on a number of his projects, including working with some of the Rams footage and doing a few rough assemblies for him. Then last year he asked me to be the editor of the film. So I started shifting my focus exclusively to Rams at the beginning of this year. Gary has been working on it since 2015 – shooting on and off for three years. It just premiered in late September, but we even shot some pick-ups in Germany as late as late August / early September.

[OP] So you were working solidly on the film for about nine months. At what point did you lock the cut?

[KS] (laugh) Even now we’re still tinkering. We get more feedback from the screenings and are learning what things are working and aren’t working. The story was locked four days before the New York premiere, but we’re making small changes to things.

[OP] Documentary editing can encompass a variety of structures – narrator-driven, a single subject, a collection of interviewees, etc. What approach did you take with Rams?

[KS] Most of the film is in Dieter Rams’ own words. Gary’s other films have a huge cast of characters. But Gary wanted to make this film different from that and more streamlined. His original concept was that it was going to be Dieter as the only interview footage and you might meet other characters in the verité. But Gary realized that wasn’t going to work, simply because Dieter is a very humble man and he wasn’t really talking about his impact on design. We knew that we needed to give the film a larger context. We needed to bring in other people to tell how influential he has been.

[OP] Obviously a documentary like this has no narrative script to follow. Understanding the interview subject’s answers is critical for the editor in order to build the story arc. I understand that much of the film is in a foreign language. So what was your workflow to edit the film?

[KS] Right. So, the vast majority of the film is in German and a little bit in Japanese, both with subtitles. Maybe 25% is in English, but we’re creating it primarily with an English-speaking audience in mind. I know pretty much no German, except words from Sound of Music and Cabaret. We had a great team of translators on this project, with German transcripts broken down by paragraph and translated into English. I had a two-column set-up with German on one side and English on the other. Before I joined the project, there was an assistant who input titles directly into Premiere – putting subtitles over the dailies with the legacy titler. That was the only way I would be able to even get a rough assembly or ‘radio edit’ of what we wanted.

When you edit an English-speaking documentary, you often splice together two parts of a longer sentence to form a complete and concise thought. But German grammar is really complicated. I don’t think I really grasped how much I was taking on when I first started tackling the project. So I would build a sentence that was pretty close from the transcripts. Thank God for Google Translate, because I would put in my constructed sentence and hope that it spit out something pretty close to what we were going for. And that’s how we did the first rough cut.

Then we had an incredible woman, Katharina Kruse-Ramey, come in. She is a native German speaker living here in New York. She came in for a full eight or nine hours and picked through the edit with a fine tooth comb. For instance, “You can’t use this verb tense with this noun.” That sort of thing. She was hugely helpful and this film wouldn’t have been able to happen without Katharina. We knew then that a German speaker could watch this film and it would make sense! We also had another native German speaker, Eugen Braeunig, who was our archival researcher. He was great for the last minute pick-ups that were shot, when we couldn’t go through the longer workflow.

[OP] I presume you received notes and comments back from Dieter Rams on the cut. What has his response been?

[KS] The film premiered at the Milano Design Film Festival a few weeks ago and Dieter came to that. It was his first time seeing the finished product. From what I’ve heard, he really liked it! As much as one can like seeing themselves on a large screen, I suppose. We had sent him a rough cut a few months ago and in true analytical fashion, the notes that we got back from him were just very specific technical details about dates and products and not about overall storytelling. He really was quite willing to give Gary complete control over the filmmaking process. There was a lot of trust between the two of them.

[OP] Did you cut the film to temp music from the beginning or add music later? I understand that the prolific electronic musician and composer, Brian Eno (The Lego Batman Movie, T2 Trainspotting, The Simpsons), created the soundtrack. What was that like?

[KS] The structure of this film has more breathing room than a lot of docs might have. We really thought about the fact that we needed to give viewers a break from reading subtitles. We didn’t want to go more than ten minutes of reading at a time. So we purposely built in moments for the audience to digest and reflect on all that information. And that’s where Brian’s music was hugely important for us.

We actually didn’t start really editing the film until we had gotten the music back from Brian. I’ve been told that he doesn’t ever score to picture. We sent him some raw footage and he came back with about 16 songs that were inspired by the footage. When you have that gorgeous Brian Eno music, you know that you’re going to have moments where you can just sit back and enjoy the sheer beauty of the moment. Once we had the music in, everything just clicked into place.

[OP] The editor is integral to creating the story structure of a documentary, more so than narrative films – almost as if they are another writer. Tell me a bit about the structure for Rams.

[KS] This film is really not structured the way you would probably structure a normal doc. As I said earlier, we very purposefully put reading breaks in, either through English scenes or with Eno’s music. We had no interest in telling this story linearly. We jump back and forth. One plot line is the chronology of Dieter’s career. Then there’s this other, perhaps more important story, which is Dieter today.  His thoughts on the current state of design and the world. He’s still very active in giving talks and lectures. There’s a company called Vitsoe that makes a lot of his products and he travels to London to give input on their designs. That was the second half of the story and those are interspersed.

[OP] I presume you went outside for finishing services – sound, color correction, and so on. But did the subtitles take on any extra complexity, since they were such an important visual element?

[KS] There are three components to the post. We did an audio mix at one post house; there was a color correction pass at another; and we also had an animation studio – Trollbäck – working with us. There is a section in the film that we knew had to be visually very different and had to convey information in a different way than we had done in any other part of the film. So we gave Trollbäck that five-minute-long sequence. And they also did our opening titles.

We had thought about a stylistic treatment to the subtitles. There were two fonts that Trollbäck had used in their animation. Our initial intent was to use that in our subtitles. We did use one of those treatments in our titles and product credits. For the subtitles, we spent days trying out different looks. Are we going to shadow it or are we using outlines? What point font? What’s the kerning on it? There was going to be so much reading that we knew we had to do the titles thoughtfully. At the end of the day, we knew Helvetica was going to be the easiest (laugh)! We had tried the outline, but some of the internal space in the letters, like an ‘o’ or an ‘e’, looked closed off. We ended up going with a drop shadow. Dieter’s home is almost completely white, so there’s a lot of white space in the film. We used shadows, which looked a little softer, but still quite readable. Those were all built in Premiere’s legacy title tool.

[OP] You are in New York, which is a big Avid Media Composer town. So what was the thought process in deciding to cut this film in Adobe Premiere Pro?

[KS] When I came on-board, the project was already in Premiere. At that point I had been using Avid quite a lot since leaving The Edit Center, which teaches their editing course in Avid. I had taught myself Premiere and I might have tried to transfer the project to Avid, but there was already so much done in terms of the dailies with the subtitles. The thought of going back and spending maybe 50 hours worth of manual subtitling that didn’t migrate over correctly just seemed like a total nightmare. And I was happy to use Premiere. Had I started the project from scratch, I might have used Avid, because it’s the tool that I felt fastest on. Premiere was perfectly fine for the film that we were doing. Plus, if there were days when Gary wanted to tinker around in the project and look at things, he’s much more familiar with Premiere than he is with Avid. He also knows the other Adobe tools, so it made more sense to continue with the same family of creative products that he already knew and used.

Maybe it’s this way with the tool you learn first, but I really like Avid and I feel that I’m faster with it than with Premiere. It’s just the way my brain likes to edit things. But I would be totally happy to edit in Premiere again, if that’s what worked best for a project and what the director wanted. It was great that we didn’t have to transcode our archival footage, because of how Premiere can handle media. Definitely that was helpful, because we had some mixed frame rates and resolutions.

[OP] A closing question. This is your first feature film and with such an influential subjective. What impact did it have on you?

[KS] Dieter has Ten Principles for Good Design. He built them to talk about product design and as a way for him to judge how a product ideally should be made. I had these principles taped to my wall by my desk. His products are very streamlined, elegant, and clean. The framework should be neutral enough that they can convey what the intention was without bells-and-whistles. He wasn’t interested in adding a feature that was unnecessary. I really wanted to evoke those principles with the editing. Had the film been cluttered with extraneous information, or was self-aggrandizing, I think when we revealed the principles to the audience, they would have thought, “Wait a minute, this film isn’t doing that!” We felt that the structure of the film had to serve his principles well, wherever appropriate.

His final principle is ‘Good Design is as Little Design as Possible.’ We joked that ‘Good Filmmaking is as Little Filmmaking as Possible.’ We wanted the audience to be able to draw their own conclusions about Dieter’s work and how that translates into their daily lives. A viewer could walk away knowing what we were trying to accomplish without someone having to tell them what we were trying to accomplish.

There were times when I really didn’t know if I could do it. Being 26 and editing a feature film was daunting. Looking at those principles kept me focused on what the meat of the film’s structure should be. That made me realize how lucky we are to have had a designer who really took the time to think about principles that can be applied to a million different subjects. At one of these screenings someone came up to us, who had become a UI designer for software, in part, because of Dieter. He told us, “I read Dieter’s principles in a book and I realized these can be applied to how people interact with software.” They can be applied to a million different things and we certainly applied it to the edit.

______________________________________________________

Gary Hustwit will tour Rams internationally and in various US cities through December. After that time it will be available in digital form through Film First.

Click here to learn more about Dieter Rams’ Ten Principles for Good Design.

©2018 Oliver Peters

Advertisements

Mary Queen of Scots

Few feature film editors have worked on such a diverse mix of films as Chris Dickens. His work ranges from Shaun of the Dead to Les Misérables, picking up an Oscar award along the way for editing Slumdog Millionaire. The latest film is Mary Queen of Scots, starring Gemma Chan, Margot Robbie, and Saoirse Ronan. This historical drama is helmed by Josie Rourke (Much Ado About Nothing), an experienced theatre director, who has also worked with film and TV projects. Readers will be familiar with Dickens from my Hot Fuzz interview. I recently had the pleasure to chat with him again about Mary Queen of Scots.

______________________________________________________

[OP] I know that there’s a big mindset difference between directing for the stage and directing for film. How was it working with Josie Rourke for this film?

[CD] She was very solid with the actors’ performances and how to rehearse them. There are great performances and that was the major thing she was concentrating on. She knew about the creative side of filmmaking, but not about the technical. We were essentially helping her with that to get what she wanted on screen. It’s a dialogue-driven movie and so she was very at home with that. But we had to work with her to adapt her normal approach for the screen, such as when to use images instead of dialogue.

Filmmaking is all about seeing something more than you can just see with the naked eye. Plus seeing emotionally what an actor is delivering. The way they’re doing it is different than on stage. It’s smaller. Film acting is much subtler. I don’t think we ever had a difference of opinion about that. It was more that in the theatre you are trying to communicate things through an actor’s movement and language and not so much through their eyes and the subtleties of their face. With film, one close-up of an actor can do more than a whole page of dialogue. Nevertheless, she certainly gave the cameramen freedom, while she concentrated on performance. And she shot all of that stuff so we had enough to use to make it work on the screen.

[OP] Did that dynamic affect how and where you edited?

[CD] I was mostly at the studio, but I did go on location with them. We shot at Pinewood and then on location in Scotland and around England. I went up to Scotland, where we had some action scenes, to help with that. Josie needed feedback about what she was shooting and needed to see at it quickly. I also did some second unit shooting and things like that.

[OP] Typically, period dramas require extensive visual effects to disguise modern locations and make them appear historically appropriate. They also are still frequently shot on film. What was the case with this film?

[CD] It was shot digitally. The DoP [John Mathieson, Logan, The Man from U.N.C.L.E., X-Men: First Class] would have preferred film, because of the genre, but that would have been too expensive. There were always two and sometimes three cameras for most set-ups. But, there are very few visual effects. Just a few clean-ups. There is an epic feel, but that’s not the main direction. The film is a more psychological story about these two women, the Queen of England and the Queen of Scotland. They are both opposed to each other, but also like each other. It’s about their relationship and the sort of psychological connection between them. The story is more intimate in that way. So it’s about the performance and the subtleties to that story.

[OP] Walk me through the production and post timeline.

[CD] We shot it a year ago last August for about three months. I assembled the film during that time and then we started the director’s cut in October of last year. We actually had a long edit and didn’t finish until July of this year. I think we spent about thirteen weeks doing a director’s cut. Then the producer’s cut, and then, a director’s cut again. I think we did about two or three test screenings and we had sound editors on board quite early. In fact, we never stopped cutting almost right until the end. If you have a lot of screenings, everyone involved with the film wants to do a lot of changes and it keeps happening right down to the wire. So we basically carried on cutting almost right through till the middle of June.

[OP] It sounds like you had more changes than usual for most film edits – especially after your test screenings. Tell me more.

[CD] The core of the film is about Mary, who was a Catholic Queen, and Elizabeth, who was a Protestant Queen. Mary had the claim to not just be Queen of Scotland, but the Queen of England, as well. She’s a threat to Elizabeth, so the film is about that threat. These women essentially had an agreement between them. Elizabeth agreed that Mary’s child would succeed her if she died. This was a private agreement between the two women. The men around them who are in their government are trying tp stop them from interacting with each other and having any kind of agreement. So it’s about women in a very archaic world. They are leaders, but they are not men, and the system around them are not happy for them to be leaders. This was the first time there was a queen in either country ever – and at the same time.

The theme is kind of modern, so the script – written by Beau Willimon, who writes House of Cards – was a bit like a political drama. In his writing, he intercuts scenes to give it a modern, more interesting feel. I followed that pattern – crosscutting scenes and stuff like that. When we started screening, a lot of people found that difficult to understand, so we went the other way around. We put things together and made the structure more classic. But when we then started screening it again, we realized that the film had ceased to be unique. It started becoming more like other dramas from this genre. So we put it all the way back to how it originally was. We went back to the spirit of what Beau had written and did more intercutting, but in different places. That is why it took so long to cut the film, because the balance was difficult to arrive at. Often a script is written in a very linear fashion and you cut it up later. But in this case it was the opposite way around.

If you listen too much to the audience or even producers of the film you can lose what makes it unique. The hands of the director are very important. Particularly here, because this is a women’s story, directed by a woman director, and it was very important to preserve that point of view, which could very easily be eroded. She wrote it with Beau and he doesn’t explain everything. He doesn’t have characters telling you how they got to a certain place or why. We needed to preserve that, but we also needed to let people into the story a little more. So we had to make adjustments to allow an audience to understand it.

[OP] I’m sure that such changes, as with every film, affected its final length. How was Mary Queen of Scots altered through these various cuts and recuts?

[CD] The original cut was about two hours and 45 minutes, but we ended up at an hour and 55. To get there, we started to cut back on the more epic scenes within the film. For instance, we had a battle scene early on in the film and there was a battle at the end of the film where Mary is beaten and expelled from Scotland. They didn’t really have the budget for a classic battle like in Braveheart. It was a slightly more impressionistic battle – more abstract and about how it feels. It was a beautiful sequence, but we found that the film didn’t need that. It just didn’t need to be that complete. We had to make a lot of choices like that – cutting things down.

We cut nearly an hour of material, which obviously I’m used to doing. However, what we found is that, because it was a performance piece, by cutting it down so far, we also lost a little bit of the air between scenes. It became quite brutal – just story without any kind of feeling. So once we got the the story working well, we then had to breathe life back into it. I literally went all the way back to the first edit of the film and looked at what was good about it in terms of the life and the subtleties. Then we very carefully started putting that back into the film. When you screen the film for audiences, you get very tunneled into making the story tighter and understandable, which is often at the expense of quite a lot. It’s an interesting part of the process – going back to the core of the story. You always have to do that. Sometimes you lose a little through the editing process and then you have to try and get it back.

We also had quite a lot of work on music. We had a composer on board [Max Richter, White Boy Rick, Hostiles, Morgan] quite early and he gave us a lot of ideas. But, as we changed the edit, we had to change the direction of the music somewhat. Of course, this also contributed to the length of the editing schedule. 

[OP] Music can certainly make or break a film. Some editors start with it right away and others wait until the end to play with options. It sounds like music was a bit of a challenge.

[CD] I normally go with it dry at the beginning. When I start putting the scenes together I tend to start using temp music. But I try to avoid it for as long as possible – even into the director’s cut. I think sometimes you can just use it as a bandage if you’re not careful. But on this film, we had a very specific tone that we needed to sell. It was a slightly more modern, suspenseful take on the music. We did end up using music a little earlier than I would have hoped.

We had a cut the film and we had a soundtrack, but we were constantly changing it – trying new things – as the edit changed. The music was more avant garde to start with and that was our intention, but the studio wanted it to be a little more melodic. The composer is very respected in the classical world, so he took that on board and wrote some themes for us that took it into a slightly different direction. He would write something – maybe not even to picture – and then give us the stems. The music editor and I would edit the music and try it out in different places. Then the composer would see what we had done with it to picture. We would then give it back to him. He would do a bit more work and give it back to us. It was actually a very unusual process.

[OP] With such a diverse set of films under your belt, what are some of your tips in tackling a scene?

[CD] I go through the rushes and try to watch everything that they shot. If there are A and B cameras, then I try to watch the B camera, as well. You get different emotional things from that, since it is a different angle. In the ideal situation when there’s time, I watch everything, mark what I like, and then make a roll with all my selected takes. Then I watch it again. I prune it down even more and then start a cut. Ideally, I try to find one take that works all the way through a scene as my first port of call. Then I go through the roll of my selects and look at what I marked and what I liked and try to work those things into the cut. I look at each one to see if that’s the best performance for that line and I literally craft it like that.

When you’ve watched half of a roll of rushes, you don’t know how to cut the scene. But once you’ve watched it all – everything they’ve shot – you then can organize the scene in your head. The actual cutting is quite quick then. I tend to watch it and think, ‘Okay I know what I’m going to do for the first cut. I’m going to use that shot for the beginning, that bit for the end, and so on.’ I map it in my head and quickly put that together with largely the selected takes that I like. Then I watch it and start refining it, honing it, and going through the roll again – adding things. Of course that depends on time. If I don’t have much time, I have to work fast, so I can’t do that all the time.

[OP] Any closing thoughts to wrap this up?

[CD] The experience of editing Mary Queen of Scots really reminded me how important it is to stick to the original intention and ambition of the film and make editorial decisions based on that. This doesn’t mean sticking to the letter of the script, but looking at how to communicate its intent overall. Film editing, of course, always means lots of changes and so it’s easy to get lost. Therefore, going back to the original thought always helps in making the right choices in the end.

© 2018 Oliver Peters

The Old Man & the Gun

Stories of criminal exploits have long captivated the American public. But no story is quirkier than that of Forrest Silva “Woody” Tucker. He was a lifelong bank robber and prison escape artist who was in and out of prison. His most famous escape came in 1979 from San Quentin State Prison. The last crimes were a series of bank robberies around the Florida retirement community where he lived. He was captured in 2000 and died in prison in 2004 at the age of 83. Apparently good at his job – he stole an estimated four million dollars over his lifetime – Tucker was aided by a set of older partners, dubbed the “Over the Hill Gang”. His success, in part, was because he tended to rob lower profile, local banks and credit unions. While he did carry a gun, it seems he never actually used it in any of the robberies.

The Old Man & the Gun is a semi-fictionalized version of Tucker’s story brought to the screen by filmmaker David Lowery (A Ghost Story, Pete’s Dragon, Ain’t Them Bodies Saints). It stars Robert Redford as Tucker, along with Danny Glover and Tom Waits as his gang. Casey Affleck plays John Hunt, a detective who is on his trail. Sissy Spacek is Jewel, a woman who takes an interest in Tucker. Lowery wrote the script in a romanticized style that is reminiscent of how outlaws of the old west are portrayed. The screenplay is based on a 2003 article in The New Yorker magazine by Dale Grann, which chronicled Tucker’s real-life exploits.

David Lowery is a multi-talented filmmaker with a string of editing credits. (He was his own editor on A Ghost Story.) But for this film, he decided to leave the editing to Lisa Zeno Churgin, A.C.E. (Dead Man Walking, Pitch Perfect, Cider House Rules, House of Sand and Fog), with whom he had previously collaborated on Pete’s Dragon. I recently had the opportunity to chat with Churgin about working on The Old Man & the Gun.

___________________________________

[OP] Please tell me a bit about your take on the story and how the screenplay’s sequence ultimately translated into the finished film.

[LZC] The basis of Redford’s character is a boy who started out stealing a bicycle, went to reform school when he was 13, and it continued along that way for the rest of his life. Casey Affleck is a cop in the robbery division who takes it as a personal affront when the bank where he was trying to make a deposit was robbed. He makes it his mission to discover who did it, which he does. But because it’s a case that crosses state lines, the case gets taken over by the FBI. Casey’s character then continues the search on his own.  It’s a wonderful cat and mouse game. 

There are three storylines in the film. The story begins when Tucker is leaving the scene of a robbery and pulls over to the side of the road to help Jewel [Sissy Spacek] while evading the police on his trail. Their story provides a bit of a love interest.  The second storyline is that of the “Over the Hill Gang”. And the third storyline is the one between Tucker and Hunt. It’s not a particularly linear story, so we were always balancing these three storylines. Whenever it started to feel like we’d been away too long from a particular storyline and set of characters, it was time to switch gears.

Although David wrote the script, he wasn’t particularly overprotective of it. As in most films, we experimented a lot, moving scenes around to make those three main stories find their proper place. David dressed Redford in the same blue suit for the entire movie with occasional shirt or tie changes. This made it easier to shift things than when you have costume constraints. Often scenes ended up back where they started, but a lot of times they didn’t – just trying to find the right balance of those three stories. We had absolute freedom to experiment, and because David is a writer, director, and an editor in his own right, he really understands and appreciates the process.

The nature of this film was so unique, because it is of another time and place [the 1980s], but still modern in its own way. I also see it partly as an homage to Bob [Redford], because this is possibly his last starring role. Shooting on 16mm film certainly lends itself to another time and place. The score is a jazz score. That jazz motor places it in time, but also keeps it contemporary. As an aside, a nice touch is when Casey visits Redford in the hospital and he does a little ‘nose salute’ from The Sting, which was Casey’s idea.

[OP] On some films the editor is on location, keeping up to camera with the cut. On others, the editing team stays at a home base. For The Old Man & the Gun, you two were separated during the initial production phase. Tell me how that was handled.

[LZC] David was filming in Cincinnati and I was simultaneously cutting in LA. Because it was being shot on film, they sent it to Fotokem to be developed and then to Technicolor to be digitized. Then it was brought over to us on a drive. When you don’t get to watch dailies together, which is pretty much the norm these days, I try to ask the director to communicate with the script supervisor as much as possible while they are shooting: circled takes, particular line readings, any idea that the director might want to communicate to the editor. That sort of input always helps. Their distant location and the need to process film meant it would be a few days before I got the film and before David could see a scene that he’d shot, cut together. Getting material to him as quickly as possible is the best thing that I can do. That’s always my goal.

When I begin cutting a scene, I start by loading a sequence of all of the set-ups and then scroll through this sequence (what most editors who worked on film call a KEM roll) so that I can see what has been shot. Occasionally, I’ll put together selects, but generally I just start at the beginning and go cut to cut. The hardest part is always figuring out what’s going to be the first cut. Are we going to start tight? Are we going to start wide where we show everything? What is that first cut going to be? I seem to spend more time on that than anything else and once I get into it – and I’m not the first person to say this – the film tells you what to do. My goal is to get it into form as quickly as possible, so I can get a cut back to the director.

I finished the editor’s cut in LA and then we moved the cutting room to Dallas. Then David and I worked on the director’s cut – traditionally ten weeks – and after that, we showed it to the producers. Our time was extended a bit, because we had to wait for Bob’s availability to shoot some of the robbery sequences. They always knew that they were going to have to do some additional filming.

[OP] I know David is an experienced editor. How did you divide up the editorial tasks? Or was David able to step back from diving in and cutting, too?

[LZC] David is an excellent editor in his own right, but he is very happy to have someone else do the first pass. On this film I think he was more interested in playing around with some of the montages sequences. Then he’d hand it back to me so that I could incorporate it back into the film, sometimes making changes that kept it within the style of the film as a whole.

[OP] The scenes used in a film and the final length are always malleable until the final version of the cut. I’m sure this one was no different. Please tell me a bit about that.

[LZC] We definitely lost a fair number of scenes. My assistant makes scene cards that we put up on the wall and then when we lift a scene it goes on the back of the door. That way, you can just open the door and look on the back and see what has been taken out. In this particular film, because of the three separate storylines, scenes went in, out, and rearranged – and then in, out, and rearranged again. Often, scenes that we dropped at the very beginning ended up back in the movie, because it’s like a house of cards. You know you really have to weigh everything and try to juxtapose and balance the storylines and keep it moving. The movie is quite short now, but my first cut wasn’t that long. The final cut is 94 minutes and I think the first cut wasn’t much more than two hours.  

[OP] Let me shift gears a bit. As I understand it, David is a fan of Adobe Creative Cloud and in particularly, Premiere Pro. On The Old Man & the Gun, you shifted to Premiere Pro, as well. As someone who comes from a film and Avid editorial background, how was it to work with Premiere Pro?

[LZC] Over the course of my career, I’ve done what we call ‘doctor jobs’, where an editor comes in and does a recut of a film. On some of these jobs, I had the opportunity to work on Lightworks and on Final Cut. When we began Pete’s Dragon, David asked if I would consider doing it on Premiere Pro. David Fincher’s team had just done Gone Girl using it and David was excited about the possibility of doing Pete’s using Premiere. But for a big visual effects film, Premiere at that stage really wasn’t ready. I said if we do another film together, I’d be happy to learn Premiere. So, when we knew we would be doing Old Man, David spoke to the people at Adobe. They arranged to have Christine Steele tutor me. I worked with her before we began shooting. It was perfect, because we live close to each other and we were able to work in short, three- and four-hour blocks of time. (Note: Steele is an LA-based editor, who is frequently a featured presenter for Adobe.)

I also hired my first assistant, Mike Melendi, who was experienced with Premiere Pro. It was definitely a little intimidating at first, but within a week, I was fine. I actually ended up doing another film on Avid afterwards and I was a little nervous to go back to Avid. But that was like riding a bike. And after that, I took over another film that was on Premiere. Now I know I can go back and forth and that it’s perfectly fine.

[OP] Many feature film editors with an extensive background on Media Composer often rely on Avid’s script integration tools (ScriptSync). That’s something Premiere doesn’t have. Any concerns there?

[LZC] I think ScriptSync is the most wonderful thing in the world, but I grew up without it. When my assistants prepare dailies for me, they’ll put in a bunch of locators, so I know where there are multiple takes within a take. I think ScriptSync is great if you can get the labor of somebody to do it. I know there are a lot of editors who do it themselves while they’re watching dailies. I worked on a half-hour comedy where there was just a massive amount of footage and a tremendous amount of ‘keep rollings’. After working for one week I said to them, ‘We have to get SciptSync’. And they did! We had a dedicated person to do it and that’s all they did. It’s a wonderful luxury, which I would always love to have, but because I learned without it, I’ve created other ways to work without it.

My biggest issue with Premiere was the fact that, because I always work in the icon view and not list view, I had to contend with their grid arrangement within the bins. With Media Composer, you can arrange your clips however you want. Adobe knew that it was a really big issue for me and for other editors, so they are working on a version where you can move and arrange the clips within a bin. I’ve had the opportunity to give input on that and I know we’ll see that changed in a future version.

I would love to keep working on Premiere. Coming back to it again recently, I felt really confident about being able to go back and forth between the two systems. But some directors and studios have specific preferences. Still, I think it would be a lot of fun to continue working in Premiere.

[OP] Any final thoughts on the experience?

[LZC] I enjoyed the opportunity to work on such a wonderful project with such great actors. For me as an editor, that’s always my goal – to work with great performances. To help have a hand in shaping and creating wonderful moments like the ones we have in our film. I hope others feel that we achieved that.

For more, check out Adobe’s customer stories and blog. Also Steve Hullfish’s Art of the Cut interview.

This interview transcribed with the assistance of SpeedScriber.

©2018 Oliver Peters

CoreMelt PaintX

When Apple launched Final Cut Pro X, it was with a decidedly simplified set of video effects. This was enhanced by the easy ability for users to create their own custom effects, using Apple Motion as a development platform. The result has been an entirely new ecosystem of low-cost, high-quality video effects. As attractive as that is, truly advanced visual effects still require knowledgeable plug-in developers who are able to work within the FCPX and macOS architecture in order to produce more powerful tools. For example, other built-in visual effects tools, such as Avid Media Composer’s Intraframe Paint or the Fusion page in DaVinci Resolve, simply aren’t within the scope of FCPX, nor what users can create on their own through Motion templates.

To fill that need, developers like CoreMelt have been designing a range of advanced visual effects tools for the Final Cut Pro market, including effects for tracking, color correction, stabilization, and more. Their newest release is PaintX, which adds a set of Photoshop-style tools to Final Cut Pro X. As with many of CoreMelt’s other offerings, PaintX includes planar tracking, thanks to the licensing of Mocha tracking technology.

To start, drop the PaintX effect onto a clip and then launch the custom interface. PaintX requires a better control layout than the standard FCPX user interface has been designed for. Once inside the PaintX interface window, you have a choice of ten brush functions, including paint color, change color, blur, smear, sharpen, warp, clone, add noise, heal, and erase. These functions cover a range of needs, from simple wire removal to beauty enhancements and even pseudo horror makeup effects. You have control over brush size, softness, aspect ratio, angle, and opacity. The various brushes also have specific controls for their related functions, such as the blur range for the blur brush. Effects are applied in layers and actions. Each stroke is an action and both remain editable. If you aren’t the most precise artist, then the erase brush comes in handy. Did you color a bit too far outside of the lines? Simply use the erase brush on that layer and trim back your excess.

Multiple brush effects can be applied to the same or different areas within the image, simply by adding a new layer for each effect. Once you’ve applied the first paint stroke, an additional brush control panel opens – allowing you to edit the brush parameters, after the fact. So, if your brush size was too large or not soft enough, simply alter those settings without the need to redo the effect. Each effect can be individually tracked in either direction. The Mocha tracker offers additional features, such as transform (scale/position) versus perspective tracking, along with the ability to copy and paste tracking data between brush layers.

As a Final Cut Pro X effect, PaintX works within the standard video pipeline. If you applied color correction upstream of your PaintX filter, then that grade is visible within the PaintX interface. But if the color correction is applied downstream of the PaintX effect, you won’t see it when you open the PaintX interface. However, that correction will still be uniformly applied to the clip, including the areas altered within the PaintX effect. If you’ve “punched into” a 4K clip on an HD timeline, when you open PaintX, you’ll still see the full 4K frame. Finally, you have additional FCPX control over the opacity and mix of the applied PaintX filter.

I found PaintX to be well-behaved even on a modest Mac, like my 3-year-old laptop. However, if you don’t have a beefy Mac, keep the effect simple. The more brush effects that you apply and track in a single clip, the slower the real-time response will become, especially on under-powered machines. These effects are GPU-intensive and paint strokes are really a particle system; therefore, simple, single-layer effects are the easiest on the machine. But, if you intend to do more complex effects like blurs and sharpens in multiple layers, then you will really want one of the more powerful Macs. Playback response is generally better, once you’ve saved the effect and exit back to Final Cut. I did run into one minor issue with the clone brush on a single isolated clip, while using a 2013 Mac Pro. CoreMelt told me there have been a few early bugs with certain GPUs and is looking into the anomaly I discovered. That model in particular has been notorious for GPU issues with video effects. (Update: CoreMelt sent me a new build, which has corrected this problem.)

Originally written for RedShark News

©2018 Oliver Peters

Art Doesn’t Pay

In the short time since its 2015 launch, Frame.io has become a leading video collaboration site. Going beyond its early roots as a video review-and-approval site, Frame.io now supports numerous, long distance workflows that empower creative video professionals all around the world. Most companies feature blogs and customer profiles as just another form of marketing. For Frame.io, these are a way to give back to the post production community. Their blog features tips and tutorials that benefit editors and other creatives, whether or not they use the company’s services in their daily workflows.

The newest outreach is Frame.io Masters, a short film series, featuring renowned filmmakers who also happen to be customers of the site. Emery Wells, CEO and co-founder, explains, “We wanted Frame.io Masters to be delivered in the voice of the creator. These are personal stories, brought to life by having each filmmaker make their own film, with no direction from us. It’s a manifestation of what Frame.io is all about – a collaborative effort that will serve as inspiration for aspiring creatives in every facet of the word.”

The inaugural video showcases sought-after Australian commercial filmmaker, Mark Toia. His inspiring short film, entitled Art Doesn’t Pay, is a showreel that features a wide range of his impressive work. The title stems from what Toia was told in school. As the film demonstrates, art did indeed come to pay for Toia. Heeding his teacher’s admonition, Toia started his working career as a steelworker. But an amateur interest in photography brought professional attention that resulted in a new path following his natural artistic talent. This eventually brought him to commercial filmmaking.

Words that motivate you to use your talent

When I asked about the apparent contradiction of the title, Mark Toia replied, “My teacher said to me, ‘Art doesn’t pay’. This was a motivation for me. These words were the drive behind my push to prove him wrong.  At the end of the video, I state that ‘My teacher was wrong’. What I should have said at the end was that art does pay. Because with commerce, marketing, and general good business practices, art does pay.”

Photography, painting, and cinematography are very “hands on”, in a similar fashion to some blue collar jobs. Do these follow the so-called “10,000-hour rule”? That’s the premise that you get good at something only after having invested a lot of time doing to it – thus, the 10,000 hours. But Toia wasn’t completely convinced. “Not sure about that. I’m a firm believer in natural ability. I could pick up a brush and a pencil and draw or paint real life almost instantly. This was an obvious natural gift. I had no formal training in photography, but quite quickly obtained the right eye for it and was better than most of my peers in very little time and made money from it very early.  So the 10,000-hour rule may be the case for the people with a natural ability. I do not doubt that at all. For example, I wanted to be a professional motorcycle racer. I put thousands of hours into trying to be the fastest I could be. I remember this person started racing against us. He was new to the sport and quickly beat us all – and ended up winning five world titles. His name was Mick Doohan. Another natural.”

Art meets technology

Toia is obviously a natural artist, but he is also no stranger to the technology. I see his frequent posts on the RedUser forum about RED cameras and using Final Cut Pro X for editing. Toia explained, “I’m very open about not being a fan of a logo or a brand, but more a fan of time savings, speed, and performance. Not caring at all who made the app or camera. I’m not loyal to any brand. I have told Jim and Jarred of RED many times that if someone else brought out a camera that was faster, lighter, smaller, with more dynamic range, and with faster frames rates, then I would jump ship in an instant. And they know this very well, as that’s the reason why I jumped to RED in the first place. RED produces a camera that ticks most, but not all, of my boxes at the moment, hence, why I use RED. Overall I’m only loyal to a tool that gets me to the results I want quicker, at a high quality, and what gets me to bed earlier, makes me work fewer hours, and helps me make money more easily. Simple as that.”

“When it comes to editing and compositing programs, I made a point of learning them all to a high level, if only to make sure I was making the right judgment and choice of using the correct tool for the job. Flame, for instance, has a great keyer – Color Warper. But it’s a slow machine, so I only use it for that toolset. It’s actually a fantastic program, but just slow to use overall. If it could use the GPU like After Effects and Apple Motion – giving near real-time feedback – it would be a winner. Nuke is slow, too, but great for multiple layers of 3D.  Cheap, but again, very slow. After Effects is very fast for 90% of everything I do, so I use that for my 3D and 2D compositing work.”

“From an editing perspective, Avid was my go-to for many years, but FCP7 was quicker, so I jumped to that. Premiere Pro came out being able to use multiple codecs, so I used that for a couple of years, but… again it wasn’t as quick to use as FCP7. Then FCPX came onto the scene and I hated everything about it. I tried to love it, but could not get my head around the magnetic timeline. After my third attempt of trying to learn and understand it, the penny finally dropped. I got it! Now I quite literally work 30-40% faster in FCPX than any other edit program. I still work in Premiere from time to time for older projects I have.  I know both programs intimately, so I can 100% say that FCPX is far more stable and quicker to use than Premiere, giving myself more time to be creative. And that’s were it wins. Speed and performance is always my drawcard. Not the logo on the box. I couldn’t care less if Google or Coca-Cola invented it. Time makes for better creative, a better end product, and better profit margins.”

Mark Toia shows us that natural talent combined with a drive for the best results is a winning combination.

Originally written for RedShark News.

©2018 Oliver Peters

Apple 2018 MacBook Pro

July was a good month for Apple power users, with the simultaneous release of Blackmagic Design’s eGPU and a refresh of Apple’s popular MacBook Pro line, including both 13″ and 15″ models. Although these new laptops retain the previous model’s form factor, they gained a bump-up in processors, RAM, and storage capacity.

Apple loaned me one of the Touch Bar space gray 15” models for this review. It came maxed out with the 8th generation 2.9 GHz 6-core Intel Core i9 CPU, 32GB DDR4 (faster) RAM, Radeon Pro 560X GPU, and a 2TB SSD. The price range on the 15″ model is pretty wide, due in part to the available SSD choices – from 256GB up to 4TB. Touch Bar 15” configurations start at $2,399 and can go all the way up to $6,699, once you spec the top upgrade for everything. My configuration was only $4,699 with the 2TB SSD. Of course, that’s before you add Apple Care (which I highly recommend for laptops) and any accessories.

Apple also released premium leather sleeves for both the 13″ and 15″ models in three colors ($199 for the 15″ size). They are pricey, of course, but not out of line with other branded, luxury products, like bags and watch bands. They fit the unit snuggly and protect it when you are out and about. In addition, they serve as a good pad on rough desk surfaces or when you have the MacBook Pro on your lap. Depending on the task you are performing, the bottom surface of the MacBook Pro can get warm, but nothing to be concerned about.

Before you point me to the nearest Windows gaming machine instead, let me mention that this review really isn’t a comparison against Windows laptops, but rather advances by Apple within the MacBook Pro line. But for context, I have owned six laptops to date – 3 PCs and 3 Macs. I shifted to Mac in order to have access to Final Cut Pro and have been happy with that move. The first 2 PCs developed stress fractures at the lid hinges before they were even a year old. The third, an HP, was solid, but after I gave it to my daughter, the power supply shorted. In addition, the hard drive became so corrupt (thank you Windows) that it wasn’t worth trying to recover. In short, my Mac laptop experience, like that of others, has been one of good value. MacBook Pros generally last years and if you use them for actual billable work (editing, DIT, sound design, etc.), then the investment will pay for itself.

This is the fastest and best laptop Apple has made. Apple engineering has nicely balanced power, size, weight, and battery life in a way that’s hard to counter. It is expensive, but if you try to find an equivalent PC, it is hard to actually find one with these exact same specs or components, until you get into gaming PCs. Those a) look pretty ugly, b) tend to be larger and heavier, with lower battery life, and c) cost about the same. There’s also the sales experience. Try to navigate nearly any PC-centric laptop supplier in an effort to customize the options and it tends to become an exercise in frustration. On the other hand, Apple makes it quite easy to buy and configure its machines with the options that you want.

I do have to mention that when these MacBook Pros first came out there was an issue of performance throttling, which was quickly addressed by Apple and fixed by a supplemental macOS release. That had already been installed on my unit, so no throttling issues that affected any of my performance tests.

Likewise, there have been debris complaints with the first run of the “butterfly” keys used in this and the previous version of these laptops. As other reviewers have stated when tear-downs have been done, Apple has added a membrane under the keys to help with sound dampening. Some reviewers have speculated that this also helps mitigate or even eliminate the debris issues. Whatever the reason, I liked typing on this keyboard and it did sound quieter to me. I tend to bang on keys, since I’m not a touch typist. The feel of a keyboard to a typist can be very subjective and in the course of a day, I tend to type on several vintages of Apple keyboards. In general, the keyboard on this newest MacBook Pro felt comfortable to me, when used for standard typing.

What did Apple bring new to the mix?

When Apple introduced the Touch Bar in 2016, I thought ‘meh’. But after these couple of weeks, I’ve really enjoyed it, especially when an application like Final Cut Pro X extends its controls to the Touch Bar. You can switch the Touch Bar preferences to only be function keys if you like. But having control strip options makes it quick to adjust screen brightness, volume, and so on. In the case of FCPX, you also get a mini-timeline view in some modes. Even QuickTime player calls up a small movie strip into the Touch Bar screen for the file being played.

These units also include Apple’s T2 security chip, which powers the fingerprint Touch ID and the newly added “Hey Siri” commands. The Retina screen on this laptop is gorgeous with up to 500 nits brightness and a wide color gamut. Another new addition is True Tone, which adjusts the display’s color temperature for the surrounding ambient light. That may become a more important selling point in the coming years. There is growing concern within the industry that blue light emitted from computer displays causes long-term eyesight damage. Generally, True Tone warms up the screen when under interior lighting, which reduces eye fatigue when you are working with a lot of white documents. But my recommendation is that editors, colorists, photographers, and designers turn this feature off when working on tasks that require color accuracy. Otherwise, the color balance of media will appear too warm (yellowish).

The 2018 15” MacBook Pro has four Thunderbolt 3/USB-C ports and a headphone jack. The four ports (two per side) are driven by two internal Thunderbolt 3 (40Gb/s) buses. It appears that’s one for each side, which means that plugging in two devices on one side will split the available Thunderbolt 3 bandwidth on that bus in half. Although, this doesn’t seem to be much of a factor during actual use. The internal bus routing does appear to be different from the previous model, in spite of what otherwise is more or less the same hardware configuration.

Gone are all other connections, so plan on purchasing an assortment of adapters to connect peripherals, such as those ubiquitous USB thumb drives or hardware dongles (license keys). I do wish that Apple had retained at least one standard USB port. Thunderbolt 3 supports power, so no separate MagSafe port is required either. (Power supply and cable are included.) One minor downside of this is that there is no indicator LED when a full battery charge is achieved, like we used to have on the MagSafe plug.

If connected to a Thunderbolt 3 device with an adequate power supply (e.g. the LG displays or the Blackmagic eGPU sold through Apple), then a single cable can both transfer data and power the laptop. One caveat is that Thunderbolt 3 doesn’t pass a video signal in the same way as Thunderbolt 2. You cannot simply add a Thunderbolt 3-to-Thunderbolt 2 adapter and connect a typical monitor’s MiniDisplayPort plug, as was possible with Thunderbolt 2 ports. External monitors without the correct connection will need to go through a dock or monitor adapter in order to pass a video signal. (This is also true for the iMac Pros.)

Many users have taken to relying on their MacBook Pros as the primary machine for their home or office, as well on the road. The upside of Thunderbolt connectively is that when you get back to the office, connecting a single Thunderbolt 3 cable to the rest of your suite peripherals (dock, display, eGPU, whatever) is all you need to get up and running. Simple and clean. Stick the laptop in a cradle in the clamshell mode or on a laptop stand, connect the cable, and you now have a powerful desktop machine. MacBook Pros have gained enough power in recent years that – unless your demands are heavy – they can easily service your editing, photography, and graphic needs.

Is it time to upgrade?

I own a mid-2014 15” MacBook Pro (the last series with an NVIDIA GPU), which I purchased in early 2015. Three years is often a good interval for most professional users to plan on a computer refresh, so I decided to compare the two. To start with, the new 2018 machine boots faster and apps also open faster. It’s even slightly smaller and thinner than the mid-2014 model. Both have fast SSDs, but the 2018 model is significantly faster (2645 MB/s write, 2722 MB/s read – Blackmagic Speed Test).

As with other reviews, I pulled an existing edit project for my test sequence. This timeline could be the same in Final Cut Pro X, Premiere Pro, and Resolve – without effects unique to one specific software application. My timeline consisted of 4K Alexa ProResHQ files that had a LUT and were scaled into a 1080p sequence. A few 1080p B-roll shots were also part of this sequence. The only taxing effect was a reverse slomo 4K clip, using optical flow interpolation. Both machines handled 4K ProRes footage just fine at full resolution using various NLEs. Exports to ProRes and H.264 were approximately twice as fast from Final Cut Pro X on the newer MacBook Pro. The same exports from Premiere Pro were longer overall than from FCPX, but faster on the 2018 machine, as well (see the section at the end for performance by the numbers).

If you are a fan of Final Cut Pro X, this machine is one of the best to use it on, especially if you can store your media on the internal drive. However, as an equalizer of sorts, I also ran these same test projects from an external SSD connected via USB3. While fast (over 200+ MB/s read/write), it wasn’t nearly as fast as the internal SSDs. Nevertheless, performance didn’t really lag behind with either FCPX or Premiere Pro. However, the optical flow clip did pose some issues. It played smoothly at “best quality” in FCPX, but oddly stuttered in the “best performance” setting. It did not play well in Premiere Pro at either full or half resolution. I also believe it contributed to the slower export times evident with Premiere Pro.

I tested a second project made up of all 4K REDCODE raw footage, which was placed into a 4K timeline. The 2018 MacBook Pro played the individual files and edited sequences smoothly when set to “best performance” in FCPX or half resolution in Premiere Pro. However, bumping the settings up to full quality caused stuttering with either NLE.

My last test was the same DaVinci Resolve project that I’ve used for my eGPU “stress” tests. These are anamorphic 4K Alexa files in a 2K DCI timeline. I stripped off all of the added filters that I had applied for the test of the eGPU, leaving a typical editing timeline with only a LUT and basic correction. This sequence played smoothly without dropping frames, which bodes well for editors who are considering a shift to Resolve as their main NLE.

Speaking of the Blackmagic eGPU tests, I had one day of overlap between the loans of the MacBook Pro and the Blackmagic eGPU. DaVinci Resolve’s real-time playback performance and exports were improved by about a 2X factor with the eGPU connected to the 15” model. Naturally,  the 15” machine by itself was quite a bit faster than the 13” MacBook Pro, so the improvement with an eGPU attached wasn’t as dramatic of a margin as the test with the 13” demonstrated. Even with this powerhouse MacBook Pro, the Blackmagic eGPU still adds value as a general appliance, as well as providing Resolve acceleration.

A note on battery life. The spec claims about 10 hours, but that’s largely for simple use, like watching web movies or listening to iTunes. Most of these activities do not cause the graphics to switch over from the integrated Intel to the Radeon Pro GPU, which consumes more power. In my editing tests with the Radeon GPU constantly on – and most of the energy saving settings disabled – I got five to six hours of battery life. That’s even when an application like FCPX was open, but minimized, without any real activity being done on the laptop.

I also ran a “heavy load” test, which involved continually looping my sample 1080 timeline (with 4K source media) full screen at “best quality” in FCPX. This is obviously a worst case scenario, but the charge only lasted about two hours. In short, the battery capacity is very good for a laptop, but one can only expect so much. If you plan on a heavy workload for an extended period of time, stay plugged in.

The 2018 MacBook Pro is a solid update that creative professionals will certainly enjoy, both in the field and even as a desktop replacement. If you bought last year’s model, there’s little reason to refresh your computer, yet. But three years or more? Get out the credit card!

_________________________________________________

Performance by the numbers

Blackmagic Design eGPU test

DaVinci Resolve renders/exports
(using the same test sequence as used for my eGPU review)

13” 2018 MacBook Pro – internal Intel graphics only
Render at source resolution – 1fps
Render at timeline resolution – 4fps

13” 2018 MacBook Pro – with Blackmagic eGPU
Render at source resolution – 5.5fps
Render at timeline resolution – 17.5fps

15” 2018 MacBook Pro – internal Radeon graphics only
Render at source resolution – 2.5fps
Render at timeline resolution – 8fps

15” 2018 MacBook Pro – with Blackmagic eGPU
Render at source resolution – 5.5fps
Render at timeline resolution – 16fps

Standard performance tests – 2018 15” MacBook Pro vs. Mid-2014
(using editing test sequence – 4K ProResHQ media)

2018 export from FCPX to ProRes  :30
2018 export from FCPX to H.264 at 10Mbps  :57
2014 export from FCPX to ProRes  :57
2014 export from FCPX to H.264 at 10Mbps  1:42

2018 export from Premiere Pro to ProRes  2:59
2018 export from Premiere Pro to H.264 at 10Mbps  2:32
2014 export from Premiere Pro to ProRes  3:35
2014 export from Premiere Pro to H.264 at 10Mbps  3:25

2018 export from Resolve to ProRes :35
2018 export from Resolve to H.264 at 10Mbps  :35
(Mid-2014 MBP was not used in this test)

Originally written for RedSharkNews

©2018 Oliver Peters

Blackmagic Design eGPU

Power users have grown to rely on graphics processing units from AMD, Intel and Nvidia to accelerate a wide range of computational functions – from visual effect filters to gaming and 360VR, and even to bitcoin mining. Apple finally supports external GPUs, which can easily be added as plug-and-play devices without any hack. Blackmagic Design just released its own eGPU product for the Mac, which is sold exclusively through Apple ($699 USD). It requires macOS 10.13.6 or later, and a Thunderbolt 3 connection. (Thunderbolt 2, even with adapters, will not work.)

The Blackmagic eGPU features a sleek, aluminum enclosure that makes a fine piece of desk art. It’s of similar size and weight to a 2013 Mac Pro and is optimized for both cooling and low noise. The unit is built around the AMD Radeon Pro 580 GPU with 8GB of video memory. It delivers 5.5 teraflops of processing power and is the same GPU used in Apple’s top-end, 27” Retina 5K iMac.

Leveraging Thunderbolt 3

Thunderbolt 3 technology supports 40Gb/s of bandwidth, as well as power. The Blackmagic eGPU includes a beefy power supply that can also power and/or charge a connected MacBook Pro. There are two Thunderbolt 3 ports, four USB3.1 ports, and HDMI. Therefore, you can connect a Mac, two displays, plus various USB peripherals. It’s easy to think of it as an accelerator, but it is also an appliance that can be useful in other ways to extend the connectivity and performance of MacBook Pros. Competing products with the same Radeon 580 GPU may be a bit less expensive, but they don’t offer this level of connectivity.

Apple and Blackmagic both promote eGPUs as an add-on for laptops, but any Thunderbolt 3 Mac qualifies. I tested the Blackmagic eGPU with both a high-end iMac Pro and the base model 13” 2018 MacBook Pro with touch bar. This model of iMac Pro is configured with the more advanced Vega Pro 64 GPU (16GB VRAM). My main interest in including the iMac Pro was simply to see whether there would be enough performance boost to justify adding an eGPU to a Mac that is already Apple’s most powerful. Installation of the eGPU was simply a matter of plugging it in. A top menu icon appears on the Mac screen to let you know it’s there and so you can disconnect the unit while the Mac is powered up.

Pushing the boundaries through testing

My focus is editing and color correction and not gaming or VR. Therefore, I ran tests with and without the eGPU, using Final Cut Pro X, Premiere Pro, and DaVinci Resolve (Resolve Studio 15 beta). Anamorphic ARRI Alexa ProRes 4444 camera files (2880×2160, native / 5760×2160 pixels, unsqueezed) were cut into 2K DCI (Resolve) and/or 4K DCI (FCPX, Premiere Pro) sequences. This meant that every clip got a Log-C LUT and color correction, as well as aspect ratio correction and scaling. In order to really stress the system, I added several GPU-accelerated effect filters, like glow, film grain, and so on. Finally, timed exports went back to ProRes 4444 – using the internal SSD for media and render files to avoid storage bottlenecks.

Not many applications take advantage of this newfound power, yet. Neither FCPX nor Premiere utilize the eGPU correctly or even at all. Premiere exports were actually slower using the eGPU. In my tests, only DaVinci Resolve gained measurable acceleration from the eGPU, which also held true for a competing eGPU that I compared.

If editing, grading or possibly location DIT work is your main interest, then consider the Blackmagic eGPU a good accessory for DaVinci Resolve running on a MacBook Pro. As a general rule, lesser-powered machines benefit more from eGPU acceleration than powerful ones, like the iMac Pro, with its already-powerful, built-in Vega Pro 64 GPU.

Performance by the numbers (iMac Pro only)

To provide some context, here are the results I got with the iMac Pro:

Resolve on iMac Pro (internal V64 chip) – NO eGPU – Auto GPU config

Playback of timeline at real-time 23.976 without frames dropping

Render at source resolution – average 11fps (slower than real-time)

Render at timeline resolution – average 33fps (faster than real-time)

Resolve on iMac Pro – with BMD eGPU (580 chip) – OpenCL

Playback of timeline at real-time 23.976 without frames dropping

Render at source resolution – average 11fps (slower than real-time)

Render at timeline resolution – average 37fps (faster than real-time)

Metal

Apple’s ability to work with eGPUs is enabled by Metal. This is their framework for addressing hardware components, like graphics and central processors. The industry has relied on other frameworks, including OpenGL, OpenCL and CUDA. The first two are open standards written for a wide range of hardware platforms, while CUDA is specific to Nvidia GPUs. Apple is deprecating all of these in favor of Metal (now Metal 2). With each coming OS update, these will become more and more “legacy” until presumably, at some point in the future, macOS may only support Metal.

Apple’s intention is to gain performance improvements by optimizing the code at a lower level “closer to the metal”. It is possible to do this when you only address a limited number of hardware options, which may explain why Apple has focused on using only AMD and Intel GPUs. The downside is that developers must write code that is proprietary to Apple computers. Metal is in part what gives Final Cut Pro X it’s smooth media handling and real-time performance. Both Premiere Pro and Resolve give you the option to select Metal, when installed on Macs.

In the tests that I ran, I presume FCPX only used Metal, since there is no option to select anything else. I did, however, test both Premiere Pro/Adobe Media Encoder and Resolve with both Metal and again with OpenCL specifically selected. I didn’t see much difference in render times with either setting in Premiere/AME. Resolve showed definite differences, with OpenCL the clear winner. For now, Resolve is still optimized for OpenCL over Metal.

Power for the on-the-go editor and colorist

The MacBook Pro is where the Blackmagic eGPU makes the most sense. It gives you better performance with faster exports, and adds badly-needed connectivity. My test Resolve sequence is a lot more stressful than I would normally create. It’s the sort of sequence I would never work with in the real world on a lower-end machine, like this 13” model. But, of course, I’m purposefully pushing it through a demanding task.

When I ran the test on the laptop without the eGPU connected, it would barely play at all. Exports at source resolution rendered at around 1fps. Once I added the Blackmagic eGPU, this sequence played in real-time, although the viewer would start to drop frames towards the end of each shot. Exports at the source resolution averaged 5.5fps. At timeline resolution (2K DCI) it rendered at up to 17fps, as opposed to 4fps without it. That’s over 4X improvement.

Everyone’s set of formats and use of color correction and filters are different. Nevertheless, once you add the Blackmagic eGPU to this MacBook Pro model, functionality in Resolve goes from insanely slow to definitely useable. If you intend to do reliable color correction using Resolve, then a Thunderbolt 3 UltraStudio HD Mini or 4K Extreme 3 is also required for proper video monitoring. Resolve doesn’t send video signals over HDMI, like Premiere Pro and Final Cut Pro X can.

It will be interesting to see if Blackmagic also offers a second eGPU model with the higher-end chip in the future. That would likely double the price of the unit. In the testing I’ve done with other eGPUs that used a version of the Vega 64 GPU, I’m not convinced that such a product would consistently deliver 2X more performance to justify the cost. This Blackmagic eGPU adds a healthy does of power and connectivity for current MacBook Pro users and that will only get better in the future.

I think it’s clear that Apple is looking towards eGPUs are a way to enhance the performance of its MacBook Pro line, without compromising design, battery life, and cooling. Cable up to an external device and you’ve gained back horsepower that wouldn’t be there in the standard machine. After all, you mainly need this power when you are in a fixed, rather than mobile, location. The Blackmagic eGPU is portable enough, so that as long as you have electrical power, you are good to go.

In his review of the 2018 MacBook Pro, Ars Technica writer Samuel Axon stated, “Apple is trying to push its own envelope with the CPU options it has included in the 2018 MacBook Pro, but it’s business as usual in terms of GPU performance. I believe that’s because Apple wants to wean pro users with serious graphics needs onto external GPUs. Those users need more power than a laptop can ever reasonably provide – especially one with a commitment to portability.”

I think that neatly sums it up, so it’s nice to see Blackmagic Design fill in the gaps.

UPDATE: The September 2018 release of Mojave has changed the behavior of Final Cut Pro X when an eGPU is connected. It is now possible to set a preference for whether the internal or external GPU is to be used with Final Cut Pro X.

Originally written for RedShark News.

©2018 Oliver Peters