Glass – Editing an Unconventional Trilogy

Writer/director M. Night Shyamalan has become synonymous with films about the supernatural that end with a twist. He first gained broad attention with The Sixth Sense and in the two decades since, has written, produced, and directed a range of large and small films. In recent years, he has taken a more independent route to filmmaking, working with lower budgets and keeping close control of production and post.

His latest endeavor, Glass, also becomes the third film in what is now an unconventional trilogy, starting first with Unbreakable, released 19 years ago. 2017’s Split was the second in this series. Glass combines the three principal characters from the previous two films – David Dunn/The Overseer (Bruce Willis), Elijah Price/Mr. Glass (Samuel L. Jackson), and Kevin Wendell Crumb (James McAvoy), who has 23 multiple personalities.

Shyamalan likes to stay close to his northeastern home base for production and post, which has afforded an interesting opportunity to young talent. One of those is Luke Ciarrocchi, who edited the final two installments of the trilogy, Split and Glass. This is only his third film in the editor’s chair. 2015’s The Visit was his first. Working with Shyamalan has provided him with a unique opportunity, but also a master class in filmmaking. I recently spoke with Luke Ciarrocchi about his experience editing Glass.

_________________________________________________

[OP] You’ve had the enviable opportunity to start your editing career at a pretty high level. Please tell me a bit about the road to this point.

[LC] I live in a suburb of Philadelphia and studied film at Temple University. My first job after college was as a production assistant to the editing team on The Happening with editor Conrad Buff (The Huntsman: Winter’s War, Rise of the Planet of the Apes, The Last Airbender) and his first assistant Carole Kenneally. When the production ended, I got a job cutting local market commercials. It wasn’t glamorous stuff, but it is where I got my first experience working on Avid [Media Composer] and really started to develop my technical knowledge. I was doing that for about seven months when The Last Airbender came to town.

I was hired as an apprentice editor by the same editing crew that I had worked with on The Happening. It was on that film that I started to get onto Night’s radar. I was probably the first Philly local to break into his editing team. There’s a very solid and talented group of local production crew in Philly, but I think I was the first local to join the Editors Guild and work in post on one of his films. Before that, all of the editing crew would come from LA or New York. So that was a big ‘foot in the door’ moment, getting that opportunity from Conrad and Carole.  I learned a lot on Airbender. It was a big studio visual effects film, so it was a great experience to see that up close – just a really exciting time for me.

During development of After Earth, even before preproduction began, Night asked me to build a type of pre-vis animatic from the storyboards for all the action sequences. I would take these drawings into After Effects and cut them up into moveable pieces, animate them, then cut them together into a scene in Avid. I was putting in music and sound effects, subtitles for the dialogue, and really taking them to a pretty serious and informative level. I remember animating the pupils on one of the drawings at one point to convey fear (laughs). We did this for a few months. I would do a cut, Night would give me notes, maybe the storyboard artist would create a new shot, and I would do a recut. That was my first back-and-forth creative experience with him.

Once the film began to shoot, I joined the editing team as an assistant editor. At the end of post – during crunch time – I got the opportunity to jump in and cut some actual scenes with Night. It was surreal. I remember sitting in the editing room auditioning cuts for him and him giving notes and all the while I’m just repeating in my head, ‘Don’t mess this up, don’t mess this up.’ I feel like we had a very natural rapport though, besides the obvious nervousness that would come from a situation like that. We really worked well together from the start. We both had a strong desire to dig deep and really analyze things, to not leave anything on the table. But at the same time we also had the ability to laugh at things and break the seriousness when we needed to. We have a similar sense of humor that to this day I think helps us navigate the more stressful days in the editing room. Personality plays a big roll in the editing room. Maybe more so then experience. I may owe my career to my immature sense of humor. I’m not sure.     

After that, I assisted on some other films passing through Philly and just kept myself busy. Then I got a call from Night’s assistant to come by to talk about his next film, The Visit. I got there and he handed me a script and told me he wanted me to be the sole editor on it. Looking back it seems crazy, because he was self-financing the film. He had lot on the line and he could have gotten any editor, but he saw something. So that was the first of the three films I would cut for him. The odds have to be one-in-a-million for that to pan out the way that it did in the suburbs of Philly. Right place, right time, right people. It’s a lot of luck, but when you find yourself in that situation, you just have to keep telling yourself, ‘Don’t mess this up.’

[OP] These three films, including Glass, are being considered a trilogy, even though they span about two decades. How do they tie together, not just in story, but also style?

[LC] I think it’s fair to call Glass the final installment of a trilogy – but definitely an untraditional one. First Unbreakable, then 19 years later Split, and now Glass. They’re all in the same universe and hopefully it feels like a satisfying philosophical arc through the three. The tone of the films is ingrained in the scripts and footage. Glass is sort of a mash-up of what Unbreakable was and what Split was. Unbreakable was a drama that then revealed itself as a comic book origin story. Split was more of a thriller – even horror at times – that then revealed itself as part of this Unbreakable comic book universe. Glass is definitely a hybrid of tone and genre representing the first two films. 

[OP] Did you do research into Unbreakable to study its style?

[LC] I didn’t have to, because Unbreakable has been one of my favorite films since I was 18. It’s just a beautiful film. I loved that in the end it wasn’t just about David Dunn accepting who he was, but also Elijah finding his place in the world only by committing these terrible crimes to discover his opposite. He had to become a villain to find the hero. It’s such a cool idea and for me, very rewatchable. The end never gets old to me. So I knew that film very, very well. 

[OP] Please walk me through your schedule for post-production.

[LC] We started shooting in October of 2017 and shot for about two month. I was doing my assembly during that time and the first week of December. Then Night joined me and we started the director’s cut. The way that Night has set up these last three films is with a very light post crew. It’s just my first assistant, Kathryn Cates, and me set up at Night’s offices here in the suburbs of Philadelphia with two Avids. We had a schedule that we were aiming for, but the release date was over a year out, so there was wiggle room if it was needed. 

Night’s doing this in a very unconventional way. He’s self-financing, so we didn’t need to go into a phase of a studio cut. After his director’s cut, we would go into a screening phase – first just for close crew, then more of a friends-and-family situation. Eventually we get to a general audience screening. We’re working and addressing notes from these screenings, and there isn’t an unbearable amount of pressure to lock it up before we’re happy. 

[OP] I understand that your first cut was about 3 1/2 hours long. It must take a lot of trimming and tweaking to get down to the release length of 129 minutes. What sort of things did you do to cut down the running time from that initial cut?

[LC] One of our obstacles throughout post was that initial length. You’re trying to get to the length that the film wants to be without gutting it in the process. You don’t want to overcut as much as you don’t want to undercut. We had a similar situation on Split, which was a long assembly as well. The good news is that there’s a lot of great stuff to work with and choose from.

We approach it very delicately. After each screening we trimmed a little and carefully pulled things out, so each screening was incrementally shorter, but never dramatically so. Sometimes you will learn from a screenings that you pulled the wrong thing out and it needed to go back in. Ultimately no major storyline was cut out of Glass. It was really just finding where we are saying the same thing twice, but differently – diagnosing which one of those versions is the more impactful one – then cutting the others. And so, we just go like that. Pass after pass. Reel by reel.

An interesting thing I’ve found is that when you are repeating things, you will often feel that the second time is the offensive moment of that information and the one to remove, because you’ve heard it once before. But the truth is that the first telling of that information is more often what you want to get rid of. By taking away the first one, you are saving something for later. Once you remove something earlier, it becomes an elevated scene, because you are aren’t giving away so much up front. 

[OP] What is your approach to getting started when you are first confronted with the production footage? What is your editing workflow like?

[LC] I’m pretty much paper-based. I have all of the script supervisor’s notes. Night is very vocal on set about what he likes and doesn’t like, and Charlie Rowe, our script supervisor, is very good at catching those thoughts. On top of that, Night still does dailies each day – either at lunch or the end of the day. As a crew, we get together wherever we are and screen all of the previous day’s footage, including B-roll. I will sit next to Night with a sheet that has all of the takes and set-ups with descriptions and I’ll take notes both on Night’s reactions, as well as my own feelings towards the footage. 

With that information, I’ll start an assembly to construct the scene in a very rough fashion without getting caught up in the small details of every edit. It starts to bring the shape of the scene out for me. I can see where the peaks and valleys are. Once I have a clearer picture of the scene and its intention, I’ll go back through my detailed notes – there’s a great look for this, there’s a great reading for that – and I find where those can fit in and whether they serve the edit. You might have a great reaction to something, but the scene might not want that to be on-camera. So first I find the bones of the scene and then I dress it up. 

Night gets a lot range from the actors from the first take to the last take. It is sometimes so vast that if you built a film out of only the last takes, it would be a dramatically different movie than if you only used take one. With each take he just pushes the performances further. So he provides you with a lot of control over how animated the scene is going to be. In Glass, Elijah is an eccentric driven by a strong ideology, so in the first take you get the subdued, calculated villain version of him, but by the last take it’s the carnival barker version. The madman. 

[OP] Do you get a sense when screening the dailies of which way Night wants to go with a scene?

[LC] Yes, he’ll definitely indicate a leaning and we can boil it down to a couple of selects. I’ll initially cut a scene with the takes that spoke to him the most during the dailies and never cut anything out ahead of time. He’ll see the first cuts as they were scripted, storyboarded, and shot. I’ll also experiment with a different take or approach if it seems valid and have that in my back pocket. He’s pretty quick to acknowledge that he might have liked a raw take on set and in dailies, but it doesn’t work as well when cut together into a scene. So then we’ll address that. 

[OP] As an Avid editor, have you used Media Composer’s script integration features, like ScriptSync?

[LC] I just had my first experience with it on a Netflix show. I came on later in their post, so the show had already been set up for ScriptSync. It was very cool and helpful to be able to jump in and quickly compare the different takes for the reading of a line. It’s a great ‘late in the game’ tool. Maybe you have a great take, but just one word is bobbled and you’d like to find a replacement for just that word. Or the emotion of a key word isn’t exactly what you want. It could be a time-saver for a lot of that kind of polishing work.

[OP] What takeaways can you share from your experiences working with M. Night Shyamalan?

[LC] Night works in the room with you everyday. He doesn’t just check in once a week or something like that. It’s really nice to have that other person there. I feel like often times the best stuff comes from discussing it and talking it through. He loves to deconstruct things and figure out the ‘why’. Why does this work and this doesn’t? I enjoy that as well. After three films of doing that, you learn a lot. You’re not aware of it, but you’re building a toolkit. These tools and choices start to become second nature. 

On the Netflix show that I just did, there were times where I didn’t have anyone else in the room for long stretches and I started to hear those things that have become inherent in my process clearer. I started to take notice of what had become my second nature – what the last decade had produced. Editing is something you just have to do to learn. You can’t just read about it or study a great film. You have to do it, do it again, and struggle with it. You need to mess it up to get it right.

________________________________________________

This interview is going online after Glass has scored its third consecutive weekend in the number one box office slot. Split was also number one for three weeks in a row. That’s a pretty impressive feat and fitting for the final installment of a trilogy.

Be sure to also check out Steve Hullfish’s AOTC interview with Luke Ciarrocchi here.

©2019 Oliver Peters

Advertisements

Molly’s Game

Molly Bloom’s future looked extremely bright. A shot at Olympic skiing glory leading to entry into a leading law school. But an accident during qualifying trials for the U. S. ski team knocked her out of the running for the Salt Lake City games. (Bloom notes in her own memoir that it was her decision to retire and change the course of her life, rather than the minor accident.) She moved to Los Angeles and ended up running high stakes, private poker games with her boss at the time. These games included A-list celebrities, hedge fund managers, and eventually, members of the Russian mob. Bloom quickly earned the nickname as the “poker princess”. This all came crashing down when Bloom was busted by the FBI and sentenced for her role in the gambling ring.

Bloom’s memoir came to the attention of screenwriter Aaron Sorkin (The Social Network, Moneyball, Steve Jobs), who not only made this his next film script, but also his debut as a film director. Sorkin stayed close to the facts that Bloom described in her own memoir and consulted her during the writing of the screenplay. The biggest departure is that Bloom named some celebrities at these games, who had previously been revealed in released court documents. Sorkin opted to fictionalize them, explaining that he would rather focus the story on Bloom’s experiences and not on Hollywood gossip. Jessica Chastain (The Zookeeper’s Wife, A Most Violent YearZero Dark Thirty) stars as Molly Bloom.

Although three editors are credited for Molly’s Game, the back story is that a staggered schedule had to be worked out. The post production of Steve Jobs connected feature film editor Elliot Graham (Milk, 21, Superman Returns) with that film’s writer and director – Sorkin and Danny Boyle (T2 Trainspotting, 127 Hours, Slumdog Millionaire). Graham was tapped to cut Molly’s Game later into the process, replacing its original editor. He brought Josh Schaeffer (The Last Man on Earth, Detroiters, You’re the Worst) on as associate editor to join him. Graham started the recut with Schaeffer, but a prior schedule commitment to work on Trust for Boyle, saw him exiting the film early. (Trust is the BBC’s adaptation of the Getty kidnapping story.) Graham was able to bring the film about 50% of the way through post. Alan Baumgarten (Trumbo, American Hustle, Gangster Squad) picked up for Graham and edited with Schaeffer to the finish, thus earning all three an editing credit.

Working with a writer on his directorial debut

It can always be a challenge when a writer is close to the editing process. Scenes that may be near and dear to the writer are often cut, leading to tension. I asked the three about this situation. Graham says, “Aaron has always been on set with his other films and worked very closely with the director. So, he understands the process, having learned from some of the best directors in the business. I had a great time with Aaron on Steve Jobs. He’s an incredibly lovely and generous collaborator who brings out the best in his team.”

Baumgarten expands, “Working with Aaron was fun, because he appreciates being challenged. He’s open to seeing what an editor brings to the film. Aaron wrote a tight script that didn’t need to be re-arranged. Only about 20 minutes came out. We cut one small scene, but it was mostly trimming here and there. You want to be careful not to ruin the rhythm of his writing.”

Graham continues, “Aaron also found his own visual vocabulary. A lot of the story is told in time jumps, from present day to the past in flashbacks. Aaron always is looking for rapid fire, overlapping dialogue. It’s part of his uniqueness and it’s a joy to cut. What was new for Aaron was using voice over to drive things.”

 Another new challenge was the use of stock footage. About 150 stock shots were used for cutaways and mini-montages throughout the film. Most of these were never originally scripted. Graham says, “Stock footage was something I chose to start injecting into the film with Aaron’s collaboration when I came on. We felt it was useful to have visual references for some of the voice overs – to connect visuals with words, which helps to land Aaron’s linguistic ideas for viewers. This began with the opening ski sequence – the first thing I cut when I came on board.”

The editors would pull down shots from a variety of internet sources and then the actual footage had to be found and cleared. The editors ultimately partnered with STALKR to find and clear all of the stock shots that were used. Visual effects were handled by Mr. X in Toronto. Originally, only 90 shots were budgeted (for example, snow falling in the ski sequences), but in the end, there were almost 600 visual effects shots in the final film.

Musicality of the performance

Baumgarten explains the musicality of Sorkin’s style. He says, “Aaron knew the film he wanted and had that in his head. Part of his writing process is to read his dialogue out loud and listen for the cadence of the performance. As you go through takes, the film is always moving in the right direction. As a writer/director, he doesn’t need variations or ad libs in an actor’s performance from one take to another, because he knows what the intention of the line is. As editors, we didn’t need to experiment with different calibrations of the performance. The experimentation came in with how we wove in the voice-over and played with the general rhythm.”

Graham adds, “Daniel Pemberton is the composer I worked with on Steve Jobs. I brought on Carl Kaller, a great music editor, when I came on. I knew that the music and dialogue had to dance a beautiful rhythm together for the film to be its best. With a compressed schedule to finish the film, we needed someone like Carl to help choreograph that dance.”

Baumgarten continues, “Daniel was involved early and provided us with temp tracks, which was a great gift. We didn’t have to use scores from other composers as temp music. Carl was just down the hall, so it was easy to weave Daniel’s temp elements in and around the dialogue and voice-over during the editing stage. There is interplay between the voice-over and the music, and the VO is like another musical element.”

Avid for the post

The post operation followed a standard feature film set-up. Avid Media Composer for the editing work stations, tied to Avid ISIS shared storage. The film was shot digitally using ARRI Alexas.

Production covered 48 days ending in February [2017]. It took 10 weeks to get to a director’s cut and then editing on Molly’s Game continued for about six months, which included visual effects, final sound mix and color correction. Schaeffer explains, “The dialogue scenes were scripted using [Avid] ScriptSync. Aaron was familiar with ScriptSync from The Newsroom, and it was a great help for us on this film. It’s the best way to have everything readily available and it allows us to be extremely thorough. If Aaron wanted to change a single word in a take, we were always able to find all of the alternates and make the change quite easily.”

Schaeffer continues, “Aaron methodically worked in a reel-by-reel order. We would divide up sequences between us at breaks that made sense. But when it came time to review the cut on a sequence, we would all review together. A lot people think that you have three editors on a film because the project is so difficult. The truth is that it lets you be more creative. Productions shoot so much footage these days, that it’s great to be able to experiment. Having multiple editors on a film enables you to take the time to be creative. We were all glad that Aaron set up an environment, which made that possible.”

Originally written for Digital Video magazine / Creative Planet Network

©2018 Oliver Peters

Avid Media Composer | First

They’ve teased us for two years, but now it’s finally out. Avid Technology has released its free nonlinear editing application, Media Composer | First. This is not dumbed-down, teaser software, but rather a partially-restricted version of the full-fledged Media Composer software and built upon the same code. With that comes an inherent level of complexity, which Avid has sought to minimize for new users; however, you really do want to go through the tutorials before diving in.

It’s important to understand who the target user is. Avid didn’t set out to simply add another free, professional editing tool to an increasingly crowded market. Media Composer | First is intended as a functional starter tool for users who want to get their feet wet in the Avid ecosystem, but then eventually convert to the full-fledged, paid software. That’s been successful for Avid with Pro Tools | First. To sweeten the pot, you’ll also get 350 sound effects from Pro Sound Effects and 50 royalty-free music tracks from Sound Ideas (both sets are also free).

Diving in

To get Media Composer | First, you must set up an Avid master account, which is free. Existing customers can also get First, but the software cannot be co-installed on a computer with the full version. For example, I installed Media Composer | First on my laptop, because I have the full Media Composer application on my desktop. You must sign into the account and stay signed in for Media Composer | First to lunch and run. I did get it to work if I signed in, but then disconnected the internet. There was a disconnection prompt, but nevertheless, the application worked, saved, and exported properly. It doesn’t seem mandatory to be constantly connected to Avid over the internet. All project data is stored locally, so this is not a cloud application.

The managing of the account and future updates are handled through Application Manager, an Avid desktop utility. It’s not my favorite, as at times it’s unreliable, but it does work most of the time. Opening the installer .dmg file will take a long time to verify. This seems to be a general Avid quirk, so be patient. When you first open the application, you may get a disk drive write permissions error message. On macOS you normally set drive permissions for “system”, “wheel”, and “everyone”. Typically I have the last two set to “read only”, which works for every other application, except Avid’s. Therefore, if you want to store Avid media on your internal system hard drive, then “everyone” must be changed to “read & write”.

The guided tour

The Avid designers have tried to make the Media Composer | First interface easy to navigate for new users – especially those coming from other NLEs, where media and projects are managed differently than in Media Composer. Right at the launch screen you have the option to learn through online tutorials. These will be helpful even for experienced users who might try to “out-think” the software. The interface includes a number of text overlays to help you get started. For example, there is no place to set project settings. The first clip added to the first sequence sets the project settings from there on. So, don’t drop a 25fps clip onto the timeline as your first clip, if you intend to work in a 23.98fps project. These prompts are right in front of you, so if you follow their guidance, you’ll be OK.

The same holds true for importing media through the Source Browser. With Media Composer you either transcode a file, which turns it into Avid-managed media placed into the Avid MediaFiles folder, or simply link to the file. If you select link, then the file stays in place and it’s up to the user not to move or delete that file on the hard drive. Although the original Avid paradigm was to only manage media in its MediaFiles hard drive folders, the current versions handle linking just fine and act largely the same as other NLEs.

Options, restrictions, and limitations

Since this is a free application, a number of features have been restricted. There are three biggies. Tracks are limited to four video tracks and eight audio tracks. This is actually quite workable, however, I think a higher audio track count would have been advisable, because of how Avid handles stereo, mono, and multichannel files. On a side note, if you use the “collapse” function to nest video clips, it’s possible to vertically stack more than just four clips on the timeline.

The application is locked to a maximum project size of 1920×1080 (Rec. 709 color space only) and up to 59.94fps. Source files can be larger (such as 4K) and you can still use them on the timeline, but you’ll have to pan-and-scan, crop, or scale them. I hope future versions will permit at least UltraHD (4K) project sizes.

Finally, Media Composer | First projects cannot be interchanged with full fledged Media Composer projects. This means that you cannot start in Media Composer | First and then migrate your project to the paid version. Hopefully this gets fixed in a future update. If not, it will negatively impact students and indie producers using the application for any real work.

As expected, there are no 3D stereoscopic tools, ScriptSync (automatic speech-to-text/sync-to-script), PhraseFind (phonetic search engine), or Symphony (advanced color correction) options. One that surprised me, though, was the removable of the superior Spectramatte keyer. You are left with the truly terrible RGB keyer for blue/green-screen work.

Nevertheless, there’s plenty of horsepower left. For example, FrameFlex to handle resizing and Timewarps for retiming clips, which is how 4K and off-speed frame rates are handled. Color correction (including scopes), multicam, IllusionFX, source setting color LUTs, Audiosuite, and Pro Tools-style audio track effects are also there. Transcoding allows for the use of a wide range of codecs, including ProRes on a Mac. 4K camera clips will be transcoded to 1080. However, exports are limited to Avid DNxHD and H.264 QuickTime files at up to 1920×1080. The only DNxHD export flavor is the 100Mbps variant (at 29.97, 80Mbps for 23.98), which is comparable to ProResLT. It’s good quality, but not at the highest mastering levels.

Conclusion

This is a really good first effect, no pun intended. As you might expect, it’s a little buggy for a first version. For example, I experienced a number of crashes while testing source LUTs. However, it was well-behaved during standard editing tasks. If Media Composer | First files can become compatible with the paid systems and the 1080 limit can be increased to UHD/4K, then Avid has a winner on its hands. Think of the film student who starts on First at home, but then finishes on the full version in the college’s computer lab. Or the indie producer/director who starts his or her own rough cut on First, but then takes it to an editor or facility to complete the process. These are ideal scenarios for First. I’ve cut tons of short and long form projects, including a few feature films, using a variety of NLEs. Nearly all of those could have been done using Media Composer | First. Yes, it’s free, but there’s enough power to get the job done and done well.

©2017 Oliver Peters

Suburbicon

George Clooney’s latest film, Suburbicon, originated over a decade ago as a screenplay by Joel and Ethan Coen. Clooney picked it up when the Coens decided not to produce the film themselves. Clooney and writing partner Grant Heslov (The Monuments Men, The Ides of March, Good Night, and Good Luck), rewrote it as taking place in the 1950s and added another story element. In the summer of 1957, the Myers, an African-American couple, moved into a largely white suburb in Levittown, Pennsylvania, setting off months of violent protests. The rewritten script interweaves the tale of the black family with that of their next-door neighbors, Gardner (Matt Damon) and Margaret (Julianne Moore). In fact, a documentary was produced about the historical events and shots from that documentary were used in Suburbicon.

Calibrating the tone

During the production and editing of the film, the overall tone was adjusted as a result of the actual, contemporary events occurring in the country. I spoke with the film’s editor, Stephen Mirrione (The Revenant, Birdman or (The Unexpected Virtue of Ignorance), The Monuments Men) about this. Mirrione explains, “The movie is presented as over-the-top to exaggerate events as satire. In feeling that out, George started to tone down the silliness, based on surrounding events. The production was being filmed during the time of the US election last year, so the mood on the set changed. The real world was more over-the-top than imagined, so the film didn’t feel quite right. George started gravitating towards a more realistic style and we locked into that tone by the time the film moved into post.”

The production took place on the Warner Brothers lot in September 2016 with Mirrione and first assistant editor Patrick Smith cutting in parallel with the production. Mirrione continues, “I was cutting during this production period. George would come in on Saturdays to work with me and ‘recalibrate’ the cut. Naturally some scenes were lost in this process. They were funny scenes, but just didn’t fit the direction any longer. In January we moved to England for the rest of the post. Amal [Clooney, George’s wife] was pregnant at the time, so George and Amal wanted to be close to her family near London. We had done post there before and had a good relationship with vendors for sound post. The final sound mix was in the April/May time frame. We had an editing room set up close to George outside of London, but also others in Twickenham and at Pinewood Studios. This way I could move around to work with George on the cut, wherever he needed to be.”

Traveling light

Mirrione is used to working with a light footprint, so the need for mobility was no burden. He explains, “I’m accustomed to being very mobile. All the media was in the Avid DNxHD36 format on mobile drives. We had an Avid ISIS shared storage system in Twickenham, which was the hub for all of the media. Patrick would make sure all the drives were updated during production, so I was able to work completely with standalone drives. The Avid is a bit faster that way, although there’s a slight trade-off waiting for updated bins to be sent. I was using a ‘trash can’ [2013] Mac Pro plus AJA hardware, but I also used a laptop – mainly for reference – when we were in LA during the final steps of the process.” The intercontinental workflow also extended to color correction. According to Mirrione, “Stefan Sonnenfeld was our digital intermediate colorist and Company 3 [Co3] stored a back-up of all the original media. Through an arrangement with Deluxe, he was able to stream material to England for review, as well as from England to LA to show the DP [Robert Elswit].”

Music was critical to Suburbicon and scoring fell to Alexandre Desplat (The Secret Life of Pets, Florence Foster Jenkins, The Danish Girl). Mirrione explains their scoring process. “It was very important, as we built the temp score in the edit, to understand the tone and suspense of the film. George wanted a classic 1950s-style score. We tapped some Elmer Bernstein, Grifters, The Good Son, and other music for our initial style and direction. Peter Clarke was brought on as music editor to help round out the emotional beats. Once we finished the cut, Alexandre and George worked together to create a beautiful score. I love watching the scenes with that score, because his music makes the editing seem much more exciting and elegant.”

Suiting the edit tool to your needs

Stephen Mirrione typically uses Avid Media Composer to cut his films and Suburbicon is no exception. Unlike many film editors who rely on unique Avid features, like ScriptSync, Mirrione takes a more straightforward approach. He says, “We were using Media Composer 8. The way George shoots, there’s not a lot of improv or tons of takes. I prefer to just rely on PDFs of the script notes and placing descriptions into the bins. The infrastructure required for ScriptSync, like extra assistants, is not something I need. My usual method of organization is a bin for each day of dailies, organized in shooting order. If the director remembers something, it’s easy to find in a day bin. During the edit, I alternate my bin set-ups between the script view and the frame view.”

With a number of noted editors dabbling with other software, I wondered whether Mirrione has been tempted. He responds, “I view my approach as system-agnostic and have cut on Lightworks and the old Laser Pacific unit, among others. I don’t want to be dependent on one piece of software to define how I do my craft. But I keep coming back to Avid. For me it’s the trim mode. It takes me back to the way I cut film. I looked at Resolve, because it would be great to skip the roundtrip between applications. I had tested it, but felt it would be too steep a learning curve, and that would have impacted George’s experience as the director.”

In wrapping our conversation, Mirrione concluded with this take away from his Suburbicon experience. He explains, “In our first preview screening, it was inspiring to see how seriously the audience took to the film and the attachment they had to the characters. The audiences were surprised at how biting and relevant it is to today. The theme of the film is really talking about what can happen when people don’t speak out against racism and bullying. I’m so proud and lucky to have the opportunity to work with someone like George, who wants to do something meaningful.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Baby Driver

You don’t have to be a rabid fan of Edgar Wright’s work to know of his films. His comedy trilogy (Shaun of the Dead, Hot Fuzz, The World’s End) and cult classics like Scott Pilgrim vs. the World loom large in pop culture. His films have earned a life beyond most films’ brief release period and earned Wright a loyal following. The latest film from Wright is Baby Driver, a musically-fueled action film written and directed by Wright, which just made a big splash at SXSW. It stars Ansel Elgort, Kevin Spacey, Jon Hamm, Jamie Foxx, and Eiza Gonzalez.

At NAB, Avid brought in a number of featured speakers for its main stage presentations, as well as its Avid Connect event. One of these speakers was Paul Machliss (Scott Pilgrim vs. the World, The World’s End, Baby Driver), who spoke to packed audiences about the art of editing these films. I had a chance to go in-depth with Machliss about the complex process of working on Baby Driver.

From Smoke to baptism by fire

We started our conversation with a bit of the backstory of the connection between Wright and Machliss. He says, “I started editing as an online editor and progressed from tape-based systems to being one of the early London-based Smoke editors. My boss at the time passed along a project that he thought would be perfect for Smoke. That was onlining the sitcom Spaced, directed by Edgar Wright. Edgar and I got on well. Concurrent to that, I had started learning Avid. I started doing offline editing jobs for other directors and had a ball. A chance came along to do a David Beckham documentary, so I took the plunge from being a full-time online editor to taking my chances in the freelance world. On the tail end of the documentary, I got a call from Edgar, offering me the gig to be the offline editor for the second season of Spaced, because Chris Dickens (Hot Fuzz, Berberian Sound Studio, Slumdog Millionaire) wasn’t available to complete the edit. And that was really jumping into the deep end. It was fantastic to be able to work with Edgar at that level.”

Machliss continues, “Chris came back to work with Edgar on Shaun of the Dead and Hot Fuzz, so over the following years I honed my skills working on a number of British comedies and dramas. After Slumdog Millionaire came out, which Chris cut and for which he won a number of awards, including an Oscar, Chris suddenly found himself very busy, so the rest of us working with Edgar all moved up one in the queue, so to speak. The opportunity to edit Scott Pilgrim came up, so we all threw ourselves into the world of feature films, which was definitely a baptism by fire. We were very lucky to be able to work on a project of that nature during a time where the industry was in a bit of a slump due to the recession. And it’s fantastic that people still remember it and talk about it seven years on. Which brings us to Baby Driver. It’s great when a studio is willing to invest in a film that isn’t a franchise, a sequel, or a reboot.”

Music drives the film

In Baby Driver, Ansel Elgort plays “Baby”, a young kid who is the getaway driver for a gang. At a young age, he was in a car accident which leaves him with tinnitus, so it takes listening to music 24/7 to drown out the tinnitus. Machliss explains, “His whole life becomes regimented to whatever music he is listening to – different music for different moods or occasions. Somehow everything falls magically into sync with whatever he is listening to – when he’s driving, swerving to avoid a car, making a turn – it all seems to happen on the beat. Music drives every single scene. Edgar deliberately chose commercial top-20 tracks from the 1960s up to today. Each song Baby listens to also slyly comments on whatever is happening at the time in the story. Everything is seemingly choreographed to musical rhythms. You’re not looking at a musical, but everything is musically driven.”

Naturally, building a film to popular music brings up a whole host of production issues. Machliss tells how this film had been in the planning for years, “Edgar had chosen these tracks years ago. I believe it was in 2011 that Edgar and I tried to sequence the tracks and intersperse them with sound effects. A couple of months later, he did a table read in LA and sent me the sound files. In the Avid, I combined the sound files, songs, and some sound effects to create effectively a 100-minute radio play, which was, in fact, the film in audio form. The big thing is that we had to clear every song before we could start filming. Eventually we cleared 30-odd songs for the film. In addition, Edgar worked with his stunt team and editor Evan Schiff in LA to create storyboards and animatics for all of the action scenes.”

Editor on the front lines

Unlike most films, a significant amount of the editing took place on-set with Machliss working from a portable set-up. He says, “Based on our experiences with Scott Pilgrim and World’s End, Edgar decided it would be best to have me on-set during most of the Atlanta shoot for Baby Driver. Even though a cutting room was available, I was in there maybe ten percent of the time. The rest of the time I was on set. I had a trolley with a laptop, monitor, an Avid Mojo, and some hard drives and I would connect myself via ethernet to the video assist’s hard drive. Effectively I was crew in the front lines with everyone else. Making sure the edit worked was as important as getting a good take in the can. If I assured Edgar that a take would work, then he knew it wasn’t going to come back and cause problems for us six months later. We wanted things to work naturally in camera without a lot of fiddling in post. We didn’t want to have to fall back on frame-cutting and vari-speeding if we didn’t have to. There was a lot of prep work in making sure actions correctly coincided with certain lyrics without the action seeming mechanical.”

The nature of the production added to the complexity of the production audio configuration, too. Machliss explains, “Sound-wise, it was very complicated. We had playback going to earwigs in the actors’ ears, Edgar wanted to hear music plus the dialogue in his cans, and then I needed to get a split feed of the audio, since I already had the clean music on my timeline. We shot this mostly on 35mm film. Some days were A-camera only, but usually two cameras running. It was a combination of Panavision, Arricams, and occasionally Arri Alexas. Sometimes there were some stunt shots, which required nine or ten cameras running. Since the action all happened against playback of a track, this allowed me to use Avid’s multicam tools to quickly group shots together. Avid’s AMA tools have really come of age, so I was able to work without needing to ingest anything. I could treat the video assist’s hard drive as my source media, as long as I had the ethernet connection to it. If we were between set-ups, I could get Avid to background-transcode the media, so I’d have my own copy.”

Did all of this on-set editing speed up the rest of the post process? He continues, “All of the on-set editing helped a great deal, because we went into the real post-production phase knowing that all the sequences basically worked. During that time, as I’d fill up a LaCie Rugged drive, I would send that back to the suites. My assistant, Jerry Ramsbottom, would then patiently overcut my edits from the video assist with the actual scanned telecine footage as it came in. We shot from mid-February until mid-May and then returned to England. Jonathan Amos came on board a few weeks into the director’s cut edit and worked on the film with Edgar and myself up until the director’s cut picture lock. He did a pass on some of the action scenes while Edgar and myself concentrated on dialogue and the overall shape of the film. He stayed on board up until the final picture lock and made an incredible contribution to the action and the tension of the film. By the end of the year we’d locked and then we finished the final mix mid-February of this year. But the great thing was to be able to come into the edit and have those sequences ready to go.”

Editing from set is something many editors try to avoid. They feel they can be more objective that way. Machliss sees it a bit differently, “Some editors don’t like being on set, but I like the openness of it – taking it all in. Because when you are in the edit, you can recall the events of the day a particular scene was shot – ‘I can remember when Kevin Spacey did this thing on the third take, which could be useful’. It’s not vital to work like this, but it does preclude to a kind of short-hand, which is something Edgar and I have developed over these years anyway. The beauty of it is that Edgar and I will take the time to try every option. You can never hit on the perfect cut the first time. Often you’ll get feedback from screenings, such as ‘we’d like to see more emotion between these characters’. You know what’s available and sometimes four extra shots can make all the difference in how a scene reads without having to re-imagine anything. We did drop some scenes from the final version of the film. Of course, you go ‘that’s a shame’, but at least these scenes were given a chance. However, there are always bits where upon the 200th viewing you can decide, ‘well, that’s completely redundant’ – and it’s easy to drop. You always skate as close to the edge of making a film shorter without doing any damage to it.”

The challenge of sound

During sound post, Baby Driver also presented some unique challenges. Machliss says, “For the sound mix – and even for the shoot – we had to make sure we were working with the final masters of the song recordings to make sure the pitch and duration remained constant throughout. Typically these came in as mono or stereo WAVs. Because music is such an important element to the film, the concept of perceived direction becomes important. Is the music emanating from Baby’s earbuds? What happens to it when the camera moves or he turns his head? We had to work out a language for the perception of sound. This was Edgar’s first film mixed in Dolby ATMOS and we were the second film in Goldcrest London’s new Atmos-certified dubbing theater. Then we did a reduction to 7.1 and 5.1. Initially we were thinking this film would have no score other than the songs. Invariably you need something to get from A to B. We called on the services of Steven Price (Gravity, Fury, Suicide Squad), who provided us with some original cues and some musical textures. He did a very clever thing where he would match the end pitch or notes of a commercial song and then by the time he came to the end of his cue, it would match to the incoming note or key of the next song. And you never notice the change.”

Working with Avid in a new way

To wrap up the conversation, we talked a bit about using Avid Media Composer on his work. Machliss has used numerous other systems, but Media Composer still fits the bill for his work today. He says, “For me, the speed of working with AMA in Avid in the latest software was a real benefit. I could actually keep up with the speed of the shoot. You don’t want to be the one holding up a crew of 70. I also made good use of background transcoding. On a different project (Fleabag), I was able to work with native 2K Alexa ProRes camera files at full resolution. It was fantastic to be able to use Frameflex and apply LUTs – doing the cutting, but then bringing back my old skills as an online editor to paint out booms and fix things up. Once we locked, I could remove the LUTs and export DPX files, which went straight to the grading facility. That was exciting to work in a new way.”

Baby Driver opened at the start of July in the US and is a fun ride. You can certainly enjoy a film like this without knowing the nitty gritty of the production that goes into it. However, after you’ve read this article, you just might need to see it at least twice – once to just enjoy and once again to study the “invisible art” that’s gone into bringing it to screen.

(For more with Paul Machliss, check out these interviews at Studio Daily, ProVideoCoalition, and FrameIO.)

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Voyage of Time

df0617_vot_3_smFans of director Terrence Malick adore his unique approach to filmmaking, which is often defined by timeless and painterly cinematic compositions. The good news for moviegoers is that Malick has been in the most prolific period of his directing career. What could be the penultimate in cinema as poetry is Malick’s recent documentary, Voyage of Time. This is no less than a chronicle of the history of the universe as seen through Malick’s eyes. Even more intriguing is the fact that the film is being released in two versions – a 90 minute feature (Voyage of Time: Life’s Journey), narrated by Cate Blanchett, as well as a 45 minute IMAX version (Voyage of Time: The IMAX Experience), narrated by Brad Pitt.

This period of Malick’s increased output has not only been good for fans, but also for Keith Fraase, co-editor of Voyage of Time. Fraase joined Malick’s filmmaking team during the post of The Tree of Life. Although he had been an experienced editor cutting commercials and shorts, working with Malick was his first time working on a full-length feature. Keith Fraase and I recently discussed what it took to bring Voyage of Time to the screen.

Eight years in the making

“I began working with Terry back in 2008 on The Tree of Life,” Fraase says. “Originally, Voyage of Time had been conceived as a companion piece to The Tree of Life, to be released simultaneously. But plans changed and the release of Voyage was delayed. Some of the ideas and thematic elements that were discussed for Voyage ended up as the ‘creation sequence’ in Tree, but reworked to fit the tone and style of that film. Over the years, Voyage became something that Terry and I would edit in between post on his other narrative films. It was our passion project.”

df0617_vot_1Malick’s cutting rooms are equipped with Avid Media Composer systems connected to Avid shared storage. Typically his films are edited by multiple editors. (Voyage of Time was co-edited by Fraase and Rehman Nizar Ali.) Not only editors, but also researchers, needed access to the footage, so at times, there were as many as eight Media Composer systems used in post. Fraase explains, “There is almost always more than one editor on Terry’s films. At the start of post, we’d divvy up the film by section and work on it until achieving a rough assembly. Then, once the film was assembled in full, each editor would usually trade-off sections or scenes, in the hope to achieve some new perspective on the cut. It was always about focusing on experimentation or discovering different approaches to the edit. With Voyage, there was so much footage to work with, some of which Terry had filmed back in the 70s. This was a project he’d had in his mind for decades. In preparation, he traveled all over the world and had amassed years of research on natural phenomena and the locations where he could film them. During filming, the crew would go to locations with particular goals in mind, like capturing mud pots in Iceland or cuttlefish in Palau. But Terry was always on the lookout for the unexpected. Due to this, much of the footage that ended up in the final films was unplanned.”

df0617_vot_2Cutting Voyage of Time presented an interesting way to tackle narration. Fraase continues, “For Voyage, there were hours and hours of footage to cut with, but we also did a lot of experiments with sound. Originally, there was a 45 page script written for the IMAX version, which was expanded for the full feature. However, this script was more about feelings and tone than outlining specific beats or scenes. It was more poetry than prose, much of which was later repurposed and recorded as voiceover. Terry has a very specific way of working with voiceover. The actors record pages and pages of it. All beautifully written. But we never know what is going to work until it’s recorded, brought into the Avid, and put up against picture. Typically, we’ll edit together sequences of voiceover independent of any footage. Then we move these sequences up and down the timeline until we find a combination of image and voiceover that produces meaning greater than the sum of the parts. Terry’s most interested in the unexpected, the unplanned.”

The art of picture and sound composition

Naturally, when moviegoers think of a Terrence Malick film, imagery comes to mind. Multiple visual effects houses worked on Voyage of Time, under the supervision of Dan Glass (Jupiter Ascending, Cloud Atlas, The Master). Different artists worked on different sections of the film. Fraase explains, “Throughout post production, we sought the guidance from scientific specialists whenever we could. They would help us define certain thematic elements that we knew we wanted – into specific, illustratable moments. We’d then bring these ideas to the different VFX shops to expand on them. They mocked up the various ‘previz’ shots that we’d test in our edit – many of which were abandoned along the way. We had to drop so many wonderful images and moments after they’d been painstakingly created, because it was impossible to know what would work best until placed in the edit.”

df0617_vot_4“For VFX, Terry wanted to rely on practical film elements as much as possible. Even the shots that were largely CGI had to have some foundation in the real. We had an ongoing series of what we called ’skunkworks shoots’ during the weekends, where the crew would film experiments with elements like smoke, flares, dyes in water and so on. These were all layered into more complex visual effects shots.” Although principal photography was on film, the finished product went through a DI (digital intermediate) finishing process. IMAX visual effects elements were scanned at 11K resolution and the regular live action footage at 8K resolution.

df0617_vot_5The music score for Voyage of Time was also a subject of much experimentation. Fraase continues, “Terry has an extensive classical music library, which was all loaded into the Avid, so that we could test a variety of pieces against the edit. This started with some obvious choices like [Gustav] Holst’s ‘The Planets’ and [Joseph] Haydn’s ‘The Creation’ for a temp score. But we tried others, like a Keith Jarrett piano piece. Then one of our composers [Hanan Townshend, To The Wonder, Knight of Cups] experimented further by taking some of the classical pieces we’d been using and slowing them way, way down. The sound of stringed instruments being slowed results in an almost drone-like texture. For some of the original compositions, Terry was most interested in melodies and chords that never resolve completely. The idea being that, by never resolving, the music was mimicking creation – constantly struggling and striving for completion. Ultimately a collection of all these techniques was used in the final mix. The idea was that this eclectic approach would provide for a soundtrack that was always changing.”

Voyage of Time is a visual symphony, which is best enjoyed if you sit back and just take it in. Keith Fraase offers this, “Terry has a deep knowledge of art and science and he wanted everyone involved in the project to be fascinated and love it as much as he. This is Terry’s ode to the earth.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters