Good Omens

Fans of British television comedies have a new treat in Amazon Prime’s Good Omens. The six-part mini-series is a co-production of BBC Studios and Amazon Studios. It is the screen adaptation of the 1990 hit novel by the late Terry Pratchett and Neil Gaiman, entitled Good Omens: The Nice and Accurate Prophecies of Agnes Nutter, Witch. Just imagine if the Book of Revelation had been written by Edgar Wright or the Coen brothers. Toss in a bit of The Witches of Eastwick and I think you’ll get the picture.

The series stars Michael Sheen (Masters of Sex, The Good Fight) as Aziraphale (an angel) and David Tennant (Mary Queen of Scots, Doctor Who) as Crowley (a demon). Although on opposing sides, the two have developed a close friendship going back to the beginning of humanity. Now it’s time for the Antichrist to arrive and bring about Armageddon. Except that the two have grown fond of humans and their life on Earth, so Crowley and Aziraphale aren’t quite ready to see it all end. They form an unlikely alliance to thwart the End Times. Naturally this gets off to a bad start, when the Antichrist child is mixed up at birth and ends up misplaced with the wrong family. The series also stars an eclectic supporting cast, including Jon Hamm (Baby Driver, Mad Men), Michael McKean (Veep, Better Call Saul), and Frances McDormand (Hail, Caesar!, Fargo) as the voice of God.

Neil Gaiman (Lucifer, American Gods) was able to shepherd the production from novel to the screen by adapting the screenplay and serving as show runner. Douglas Mackinnon (Doctor Who, Sherlock) directed all six episodes. I recently had a chance to speak with Will Oswald (Doctor Who, Torchwood: Children of Earth, Sherlock) and Emma Oxley (Lair, Happy Valley), the two editors who brought the production over the finish line.

(Click any image to see an enlarged view.)

_____________________________________________________

[OP] Please tell me a bit about your editing backgrounds and how you landed this project.

[Will] I was the lead editor for Doctor Who for a while and got along well with the people. This led to Sherlock. Douglas had worked on both and gave me a call when this came up.

[Emma] I’ve been mainly editing thrillers and procedurals and was looking for a completely different script, and out of the blue I received a call from Douglas. I had worked with him as an assistant editor in 2007 on an adaptation of the Jekyll and Hyde story and I was fortunate that a couple of Douglas’s main editors were not available for Good Omens. When I read the script I thought this is a dream come true.

[OP] Had either of you read the book before?

[Will] I hadn’t, but when I got the gig, I immediately read the book. It was great, because this is a drama-comedy. How good a job is that? You are doing everything you like. It’s a bit tricky, but it’s a great atmosphere to work in.

[Emma] I was the same, but within a week I had read it. Then the scripts came through and they were pretty much word for word – you don’t expect that. But since it was six hours instead of feature length the book could remain intact.

[OP] I know that episodic series often divide up the editorial workload in many different ways. Who worked on which episode and how was that decided?

[Will] Douglas decided that I would do the first three episodes and Emma would edit the last three. The series happened to split very neatly in the middle. The first three episodes really set up the back story and the relationship between the characters and then the story shifts tone in the last three episodes.

[Emma] Normally in TV the editors would leapfrog each other. In this case, as Will said, the story split nicely into two, three-hour sections. It was a nice experience not to have to jump backwards and forwards.

[Will] The difficult thing for me in the first half is that the timeline is so complicated. In the first three episodes you have to develop the back story, which in this case goes back and forth through the centuries – literally back to the beginning of time. You also have to establish the characters’ relationship to each other. By the end of episode three, they really start falling apart, even though they do really like each other. It’s a bit like Butch Cassidy and the Sundance Kid. Of course, Emma then had to resolve all the conflicts in her episodes. But it was nice to go rocking along from one episode to the next.

[OP] What was the post-production schedule like?

[Emma] Well, we didn’t really have a schedule. That’s why it worked! (laugh) Will and I were on it from the very start and once we decided to split up the edit as two blocks of three episodes, there were days when I wouldn’t get any rushes, so could focus on getting a cut done and vice versa with Will. When Douglas came in, we had six pretty good episodes that were cut according to the script. Douglas said he wanted to treat it like a six hour film, so we did a full pass on all six episodes before Neil came in and then finally the execs. They allowed us the creative freedom to do that.

[Will] When Douglas came back, we basically had a seven and a half hour movie, which we ran in a cinema on a big screen. Then we went through and made adjustments in order. It was the first time I’ve had both the show runner and the director in with me every day. Neil had promised Terry that he would make sure it happened. Terry passed away before the production, but he had told Neil – and I’m paraphrasing here – don’t mess it up! So this was a very personal project for him. That weighed heavily on me, because when I reread the book, I wanted to make sure ‘this’ was in and ‘that’ was in as I did my cut.

[OP] What sort of changes were made as you were refining the episodes?

[Will] There were a lot of structural changes in episodes one and two that differed a lot from the script. It was a matter of working out how best to tell the story. Episode one was initially 80 minutes long. There was quite a lot of work to get down to the hourlong final version. Episode three was much easier. 

[Emma] By the time it got to episode four, the pattern had been established, so we had to deal more with visual effects challenges in the second half. We had a number of large set pieces and a limited visual effects budget. So we had to be clever about using visual effects moments without losing the impact, but still maximizing the effects we did have. And at the same time keeping it as good as we could. For example, there’s a flying saucer scene, but the plate shot didn’t match the saucer shot and it was going to take a ton of work to match everything. So we combined it with a shot intended for another part of the scene. Instead of a full screen effects shot, it’s seen through a car window. Not only did it save a lot of money, but more importantly, it ended up being a better way for the ship to land and more in the realm of Good Omens storytelling. I love that shot.

[Will] Visual effects are just storytelling points. You want to be careful not to lose the plot. For example, the Hellhound changes into a puppy dog and that transformation was originally intended to be a big visual effect. But instead, we went with a more classic approach. Just a simple cut and the camera tilts down to reveal the smaller dog. It turned out to be a much better way of doing it and makes me laugh every time I see it.

[OP] I noticed a lot of music from Queen used throughout. Any special arrangement to secure that for the series?

[Will] Queen is in the book. Every time Crowley hears music, even if it’s Mozart, it turns into Queen. Fortunately Neil knows everybody!

[Emma] And it’s one of Douglas’ favorite bands of all time, so it was a treat for him to put as much Queen music in as possible. At one point we had it over many more moments.

[Will] Also working with David Arnold [series composer] was great. There’s a lot of his music as well and he really understands what we do in editing.

[OP] Since this is a large effort and a lot of complex work involved, did you have a large team of assistant editors on the job with you?

[Emma] This is the UK. We don’t have a huge team! (laugh)

[Will] We had one assistant, Cat Gregory, and then much later on, a couple more for visual effects.

[Emma] They were great. Cat, our first assistant, had an adjoining room to us and she was our ‘take barometer.’ If you put in an alt line and she didn’t laugh, you knew it wasn’t as good. But if there was a chuckle coming out of her room, it would more often stay.

[OP] How do you work with your assistants? For example, do you let assistants assemble selects, or cut in sound effects or music?

[Will] It was such a heavy schedule with a huge amount of material, so there was a lot of work just to get that in and organized. Just giving us an honest opinion was invaluable. But music and sound effects – you really have to do that yourself.

[Emma] Me, too. I cut my own music and assemble my own rushes.

[OP] Please tell me a bit about your editorial set-up and editing styles.

[Will] We were spread over two or four upstairs/downstairs rooms at the production company’s office in Soho. These were Avid Media Composer systems with shared storage. We didn’t have the ScriptSync option. We didn’t even have Sapphire plug-ins until late in the day, although that might have been nice with some of the bigger scenes with a lot of explosions. I don’t really have an editing style, I think it’s important not to have one as an editor. Style comes out of the content. I think the biggest challenge on this show was how do you get the English humor across to an American audience.

[Emma] I wouldn’t say I have an editing style either. I come in, read the notes, and then watch the rushes with that information in my head. There wasn’t a lot of wild variation in the takes and David’s and Michael’s performances were just dreamy. So the material kind of cut itself.

[Will] The most important thing is to familiarize yourself with the material and review the selected takes. Those are the ones the director wanted. That also gives you a fixed point to start from. The great thing about software these days is that you can have multiple versions.

[OP] I know some directors like to calibrate their actors’ performances, with each take getting more extreme in emotion. Others like to have each take be very different from the one before it. What was Mackinnon’s style on this show as a director?

[Emma] In the beginning you always want to figure out what they are thinking. With Douglas it’s easy to see from the material he gives you. He’s got it all planned. He really gets the performance down to a tee in the rehearsal.

[Will] Douglas doesn’t push for a wide range in the emotion from one take to the next. As Emma mentioned, Douglas works through that in rehearsal. Someone like David and Michael work that out, too, and they’re bouncing off each other. Douglas has a fantastic visual sense. You can look at the six episodes and go, “Wow, how did you get all of that in?” It’s a lot of material and he found a way to tell that story. There’s a very natural flow to the structure.

[OP] Since both Douglas Mackinnon and Will worked on Doctor Who, and David Tennant was one of the Doctors during the series, was there a conscious effort to stay away from anything that smacked of Doctor Who in Good Omens?

[Will] It never crossed my mind. I always try to do something different, but as I said, the style comes out of the material. It has jeopardy and humor like Doctor Who, but it’s really quite different. I did 32 episodes of Doctor Who and each of those was very different from the other. David Tennent is in it, of course, but he is not even remotely playing the Doctor. Crowley is a fantastic new character for him.

[OP] Are there any final thoughts you’d like to share about working on Good Omens?

[Will] It was a pleasure to work on a world-famous book and it is very funny. To do it justice was really all we were doing. I was going back every night and reading the book marking up things. Hopefully the fans like it. I know Neil does and I hope Terry is watching it.

[Emma] I’m just proud that the fans of the book are saying that it’s one of the best adaptations they’ve ever watched on the screen. That’s a success story and it gives me a warm feeling when I think about Good Omens. I’d go back and cut it again, which I rarely say about any other job.

©2019 Oliver Peters

Advertisements

NAB Show 2019

This year the NAB Show seemed to emphasize its roots – the “B” in National Association of Broadcasters. Gone or barely visible were the fads of past years, such as stereoscopic 3D, 360-degree video, virtual/augmented reality, drones, etc. Not that these are gone – merely that they have refocused on the smaller segment of marketshare that reflects reality. There’s not much point in promoting stereo 3D at NAB if most of the industry goes ‘meh’.

Big exhibitors of the past, like Quantel, RED, Apple, and Autodesk, are gone from the floor. Quantel products remain as part of Grass Valley (now owned by Belden), which is the consolidation of Grass Valley Group, Quantel, Snell & Wilcox, and Philips. RED decided last year that small, camera-centric shows were better venues. Apple – well, they haven’t been on the main floor for years, but even this year, there was no off-site, Final Cut Pro X stealth presence in a hotel suite somewhere. Autodesk, which shifted to a subscription model a couple of years ago, had a demo suite in the nearby Renaissance Hotel, focusing on its hero product, Flame 2020. Smoke for Mac users – tough luck. It’s been over for years.

This was a nuts-and-bolts year, with many exhibits showing new infrastructure products. These appeal to larger customers, such as broadcasters and network facilities. Specifically the world is shifting to an IP-based infrastructure for signal routing, control, and transmission. This replaces copper and fiber wiring of the past, along with the devices (routers, video switchers, etc) at either end of the wire. Companies that might have appeared less relevant, like Grass Valley, are back in a strong sales position. Other companies, like Blackmagic Design, are being encouraged by their larger clients to fulfill those needs. And as ever, consolidation continues – this year VizRT acquired NewTek, who has been an early player in video-over-IP with their proprietary NDI protocol.

Adobe

The NAB season unofficially started with Adobe’s pre-NAB release of the CC2019 update. For editors and designers, the hallmarks of this update include a new, freeform bin window view and adjustable guides in Premiere Pro and content-aware, video fill in After Effects. These are solid additions in response to customer requests, which is something Adobe has focused on. A smaller, but no less important feature is Adobe’s ongoing effort to improve media performance on the Mac platform.

As in past years, their NAB booth was an opportunity to present these new features in-depth, as well as showcase speakers who use Adobe products for editing, sound, and design. Part of the editing team from the series Atlanta was on hand to discuss the team’s use of Premiere Pro and After Effects in their ‘editing crash pad’.

Avid

For many attendees, NAB actually kicked off on the weekend with Avid Connect, a gathering of Avid users (through the Avid Customer Association), featuring meet-and-greets, workshops, presentations, and ACA leadership committee meetings. While past product announcements at Connect have been subdued from the vantage of Media Composer editors, this year was a major surprise. Avid revealed its Media Composer 2019.5 update (scheduled for release the end of May). This came as part of a host of many updates. Most of these apply to companies that have invested in the full Avid ecosystem, including Nexis storage and Media Central asset management. While those are superb, they only apply to a small percentage of the market. Let’s not forget Avid’s huge presence in the audio world, thanks to the dominance of Pro Tools – now with Dolby ATMOS support. With the acquisition of Euphonix years back, Avid has become a significant player in the live and studio sound arena. Various examples of its S-series consoles in action were presented.

Since I focus on editing, let me discuss Media Composer a bit more. The 2019.5 refresh is the first major Media Composer overhaul in years. It started in secret last year. 2019.5 is the first iteration of the new UI, with more to be updated in coming releases. In short, the interface has been modernized and streamlined in ways to attract newer, younger users, without alienating established editors. Its panel design is similar to Adobe’s approach – i.e. interface panels can be docked, floated, stacked, or tabbed. Panels that you don’t want to see may be closed or simply slid to the side and hidden. Need to see a hidden panel again? Simply side it back open from the edge of the screen.

This isn’t just a new skin. Avid has overhauled the internal video pipeline, with 32-bit floating color and an uncompressed DNx codec. Project formats now support up to 16K. Avid is also compliant with the specs of the Netflix Post Alliance and the ACES logo program.

I found the new version very easy to use and a welcomed changed; however, it will require some adaptation if you’ve been using Media Composer for a long time. In a nod to the Media Composer heritage, the weightlifter (aka ‘liftman’) and scissors icons (for lift and extract edits) are back. Even though Media Composer 2019.5 is just in early beta testing, Avid felt good enough about it to use this version in its workshops, presentations, and stage demos.

One of the reasons to go to NAB is for the in-person presentations by top editors about their real-world experiences. No one can top Avid at this game, who can easily tap a host of Oscar, Emmy, BFTA, and Eddie award winners. The hallmark for many this year was the presentation at Avid Connect and/or at the show by the Oscar-winning picture and sound editing/mixing team for Bohemian Rhapsody. It’s hard not to gather a standing-room-only crowd when you close your talk with the Live Aid finale sequence played in kick-ass surround!

Blackmagic Design

Attendees and worldwide observers have come to expect a surprise NAB product announcement out of Grant Petty each year and he certainly didn’t disappoint this time. Before I get into that, there were quite a few products released, including for IP infrastructures, 8K production and post, and more. Blackmagic is a full spectrum video and audio manufacturer that long ago moved into the ‘big leagues’. This means that just like Avid or Grass Valley, they have to respond to pressure from large users to develop products designed around their specific workflow needs. In the BMD booth, many of those development fruits were on display, like the new Hyperdeck Extreme 8K HDR recorder and the ATEM Constellation 8K switcher.

The big reveal for editors was DaVinci Resolve 16. Blackmagic has steadily been moving into the editorial space with this all-in-one, edit/color/mix/effects/finishing application. If you have no business requirement for – or emotional attachment to – one of the other NLE brands, then Resolve (free) or Resolve Studio (paid) is an absolute no-brainer. Nothing can touch the combined power of Resolve’s feature set.

New for Resolve 16 is an additional editorial module called the Cut Page. At first blush, the design, layout, and operation are amazingly similar to Apple’s Final Cut Pro X. Blackmagic’s intent is to make a fast editor where you can start and end your project for a time-sensitive turnaround without the complexities of the Edit Page. However, it’s just another tool, so you could work entirely in the Cut Page, or start in the Cut Page and refine your timeline in the Edit Page, or skip the Cut Page all together. Resolve offers a buffet of post tools that are at your disposal.

While Resolve 16’s Cut Page does elicit a chuckle from experienced FCPX users, it offers some new twists. For example, there’s a two-level timeline view – the top section is the full-length timeline and the bottom section is the zoomed-in detail view. The intent is quick navigation without the need to constantly zoom in and out of long timelines. There’s also an automatic sync detection function. Let’s say you are cutting a two-camera show. Drop the A-camera clips onto the timeline and then go through your B-camera footage. Find a cut-away shot, mark in/out on the source, and edit. It will ‘automagically’ edit to the in-sync location on the timeline. I presume this is matched by either common sound or timecode. I’ll have to see how this works in practice, but it demos nicely. Changes to other aspects of Resolve were minor and evolutionary, except for one other notable feature. The Color Page added its own version of content-aware, video fill.

Another editorial product addition – tied to the theme of faster, more-efficient editing – was a new edit keyboard. Anyone who’s ever cut in the linear days – especially those who ran Sony BVE9000/9100 controllers – will feel very nostalgic. It’s a robust keyboard with a high-quality, integrated jog/shuttle knob. The feel is very much like controlling a tape deck in a linear system, with fast shuttle response and precise jogging. The precision is far better than any of the USB controllers, like a Contour Shuttle. Whether or not enough people will have interest in shelling out $1,025 for it awaits to be seen. It’s a great tool, but are you really faster with one, than with FCPX’s skimming and a standard keyboard and mouse?

Ironically, if you look around the Blackmagic Design booth there does seem to be a nostalgic homage to Sony hardware of the past. As I said, the edit keyboard is very close to a BVE9100 keyboard. Even the style of the control panel on the Hyperdecks – and the look of the name badges on those panels – is very much Sony’s style. As humans, this appeals to our desire for something other than the glass interfaces we’ve been dealing with for the past few years. Michael Cioni (Panavision, Light Iron) coined this as ‘tactile attraction’ in his excellent Faster Together Stage talk. It manifests itself not only in these type of control surfaces, but also in skeuomorphic designs applied to audio filter interfaces. Or in the emotion created in the viewer when a colorist adds film grain to digital footage.

Maybe Grant is right and these methods are really faster in a pressure-filled production environment. Or maybe this is simply an effort to appeal to emotion and nostalgia by Blackmagic’s designers. (Check out Grant Petty’s two-hour 2019 Product Overview for more in-depth information on Blackmagic Design’s new products.)

8K

I won’t spill a lot of words on 8K. Seems kind of silly when most delivery is HD and even SD in some places. A lot of today’s production is in 4K, but really only for future-proofing. But the industry has to sell newer and flashier items, so they’ve moved on to 8K pixel resolution (7680 x 4320). Much of this is driven by Japanese broadcast and manufacturer efforts, who are pushing into 8K. You can laugh or roll your eyes, but NAB had many examples of 8K production tools (cameras and recorders) and display systems. Of course, it’s NAB, making it hard to tell how many of these are only prototypes and not yet ready for actual production and delivery.

For now, it’s still a 4K game, with plenty of mainstream product. Not only cameras and NLEs, but items like AJA’s KiPro family. The KiPro Ultra Plus records up to four channels of HD or one channel of 4K in ProRes or DNx. The newest member of the family is the KiPro GO, which records up to four channels of HD (25Mbps H.264) onto removable USB media.

Of course, the industry never stops, so while we are working with HD and 4K, and looking at 8K, the developers are planning ahead for 16K. As I mentioned, Avid already has project presets built-in for 16K projects. Yikes!

HDR

HDR – or high dynamic range – is about where it was last year. There are basically four formats vying to become the final standard used in all production, post, and display systems. While there are several frontrunners and edicts from distributors to deliver HDR-compatible masters, there still is no clear path. In you shoot in log or camera raw with nearly any professional camera produced within the past decade, you have originated footage that is HDR-compatible. But none of the low-cost post solutions make this easy. Without the right monitoring environment, you are wasting your time. If anything, those waters are muddier this year. There were a number of HDR displays throughout the show, but there were also a few labelled as using HDR simulation. I saw a couple of those at TV Logic. Yes, they looked gorgeous and yes, they were receiving an HDR signal. I found out that the ‘simulation’ part of the description meant that the display was bright (up to 350 nits), but not bright enough to qualify as ‘true’ HDR (1,000 nits or higher).

As in past transitions, we are certainly going to have to rely on a some ‘glue’ products. For me, that’s AJA again. Through their relationship with Colorfront, AJA offers two FS-HDR products: the HDR Image Analyzer and the FS-HDR convertor. The latter was introduced last year as a real-time frame synchronizer and color convertor to go between SDR and HDR display standards.  The new Analyzer is designed to evaluate color space and gamut compliance. Just remember, no computer display can properly show you HDR, so if you need to post and delivery HDR, proper monitoring and analysis tools are essential.

Cameras

I’m not a cinematographer, but I do keep up with cameras. Nearly all of this year’s camera developments were evolutionary: new LF (large format sensor) cameras (ARRI), 4K camcorders (Sharp, JVC), a full-frame mirrorless DSLR from Nikon (with ProRes RAW recording coming in a future firmware update). Most of the developments were targeted towards live broadcast production, like sports and megachurches.  Ikegami had an 8K camera to show, but their real focus was on 4K and IP camera control.

RED, a big player in the cinema space, was only there in a smaller demo room, so you couldn’t easily compare their 8K imagery against others on the floor, but let’s not forget Sony and Panasonic. While ARRI has been a favorite, due to the ‘look’ of the Alexa, Sony (Venice) and Panasonic (Varicam and now EVA-1) are also well-respected digital cinema tools that create outstanding images. For example, Sony’s booth featured an amazing, theater-sized, LED 8K micro-pixel display system. Some of the sample material shown was of the Rio Carnival, shot with anamorphic lenses on a 6K full-frame Sony Venice camera. Simply stunning.

Finally, let’s not forget Canon’s line-up of cinema cameras, from the C100 to the C700FF. To complement these, Canon introduced their new line of Sumire Prime lenses at the show. The C300 has been a staple of documentary films, including the Oscar-winning film, Free Solo, which I had the pleasure of watching on the flight to Las Vegas. Sweaty palms the whole way. It must have looked awesome in IMAX!

(For more on RED, cameras, and lenses at NAB, check out this thread from DP Phil Holland.)

It’s a wrap

In short, NAB 2019 had plenty for everyone. This also included smaller markets, like products for education seminars. One of these that I ran across was Cinamaker. They were demonstrating a complete multi-camera set-up using four iPhones and an iPad. The iPhones are the cameras (additional iPhones can be used as isolated sound recorders) and the iPad is the ‘switcher/control room’. The set-up can be wired or wireless, but camera control, video switching, and recording is done at the iPad. This can generate the final product, or be transferred to a Mac (with the line cut and camera iso media, plus edit list) for re-editing/refinement in Final Cut Pro X. Not too shabby, given the market that Cinamaker is striving to address.

For those of us who like to use the NAB Show exhibit floor as a miniature yardstick for the industry, one of the trends to watch is what type of gear is used in the booths and press areas. Specifically, one NLE over another, or one hardware platform versus the other. On that front, I saw plenty of Premiere Pro, along with some Final Cut Pro X. Hardware-wise, it looked like Apple versus HP. Granted, PC vendors, like HP, often supply gear to use in the booths as a form of sponsorship, so take this with a grain of salt. Nevertheless, I would guess that I saw more iMac Pros than any other single computer. For PCs, it was a mix of HP Z4, Z6, and Z8 workstations. HP and AMD were partner-sponsors of Avid Connect and they demoed very compelling set-ups with these Z-series units configured with AMD Radeon cards. These are very powerful workstations for editing, grading, mixing, and graphics.

©2019 Oliver Peters

Glass – Editing an Unconventional Trilogy

Writer/director M. Night Shyamalan has become synonymous with films about the supernatural that end with a twist. He first gained broad attention with The Sixth Sense and in the two decades since, has written, produced, and directed a range of large and small films. In recent years, he has taken a more independent route to filmmaking, working with lower budgets and keeping close control of production and post.

His latest endeavor, Glass, also becomes the third film in what is now an unconventional trilogy, starting first with Unbreakable, released 19 years ago. 2017’s Split was the second in this series. Glass combines the three principal characters from the previous two films – David Dunn/The Overseer (Bruce Willis), Elijah Price/Mr. Glass (Samuel L. Jackson), and Kevin Wendell Crumb (James McAvoy), who has 23 multiple personalities.

Shyamalan likes to stay close to his northeastern home base for production and post, which has afforded an interesting opportunity to young talent. One of those is Luke Ciarrocchi, who edited the final two installments of the trilogy, Split and Glass. This is only his third film in the editor’s chair. 2015’s The Visit was his first. Working with Shyamalan has provided him with a unique opportunity, but also a master class in filmmaking. I recently spoke with Luke Ciarrocchi about his experience editing Glass.

_________________________________________________

[OP] You’ve had the enviable opportunity to start your editing career at a pretty high level. Please tell me a bit about the road to this point.

[LC] I live in a suburb of Philadelphia and studied film at Temple University. My first job after college was as a production assistant to the editing team on The Happening with editor Conrad Buff (The Huntsman: Winter’s War, Rise of the Planet of the Apes, The Last Airbender) and his first assistant Carole Kenneally. When the production ended, I got a job cutting local market commercials. It wasn’t glamorous stuff, but it is where I got my first experience working on Avid [Media Composer] and really started to develop my technical knowledge. I was doing that for about seven months when The Last Airbender came to town.

I was hired as an apprentice editor by the same editing crew that I had worked with on The Happening. It was on that film that I started to get onto Night’s radar. I was probably the first Philly local to break into his editing team. There’s a very solid and talented group of local production crew in Philly, but I think I was the first local to join the Editors Guild and work in post on one of his films. Before that, all of the editing crew would come from LA or New York. So that was a big ‘foot in the door’ moment, getting that opportunity from Conrad and Carole.  I learned a lot on Airbender. It was a big studio visual effects film, so it was a great experience to see that up close – just a really exciting time for me.

During development of After Earth, even before preproduction began, Night asked me to build a type of pre-vis animatic from the storyboards for all the action sequences. I would take these drawings into After Effects and cut them up into moveable pieces, animate them, then cut them together into a scene in Avid. I was putting in music and sound effects, subtitles for the dialogue, and really taking them to a pretty serious and informative level. I remember animating the pupils on one of the drawings at one point to convey fear (laughs). We did this for a few months. I would do a cut, Night would give me notes, maybe the storyboard artist would create a new shot, and I would do a recut. That was my first back-and-forth creative experience with him.

Once the film began to shoot, I joined the editing team as an assistant editor. At the end of post – during crunch time – I got the opportunity to jump in and cut some actual scenes with Night. It was surreal. I remember sitting in the editing room auditioning cuts for him and him giving notes and all the while I’m just repeating in my head, ‘Don’t mess this up, don’t mess this up.’ I feel like we had a very natural rapport though, besides the obvious nervousness that would come from a situation like that. We really worked well together from the start. We both had a strong desire to dig deep and really analyze things, to not leave anything on the table. But at the same time we also had the ability to laugh at things and break the seriousness when we needed to. We have a similar sense of humor that to this day I think helps us navigate the more stressful days in the editing room. Personality plays a big roll in the editing room. Maybe more so then experience. I may owe my career to my immature sense of humor. I’m not sure.     

After that, I assisted on some other films passing through Philly and just kept myself busy. Then I got a call from Night’s assistant to come by to talk about his next film, The Visit. I got there and he handed me a script and told me he wanted me to be the sole editor on it. Looking back it seems crazy, because he was self-financing the film. He had lot on the line and he could have gotten any editor, but he saw something. So that was the first of the three films I would cut for him. The odds have to be one-in-a-million for that to pan out the way that it did in the suburbs of Philly. Right place, right time, right people. It’s a lot of luck, but when you find yourself in that situation, you just have to keep telling yourself, ‘Don’t mess this up.’

[OP] These three films, including Glass, are being considered a trilogy, even though they span about two decades. How do they tie together, not just in story, but also style?

[LC] I think it’s fair to call Glass the final installment of a trilogy – but definitely an untraditional one. First Unbreakable, then 19 years later Split, and now Glass. They’re all in the same universe and hopefully it feels like a satisfying philosophical arc through the three. The tone of the films is ingrained in the scripts and footage. Glass is sort of a mash-up of what Unbreakable was and what Split was. Unbreakable was a drama that then revealed itself as a comic book origin story. Split was more of a thriller – even horror at times – that then revealed itself as part of this Unbreakable comic book universe. Glass is definitely a hybrid of tone and genre representing the first two films. 

[OP] Did you do research into Unbreakable to study its style?

[LC] I didn’t have to, because Unbreakable has been one of my favorite films since I was 18. It’s just a beautiful film. I loved that in the end it wasn’t just about David Dunn accepting who he was, but also Elijah finding his place in the world only by committing these terrible crimes to discover his opposite. He had to become a villain to find the hero. It’s such a cool idea and for me, very rewatchable. The end never gets old to me. So I knew that film very, very well. 

[OP] Please walk me through your schedule for post-production.

[LC] We started shooting in October of 2017 and shot for about two month. I was doing my assembly during that time and the first week of December. Then Night joined me and we started the director’s cut. The way that Night has set up these last three films is with a very light post crew. It’s just my first assistant, Kathryn Cates, and me set up at Night’s offices here in the suburbs of Philadelphia with two Avids. We had a schedule that we were aiming for, but the release date was over a year out, so there was wiggle room if it was needed. 

Night’s doing this in a very unconventional way. He’s self-financing, so we didn’t need to go into a phase of a studio cut. After his director’s cut, we would go into a screening phase – first just for close crew, then more of a friends-and-family situation. Eventually we get to a general audience screening. We’re working and addressing notes from these screenings, and there isn’t an unbearable amount of pressure to lock it up before we’re happy. 

[OP] I understand that your first cut was about 3 1/2 hours long. It must take a lot of trimming and tweaking to get down to the release length of 129 minutes. What sort of things did you do to cut down the running time from that initial cut?

[LC] One of our obstacles throughout post was that initial length. You’re trying to get to the length that the film wants to be without gutting it in the process. You don’t want to overcut as much as you don’t want to undercut. We had a similar situation on Split, which was a long assembly as well. The good news is that there’s a lot of great stuff to work with and choose from.

We approach it very delicately. After each screening we trimmed a little and carefully pulled things out, so each screening was incrementally shorter, but never dramatically so. Sometimes you will learn from a screenings that you pulled the wrong thing out and it needed to go back in. Ultimately no major storyline was cut out of Glass. It was really just finding where we are saying the same thing twice, but differently – diagnosing which one of those versions is the more impactful one – then cutting the others. And so, we just go like that. Pass after pass. Reel by reel.

An interesting thing I’ve found is that when you are repeating things, you will often feel that the second time is the offensive moment of that information and the one to remove, because you’ve heard it once before. But the truth is that the first telling of that information is more often what you want to get rid of. By taking away the first one, you are saving something for later. Once you remove something earlier, it becomes an elevated scene, because you are aren’t giving away so much up front. 

[OP] What is your approach to getting started when you are first confronted with the production footage? What is your editing workflow like?

[LC] I’m pretty much paper-based. I have all of the script supervisor’s notes. Night is very vocal on set about what he likes and doesn’t like, and Charlie Rowe, our script supervisor, is very good at catching those thoughts. On top of that, Night still does dailies each day – either at lunch or the end of the day. As a crew, we get together wherever we are and screen all of the previous day’s footage, including B-roll. I will sit next to Night with a sheet that has all of the takes and set-ups with descriptions and I’ll take notes both on Night’s reactions, as well as my own feelings towards the footage. 

With that information, I’ll start an assembly to construct the scene in a very rough fashion without getting caught up in the small details of every edit. It starts to bring the shape of the scene out for me. I can see where the peaks and valleys are. Once I have a clearer picture of the scene and its intention, I’ll go back through my detailed notes – there’s a great look for this, there’s a great reading for that – and I find where those can fit in and whether they serve the edit. You might have a great reaction to something, but the scene might not want that to be on-camera. So first I find the bones of the scene and then I dress it up. 

Night gets a lot range from the actors from the first take to the last take. It is sometimes so vast that if you built a film out of only the last takes, it would be a dramatically different movie than if you only used take one. With each take he just pushes the performances further. So he provides you with a lot of control over how animated the scene is going to be. In Glass, Elijah is an eccentric driven by a strong ideology, so in the first take you get the subdued, calculated villain version of him, but by the last take it’s the carnival barker version. The madman. 

[OP] Do you get a sense when screening the dailies of which way Night wants to go with a scene?

[LC] Yes, he’ll definitely indicate a leaning and we can boil it down to a couple of selects. I’ll initially cut a scene with the takes that spoke to him the most during the dailies and never cut anything out ahead of time. He’ll see the first cuts as they were scripted, storyboarded, and shot. I’ll also experiment with a different take or approach if it seems valid and have that in my back pocket. He’s pretty quick to acknowledge that he might have liked a raw take on set and in dailies, but it doesn’t work as well when cut together into a scene. So then we’ll address that. 

[OP] As an Avid editor, have you used Media Composer’s script integration features, like ScriptSync?

[LC] I just had my first experience with it on a Netflix show. I came on later in their post, so the show had already been set up for ScriptSync. It was very cool and helpful to be able to jump in and quickly compare the different takes for the reading of a line. It’s a great ‘late in the game’ tool. Maybe you have a great take, but just one word is bobbled and you’d like to find a replacement for just that word. Or the emotion of a key word isn’t exactly what you want. It could be a time-saver for a lot of that kind of polishing work.

[OP] What takeaways can you share from your experiences working with M. Night Shyamalan?

[LC] Night works in the room with you everyday. He doesn’t just check in once a week or something like that. It’s really nice to have that other person there. I feel like often times the best stuff comes from discussing it and talking it through. He loves to deconstruct things and figure out the ‘why’. Why does this work and this doesn’t? I enjoy that as well. After three films of doing that, you learn a lot. You’re not aware of it, but you’re building a toolkit. These tools and choices start to become second nature. 

On the Netflix show that I just did, there were times where I didn’t have anyone else in the room for long stretches and I started to hear those things that have become inherent in my process clearer. I started to take notice of what had become my second nature – what the last decade had produced. Editing is something you just have to do to learn. You can’t just read about it or study a great film. You have to do it, do it again, and struggle with it. You need to mess it up to get it right.

________________________________________________

This interview is going online after Glass has scored its third consecutive weekend in the number one box office slot. Split was also number one for three weeks in a row. That’s a pretty impressive feat and fitting for the final installment of a trilogy.

Be sure to also check out Steve Hullfish’s AOTC interview with Luke Ciarrocchi here.

©2019 Oliver Peters

The State of the NLE 2019

It’s a new year, but the doesn’t mean that the editing software landscape will change drastically in the coming months. For all intents and purpose, professional editing options boil down to four choices: Avid Media Composer, Adobe Premiere Pro, Apple Final Cut Pro X, and Blackmagic Design DaVinci Resolve. Yes, I know Vegas, Lightworks, Edius, and others are still out there, but those are far off on the radar by comparison (no offense meant to any happy practitioners of these tools). Naturally, since blogs are mainly about opinions, everything I say from here on is purely conjecture. Although it’s informed by my own experiences with these tools and my knowing many of the players involved on the respective product design and management teams – past and present.

Avid continues to be the go-to NLE in the feature film and episodic television world. That’s certainly a niche, but it’s a niche that determines the tools developed by designers for the broader scope of video editing. Apple officially noted two million users for Final Cut Pro X last year and I’m sure it’s likely to be at least 2.5M by now. Adobe claims Premiere Pro to be the most widely used NLE by a large margin. I have no reason to doubt that statement, but I have also never seen any actual stats. I’m sure through the Creative Cloud subscription mechanism Adobe not only knows how many Premiere Pro installations have been downloaded, but probably has a good idea as to actual usage (as opposed to simply downloading the software). Bringing up the rear in this quartet is Resolve. While certainly a dominant color correction application, I don’t yet see it as a key player in the creative editing (as opposed to finishing) space. With the stage set, let’s take a closer look.

Avid Media Composer

Editors who have moved away from Media Composer or who have never used it, like to throw shade on Avid and its marquee product. But loyal users – who include some of the biggest names in film editing – stick by it due in part to familiarity, but also its collaborative features and overall stability. As a result, the development pace and rate of change is somewhat slow compared with the other three. In spite of that, Avid is currently on a schedule of a solid, incremental update nearly every month – each of which chips away at a long feature request list. The most recent one dropped on December 31st. Making significant changes without destroying the things that people love is a difficult task. Development pace is also hindered by the fact that each one of these developers is also chasing changes in the operating system, particularly Apple and macOS. Sometimes you get the feeling that it’s two steps forward, one step back.

As editors, we focus on Media Composer, but Avid is a much bigger company than just that, with its fingers in sound, broadcast, storage, cloud, and media management. If you are a Pro Tools user, you are just as concerned about Avid’s commitment to you, as editors are to them. Like any large company, Avid must advance not just a single core product, but its ecosystem of products. Yet it still must advance the features in these products, because that’s what gets users’ attention. In an effort to improve its attraction to new users, Avid has introduced subscription plans and free versions to make it easier to get started. They now cover editing and sound needs with a lower cost-of-entry than ever before.

I started nonlinear editing with Avid and it will always hold a spot in my heart. Truth be told, I use it much less these days. However, I still maintain current versions for the occasional project need plus compatibility with incoming projects. I often find that Media Composer is the single best NLE for certain tasks, mainly because of Avid’s legacy with broadcast. This includes issues like proper treatment of interlaced media and closed captioning. So for many reasons, I don’t see Avid going away any time soon, but whether or not they can grow their base remains an unknown. Fortunately many film and media schools emphasize Avid when they teach editing. If you know Media Composer, it’s an easy jump to any other editing tool.

Adobe Premiere Pro CC

The most widely used NLE? At least from what I can see around me, it’s the most used NLE in my market, including individual editors, corporate media departments, and broadcasters. Its attraction comes from a) the versatility in editing with a wide range of native media formats, and b) the similarity to – and viable replacement for – Final Cut Pro “legacy”. It picked up steam partly as a reaction to the Final Cut Pro X roll-out and users have generally been happy with that choice. While the shift by Adobe to a pure subscription model has been a roadblock for some (who stopped at CS6), it’s also been an advantage for others. I handle the software updates at a production company with nine edit systems and between the Adobe Creative Cloud and Apple Mac App Store applications, upgrades have never been easier.

A big criticism of Adobe has been Premiere’s stability. Of course, that’s based on forum reads, where people who have had problems will pipe up. Rarely does anyone ever post how uneventful their experience has been. I personally don’t find Premiere Pro to be any less stable than any other NLE or application. Nonetheless, working with a mix of oddball native media will certainly tax your system. Avid and Apple get around this by pushing optimized and proxy media. As such, editors reap the benefits of stability. And the same is true with Premiere. Working with consistent, optimized media formats (transcoded in advance) – or working with Adobe’s own proxies – results in a more stable project and a better editing experience.

Avid Media Composer is the dominant editing tool in major markets, but mainly in the long-form entertainment media space. Many of the top trailer and commercial edit shops in those same markets use Premiere Pro. Again, that goes back to the FCP7-to-Premiere Pro shift. Many of these companies had been using the old Final Cut rather than Media Composer. Since some of these top editors also cut features and documentaries, you’ll often see them use Premiere on the features that they cut, too. Once you get below the top tier of studio films and larger broadcast network TV shows, Premiere Pro has a much wider representation. That certainly is good news for Adobe and something for Avid to worry about.

Another criticism is that of Adobe’s development pace. Some users believed that moving to a subscription model would speed the development pace of new versions – independent of annual or semi-annual cycles. Yet cycles still persist – much to the disappointment of those users. This gets down to how software is actually developed, keeping up with OS changes, and to some degree, marketing cycles. For example, if there’s a big Photoshop update, then it’s possible that the marketing “wow” value of a large Premiere Pro update might be overshadowed and needs to wait. Not ideal, but that’s the way it is.

Just because it’s possible, doesn’t mean that users really want to constantly deal with automatic software updates that they have to keep track of. This is especially true with After Effects and Premiere Pro, where old project files often have to be updated once you update the application. And those updates are not backwards compatible. Personally, I’m happy to restrict that need to a couple of times a year.

Users have the fear that a manufacturer is going to end-of-life their favorite application at some point. For video users, this was made all too apparent by Apple and FCPX. Neither Apple nor Adobe has been exempt from killing off products that no longer fit their plans. Markets and user demands shift. Photography is an obvious example here. In recent years, smart phones have become the dominant photographic device, which has enabled cloud-syncing and storage of photos. Adobe and Apple have both shifted the focus for their photo products accordingly. If you follow any of the photo blogs, you’ll know there’s some concern that Adobe Lightroom Classic (the desktop version) will eventually give way completely to Lightroom CC (the cloud version). When a company names something as “classic”, you have to wonder how long it will be supported.

If we apply that logic to Premiere Pro, then the new Adobe Rush comes to mind. Rush is a simpler, nimbler, cross-platform/cross-device NLE targeted as users who produce video starting with their smart phone or tablet. Since there’s also a desktop version, one could certainly surmise that in the future Rush might replace Premiere Pro in the same way that FCPX replaced FCP7. Personally, I don’t think that will happen any time soon. Adobe treats certain software as core products. Photoshop, Illustrator, and After Effects are such products. Premiere Pro may or may not be viewed that way internally, but certainly more so now than ever in the past. Premiere Pro is being positioned as a “hub” application with connections to companion products, like Prelude and Audition. For now, Rush is simply an interesting offshoot to address a burgeoning market. It’s Adobe’s second NLE, not a replacement. But time will tell.

Apple Final Cut Pro X

Apple released Final Cut Pro X in the summer of 2011 – going on eight years now. It’s a versatile, professional tool that has improved greatly since that 2011 launch and gained a large and loyal fan base. Many FCPX users are also Premiere Pro users and the other way around. It can be used to cut nearly any type of project, but the interface design is different from the others, making it an acquired taste. Being a Mac-only product and developed within the same company that makes the hardware and OS, FCPX is optimized to run on Macs more so than any cross-platform product can be. For example, the fluidity of dealing with 4K ProRes media on even older Macs surpasses that of any other NLE.

Prognosticating Apple’s future plans is a fool’s errand. Some guesses have put the estimated lifespan of FCPX at 10 years, based in part on the lifespan of FCP “legacy”. I have no idea whether that’s true of not. Often when I read interviews with key Apple management (as well as off-the-record, casual discussions I’ve had with people I know on the inside), it seems like a company that actually has less of a concrete plan when it comes to “pro” users. Instead, it often appears to approach them with an attitude of “let’s throw something against the wall and see what sticks”. The 2013 Mac Pro is a striking example of this. It was clearly innovative and a stellar exhibit for Apple’s “think different” mantra. Yet it was a product that obviously was not designed by actually speaking with that product’s target user. Apple’s current “shunning” of Nvidia hardware seems like another example.

One has to ask whether a company so dominated by the iPhone is still agile enough to respond to the niche market of professional video editors. While Apple products (hardware and software) still appeal to creatives and video professionals, it seems like the focus with FCPX is towards the much broader sphere of pro video. Not TV shows and feature films (although that’s great when it comes) – or even high-end commercials and trailers – but rather the world of streaming channels, social media influencers, and traditional publishers who have shifted to an online media presence from a print legacy. These segments of the market have a broad range of needs. After all, so called “YouTube stars” shoot with everything from low-end cameras and smart phones all the way up to Alexas and REDs. Such users are equally professional in their need to deliver a quality product on a timetable and I believe that’s a part of the market that Apple seeks to address with FCPX.

If you are in the world of the more traditional post facility or production company, then those users listed above may be market segments that you don’t see or possibly even look down upon. I would theorize that among the more traditional sectors, FCPX may have largely made the inroads that it’s going to. Its use in films and TV shows (with the exception of certain high-profile, international examples) doesn’t seem to be growing, but I could be wrong. Maybe the marketing is just behind or it no longer has PR value. Regardless, I do see FCPX as continuing strong as a product. Even if it’s not your primary tool, it should be something in your toolkit. Apple’s moves to open up ProRes encoding and offering LumaForge and Blackmagic eGPU products in their online store are further examples that the pro customer (in whatever way you define “pro”) continues to have value to them. That’s a good thing for our industry.

Blackmagic Design DaVinci Resolve

No one seems to match the development pace of Blackmagic Design. DaVinci Resolve underwent a wholesale transformation from a tool that was mainly a high-end color corrector into an all-purpose editing application. Add to this the fact that Blackmagic has acquired and integrated a number of companies, whose tools have been modernized and integrated into Resolve. Blackmagic now offers a post-production solution with some similarities to FCPX while retaining a traditional, track-based interface. It includes modes for advanced audio post (Fairlight) and visual effects (Fusion) that have been adapted from those acquisitions. Unlike past all-in-one applications, Resolve’s modal pages retain the design and workflow specific to the task at hand, rather than making them fit into the editing application’s interface design. All of this in a very short order and across three operating systems, thus making their pace the envy of the industry.

But a fast development pace doesn’t always translate into a winning product. In my experience each version update has been relatively solid. There are four ways to get Resolve (free and paid, Mac App Store and reseller). That makes it a no-brainer for anyone starting out in video editing, but who doesn’t have the specific requirement for one application over another. I have to wonder though, how many new users go deep into the product. If you only edit, there’s no real need to tap into the Fusion, Fairlight, or color correction pages. Do Resolve editors want to finish audio in Fairlight or would they rather hand off the audio post and mix to a specialist who will probably be using Pro Tools? The nice thing about Resolve is that you can go as deep as you like – or not – depending on your mindset, capabilities, and needs.

On the other hand, is the all-in-one approach better than the alternatives: Media Composer/Pro Tools, Premiere Pro/After Effects/Audition, or Final Cut Pro X/Motion/Logic Pro X? I don’t mean for the user, but rather the developer. Does the all-in-one solution give you the best product? The standalone version of Fusion is more full-featured than the Fusion page in Resolve. Fusion users are rightly concerned that the standalone will go away, leaving them with a smaller subset of those tools. I would argue that there are already unnecessary overlaps in effects and features between the pages. So are you really getting the best editor or is it being compromised by the all-in-one approach? I don’t know the answer to these questions. Resolve for me is a good color correction/grading application that can also work for my finishing needs (although I still prefer to edit in something else and roundtrip to/from Resolve). It’s also a great option for the casual editor who wants a free tool. Yet in spite of all its benefits, I believe Resolve will still be a distant fourth in the NLE world, at least for the next year.

The good news is that there are four great editing options in the lead and even more coming from behind. There are no bad choices and with a lower cost than ever, there’s no reason to limit your knowledge to only one. After all, the products that are on top now may be gone in a decade. So broaden your knowledge and define your skills by your craft – not your tools!

©2019 Oliver Peters

Viva Las Vegas – NAB 2018

As more and more folks get all of their information through internet sources, the running question is whether or not trade shows still have value. A show like the annual NAB (National Association of Broadcasters) Show in Las Vegas is both fun and grueling, typified by sensory overload and folks in business attire with sneakers. Although some announcements are made before the exhibits officially open – and nearly all are pretty widely known before the week ends – there still is nothing quite like being there in person.

For some, other shows have taken the place of NAB. The annual HPA Tech Retreat in the Palm Springs area is a gathering of technical specialists, researchers, and creatives that many consider the TED Talks for our industry. For others, the Cine Gear Expo in LA is the prime showcase for grip, lighting, and camera offerings. RED Camera has focused on Cine Gear instead of NAB for the last couple of years. And then, of course, there’s IBC in Amsterdam – the more humane version of NAB in a more pleasant setting. But for me, NAB is still the main event.

First of all, the NAB Show isn’t merely about the exhibit floor at the sprawling Las Vegas Convention Center. Actual NAB members can attend various sessions and workshops related to broadcasting and regulations. There are countless sidebar events specific to various parts of the industry. For editors that includes Avid Connect – a two-day series of Avid presentations in the weekend leading into NAB; Post Production World – a series of workshops, training sessions, and presentations managed by Future Media Concepts; as well as a number of keynote presentations and artist gatherings, including SuperMeet, FCPexchange, and the FCPX Guru Gathering. These are places where you’ll rub shoulders with some well-known editors, colorists, artists, and mixers, learn about new technologies like HDR (high dynamic range imagery), and occasionally see some new product features from vendors who might not officially be on the show floor with a booth, like Apple.

One of the biggest benefits I find in going to NAB is simply walking the floor, checking out the companies and products who might not get a lot of attention. These newcomers often have the most innovative technologies and it’s these new things that you find, which were never on the radar prior to that week.

The second benefit is connection. I meet up again in person with friends that I’ve made over the years – both other users, as well as vendors. Often it’s a chance to meet people that you might only know through the internet (forums, blogs, etc.) and to get to know them just a bit better. A bit more of that might make the internet more friendly, too!

Here are some of my random thoughts and observations from Las Vegas.

__________________________________

Editing hardware and software – four As and a B

Apple uncharacteristically pre-announced their new features just prior to the show, culminating with App Store availability on Monday when the NAB exhibits opened. This includes new Final Cut Pro X/Motion/Compressor updates and the official number of 2.5 million FCPX users. That’s a growth of 500,000 users in 2017, the biggest year to date for Final Cut. The key new feature in FCPX is a captioning function to author, edit, and export both closed and embedded (open) captions. There aren’t many great solutions for captioning and the best to date have been expensive. I found that the Apple approach was now the best and easiest to use that I’ve seen. It’s well-designed and should save time and money for those who need to create captions for their productions – even if you are using another brand of NLE. Best of all, if you own FCPX, you already have that feature. When you don’t have a script to start out, then manual or automatic transcription is required as a starting point. There is now a tie-in between Speedscriber (also updated this week) and FCPX that will expedite the speech-to-text function.

The second part of Apple’s announcement was the introduction of a new camera raw codec family – ProResRAW and ProResRAW HQ. These are acquisition codecs designed to record the raw sensor data from Bayer-pattern sensors (prior to debayering the signal into RGB information) and make that available in post, just like RED’s REDCODE RAW or CinemaDNG. Since this is an acquisition codec and NOT a post or intermediate codec, it requires a partnership on the production side of the equation. Initially this includes Atomos and DJI. Atomos supplies an external recorder, which can record the raw output from various cameras that offer the ability to record raw data externally. This currently includes their Shogun Inferno and Sumo 19 models. As this is camera-specific, Atomos must then create the correct profile by camera to remap that sensor data into ProResRAW. At the show, this included several Canon, Sony, and Panasonic cameras. DJI does this in-camera on the Inspire 2.

The advantage with FCPX, is that ProResRAW is optimized for post, thus allowing for more streams in real-time. ProResRAW data rates (variable) fall between that of ProRes and ProResHQ, while the less compressed ProResRAW HQ rates are between ProRes HQ and ProRes 4444. It’s very early with this new codec, so additional camera and post vendors will likely add ProResRAW support over the coming year. It is currently unknown whether or not any other NLEs can support ProResRAW decode and playback yet.

As always, the Avid booth was quite crowded and, from what I heard, Avid Connect was well attended with enthused Avid users. The Avid offerings are quite broad and hard to encapsulate into any single blog post. Most, these days, are very enterprise-centric. But this year, with a new CEO at the helm, Avid’s creative tools have been reorganized into three strata – First, standard, and Ultimate. This applies to Sibelius, Pro Tools, and Media Composer. In the case of Media Composer, there’s Media Composer | First – a fully functioning free version, with minimal restrictions; Media Composer; and Media Composer | Ultimate – includes all options, such as PhraseFind, ScriptSync, NewsCutter, and Symphony. The big difference is that project sharing has been decoupled from Media Composer. This means that if you get the “standard” version (just named Media Composer) it will not be enabled for collaboration on a shared storage network. That will require Media Composer | Ultimate. So Media Composer (standard) is designed for the individual editor. There is also a new subscription pricing structure, which places Media Composer at about the same annual cost as Adobe Premiere Pro CC (single app license). The push is clearly towards subscription, however, you can still purchase and/or maintain support for perpetual licenses, but it’s a little harder to find that info on Avid’s store website.

Though not as big news, Avid is also launching the Avid DNxID capture/export unit. It is custom-designed by Blackmagic Design for Avid and uses a small form factor. It was created for file-base acquisition, supports 4K, and includes embedded DNx codecs for onboard encoding. Connections are via component analog, HDMI, as well as an SD card slot.

The traffic around Adobe’s booth was thick the entire week. The booth featured interesting demos that were front and center in the middle of one of the South Hall’s main thoroughfares, generally creating a bit of a bottleneck. The newest Creative Cloud updates had preceded the show, but were certainly new to anyone not already using the Adobe apps. Big news for Premiere Pro users was the addition of automatic ducking that was brought over from Audition, and a new shot matching function within the Lumetri color panel. Both are examples of Adobe’s use of their Sensei AI technology. Not to be left out, Audition can now also directly open sequences from Premiere Pro. Character Animator had been in beta form, but is now a full-fledged CC product. And for puppet control Adobe also introduced the Advanced Puppet Engine for After Effects. This is a deformation tool to better bend, twist, and control elements.

Of course when it comes to NLEs, the biggest buzz has been over Blackmagic Design’s DaVinci Resolve 15. The company has an extensive track record of buying up older products whose companies weren’t doing so well, reinvigorating the design, reducing the cost, and breathing new life into them – often to a new, wider customer base. This is no more evident than Resolve, which has now grown from a leading color correction system to a powerful, all-in-one edit/mix/effects/color solution. We had previously seen the integration of the Fairlight audio mixing engine. This year Fusion visual effects were added. As before, each one of these disparate tools appears on its own page with a specific UI optimized for that task.

A number of folks have quipped that someone had finally resurrected Avid DS. Although all-in-ones like DS and Smoke haven’t been hugely successful in the past, Resolve’s price point is considerably more attractive. The Fusion integration means that you now have a subset of Fusion running inside of Resolve. This is a node-based compositor, which makes it easy for a Resolve user to understand, since it, too, already uses nodes in the color page. At least for now, Blackmagic Design intends to also maintain a standalone version of Fusion, which will offer more functions for visual effects compositing. Resolve also gained new editorial features, including tabbed sequences, a pancake timeline view, captioning, and improvements in the Fairlight audio page.

Other Blackmagic Design news includes updates to their various mini-converters, updates to the Cintel Scanner, and the announcement of a 4K Pocket Cinema Camera (due in September). They have also redesigned and modularized the Fairlight console mixing panels. These are now more cost-effective to manufacture and can be combined in various configurations.

This was the year for a number of milestone anniversaries, such as the 100th for Panasonic and the 25th for AJA. There were a lot of new product announcements at the AJA booth, but a big one was the push for more OpenGear-compatible cards. OpenGear is an open source hardware rack standard that was developed by Ross and embraced by many manufacturers. You can purchase any OpenGear version of a manufacturer’s product and then mix and match a variety of OpenGear cards into any OpenGear rack enclosure. AJA’s cards also offer Dashboard support, which is a software tool to configure and control the cards. There are new KONA SDI and HDMI cards, HDR support in the IO 4K Plus, and HDR capture and playback with the KiPro Ultra Plus.

HDR

It’s fair to say that we are all learning about HDR, but from what I observed on the floor, AJA is one of the only companies with a number of hardware product offerings that will allow you to handle HDR. This is thanks to their partnership with ColorFront, who is handling the color science in these products. This includes the FS | HDR – an up/down/cross, SDR/HDR synchronizer/converter. It also includes support for the Tangent Element Kb panel. The FS | HDR was a tech preview last year, but a product now. This year the tech preview product is the HDR Image Analyzer, which offers waveform and histogram monitoring at up to 4K/60fps.

Speaking of HDR (high dynamic range) and SDR (standard dynamic range), I had a chance to sit in on Robbie Carman’s (colorist at DC Color, Mixing Light) Post Production World HDR overview. Carman has graded numerous HDR projects and from his HDR presentation – coupled with exhibits on the floor – it’s quite clear that HDR is the wild, wild west right now. There is much confusion about color space and dynamic range, not to mention what current hardware is capable of versus the maximums expressed in the tech standards. For example, the BT 2020 spec doesn’t inherently mean that the image is HDR. Or the fact that you must be working in 4K to also have HDR and the set must accept the HDMI 2.0 standard.

High dynamic range grading absolutely requires HDR-compatible hardware, such as the proper i/o device and a display with the ability to receive metadata that turns on and sets its target HDR values. This means investing in a device like AJA’s IO 4K Plus or Blackmagic’s UltraStudio 4K Extreme 3. It also means purchasing a true grading monitor costing tens of thousands of dollars, like one from Sony, Canon, or Flanders. You CANNOT properly grade HDR based on the image of ANY computer display. So while the latest version of FCPX can handle HDR, and an iMac Pro screen features a high nits rating, you cannot rely on this screen to see proper HDR.

LG was a sponsor of the show and LG displays were visible in many of the exhibits. Many of their newest products qualify at the minimum HDR spec, but for the most part, the images shown on the floor were simply bright and not HDR – no matter what the sales reps in the booths were saying.

One interesting fact that Carman pointed out was that HDR displays cannot be driven across the full screen at the highest value. You cannot display a full screen of white at 1,000 nits on a 1,000 nits display without causing damage. Therefore, automatic gain adjustments are used in the set’s electronics to dim the screen. Only a smaller percentage of the image (20% maybe?) can be driven at full value before dimming occurs. Another point Carman made was that standard lift/gamma/gain controls may be too coarse to grade HDR images with finesse. His preference is to use Resolve’s log grading controls, because you can make more precise adjustments to highlight and shadow values.

Cameras

I’m not a camera guy, but there was notable camera news at the show. Many folks really like the Panasonic colorimetry for which the Varicam products are known. For people who want a full-featured camera in a small form factor, look no further than the Panasonics AU-EVA-1. It’s a 4K, Super35, handheld cinema camera featuring dual ISOs. Panasonic claims 14 stops of latitude. It will take EF lenses and can output camera raw data. When paired with an Atmos recorder it will be able to record ProResRAW.

Another new camera is Canon’s EOS C700 FF. This is a new full-frame model in both EF and PL lens mount versions. As with the standard C700, this is a 4K, Super35 cinema camera that records ProRes or X-AVC at up to 4K resolution onboard to CFast cards. The full-frame sensor offers higher resolution and a shallower depth of field.

Storage

Storage is of interest to many. As costs come down, collaboration is easier than ever. The direct-attached vendors, like G-Tech, LaCie, OWC, Promise, and others were all there with new products. So were the traditional shared storage vendors like Avid, Facilis, Tiger, 1 Beyond, and EditShare. But three of the newer companies had my interest.

In my editing day job, I work extensively with QNAP, which currently offers the best price/performance ratio of any system. It’s reliable, cost-effective, and provides reasonable JKL response cutting HD media with Premiere Pro in a shared editing installation. But it’s not the most responsive and it struggles with 4K media, in spite of plenty of bandwidth  – especially when the editors are all banging away. This has me looking at both Lumaforge and OpenDrives.

Lumaforge is known to many of the Final Cut Pro X editors, because the developers have optimized the system for FCPX and have had early successes with many key installations. Since then they have also pushed into more Premiere-based installations. Because these units are engineered for video-centric facilities, as opposed to data-centric, they promise a better shared storage, video editing experience.

Likewise, OpenDrives made its name as the provider for high-profile film and TV projects cut on Premiere Pro. Last year they came to the show with their highest performance, all-SSD systems. These units are pricey and, therefore, don’t have a broad appeal. This year they brought a few of the systems that are more applicable to a broader user base. These include spinning disk and hybrid products. All are truly optimized for Premiere Pro.

The cloud

In other storage news, “the cloud” garners a ton of interest. The biggest vendors are Microsoft, Google, IBM, and Amazon. While each of these offers relatively easy ways to use cloud-based services for back-up and archiving, if you want a full cloud-based installation for all of your media needs, then actual off-the-shelf solutions are not readily available. The truth of the matter is that each of these companies offers APIs, which are then handed off to other vendors – often for totally custom solutions.

Avid and Sony seem to have the most complete offerings, with Sony Ci being the best one-size-fits-all answer for customer-facing services. Of course, if review-and-approval is your only need, then Frame.io leads and will have new features rolled out during the year. IBM/Aspera is a great option for standard archiving, because fast Aspera up and down transfers are included. You get your choice of IBM or other (Google, Amazon, etc.) cloud storage. They even offer a trial period using IBM storage for 30 days at up to 100GB free. Backblaze is a competing archive solution with many partnering applications. For example, you can tie it in with Archiware’s P5 Suite of tools for back-up, archiving, and server synchronization to the cloud.

Naturally, when you talk of the “cloud”, many people interpret that to mean software that runs in the cloud – SaaS (software as a service). In most cases, that is nowhere close to happening. However, the exception is The Foundry, which was showing Athera, a suite of its virtualized applications, like Nuke, running on the Google Cloud Platform. They demo’ed it running inside the Chrome browser, thanks to this partnership with Google. The Foundry had a pod in the Google partners pavilion.

In short, you can connect to the internet with a laptop, activate a license of the tool or tools that you need, and then all media, processing, and rendering is handled in the cloud, using Google’s services and hardware. Since all of this happens on Google’s servers, only an updated UI image needs to be pushed back to the connected computer’s display. This concept is ideal for the visual effects world, where the work is generally done on an individual shot basis without a lot of media being moved in real-time. The target is the Nuke-centric shop that may need to add on a few freelancers quickly, and who may or may not be able to work on-premises.

Interesting newcomers

As I mentioned at the beginning, part of the joy of NAB is discovering the small vendors who seek out NAB to make their mark. One example this year is Lumberjack Systems, a venture by Philip Hodgetts and Greg Clarke of Intelligent Assistance. They were in the Lumaforge suite demonstrating Lumberjack Builder, which is a text-based NLE. In the simplest of explanations, your transcription or scripted text is connected to media. As you re-arrange or trim the text, the associated picture is edited accordingly. Newly-written text for voiceovers turns into spoken word media courtesy of the computer’s internal audio system and system voice. Once your text-based rough cut is complete, an FCPXML is sent to Final Cut Pro X, for further finesse and final editing.

Another new vendor I encountered was Quine, co-founded by Norwegian DoP Grunleik Groven. Their QuineBox IoT device attaches to the back of a camera, where it can record and upload “conformable” dailies (ProRes, DNxHD) to your SAN, as well as proxies to the cloud via its internal wi-fi system. Script notes can also be incorporated. The unit has already been battle-test on the Netflix/NRK production of “Norsemen”.

Closing thoughts

It’s always interesting to see, year over year, which companies are not at the show. This isn’t necessarily indicative of a company’s health, but can signal a change in their direction or that of the industry. Sometimes companies opt for smaller suites at an area hotel in lieu of the show floor (Autodesk). Or they are a smaller part of a reseller or partner’s booth (RED). But often, they are simply gone. For instance, in past years drones were all the rage, with a lot of different manufacturers exhibiting. DJI has largely captured that market for both vehicles and camera systems. While there were a few other drone vendors besides DJI, GoPro and Freefly weren’t at the show at all.

Another surprise change for me was the absence of SAM (Snell Advanced Media) – the hybrid company formed out of Snell & Wilcox and Quantel. SAM products are now part of Grass Valley, which, in turn, is owned by Belden (the cable manufacturer). Separate Snell products appear to have been absorbed into the broader Grass Valley product line. Quantel’s Go and Rio editors continue in Grass Valley’s editing line, alongside Edius – as simple, middle, and advanced NLE products. A bit sad actually. And very ironic. Here we are in the world of software and file-based video, but the company that still has money to make acquisitions is the one with a heavy investment in copper (I know, not just copper, but you get the point).

Speaking of “putting a fork in it”, I would have to say that stereo 3D and 360 VR are pretty much dead in the film and video space. I understand that there is a market – potentially quite large – in gaming, education, simulation, engineering, training, etc. But for more traditional entertainment projects, it’s just not there. Vendors were down to a few, and even though the leading NLEs have ways of working with 360 VR projects, the image quality still looks awful. When you view a 4K image within even the best goggles, the qualitative experience is like watching a 1970s-era TV set from a few inches away. For now, it continues to be a novelty looking for a reason to exist.

A few final points… It’s always fun to see what computers were being used in the booths. Apple is again a clear winner, with plenty of MacBook Pros and iMac Pros all over the LVCC when used for any sort of creative products or demos. eGPUs are of interest, with Sonnet being the main vendor. However, eGPUs are not a solution that solves every problem. For example, you will see more benefit by adding an eGPU to a lesser-powered machine, like a 13” MacBook Pro than one with more horsepower, like an iMac Pro. Each eGPU takes one Thunderbolt 3 bus, so realistically, you are likely to only add one additional eGPU to a computer. None of the NLE vendors could really tell me how much of a boost their application would have with an eGPU. Finally, if you are looking for some great-looking, large, OLED displays that are pretty darned accurate and won’t break the bank, then LG is the place to look.

©2018 Oliver Peters

Molly’s Game

Molly Bloom’s future looked extremely bright. A shot at Olympic skiing glory leading to entry into a leading law school. But an accident during qualifying trials for the U. S. ski team knocked her out of the running for the Salt Lake City games. (Bloom notes in her own memoir that it was her decision to retire and change the course of her life, rather than the minor accident.) She moved to Los Angeles and ended up running high stakes, private poker games with her boss at the time. These games included A-list celebrities, hedge fund managers, and eventually, members of the Russian mob. Bloom quickly earned the nickname as the “poker princess”. This all came crashing down when Bloom was busted by the FBI and sentenced for her role in the gambling ring.

Bloom’s memoir came to the attention of screenwriter Aaron Sorkin (The Social Network, Moneyball, Steve Jobs), who not only made this his next film script, but also his debut as a film director. Sorkin stayed close to the facts that Bloom described in her own memoir and consulted her during the writing of the screenplay. The biggest departure is that Bloom named some celebrities at these games, who had previously been revealed in released court documents. Sorkin opted to fictionalize them, explaining that he would rather focus the story on Bloom’s experiences and not on Hollywood gossip. Jessica Chastain (The Zookeeper’s Wife, A Most Violent YearZero Dark Thirty) stars as Molly Bloom.

Although three editors are credited for Molly’s Game, the back story is that a staggered schedule had to be worked out. The post production of Steve Jobs connected feature film editor Elliot Graham (Milk, 21, Superman Returns) with that film’s writer and director – Sorkin and Danny Boyle (T2 Trainspotting, 127 Hours, Slumdog Millionaire). Graham was tapped to cut Molly’s Game later into the process, replacing its original editor. He brought Josh Schaeffer (The Last Man on Earth, Detroiters, You’re the Worst) on as associate editor to join him. Graham started the recut with Schaeffer, but a prior schedule commitment to work on Trust for Boyle, saw him exiting the film early. (Trust is the BBC’s adaptation of the Getty kidnapping story.) Graham was able to bring the film about 50% of the way through post. Alan Baumgarten (Trumbo, American Hustle, Gangster Squad) picked up for Graham and edited with Schaeffer to the finish, thus earning all three an editing credit.

Working with a writer on his directorial debut

It can always be a challenge when a writer is close to the editing process. Scenes that may be near and dear to the writer are often cut, leading to tension. I asked the three about this situation. Graham says, “Aaron has always been on set with his other films and worked very closely with the director. So, he understands the process, having learned from some of the best directors in the business. I had a great time with Aaron on Steve Jobs. He’s an incredibly lovely and generous collaborator who brings out the best in his team.”

Baumgarten expands, “Working with Aaron was fun, because he appreciates being challenged. He’s open to seeing what an editor brings to the film. Aaron wrote a tight script that didn’t need to be re-arranged. Only about 20 minutes came out. We cut one small scene, but it was mostly trimming here and there. You want to be careful not to ruin the rhythm of his writing.”

Graham continues, “Aaron also found his own visual vocabulary. A lot of the story is told in time jumps, from present day to the past in flashbacks. Aaron always is looking for rapid fire, overlapping dialogue. It’s part of his uniqueness and it’s a joy to cut. What was new for Aaron was using voice over to drive things.”

 Another new challenge was the use of stock footage. About 150 stock shots were used for cutaways and mini-montages throughout the film. Most of these were never originally scripted. Graham says, “Stock footage was something I chose to start injecting into the film with Aaron’s collaboration when I came on. We felt it was useful to have visual references for some of the voice overs – to connect visuals with words, which helps to land Aaron’s linguistic ideas for viewers. This began with the opening ski sequence – the first thing I cut when I came on board.”

The editors would pull down shots from a variety of internet sources and then the actual footage had to be found and cleared. The editors ultimately partnered with STALKR to find and clear all of the stock shots that were used. Visual effects were handled by Mr. X in Toronto. Originally, only 90 shots were budgeted (for example, snow falling in the ski sequences), but in the end, there were almost 600 visual effects shots in the final film.

Musicality of the performance

Baumgarten explains the musicality of Sorkin’s style. He says, “Aaron knew the film he wanted and had that in his head. Part of his writing process is to read his dialogue out loud and listen for the cadence of the performance. As you go through takes, the film is always moving in the right direction. As a writer/director, he doesn’t need variations or ad libs in an actor’s performance from one take to another, because he knows what the intention of the line is. As editors, we didn’t need to experiment with different calibrations of the performance. The experimentation came in with how we wove in the voice-over and played with the general rhythm.”

Graham adds, “Daniel Pemberton is the composer I worked with on Steve Jobs. I brought on Carl Kaller, a great music editor, when I came on. I knew that the music and dialogue had to dance a beautiful rhythm together for the film to be its best. With a compressed schedule to finish the film, we needed someone like Carl to help choreograph that dance.”

Baumgarten continues, “Daniel was involved early and provided us with temp tracks, which was a great gift. We didn’t have to use scores from other composers as temp music. Carl was just down the hall, so it was easy to weave Daniel’s temp elements in and around the dialogue and voice-over during the editing stage. There is interplay between the voice-over and the music, and the VO is like another musical element.”

Avid for the post

The post operation followed a standard feature film set-up. Avid Media Composer for the editing work stations, tied to Avid ISIS shared storage. The film was shot digitally using ARRI Alexas.

Production covered 48 days ending in February [2017]. It took 10 weeks to get to a director’s cut and then editing on Molly’s Game continued for about six months, which included visual effects, final sound mix and color correction. Schaeffer explains, “The dialogue scenes were scripted using [Avid] ScriptSync. Aaron was familiar with ScriptSync from The Newsroom, and it was a great help for us on this film. It’s the best way to have everything readily available and it allows us to be extremely thorough. If Aaron wanted to change a single word in a take, we were always able to find all of the alternates and make the change quite easily.”

Schaeffer continues, “Aaron methodically worked in a reel-by-reel order. We would divide up sequences between us at breaks that made sense. But when it came time to review the cut on a sequence, we would all review together. A lot people think that you have three editors on a film because the project is so difficult. The truth is that it lets you be more creative. Productions shoot so much footage these days, that it’s great to be able to experiment. Having multiple editors on a film enables you to take the time to be creative. We were all glad that Aaron set up an environment, which made that possible.”

Originally written for Digital Video magazine / Creative Planet Network

©2018 Oliver Peters

Downsizing

The bond between a film director and the editor is often a long-lasting one. The industry is full of pairings that continue film after film. One such duo is director Alexander Payne (Nebraska, The Descendants, Sideways) and editor Kevin Tent (Welcome to Me, Girl Interrupted, Election). Tent has edited every film that Payne directed, with the exception of Payne’s short film Paris, je t’aime. In fact, Payne also served as producer for Crash Pad, a film directed by Tent.

The latest Alexander Payne film to hit the cinemas is Downsizing, a sci-fi satire starring Matt Damon, Christoph Waltz, and Kristen Wig. In the film, scientists discover human miniaturization as a way to combat overpopulation. Paul (Matt Damon) and Audrey (Kristen Wig) decide to give it a try, exchanging their average life in Omaha for Leisure Land, one of the ‘micro-communities’ sprouting up. Their modest $150,000 in personal assets will make them multimillionaires, so they take the plunge.

Sci-fi and satire

The sci-fi genre is a new approach for Payne, which is where I started my conversation with Kevin Tent. He explains, “The sci-fi theme is a departure for Alexander, but this is still very much an ‘Alexander Payne movie’. It’s still about the human experience. In the plot, shrinking is seen as a way to save the human race, but people get greedy. They can make themselves instantly rich, save money on food, medicine, and move into big ‘McMansions’. Human nature takes over, which makes the film funny and also thought-provoking. It covers a lot of ground and politics.”

“It’s easy to ask, why sci-fi,” Tent continues. “Alexander Payne is an artist who is always looking for ways to challenge himself. He co-wrote the script ten years ago, but it took this long to get it made. For one thing, Downsizing is more expensive than his past films. As an editor, I first looked at the cutting differently, because of working with the visual effects; but, I quickly realized that this film, like Alexander’s others, was about the characters and the story.  [Those are] still the most important elements of the movie. I had recently worked on The Audition, which was shot mostly with green screen – and a while back, The Golden Compass, which was a serious visual effects movie.  I had enough knowledge about the process to know one thing. These people can do anything! We had a terrific VFX team, headed by our creative guru, Jamie Price. ILM and Framestore did most of the visual effects.”

Digital production to aid the process

Alexander Payne shifted to digital acquisition with Nebraska and has followed suit with his latest, Downsizing. According to Tent, “Alexander shoots a lot of coverage, so he likes digital for that. It’s also easier to deal with when compositing visual effects. We had over 130 hours of total footage. Of course, a fairly good chunk was plates for VFX and 2nd unit footage. Most of the scenes were shot with single camera, but sometimes with multi-cam. Especially for some of the big speeches, which were covered with two and sometimes three cameras. We synced up the takes in the Avid, which makes it so easy to switch from camera to camera. Mindy Elliot is our amazing first assistant. She’s a total pro and a total joy to work with. She’s been running our cutting rooms since The Descendants. Angela Latimer was our second. She did 99% of the scripting [for Avid’s ScriptSync feature] and also helped cut early versions of Paul’s drug montage [scene in Downsizing]. Joe Carson was our VFX editor. I met him while working on Sponge Bob The Movie. I was one of the live action CGI editors on that film. Joe is awesome. He not only kept all of our visual effects organized, but he was also kept busy with the countless comps, morphs, and speed-ups that we tossed at him on a daily basis.”

Production wrapped in mid-August 2016 and then Tent started cutting with Payne right after Labor Day. Tent continues, “When I cut with Alexander, we basically start from scratch. I do create an editor’s cut during production, which we go back to for reference during our time together cutting, but it isn’t the starting point when I begin with Alexander. He’s a good editor, so when we work together, it’s really like having two editors in the room. We start watching dailies and start building scenes. We often look back at my editor’s cut and realize the scene or a part of it was better in that earlier version. Or maybe not. If there is something we like, we’ll put it back into the current cut.  We completed our first pass (kind of a director’s assembly) in January to show the studio. By early to mid-July we had a locked cut with about 80% of the completed VFX shots. The remainder trickled in afterwards. All together, that’s about ten or eleven months of cutting and finishing. Our DI/color grading was handled by the amazing Skip Kimball at Technicolor.”

Tools and tips

As a fellow editor, it’s always fun to talk about the tools and how to use them on a feature film project. Kevin Tent is a committed Avid Media Composer user. (Pacific Post provided the Avid systems used by the editing team.) According to Tent, “This was a huge project and Media Composer never had a problem with it.” One unique hallmark of Media Composer is Avid’s Script Integration. Notable within it is ScriptSync, Media Composer’s ability to automatically analyze waveforms and synchronize them – and, therefore, the associated clip – against text that has been input, like a film script. When correctly indexed, simply clicking on a line of dialogue in the on-screen script brings up all of the corresponding coverage. An ongoing licensing dispute limited its use to older versions of Media Composer, until the issue was finally resolved this year. That is great news for devotees of Avid’s powerful ScriptSync capability.

Many film editors swear by Avid’s Script Integration tools, yet some never use them at all. Was Tent a ScriptSync user? “Hell, ya!,” is his instant reply. “We stayed on Media Composer 7.0.6, because of the ScriptSync licensing issue, just so we could use it. I had Angela mark a lot of extra material and ad libs in addition to the scripted dialog. For example, an action like Paul opening a door or something like that. That would help, especially if they shot a lot of takes or resets within one bigger take, which tends to happen a lot when the shooting is on digital. There’s a massive party scene midway through the movie with people dancing, smoking pot, that kind of thing, and I asked Angela to add a ton of detail describing the scene. It made finding specific actions so quick. It’s also an especially great aid at re-cutting scenes when you are looking for alternate coverage.”

Another aid that editors like is to place scene cards on the wall. Typically these are 3”x5” note cards with written scene descriptions – one for each scene – that can be pinned to the wall in the order of the ongoing edit. Although Tent is also a proponent of these – a remnant practice from the old film days – his Downsizing cutting room didn’t have enough wall space to accommodate cards.

The Downsizing script clocked in a tad long and the first assembly that Payne and Tent cut was 2:45 (final length was 2:08). Obviously the team needed to do a bit of “downsizing” themselves. Tent explains, “The biggest lost scenes were bookending storyteller elements to open and close the film. There was an old caveman from far in the future telling a group of children about the events within the film and how once giants roamed the world. This story element was painful to lose, because it was very funny and effective emotionally. But it took an added three or four minutes to get to Matt Damon’s character and that hurt us.  The audience wants you to get to your main characters and understand what they’re seeing within a reasonable amount of time. Fortunately, Alexander hadn’t shot it yet as part of the main production. We previewed with storyboards, temp music, and voice over. While it was tough to lose it from the point of view of the script, we weren’t leaving produced material ‘on the cutting room floor’. Ultimately if you don’t know it was there, you won’t miss not having it.”

Downsizing opened in cinemas on December 21. Whether you are in it for the thought-provoking concepts or simply a lot of laughs and a wild ride, it’s a film to enjoy. Alexander Payne is bound to have another success on his hands.

Originally written for Digital Video magazine / Creative Planet Network

© 2017, 2018 Oliver Peters