Viva Las Vegas – NAB 2018

As more and more folks get all of their information through internet sources, the running question is whether or not trade shows still have value. A show like the annual NAB (National Association of Broadcasters) Show in Las Vegas is both fun and grueling, typified by sensory overload and folks in business attire with sneakers. Although some announcements are made before the exhibits officially open – and nearly all are pretty widely known before the week ends – there still is nothing quite like being there in person.

For some, other shows have taken the place of NAB. The annual HPA Tech Retreat in the Palm Springs area is a gathering of technical specialists, researchers, and creatives that many consider the TED Talks for our industry. For others, the Cine Gear Expo in LA is the prime showcase for grip, lighting, and camera offerings. RED Camera has focused on Cine Gear instead of NAB for the last couple of years. And then, of course, there’s IBC in Amsterdam – the more humane version of NAB in a more pleasant setting. But for me, NAB is still the main event.

First of all, the NAB Show isn’t merely about the exhibit floor at the sprawling Las Vegas Convention Center. Actual NAB members can attend various sessions and workshops related to broadcasting and regulations. There are countless sidebar events specific to various parts of the industry. For editors that includes Avid Connect – a two-day series of Avid presentations in the weekend leading into NAB; Post Production World – a series of workshops, training sessions, and presentations managed by Future Media Concepts; as well as a number of keynote presentations and artist gatherings, including SuperMeet, FCPexchange, and the FCPX Guru Gathering. These are places where you’ll rub shoulders with some well-known editors, colorists, artists, and mixers, learn about new technologies like HDR (high dynamic range imagery), and occasionally see some new product features from vendors who might not officially be on the show floor with a booth, like Apple.

One of the biggest benefits I find in going to NAB is simply walking the floor, checking out the companies and products who might not get a lot of attention. These newcomers often have the most innovative technologies and it’s these new things that you find, which were never on the radar prior to that week.

The second benefit is connection. I meet up again in person with friends that I’ve made over the years – both other users, as well as vendors. Often it’s a chance to meet people that you might only know through the internet (forums, blogs, etc.) and to get to know them just a bit better. A bit more of that might make the internet more friendly, too!

Here are some of my random thoughts and observations from Las Vegas.

__________________________________

Editing hardware and software – four As and a B

Apple uncharacteristically pre-announced their new features just prior to the show, culminating with App Store availability on Monday when the NAB exhibits opened. This includes new Final Cut Pro X/Motion/Compressor updates and the official number of 2.5 million FCPX users. That’s a growth of 500,000 users in 2017, the biggest year to date for Final Cut. The key new feature in FCPX is a captioning function to author, edit, and export both closed and embedded (open) captions. There aren’t many great solutions for captioning and the best to date have been expensive. I found that the Apple approach was now the best and easiest to use that I’ve seen. It’s well-designed and should save time and money for those who need to create captions for their productions – even if you are using another brand of NLE. Best of all, if you own FCPX, you already have that feature. When you don’t have a script to start out, then manual or automatic transcription is required as a starting point. There is now a tie-in between Speedscriber (also updated this week) and FCPX that will expedite the speech-to-text function.

The second part of Apple’s announcement was the introduction of a new camera raw codec family – ProResRAW and ProResRAW HQ. These are acquisition codecs designed to record the raw sensor data from Bayer-pattern sensors (prior to debayering the signal into RGB information) and make that available in post, just like RED’s REDCODE RAW or CinemaDNG. Since this is an acquisition codec and NOT a post or intermediate codec, it requires a partnership on the production side of the equation. Initially this includes Atomos and DJI. Atomos supplies an external recorder, which can record the raw output from various cameras that offer the ability to record raw data externally. This currently includes their Shogun Inferno and Sumo 19 models. As this is camera-specific, Atomos must then create the correct profile by camera to remap that sensor data into ProResRAW. At the show, this included several Canon, Sony, and Panasonic cameras. DJI does this in-camera on the Inspire 2.

The advantage with FCPX, is that ProResRAW is optimized for post, thus allowing for more streams in real-time. ProResRAW data rates (variable) fall between that of ProRes and ProResHQ, while the less compressed ProResRAW HQ rates are between ProRes HQ and ProRes 4444. It’s very early with this new codec, so additional camera and post vendors will likely add ProResRAW support over the coming year. It is currently unknown whether or not any other NLEs can support ProResRAW decode and playback yet.

As always, the Avid booth was quite crowded and, from what I heard, Avid Connect was well attended with enthused Avid users. The Avid offerings are quite broad and hard to encapsulate into any single blog post. Most, these days, are very enterprise-centric. But this year, with a new CEO at the helm, Avid’s creative tools have been reorganized into three strata – First, standard, and Ultimate. This applies to Sibelius, Pro Tools, and Media Composer. In the case of Media Composer, there’s Media Composer | First – a fully functioning free version, with minimal restrictions; Media Composer; and Media Composer | Ultimate – includes all options, such as PhraseFind, ScriptSync, NewsCutter, and Symphony. The big difference is that project sharing has been decoupled from Media Composer. This means that if you get the “standard” version (just named Media Composer) it will not be enabled for collaboration on a shared storage network. That will require Media Composer | Ultimate. So Media Composer (standard) is designed for the individual editor. There is also a new subscription pricing structure, which places Media Composer at about the same annual cost as Adobe Premiere Pro CC (single app license). The push is clearly towards subscription, however, you can still purchase and/or maintain support for perpetual licenses, but it’s a little harder to find that info on Avid’s store website.

Though not as big news, Avid is also launching the Avid DNxID capture/export unit. It is custom-designed by Blackmagic Design for Avid and uses a small form factor. It was created for file-base acquisition, supports 4K, and includes embedded DNx codecs for onboard encoding. Connections are via component analog, HDMI, as well as an SD card slot.

The traffic around Adobe’s booth was thick the entire week. The booth featured interesting demos that were front and center in the middle of one of the South Hall’s main thoroughfares, generally creating a bit of a bottleneck. The newest Creative Cloud updates had preceded the show, but were certainly new to anyone not already using the Adobe apps. Big news for Premiere Pro users was the addition of automatic ducking that was brought over from Audition, and a new shot matching function within the Lumetri color panel. Both are examples of Adobe’s use of their Sensei AI technology. Not to be left out, Audition can now also directly open sequences from Premiere Pro. Character Animator had been in beta form, but is now a full-fledged CC product. And for puppet control Adobe also introduced the Advanced Puppet Engine for After Effects. This is a deformation tool to better bend, twist, and control elements.

Of course when it comes to NLEs, the biggest buzz has been over Blackmagic Design’s DaVinci Resolve 15. The company has an extensive track record of buying up older products whose companies weren’t doing so well, reinvigorating the design, reducing the cost, and breathing new life into them – often to a new, wider customer base. This is no more evident than Resolve, which has now grown from a leading color correction system to a powerful, all-in-one edit/mix/effects/color solution. We had previously seen the integration of the Fairlight audio mixing engine. This year Fusion visual effects were added. As before, each one of these disparate tools appears on its own page with a specific UI optimized for that task.

A number of folks have quipped that someone had finally resurrected Avid DS. Although all-in-ones like DS and Smoke haven’t been hugely successful in the past, Resolve’s price point is considerably more attractive. The Fusion integration means that you now have a subset of Fusion running inside of Resolve. This is a node-based compositor, which makes it easy for a Resolve user to understand, since it, too, already uses nodes in the color page. At least for now, Blackmagic Design intends to also maintain a standalone version of Fusion, which will offer more functions for visual effects compositing. Resolve also gained new editorial features, including tabbed sequences, a pancake timeline view, captioning, and improvements in the Fairlight audio page.

Other Blackmagic Design news includes updates to their various mini-converters, updates to the Cintel Scanner, and the announcement of a 4K Pocket Cinema Camera (due in September). They have also redesigned and modularized the Fairlight console mixing panels. These are now more cost-effective to manufacture and can be combined in various configurations.

This was the year for a number of milestone anniversaries, such as the 100th for Panasonic and the 25th for AJA. There were a lot of new product announcements at the AJA booth, but a big one was the push for more OpenGear-compatible cards. OpenGear is an open source hardware rack standard that was developed by Ross and embraced by many manufacturers. You can purchase any OpenGear version of a manufacturer’s product and then mix and match a variety of OpenGear cards into any OpenGear rack enclosure. AJA’s cards also offer Dashboard support, which is a software tool to configure and control the cards. There are new KONA SDI and HDMI cards, HDR support in the IO 4K Plus, and HDR capture and playback with the KiPro Ultra Plus.

HDR

It’s fair to say that we are all learning about HDR, but from what I observed on the floor, AJA is one of the only companies with a number of hardware product offerings that will allow you to handle HDR. This is thanks to their partnership with ColorFront, who is handling the color science in these products. This includes the FS | HDR – an up/down/cross, SDR/HDR synchronizer/converter. It also includes support for the Tangent Element Kb panel. The FS | HDR was a tech preview last year, but a product now. This year the tech preview product is the HDR Image Analyzer, which offers waveform and histogram monitoring at up to 4K/60fps.

Speaking of HDR (high dynamic range) and SDR (standard dynamic range), I had a chance to sit in on Robbie Carman’s (colorist at DC Color, Mixing Light) Post Production World HDR overview. Carman has graded numerous HDR projects and from his HDR presentation – coupled with exhibits on the floor – it’s quite clear that HDR is the wild, wild west right now. There is much confusion about color space and dynamic range, not to mention what current hardware is capable of versus the maximums expressed in the tech standards. For example, the BT 2020 spec doesn’t inherently mean that the image is HDR. Or the fact that you must be working in 4K to also have HDR and the set must accept the HDMI 2.0 standard.

High dynamic range grading absolutely requires HDR-compatible hardware, such as the proper i/o device and a display with the ability to receive metadata that turns on and sets its target HDR values. This means investing in a device like AJA’s IO 4K Plus or Blackmagic’s UltraStudio 4K Extreme 3. It also means purchasing a true grading monitor costing tens of thousands of dollars, like one from Sony, Canon, or Flanders. You CANNOT properly grade HDR based on the image of ANY computer display. So while the latest version of FCPX can handle HDR, and an iMac Pro screen features a high nits rating, you cannot rely on this screen to see proper HDR.

LG was a sponsor of the show and LG displays were visible in many of the exhibits. Many of their newest products qualify at the minimum HDR spec, but for the most part, the images shown on the floor were simply bright and not HDR – no matter what the sales reps in the booths were saying.

One interesting fact that Carman pointed out was that HDR displays cannot be driven across the full screen at the highest value. You cannot display a full screen of white at 1,000 nits on a 1,000 nits display without causing damage. Therefore, automatic gain adjustments are used in the set’s electronics to dim the screen. Only a smaller percentage of the image (20% maybe?) can be driven at full value before dimming occurs. Another point Carman made was that standard lift/gamma/gain controls may be too coarse to grade HDR images with finesse. His preference is to use Resolve’s log grading controls, because you can make more precise adjustments to highlight and shadow values.

Cameras

I’m not a camera guy, but there was notable camera news at the show. Many folks really like the Panasonic colorimetry for which the Varicam products are known. For people who want a full-featured camera in a small form factor, look no further than the Panasonics AU-EVA-1. It’s a 4K, Super35, handheld cinema camera featuring dual ISOs. Panasonic claims 14 stops of latitude. It will take EF lenses and can output camera raw data. When paired with an Atmos recorder it will be able to record ProResRAW.

Another new camera is Canon’s EOS C700 FF. This is a new full-frame model in both EF and PL lens mount versions. As with the standard C700, this is a 4K, Super35 cinema camera that records ProRes or X-AVC at up to 4K resolution onboard to CFast cards. The full-frame sensor offers higher resolution and a shallower depth of field.

Storage

Storage is of interest to many. As costs come down, collaboration is easier than ever. The direct-attached vendors, like G-Tech, LaCie, OWC, Promise, and others were all there with new products. So were the traditional shared storage vendors like Avid, Facilis, Tiger, 1 Beyond, and EditShare. But three of the newer companies had my interest.

In my editing day job, I work extensively with QNAP, which currently offers the best price/performance ratio of any system. It’s reliable, cost-effective, and provides reasonable JKL response cutting HD media with Premiere Pro in a shared editing installation. But it’s not the most responsive and it struggles with 4K media, in spite of plenty of bandwidth  – especially when the editors are all banging away. This has me looking at both Lumaforge and OpenDrives.

Lumaforge is known to many of the Final Cut Pro X editors, because the developers have optimized the system for FCPX and have had early successes with many key installations. Since then they have also pushed into more Premiere-based installations. Because these units are engineered for video-centric facilities, as opposed to data-centric, they promise a better shared storage, video editing experience.

Likewise, OpenDrives made its name as the provider for high-profile film and TV projects cut on Premiere Pro. Last year they came to the show with their highest performance, all-SSD systems. These units are pricey and, therefore, don’t have a broad appeal. This year they brought a few of the systems that are more applicable to a broader user base. These include spinning disk and hybrid products. All are truly optimized for Premiere Pro.

The cloud

In other storage news, “the cloud” garners a ton of interest. The biggest vendors are Microsoft, Google, IBM, and Amazon. While each of these offers relatively easy ways to use cloud-based services for back-up and archiving, if you want a full cloud-based installation for all of your media needs, then actual off-the-shelf solutions are not readily available. The truth of the matter is that each of these companies offers APIs, which are then handed off to other vendors – often for totally custom solutions.

Avid and Sony seem to have the most complete offerings, with Sony Ci being the best one-size-fits-all answer for customer-facing services. Of course, if review-and-approval is your only need, then Frame.io leads and will have new features rolled out during the year. IBM/Aspera is a great option for standard archiving, because fast Aspera up and down transfers are included. You get your choice of IBM or other (Google, Amazon, etc.) cloud storage. They even offer a trial period using IBM storage for 30 days at up to 100GB free. Backblaze is a competing archive solution with many partnering applications. For example, you can tie it in with Archiware’s P5 Suite of tools for back-up, archiving, and server synchronization to the cloud.

Naturally, when you talk of the “cloud”, many people interpret that to mean software that runs in the cloud – SaaS (software as a service). In most cases, that is nowhere close to happening. However, the exception is The Foundry, which was showing Athera, a suite of its virtualized applications, like Nuke, running on the Google Cloud Platform. They demo’ed it running inside the Chrome browser, thanks to this partnership with Google. The Foundry had a pod in the Google partners pavilion.

In short, you can connect to the internet with a laptop, activate a license of the tool or tools that you need, and then all media, processing, and rendering is handled in the cloud, using Google’s services and hardware. Since all of this happens on Google’s servers, only an updated UI image needs to be pushed back to the connected computer’s display. This concept is ideal for the visual effects world, where the work is generally done on an individual shot basis without a lot of media being moved in real-time. The target is the Nuke-centric shop that may need to add on a few freelancers quickly, and who may or may not be able to work on-premises.

Interesting newcomers

As I mentioned at the beginning, part of the joy of NAB is discovering the small vendors who seek out NAB to make their mark. One example this year is Lumberjack Systems, a venture by Philip Hodgetts and Greg Clarke of Intelligent Assistance. They were in the Lumaforge suite demonstrating Lumberjack Builder, which is a text-based NLE. In the simplest of explanations, your transcription or scripted text is connected to media. As you re-arrange or trim the text, the associated picture is edited accordingly. Newly-written text for voiceovers turns into spoken word media courtesy of the computer’s internal audio system and system voice. Once your text-based rough cut is complete, an FCPXML is sent to Final Cut Pro X, for further finesse and final editing.

Another new vendor I encountered was Quine, co-founded by Norwegian DoP Grunleik Groven. Their QuineBox IoT device attaches to the back of a camera, where it can record and upload “conformable” dailies (ProRes, DNxHD) to your SAN, as well as proxies to the cloud via its internal wi-fi system. Script notes can also be incorporated. The unit has already been battle-test on the Netflix/NRK production of “Norsemen”.

Closing thoughts

It’s always interesting to see, year over year, which companies are not at the show. This isn’t necessarily indicative of a company’s health, but can signal a change in their direction or that of the industry. Sometimes companies opt for smaller suites at an area hotel in lieu of the show floor (Autodesk). Or they are a smaller part of a reseller or partner’s booth (RED). But often, they are simply gone. For instance, in past years drones were all the rage, with a lot of different manufacturers exhibiting. DJI has largely captured that market for both vehicles and camera systems. While there were a few other drone vendors besides DJI, GoPro and Freefly weren’t at the show at all.

Another surprise change for me was the absence of SAM (Snell Advanced Media) – the hybrid company formed out of Snell & Wilcox and Quantel. SAM products are now part of Grass Valley, which, in turn, is owned by Belden (the cable manufacturer). Separate Snell products appear to have been absorbed into the broader Grass Valley product line. Quantel’s Go and Rio editors continue in Grass Valley’s editing line, alongside Edius – as simple, middle, and advanced NLE products. A bit sad actually. And very ironic. Here we are in the world of software and file-based video, but the company that still has money to make acquisitions is the one with a heavy investment in copper (I know, not just copper, but you get the point).

Speaking of “putting a fork in it”, I would have to say that stereo 3D and 360 VR are pretty much dead in the film and video space. I understand that there is a market – potentially quite large – in gaming, education, simulation, engineering, training, etc. But for more traditional entertainment projects, it’s just not there. Vendors were down to a few, and even though the leading NLEs have ways of working with 360 VR projects, the image quality still looks awful. When you view a 4K image within even the best goggles, the qualitative experience is like watching a 1970s-era TV set from a few inches away. For now, it continues to be a novelty looking for a reason to exist.

A few final points… It’s always fun to see what computers were being used in the booths. Apple is again a clear winner, with plenty of MacBook Pros and iMac Pros all over the LVCC when used for any sort of creative products or demos. eGPUs are of interest, with Sonnet being the main vendor. However, eGPUs are not a solution that solves every problem. For example, you will see more benefit by adding an eGPU to a lesser-powered machine, like a 13” MacBook Pro than one with more horsepower, like an iMac Pro. Each eGPU takes one Thunderbolt 3 bus, so realistically, you are likely to only add one additional eGPU to a computer. None of the NLE vendors could really tell me how much of a boost their application would have with an eGPU. Finally, if you are looking for some great-looking, large, OLED displays that are pretty darned accurate and won’t break the bank, then LG is the place to look.

©2018 Oliver Peters

Advertisements

Molly’s Game

Molly Bloom’s future looked extremely bright. A shot at Olympic skiing glory leading to entry into a leading law school. But an accident during qualifying trials for the U. S. ski team knocked her out of the running for the Salt Lake City games. (Bloom notes in her own memoir that it was her decision to retire and change the course of her life, rather than the minor accident.) She moved to Los Angeles and ended up running high stakes, private poker games with her boss at the time. These games included A-list celebrities, hedge fund managers, and eventually, members of the Russian mob. Bloom quickly earned the nickname as the “poker princess”. This all came crashing down when Bloom was busted by the FBI and sentenced for her role in the gambling ring.

Bloom’s memoir came to the attention of screenwriter Aaron Sorkin (The Social Network, Moneyball, Steve Jobs), who not only made this his next film script, but also his debut as a film director. Sorkin stayed close to the facts that Bloom described in her own memoir and consulted her during the writing of the screenplay. The biggest departure is that Bloom named some celebrities at these games, who had previously been revealed in released court documents. Sorkin opted to fictionalize them, explaining that he would rather focus the story on Bloom’s experiences and not on Hollywood gossip. Jessica Chastain (The Zookeeper’s Wife, A Most Violent YearZero Dark Thirty) stars as Molly Bloom.

Although three editors are credited for Molly’s Game, the back story is that a staggered schedule had to be worked out. The post production of Steve Jobs connected feature film editor Elliot Graham (Milk, 21, Superman Returns) with that film’s writer and director – Sorkin and Danny Boyle (T2 Trainspotting, 127 Hours, Slumdog Millionaire). Graham was tapped to cut Molly’s Game later into the process, replacing its original editor. He brought Josh Schaeffer (The Last Man on Earth, Detroiters, You’re the Worst) on as associate editor to join him. Graham started the recut with Schaeffer, but a prior schedule commitment to work on Trust for Boyle, saw him exiting the film early. (Trust is the BBC’s adaptation of the Getty kidnapping story.) Graham was able to bring the film about 50% of the way through post. Alan Baumgarten (Trumbo, American Hustle, Gangster Squad) picked up for Graham and edited with Schaeffer to the finish, thus earning all three an editing credit.

Working with a writer on his directorial debut

It can always be a challenge when a writer is close to the editing process. Scenes that may be near and dear to the writer are often cut, leading to tension. I asked the three about this situation. Graham says, “Aaron has always been on set with his other films and worked very closely with the director. So, he understands the process, having learned from some of the best directors in the business. I had a great time with Aaron on Steve Jobs. He’s an incredibly lovely and generous collaborator who brings out the best in his team.”

Baumgarten expands, “Working with Aaron was fun, because he appreciates being challenged. He’s open to seeing what an editor brings to the film. Aaron wrote a tight script that didn’t need to be re-arranged. Only about 20 minutes came out. We cut one small scene, but it was mostly trimming here and there. You want to be careful not to ruin the rhythm of his writing.”

Graham continues, “Aaron also found his own visual vocabulary. A lot of the story is told in time jumps, from present day to the past in flashbacks. Aaron always is looking for rapid fire, overlapping dialogue. It’s part of his uniqueness and it’s a joy to cut. What was new for Aaron was using voice over to drive things.”

 Another new challenge was the use of stock footage. About 150 stock shots were used for cutaways and mini-montages throughout the film. Most of these were never originally scripted. Graham says, “Stock footage was something I chose to start injecting into the film with Aaron’s collaboration when I came on. We felt it was useful to have visual references for some of the voice overs – to connect visuals with words, which helps to land Aaron’s linguistic ideas for viewers. This began with the opening ski sequence – the first thing I cut when I came on board.”

The editors would pull down shots from a variety of internet sources and then the actual footage had to be found and cleared. The editors ultimately partnered with STALKR to find and clear all of the stock shots that were used. Visual effects were handled by Mr. X in Toronto. Originally, only 90 shots were budgeted (for example, snow falling in the ski sequences), but in the end, there were almost 600 visual effects shots in the final film.

Musicality of the performance

Baumgarten explains the musicality of Sorkin’s style. He says, “Aaron knew the film he wanted and had that in his head. Part of his writing process is to read his dialogue out loud and listen for the cadence of the performance. As you go through takes, the film is always moving in the right direction. As a writer/director, he doesn’t need variations or ad libs in an actor’s performance from one take to another, because he knows what the intention of the line is. As editors, we didn’t need to experiment with different calibrations of the performance. The experimentation came in with how we wove in the voice-over and played with the general rhythm.”

Graham adds, “Daniel Pemberton is the composer I worked with on Steve Jobs. I brought on Carl Kaller, a great music editor, when I came on. I knew that the music and dialogue had to dance a beautiful rhythm together for the film to be its best. With a compressed schedule to finish the film, we needed someone like Carl to help choreograph that dance.”

Baumgarten continues, “Daniel was involved early and provided us with temp tracks, which was a great gift. We didn’t have to use scores from other composers as temp music. Carl was just down the hall, so it was easy to weave Daniel’s temp elements in and around the dialogue and voice-over during the editing stage. There is interplay between the voice-over and the music, and the VO is like another musical element.”

Avid for the post

The post operation followed a standard feature film set-up. Avid Media Composer for the editing work stations, tied to Avid ISIS shared storage. The film was shot digitally using ARRI Alexas.

Production covered 48 days ending in February [2017]. It took 10 weeks to get to a director’s cut and then editing on Molly’s Game continued for about six months, which included visual effects, final sound mix and color correction. Schaeffer explains, “The dialogue scenes were scripted using [Avid] ScriptSync. Aaron was familiar with ScriptSync from The Newsroom, and it was a great help for us on this film. It’s the best way to have everything readily available and it allows us to be extremely thorough. If Aaron wanted to change a single word in a take, we were always able to find all of the alternates and make the change quite easily.”

Schaeffer continues, “Aaron methodically worked in a reel-by-reel order. We would divide up sequences between us at breaks that made sense. But when it came time to review the cut on a sequence, we would all review together. A lot people think that you have three editors on a film because the project is so difficult. The truth is that it lets you be more creative. Productions shoot so much footage these days, that it’s great to be able to experiment. Having multiple editors on a film enables you to take the time to be creative. We were all glad that Aaron set up an environment, which made that possible.”

Originally written for Digital Video magazine / Creative Planet Network

©2018 Oliver Peters

Downsizing

The bond between a film director and the editor is often a long-lasting one. The industry is full of pairings that continue film after film. One such duo is director Alexander Payne (Nebraska, The Descendants, Sideways) and editor Kevin Tent (Welcome to Me, Girl Interrupted, Election). Tent has edited every film that Payne directed, with the exception of Payne’s short film Paris, je t’aime. In fact, Payne also served as producer for Crash Pad, a film directed by Tent.

The latest Alexander Payne film to hit the cinemas is Downsizing, a sci-fi satire starring Matt Damon, Christoph Waltz, and Kristen Wig. In the film, scientists discover human miniaturization as a way to combat overpopulation. Paul (Matt Damon) and Audrey (Kristen Wig) decide to give it a try, exchanging their average life in Omaha for Leisure Land, one of the ‘micro-communities’ sprouting up. Their modest $150,000 in personal assets will make them multimillionaires, so they take the plunge.

Sci-fi and satire

The sci-fi genre is a new approach for Payne, which is where I started my conversation with Kevin Tent. He explains, “The sci-fi theme is a departure for Alexander, but this is still very much an ‘Alexander Payne movie’. It’s still about the human experience. In the plot, shrinking is seen as a way to save the human race, but people get greedy. They can make themselves instantly rich, save money on food, medicine, and move into big ‘McMansions’. Human nature takes over, which makes the film funny and also thought-provoking. It covers a lot of ground and politics.”

“It’s easy to ask, why sci-fi,” Tent continues. “Alexander Payne is an artist who is always looking for ways to challenge himself. He co-wrote the script ten years ago, but it took this long to get it made. For one thing, Downsizing is more expensive than his past films. As an editor, I first looked at the cutting differently, because of working with the visual effects; but, I quickly realized that this film, like Alexander’s others, was about the characters and the story.  [Those are] still the most important elements of the movie. I had recently worked on The Audition, which was shot mostly with green screen – and a while back, The Golden Compass, which was a serious visual effects movie.  I had enough knowledge about the process to know one thing. These people can do anything! We had a terrific VFX team, headed by our creative guru, Jamie Price. ILM and Framestore did most of the visual effects.”

Digital production to aid the process

Alexander Payne shifted to digital acquisition with Nebraska and has followed suit with his latest, Downsizing. According to Tent, “Alexander shoots a lot of coverage, so he likes digital for that. It’s also easier to deal with when compositing visual effects. We had over 130 hours of total footage. Of course, a fairly good chunk was plates for VFX and 2nd unit footage. Most of the scenes were shot with single camera, but sometimes with multi-cam. Especially for some of the big speeches, which were covered with two and sometimes three cameras. We synced up the takes in the Avid, which makes it so easy to switch from camera to camera. Mindy Elliot is our amazing first assistant. She’s a total pro and a total joy to work with. She’s been running our cutting rooms since The Descendants. Angela Latimer was our second. She did 99% of the scripting [for Avid’s ScriptSync feature] and also helped cut early versions of Paul’s drug montage [scene in Downsizing]. Joe Carson was our VFX editor. I met him while working on Sponge Bob The Movie. I was one of the live action CGI editors on that film. Joe is awesome. He not only kept all of our visual effects organized, but he was also kept busy with the countless comps, morphs, and speed-ups that we tossed at him on a daily basis.”

Production wrapped in mid-August 2016 and then Tent started cutting with Payne right after Labor Day. Tent continues, “When I cut with Alexander, we basically start from scratch. I do create an editor’s cut during production, which we go back to for reference during our time together cutting, but it isn’t the starting point when I begin with Alexander. He’s a good editor, so when we work together, it’s really like having two editors in the room. We start watching dailies and start building scenes. We often look back at my editor’s cut and realize the scene or a part of it was better in that earlier version. Or maybe not. If there is something we like, we’ll put it back into the current cut.  We completed our first pass (kind of a director’s assembly) in January to show the studio. By early to mid-July we had a locked cut with about 80% of the completed VFX shots. The remainder trickled in afterwards. All together, that’s about ten or eleven months of cutting and finishing. Our DI/color grading was handled by the amazing Skip Kimball at Technicolor.”

Tools and tips

As a fellow editor, it’s always fun to talk about the tools and how to use them on a feature film project. Kevin Tent is a committed Avid Media Composer user. (Pacific Post provided the Avid systems used by the editing team.) According to Tent, “This was a huge project and Media Composer never had a problem with it.” One unique hallmark of Media Composer is Avid’s Script Integration. Notable within it is ScriptSync, Media Composer’s ability to automatically analyze waveforms and synchronize them – and, therefore, the associated clip – against text that has been input, like a film script. When correctly indexed, simply clicking on a line of dialogue in the on-screen script brings up all of the corresponding coverage. An ongoing licensing dispute limited its use to older versions of Media Composer, until the issue was finally resolved this year. That is great news for devotees of Avid’s powerful ScriptSync capability.

Many film editors swear by Avid’s Script Integration tools, yet some never use them at all. Was Tent a ScriptSync user? “Hell, ya!,” is his instant reply. “We stayed on Media Composer 7.0.6, because of the ScriptSync licensing issue, just so we could use it. I had Angela mark a lot of extra material and ad libs in addition to the scripted dialog. For example, an action like Paul opening a door or something like that. That would help, especially if they shot a lot of takes or resets within one bigger take, which tends to happen a lot when the shooting is on digital. There’s a massive party scene midway through the movie with people dancing, smoking pot, that kind of thing, and I asked Angela to add a ton of detail describing the scene. It made finding specific actions so quick. It’s also an especially great aid at re-cutting scenes when you are looking for alternate coverage.”

Another aid that editors like is to place scene cards on the wall. Typically these are 3”x5” note cards with written scene descriptions – one for each scene – that can be pinned to the wall in the order of the ongoing edit. Although Tent is also a proponent of these – a remnant practice from the old film days – his Downsizing cutting room didn’t have enough wall space to accommodate cards.

The Downsizing script clocked in a tad long and the first assembly that Payne and Tent cut was 2:45 (final length was 2:08). Obviously the team needed to do a bit of “downsizing” themselves. Tent explains, “The biggest lost scenes were bookending storyteller elements to open and close the film. There was an old caveman from far in the future telling a group of children about the events within the film and how once giants roamed the world. This story element was painful to lose, because it was very funny and effective emotionally. But it took an added three or four minutes to get to Matt Damon’s character and that hurt us.  The audience wants you to get to your main characters and understand what they’re seeing within a reasonable amount of time. Fortunately, Alexander hadn’t shot it yet as part of the main production. We previewed with storyboards, temp music, and voice over. While it was tough to lose it from the point of view of the script, we weren’t leaving produced material ‘on the cutting room floor’. Ultimately if you don’t know it was there, you won’t miss not having it.”

Downsizing opened in cinemas on December 21. Whether you are in it for the thought-provoking concepts or simply a lot of laughs and a wild ride, it’s a film to enjoy. Alexander Payne is bound to have another success on his hands.

Originally written for Digital Video magazine / Creative Planet Network

© 2017, 2018 Oliver Peters

Stocking Stuffers 2017

It’s holiday time once again. For many editors that means it’s time to gift themselves with some new tools and toys to speed their workflows or just make the coming year more fun! Here are some products to consider.

Just like the tiny house craze, many editors are opting for their laptops as their main editing tool. I’ve done it for work that I cut when I’m not freelancing in other shops, simply because my MacBook Pro is a better machine than my old (but still reliable) 2009 Mac Pro tower. One less machine to deal with, which simplifies life. But to really make it feel like a desktop tool, you need some accessories along with an external display. For me, that boils down to a dock, a stand, and an audio interface. There are several stands for laptops. I bought both the Twelve South BookArc and the Rain Design mStand: the BookArc for when I just want to tuck the closed MacBook Pro out of the way in the clamshell mode and the mStand for when I need to use the laptop’s screen as a second display. Another option some editors like is the Vertical Dock from Henge Docks, which not only holds the MacBook Pro, but also offers some cable management.

The next hardware add-on for me is a USB audio interface. This is useful for any type of computer and may be used with or without other interfaces from Blackmagic Design or AJA. The simplest of these is the Mackie Onyx Blackjack, which combines interface and output monitor mixing into one package. This means no extra small mixer is required. USB input and analog audio output direct to a pair of powered speakers. But if you prefer a separate small mixer and only want a USB interface for input/output, then the PreSonus Audiobox USB or the Focusrite Scarlett series is the way to go.

Another ‘must have’ with any modern system is a Thunderbolt dock in order to expand the native port connectivity of your computer. There are several on the market but it’s hard to go wrong with either the CalDigit Thunderbolt Station 2 or the OWC Thunderbolt 2 Dock. Make sure you double-check which version fits for your needs, depending on whether you have a Thunderbolt 2 or 3 connection and/or USB-C ports. I routinely use each of the CalDigit and OWC products. The choice simply depends on which one has the right combination of ports to fit your needs.

Drives are another issue. With a small system, you want small portable drives. While LaCie Rugged and G-Technology portable drives are popular choices, SSDs are the way to go when you need true, fast performance. A number of editors I’ve spoken to are partial to the Samsung Portable SSD T5 drives. These USB3.0-compatible drives aren’t the cheapest, but they are ultraportable and offer amazing read/write speeds. Another popular solution is to use raw (uncased) drives in a drive caddy/dock for archiving purposes. Since they are raw, you don’t pack for the extra packaging, power supply, and interface electronics with each, just to have it sit on the shelf. My favorite of these is the HGST Deckstar NAS series.

For many editors the software world is changing with free applications, subscription models, and online services. The most common use of the latter is for review-and-approval, along with posting demo clips and short films. Kollaborate.tv, Frame.io, Wipster.io, and Vimeo are the best known. There are plenty of options and even Vimeo Pro and Business plans offer a Frame/Wipster-style review-and-approval and collaboration service. Plus, there’s some transfer ability between these. For example, you can publish to a Vimeo account from your Frame account. Another expansion of the online world is in team workgroups. A popular solution is Slack, which is a workgroup-based messaging/communication service.

As more resources become available online, the benefits of large-scale computing horsepower are available to even single editors. One of the first of these new resources is cloud-based, speech-to-text transcription. A number of online services provide this functionality to any NLE. Products to check out include Scribeomatic (Coremelt), Transcriptive (Digital Anarchy), and Speedscriber (Digital Heaven). They each offer different pricing models and speech analysis engines. Some are still in beta, but one that’s already out is Speedscriber, which I’ve used and am quite happy with. Processing is fast and reasonably accurate, given a solid audio recording.

Naturally free tools make every user happy and the king of the hill is Blackmagic Design with DaVinci Resolve and Fusion. How can you go wrong with something this powerful and free with ongoing company product development? Even the paid versions with some more advanced features are low cost. However, at the very least the free version of Resolve should be in every editor’s toolkit, because it’s such a Swiss Army Knife application.

On the other hand, editors who have the need to learn Avid Media Composer, need look no further than the free Media Composer | First. Avid has tried ‘dumbed-down’ free editing apps before, but First is actually built off of the same code base as the full Media Composer software. Thus, skills translate and most of the core functions are available for you to use.

Many users are quite happy with the advantages of Adobe’s Creative Cloud software subscription model. Others prefer to own their software. If you work in video, then it’s easy to put together alternative software kits for editing, effects, audio, and encoding that don’t touch an Adobe product. Yet for most, the stumbling block is Photoshop – until now. Both Affinity Photo (Serif) and Pixelmator Pro are full-fledged graphic design and creation tools that rival Photoshop in features and quality. Each of these has its own strong points. Affinity Photo offers Mac and Windows versions, while Pixelmator Pro is Mac only, but taps more tightly into macOS functions.

If you work in the Final Cut Pro X world, several utilities are essential. These include SendToX and XtoCC from Intelligent Assistance, along with X2Pro Audio Convert from Marquis Broadcast. Marquis’ newest is Worx4 X – a media management tool. It takes your final sequence and creates a new FCPX library with consolidated (trimmed) media. No transcoding is involved, so the process is lighting fast. Although in some cases media is copied without being trimmed. This can reduce the media to be archived from TBs down to GBs. They also offer Worx4 Pro, which is designed for Premiere Pro CC users. This tool serves as a media tracking application, to let editors find all of the media used in a Premiere Pro project across multiple volumes.

Most editors love to indulge in plug-in packages. If you can only invest in a single, large plug-in package, then BorisFX’s Boris Continuum Complete 11 and/or their Sapphire 11 bundles are the way to go. These are industry-leading tools with wide host and platform support. Both feature mocha tracking integration and Continuum also includes the Primatte Studio chromakey technology.

If you want to go for a build-it-up-as-you-need-it approach – and you are strictly on the Mac – then FxFactory will be more to your liking. You can start with the free, basic platform or buy the Pro version, which includes FxFactory’s own plug-ins. Either way, FxFactory functions as a plug-in management tool. FxFactory’s numerous partner/developers provide their products through the FxFactory platform, which functions like an app store for plug-ins. You can pick and choose the plug-ins that you need when the time is right to purchase them. There are plenty of plug-ins to recommend, but I would start with any of the Crumplepop group, because they work well and provide specific useful functions. They also include the few audio plug-ins available via FxFactory. Another plug-in to check out is the Hawaiki Keyer 4. It installs into both the Apple and Adobe applications and far surpasses the built-in keying tools within these applications.

The Crumplepop FxFactory plug-ins now includes Koji Advance, which is a powerful film look tool. I like Koji a lot, but prefer FilmConvert from Rubber Monkey Software. To my eyes, it creates one of the more pleasing and accurate film emulations around and even adds a very good three-way color corrector. This opens as a floating window inside of FCPX, which is less obtrusive than some of the other color correction plug-ins for FCPX. It’s not just for film emulation – you can actually use it as the primary color corrector for an entire project.

I don’t want to forget audio plug-ins in this end-of-the-year roundup. Most editors don’t feel too comfortable with a ton of surgical audio filters, so let me stick to suggestions that are easy-to-use and very affordable. iZotope is a well-known audio developer and several of its products are perfect for video editors. These fall into repair, mixing, and mastering needs. These include the Nectar, Ozone, and RX bundles, along with the RX Loudness Control. The first three groups are designed to cover a wide range of needs and, like the BCC video plug-ins, are somewhat of an all-encompassing product offering. But if that’s a bit rich for the blood, then check out iZotope’s various Elements versions.

The iZotope RX Loudness Control is great for accurate loudness compliance, and best used with Avid or Adobe products. However, it is not real-time, because it uses analysis and adaptive processing. If you want something more straightforward and real-time, then check out the LUFS Meter from Klangfreund. It can be used for loudness control on individual tracks or the master output. It works with most of the NLEs and DAWs. A similar tool to this is Loudness Change from Videotoolshed.

Finally, let’s not forget the iOS world, which is increasingly becoming a viable production platform. For example, I’ve used my iPad in the last year to do location interview recordings. This is a market that audio powerhouse Apogee has also recognized. If you need a studio-quality hardware interface for an iPhone or iPad, then check out the Apogee ONE. In my case, I tapped the Apogee MetaRecorder iOS application for my iPad, which works with both Apogee products and the iPad’s built-in mic. It can be used in conjunction with FCPX workflows through the integration of metadata tagging for Keywords, Favorites, and Markers.

Have a great holiday season and happy editing in the coming year!

©2017 Oliver Peters

Avid Media Composer | First

They’ve teased us for two years, but now it’s finally out. Avid Technology has released its free nonlinear editing application, Media Composer | First. This is not dumbed-down, teaser software, but rather a partially-restricted version of the full-fledged Media Composer software and built upon the same code. With that comes an inherent level of complexity, which Avid has sought to minimize for new users; however, you really do want to go through the tutorials before diving in.

It’s important to understand who the target user is. Avid didn’t set out to simply add another free, professional editing tool to an increasingly crowded market. Media Composer | First is intended as a functional starter tool for users who want to get their feet wet in the Avid ecosystem, but then eventually convert to the full-fledged, paid software. That’s been successful for Avid with Pro Tools | First. To sweeten the pot, you’ll also get 350 sound effects from Pro Sound Effects and 50 royalty-free music tracks from Sound Ideas (both sets are also free).

Diving in

To get Media Composer | First, you must set up an Avid master account, which is free. Existing customers can also get First, but the software cannot be co-installed on a computer with the full version. For example, I installed Media Composer | First on my laptop, because I have the full Media Composer application on my desktop. You must sign into the account and stay signed in for Media Composer | First to lunch and run. I did get it to work if I signed in, but then disconnected the internet. There was a disconnection prompt, but nevertheless, the application worked, saved, and exported properly. It doesn’t seem mandatory to be constantly connected to Avid over the internet. All project data is stored locally, so this is not a cloud application.

The managing of the account and future updates are handled through Application Manager, an Avid desktop utility. It’s not my favorite, as at times it’s unreliable, but it does work most of the time. Opening the installer .dmg file will take a long time to verify. This seems to be a general Avid quirk, so be patient. When you first open the application, you may get a disk drive write permissions error message. On macOS you normally set drive permissions for “system”, “wheel”, and “everyone”. Typically I have the last two set to “read only”, which works for every other application, except Avid’s. Therefore, if you want to store Avid media on your internal system hard drive, then “everyone” must be changed to “read & write”.

The guided tour

The Avid designers have tried to make the Media Composer | First interface easy to navigate for new users – especially those coming from other NLEs, where media and projects are managed differently than in Media Composer. Right at the launch screen you have the option to learn through online tutorials. These will be helpful even for experienced users who might try to “out-think” the software. The interface includes a number of text overlays to help you get started. For example, there is no place to set project settings. The first clip added to the first sequence sets the project settings from there on. So, don’t drop a 25fps clip onto the timeline as your first clip, if you intend to work in a 23.98fps project. These prompts are right in front of you, so if you follow their guidance, you’ll be OK.

The same holds true for importing media through the Source Browser. With Media Composer you either transcode a file, which turns it into Avid-managed media placed into the Avid MediaFiles folder, or simply link to the file. If you select link, then the file stays in place and it’s up to the user not to move or delete that file on the hard drive. Although the original Avid paradigm was to only manage media in its MediaFiles hard drive folders, the current versions handle linking just fine and act largely the same as other NLEs.

Options, restrictions, and limitations

Since this is a free application, a number of features have been restricted. There are three biggies. Tracks are limited to four video tracks and eight audio tracks. This is actually quite workable, however, I think a higher audio track count would have been advisable, because of how Avid handles stereo, mono, and multichannel files. On a side note, if you use the “collapse” function to nest video clips, it’s possible to vertically stack more than just four clips on the timeline.

The application is locked to a maximum project size of 1920×1080 (Rec. 709 color space only) and up to 59.94fps. Source files can be larger (such as 4K) and you can still use them on the timeline, but you’ll have to pan-and-scan, crop, or scale them. I hope future versions will permit at least UltraHD (4K) project sizes.

Finally, Media Composer | First projects cannot be interchanged with full fledged Media Composer projects. This means that you cannot start in Media Composer | First and then migrate your project to the paid version. Hopefully this gets fixed in a future update. If not, it will negatively impact students and indie producers using the application for any real work.

As expected, there are no 3D stereoscopic tools, ScriptSync (automatic speech-to-text/sync-to-script), PhraseFind (phonetic search engine), or Symphony (advanced color correction) options. One that surprised me, though, was the removable of the superior Spectramatte keyer. You are left with the truly terrible RGB keyer for blue/green-screen work.

Nevertheless, there’s plenty of horsepower left. For example, FrameFlex to handle resizing and Timewarps for retiming clips, which is how 4K and off-speed frame rates are handled. Color correction (including scopes), multicam, IllusionFX, source setting color LUTs, Audiosuite, and Pro Tools-style audio track effects are also there. Transcoding allows for the use of a wide range of codecs, including ProRes on a Mac. 4K camera clips will be transcoded to 1080. However, exports are limited to Avid DNxHD and H.264 QuickTime files at up to 1920×1080. The only DNxHD export flavor is the 100Mbps variant (at 29.97, 80Mbps for 23.98), which is comparable to ProResLT. It’s good quality, but not at the highest mastering levels.

Conclusion

This is a really good first effect, no pun intended. As you might expect, it’s a little buggy for a first version. For example, I experienced a number of crashes while testing source LUTs. However, it was well-behaved during standard editing tasks. If Media Composer | First files can become compatible with the paid systems and the 1080 limit can be increased to UHD/4K, then Avid has a winner on its hands. Think of the film student who starts on First at home, but then finishes on the full version in the college’s computer lab. Or the indie producer/director who starts his or her own rough cut on First, but then takes it to an editor or facility to complete the process. These are ideal scenarios for First. I’ve cut tons of short and long form projects, including a few feature films, using a variety of NLEs. Nearly all of those could have been done using Media Composer | First. Yes, it’s free, but there’s enough power to get the job done and done well.

©2017 Oliver Peters

Suburbicon

George Clooney’s latest film, Suburbicon, originated over a decade ago as a screenplay by Joel and Ethan Coen. Clooney picked it up when the Coens decided not to produce the film themselves. Clooney and writing partner Grant Heslov (The Monuments Men, The Ides of March, Good Night, and Good Luck), rewrote it as taking place in the 1950s and added another story element. In the summer of 1957, the Myers, an African-American couple, moved into a largely white suburb in Levittown, Pennsylvania, setting off months of violent protests. The rewritten script interweaves the tale of the black family with that of their next-door neighbors, Gardner (Matt Damon) and Margaret (Julianne Moore). In fact, a documentary was produced about the historical events and shots from that documentary were used in Suburbicon.

Calibrating the tone

During the production and editing of the film, the overall tone was adjusted as a result of the actual, contemporary events occurring in the country. I spoke with the film’s editor, Stephen Mirrione (The Revenant, Birdman or (The Unexpected Virtue of Ignorance), The Monuments Men) about this. Mirrione explains, “The movie is presented as over-the-top to exaggerate events as satire. In feeling that out, George started to tone down the silliness, based on surrounding events. The production was being filmed during the time of the US election last year, so the mood on the set changed. The real world was more over-the-top than imagined, so the film didn’t feel quite right. George started gravitating towards a more realistic style and we locked into that tone by the time the film moved into post.”

The production took place on the Warner Brothers lot in September 2016 with Mirrione and first assistant editor Patrick Smith cutting in parallel with the production. Mirrione continues, “I was cutting during this production period. George would come in on Saturdays to work with me and ‘recalibrate’ the cut. Naturally some scenes were lost in this process. They were funny scenes, but just didn’t fit the direction any longer. In January we moved to England for the rest of the post. Amal [Clooney, George’s wife] was pregnant at the time, so George and Amal wanted to be close to her family near London. We had done post there before and had a good relationship with vendors for sound post. The final sound mix was in the April/May time frame. We had an editing room set up close to George outside of London, but also others in Twickenham and at Pinewood Studios. This way I could move around to work with George on the cut, wherever he needed to be.”

Traveling light

Mirrione is used to working with a light footprint, so the need for mobility was no burden. He explains, “I’m accustomed to being very mobile. All the media was in the Avid DNxHD36 format on mobile drives. We had an Avid ISIS shared storage system in Twickenham, which was the hub for all of the media. Patrick would make sure all the drives were updated during production, so I was able to work completely with standalone drives. The Avid is a bit faster that way, although there’s a slight trade-off waiting for updated bins to be sent. I was using a ‘trash can’ [2013] Mac Pro plus AJA hardware, but I also used a laptop – mainly for reference – when we were in LA during the final steps of the process.” The intercontinental workflow also extended to color correction. According to Mirrione, “Stefan Sonnenfeld was our digital intermediate colorist and Company 3 [Co3] stored a back-up of all the original media. Through an arrangement with Deluxe, he was able to stream material to England for review, as well as from England to LA to show the DP [Robert Elswit].”

Music was critical to Suburbicon and scoring fell to Alexandre Desplat (The Secret Life of Pets, Florence Foster Jenkins, The Danish Girl). Mirrione explains their scoring process. “It was very important, as we built the temp score in the edit, to understand the tone and suspense of the film. George wanted a classic 1950s-style score. We tapped some Elmer Bernstein, Grifters, The Good Son, and other music for our initial style and direction. Peter Clarke was brought on as music editor to help round out the emotional beats. Once we finished the cut, Alexandre and George worked together to create a beautiful score. I love watching the scenes with that score, because his music makes the editing seem much more exciting and elegant.”

Suiting the edit tool to your needs

Stephen Mirrione typically uses Avid Media Composer to cut his films and Suburbicon is no exception. Unlike many film editors who rely on unique Avid features, like ScriptSync, Mirrione takes a more straightforward approach. He says, “We were using Media Composer 8. The way George shoots, there’s not a lot of improv or tons of takes. I prefer to just rely on PDFs of the script notes and placing descriptions into the bins. The infrastructure required for ScriptSync, like extra assistants, is not something I need. My usual method of organization is a bin for each day of dailies, organized in shooting order. If the director remembers something, it’s easy to find in a day bin. During the edit, I alternate my bin set-ups between the script view and the frame view.”

With a number of noted editors dabbling with other software, I wondered whether Mirrione has been tempted. He responds, “I view my approach as system-agnostic and have cut on Lightworks and the old Laser Pacific unit, among others. I don’t want to be dependent on one piece of software to define how I do my craft. But I keep coming back to Avid. For me it’s the trim mode. It takes me back to the way I cut film. I looked at Resolve, because it would be great to skip the roundtrip between applications. I had tested it, but felt it would be too steep a learning curve, and that would have impacted George’s experience as the director.”

In wrapping our conversation, Mirrione concluded with this take away from his Suburbicon experience. He explains, “In our first preview screening, it was inspiring to see how seriously the audience took to the film and the attachment they had to the characters. The audiences were surprised at how biting and relevant it is to today. The theme of the film is really talking about what can happen when people don’t speak out against racism and bullying. I’m so proud and lucky to have the opportunity to work with someone like George, who wants to do something meaningful.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Baby Driver

You don’t have to be a rabid fan of Edgar Wright’s work to know of his films. His comedy trilogy (Shaun of the Dead, Hot Fuzz, The World’s End) and cult classics like Scott Pilgrim vs. the World loom large in pop culture. His films have earned a life beyond most films’ brief release period and earned Wright a loyal following. The latest film from Wright is Baby Driver, a musically-fueled action film written and directed by Wright, which just made a big splash at SXSW. It stars Ansel Elgort, Kevin Spacey, Jon Hamm, Jamie Foxx, and Eiza Gonzalez.

At NAB, Avid brought in a number of featured speakers for its main stage presentations, as well as its Avid Connect event. One of these speakers was Paul Machliss (Scott Pilgrim vs. the World, The World’s End, Baby Driver), who spoke to packed audiences about the art of editing these films. I had a chance to go in-depth with Machliss about the complex process of working on Baby Driver.

From Smoke to baptism by fire

We started our conversation with a bit of the backstory of the connection between Wright and Machliss. He says, “I started editing as an online editor and progressed from tape-based systems to being one of the early London-based Smoke editors. My boss at the time passed along a project that he thought would be perfect for Smoke. That was onlining the sitcom Spaced, directed by Edgar Wright. Edgar and I got on well. Concurrent to that, I had started learning Avid. I started doing offline editing jobs for other directors and had a ball. A chance came along to do a David Beckham documentary, so I took the plunge from being a full-time online editor to taking my chances in the freelance world. On the tail end of the documentary, I got a call from Edgar, offering me the gig to be the offline editor for the second season of Spaced, because Chris Dickens (Hot Fuzz, Berberian Sound Studio, Slumdog Millionaire) wasn’t available to complete the edit. And that was really jumping into the deep end. It was fantastic to be able to work with Edgar at that level.”

Machliss continues, “Chris came back to work with Edgar on Shaun of the Dead and Hot Fuzz, so over the following years I honed my skills working on a number of British comedies and dramas. After Slumdog Millionaire came out, which Chris cut and for which he won a number of awards, including an Oscar, Chris suddenly found himself very busy, so the rest of us working with Edgar all moved up one in the queue, so to speak. The opportunity to edit Scott Pilgrim came up, so we all threw ourselves into the world of feature films, which was definitely a baptism by fire. We were very lucky to be able to work on a project of that nature during a time where the industry was in a bit of a slump due to the recession. And it’s fantastic that people still remember it and talk about it seven years on. Which brings us to Baby Driver. It’s great when a studio is willing to invest in a film that isn’t a franchise, a sequel, or a reboot.”

Music drives the film

In Baby Driver, Ansel Elgort plays “Baby”, a young kid who is the getaway driver for a gang. At a young age, he was in a car accident which leaves him with tinnitus, so it takes listening to music 24/7 to drown out the tinnitus. Machliss explains, “His whole life becomes regimented to whatever music he is listening to – different music for different moods or occasions. Somehow everything falls magically into sync with whatever he is listening to – when he’s driving, swerving to avoid a car, making a turn – it all seems to happen on the beat. Music drives every single scene. Edgar deliberately chose commercial top-20 tracks from the 1960s up to today. Each song Baby listens to also slyly comments on whatever is happening at the time in the story. Everything is seemingly choreographed to musical rhythms. You’re not looking at a musical, but everything is musically driven.”

Naturally, building a film to popular music brings up a whole host of production issues. Machliss tells how this film had been in the planning for years, “Edgar had chosen these tracks years ago. I believe it was in 2011 that Edgar and I tried to sequence the tracks and intersperse them with sound effects. A couple of months later, he did a table read in LA and sent me the sound files. In the Avid, I combined the sound files, songs, and some sound effects to create effectively a 100-minute radio play, which was, in fact, the film in audio form. The big thing is that we had to clear every song before we could start filming. Eventually we cleared 30-odd songs for the film. In addition, Edgar worked with his stunt team and editor Evan Schiff in LA to create storyboards and animatics for all of the action scenes.”

Editor on the front lines

Unlike most films, a significant amount of the editing took place on-set with Machliss working from a portable set-up. He says, “Based on our experiences with Scott Pilgrim and World’s End, Edgar decided it would be best to have me on-set during most of the Atlanta shoot for Baby Driver. Even though a cutting room was available, I was in there maybe ten percent of the time. The rest of the time I was on set. I had a trolley with a laptop, monitor, an Avid Mojo, and some hard drives and I would connect myself via ethernet to the video assist’s hard drive. Effectively I was crew in the front lines with everyone else. Making sure the edit worked was as important as getting a good take in the can. If I assured Edgar that a take would work, then he knew it wasn’t going to come back and cause problems for us six months later. We wanted things to work naturally in camera without a lot of fiddling in post. We didn’t want to have to fall back on frame-cutting and vari-speeding if we didn’t have to. There was a lot of prep work in making sure actions correctly coincided with certain lyrics without the action seeming mechanical.”

The nature of the production added to the complexity of the production audio configuration, too. Machliss explains, “Sound-wise, it was very complicated. We had playback going to earwigs in the actors’ ears, Edgar wanted to hear music plus the dialogue in his cans, and then I needed to get a split feed of the audio, since I already had the clean music on my timeline. We shot this mostly on 35mm film. Some days were A-camera only, but usually two cameras running. It was a combination of Panavision, Arricams, and occasionally Arri Alexas. Sometimes there were some stunt shots, which required nine or ten cameras running. Since the action all happened against playback of a track, this allowed me to use Avid’s multicam tools to quickly group shots together. Avid’s AMA tools have really come of age, so I was able to work without needing to ingest anything. I could treat the video assist’s hard drive as my source media, as long as I had the ethernet connection to it. If we were between set-ups, I could get Avid to background-transcode the media, so I’d have my own copy.”

Did all of this on-set editing speed up the rest of the post process? He continues, “All of the on-set editing helped a great deal, because we went into the real post-production phase knowing that all the sequences basically worked. During that time, as I’d fill up a LaCie Rugged drive, I would send that back to the suites. My assistant, Jerry Ramsbottom, would then patiently overcut my edits from the video assist with the actual scanned telecine footage as it came in. We shot from mid-February until mid-May and then returned to England. Jonathan Amos came on board a few weeks into the director’s cut edit and worked on the film with Edgar and myself up until the director’s cut picture lock. He did a pass on some of the action scenes while Edgar and myself concentrated on dialogue and the overall shape of the film. He stayed on board up until the final picture lock and made an incredible contribution to the action and the tension of the film. By the end of the year we’d locked and then we finished the final mix mid-February of this year. But the great thing was to be able to come into the edit and have those sequences ready to go.”

Editing from set is something many editors try to avoid. They feel they can be more objective that way. Machliss sees it a bit differently, “Some editors don’t like being on set, but I like the openness of it – taking it all in. Because when you are in the edit, you can recall the events of the day a particular scene was shot – ‘I can remember when Kevin Spacey did this thing on the third take, which could be useful’. It’s not vital to work like this, but it does preclude to a kind of short-hand, which is something Edgar and I have developed over these years anyway. The beauty of it is that Edgar and I will take the time to try every option. You can never hit on the perfect cut the first time. Often you’ll get feedback from screenings, such as ‘we’d like to see more emotion between these characters’. You know what’s available and sometimes four extra shots can make all the difference in how a scene reads without having to re-imagine anything. We did drop some scenes from the final version of the film. Of course, you go ‘that’s a shame’, but at least these scenes were given a chance. However, there are always bits where upon the 200th viewing you can decide, ‘well, that’s completely redundant’ – and it’s easy to drop. You always skate as close to the edge of making a film shorter without doing any damage to it.”

The challenge of sound

During sound post, Baby Driver also presented some unique challenges. Machliss says, “For the sound mix – and even for the shoot – we had to make sure we were working with the final masters of the song recordings to make sure the pitch and duration remained constant throughout. Typically these came in as mono or stereo WAVs. Because music is such an important element to the film, the concept of perceived direction becomes important. Is the music emanating from Baby’s earbuds? What happens to it when the camera moves or he turns his head? We had to work out a language for the perception of sound. This was Edgar’s first film mixed in Dolby ATMOS and we were the second film in Goldcrest London’s new Atmos-certified dubbing theater. Then we did a reduction to 7.1 and 5.1. Initially we were thinking this film would have no score other than the songs. Invariably you need something to get from A to B. We called on the services of Steven Price (Gravity, Fury, Suicide Squad), who provided us with some original cues and some musical textures. He did a very clever thing where he would match the end pitch or notes of a commercial song and then by the time he came to the end of his cue, it would match to the incoming note or key of the next song. And you never notice the change.”

Working with Avid in a new way

To wrap up the conversation, we talked a bit about using Avid Media Composer on his work. Machliss has used numerous other systems, but Media Composer still fits the bill for his work today. He says, “For me, the speed of working with AMA in Avid in the latest software was a real benefit. I could actually keep up with the speed of the shoot. You don’t want to be the one holding up a crew of 70. I also made good use of background transcoding. On a different project (Fleabag), I was able to work with native 2K Alexa ProRes camera files at full resolution. It was fantastic to be able to use Frameflex and apply LUTs – doing the cutting, but then bringing back my old skills as an online editor to paint out booms and fix things up. Once we locked, I could remove the LUTs and export DPX files, which went straight to the grading facility. That was exciting to work in a new way.”

Baby Driver opened at the start of July in the US and is a fun ride. You can certainly enjoy a film like this without knowing the nitty gritty of the production that goes into it. However, after you’ve read this article, you just might need to see it at least twice – once to just enjoy and once again to study the “invisible art” that’s gone into bringing it to screen.

(For more with Paul Machliss, check out these interviews at Studio Daily, ProVideoCoalition, and FrameIO.)

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters