NAB Show 2019

This year the NAB Show seemed to emphasize its roots – the “B” in National Association of Broadcasters. Gone or barely visible were the fads of past years, such as stereoscopic 3D, 360-degree video, virtual/augmented reality, drones, etc. Not that these are gone – merely that they have refocused on the smaller segment of marketshare that reflects reality. There’s not much point in promoting stereo 3D at NAB if most of the industry goes ‘meh’.

Big exhibitors of the past, like Quantel, RED, Apple, and Autodesk, are gone from the floor. Quantel products remain as part of Grass Valley (now owned by Belden), which is the consolidation of Grass Valley Group, Quantel, Snell & Wilcox, and Philips. RED decided last year that small, camera-centric shows were better venues. Apple – well, they haven’t been on the main floor for years, but even this year, there was no off-site, Final Cut Pro X stealth presence in a hotel suite somewhere. Autodesk, which shifted to a subscription model a couple of years ago, had a demo suite in the nearby Renaissance Hotel, focusing on its hero product, Flame 2020. Smoke for Mac users – tough luck. It’s been over for years.

This was a nuts-and-bolts year, with many exhibits showing new infrastructure products. These appeal to larger customers, such as broadcasters and network facilities. Specifically the world is shifting to an IP-based infrastructure for signal routing, control, and transmission. This replaces copper and fiber wiring of the past, along with the devices (routers, video switchers, etc) at either end of the wire. Companies that might have appeared less relevant, like Grass Valley, are back in a strong sales position. Other companies, like Blackmagic Design, are being encouraged by their larger clients to fulfill those needs. And as ever, consolidation continues – this year VizRT acquired NewTek, who has been an early player in video-over-IP with their proprietary NDI protocol.

Adobe

The NAB season unofficially started with Adobe’s pre-NAB release of the CC2019 update. For editors and designers, the hallmarks of this update include a new, freeform bin window view and adjustable guides in Premiere Pro and content-aware, video fill in After Effects. These are solid additions in response to customer requests, which is something Adobe has focused on. A smaller, but no less important feature is Adobe’s ongoing effort to improve media performance on the Mac platform.

As in past years, their NAB booth was an opportunity to present these new features in-depth, as well as showcase speakers who use Adobe products for editing, sound, and design. Part of the editing team from the series Atlanta was on hand to discuss the team’s use of Premiere Pro and After Effects in their ‘editing crash pad’.

Avid

For many attendees, NAB actually kicked off on the weekend with Avid Connect, a gathering of Avid users (through the Avid Customer Association), featuring meet-and-greets, workshops, presentations, and ACA leadership committee meetings. While past product announcements at Connect have been subdued from the vantage of Media Composer editors, this year was a major surprise. Avid revealed its Media Composer 2019.5 update (scheduled for release the end of May). This came as part of a host of many updates. Most of these apply to companies that have invested in the full Avid ecosystem, including Nexis storage and Media Central asset management. While those are superb, they only apply to a small percentage of the market. Let’s not forget Avid’s huge presence in the audio world, thanks to the dominance of Pro Tools – now with Dolby ATMOS support. With the acquisition of Euphonix years back, Avid has become a significant player in the live and studio sound arena. Various examples of its S-series consoles in action were presented.

Since I focus on editing, let me discuss Media Composer a bit more. The 2019.5 refresh is the first major Media Composer overhaul in years. It started in secret last year. 2019.5 is the first iteration of the new UI, with more to be updated in coming releases. In short, the interface has been modernized and streamlined in ways to attract newer, younger users, without alienating established editors. Its panel design is similar to Adobe’s approach – i.e. interface panels can be docked, floated, stacked, or tabbed. Panels that you don’t want to see may be closed or simply slid to the side and hidden. Need to see a hidden panel again? Simply side it back open from the edge of the screen.

This isn’t just a new skin. Avid has overhauled the internal video pipeline, with 32-bit floating color and an uncompressed DNx codec. Project formats now support up to 16K. Avid is also compliant with the specs of the Netflix Post Alliance and the ACES logo program.

I found the new version very easy to use and a welcomed changed; however, it will require some adaptation if you’ve been using Media Composer for a long time. In a nod to the Media Composer heritage, the weightlifter (aka ‘liftman’) and scissors icons (for lift and extract edits) are back. Even though Media Composer 2019.5 is just in early beta testing, Avid felt good enough about it to use this version in its workshops, presentations, and stage demos.

One of the reasons to go to NAB is for the in-person presentations by top editors about their real-world experiences. No one can top Avid at this game, who can easily tap a host of Oscar, Emmy, BFTA, and Eddie award winners. The hallmark for many this year was the presentation at Avid Connect and/or at the show by the Oscar-winning picture and sound editing/mixing team for Bohemian Rhapsody. It’s hard not to gather a standing-room-only crowd when you close your talk with the Live Aid finale sequence played in kick-ass surround!

Blackmagic Design

Attendees and worldwide observers have come to expect a surprise NAB product announcement out of Grant Petty each year and he certainly didn’t disappoint this time. Before I get into that, there were quite a few products released, including for IP infrastructures, 8K production and post, and more. Blackmagic is a full spectrum video and audio manufacturer that long ago moved into the ‘big leagues’. This means that just like Avid or Grass Valley, they have to respond to pressure from large users to develop products designed around their specific workflow needs. In the BMD booth, many of those development fruits were on display, like the new Hyperdeck Extreme 8K HDR recorder and the ATEM Constellation 8K switcher.

The big reveal for editors was DaVinci Resolve 16. Blackmagic has steadily been moving into the editorial space with this all-in-one, edit/color/mix/effects/finishing application. If you have no business requirement for – or emotional attachment to – one of the other NLE brands, then Resolve (free) or Resolve Studio (paid) is an absolute no-brainer. Nothing can touch the combined power of Resolve’s feature set.

New for Resolve 16 is an additional editorial module called the Cut Page. At first blush, the design, layout, and operation are amazingly similar to Apple’s Final Cut Pro X. Blackmagic’s intent is to make a fast editor where you can start and end your project for a time-sensitive turnaround without the complexities of the Edit Page. However, it’s just another tool, so you could work entirely in the Cut Page, or start in the Cut Page and refine your timeline in the Edit Page, or skip the Cut Page all together. Resolve offers a buffet of post tools that are at your disposal.

While Resolve 16’s Cut Page does elicit a chuckle from experienced FCPX users, it offers some new twists. For example, there’s a two-level timeline view – the top section is the full-length timeline and the bottom section is the zoomed-in detail view. The intent is quick navigation without the need to constantly zoom in and out of long timelines. There’s also an automatic sync detection function. Let’s say you are cutting a two-camera show. Drop the A-camera clips onto the timeline and then go through your B-camera footage. Find a cut-away shot, mark in/out on the source, and edit. It will ‘automagically’ edit to the in-sync location on the timeline. I presume this is matched by either common sound or timecode. I’ll have to see how this works in practice, but it demos nicely. Changes to other aspects of Resolve were minor and evolutionary, except for one other notable feature. The Color Page added its own version of content-aware, video fill.

Another editorial product addition – tied to the theme of faster, more-efficient editing – was a new edit keyboard. Anyone who’s ever cut in the linear days – especially those who ran Sony BVE9000/9100 controllers – will feel very nostalgic. It’s a robust keyboard with a high-quality, integrated jog/shuttle knob. The feel is very much like controlling a tape deck in a linear system, with fast shuttle response and precise jogging. The precision is far better than any of the USB controllers, like a Contour Shuttle. Whether or not enough people will have interest in shelling out $1,025 for it awaits to be seen. It’s a great tool, but are you really faster with one, than with FCPX’s skimming and a standard keyboard and mouse?

Ironically, if you look around the Blackmagic Design booth there does seem to be a nostalgic homage to Sony hardware of the past. As I said, the edit keyboard is very close to a BVE9100 keyboard. Even the style of the control panel on the Hyperdecks – and the look of the name badges on those panels – is very much Sony’s style. As humans, this appeals to our desire for something other than the glass interfaces we’ve been dealing with for the past few years. Michael Cioni (Panavision, Light Iron) coined this as ‘tactile attraction’ in his excellent Faster Together Stage talk. It manifests itself not only in these type of control surfaces, but also in skeuomorphic designs applied to audio filter interfaces. Or in the emotion created in the viewer when a colorist adds film grain to digital footage.

Maybe Grant is right and these methods are really faster in a pressure-filled production environment. Or maybe this is simply an effort to appeal to emotion and nostalgia by Blackmagic’s designers. (Check out Grant Petty’s two-hour 2019 Product Overview for more in-depth information on Blackmagic Design’s new products.)

8K

I won’t spill a lot of words on 8K. Seems kind of silly when most delivery is HD and even SD in some places. A lot of today’s production is in 4K, but really only for future-proofing. But the industry has to sell newer and flashier items, so they’ve moved on to 8K pixel resolution (7680 x 4320). Much of this is driven by Japanese broadcast and manufacturer efforts, who are pushing into 8K. You can laugh or roll your eyes, but NAB had many examples of 8K production tools (cameras and recorders) and display systems. Of course, it’s NAB, making it hard to tell how many of these are only prototypes and not yet ready for actual production and delivery.

For now, it’s still a 4K game, with plenty of mainstream product. Not only cameras and NLEs, but items like AJA’s KiPro family. The KiPro Ultra Plus records up to four channels of HD or one channel of 4K in ProRes or DNx. The newest member of the family is the KiPro GO, which records up to four channels of HD (25Mbps H.264) onto removable USB media.

Of course, the industry never stops, so while we are working with HD and 4K, and looking at 8K, the developers are planning ahead for 16K. As I mentioned, Avid already has project presets built-in for 16K projects. Yikes!

HDR

HDR – or high dynamic range – is about where it was last year. There are basically four formats vying to become the final standard used in all production, post, and display systems. While there are several frontrunners and edicts from distributors to deliver HDR-compatible masters, there still is no clear path. In you shoot in log or camera raw with nearly any professional camera produced within the past decade, you have originated footage that is HDR-compatible. But none of the low-cost post solutions make this easy. Without the right monitoring environment, you are wasting your time. If anything, those waters are muddier this year. There were a number of HDR displays throughout the show, but there were also a few labelled as using HDR simulation. I saw a couple of those at TV Logic. Yes, they looked gorgeous and yes, they were receiving an HDR signal. I found out that the ‘simulation’ part of the description meant that the display was bright (up to 350 nits), but not bright enough to qualify as ‘true’ HDR (1,000 nits or higher).

As in past transitions, we are certainly going to have to rely on a some ‘glue’ products. For me, that’s AJA again. Through their relationship with Colorfront, AJA offers two FS-HDR products: the HDR Image Analyzer and the FS-HDR convertor. The latter was introduced last year as a real-time frame synchronizer and color convertor to go between SDR and HDR display standards.  The new Analyzer is designed to evaluate color space and gamut compliance. Just remember, no computer display can properly show you HDR, so if you need to post and delivery HDR, proper monitoring and analysis tools are essential.

Cameras

I’m not a cinematographer, but I do keep up with cameras. Nearly all of this year’s camera developments were evolutionary: new LF (large format sensor) cameras (ARRI), 4K camcorders (Sharp, JVC), a full-frame mirrorless DSLR from Nikon (with ProRes RAW recording coming in a future firmware update). Most of the developments were targeted towards live broadcast production, like sports and megachurches.  Ikegami had an 8K camera to show, but their real focus was on 4K and IP camera control.

RED, a big player in the cinema space, was only there in a smaller demo room, so you couldn’t easily compare their 8K imagery against others on the floor, but let’s not forget Sony and Panasonic. While ARRI has been a favorite, due to the ‘look’ of the Alexa, Sony (Venice) and Panasonic (Varicam and now EVA-1) are also well-respected digital cinema tools that create outstanding images. For example, Sony’s booth featured an amazing, theater-sized, LED 8K micro-pixel display system. Some of the sample material shown was of the Rio Carnival, shot with anamorphic lenses on a 6K full-frame Sony Venice camera. Simply stunning.

Finally, let’s not forget Canon’s line-up of cinema cameras, from the C100 to the C700FF. To complement these, Canon introduced their new line of Sumire Prime lenses at the show. The C300 has been a staple of documentary films, including the Oscar-winning film, Free Solo, which I had the pleasure of watching on the flight to Las Vegas. Sweaty palms the whole way. It must have looked awesome in IMAX!

(For more on RED, cameras, and lenses at NAB, check out this thread from DP Phil Holland.)

It’s a wrap

In short, NAB 2019 had plenty for everyone. This also included smaller markets, like products for education seminars. One of these that I ran across was Cinamaker. They were demonstrating a complete multi-camera set-up using four iPhones and an iPad. The iPhones are the cameras (additional iPhones can be used as isolated sound recorders) and the iPad is the ‘switcher/control room’. The set-up can be wired or wireless, but camera control, video switching, and recording is done at the iPad. This can generate the final product, or be transferred to a Mac (with the line cut and camera iso media, plus edit list) for re-editing/refinement in Final Cut Pro X. Not too shabby, given the market that Cinamaker is striving to address.

For those of us who like to use the NAB Show exhibit floor as a miniature yardstick for the industry, one of the trends to watch is what type of gear is used in the booths and press areas. Specifically, one NLE over another, or one hardware platform versus the other. On that front, I saw plenty of Premiere Pro, along with some Final Cut Pro X. Hardware-wise, it looked like Apple versus HP. Granted, PC vendors, like HP, often supply gear to use in the booths as a form of sponsorship, so take this with a grain of salt. Nevertheless, I would guess that I saw more iMac Pros than any other single computer. For PCs, it was a mix of HP Z4, Z6, and Z8 workstations. HP and AMD were partner-sponsors of Avid Connect and they demoed very compelling set-ups with these Z-series units configured with AMD Radeon cards. These are very powerful workstations for editing, grading, mixing, and graphics.

©2019 Oliver Peters

Advertisements

Are you ready for a custom PC?

Why would an editor, colorist, or animator purchase a workstation from a custom PC builder, instead of one of the brand name manufacturers? Puget Systems, a PC supplier in Washington state, loaned me a workstation to delve into this question. They pride themselves on assembling systems tailor-made for creative users. Not all component choices are equal, so Puget tests the same creative applications we use every day in order to optimize their systems. For instance, Premiere Pro benefits from more CPU cores, whereas with After Effects, faster core speeds are more important than the core count.

Puget Systems also offers a unique warranty. It’s one year on parts, but lifetime free labor. This means free tech and repair support for as long as you own the unit. Even better, it also includes free labor to install hardware upgrades at their facility at any point in the future – you only pay for parts and shipping.

Built for editing

The experience starts with a consultation, followed by progress reports, test results, and photos of your system during and after assembly. These include thermal scans showing your system under load. Puget’s phone advisers can recommend a system designed specifically for your needs, whether that’s CAD, gaming, After Effects, or editing. My target was Premiere Pro and Resolve with a bit of After Effects. I needed it to be capable of dealing with 4K media using native codecs (no transcodes or proxies). 

Puget’s configuration included an eight-core Intel i9 3.6GHz CPU, 64GB RAM, and an MSI GeForce RTX 2080 Ti Venus GPU (11GB). We put in two Samsung SSDs (a Samsung 860 Pro for OS/applications, plus a faster Samsung 970 Pro M.2 NVMe for cache) and a Western Digital Ultrastar 6TB SATA3 spinning drive for media. This PC has tons of connectivity with ports for video displays, Thunderbolt 3, USB-C, and USB 3. The rest was typical for any PC: sound card, ethernet, wifi, DVD-RW, etc. This unit without a display costs slightly over $5K USD, including shipping and a Windows 10 license. That price is in line with (or cheaper than) any other robust, high-performance workstation.

The three drives in this system deliver different speeds and are intended for different purposes. The fastest of these is the “D” drive, which is a blazingly fast NVMe drive that is mounted directly onto the motherboard. This one is intended for use with material requiring frequent and fast read/write cycles. So it’s ideal for Adobe’s cache files and previews. While you wouldn’t store the media for a large Premiere Pro project on it, it would be well-suited for complex After Effects jobs, which typically only deal with a smaller amount of media. While the 6TB HGST “E” drive dealt well with the 4K media for my test projects, in actual practice you would likely add more drives and build up an internal RAID, or connect to a fast external array or NAS.

If we follow Steve Jobs’ analogy that PCs are like trucks, then this is the Ford F-350 of workstations. The unit is a tad bigger and heavier than an older Mac Pro tower. It’s built into an all-metal Fractal Design case with sound dampening and efficient cooling, resulting in the quietest workstation I’ve ever used – even the few times when the fans revved up. There’s plenty of internal space for future expansion, such as additional hard drives, GPUs, i/o card, etc.

For anyone fretting about a shift from macOS to Windows, setting up this system couldn’t have been simpler. Puget installs a professional build of Windows 10 without all of the junk software most PC makers put there. After connecting my devices, I was up and running in less than an hour, including software installation for Adobe CC, Resolve, Chrome, MacDrive, etc. That’s a very ‘Apple-like’ experience and something you can’t touch if you built your own PC.

The proof is in the pudding

Professional users want hardware and software to fade away so they can fluidly concentrate on the creative process. I was working with 4K media and mixed codecs in Premiere Pro, After Effects, and Resolve. The Puget PC more than lived up to its reputation. It was quiet, media handling was smooth, and Premiere and Resolve timelines could play without hiccups. In short, you can stay in the zone without the system creating distractions.

I don’t work as often with RED camera raw files; however, I did load up original footage from an indie film onto the fastest SSD. This was 4K REDCODE media in a 4K timeline in Premiere Pro. Adobe gives you access to the raw settings, in addition to Premiere’s Lumetri color correction controls. The playback was smooth as silk at full timeline resolution. Even adding Lumetri creative LUTs, dissolves, and slow motion with optical flow processing did not impede real-time playback at full resolution. No dropped frames! Nvidia and RED Digital Camera have been working closely together lately, so if your future includes work with 6K/8K RED media, then a system like this requires serious consideration.

The second concern is rendering and exporting. The RTX 2080 Ti is an Nvidia card that offers CUDA processing, a proprietary Nvidia technology.  So, how fast is the system? There are many variables, of course, such as scaling, filters, color correction, and codecs. When I tested the export of a single 4K Alexa clip from a 1080p Premiere Pro timeline, the export times were nearly the same between this PC and an eight-core 2013 Mac Pro. But you can’t tell much from such a simple test.

To push Premiere Pro, I used a nine minute 1080p travelogue episode containing mostly 4K camera files. I compared export times for ProRes (new on Windows with Adobe CC apps) and Avid DNx between this PC and the Mac Pro (through Adobe Media Encoder). ProRes exports were faster than DNxHD and the PC exports were faster than on the Mac, although comparative times tended to be within a minute of each other. The picture was different when comparing H.264 exports using the Vimeo Full HD preset. In that test, the PC export was approximately 75% faster.

The biggest performance improvements were demonstrated in After Effects and Resolve. I used Puget Systems’ After Effects Benchmark, which includes a series of compositions that test effects, tracking, keys, caustics, 3D text, and more (based on Video Copilot’s tutorials). The Puget PC trounced the Mac Pro in this test. The PC scored a total of 969.5 points versus the Mac’s 535 out of a possible maximum score of 1,000. Resolve was even more dramatic with the graded nine-minute-long sequence sent from Premiere Pro. Export times bested the Mac Pro by more than 2.5x for DNxHD and 6x for H.264.

Aside from these benchmark tests, I also created a “witches brew” After Effects composition of my own. This one contains ten layers of 4K media in a one-minute-long 6K composition. The background layer was blown up and defocused, while all other layers were scaled down and enhanced with a lot of color and Cycore stylized effects. A 3D camera was added to create a group move for the layers. In addition, I was working from the slower drives and not the fast SSDs on either machine. Needless to say this one totally bogs any system down. The Mac Pro rendered a 1080 ProRes file in about 54 minutes, whereas the PC took 42 minutes. Not the same 2-to-1 advantage as in the benchmarks; however, that’s likely due to the fact that I heavily weighted the composition with the Cycore effects. These are not particularly efficient and probably introduce some bottlenecks in After Effects’ processing. Nevertheless, the Puget Systems PC still maintained a decided advantage.

Conclusion

Mac vs. PC comparisons are inevitable when discussing creative workstations. Ultimately it gets down to preference – the OS, the ecosystem, and hardware options. But if you want the ultimate selection of performance hardware and to preserve future expandability, then a custom-built PC is currently the best solution. For straight-forward editing, both platforms will generally serve you well, but there are times when a top-of-the-line PC simply leaves any Mac in the dust. If you need to push performance in After Effects or Resolve, then Windows-based solutions offer the edge today. Custom systems, like those from Puget Systems, are designed with our needs in mind. That’s something you don’t necessarily get from a mainline PC maker. This workstation is a future-proof, no-compromise system that makes the switch from Mac to PC an easy and graceful transition – and with power to space.

Originally written for RedShark News.

©2019 Oliver Peters

Edit Collaboration and Best Practices

There are many workflows that involve collaboration, with multiple editors and designers working on the same large project or group of projects. Let me say up front that if you want the best possible collaborative experience with multiple editors, then work with Avid Media Composer. Full stop. I have worked both sides of the equation and without a doubt, Media Composer connected to Avid Unity/Isis/Nexis shared storage is simply not matched by Final Cut Pro, Final Cut Pro X, Premiere Pro, or any other editing software/storage/cloud combination. Everything else is a compromise, which is why feature film and TV series editorial teams continue to select Avid solutions as their first choice.

In spite of that, there are many reasons to use other editing tools. I work most of the time in Adobe Premiere Pro CC and freelance at a shop with nine edit workstations connected to shared storage. We work mainly in Adobe Creative Cloud applications and our projects involve a lot of collaboration. Some of these are corporate videos that are frequently edited and revised by different editors. Some are entertainment shows, cut by a small editorial team focused on those shows. For some projects, Premiere Pro is the perfect tool. For others, we have to develop strategies to adapt Premiere to our workflow.

With that in mind, the following are tips and best practices that I’ll share for what has worked best for us over the past three years, while working on large projects with a team of editors. Although it applies to our work with Premiere Pro, the same would generally be true if we were working with Apple Final Cut Pro X instead.

Organization. We organize all projects into a specific folder structure, using a Post Haste template. All media files, like camera footage, audio, graphic elements, etc. go into common folders. Editors know where to look to find things. When new camera footage comes in, files are organized as “dailies” into specific folders by date, camera, and camera card. Non-pro formats, like GoPro and DSLR footage will be batch-renamed to reflect the project, date, and camera card. The objective is to have unique file names for each and every media file.

Optimized, transcoded, or proxy media. Depending on the performance and amount of media, you may need to do some prep work before even starting the edit process. Premiere and FCPX work well with some media formats and not with others. NAS/SAN storage is particularly taxing, especially once you get to resolutions greater than HD. If you want the most fluid experience in a shared workflow, then you will likely need to transcode proxy files from within the application. The reason to stay inside of FCPX or Premiere Pro is so that frame size offsets are properly tracked. Once proxies have been transcoded, it’s a simple matter of toggling between the proxy media (best playback performance) and full-resolution media (best image quality).

On the other hand, if you’d rather stick to full-resolution, native media, then some formats will have to be transcoded into “optimized” media. For instance, GoPro 4K footage is terrible to edit with natively. It should always be transcoded to ProRes or DNxHD before editing, if you don’t want to go the proxy route. This can be done inside or outside of the application and is an easy task with DaVinci Resolve, EditReady, Adobe Media Encoder, or Apple Compressor.

Finally, if you have image sequences from a drone or other source, forget trying to edit from these off of a network. Transcode them right away into some format of master movie file. I find Resolve to be the best tool for this. It’s fast and since these are often camera raw files, you can apply a base grade to them as a starting point for future color correction.

Break up your projects. Depending on the type and size of the job and number of editors working on it, you may choose to work in multiple Premiere projects. There may be a master file where all media is imported and initially organized. Then there may be multiple projects that are offshoots from this for component parts. In a corporate environment, it could be several different videos cut from a single, larger set of media. In a feature film, there could be different Premiere projects for each reel of the film.

Since Premiere Pro employs project locking, any project opened by one editor can also be opened in a read-only mode by other editors. Editors can have multiple Premiere projects open at one time. Thus, it’s simple to bring in elements from one project into another, even while they are all open. This workflow mimics Avid’s bin-locking strategy.

It helps to keep project files streamlined as progress on the production extends over time. You want to keep the number of sequences in any given project small. Periodically duplicate your project(s), strip out old sequences from the current project, and archive the older project files.

As a general note, while working to build the creative story edits – i.e. “offline editing” – you will want to keep plug-in filter effects to a minimum. In fact, it’s generally a good idea to keep the plug-in selection on each system small, so that all workstations in this shared environment are able to have the same set of installed plug-ins. The same is true of fonts.

Finishing stages of post. There are generally two paths in the finishing, aka “online editing” stage. Either all final color correction and assembly of effects is completed within Premiere Pro, or there is a roundtrip through a color correction application, like Blackmagic Design DaVinci Resolve. The same holds true for audio, where a separate sound editor/designer/mixer may handle the finishing touches in Avid Pro Tools.

To accomplish an easy roundtrip with Resolve, create a sequence with all color correction and effects removed. Flatten the video to a single track (if possible), and remove the audio or do a simple stereo mixdown for reference. Ideally, media with mixed frame rates should be addressed as slow motion in the edited sequence. Avoid modifying the frame rate through any sort of “interpret” function within the application. Export an XML or AAF and send that and the associated media to Resolve. When color correction is complete, you can render the entire timeline at the sequence resolution as a single master file.

Conversely, if you want to send it back to Premiere Pro for final assembly and to complete the roundtrip, then render individual clips at their source resolution with handles of one to two seconds. Back in Premiere, re-apply titles, insert completed visual effects, and add any missing plug-in effects.

With audio post, there will be no roundtrip of elements, since the mixer will deliver a completed mixed stereo or surround track. This should be imported into Premiere (or Resolve if the final master is created in Resolve) and married back to the final video sequence. The mixer should also supply “stems” – the individual dialogue, music, and sound effects (D/M/E) submix tracks.

Mastering. Final sequences should be exported in a master file format (ProRes, DNxHD/HR, uncompressed) in at least two forms: 1) master with final mix and titles, and 2) textless submaster with split-track audio (multiple channels containing the D/M/E stems). All of these files are stored within the same job-based folder structure outlined at the top. It is quite common that future revisions will be made using the textless submaster rather than re-opening the full project, or that it may be used as source material in another edit.

Another aspect of finishing the project is media consolidation. This means taking the final sequence and generating a new project file from it. That file contained only those elements from the sequence, along with a copy of the media used, where each file has been trimmed to the portion within the sequence (plus handles). This is the Project Manager function in Premiere Pro. Unfortunately, Premiere is not consistently good at this task. Some formats will be properly trimmed, while others will be copied in their entirety. That’s OK for a :10 take, but a bummer when it’s a 30-minute interview.

The good news is that if you went through the Resolve roundtrip workflow and rendered individual clips, then effectively Resolve has already done media consolidation as a byproduct. In addition, if your source media is 4K, but you only finished in HD, the Resolve renders will be 4K. If in the future, you need to deliver the same master in 4K, everything is already set. Of course, that assumes that you didn’t do a lot of “punching in” and reframing in your edit sequence.

Cloud-based services. Often collaboration requires a distributed team, when not everyone is under one roof. While Adobe does offer cloud-based team editing methods, this doesn’t really work when editors are on different Creative Cloud accounts or when the collaboration is between an editor and a graphic designer/animator/VFX artist working in non-Adobe tools. In that case the old standbys have been Dropbox, Box, or Google Drive. Syncing is easy and relatively reliable. However, these are really just designed for sharing assets. But when this involves a couple of editors and each has a local, mirrored set of media, then simple sharing/syncing of only small project files makes for a working collaborative method.

Frame.io is the newbie here, with updated extension tools designed for in-application workspace panels within Final Cut Pro X, After Effects, and Premiere Pro. While they tout the ease of moving full-resolution media into their cloud, including camera files, I really wouldn’t recommend doing that. It’s simply not very practical on must projects. But for sharing cuts using a standard review-and-approach workflow, Frame.io definitely hits most of the buttons.

©2018 Oliver Peters

Rams

If you are a fan of the elegant, minimalist design of Apple products, then you have seen the influence of Dieter Rams. The renowned, German industrial designer, associated with functional and unobtrusive design, is known for the iconic consumer products he developed for Braun, as well as his Ten Principles for Good Design. Dieter Rams is the subject of Rams, a new documentary film by Gary Hustwit (Helvetica, Objectified, Urbanized).

This has been a labor of love for Hustwit and partially funded through a Kickstarter campaign. In a statement to the website Designboom, Huswit says, “This film is an opportunity to celebrate a designer whose work continues to impact us and preserve an important piece of design history. I’m also interested in exploring the role that manufactured objects play in our lives and, by extension, the relationship we have with the people who design them. We hope to dig deeper into Rams’ untold story – to try and understand a man of contradictions by design. I want the film to get past the legend of Dieter. I want it to get into his philosophy, process, inspirations, and even his regrets.” 

Hustwit has worked on the documentary for the past three years and premiered it in New York at the end of September. The film is currently on the road for a series of international premiere screenings until the end of the year. I recently had a conversation with Kayla Sklar, the young editor how had the opportunity to tackle this as her first feature film.

______________________________________________________

[OP] Please give me a little background about how you got into editing and then became connected with this project.

[KS] I moved to New York in 2014 after college to pursue working in theater administration for non-profit, Off Broadway theater companies. But at 25, I had sort of a quarter-life crisis and realized that wasn’t what I wanted to do at all. I knew I had to make a career change. I had done some video editing in high school with [Apple] iMovie and in college with [Apple] Final Cut Pro 7 and had enjoyed that. So I enrolled at The Edit Center in Brooklyn. They have an immersive, six-week-long program where you learn the art of editing by working with actual footage from real projects. Indie filmmakers working in documentaries and narrative films, who don’t have a lot of money, can submit their film to The Edit Center. Two are chosen per semester. 12 to 16 students are given scenes and get to work with the director. They give us feedback and at the end, we present a finished rough cut. This process gives us a sense of how to edit.

I knew I could definitely teach myself [Adobe] Premiere Pro, and probably figure out Avid [Media Composer], but I wanted to know if I would even enjoy the process of working with a director. I took the course in 2016 thinking I would pursue narrative films, because it felt the most similar to the world I had come from. But I left the course with an interest in documentary editing. I liked the puzzle-solving aspect of it. It’s where my skillset best aligned.

Afterwards, I took a few assistant editing jobs and eventually started as an assistant editor with Film First, which is owned by Jessica Edwards and Gary Hustwit. That’s how I got connected with Gary. I was assisting on a number of his projects, including working with some of the Rams footage and doing a few rough assemblies for him. Then last year he asked me to be the editor of the film. So I started shifting my focus exclusively to Rams at the beginning of this year. Gary has been working on it since 2015 – shooting on and off for three years. It just premiered in late September, but we even shot some pick-ups in Germany as late as late August / early September.

[OP] So you were working solidly on the film for about nine months. At what point did you lock the cut?

[KS] (laugh) Even now we’re still tinkering. We get more feedback from the screenings and are learning what things are working and aren’t working. The story was locked four days before the New York premiere, but we’re making small changes to things.

[OP] Documentary editing can encompass a variety of structures – narrator-driven, a single subject, a collection of interviewees, etc. What approach did you take with Rams?

[KS] Most of the film is in Dieter Rams’ own words. Gary’s other films have a huge cast of characters. But Gary wanted to make this film different from that and more streamlined. His original concept was that it was going to be Dieter as the only interview footage and you might meet other characters in the verité. But Gary realized that wasn’t going to work, simply because Dieter is a very humble man and he wasn’t really talking about his impact on design. We knew that we needed to give the film a larger context. We needed to bring in other people to tell how influential he has been.

[OP] Obviously a documentary like this has no narrative script to follow. Understanding the interview subject’s answers is critical for the editor in order to build the story arc. I understand that much of the film is in a foreign language. So what was your workflow to edit the film?

[KS] Right. So, the vast majority of the film is in German and a little bit in Japanese, both with subtitles. Maybe 25% is in English, but we’re creating it primarily with an English-speaking audience in mind. I know pretty much no German, except words from Sound of Music and Cabaret. We had a great team of translators on this project, with German transcripts broken down by paragraph and translated into English. I had a two-column set-up with German on one side and English on the other. Before I joined the project, there was an assistant who input titles directly into Premiere – putting subtitles over the dailies with the legacy titler. That was the only way I would be able to even get a rough assembly or ‘radio edit’ of what we wanted.

When you edit an English-speaking documentary, you often splice together two parts of a longer sentence to form a complete and concise thought. But German grammar is really complicated. I don’t think I really grasped how much I was taking on when I first started tackling the project. So I would build a sentence that was pretty close from the transcripts. Thank God for Google Translate, because I would put in my constructed sentence and hope that it spit out something pretty close to what we were going for. And that’s how we did the first rough cut.

Then we had an incredible woman, Katharina Kruse-Ramey, come in. She is a native German speaker living here in New York. She came in for a full eight or nine hours and picked through the edit with a fine tooth comb. For instance, “You can’t use this verb tense with this noun.” That sort of thing. She was hugely helpful and this film wouldn’t have been able to happen without Katharina. We knew then that a German speaker could watch this film and it would make sense! We also had another native German speaker, Eugen Braeunig, who was our archival researcher. He was great for the last minute pick-ups that were shot, when we couldn’t go through the longer workflow.

[OP] I presume you received notes and comments back from Dieter Rams on the cut. What has his response been?

[KS] The film premiered at the Milano Design Film Festival a few weeks ago and Dieter came to that. It was his first time seeing the finished product. From what I’ve heard, he really liked it! As much as one can like seeing themselves on a large screen, I suppose. We had sent him a rough cut a few months ago and in true analytical fashion, the notes that we got back from him were just very specific technical details about dates and products and not about overall storytelling. He really was quite willing to give Gary complete control over the filmmaking process. There was a lot of trust between the two of them.

[OP] Did you cut the film to temp music from the beginning or add music later? I understand that the prolific electronic musician and composer, Brian Eno (The Lego Batman Movie, T2 Trainspotting, The Simpsons), created the soundtrack. What was that like?

[KS] The structure of this film has more breathing room than a lot of docs might have. We really thought about the fact that we needed to give viewers a break from reading subtitles. We didn’t want to go more than ten minutes of reading at a time. So we purposely built in moments for the audience to digest and reflect on all that information. And that’s where Brian’s music was hugely important for us.

We actually didn’t start really editing the film until we had gotten the music back from Brian. I’ve been told that he doesn’t ever score to picture. We sent him some raw footage and he came back with about 16 songs that were inspired by the footage. When you have that gorgeous Brian Eno music, you know that you’re going to have moments where you can just sit back and enjoy the sheer beauty of the moment. Once we had the music in, everything just clicked into place.

[OP] The editor is integral to creating the story structure of a documentary, more so than narrative films – almost as if they are another writer. Tell me a bit about the structure for Rams.

[KS] This film is really not structured the way you would probably structure a normal doc. As I said earlier, we very purposefully put reading breaks in, either through English scenes or with Eno’s music. We had no interest in telling this story linearly. We jump back and forth. One plot line is the chronology of Dieter’s career. Then there’s this other, perhaps more important story, which is Dieter today.  His thoughts on the current state of design and the world. He’s still very active in giving talks and lectures. There’s a company called Vitsoe that makes a lot of his products and he travels to London to give input on their designs. That was the second half of the story and those are interspersed.

[OP] I presume you went outside for finishing services – sound, color correction, and so on. But did the subtitles take on any extra complexity, since they were such an important visual element?

[KS] There are three components to the post. We did an audio mix at one post house; there was a color correction pass at another; and we also had an animation studio – Trollbäck – working with us. There is a section in the film that we knew had to be visually very different and had to convey information in a different way than we had done in any other part of the film. So we gave Trollbäck that five-minute-long sequence. And they also did our opening titles.

We had thought about a stylistic treatment to the subtitles. There were two fonts that Trollbäck had used in their animation. Our initial intent was to use that in our subtitles. We did use one of those treatments in our titles and product credits. For the subtitles, we spent days trying out different looks. Are we going to shadow it or are we using outlines? What point font? What’s the kerning on it? There was going to be so much reading that we knew we had to do the titles thoughtfully. At the end of the day, we knew Helvetica was going to be the easiest (laugh)! We had tried the outline, but some of the internal space in the letters, like an ‘o’ or an ‘e’, looked closed off. We ended up going with a drop shadow. Dieter’s home is almost completely white, so there’s a lot of white space in the film. We used shadows, which looked a little softer, but still quite readable. Those were all built in Premiere’s legacy title tool.

[OP] You are in New York, which is a big Avid Media Composer town. So what was the thought process in deciding to cut this film in Adobe Premiere Pro?

[KS] When I came on-board, the project was already in Premiere. At that point I had been using Avid quite a lot since leaving The Edit Center, which teaches their editing course in Avid. I had taught myself Premiere and I might have tried to transfer the project to Avid, but there was already so much done in terms of the dailies with the subtitles. The thought of going back and spending maybe 50 hours worth of manual subtitling that didn’t migrate over correctly just seemed like a total nightmare. And I was happy to use Premiere. Had I started the project from scratch, I might have used Avid, because it’s the tool that I felt fastest on. Premiere was perfectly fine for the film that we were doing. Plus, if there were days when Gary wanted to tinker around in the project and look at things, he’s much more familiar with Premiere than he is with Avid. He also knows the other Adobe tools, so it made more sense to continue with the same family of creative products that he already knew and used.

Maybe it’s this way with the tool you learn first, but I really like Avid and I feel that I’m faster with it than with Premiere. It’s just the way my brain likes to edit things. But I would be totally happy to edit in Premiere again, if that’s what worked best for a project and what the director wanted. It was great that we didn’t have to transcode our archival footage, because of how Premiere can handle media. Definitely that was helpful, because we had some mixed frame rates and resolutions.

[OP] A closing question. This is your first feature film and with such an influential subjective. What impact did it have on you?

[KS] Dieter has Ten Principles for Good Design. He built them to talk about product design and as a way for him to judge how a product ideally should be made. I had these principles taped to my wall by my desk. His products are very streamlined, elegant, and clean. The framework should be neutral enough that they can convey what the intention was without bells-and-whistles. He wasn’t interested in adding a feature that was unnecessary. I really wanted to evoke those principles with the editing. Had the film been cluttered with extraneous information, or was self-aggrandizing, I think when we revealed the principles to the audience, they would have thought, “Wait a minute, this film isn’t doing that!” We felt that the structure of the film had to serve his principles well, wherever appropriate.

His final principle is ‘Good Design is as Little Design as Possible.’ We joked that ‘Good Filmmaking is as Little Filmmaking as Possible.’ We wanted the audience to be able to draw their own conclusions about Dieter’s work and how that translates into their daily lives. A viewer could walk away knowing what we were trying to accomplish without someone having to tell them what we were trying to accomplish.

There were times when I really didn’t know if I could do it. Being 26 and editing a feature film was daunting. Looking at those principles kept me focused on what the meat of the film’s structure should be. That made me realize how lucky we are to have had a designer who really took the time to think about principles that can be applied to a million different subjects. At one of these screenings someone came up to us, who had become a UI designer for software, in part, because of Dieter. He told us, “I read Dieter’s principles in a book and I realized these can be applied to how people interact with software.” They can be applied to a million different things and we certainly applied it to the edit.

______________________________________________________

Gary Hustwit will tour Rams internationally and in various US cities through December. After that time it will be available in digital form through Film First.

Click here to learn more about Dieter Rams’ Ten Principles for Good Design.

©2018 Oliver Peters

The Old Man & the Gun

Stories of criminal exploits have long captivated the American public. But no story is quirkier than that of Forrest Silva “Woody” Tucker. He was a lifelong bank robber and prison escape artist who was in and out of prison. His most famous escape came in 1979 from San Quentin State Prison. The last crimes were a series of bank robberies around the Florida retirement community where he lived. He was captured in 2000 and died in prison in 2004 at the age of 83. Apparently good at his job – he stole an estimated four million dollars over his lifetime – Tucker was aided by a set of older partners, dubbed the “Over the Hill Gang”. His success, in part, was because he tended to rob lower profile, local banks and credit unions. While he did carry a gun, it seems he never actually used it in any of the robberies.

The Old Man & the Gun is a semi-fictionalized version of Tucker’s story brought to the screen by filmmaker David Lowery (A Ghost Story, Pete’s Dragon, Ain’t Them Bodies Saints). It stars Robert Redford as Tucker, along with Danny Glover and Tom Waits as his gang. Casey Affleck plays John Hunt, a detective who is on his trail. Sissy Spacek is Jewel, a woman who takes an interest in Tucker. Lowery wrote the script in a romanticized style that is reminiscent of how outlaws of the old west are portrayed. The screenplay is based on a 2003 article in The New Yorker magazine by Dale Grann, which chronicled Tucker’s real-life exploits.

David Lowery is a multi-talented filmmaker with a string of editing credits. (He was his own editor on A Ghost Story.) But for this film, he decided to leave the editing to Lisa Zeno Churgin, A.C.E. (Dead Man Walking, Pitch Perfect, Cider House Rules, House of Sand and Fog), with whom he had previously collaborated on Pete’s Dragon. I recently had the opportunity to chat with Churgin about working on The Old Man & the Gun.

___________________________________

[OP] Please tell me a bit about your take on the story and how the screenplay’s sequence ultimately translated into the finished film.

[LZC] The basis of Redford’s character is a boy who started out stealing a bicycle, went to reform school when he was 13, and it continued along that way for the rest of his life. Casey Affleck is a cop in the robbery division who takes it as a personal affront when the bank where he was trying to make a deposit was robbed. He makes it his mission to discover who did it, which he does. But because it’s a case that crosses state lines, the case gets taken over by the FBI. Casey’s character then continues the search on his own.  It’s a wonderful cat and mouse game. 

There are three storylines in the film. The story begins when Tucker is leaving the scene of a robbery and pulls over to the side of the road to help Jewel [Sissy Spacek] while evading the police on his trail. Their story provides a bit of a love interest.  The second storyline is that of the “Over the Hill Gang”. And the third storyline is the one between Tucker and Hunt. It’s not a particularly linear story, so we were always balancing these three storylines. Whenever it started to feel like we’d been away too long from a particular storyline and set of characters, it was time to switch gears.

Although David wrote the script, he wasn’t particularly overprotective of it. As in most films, we experimented a lot, moving scenes around to make those three main stories find their proper place. David dressed Redford in the same blue suit for the entire movie with occasional shirt or tie changes. This made it easier to shift things than when you have costume constraints. Often scenes ended up back where they started, but a lot of times they didn’t – just trying to find the right balance of those three stories. We had absolute freedom to experiment, and because David is a writer, director, and an editor in his own right, he really understands and appreciates the process.

The nature of this film was so unique, because it is of another time and place [the 1980s], but still modern in its own way. I also see it partly as an homage to Bob [Redford], because this is possibly his last starring role. Shooting on 16mm film certainly lends itself to another time and place. The score is a jazz score. That jazz motor places it in time, but also keeps it contemporary. As an aside, a nice touch is when Casey visits Redford in the hospital and he does a little ‘nose salute’ from The Sting, which was Casey’s idea.

[OP] On some films the editor is on location, keeping up to camera with the cut. On others, the editing team stays at a home base. For The Old Man & the Gun, you two were separated during the initial production phase. Tell me how that was handled.

[LZC] David was filming in Cincinnati and I was simultaneously cutting in LA. Because it was being shot on film, they sent it to Fotokem to be developed and then to Technicolor to be digitized. Then it was brought over to us on a drive. When you don’t get to watch dailies together, which is pretty much the norm these days, I try to ask the director to communicate with the script supervisor as much as possible while they are shooting: circled takes, particular line readings, any idea that the director might want to communicate to the editor. That sort of input always helps. Their distant location and the need to process film meant it would be a few days before I got the film and before David could see a scene that he’d shot, cut together. Getting material to him as quickly as possible is the best thing that I can do. That’s always my goal.

When I begin cutting a scene, I start by loading a sequence of all of the set-ups and then scroll through this sequence (what most editors who worked on film call a KEM roll) so that I can see what has been shot. Occasionally, I’ll put together selects, but generally I just start at the beginning and go cut to cut. The hardest part is always figuring out what’s going to be the first cut. Are we going to start tight? Are we going to start wide where we show everything? What is that first cut going to be? I seem to spend more time on that than anything else and once I get into it – and I’m not the first person to say this – the film tells you what to do. My goal is to get it into form as quickly as possible, so I can get a cut back to the director.

I finished the editor’s cut in LA and then we moved the cutting room to Dallas. Then David and I worked on the director’s cut – traditionally ten weeks – and after that, we showed it to the producers. Our time was extended a bit, because we had to wait for Bob’s availability to shoot some of the robbery sequences. They always knew that they were going to have to do some additional filming.

[OP] I know David is an experienced editor. How did you divide up the editorial tasks? Or was David able to step back from diving in and cutting, too?

[LZC] David is an excellent editor in his own right, but he is very happy to have someone else do the first pass. On this film I think he was more interested in playing around with some of the montages sequences. Then he’d hand it back to me so that I could incorporate it back into the film, sometimes making changes that kept it within the style of the film as a whole.

[OP] The scenes used in a film and the final length are always malleable until the final version of the cut. I’m sure this one was no different. Please tell me a bit about that.

[LZC] We definitely lost a fair number of scenes. My assistant makes scene cards that we put up on the wall and then when we lift a scene it goes on the back of the door. That way, you can just open the door and look on the back and see what has been taken out. In this particular film, because of the three separate storylines, scenes went in, out, and rearranged – and then in, out, and rearranged again. Often, scenes that we dropped at the very beginning ended up back in the movie, because it’s like a house of cards. You know you really have to weigh everything and try to juxtapose and balance the storylines and keep it moving. The movie is quite short now, but my first cut wasn’t that long. The final cut is 94 minutes and I think the first cut wasn’t much more than two hours.  

[OP] Let me shift gears a bit. As I understand it, David is a fan of Adobe Creative Cloud and in particularly, Premiere Pro. On The Old Man & the Gun, you shifted to Premiere Pro, as well. As someone who comes from a film and Avid editorial background, how was it to work with Premiere Pro?

[LZC] Over the course of my career, I’ve done what we call ‘doctor jobs’, where an editor comes in and does a recut of a film. On some of these jobs, I had the opportunity to work on Lightworks and on Final Cut. When we began Pete’s Dragon, David asked if I would consider doing it on Premiere Pro. David Fincher’s team had just done Gone Girl using it and David was excited about the possibility of doing Pete’s using Premiere. But for a big visual effects film, Premiere at that stage really wasn’t ready. I said if we do another film together, I’d be happy to learn Premiere. So, when we knew we would be doing Old Man, David spoke to the people at Adobe. They arranged to have Christine Steele tutor me. I worked with her before we began shooting. It was perfect, because we live close to each other and we were able to work in short, three- and four-hour blocks of time. (Note: Steele is an LA-based editor, who is frequently a featured presenter for Adobe.)

I also hired my first assistant, Mike Melendi, who was experienced with Premiere Pro. It was definitely a little intimidating at first, but within a week, I was fine. I actually ended up doing another film on Avid afterwards and I was a little nervous to go back to Avid. But that was like riding a bike. And after that, I took over another film that was on Premiere. Now I know I can go back and forth and that it’s perfectly fine.

[OP] Many feature film editors with an extensive background on Media Composer often rely on Avid’s script integration tools (ScriptSync). That’s something Premiere doesn’t have. Any concerns there?

[LZC] I think ScriptSync is the most wonderful thing in the world, but I grew up without it. When my assistants prepare dailies for me, they’ll put in a bunch of locators, so I know where there are multiple takes within a take. I think ScriptSync is great if you can get the labor of somebody to do it. I know there are a lot of editors who do it themselves while they’re watching dailies. I worked on a half-hour comedy where there was just a massive amount of footage and a tremendous amount of ‘keep rollings’. After working for one week I said to them, ‘We have to get SciptSync’. And they did! We had a dedicated person to do it and that’s all they did. It’s a wonderful luxury, which I would always love to have, but because I learned without it, I’ve created other ways to work without it.

My biggest issue with Premiere was the fact that, because I always work in the icon view and not list view, I had to contend with their grid arrangement within the bins. With Media Composer, you can arrange your clips however you want. Adobe knew that it was a really big issue for me and for other editors, so they are working on a version where you can move and arrange the clips within a bin. I’ve had the opportunity to give input on that and I know we’ll see that changed in a future version.

I would love to keep working on Premiere. Coming back to it again recently, I felt really confident about being able to go back and forth between the two systems. But some directors and studios have specific preferences. Still, I think it would be a lot of fun to continue working in Premiere.

[OP] Any final thoughts on the experience?

[LZC] I enjoyed the opportunity to work on such a wonderful project with such great actors. For me as an editor, that’s always my goal – to work with great performances. To help have a hand in shaping and creating wonderful moments like the ones we have in our film. I hope others feel that we achieved that.

For more, check out Adobe’s customer stories and blog. Also Steve Hullfish’s Art of the Cut interview.

This interview transcribed with the assistance of SpeedScriber.

©2018 Oliver Peters

Beyond the Supernova

No one typifies hard driving, instrumental, guitar rock better than Joe Satriani. The guitar virtuoso – known to his fans as Satch – has sixteen studio albums under his belt, along with several other EPs, live concert and compilation recordings. In addition to his solo tours, Satriani founded the “G3”, a series of short tours that feature Satriani along with a changing cast of two other all-star, solo guitarists, such as Steve Vai, Yngwie Malmsteen, Guthrie Govan, and others. In another side project, Satriani is the guitarist for the supergroup Chickenfoot, which is fronted by former Van Halen lead singer, Sammy Hagar.

The energy behind Satriani’s performances was captured in the new documentary film, Beyond the Supernova, which is currently available on the Stingray Qello streaming channel. This documentary grew out of the general behind-the-scenes coverage of Satriani’s 2016 and 2017 tours in Asia and Europe, to promote his 15th studio album, Shockwave Supernova. Tour filming was handled by Satriani’s son, ZZ (Zachariah Zane) – an up-and-coming, young filmmaker. The tour coincided with Joe Satriani’s 60th birthday and 30 years after the release of his multi-platinum-selling album Surfing with the Alien. These elements, as well as capturing Satriani’s introspective nature, provided the ingredients for a more in-depth project, which ZZ Satriani produced, directed and edited.

According to Joe Satriani in an interview on Stingray’s PausePlay, “ZZ was able to capture the real me in a way that only a son would understand how to do; because I was struggling with how I was going to record a new record and go in a new direction. So, as I’m on the tour bus and backstage – I guess it’s on my face. He’s filming it and he’s going ‘there’s a movie in here about that. It’s not just a bunch of guys on tour.’”

From music to filmmaking

ZZ Satriani graduated from Occidental College in 2015 with a BA in Art History and Visual Arts, with a focus on film production. He moved to Los Angeles to start a career as a freelance editor. I spoke with ZZ Satriani about how he came to make this film. He explained, “For me it started with skateboarding in high school. Filmmaking and skateboarding go hand-in-hand. You are always trying to capture your buddies doing cool tricks. I gravitated more to filmmaking in college. For the 2012 G3 Tour, I produced a couple of web videos that used mainly jump cuts and were very disjointed, but fun. They decided to bring me on for the 2016 tour in order to produce something similar. But this time, it had to have more of a story. So I recorded the interviews afterwards.”

Although ZZ thinks of himself as primarily an editor, he handed all of the backstage, behind-the-scenes, and interview filming himself, using a Sony PXW-FS5 camera. He comments, “I was learning how to use the camera as I was shooting, so I got some weird results – but in a good way. I wanted the footage to have more of a filmic look – to have more the feeling of a memory, than simply real-time events.”

The structure of Beyond the Supernova intersperses concert performances with events on the tour and introspective interviews with Joe Satriani. The multi-camera concert footage was supplied by the touring support company and is often mixed with historical footage provided by Joe Satriani’s management team. This enabled ZZ to intercut performances of the same song, not only from different locations, but even different years, going back to Joe Satriani’s early career.

The style of cutting the concert performances is relatively straightforward, but the travel and interview bridges that join them together have more of a stream-of-consciousness feel to them and are often quite psychedelic. ZZ says, “I’m not a big [Adobe] After Effects guy, so all of the ‘effects’ are practical and built up in layers within [Adobe] Premiere Pro. The majority of ‘effects’ dealt with layering, blending and cropping different clips together. It makes you think about the space within the frame – different shapes, movement, direction, etc. I like playing around that way – you end up discovering things you wouldn’t have normally thought of. Let your curiosity guide you, keep messing with things and you will look at everything in a new way. It keeps editing exciting!”

Premiere Pro makes the cut

Beyond the Supernova was completely cut and finished in Premiere Pro. ZZ explains why,  “Around 2011-12, I made the switch from [Apple] Final Cut Pro to Premiere Pro while I was in a film production class. They informed us that was the new standard, so we rolled with it and the transition was very smooth. I use other apps in the Adobe suite and I like the layout of everything in each one, so I’ve never felt the need to switch to another NLE.”

ZZ Satriani continues, “We had a mix of formats to deal with, including the need to upscale some of the standard definition footage to HD, which I did in software. Premiere handled the PXW-FS5’s XAVC-L codec pretty well in my opinion. I didn’t transcode to Pro Res, since I had so much footage, and not a lot of external hard drive space. I knew this might make things go more slowly – but honestly, I didn’t notice any significant drawbacks. I also handled all of the color correction, using Premiere’s Lumetri color controls and the FilmConvert plug-in.” Satriani created the sound design for the interview segments, but John Cuniberti (who has also mixed Joe Satriani’s albums) re-mixed the live concert segments in his studio in London. The final 5.1 surround mix of the whole film was handled at Skywalker Sound.

The impetus pushing completion was entry into the October 2017 Mill Valley Film Festival. ZZ says, “I worked for a month putting together the trailer for Mill Valley. Because I had already organized the footage for this and an earlier teaser, the actual edit of the film came easily. It took me about two months to cut – working by myself in the basement on a [2013] Mac Pro. Coffee and burritos from across the street kept me going.” 

Introspection brings surprises

Fathers and sons working together can often be an interesting dynamic and even ZZ learned new things during the production. He comments, “The title of the film evolved out of the interviews. I learned that Joe’s songs on an album tend to have a theme tied to the theme of the album, which often has a sci-fi basis to it. But it was a real surprise to me when Joe explained that Shockwave Supernova was really his character or persona on stage. I went, ‘Wait! After all these years, how did I not know that?’”

As with any film, you have to decide what gets cut and what stays. In concert projects, the decision often comes down to which songs to include. ZZ says, “One song that I initially thought shouldn’t be included was Surfing with the Alien. It’s a huge fan favorite and such an iconic song for Joe. Including it almost seemed like giving in. But, in a way it created a ‘conflict point’ for the film. Once we added Joe’s interview comments, it worked for me. He explained that each time he plays it live that it’s not like repeating the past. He feels like he’s growing with the song – discovering new ways to approach it.”

The original plan for Beyond the Supernova after Mill Valley was to showcase it at other film festivals. But Joe Satriani’s management team thought that it coincided beautifully with the release of his 16th studio album, What Happens Next, which came out in January of this year. Instead of other film festivals, Beyond the Supernova made its video premiere on AXS TV in March and then started its streaming run on Stingray Qello this July. Qello is known as a home for classic and new live concerts, so this exposes the documentary to a wider audience. Whether you are a fan of Joe Satriani or just rock documentaries, ZZ Satriani’s Beyond the Supernova is a great peek behind the curtain into life on the road and some of the thoughts that keep this veteran solo performer fresh.

Images courtesy of ZZ Satriani.

©2018 Oliver Peters

Hawaiki AutoGrade

The color correction tools in Final Cut Pro X are nice. Adobe’s Lumetri controls make grading intuitive. But sometimes you just want to click a few buttons and be happy with the results. That’s where AutoGrade from Hawaiki comes in. AutoGrade is a full-featured color correction plug-in that runs within Final Cut Pro X, Motion, Premiere Pro and After Effects. It is available from FxFactory and installs through the FxFactory plug-in manager.

As the name implies, AutoGrade is an automatic color correction tool designed to simplify and speed-up color correction. When you install AutoGrade, you get two plug-ins: AutoGrade and AutoGrade One. The latter is a simple, one-button version, based on global white balance. Simply use the color-picker (eye dropper) and sample an area that should be white. Select enable and the overall color balance is corrected. You can then tweak further, by boosting the correction, adjusting the RGB balance sliders, and/or fine-tuning luma level and saturation. Nearly all parameters are keyframeable, and looks can be saved as presets.

AutoGrade One is just a starter, though, for simple fixes. The real fun is with the full version of AutoGrade, which is a more comprehensive color correction tool. Its interface is divided into three main sections: Auto Balance, Quick Fix, and Fine-Tune. Instead of a single global balance tool, the Auto Balance section permits global, as well as, any combination of white, black, and/or skin correction. Simply turn on one or more desired parameters, sample the appropriate color(s) and enable Auto Balance. This tool will also raise or lower luma levels for the selected tonal range.

Sometimes you might have to repeat the process if you don’t like the first results. For example, when you sample the skin on someone’s face, sampling rosy cheeks will yield different results than if you sample the yellowish highlights on a forehead. To try again, just uncheck Auto Balance, sample a different area, and then enable Auto Balance again. In addition to an amount slider for each correction range, you can also adjust the RGB balance for each. Skin tones may be balanced towards warm or neutral, and the entire image can be legalized, which clamps video levels to 0-100.

Quick Fix is a set of supplied presets that work independently of the color balance controls. These include some standards, like cooling down or warming up the image, the orange and teal look, adding an s-curve, and so on. They are applied at 100% and to my eye felt a bit harsh at this default. To tone down the effect, simply adjust the amount slider downwards to get less intensity from the effect.

Fine-Tune rounds it out when you need to take a deeper dive. This section is built as a full-blown, 3-way color corrector. Each range includes a luma and three color offset controls. Instead of wheels, these controls are sliders, but the results are the same as with wheels. In addition, you can adjust exposure, saturation, vibrance, temperature/tint, and even two different contrast controls. One innovation is a log expander, designed to make it easy to correct log-encoded camera footage, in the absence of a specific log-to-Rec709 camera LUT.

Naturally, any plug-in could always offer more, so I have a minor wish list. I would love to see five additional features: film grain, vignette, sharpening, blurring/soft focus, and a highlights-only expander. There are certainly other individual filters that cover these needs, but having it all within a single plug-in would make sense. This would round out AutoGrade as a complete, creative grading module, servicing user needs beyond just color correction looks.

AutoGrade is a deceptively powerful color corrector, hidden under a simple interface. User-created looks can be saved as presets, so you can quickly apply complex settings to similar shots and set-ups. There are already many color correction tools on the market, including Hawaiki’s own Hawaiki Color. The price is very attractive, so AutoGrade is a superb tool to have in your kit. It’s a fast way to color-grade that’s ideal for both users who are new or experienced when it comes to color correction.

(Click any image to see an enlarged view.)

©2018 Oliver Peters