4K is kinda meh

df1716_4k-native_main

Lately I’ve done a lot of looking at 4K content. Not only was 4K all over the place at NAB in Las Vegas, but I’ve also had to provide some 4K deliverables on client projects. This has meant a much closer examination of the 4K image than in the past.

First, let’s define 4K. Typically the term 4K applies to either a “cinema” width of 4096 pixels or a broadcast width of 3840 pixels. The latter is also referred to as QuadHD, UltraHD or UHD and is a 2x multiple of the 1920-wide HD standard. For simplicity’s sake, in this article I’m going to be referring to 4K, but will generally mean the UHD version, i.e. 3840 x 2160 pixels, aka 2160p. While 4K (and greater) acquisition for an HD finish has been used for awhile in post, there are already demands for true 4K content. This vanguard is notably led by Netflix and Amazon, however, international distributors are also starting to request 4K masters, if they are available.

In my analysis of the images from various 4K (and higher) camera, it starts to become quite obvious that the 1:1 image in 4K really isn’t all that good. In fact, if you compared a blow-up from HD to 4K of that same image, it becomes very hard to distinguish the blow-up from the true 4K image. Why is that?

When you analyze a native 4K image, you become aware of the deficiencies in the image. These weren’t as obvious when that 4K original was down-sampled to an HD timeline and master. That’s because in the HD timeline you are seeing the benefit of oversampling, which results in a superb HD image. Here are some factors that become more obvious when you view the footage in its original size.

1. Most formats use a high-compression algorithm to squeeze the data into a smaller file size. In some cases compression artifacts start to become visible at the native size.

2. Many DPs like to shoot with vintage or otherwise “lower quality” lenses. This gives the image “character” and, in the words of one cinematographer that I worked with, “takes the curse off of the digital image.” That’s all fine, but again, viewed natively, you start to see the defects in the optics, like chromatic aberration in the corners, coloration of the image, and general softness.

3. Due to the nature of video viewfinders, run-and-gun production methods, and smaller crews, many operators do not nail the critical focus on a shot. That’s not too obvious when you down-convert the image; however, at 100% you notice that focus was on your talent’s ear and not their nose.

The interesting thing to me is that when you take a 4K (or greater) image, down-convert that to HD, and then up-convert it back to 4K, much of the image detail is retained. I’ve especially noticed this when high quality scalers are used for the conversion. For example, even the free version of DaVinci Resolve offers one of the best up-scalers on the market. Secondly, scaling for 1920 x 1080 to 3840 x 2160 is an even 2x multiple, so a) the amount you are zooming in isn’t all that much, and b) even numbered multiples give you better results than fractional values. In addition, Resolve also offers several scaling methods for sharper versus smoother results.

df1716_4k-native_16_smIn general, I feel that the most quality is retained when you start with 4K footage rather than HD, but that’s not a given. I’ve blown up ARRI ALEXA clips – that only ever existed as HD – up to 4K and the result was excellent. That has a lot to do with what ARRI is doing in their sensor and the general detail of the ALEXA image. Clearly that’s been proven time and time again in the theaters, where files recorded using ALEXAs with the footage in 2K, HD or 2.8K ARRIRAW have been blown up via 4K projection onto the large screen and the image is excellent.

Don’t get me wrong. I’m not saying you shouldn’t post in 4K if you have an easy workflow (see my post next week) to get there. What I am saying is that staying in 4K versus a 4K-HD-4K workflow won’t result in a dramatic difference in image quality, when you compare the two side-by-side at 100% pixel-for-pixel resolution. The samples below come from a variety of sources, including the blogs of John Brawley, Philip Bloom and OffHollywood Productions. In some cases the source images originated from pre-production cameras, so there may be image anomalies not found in actual shipping models of these cameras. Grades applied are mine.

View some of the examples below. Click on any of these images for the slide show. From there you can access the full size version of any of these comparisons.

©2016 Oliver Peters

Australian Design Shines with Blackmagic

df1616_bmd_reddot_1_sm

One of the things to do in the week after NAB is to scour the internet to pick up those gems I might have missed at the show. I was curious to run across a blurb at RedShark News about a prestigious design award picked up by Blackmagic Design.

df1616_bmd_reddot_6Anyone in this industry who’s been exposed to any Blackmagic product knows that the company has a sense of taste when it comes to industrial design, packaging, and even their website. Products, like their rack-mounted gear and cameras, have a certain finesse even down to the screws that hold them together. One look at DaVinci Resolve and you know they’ve aimed at the best-looking and easiest-to-navigate user interface of any NLE. The redesign of the Cintel Scanner is like an art piece to hang on the wall.

df1616_bmd_reddot_2This year they’ve been honored as the Design Team of the Year by the Red Dot Awards. This is a design competition founded by German industrial designer Professor Dr. Peter Zec, former president of Icsid (International Council of Societies of Industrial Design) and current head of the German design center, Design Zentrum Nordrhein Westfalen. The Design Team of the Year Award (which is awarded and not competed for) goes to one company each year. Blackmagic Design is in good company, as past winners include Apple, Porsche, and frog design (who has been closely involved with Apple over the years) – among many others.df1616_bmd_reddot_4

df1616_bmd_reddot_5Blackmagic’s design team is headed by Simon Kidd, Director of Industrial Design, who’s been with the company for ten years. This is the first time the honor has gone to an Australian firm and highlights the outstanding work being done down under. That design aesthetic can be seen not only at Blackmagic, but other Australian firms, too, including Atomos and Rode Microphones. It’s nice to see this recognition go to any company in the film and video space, but even better when it goes to someone who really values design along with solid functionality.

©2016 Oliver Peters

Automatic Duck Xsend Motion

df1516_AD_1_sm

When Apple transitioned its Final Cut Pro product family from Final Cut Studio to Final Cut Pro X, Motion 5, and Compressor 4, it lost a number of features that editors really liked. Some of these “missing” features show up as consistent and reoccurring requests on various wish lists. One of the most popular is the roundtrip function that sent Final Cut Pro “classic” timelines over to Motion for further compositing. To many, it seemed like Motion had become relegated to being a fancy development tool for FCPX plug-ins, rather than what it is – a powerful, GPU-enabled compositor.

df1516_AD_2At last, that workflow hole has been plugged, thanks to Automatic Duck. Last year the father/son development team brought us a way to go from Final Cut Pro X to Adobe’s After Effects by way of the Automatic Duck Ximport AE bridge. This week at the FCP Exchange Workshop in Las Vegas, Wes Plate reveals the new Automatic Duck Xsend Motion. This tool leverages the power of the FCPX’s version of XML to move data from one application to the other. Thanks to FCPXML, it provides a bridge to send FCPX timelines, clips, or sections of timelines over to Motion 5.

df1516_AD_4Xsend Motion reads FCPXML exports or is able to process projects directly from the Final Cut Pro X Share menu. The Xsend menu enables a number of settings options, including whether to bring clips into Motion as individual clips or as what Automatic Duck has dubbed as “lanes”. When clips are left individual, then each clip is assigned a layer in Motion for a composition made up of a series of cascading layers. If you opt for lanes, then the Motion layers stay grouped in a similar representation to the FCPX project timeline. This way primary and secondary storylines and connected clips are properly configured. Xsend also interprets compound clips.

Automatic Duck is striving to correctly interpret all of the FCPX characteristics, including frame sizes, rates, cropping, and more. Since Final Cut Pro X and Motion 5 are essentially built upon the same engine, the translation will correctly interpret most built-in effects. However, it may or may not interpret custom Motion templates that individual users have created. In addition, they plan on being able to properly translate many of the effects in the FxFactory portfolio, which typically install into both FCPX and Motion.

df1516_AD_3While Xsend Motion and Ximport AE are primarily one-way trips, there is a mechanism to send the finished result back to Final Cut Pro X from Motion 5. The first and most obvious is simply to render the Motion composition as a flattened QuickTime movie and import that back into FCPX as new media. However, you can also publish the Motion composition as an FCPX Generator. This would then show up in the Generators portion of the Effects Palette as a custom generator effect.

Automatic Duck Xsend Motion will be officially released later this year. The price hasn’t been announced yet. Current Automatic Duck products (Automatic Duck Ximport AE and Automatic Duck Media Copy) are available through Red Giant.

©2016 Oliver Peters

Spring Tools

df1416_tools_7_sm

It’s often the little things that improve your editing workflow. Here are a few quick items that can expand your editing arsenal.

Hawaiki Super Dissolve

df1416_tools_3The classical approach to editing transitions suggests that all you need is a cut and a dissolve. Given how often most editors use a dissolve transition, it’s amazing that few NLE developers spend any time creating more than a basic video dissolve, fade or dip. After all, even the original Media Composer came with both a video and a film-style dissolve. Audio mixers are used to several different types of crossfades.

Since this is such a neglected area, the development team behind the Hawaiki plug-ins decided to create Super Dissolve – a dissolve transition plug-in for Final Cut Pro X with many more options. This installs through the FxFactory application. It shows up in the FCPX transitions palette as a dissolve effect, plus a set of presets for fades, dips and custom curves. A dissolve is nothing more than a blend between two images, so Super Dissolve exposes the same types of under-the-hood controls as After Effects and Photoshop artists are used to with compositing modes.

Drop the Super Dissolve in as a transition and you have control over blending modes, layer order, easing controls with timing, and the blurring of the outgoing and/or incoming image. Since you have control over the outgoing and incoming clips separately, different values can be applied to either side, thus enabling an asymmetrical effect. For example, a quick fade with a blur off the outgoing clip, while bringing the incoming side up more slowly. As with the default FCPX dissolve, there’s also an audio crossfade adjustment, since FCPX transitions can effect both audio and video when these elements are combined. If you really like the ability to finesse your transitions, then Super Dissolve hits the spot.

XEffects Audio Fades

df1416_tools_6Free is good, so check out Idustrial Revolution’s free effects. Although they are primarily a video effects developer for Motion and Final Cut Pro X, they recently added a set of audio fade presets for FCPX. Download and install the free pack and you’ll find the XEffects Fades group in the audio plug-ins section of your effects palette.

XEffects Fades includes a set of preset fade handles, which are applied to the audio on your timeline clips. Drag-and-drop the preset with the fade length closest to what you want and it automatically adjusts the fade handle length at both ends of that audio clip. If you want to tweak the length, apply the effect first and then adjust the length puck on the clip as needed. Existing lengths will be overwritten when you drop the effect onto the clip, so make sure you make these adjustments last.

AudioDenoise and EchoRemover

df1416_tools_5CrumplePop is another developer known for its video effects; but they, too have decided to add audio effects to their repertoire. AudioDenoise and EchoRemover are two Final Cut Pro X plug-ins sold through the FxFactory application. These two effects are easy-to-use Apple Audio Units filters designed to improve poorly recorded location audio. As with Apple’s own built-in controls, each filter includes a few sliders to adjust strength and how the effect is applied. When applying any audio “clean up” filter, a little goes a long way. If you use it to its extreme range, the result sounds like you are underwater. Nevertheless, these two filters do a very nice job with poor audio, without presenting the cost and complexity of other well-known audio products.

Alex4D Animated Transitions

df1416_tools_1For a little bit of spice in your Final Cut Pro X timelines, it’s worth checking out the Alex4D Animated Transitions from FxFactory. Alex Gollner has been a prolific developer of free Final Cut Pro plug-ins, but this is his first commercial effort. Animated Transitions are a set of 120 customizable transition effects to slide, grow, split and peel incoming or outgoing clips and lower third titles. Traditionally you’d have to build these effects yourself using DVE moves. But by dropping one of these effects onto a cut point between two clips, you quickly apply a dynamic effect with all the work already done. Simply pick the transition you like, tweak the parameters and it’s done.

Post Notes

df1416_tools_4One of the best features of Adobe applications is Extensions. This is a development “hook” within Premiere Pro or After Effects that allows developers to create task-oriented panels, tools and controls that effectively “bolt” right into the Adobe interface. One example for After Effects would be TypeMonkey (and the other “Monkeys”), which are kinetic effect macros. For Premiere there’s PDFviewer, which enables you to view your script (or any other document) in PDF format right inside the Premiere user interface.

A new extension for Premiere Pro CC is Post Notes. Once installed, it’s an interface panel within Premiere Pro that functions as a combined notepad and to-do list. These are tied to a specific sequence, so you can have a set of notes and to-dos for each sequence in your project. When a to-do item is completed, check it off to indicate that it’s been addressed. This tool is so straightforward and simple, you’ll wonder why every editing software doesn’t already have something like this built-in.

Hedge for Mac

df1416_tools_2With digital media as a way of life for most editors, we have to deal with more and more camera media. Quickly copying camera cards is a necessary evil and making sure you do this without corruption is essential. The Mac Finder really is NOT the tool you should be using, yet everyone does it. There are a number of products on the market that copy to multiple locations with checksum verification. These are popular with DITs and “data wranglers” and include Pomfort Silverstack, Red Giant Offload, and even Adobe Prelude.

A newcomer is Hedge for Mac. This is a simple, single-purpose utility designed to quickly copy files and verify the copies. There’s a free and a paid version. If you just want to copy to one or two destinations at a time, the free version will do. If you need even more destinations as a simultaneous copy, then go for the paid version. Hedge will also launch your custom AppleScripts to sort, transcode, rename or perform other functions. Transfers are fast in the testing I’ve done, so this is a must-have tool for any editors.

©2016 Oliver Peters

Deadpool

df1016_deadpool_1_sm

Adobe has been on a roll getting filmmakers to adopt its Premiere Pro CC editing software for feature film post. Hot on the heels of its success at Sundance, where a significant number of the indie films we’re edited using Premiere Pro, February saw the release of two major Hollywood films that were cut using Premiere Pro – the Coen Brothers’ Hail, Caesar! and Tim Miller’s Deadpool.

Deadpool is one of Marvel Comics’ more unconventional superheroes. Deadpool, the film, is the origin story of how Wade Wilson (Ryan Reynolds) becomes Deadpool. He’s a mercenary soldier that gains accelerated healing powers through a rogue experiment. Left disfigured, but with new powers, he sets off to rescue his girlfriend (Morena Baccarin) and find the person responsible. Throughout all of this, the film is peppered with Deadpool’s wise-cracking and breaking the fourth wall by addressing the audience.

This is the first feature film for director Tim Miller, but he’s certainly not new to the process. Miller and his company Blur Studios are known for their visual effects work on commercials, shorts, and features, including Scott Pilgrim vs. the World and Thor: The Dark World. Setting out to bring as much of the post in-house, Miller consulted with his friend, director David Fincher, who recommended the Adobe Creative Cloud solution, based on Fincher’s experience during Gone Girl. Several editing bays were established within Blur’s facility – using new, tricked out Mac Pros connected to an Open Drives Velocity SSD 180TB shared storage solution.

Plugging new software into a large VFX film pipeline

df1016_deadpool_6Julian Clarke (Chappie, Elysium, District 9) came on board to edit the film. He explains, “I talked with Tim and was interested in the whole pioneering aspect of it. The set-up costs to make these permanent edit suites for his studio are attractive. I learned editing using [Apple] Final Cut Pro at version one and then I switched to Avid about four years later and have cut with it since. If you can learn [Avid] Media Composer, then [Adobe] Premiere Pro is fine. I was up to about 80% of my normal speed after just two days.”

To ease any growing pains of using a new editing tool on such a complex film, Miller and Adobe also brought in feature film editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2: The Second One, An American Carol) as a workflow consultant. Nedomansky’s job was to help establish a workflow pipeline and to get the editorial team up to speed with Premiere Pro. He had performed a similar role on Gone Girl. He says, “I’ve cut nine features and the last four have been using Premiere Pro. Adobe has called on me for that editor-to-editor interface and to help Blur set up five edit bays. I translated what we figured out with Gone Girl, but adapted it to Blur’s needs, as well as taking into consideration the updates made to the software since then. During the first few weeks of shooting, I worked with Julian and the assistant editors to customize their window layouts and keyboard shortcuts, since prior to this, the whole crew had primarily been using Avid.”

Deadpool was shot mostly with ARRI ALEXA cameras recording open gate 2.8K ARRIRAW. Additional footage also came from Phantom and RED cameras. Most scenes were recorded with two cameras. The original camera files were transcoded to 2K ProRes dailies in Vancouver. Back at Blur, first assistant editor Matt Carson would sync audio and group the clips into Premiere Pro multicam sequences.

Staying up with production

df1016_deadpool_2As with most features, Clarke was cutting while the production was going on. However, unlike many films, he was ready to show Miller edited scenes to review within 24 hours after the shoot had wrapped for the day. Not only a cut scene, but one already fleshed out with temporary sound effects and music. This is quite a feat, considering that Miller shot more than 500 hours of footage. Seeing a quick turnaround of edited scenes was very beneficial for Miller as a first-time feature director. Clarke adds, “My normal approach is to start cutting and see what works as a first draft. The assistant will add sound effects and temp music and if we hit a stumbling block, we move on to another scene. Blur had also created a lot of pre-vis shots for the effects scenes prior to the start of principal photography. I was able to cut these in as temp VFX. This way the scenes could play through without a lot of holes.”

df1016_deadpool_3To make their collaborative workflow function, Nedomansky, Clarke, and the assistants worked out a structure for organizing files and Premiere Pro projects. Deadpool was broken into six reels, based on the approximate page count in the script where a reel break should occur. Every editor had their own folder on the Open Drives SAN containing only the most recent version of whatever project that they were working on. If Julian Clarke was done working on Reel 1, then that project file could be closed and moved from Clarke’s folder into the folder of one of the assistants. They would then open the project to add temporary sound effects or create some temporary visual effects. Meanwhile, Clarke would continue on Reel 2, which was located in his folder. By keeping only the active project file in the various folders and moving projects among editors’ folders, it would mimic the bin-locking method used in shared Avid workflows.

In addition, Premiere Pro’s Media Browser module would also enable the editors to access and import sequences found within other project files. This is a non-destructive process. Older versions of the project files would be stored in a separate folder on the SAN in order to keep the active folders and projects uncluttered. Premiere Pro’s ability to work with folders as they were created in the Finder, let the editors do more of the organization at the Finder level than they normally would, had they been cutting with Avid systems.

Cutting an action film

df1016_deadpool_4Regardless of the software you use, each film presents a unique set of creative challenges. Clarke explains, “One scene that took a while was a long dialogue scene with Deadpool and Colossus on the highway. It’s quintessential Deadpool with a lot of banter and improv from Ryan. There’s not much story going on in the background at that time. We didn’t want to cut too much out, but at the same time we didn’t want to have the audience get lost in what’s supposed to be the bigger story. It took some time to strike the right balance. Overall the film was just about right. The director’s cut was about two hours, which was cut into the final length of one hour and 45 minutes. That’s just about the right amount to cut out, because you don’t end up loosing so much of the heart of the film.”

Many editors have a particular way they like their assistants to organize bins and projects. Clarke offers, “I tend to work in the frame view and organize my set-ups by masters, close-ups, and so on. Where I may be a little different than other editors is how I have my assistants organize action scenes. I’ll have them break down the choreography move-by-move and build a sequence of selected shots in the order of these moves. So for example, all the angles of the first punch, followed by all the angles of the next move – a punch, or block, or kick. Action scenes are often shot with so much coverage, that this lets me quickly zero in on the best stuff. It eliminates the scavenger hunt to find just the right angle on a move.”

df1016_deadpool_8The script was written to work in a nonlinear order. Clarke explains how that played out through the edit, “We stood by this intention in the editing. We found, in fact, that the film just didn’t work linearly at all. The tone of the two [scripted] timelines are quite different, with the more serious undertones of the origin story and the broad humor of the Deadpool timeline. When played sequentially, it was like oil and water – two totally different movies. By interweaving the timelines, the tone of the movie felt more coherent with the added bonus of being able to front load action into the movie to excite the audience, before getting into the heavier cancer part of the story.”

One editing option that might come to mind is that a character in a mask offers an interesting opportunity to change dialogue without difficult sync issues. However it wasn’t the sort of crutch some might assume. Clarke says, “Yes, the mask provided a lot of opportunity for ADR. Though this was used more for tweaking dialogue for plot clarity or to try out alternate jokes, than a wholesale replacement of the production track. If we liked the production performance we generally kept it, and embraced the fact that the mask Ryan was wearing would dull the audio a bit. I try to use as little ADR as possible, when it comes to it being used for technical reasons, rather than creative ones. I feel like there’s a magic that happens on set that is often hard to replicate in the ADR booth.”

Pushing the envelope

df1016_deadpool_7The editing systems proved to offer the performance needed to complete a film of this size and complexity. Vashi Nedomansky says, “There were 1400 effects shots handled by ten vendors. Thanks to the fact that Blur tricked out the bays, the editors could push 10 to 15 layers of 2K media at a time for temp effects – in real-time without rendering. When the film was locked, audio was exported as AAF for the sound facility along with an H.264 picture reference. Blur did many of the visual effects in-house. For final picture deliverables, we exported an XML from Premiere Pro, but also used the Change List tool from Intelligent Assistance. This was mainly to supply the list in a column format that would match Avid’s output to meet the studio’s requirements.”

df1016_deadpool_5I asked Clarke and Nedomansky what the team liked best about working with the Adobe solution. Nedomansky says, “I found that the editors really liked the tilde key [on the keyboard], which in Premiere Pro brings any window to fullscreen. When you have a timeline with 24 to 36 tracks of temp sound effects, it’s really nice to be able to make that fullscreen so that you can fine-tune them. They also liked what I call the ‘pancake timeline’. This is where you can stack two timelines over each other to compare or pull clips from one into the other. When you can work faster like this, there’s more time for creativity.” Clarke adds, “I used a lot of the time-remapping in After Effects. Premiere Pro’s sub-frame audio editing is really good for dialogue. When Avid and Apple were competing with Media Composer and Final Cut Pro it was very productive for both companies. So competition between Avid and Adobe is good, because Premiere Pro is very forward-thinking.”

Many NLE users may question how feature films apply to the work they do. Nedomansky explains, “When Kirk Baxter used Premiere Pro for Fincher’s Gone Girl, the team requested many features that they were used to from Final Cut Pro 7. About 200 of those suggestions have found their way as features into the current release that all Creative Cloud customers receive. Film editors will stress a system in ways that others won’t, and that information benefits all users. The important takeaway from the Deadpool experience is that after some initial adjustment, there were no showstoppers and no chaos. Deadpool is a monster film, but these are just tools. It’s the human in the chair making the decision. We all just want to work and not deal with technical issues. Whatever makes the computer invisible – that’s the power.”

Deadpool is certainly a fun rid, with a lot of inside jokes for veteran Marvel fans. Look for the Stan Lee cameo and be sure to stay all the way through the end credits!

Watch director Tim Miller discuss the choice to go with Adobe.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Whiskey Tango Foxtrot

df1116_wtf_1_sm

As most readers know, “whiskey tango foxtrot” is the military way to communicate the letters WTF. Your imagination can fill in the rest. Whiskey Tango Foxtrot, the movie, is a dark comedy about the experiences of a female journalist in Afghanistan, based on Kim Barker’s memoir, The Taliban Shuffle: Strange Days in Afghanistan and Pakistan. Paramount Pictures tapped the writing/directing team of John Requa and Glenn Ficarra (Focus, Crazy, Stupid, Love., I Love You Phillip Morris) to tackle the film adaptation, starring Tina Fey, Margot Robbie, Martin Freeman, Billy Bob Thornton, and Alfred Molina.

df1116_wtf_2Glenn Ficarra explains the backstory, “When the military focus shifted from Afghanistan to Iraq there was a void in coverage. Barker was looking for a change in her life and volunteered to embed as a correspondent in Kabul. When she got there, she wasn’t quite ready for the high-adrenaline, partying lifestyle of many of the journalists. Most lived in dorms away from the general Afghan population. Since there weren’t that many females there, she found that there was a lot of interest in her.” This is the basis of both the book and the film – an Afghanistan story with a touch of Animal House and M*A*S*H.

Filming in Afghanistan would have been too dangerous, so production shifted to New Mexico, with Xavier Grobet (Focus, Enough Said, I Love You Phillip Morris) as the director of photography. The filmmakers also hired a female, Muslim journalist, Galereh Kiazand, as the second unit photographer to pick up B-roll in Kabul, which added to the authenticity. In addition, they also licensed stock shots originally filmed for The Kite Runner, but not used in that film. Ficarra adds, “We built two huge sets for Kabul and Kandahar, which were quite convincing, even to vets and Afghans who saw them.”

df1116_wtf_9With efficiencies realized during Focus, the team followed a similar course on this film. Ficarra explains, “We previously pulled the editing in-house. For Whiskey Tango Foxtrot we decided to do all the visual effects in-house, too. There are about 1,000 VFX shots in the film. It’s so great to simply bring on more artists as you need them and you only have to pay the crew. At its peak, we had about 20 Nuke artists working on shots. Doing it internally opens you up to more possibilities for minor effects that enhance shots. You would otherwise skip these if you were working with an outside effects house. We carried this approach into the filming as well. While traveling, it was great to quickly pick up a shot that you could use as B-roll. So our whole mentality has been very much like you work in film school.”

Adjusting the workflow for a new film

df1116_wtf_3The duo started production of Whiskey Tango Foxtrot on the heels of completing Focus. They brought along editor Jan Kovac, as well as use of Apple Final Cut Pro X for editing. This was the off-the-shelf version of Final Cut Pro X available to all customers at the time of the production – no special version or side build. Kovac explains what differed on this new film, “The biggest change was in camera formats. Instead of shooting [Apple] ProRes 4444, we switched to using the new ProRes 4444 XQ codec, which was deployed by ARRI on the ALEXAs. On Focus, we recorded ARRIRAW for the green screen shots. We did extensive testing with this XQ codec prior to production and it was perfect for even the green screen work. Most of the production was shot with two ALEXAs recording in a 2K theatrical format using the ProRes 4444 XQ codec.”

Light Iron provided a DIT on set who took the camera files, added a basic color LUT, synced production sound, and then generated viewing dailies, which were distributed to department heads on Apple iPads. The DIT also generated editorial files that were in the full 2K ProRes 4444 XQ resolution. Both the camera original files and the color-corrected editorial files were stored on a 160TB Accusys ExaSAN system back at the film’s post headquarters. Two Mac Minis served as metadata controllers. Kovac explains, “By always having the highest quality image to edit with, it meant that we could have the highest quality screenings at any given time. You always see the film in a state that is very close to the final product. Since visual effects were being handled in-house, it made sense to have the camera original files on the SAN. This way shots could quickly be pulled for VFX work, without the usual intermediate step of coordinating with the lab or post house that might otherwise store these files.”

df1116_wtf_6Another change was that audio was re-synced by the editing team. First assistant editor Kevin Bailey says, “The DIT would sync the production mix, but when it got here, I would sync up all the audio tracks using Sync-N-Link X. This syncs by timecode, making the process fast. I would group the cameras into multicam clips, but as many as 12 isolated audio tracks were also set up as separate angles. This way, Jan could easily switch between the production mix and individual mics. The only part that wasn’t as automatic was that the crew also used a Blackmagic Pocket Camera and a Sony A7 for some of the shots. The production was running at a true 24.0 fps frame rate, while these smaller cameras only shot 24 frames at a video rate of 23.98. These shots required adjustment and manual syncing. The reason for a true 24.0 frame rate was to make it easy to work with 48fps material. Sometimes the A-camera would run at 24fps while the B-camera ran at 48fps. Speeding up the B-camera by a 2X factor gets it into sync, without worrying about more complicated speed offsets.” In addition to these formats, the Afghanistan second unit footage was shot on a RED camera.

df1116_wtf_5Bailey is an experienced programmer who created the program Shot Notes X, which was used on this film. He continues, “Our script supervisor used Filemaker Pro, which exports a .csv file. Using Shot Notes X, I could combine the FCPXML from Final Cut with the .csv file and then generate a new FCPXML file. When imported back into Final Cut, the event would be updated to display scenes and takes, along with the script notes in the browser’s notes column. Common script codes would be used for close-ups, dolly shots, and so on. Filtering the list view by one of these codes in Final Cut would then display only the close-ups or only the dolly shots for easy access.” Bailey helped set up this pipeline during the first few weeks of production, at which point apprentice editor Esther Sokolow took over the dailies processing. Bailey shifted over to assist with sound and Sokolow later moved into a VFX editor role as one of several people doing temp VFX.

From trailer to home base

df1116_wtf_8During production in New Mexico, Kovac worked out of an editorial trailer equipped with a single Mac Pro and an 8TB G-Raid drive. There he was cutting using the proxy files that Final Cut Pro X can generate internally. During that 47-day period, Kovac was doing 90% of the editing. The amount of footage averaged about three hours and 40 minutes per day. In April, the unit moved back to home base in Los Angeles, where the team had two Mac Pro edit suites set up for the editors, as well as iMacs for the assistants.

John Requa and Glenn Ficarra are “hands-on” participants in the editing process. Kovac would cut in one room, while Ficarra and Requa would cut in the other. After the first preview, their collaboration slowly changed into a more traditional editor-director format. Even towards the end, Ficarra would still edit when he found time to do so. Post ended just before Christmas after a 35-week post schedule. Glenn Ficarra explains, “John and I have worked together for 30 years, so we are generally of one mind when we write, direct, or edit. Sometimes John would cut with me and I’d be the ‘fingers’ and other times he’d work with Jan. Or maybe I’d work with Jan and John would review and pick takes. So our process is very fluid.”

df1116_wtf_4The Whiskey Tango Foxtrot team worked deeper into temp sound and visual effects than before. Kovac explains, “Kevin is very comfortable with sound design during the edit. And he’s a good Nuke artist, too. While I was working on one reel, Kevin could work on a different reel adding in sound effects and creating monitor comps and screen replacements. A lot of this work was done inside of Final Cut using the SliceX and TrackX plug-ins from CoreMelt. We were able to work in a 5.1 surround project and did all of our temp mixes in 5.1.” The power of the plug-ins let more of the temp effects be done inside Final Cut  Pro X, resulting in a more efficient workflow with less need for roundtrips to other applications.

All media and render files were kept on the ExaSAN storage, but external of the Final Cut Pro X library files, thus keeping those small. The library files were stored on a separate NFS server (a Mac Mini using NFS Manager) with a separate FCPX library file for each reel of the film. This enabled the editors and assistants to all access any FCPX library file, as long as someone else wasn’t using it at that time. A shared iTunes library for temporary sound effects and music selections was stored on the SAN with all machines pointing to that location. From within Final Cut, any editor could browse the iTunes library for music and sound effects.

When it came time for sound and picture turnovers, X2Pro Audio Convert was used to pass audio to the sound design team as an AAF file. Light Iron’s Ian Vertovec handled final color correction on their Quantel Pablo Rio system. He was working off of camera original media, which Light Iron also stored at their facility after the production. Effects shots were sent over as DPX image sequences.

Thoughts on the cut

df1116_wtf_7The director’s cut for Whisky Tango Foxtrot ran about three hours, although the final length clocked in at 1:52:00 with credits. Kovac explains, “There were 167 scripted scenes in the original script, requiring a fair amount of trimming. Once you removed something it had consequences that rippled throughout. It took time to get it right. While it was a tougher film from that standpoint, it was easier, because no studio approval process was needed for the use of Final Cut Pro X. So it built upon the shoulders of Focus. Final Cut has proven itself as a valuable member of the NLE community. Naturally anything can be improved. For example, optical flow and auditions don’t work with multicam clips. Neither do the CoreMelt plug-ins.” Bailey adds, “For me the biggest selling point is the magnetic timeline. In areas where I would build up temp sound design, these would be the equivalent of ten tracks deep. It’s far easier to trim sections and have the audio follow along than in any other NLE.”

Glenn Ficarra wrapped up with these thoughts. He says, “A big step forward on this film was how we dealt with audio. We devised a method to keep as much as possible inside FCPX, for as long as possible – especially for screenings. This gave us more cutting time, which was nice. There was no need for any of the in-between turnovers I’ve gone through on other systems, just to prepare the movie for screenings. I like the robust third-party approach with Final Cut. It’s a small, tight-knit community. You can actually get in touch with a developer without going through a large corporation. I’d like to see Apple improve some features, like better match-back. I feel they’ve only scratched the surface with roles, so I’d like to see them develop that more.”

He concludes, “A lot of directors would like to cut for themselves, but find a tool like Avid impenetrable. It doesn’t have to be that way. My 12-year-old daughter is perfectly comfortable with Final Cut Pro X. Many of the current workflows stem from what was built up around film and we no longer work that way. Why adhere to the old film methods and rules? Filmmakers who are using new methods are those that aren’t satisfied with the status quo. They are willing to push the boundaries.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

More LUTs from IWLTBAP

df1316_iwltbap_1

With more cameras shooting in some form of a log or flat color profile and more editing software being able to integrate color look-up tables (LUTs), numerous developers have designed their own LUT packages. Some, like Koji, strive to duplicate the colorimetry of certain film stocks, while others, such as SpeedLooks from LookLabs, create stylized “look” files that give you a range of creative color correction choices.

One new developer offering a package of easy to use LUTs is French filmmaker IWLTBAP. Through the website, you can pick up a comprehensive package of LUTs in the 32x32x32 .cube format, which are compatible with most modern editing and compositing software applications. If you edit in Adobe Premiere Pro CC, the Lumetri Color panel lets you browse and add any .cube LUTs you’ve saved on your hard drives. If you cut in Apple Final Cut Pro X, then the addition of a LUT plug-in, like Color Grading Central’s LUT Utility, enables you to add third-party LUTs to any clip on the timeline.df1316_iwltbap_4

I took these LUTs for a spin and like most LUT packages, they come in a groups. First you have Utility LUTs, which are designed to convert color spaces from log to Rec709 (the standard video color space) or in the opposite direction. These are organized by camera type, since not all manufacturers use the same logarithmic values. Then the color correction or “look” LUTs are grouped into Standard and Log versions.

The Standard LUTs are to be applied to images that are already in Rec709 color space, while the Log versions can be used as a one-step LUT to be applied to generic log images. For example, you could apply both a Log-to-Rec709 Utility LUT and a second LUT from the Standard group to achieve your result. Or simply apply the single Log version to that same clip and end up with similar results. The dual-LUT approach gives you more incremental control over the Log conversion based on camera models, whereas the single-step solution is designed for generic log images. However, both can yield the desired grade, depending on the clip. In addition to the paid LUT package, IWLTBAP offers two Bonus LUTs, which are available as a free download from the website.

df1316_iwltbap_2There are over 80 LUTs in each group and these are organized by color style and number. The numbers don’t really mean anything. In other words, they aren’t an attempt to mimic a film stock number. As you ascend in numbers, the next step is a more aggressive or somewhat different version of the previous. The key is the prefix and suffix for each. These LUT files carry a STD or LOG suffix so you know whether these are from the Log or Standard group. Then there’s a prefix: C for cold, H for hot, W for warm, F for film, and X for creative. Each style has several variations within that general look. For example, the LUT file labelled “F-9490-STD.cube” is a LUT with a filmic curve designed for a Rec709 image.

df1316_iwltbap_7When working with LUTs, it’s often hard to know what result you get until you try it. Then if you don’t like the look, you have to continue to slowly browse through your LUT files – applying each, one at a time – until you get the right look. Often that can lead to a lot of trial and error. The IWLTBAP package ships with lightweight Windows and Mac preview applications, however, the developer warns of some occasional instability on some machines. The easiest solution is to use their web-based LUT previewer. Simply upload a reference JPEG from your clip and then toggle through the LUTs to preview how those will affect the shot.

df1316_iwltbap_6I ran some tests on Blackmagic Design camera footage in both FCPX and Premiere Pro CC and got some really pleasing results. In the case of FCPX, if you use LUT Utility, you have to copy the .cube files into LUT Utility’s Motion Templates folder. This is found under Effects/CGC. Files stored there become visible in the LUT Utility pulldown menu. Note that only the first 50 or so files in that folder can be accessed, so be selective. If you apply two instances of the LUT Utility to a clip, then you can apply a Log-to-Rec709 conversion in the first and then the creative look LUT in the second. This plug-in has a mix slider, so you can adjust the intensity of the LUT to taste. As an effects plug-in, you can also place other effects, such as color correction in-between the two LUT Utility effects as part of that stack of effects. Doing this gives you nice control over color within FCPX with very little overhead on the application’s performance.

df1316_iwltbap_3If you are an FCPX user that has adopted Color Grading Central’s ColorFinale grading tool as your go-to color correction plug-in, then all of this LUT management within the application can be simply handled from the ColorFinale interface itself. Stack layers of LUTs and other color tools all inside the ColorFinale panel. LUT choices can be added or removed using the integrated LUT Manager and then relaunching FCPX to activate them as part of ColorFinale.

If you are a Premiere Pro CC editor, then the latest version was enhanced with the Lumetri Color panel. This control is organized as a stack of color modules, which include two entry points to add a LUT – in the Basic and the Creative tabs. In my testing of the new URSA footage, I applied a Log-to-Rec709 LUT for the URSA in Basic and then one the “look” LUTs, like the free Aspen standard version, in Creative. You still have all the other color control in the Lumetri panel to fine-tune these, including the intensity level of the LUT.

df1316_iwltbap_5LUTs are a creative tool that should be thought of as a stylistic choice. They aren’t an instant fix and shouldn’t be the only tool you use to color correct a clip. However, the LUTs from IWLTBAP provide a good selection of looks and moods that work well with a wide range of shots. Plus the package is very affordable and even more so if you get it after reading this blog! Readers who are interested can get 25% off of the retail price using the discount code DIGITALFILMS. Or by using this direct link.

Last but not least, check out the free, downloadable 4K film grain clip. It’s a ten second ProRes file that can be overlaid or blended to add grain to your shot.

©2016 Oliver Peters