4K is kinda meh

df1716_4k-native_main

Lately I’ve done a lot of looking at 4K content. Not only was 4K all over the place at NAB in Las Vegas, but I’ve also had to provide some 4K deliverables on client projects. This has meant a much closer examination of the 4K image than in the past.

First, let’s define 4K. Typically the term 4K applies to either a “cinema” width of 4096 pixels or a broadcast width of 3840 pixels. The latter is also referred to as QuadHD, UltraHD or UHD and is a 2x multiple of the 1920-wide HD standard. For simplicity’s sake, in this article I’m going to be referring to 4K, but will generally mean the UHD version, i.e. 3840 x 2160 pixels, aka 2160p. While 4K (and greater) acquisition for an HD finish has been used for awhile in post, there are already demands for true 4K content. This vanguard is notably led by Netflix and Amazon, however, international distributors are also starting to request 4K masters, if they are available.

In my analysis of the images from various 4K (and higher) camera, it starts to become quite obvious that the 1:1 image in 4K really isn’t all that good. In fact, if you compared a blow-up from HD to 4K of that same image, it becomes very hard to distinguish the blow-up from the true 4K image. Why is that?

When you analyze a native 4K image, you become aware of the deficiencies in the image. These weren’t as obvious when that 4K original was down-sampled to an HD timeline and master. That’s because in the HD timeline you are seeing the benefit of oversampling, which results in a superb HD image. Here are some factors that become more obvious when you view the footage in its original size.

1. Most formats use a high-compression algorithm to squeeze the data into a smaller file size. In some cases compression artifacts start to become visible at the native size.

2. Many DPs like to shoot with vintage or otherwise “lower quality” lenses. This gives the image “character” and, in the words of one cinematographer that I worked with, “takes the curse off of the digital image.” That’s all fine, but again, viewed natively, you start to see the defects in the optics, like chromatic aberration in the corners, coloration of the image, and general softness.

3. Due to the nature of video viewfinders, run-and-gun production methods, and smaller crews, many operators do not nail the critical focus on a shot. That’s not too obvious when you down-convert the image; however, at 100% you notice that focus was on your talent’s ear and not their nose.

The interesting thing to me is that when you take a 4K (or greater) image, down-convert that to HD, and then up-convert it back to 4K, much of the image detail is retained. I’ve especially noticed this when high quality scalers are used for the conversion. For example, even the free version of DaVinci Resolve offers one of the best up-scalers on the market. Secondly, scaling for 1920 x 1080 to 3840 x 2160 is an even 2x multiple, so a) the amount you are zooming in isn’t all that much, and b) even numbered multiples give you better results than fractional values. In addition, Resolve also offers several scaling methods for sharper versus smoother results.

df1716_4k-native_16_smIn general, I feel that the most quality is retained when you start with 4K footage rather than HD, but that’s not a given. I’ve blown up ARRI ALEXA clips – that only ever existed as HD – up to 4K and the result was excellent. That has a lot to do with what ARRI is doing in their sensor and the general detail of the ALEXA image. Clearly that’s been proven time and time again in the theaters, where files recorded using ALEXAs with the footage in 2K, HD or 2.8K ARRIRAW have been blown up via 4K projection onto the large screen and the image is excellent.

Don’t get me wrong. I’m not saying you shouldn’t post in 4K if you have an easy workflow (see my post next week) to get there. What I am saying is that staying in 4K versus a 4K-HD-4K workflow won’t result in a dramatic difference in image quality, when you compare the two side-by-side at 100% pixel-for-pixel resolution. The samples below come from a variety of sources, including the blogs of John Brawley, Philip Bloom and OffHollywood Productions. In some cases the source images originated from pre-production cameras, so there may be image anomalies not found in actual shipping models of these cameras. Grades applied are mine.

View some of the examples below. Click on any of these images for the slide show. From there you can access the full size version of any of these comparisons.

©2016 Oliver Peters

Advertisements

Australian Design Shines with Blackmagic

df1616_bmd_reddot_1_sm

One of the things to do in the week after NAB is to scour the internet to pick up those gems I might have missed at the show. I was curious to run across a blurb at RedShark News about a prestigious design award picked up by Blackmagic Design.

df1616_bmd_reddot_6Anyone in this industry who’s been exposed to any Blackmagic product knows that the company has a sense of taste when it comes to industrial design, packaging, and even their website. Products, like their rack-mounted gear and cameras, have a certain finesse even down to the screws that hold them together. One look at DaVinci Resolve and you know they’ve aimed at the best-looking and easiest-to-navigate user interface of any NLE. The redesign of the Cintel Scanner is like an art piece to hang on the wall.

df1616_bmd_reddot_2This year they’ve been honored as the Design Team of the Year by the Red Dot Awards. This is a design competition founded by German industrial designer Professor Dr. Peter Zec, former president of Icsid (International Council of Societies of Industrial Design) and current head of the German design center, Design Zentrum Nordrhein Westfalen. The Design Team of the Year Award (which is awarded and not competed for) goes to one company each year. Blackmagic Design is in good company, as past winners include Apple, Porsche, and frog design (who has been closely involved with Apple over the years) – among many others.df1616_bmd_reddot_4

df1616_bmd_reddot_5Blackmagic’s design team is headed by Simon Kidd, Director of Industrial Design, who’s been with the company for ten years. This is the first time the honor has gone to an Australian firm and highlights the outstanding work being done down under. That design aesthetic can be seen not only at Blackmagic, but other Australian firms, too, including Atomos and Rode Microphones. It’s nice to see this recognition go to any company in the film and video space, but even better when it goes to someone who really values design along with solid functionality.

©2016 Oliver Peters

Automatic Duck Xsend Motion

df1516_AD_1_sm

When Apple transitioned its Final Cut Pro product family from Final Cut Studio to Final Cut Pro X, Motion 5, and Compressor 4, it lost a number of features that editors really liked. Some of these “missing” features show up as consistent and reoccurring requests on various wish lists. One of the most popular is the roundtrip function that sent Final Cut Pro “classic” timelines over to Motion for further compositing. To many, it seemed like Motion had become relegated to being a fancy development tool for FCPX plug-ins, rather than what it is – a powerful, GPU-enabled compositor.

df1516_AD_2At last, that workflow hole has been plugged, thanks to Automatic Duck. Last year the father/son development team brought us a way to go from Final Cut Pro X to Adobe’s After Effects by way of the Automatic Duck Ximport AE bridge. This week at the FCP Exchange Workshop in Las Vegas, Wes Plate reveals the new Automatic Duck Xsend Motion. This tool leverages the power of the FCPX’s version of XML to move data from one application to the other. Thanks to FCPXML, it provides a bridge to send FCPX timelines, clips, or sections of timelines over to Motion 5.

df1516_AD_4Xsend Motion reads FCPXML exports or is able to process projects directly from the Final Cut Pro X Share menu. The Xsend menu enables a number of settings options, including whether to bring clips into Motion as individual clips or as what Automatic Duck has dubbed as “lanes”. When clips are left individual, then each clip is assigned a layer in Motion for a composition made up of a series of cascading layers. If you opt for lanes, then the Motion layers stay grouped in a similar representation to the FCPX project timeline. This way primary and secondary storylines and connected clips are properly configured. Xsend also interprets compound clips.

Automatic Duck is striving to correctly interpret all of the FCPX characteristics, including frame sizes, rates, cropping, and more. Since Final Cut Pro X and Motion 5 are essentially built upon the same engine, the translation will correctly interpret most built-in effects. However, it may or may not interpret custom Motion templates that individual users have created. In addition, they plan on being able to properly translate many of the effects in the FxFactory portfolio, which typically install into both FCPX and Motion.

df1516_AD_3While Xsend Motion and Ximport AE are primarily one-way trips, there is a mechanism to send the finished result back to Final Cut Pro X from Motion 5. The first and most obvious is simply to render the Motion composition as a flattened QuickTime movie and import that back into FCPX as new media. However, you can also publish the Motion composition as an FCPX Generator. This would then show up in the Generators portion of the Effects Palette as a custom generator effect.

Automatic Duck Xsend Motion will be officially released later this year. The price hasn’t been announced yet. Current Automatic Duck products (Automatic Duck Ximport AE and Automatic Duck Media Copy) are available through Red Giant.

©2016 Oliver Peters

Spring Tools

df1416_tools_7_sm

It’s often the little things that improve your editing workflow. Here are a few quick items that can expand your editing arsenal.

Hawaiki Super Dissolve

df1416_tools_3The classical approach to editing transitions suggests that all you need is a cut and a dissolve. Given how often most editors use a dissolve transition, it’s amazing that few NLE developers spend any time creating more than a basic video dissolve, fade or dip. After all, even the original Media Composer came with both a video and a film-style dissolve. Audio mixers are used to several different types of crossfades.

Since this is such a neglected area, the development team behind the Hawaiki plug-ins decided to create Super Dissolve – a dissolve transition plug-in for Final Cut Pro X with many more options. This installs through the FxFactory application. It shows up in the FCPX transitions palette as a dissolve effect, plus a set of presets for fades, dips and custom curves. A dissolve is nothing more than a blend between two images, so Super Dissolve exposes the same types of under-the-hood controls as After Effects and Photoshop artists are used to with compositing modes.

Drop the Super Dissolve in as a transition and you have control over blending modes, layer order, easing controls with timing, and the blurring of the outgoing and/or incoming image. Since you have control over the outgoing and incoming clips separately, different values can be applied to either side, thus enabling an asymmetrical effect. For example, a quick fade with a blur off the outgoing clip, while bringing the incoming side up more slowly. As with the default FCPX dissolve, there’s also an audio crossfade adjustment, since FCPX transitions can effect both audio and video when these elements are combined. If you really like the ability to finesse your transitions, then Super Dissolve hits the spot.

XEffects Audio Fades

df1416_tools_6Free is good, so check out Idustrial Revolution’s free effects. Although they are primarily a video effects developer for Motion and Final Cut Pro X, they recently added a set of audio fade presets for FCPX. Download and install the free pack and you’ll find the XEffects Fades group in the audio plug-ins section of your effects palette.

XEffects Fades includes a set of preset fade handles, which are applied to the audio on your timeline clips. Drag-and-drop the preset with the fade length closest to what you want and it automatically adjusts the fade handle length at both ends of that audio clip. If you want to tweak the length, apply the effect first and then adjust the length puck on the clip as needed. Existing lengths will be overwritten when you drop the effect onto the clip, so make sure you make these adjustments last.

AudioDenoise and EchoRemover

df1416_tools_5CrumplePop is another developer known for its video effects; but they, too have decided to add audio effects to their repertoire. AudioDenoise and EchoRemover are two Final Cut Pro X plug-ins sold through the FxFactory application. These two effects are easy-to-use Apple Audio Units filters designed to improve poorly recorded location audio. As with Apple’s own built-in controls, each filter includes a few sliders to adjust strength and how the effect is applied. When applying any audio “clean up” filter, a little goes a long way. If you use it to its extreme range, the result sounds like you are underwater. Nevertheless, these two filters do a very nice job with poor audio, without presenting the cost and complexity of other well-known audio products.

Alex4D Animated Transitions

df1416_tools_1For a little bit of spice in your Final Cut Pro X timelines, it’s worth checking out the Alex4D Animated Transitions from FxFactory. Alex Gollner has been a prolific developer of free Final Cut Pro plug-ins, but this is his first commercial effort. Animated Transitions are a set of 120 customizable transition effects to slide, grow, split and peel incoming or outgoing clips and lower third titles. Traditionally you’d have to build these effects yourself using DVE moves. But by dropping one of these effects onto a cut point between two clips, you quickly apply a dynamic effect with all the work already done. Simply pick the transition you like, tweak the parameters and it’s done.

Post Notes

df1416_tools_4One of the best features of Adobe applications is Extensions. This is a development “hook” within Premiere Pro or After Effects that allows developers to create task-oriented panels, tools and controls that effectively “bolt” right into the Adobe interface. One example for After Effects would be TypeMonkey (and the other “Monkeys”), which are kinetic effect macros. For Premiere there’s PDFviewer, which enables you to view your script (or any other document) in PDF format right inside the Premiere user interface.

A new extension for Premiere Pro CC is Post Notes. Once installed, it’s an interface panel within Premiere Pro that functions as a combined notepad and to-do list. These are tied to a specific sequence, so you can have a set of notes and to-dos for each sequence in your project. When a to-do item is completed, check it off to indicate that it’s been addressed. This tool is so straightforward and simple, you’ll wonder why every editing software doesn’t already have something like this built-in.

Hedge for Mac

df1416_tools_2With digital media as a way of life for most editors, we have to deal with more and more camera media. Quickly copying camera cards is a necessary evil and making sure you do this without corruption is essential. The Mac Finder really is NOT the tool you should be using, yet everyone does it. There are a number of products on the market that copy to multiple locations with checksum verification. These are popular with DITs and “data wranglers” and include Pomfort Silverstack, Red Giant Offload, and even Adobe Prelude.

A newcomer is Hedge for Mac. This is a simple, single-purpose utility designed to quickly copy files and verify the copies. There’s a free and a paid version. If you just want to copy to one or two destinations at a time, the free version will do. If you need even more destinations as a simultaneous copy, then go for the paid version. Hedge will also launch your custom AppleScripts to sort, transcode, rename or perform other functions. Transfers are fast in the testing I’ve done, so this is a must-have tool for any editors.

©2016 Oliver Peters

Deadpool

df1016_deadpool_1_sm

Adobe has been on a roll getting filmmakers to adopt its Premiere Pro CC editing software for feature film post. Hot on the heels of its success at Sundance, where a significant number of the indie films we’re edited using Premiere Pro, February saw the release of two major Hollywood films that were cut using Premiere Pro – the Coen Brothers’ Hail, Caesar! and Tim Miller’s Deadpool.

Deadpool is one of Marvel Comics’ more unconventional superheroes. Deadpool, the film, is the origin story of how Wade Wilson (Ryan Reynolds) becomes Deadpool. He’s a mercenary soldier that gains accelerated healing powers through a rogue experiment. Left disfigured, but with new powers, he sets off to rescue his girlfriend (Morena Baccarin) and find the person responsible. Throughout all of this, the film is peppered with Deadpool’s wise-cracking and breaking the fourth wall by addressing the audience.

This is the first feature film for director Tim Miller, but he’s certainly not new to the process. Miller and his company Blur Studios are known for their visual effects work on commercials, shorts, and features, including Scott Pilgrim vs. the World and Thor: The Dark World. Setting out to bring as much of the post in-house, Miller consulted with his friend, director David Fincher, who recommended the Adobe Creative Cloud solution, based on Fincher’s experience during Gone Girl. Several editing bays were established within Blur’s facility – using new, tricked out Mac Pros connected to an Open Drives Velocity SSD 180TB shared storage solution.

Plugging new software into a large VFX film pipeline

df1016_deadpool_6Julian Clarke (Chappie, Elysium, District 9) came on board to edit the film. He explains, “I talked with Tim and was interested in the whole pioneering aspect of it. The set-up costs to make these permanent edit suites for his studio are attractive. I learned editing using [Apple] Final Cut Pro at version one and then I switched to Avid about four years later and have cut with it since. If you can learn [Avid] Media Composer, then [Adobe] Premiere Pro is fine. I was up to about 80% of my normal speed after just two days.”

To ease any growing pains of using a new editing tool on such a complex film, Miller and Adobe also brought in feature film editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2: The Second One, An American Carol) as a workflow consultant. Nedomansky’s job was to help establish a workflow pipeline and to get the editorial team up to speed with Premiere Pro. He had performed a similar role on Gone Girl. He says, “I’ve cut nine features and the last four have been using Premiere Pro. Adobe has called on me for that editor-to-editor interface and to help Blur set up five edit bays. I translated what we figured out with Gone Girl, but adapted it to Blur’s needs, as well as taking into consideration the updates made to the software since then. During the first few weeks of shooting, I worked with Julian and the assistant editors to customize their window layouts and keyboard shortcuts, since prior to this, the whole crew had primarily been using Avid.”

Deadpool was shot mostly with ARRI ALEXA cameras recording open gate 2.8K ARRIRAW. Additional footage also came from Phantom and RED cameras. Most scenes were recorded with two cameras. The original camera files were transcoded to 2K ProRes dailies in Vancouver. Back at Blur, first assistant editor Matt Carson would sync audio and group the clips into Premiere Pro multicam sequences.

Staying up with production

df1016_deadpool_2As with most features, Clarke was cutting while the production was going on. However, unlike many films, he was ready to show Miller edited scenes to review within 24 hours after the shoot had wrapped for the day. Not only a cut scene, but one already fleshed out with temporary sound effects and music. This is quite a feat, considering that Miller shot more than 500 hours of footage. Seeing a quick turnaround of edited scenes was very beneficial for Miller as a first-time feature director. Clarke adds, “My normal approach is to start cutting and see what works as a first draft. The assistant will add sound effects and temp music and if we hit a stumbling block, we move on to another scene. Blur had also created a lot of pre-vis shots for the effects scenes prior to the start of principal photography. I was able to cut these in as temp VFX. This way the scenes could play through without a lot of holes.”

df1016_deadpool_3To make their collaborative workflow function, Nedomansky, Clarke, and the assistants worked out a structure for organizing files and Premiere Pro projects. Deadpool was broken into six reels, based on the approximate page count in the script where a reel break should occur. Every editor had their own folder on the Open Drives SAN containing only the most recent version of whatever project that they were working on. If Julian Clarke was done working on Reel 1, then that project file could be closed and moved from Clarke’s folder into the folder of one of the assistants. They would then open the project to add temporary sound effects or create some temporary visual effects. Meanwhile, Clarke would continue on Reel 2, which was located in his folder. By keeping only the active project file in the various folders and moving projects among editors’ folders, it would mimic the bin-locking method used in shared Avid workflows.

In addition, Premiere Pro’s Media Browser module would also enable the editors to access and import sequences found within other project files. This is a non-destructive process. Older versions of the project files would be stored in a separate folder on the SAN in order to keep the active folders and projects uncluttered. Premiere Pro’s ability to work with folders as they were created in the Finder, let the editors do more of the organization at the Finder level than they normally would, had they been cutting with Avid systems.

Cutting an action film

df1016_deadpool_4Regardless of the software you use, each film presents a unique set of creative challenges. Clarke explains, “One scene that took a while was a long dialogue scene with Deadpool and Colossus on the highway. It’s quintessential Deadpool with a lot of banter and improv from Ryan. There’s not much story going on in the background at that time. We didn’t want to cut too much out, but at the same time we didn’t want to have the audience get lost in what’s supposed to be the bigger story. It took some time to strike the right balance. Overall the film was just about right. The director’s cut was about two hours, which was cut into the final length of one hour and 45 minutes. That’s just about the right amount to cut out, because you don’t end up loosing so much of the heart of the film.”

Many editors have a particular way they like their assistants to organize bins and projects. Clarke offers, “I tend to work in the frame view and organize my set-ups by masters, close-ups, and so on. Where I may be a little different than other editors is how I have my assistants organize action scenes. I’ll have them break down the choreography move-by-move and build a sequence of selected shots in the order of these moves. So for example, all the angles of the first punch, followed by all the angles of the next move – a punch, or block, or kick. Action scenes are often shot with so much coverage, that this lets me quickly zero in on the best stuff. It eliminates the scavenger hunt to find just the right angle on a move.”

df1016_deadpool_8The script was written to work in a nonlinear order. Clarke explains how that played out through the edit, “We stood by this intention in the editing. We found, in fact, that the film just didn’t work linearly at all. The tone of the two [scripted] timelines are quite different, with the more serious undertones of the origin story and the broad humor of the Deadpool timeline. When played sequentially, it was like oil and water – two totally different movies. By interweaving the timelines, the tone of the movie felt more coherent with the added bonus of being able to front load action into the movie to excite the audience, before getting into the heavier cancer part of the story.”

One editing option that might come to mind is that a character in a mask offers an interesting opportunity to change dialogue without difficult sync issues. However it wasn’t the sort of crutch some might assume. Clarke says, “Yes, the mask provided a lot of opportunity for ADR. Though this was used more for tweaking dialogue for plot clarity or to try out alternate jokes, than a wholesale replacement of the production track. If we liked the production performance we generally kept it, and embraced the fact that the mask Ryan was wearing would dull the audio a bit. I try to use as little ADR as possible, when it comes to it being used for technical reasons, rather than creative ones. I feel like there’s a magic that happens on set that is often hard to replicate in the ADR booth.”

Pushing the envelope

df1016_deadpool_7The editing systems proved to offer the performance needed to complete a film of this size and complexity. Vashi Nedomansky says, “There were 1400 effects shots handled by ten vendors. Thanks to the fact that Blur tricked out the bays, the editors could push 10 to 15 layers of 2K media at a time for temp effects – in real-time without rendering. When the film was locked, audio was exported as AAF for the sound facility along with an H.264 picture reference. Blur did many of the visual effects in-house. For final picture deliverables, we exported an XML from Premiere Pro, but also used the Change List tool from Intelligent Assistance. This was mainly to supply the list in a column format that would match Avid’s output to meet the studio’s requirements.”

df1016_deadpool_5I asked Clarke and Nedomansky what the team liked best about working with the Adobe solution. Nedomansky says, “I found that the editors really liked the tilde key [on the keyboard], which in Premiere Pro brings any window to fullscreen. When you have a timeline with 24 to 36 tracks of temp sound effects, it’s really nice to be able to make that fullscreen so that you can fine-tune them. They also liked what I call the ‘pancake timeline’. This is where you can stack two timelines over each other to compare or pull clips from one into the other. When you can work faster like this, there’s more time for creativity.” Clarke adds, “I used a lot of the time-remapping in After Effects. Premiere Pro’s sub-frame audio editing is really good for dialogue. When Avid and Apple were competing with Media Composer and Final Cut Pro it was very productive for both companies. So competition between Avid and Adobe is good, because Premiere Pro is very forward-thinking.”

Many NLE users may question how feature films apply to the work they do. Nedomansky explains, “When Kirk Baxter used Premiere Pro for Fincher’s Gone Girl, the team requested many features that they were used to from Final Cut Pro 7. About 200 of those suggestions have found their way as features into the current release that all Creative Cloud customers receive. Film editors will stress a system in ways that others won’t, and that information benefits all users. The important takeaway from the Deadpool experience is that after some initial adjustment, there were no showstoppers and no chaos. Deadpool is a monster film, but these are just tools. It’s the human in the chair making the decision. We all just want to work and not deal with technical issues. Whatever makes the computer invisible – that’s the power.”

Deadpool is certainly a fun rid, with a lot of inside jokes for veteran Marvel fans. Look for the Stan Lee cameo and be sure to stay all the way through the end credits!

Watch director Tim Miller discuss the choice to go with Adobe.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters