Molly’s Game

Molly Bloom’s future looked extremely bright. A shot at Olympic skiing glory leading to entry into a leading law school. But an accident during qualifying trials for the U. S. ski team knocked her out of the running for the Salt Lake City games. (Bloom notes in her own memoir that it was her decision to retire and change the course of her life, rather than the minor accident.) She moved to Los Angeles and ended up running high stakes, private poker games with her boss at the time. These games included A-list celebrities, hedge fund managers, and eventually, members of the Russian mob. Bloom quickly earned the nickname as the “poker princess”. This all came crashing down when Bloom was busted by the FBI and sentenced for her role in the gambling ring.

Bloom’s memoir came to the attention of screenwriter Aaron Sorkin (The Social Network, Moneyball, Steve Jobs), who not only made this his next film script, but also his debut as a film director. Sorkin stayed close to the facts that Bloom described in her own memoir and consulted her during the writing of the screenplay. The biggest departure is that Bloom named some celebrities at these games, who had previously been revealed in released court documents. Sorkin opted to fictionalize them, explaining that he would rather focus the story on Bloom’s experiences and not on Hollywood gossip. Jessica Chastain (The Zookeeper’s Wife, A Most Violent YearZero Dark Thirty) stars as Molly Bloom.

Although three editors are credited for Molly’s Game, the back story is that a staggered schedule had to be worked out. The post production of Steve Jobs connected feature film editor Elliot Graham (Milk, 21, Superman Returns) with that film’s writer and director – Sorkin and Danny Boyle (T2 Trainspotting, 127 Hours, Slumdog Millionaire). Graham was tapped to cut Molly’s Game later into the process, replacing its original editor. He brought Josh Schaeffer (The Last Man on Earth, Detroiters, You’re the Worst) on as associate editor to join him. Graham started the recut with Schaeffer, but a prior schedule commitment to work on Trust for Boyle, saw him exiting the film early. (Trust is the BBC’s adaptation of the Getty kidnapping story.) Graham was able to bring the film about 50% of the way through post. Alan Baumgarten (Trumbo, American Hustle, Gangster Squad) picked up for Graham and edited with Schaeffer to the finish, thus earning all three an editing credit.

Working with a writer on his directorial debut

It can always be a challenge when a writer is close to the editing process. Scenes that may be near and dear to the writer are often cut, leading to tension. I asked the three about this situation. Graham says, “Aaron has always been on set with his other films and worked very closely with the director. So, he understands the process, having learned from some of the best directors in the business. I had a great time with Aaron on Steve Jobs. He’s an incredibly lovely and generous collaborator who brings out the best in his team.”

Baumgarten expands, “Working with Aaron was fun, because he appreciates being challenged. He’s open to seeing what an editor brings to the film. Aaron wrote a tight script that didn’t need to be re-arranged. Only about 20 minutes came out. We cut one small scene, but it was mostly trimming here and there. You want to be careful not to ruin the rhythm of his writing.”

Graham continues, “Aaron also found his own visual vocabulary. A lot of the story is told in time jumps, from present day to the past in flashbacks. Aaron always is looking for rapid fire, overlapping dialogue. It’s part of his uniqueness and it’s a joy to cut. What was new for Aaron was using voice over to drive things.”

 Another new challenge was the use of stock footage. About 150 stock shots were used for cutaways and mini-montages throughout the film. Most of these were never originally scripted. Graham says, “Stock footage was something I chose to start injecting into the film with Aaron’s collaboration when I came on. We felt it was useful to have visual references for some of the voice overs – to connect visuals with words, which helps to land Aaron’s linguistic ideas for viewers. This began with the opening ski sequence – the first thing I cut when I came on board.”

The editors would pull down shots from a variety of internet sources and then the actual footage had to be found and cleared. The editors ultimately partnered with STALKR to find and clear all of the stock shots that were used. Visual effects were handled by Mr. X in Toronto. Originally, only 90 shots were budgeted (for example, snow falling in the ski sequences), but in the end, there were almost 600 visual effects shots in the final film.

Musicality of the performance

Baumgarten explains the musicality of Sorkin’s style. He says, “Aaron knew the film he wanted and had that in his head. Part of his writing process is to read his dialogue out loud and listen for the cadence of the performance. As you go through takes, the film is always moving in the right direction. As a writer/director, he doesn’t need variations or ad libs in an actor’s performance from one take to another, because he knows what the intention of the line is. As editors, we didn’t need to experiment with different calibrations of the performance. The experimentation came in with how we wove in the voice-over and played with the general rhythm.”

Graham adds, “Daniel Pemberton is the composer I worked with on Steve Jobs. I brought on Carl Kaller, a great music editor, when I came on. I knew that the music and dialogue had to dance a beautiful rhythm together for the film to be its best. With a compressed schedule to finish the film, we needed someone like Carl to help choreograph that dance.”

Baumgarten continues, “Daniel was involved early and provided us with temp tracks, which was a great gift. We didn’t have to use scores from other composers as temp music. Carl was just down the hall, so it was easy to weave Daniel’s temp elements in and around the dialogue and voice-over during the editing stage. There is interplay between the voice-over and the music, and the VO is like another musical element.”

Avid for the post

The post operation followed a standard feature film set-up. Avid Media Composer for the editing work stations, tied to Avid ISIS shared storage. The film was shot digitally using ARRI Alexas.

Production covered 48 days ending in February [2017]. It took 10 weeks to get to a director’s cut and then editing on Molly’s Game continued for about six months, which included visual effects, final sound mix and color correction. Schaeffer explains, “The dialogue scenes were scripted using [Avid] ScriptSync. Aaron was familiar with ScriptSync from The Newsroom, and it was a great help for us on this film. It’s the best way to have everything readily available and it allows us to be extremely thorough. If Aaron wanted to change a single word in a take, we were always able to find all of the alternates and make the change quite easily.”

Schaeffer continues, “Aaron methodically worked in a reel-by-reel order. We would divide up sequences between us at breaks that made sense. But when it came time to review the cut on a sequence, we would all review together. A lot people think that you have three editors on a film because the project is so difficult. The truth is that it lets you be more creative. Productions shoot so much footage these days, that it’s great to be able to experiment. Having multiple editors on a film enables you to take the time to be creative. We were all glad that Aaron set up an environment, which made that possible.”

Originally written for Digital Video magazine / Creative Planet Network

©2018 Oliver Peters

Advertisements

Blackmagic Design DaVinci Resolve 14

DaVinci Resolve has made its mark as one of the premier color correction applications for the film and video industries. With the introduction of Resolve 14*, it’s clear that Blackmagic Design has set its sights higher. Advanced editing functions and the inclusion of the Fairlight audio engine put Resolve on track to be the industry’s latest all-in-one post-production powerhouse. I’ve reviewed Resolve in the past as a grading application, but my focus here is editing. Right at the start, let me paraphrase the judges on History Channel’s Forged in Fire series – ‘This NLE can cut!’ If you have no prior allegiances to other editing platforms, then using Resolve as your NLE of choice is a no-brainer.

(*This review was originally written right after the release of Resolve 14 in late 2017.)

DaVinci Resolve 14 comes in two flavors, DaVinci Resolve 14 (free) and DaVinci Resolve Studio ($299). Upgrades have been free to date. It’s the only NLE to support three operating systems: macOS, Windows, and Linux. Mac users also have the option to download Resolve (free) or purchase Resolve Studio through the Apple Mac App Store. These versions are basically the same as those on Blackmagic Design’s website, but with some differences, due to the requirement that App Store software be sandboxed.

Resolve offers the majority of the same features as Resolve Studio. The primary limitations are that exports are capped at UltraHD (3840×2160), and that features such as stereo3D, lens distortion correction, noise reduction, and collaboration require Resolve Studio. Regardless of the version, Resolve is a very deep application that’s been battle-tested through years of high-pressure, enterprise-grade deployment. But is that enough to sway loyal Final Cut Pro X, Premiere Pro, or Media Composer editors to switch? There’s certainly interest, as Stephen Mirrione pointed out in my recent Suburbicon interview, so I wouldn’t be surprised to hear news of a TV show or small feature film being edited with Resolve in the coming year.

The all-in-one concept

Creating a single application that’s good at many different tasks can be daunting and more often than not has been unsuccessful. In the case of Resolve, Blackmagic Design has taken a modal approach by splitting the interface into five pages: Media (ingest/import), Edit, Color, Fairlight (audio mixing), and Deliver (export/output).

The workflow follows a logical, left-to-right path through these five stages of post-production. With each page/mode change, the user interface is reconfigured to best suit the task at hand. The Edit page sports a standard source/record/bin/track layout similar to Media Composer, Premiere Pro, or Final Cut Pro 7. Color switches to the familiar tools and nodes of DaVinci color correction. The Fairlight mixing page isn’t just a mimic of the Fairlight interface. The engineers completely swapped out the audio guts of Resolve and replaced it with the Fairlight audio engine.

Not only is the interface that of a respected DAW, but it is also possible to expand your system with Fairlight’s audio acceleration card, as well as add a Fairlight mixing desk. This means that in a multi-suite facility, you can have task-specific rooms optimized for editing, color grading, or audio mixing – all using the exact same software application without the need for roundtrips or other list translations.

But does it work?

I put both versions of Resolve 14 through the paces and the application is reasonably solid, given how much has changed from version 12 (there was no version 13). General media management, editing, and audio processing is top notch. If you want audio/video output, Blackmagic Design Decklink or UltraStudio hardware is required. There is also a Cinema viewer function for fullscreen viewing on your computer display. With dual displays, the edit interface can be on one along with fullscreen video on the other.

The Fairlight mode will likely require a bit of rethinking by editors used to mixing audio in other NLEs, since it uses a DAW-style interface. Many well-known physical mixing consoles, like those from Solid State Logic, feature channel strips with built-in EQs, compressors, etc. That’s how Fairlight treats these software channels or tracks. Each track can have its own combination of Fairlight audio processing functions. Stick with those and you’ll be happy, although other audio filters on your computer, like Apple AU plug-ins, are accessible. Mixing and audio editing is good with subframe accuracy and the 14.1 update added linked groups to lock faders together. The pace of Fairlight integration was quite fast, but it’s still a bit rough. I encountered a number of application crashes only in the Fairlight page, while scrubbing audio.

Whether or not you like the editing is more a function of personal style and preference. The user interface design is a lot like Final Cut Pro X, except with bins and tracks. Interface windows, tabs, and panels can be opened or pulled down into various screen configurations, but you don’t have freeform control over size and position. Clearly Premiere Pro is king in that department. Some design choices aren’t consistent. For example, you can’t enable a single-viewer layout when using two displays.

Multicam editing is solid, but I experienced a small bit of latency in the viewer when cutting camera angles on-the-fly. It’s minor and may or may not bother you. You can sync clips by various methods, such as timecode or waveform, but oddly, it seemed to be too lax. In my tests, it would frequently sync clips that it shouldn’t have when a sync relationship didn’t exist.

There are a number of things in Resolve’s design that take getting used to. For example, a Resolve project is locked to the frame rate you picked when that new project was created – same as with Avid. This means you can’t mix sequences with different frame rates within the same project. There are no adjustment layers, although you can fake it in the Color page by using clip and program-based corrections. Color management via LUTs (look-up tables) is much deeper than any other NLE. You can set color management with LUTs to be global, which is best when the project uses only one camera type. Conversely, input LUTs may be applied singly or in a batch to specific cameras in a bin. But, when you do that, the LUT process doesn’t show up in the color correction node (only its result), when you switch to the Color page. On the plus side, real time performance has been improved from previous versions and the built-in effects include filters that you don’t often find in the basic build of other NLEs, like glow and watercolor effects. In addition to great built-in effects, third-party OpenFX packages, like Boris Continuum Complete and Sapphire are also available.

Collaboration

Resolve uses bin-locking like Avid Media Composer. The first editor to open a bin has read/write permission to it. Any other editor can open that same bin in a read-only mode. For example, in a long-form project, separate bins might be organized for Act 1, Act 2, and so on. Different editors can separately work on parts of the film at the same time. Since this all happens in a single database file, it always reflects the most current state of the project.

To set up shared projects, a different PostgreSQL database is required, which is installed through the custom options of the installer. Make sure you are using the most recent version when upgrading Resolve, since the older versions of PostgreSQL are no longer compatible with the newest OS versions. One machine on the network hosts this database and then other workstations connect to that database to access the Resolve projects. Only that host machine needs to have PostgreSQL software installed on it. The process of adding and connecting shared databases has been improved and simplified with the release of 14.1.1 (and later), which now includes an additional server set-up utility application.

In testing collaboration features, I initially ran into set-up problems. These were eventually fixed when I disabled the macOS firewall on the host machine, which was blocking access from the other connected Macs to its shared database. This took some back and forth with Blackmagic Design’s helpful support engineers until we figured out why I was getting the connection errors. Since I had to return the additional “dongle” (USB license key) before this was fixed, I wasn’t able to test two editors simultaneously editing within the same open project. However, the ability to open any shared project from any qualified computer on the network was just fine.

DaVinci Resolve Micro Panel

I also tested the smaller, bus-powered DaVinci Resolve Micro panel. The Micro panel is just the right size for an editor or a DIT on set. It’s smaller than the Mini (tested previously in another review), because it doesn’t have the upward slanting portion in the back; therefore, it’s a better physical fit between your computer keyboard and display. You don’t have to shuffle desk real estate between tools, as you do with the Micro panel. In spite of not having the extra controls and LCD displays of the Mini, the Micro panel combines most of the control functions you need for fast grading. If you are an editor who is heavy into color correction, then this is a must-have for Resolve.

I took an instant liking to the Micro. You can use both hands to quickly and intuitively work the trackballs and knob controls, making for faster and better correction. It’s tactile, with next and previous clip buttons to quickly advance through the timeline, so you can keep your eyes on the screen. I grade in Resolve, Avid, Premiere Pro, and Final Cut Pro X, and all of that is with a mouse. Using the panel easily resulted in faster grading by a factor at least 3X or 4X. I also achieved better-looking corrections with fewer steps or processes than grading in any of these other applications.

Conclusion

Overall, there’s a lot to love about Resolve, in spite of a few rough edges. In general, it seems more stable under macOS Sierra than with High Sierra. If you use Resolve on a Mac, then you are stuck dealing with Apple’s platform changes. For example, recent Macs that use an Nvidia GPU are at a disadvantage under High Sierra, because Nvidia is just now developing drivers for CUDA under this OS. I experienced a number of crashes running Resolve 14 on my 2014 MacBook Pro until I manually changed the Resolve hardware configuration under Resolve’s preferences from CUDA to using Metal. When I installed what was supposed to be the newest CUDA driver, I still received a prompt that no CUDA-compliant card was present. But, it’s working fine using Metal. Macs with AMD GPUs should be fine.

Resolve 14 is a dense tool, with a lot of depth in various menus, which some may find daunting. This review would be a lot longer if I went even deeper into the many specific features of this application. Yet, it is easy for new users to hit the ground running and then learn as they go. For many, this is their mythical “Final Cut Pro 8”. In any case, DaVinci Resolve 14 is the best incarnation of the all-in-one concept to date. If you add Blackmagic Design’s Fusion visual effects software into the mix (also available in free and paid versions), the result is a combination that’s tough to beat at any price.

Blackmagic Design’s engineers have shown impressive development over a very short period of time, so I fully expect Blackmagic to give the three “A” companies a run for their money. Even if you use another tool as your main editing application, Resolve is a great addition to the toolbox. Using it becomes addictive. Give it a try and you might just find it becomes your first choice.

Originally written for Digital Video magazine / Creative Planet Network

©2017, 2018 Oliver Peters

Telestream Switch 4

Once Apple pulled the plug on QuickTime Player Pro 7, the industry started to look elsewhere for an all-purpose media tool that could facilitate the proper playback, inspection, and encoding of media files. For many, that new multipurpose application has become Telestream’s Switch, now in version 4. Telestream offers a range of desktop and enterprise media solutions, including Vantage, ScreenFlow, Flip4Mac, Episode, and others. Switch fills the role of a media player with added post-production capabilities, going far beyond other players, such as QuickTime Player or VLC.

Switch is offered in three versions: the basic Switch Player ($9.99), Switch Plus ($199) and Switch Pro ($499). Pricing for Plus and Pro covers the first year of support, which includes upgrades and assistance. There is also a free demo version with watermarking. All versions are available for both macOS (10.11-13) and Windows (7-10).

Playback support

The first attraction to Switch is its wide support of “consumer”, broadcast, and professional media formats and codecs. For Mac users, some of these are supported in QuickTime Player, too, but require a conversion step before you can play them. Not so with Switch. Of particular importance to editors will be the MPEG-2 and MXF variations. Some formats do require an upgrade to at least the Plus version, so check Telestream’s tech specs for specifics.

One area where Switch shines is file inspection. This has made it to the go-to quality assurance tool at many facilities. File metadata is exposed, along with proper display and reporting of interlaced video. It supports JKL transport control and frame advance using the arrow keys. Since closed captioning is important for all terrestrial and set-top channel broadcasters, you must have a way to check embedded captions. In the case of QuickTime Player, it will only display a single track of embedded captions and then, only the lower track. So, for example, if you have a file with both English and Spanish captions on CC1 and CC3, QuickTime Player will only display the English captions and not even let you verify that more captions are present. With Switch Plus and Pro, the full range of embedded channels are presented and you have the ability to do a check on any of the caption tracks.

Switch Plus likely covers the needs of most users; but Pro adds additional functionality, such as metering for multi-channel audio and loudness compliance. Pro also lets you open up to sixteen different files for comparison. It is the only version that supports external monitoring through Blackmagic Design or AJA i/o hardware. Finally, Pro lets you QC DPP (Digital Production Partnership) files from the desktop and display AS-11 MXF metadata.

Content encoding

Beyond these powerful player and inspection functions, Switch Plus and Pro are also full-fledged media encoders. You can change metadata, reorder audio channels, and export a new media file in various formats. Files can be trimmed, cropped, and/or resized in the export. Do you have a ProRes master file and need to generate an MPEG-2 Transport Stream file for broadcast? No problem.

I had a situation where I received a closed caption master file of a commercial from the captioning facility. It needed to have the ends of the file (slate and black) trimmed to meet the delivery specs. Normally when you edit or convert a file with embedded captioning, it will break the captions on the new file. Not so with Switch. I simply set the in and out points, set my encode specs to video pass-through, and generated the new file. The encode (essentially a file copy in this case) was lightning fast and the captions stayed intact.

Switch Plus and Pro include publishing presets for Vimeo, YouTube, and Facebook. In addition, the Pro version also lets you create an iTunes Store package, necessary to be compliant when distributing via the iTunes Store. Switch is a cross-platform application, but ProRes encoding support is limited to the Mac version. However, the iTunes Store package feature is the exception. ProRes asset creation is available to Windows users when creating the .itms files used by the iTunes Store.

Although Switch Plus or Pro might seem pricy to some when they compare these to Apple Compressor or Adobe Media Encoder; however, the other encoders can’t do the precision media functions that Switch offers. Telestream has built Switch to be an industrial-grade media tool that covers a host of needs in a package that’s easy for anyone to understand. If you liked QuickTime Player Pro 7, then Switch has become its 21st century successor.

Originally written for RedShark News.

©2018 Oliver Peters

What’s up with Final Cut’s Color Wheels?

Apple Final Cut Pro X 10.4 introduced new, advanced color correction tools to this editing application, including color wheels, curves, and hue vs. saturation curves. These are tools that users of other NLEs have enjoyed for some time – and, which were part of Final Cut Studio (FCP 7, Color). Like others, my first reaction was, “Super! They’ve added some nice advanced tools, which will improve the use of FCPX for higher-end users.” But, as I started to primarily use the Color Wheels with real correction work, I quickly realized that something wasn’t quite right in how they operated. Or at least, they didn’t work in a way that we’ve come to understand.

In trying to figure it out, I reached out to other industry pros and developers for their thoughts. Naturally this led to some spirited discussions at forums like those at Creative COW. However, other editors have noticed the same problems, so you can also find threads in the Facebook FCPX group and at FCP.co. It is certainly easy to characterize this as just another internet kerfuffle, surrounding Apple’s “think different” approaches to FCPX. But those arguments fall flat when you actually try to use the tools as intended.

The FCPX Color Wheels panel includes four wheels – Master, Shadows, Midtones, and Highlights. The puck in the center of each wheel is a hue offset control to push hues in the direction that you move the puck. The slider to the right of the wheel controls the brightness of that range. The left slider controls the saturation. One of the main issues is that when you adjust luminance using one of these controls, the affected range is too broad. Specifically, in the case of the Midtones control, as you adjust the luminance slider up or down, you are affecting most of the image and not just the midrange levels. This is not the way this type of control normally works in other tools, and in fact, it’s not how FCPX’s Color Board controls work either.

“What’s the big deal?” you might ask. Fair enough. I see two operational issues. The first is that to properly grade the image using the Color Wheels, you end up having to go back-and-forth a lot between wheels, to counteract the changes made by one control with another. The second is that using the Midtones slider tends to drive highlights above 100 IRE, where they will be clipped if any broadcast limiting is used. This doesn’t happen with other color tools, notably Apple’s own Color Board.

A lot of the discussion focuses on luma levels and specifically the Midtones slider, since it’s easy to see the issue there. However, other controls are also affected, but that’s too much to dissect in a single post. Throughout this post, be sure to click on the images to see the full view. I have presented various samples against each other and you will only get the full understanding if you open the thumbnail (which is small but also cropped) to the full image. I have compared the effect using five different tools – the Color Board, the Color Wheels, a color corrector plug-in that I built as a Motion template using Motion effects, Rubber Monkey Software FilmConvert (the wheels portion only), and finally, the Adobe Lumetri controls in Premiere Pro.

I am using three different test images – a black-to-white ramp, a test pattern, and a demo video image. The ramp without correction will appear as a diagonal line (0-100 IRE) on the scope, which makes it easy to analyze what’s happening. The video image has definite shadow and highlight areas, which lets us see how these controls work in the real world. For example, if you want to brighten the area of the shot where the man is in the shadows, but don’t want to make the highlights any brighter, this would normally be done using a Midtones control. Be aware that these various tools certainly aren’t calibrated the same way and some have a greater range of control than others. The weakest of these is FilmConvert’s wheels, since this plug-in has additional level controls in other parts of its interface.

Color science models

In the various forum threads, the argument is made that Apple is simply using a different color science method or a different weighing of some existing models. That’s certainly possible, since not all color correctors are built the same way. The most common approaches are Lift/Gamma/Gain and Shadows/Mids/Highlights. Be careful with naming. Just because something uses the terminology of Shadows, Midtones, and Highlights, does not mean that it also uses the SMH color science model. Many tools use the Lift/Gamma/Gain model, but in fact, call the controls shadows (Lift), mids (Gamma), and highlights (Gain). Another term you may run across is Set-up in some correction tools. This is typically used for control of shadows (equal to Lift), but can also function is an offset control that raises the level of the entire image. Avid Symphony employs this solution. Finally, both Symphony and Adobe SpeedGrade use what has been dubbed a 12-way color corrector. Each range is further subdivided into its own subset of shadows, mids, and highlights controls.

An LGG model provides broad control of shadows and highlights, with the midtones control working like a curve that covers the whole range, but with the largest effect in the middle. An SMH model normally divides the levels into three distinct, precisely overlapping ranges. This is much like a three-band audio equalizing filter. A number of the color correctors add a luma range control, which gives the user the ability to change how much of the image a specific range will affect. In other words, how broad is the control of the shadows, mids, or highlights control? This is like a Q control in an audio equalizer, where you change the shape of the envelope at a certain frequency.

Red Giant’s Magic Bullet Looks offers both color correction models with two different tools – the 4-way color corrector (SMH) and the Colorista color corrector (LGG). When you adjust the midrange control of their 4-way, the result is a graceful S-shaped curve to the levels on the waveform.

To study the effect of an LGG-based corrector, test the ramp. The shadows control (Lift) will raise or lower the dark areas of the image without changing the absolute highlights. The diagonal line of the ramp on the waveform essentially pivots, hinged at the 100 IRE point. Conversely, change the highlights control (Gain) pivots the line pinned to 0 IRE (at black). When you adjust the midtones control (Gamma), you create a curve to the line, which stays pinned at 0 and 100 IRE at either end. In this way you are effectively “expanding” or “compressing” the levels in the middle portion of your image without changing the position of your black or white points.

How the various color correction tools react

Looking at the luma control for the Midtones, two things are clear. First, all of these tools are using the LGG color science model. It’s not clear what the Color Wheels are using, but it isn’t SMH, as there is no bulge or S-curve visible in the scope. Second, the Color Wheels quickly drive the image levels into clipping, while the other tools generally keep black and while levels in place. In essence, the Midtones control affects the image more like a master or offset control would, than a typical mids or Gamma control. Yet, clearly Apple’s Color Board controls adhere to the standard LGG model. The concern, of course, is clipping. In the test image of the man walking on the village street, the sunlit building walls on the opposite side of the street will become overexposed and risk being clipped when the Color Wheels are used.

What about color? As a simple test, I next shifted the Midtones puck to the yellow. Bear in mind that the range of each of these controls is different, so you will see varying degrees of yellow intensity. Nevertheless, the way the control should work is that some pure black and white should be preserved at the top and bottom of the video levels. All of these tools maintain that, except for the Color Wheels. There, the entire image is yellow, effectively making the hue offset puck function more like a tint control.

One other issue to note, is that the Color Wheels offer an extraordinarily control range. The hue offset control RGB intensity values go from 0 (center of the wheel) to 1023. However, the puck icon can only go to the rim of the wheel, which it hits at about 200. With a mouse (or numerical entry), you can keep going well past the stop of the wheel icon – five times farther, in fact. The image not only becomes very yellow in this case, but you can easily lose the location of your control, since the GUI position in no longer relevant.

The working theory

The big question is why don’t the Color Wheels conform to established principles, when in fact, the Color Board controls do? Until there is some further clarification from Apple, one possible explanation is with HDR. FCPX 10.4 introduced High Dynamic Range (HDR) features. One of the various HDR standards is Rec. 2020 PQ. In that color space, the 0-100 IRE limitations of Rec. 709 are expanded to 0-10,000 nits. 0-100 nits is roughly the same brightness as we are used to with Rec. 709.

Looking at this image of the man walking along the street – where I’ve attempted to get a pleasing look with all of the tools – you’ll see that the Color Wheels in Rec. 709 don’t react correctly and will drive the highlights into a range to be clipped. However, in the bottom pane, which is the same image in Rec. 2020 PQ color space, the grade looks pretty normal. And, in practice, the Color Wheels controls work more or less the way I would have expected them to work. Yes, the same controls work differently in the different color spaces – properly in 2020 PQ and not in 709.

But why is that the case? I have no answer, but I do have a wild guess. Maybe, just maybe, the Color Wheels were designed for – or intended to only be used for – HDR work. Or maybe there’s conversion or recalibration of the controls that hasn’t taken place yet in this version. If the tool is only calibrated for HDR, then its range and weighing will be completely wrong for Rec. 709 video. If you increase the Midtones luma of the ramp in both Rec. 709 and Rec. 2020 PQ, you’ll see a similar curve. In fact, if you overlay a screen shot of each waveform, placing the full Rec. 709 scope image over the bottom portion of the Rec. 2020 PQ scale, you’ll notice that these sort of align up to about 100 IRE and nits. It’s as if one is simply a slice out of the other.

Regardless of why, this is something where I would hope Apple will provide a white paper or other demonstration of what the best practices will be for using this tool effectively. If it isn’t intentional, and actually is a mistake, then I presume a fix will be forthcoming. In either case, put in your feedback comments to Apple.

A word about HDR

Over the course of testing this tool and this theory, I’ve done a bit of testing with the HDR color spaces in FCPX. If you want to know more about HDR, I would encourage you to check out these contrary blog posts by Stu Maschwitz and Alexis Van Hurkman. I tend to side with Stu’s point-of-view and am not a big fan of HDR.

The way Apple has implemented these features in Final Cut Pro X 10.4 is to allow the user to set and override color spaces. If you set up your project to be Rec. 2020 PQ (and set preferences to “show HDR as raw values”), then the viewer and a/v output (direct from the Mac, not through a hardware i/o device) are effectively dimmed through the Mac’s color profile system. When you grade the image based on the 0-10,000 nits scale, you’ll end up seeing an image that looks pleasing and essentially the same as if you were working in Rec. 709. However – and I cannot over-emphasize this – you are not going to be able to produce an image that’s truly compatible with Dolby Vision and actually look correct as HDR, unless you have the correct AJA i/o hardware and a proper display. And by display, I mean a top-end Dolby, Canon, or Sony unit, costing tens of thousands of dollars.

As I understand the PQ specs, the bulk of the higher range is for the highlights that are normally constrained or clipped in our current video systems. However, that 10,000 nits scale is weighed, so that about 50% of the image value is in the first 100 nits, making it of comparable brightness to the current 100 IRE. The rest of that range is for brighter information, like specular highlights. You don’t necessarily get more brightness in the shadow detail. Therefore, if you are grading a shot in FCPX in a 2020 PQ color space and you only have the computer display to go by, you’ll grade by eye as much as by scope. This means that to get a pleasing image, you will end up making the average appearance of the image brighter than it really should be. When this is viewed on a real HDR monitor, it will be painfully bright. Having a higher-nits computer display, like on the iMac Pro (up to 500 nits), won’t make much difference, unless maybe, you crank the display brightness to its maximum (ouch!).  “Mine goes the 11!”

Right now, HDR is the wild, wild west. If you are smart, you’ll realize that you don’t know what you don’t know. While it’s nice to have these new features in FCPX, they can be very dangerous in the wrong hands.

But that’s another matter. Right now, I just hope Apple (or one of the usual suspects, like Ripple Training, LumaForge, or Larry Jordan) will come out with more elaboration on the Color Wheels.

©2018 Oliver Peters

Putting Apple’s iMac Pro Through the Paces

At the end of December, Apple made good on the release of the new iMac Pro and started selling and shipping the new workstations. While this could be characterized as a stop-gap effort until the next generation of Mac Pro is produced, that doesn’t detract from the usefulness and power of this design in its own right. After all, the iMac line is the direct descendant in spirit and design of the original Macintosh. Underneath the sexy, all-in-one, space grey enclosure, the iMac Pro offers serious workstation performance.

I work mostly these days with a production company that produces and posts commercials, corporate videos, and entertainment programming. Our editing set-up consists of seven workstations, plus an auxiliary machine connected to a common QNAP shared storage network. These edit stations consisted of a mix of old and new Mac Pros and iMacs (connected via 10GigE), with a Mac Mini for the auxiliary (1GigE). It was time to upgrade the oldest machines, which led us to consider the iMac Pros. The company picked up three of them – replacing two Mac Pro towers and an older iMac. The new configuration is a mix of three, one-year-old Retina 5K iMacs (late 2015 model), a 2013 “trash can” Mac Pro, and three 2017 iMac Pros.

There are plenty of videos and articles on the web about how these machines perform; but, the testers often use artificial benchmarks or only Final Cut Pro X. This shop has a mix of NLEs (Adobe, Apple, Avid, Blackmagic Design), but our primary tool is Adobe Premiere Pro CC 2018. This gave me a chance to compare how these machines stacked up against each other in the kind of work we actually do. This comparison isn’t truly apples-to-apples, since the specs of the three different products are somewhat different from each other. Nevertheless, I feel that it’s a valid real-world assessment of the iMac Pros in a typical, modern post environment.

Why buy iMac Pros at all?

The question to address is why should someone purchase these machines? Let me say right off the bat, that if your main focus is 3D animation or heavy compositing using After Effects or other applications – and speed and performance are the most important factor – then don’t buy an Apple computer. Period. There are plenty of examples of Dell and HP workstations, along with high-end gaming PCs, that outperform any of the Macs. This is largely due to the availability of advanced NVidia GPUs for the PC, which simply aren’t an option for current Macs.

On the other hand, if you need a machine that’s solid and robust across a wide range of postproduction tasks – and you prefer the Mac operating ecosystem – then the iMac Pros are a good choice. Yes, the machine is pricy and you can buy cheaper gaming PCs and DIY workstations, but if you stick to the name brands, like Dell and HP, then the iMac Pros are competitively priced. In our case, a shift to PC would have also meant changing out all of the machines and not just three – therefore, even more expensive.

Naturally, the next thing is to compare price against the current 5K iMacs and 2013 Mac Pros. Apple’s base configuration of the iMac Pro uses an 8-core 3.2GHz Xeon W CPU, 32GB RAM, 1TB SSD, and the Radeon Pro Vega 56 GPU (8GB memory) for $4,999. A comparably configured 2013 Mac Pro is $5,207 (with mouse and keyboard), but no display. Of course, it also has the dual D-700 GPUs. The 5K iMac in a similar configuration is $3,729. Note that we require 10GigE connectivity, which is built into the iMac Pros. Therefore, in a direct comparison, you would need to bump up the iMac and Mac Pro prices by about $500 for a Thunderbolt2-to-10GigE converter.

Comparing these numbers for similar machines, you’d spend more for the Mac Pro and less for the iMac. Yet, the iMac Pro uses newer processors and faster RAM, so it could be argued that it’s already better out of the gate in the base configuration than Apple’s former top-of-the-line product. It has more horsepower than the tricked-out iMac, so then it becomes a question of whether the cost difference is important to you for what you are getting.

Build quality

Needless to say, Apple has a focus on the quality and fit-and-finish of its products. The iMac Pro is no exception. Except for the space grey color, it looks like the regular 27” iMacs and just as nicely built. However, let me quibble a bit with a few things. First, the edges of the case and foot tend to be a bit sharp. It’s not a huge issue, but compared with an iPhone, iPad, or 2013 Mac Pro, the edges just not as smooth and rounded. Secondly, you get a wireless mouse and extended keyboard. Both have to be plugged in to charge. In the case of the mouse, the cable plugs in at the bottom, rendering it useless during charging. Truly a bad design. The wireless keyboard is the newer, flatter style, so you lose two USB ports that were on the previous plug-in extended keyboard. Personally, I prefer the features and feel of the previous keyboard, not to mention any scroll wheel mouse over the Magic Mouse. Of course, those are strictly items of personal taste.

With the iMac Pro, Apple is transitioning its workstations to Thunderbolt 3, using USB-C connectors. Previous Thunderbolt 2 ports have been problematic, because the cables easily disconnect. In fact, on our existing iMacs, it’s very easy to disconnect the Thunderbolt 2 cable that connects us to the shared storage network, simply by moving the iMac around to get to the ports on the back. The USB-C connectors feel more snug, so hopefully we will find that to be an improvement. If you need to get to the back of the iMac or iMac Pro frequently, in order to plug in drives, dongles, etc., then I would highly recommend one of the docks from CalDigit or OWC as a valuable accessory.

5K screen

Apple spends a lot of marketing hype on promoting their 5K Retina screens. The 27” screens have a raw pixel resolution of 5120×2880 pixels, but that’s not what you see in terms of image and user interface dimensions. To start with, the 5K iMacs and iMac Pros use the same screen resolution and the default display setting (middle scaled option) is 2560×1440 pixels. The top choice is 3200×1800. Of course, if you use that setting, everything becomes extremely small on screen.  Conversely, our 2013 Mac Pro is connected to a 27” Apple LED Cinema Display (non Retina). It’s top scaled resolution is also 2560×1440 pixels. Therefore, at the most useable settings, all of our workstations are set to the same resolution. Even if you scale the resolution up (images and UI get smaller), you are going to end up adjusting the size of the application interface and viewer window. While you might see different viewer size percentage numbers between the machines, the effective size on screen will be the same.

Retina is Apple’s marketing name for high pixel density. This is the equivalent of DPI (dots per inch) in print resolutions. According to a Macworld article, iPhones from 4 to 5s had a pixel density of 326ppi (pixels per inch), while iMacs have 218ppi. Apple converts a device’s display to Retina by doubling the horizontal and vertical pixel count. More pixels are applied to any given area on the screen, resulting in smoother text, smoother diagonal lines, and so on. That’s assuming an application’s interface is optimized for it. At the distance that the editors sit from a 27” display, there is simply little or no difference between the look of the 27” LED display and the 27” iMac Retina screens.

Upgradeability

Future-proofing and upgrades are the biggest negatives thrown at all-in-ones, particularly the iMac Pros. While the user can upgrade RAM in the standard iMacs, that’s not the case with iMac Pros. You can upgrade RAM in the future, but that must be done at a service facility, such as the Apple Store’s Genius service. This means that in three years, when you want the latest, greatest CPU, GPU, storage, etc., you won’t be able to swap out components. But is this really an issue? I’m sure Apple has user research numbers to justify their decisions. Plus, the thermal design of the iMac would make user upgrades difficult, unlike older mac Pro towers.

In my own experience on personal machines, as well as clients’ machines that I’ve helped maintain, I have upgraded storage, GPU cards, and RAM, but never the CPU. Although I do know others who have upgraded Xeon models on their Mac Pro towers. Part of the dichotomy is buying what you can afford now and upgrading later, versus stretching a bit up front and then not needing to upgrade later. My gut feeling is that Apple is pushing the latter approach.

If I tally up the cost of the upgrades that I’ve made after about three years, I would already be part of the way towards a newer, better machine anyway. Plus, if you are cutting HD and even 4K today, then just about any advanced machine will do the trick, making it less likely that you’ll need to do that upgrade within the foreseeable life of the machine. An argument can be made for either approach, but I really think that the vast majority of users – even professional users – never actually upgrade any of the internal hardware from that of the configuration as originally purchased.

Performance testing

We ultimately purchased machines that were the 10-core bump-up from the base configuration, feeling that this is the sweet spot (and is currently available) within the iMac Pro product line.

The new machine specs within the facility now look like this:

2013 Mac Pro – 3GHz 8-core Xeon/64GB RAM/dual D-500 GPUs/1TB SSD (Sierra)

2015 iMac – 4GHz 4-core Core i7/32GB RAM/AMD R9/3TB Fusion drive (Sierra)

2017 iMac Pro – 3GHz 10-core Xeon W/64GB RAM/Radeon Vega 64/1TB SSD (High Sierra)

As you can see, the tech specs of the new iMac Pros more closely match the 2013 Mac Pro than the year-old 5K iMacs. Of course, it’s not a perfect match for optimal benchmark testing, but close enough for a good read on how well the iMac Pro delivers in a real working environment.

Test 1 – BruceX

The BruceX test uses a 5K Final Cut Pro X timeline made up only of built-in titles and generators. The timeline is then rendered out to a ProRes file. This tests the pure application without any media and codec variables. It’s a bit of an artificial test and only applicable to FCPX performance, but still useful. The faster the export time, the better. (I have bolded the best results.)

2013 Mac Pro – 26.8 sec.

2015 iMac – 28.3 sec.

2017 iMac Pro – 14.4 sec.

Test 2 – media encoding

In my next test, I took a 4½-minute-long 1080p ProRes file and rendered it to a 4K/UHD (3840×2160) H.264 (1-pass CBR 20Mbps) file. Not only was it being encoded, but also scaled up to 4K in this process. I rendered from and to the desktop, to eliminate any variables from the QNAP system. Finally, I conducted the test using both Adobe Media Encoder (using OpenCL processing) and Apple Compressor.

Two noteworthy issues. The Compressor test was surprisingly slow on the Mac Pro. (I actually ran the Compressor test twice, just to be certain about the slowness of the Mac Pro.) The AME version kicked in the fans on the iMac.

Adobe Media Encoder

2013 Mac Pro – 6:13 min.

2015 iMac – 7:14 min.

2017 iMac Pro – 4:48 min.

 Compressor

2013 Mac Pro – 11:02 min.

2015 iMac – 2:20 min.

2017 iMac Pro – 2:19 min.

 Test 3 – editing timeline playback – multi-layered sequence

This was a difficult test designed to break during unrendered playback. The 40-second 1080p/23.98 sequence include six layers of resized 4K source media.

Layer 1 – DJI clips with dissolves between the clips

Layers 2-5 – 2D PIP ARRI Alexa clips (no LUTs); layer 5 had a Gaussian blur effect added

Layer 6 – native REDCODE RAW with minor color correction

The sequence was created in both Final Cut Pro X and Premiere Pro. Playback was tested with the media located on the QNAP volumes, as well as from the desktop (this should provide the best possible playback).

Playing back this sequence in Final Cut Pro X from the QNAP resulted is the video output largely choking on all of the machines. Playing it back in Premiere Pro from the QNAP was slightly better than in FCPX, with the 2017 iMac Pro performing best of all. It played, but was still choppy.

When I tested playback from the desktop, all three machines performed reasonably well using both Final Cut Pro X (“best performance”) and Premiere Pro (“1/2 resolution”). There were some frames dropped, although the iMac Pro played back more smoothly than the other two. In fact, in Premiere Pro, I was able to set the sequence to “full resolution” and get visually smooth playback, although the indicator light still noted dropped frames. Typically, as each staggered layer kicked in, performance tended to hiccup.

Test 4 – editing timeline playback – single-layer sequence

 This was a simpler test using a standard workflow. The 30-second 1080p/23.98 sequence included three Alexa clips (no LUTs) with dissolves between the clips. Each source file was 4K/UHD and had a “punch-in” and reposition within the HD frame. Each also included a slight, basic color correction. Playback was tested in Final Cut Pro X and Premiere Pro, as well as from the QNAP system and the desktop. Quality settings were increased to “best quality” in FCPX and “full resolution” in Premiere Pro.

My complex timeline in Test 3 appeared to perform better in Premiere Pro. In Test 4, the edge was with Final Cut Pro X. No frames were dropped with any of the three machines playing back either from the QNAP or the desktop, when testing in FCPX. In Premiere Pro, the 2017 iMac Pro was solid in both situations. The 2015 iMac was mostly smooth at “full” and completely smooth at “1/2”. Unfortunately, the 2013 Mac Pro seemed to be the worst of the three, dropping frames even at “1/2 resolution” at each dissolve within the timeline.

Test 5 – timeline renders (multi-layered sequence)

In this test, I took the complex sequence from Test 3 and exported it to a ProRes master file. I used the QNAP-connected versions of the Premiere Pro and Final Cut Pro X timelines and rendered the exports to the desktop. In FCPX, I used its default Share function. In Premiere Pro, I queued the export to Adobe Media Encoder set to process in OpenCL. This was one of the few tests in which the 2013 Mac Pro put in a faster time, although the iMac Pro was very close.

Rendering to ProRes – Premiere Pro (via Adobe Media Encoder)

2013 Mac Pro – 1:29 min.

2015 iMac – 2:29 min.

2017 iMac Pro – 1:45 min.

Rendering to ProRes – Final Cut Pro X

2013 Mac Pro – 1:21 min.

2015 iMac – 2:29 min.

2017 iMac Pro – 1:22 min.

Test 6 – Adobe After Effects – rendering composition

My final test was to see how well the iMac Pro performed in rendering out compositions from After Effects. This was a 1080p/23.98 15-second composition. The bottom layer was a JPEG still with a Color Finesse correction. On top of that were five 1080p ProResLT video clips that had been slomo’ed to fill the composition length. Each was scaled, cropped, and repositioned. Each was beveled with a layer style and had a stylized effect added to it. The topmost layer was a camera layer with all other layers set to 3D, so the clips could be repositioned in z-space. Using the camera, I added a slight rotation/perspective change over the life of the composition.

Rendering to ProRes – After Effects

2013 Mac Pro – 2:37 min.

2015 iMac – 2:15 min.

2017 iMac Pro – 2:03 min.

Conclusion

After all of this testing, one is left with the answer “it depends”. The 2013 Mac Pro has two GPUs, but not every application takes advantage of that. Some apps tax all the available cores, so more, but slower, cores are better. Others go for the maximum speed on fewer cores. All things considered, the iMac Pro performed at the top of these three machines. It was either the best or close/equal to the best. But, this is an incremental difference in the 10% to 30% range. But, of course some of these numbers will be meaningful and others won’t, depending on the apps used and a user’s storage situation.

I will say that installing these three machines was the easiest I’ve ever done, including connecting them to the 10GigE storage network. The majority of our apps come from Adobe Create Cloud, the Mac App Store, or FxFactory (for plug-ins). Except for a few other installers, there was largely no need to track down installers, activation information, etc. for a zillion small apps and plug-ins. This made it a breeze and is certainly part of the attraction of the Mac ecosystem. The iMac Pro’s all-in-one design limits the required peripherals, which also contributes to a faster installation. Naturally, I can’t tell anyone if this is the right machine for them, but so far, the investment does look like the correct choice for this shop’s needs.

Here are two additional impressions by working editors: Thomas Grove Carter and Ben Balser. Also a very comprehensive review from AppleInsider.

©2018 Oliver Peters

Apple Final Cut Pro X 10.4

December finally delivered the much-anticipated simultaneous release of new versions of Apple Final Cut Pro X, Motion, and Compressor – all on the same day as the iMac Pro officially went on sale. In the broader ecosystem, we also saw updates for macOS High Sierra, Logic Pro X, Pixelmator Pro, and Blackmagic Design DaVinci Resolve.

Final Cut Pro X (“ten”), version 10.4 is the fifth major release of Apple’s professional NLE in a little over six years. There are changes under the hood tied to technologies in High Sierra (macOS 10.13), which won’t get much press, but are very important in the development and operation of an application. This version will still run on a wide range of recent and older Macs. The minimum OS requirement is 10.12.4, but 10.13 or later is recommended. There are four new, marquee features in this release: advanced color correction tools, 360° editing, HDR (wide gamut) color space support, and HEVC/H.265 codec support for editing and encoding.

New advanced color tools

Final Cut Pro X was first launched with a color correction tool called the color board. It substituted sliders on a color swatch for the standard curves and color wheel controls that editors had been used to. While the color board was and is effective, as well as a bit deceptive in what you can accomplish, it was an instant turn-off for many. The lack of a more advanced color correction interface opened the field for third party color correction plug-in developers who came up with some great tools. With the release of FCPX 10.4, it’s hard for me to see why FCPX diehards would still buy a color correction plug-in. Yet, I have heard from at least one plug-in developer that their color corrector plug-in sales are staying stable. Clearly users want choice and that’s a good thing.

With this update, you’ve gained three new, native color tools, including color wheels, curves, and hue vs. saturation curves. All are elegantly designed, operate quite fluidly, and generally mimic what you can do in DaVinci Resolve. However, the color board didn’t go away however. There’s a preference setting for which of these four color tools is the default effect when first applying color correction (CMD+6).

Once you start color correcting, you can add more instances of any of these four tools in any combination. Final Cut Pro X sports robust performance, so you can apply several layers of correction to a clip and still have real-time playback without rendering. There are also additional keyboard commands to quickly step through effects or clips on your timeline. While not quite as fluid of a grading workflow as you’d have in a true color correction application, like Resolve, you can get pretty close with some experience. My biggest beef is that you are limited to the controls being locked within the inspector pane. You can’t move the controls around and there is no special color correction workspace. So for me, the ergonomics are poor. In my testing, I’ve also hit some flaws in how the processing is done (more on that in a future post). Ironically the color board actually seems to achieve more accurate correction than the color wheels.

There are a few quirks. Previously created presets for the color board will be converted into color preset effects, which now appear in the effects browser. This enables you to preview a color preset applied to a clip by skimming over the effect thumbnail. Unfortunately, I found this conversion didn’t always work. On a Sierra machine (10.12), the older presets were automatically converted after waiting a few minutes; however, nothing happened on a High Sierra machine (10.13). I eventually resorted to copying my converted effects presets from the Sierra Mac over to the High Sierra Mac. I suspect, that because the High Sierra update automatically reformats the internal SSD drive to the new Apple File System (APFS), this conversion process is somehow impeded. Of course, if you don’t already have any existing custom presets, then it’s not an issue.

(You can check out my previously-created color presets for instructions and downloads here.)

There is no control surface support yet, although future support for third party color correction controllers has been alluded to. It would be nice to see support for Tangent or Avid panels at the very least. There’s a new FCPXML version (1.7) that includes this new color metadata; however, it doesn’t seem to be imported into the newest version of Resolve. It’s possible that color metadata in the FCPXML file is only intended for FCPX-to-FCPX transfers and not round tripping to other applications.

360° editing

Let me say up front that this doesn’t hit my hot button. It’s an area where Apple is playing catch-up to Adobe. Quite frankly, for both of these companies, it only appeals to a small percentage of users. Not all 360° formats are supported. Your footage must be equirectangular (stitched panorama), in order that FCPX can properly correct its display. Nevertheless, if you do work on 360° productions, then FCPX provides you a nice tool kit.

You can set up your timeline sequence for monoscopic or stereoscopic 360° editing. Once set up, simply open a separate 360° viewer, side-by-side to the normal viewer. When you do this, you’ll see the uncorrected image on the right and the adjusted point-of-view image on the left. What’s really cool, is that you can play the timeline and actively navigate your view of the content within of the 360° viewer, without ever stopping playback. Plus I’m talking about 4K material here! Clearly the engineers have tweaked the performance and not just integrated a plug-in.

There are also a set of custom effects designed for seamless use on 360° images. For example, if you apply a standard blur, there will be a visible seam where the left and right edges meet. If you apply a 360° blur effect, then the image and effect are properly blended. If you want to get the full effect, just attach an HTC Vive VR headset to view clips in full 360°. Want to test this, but don’t have any footage? A quick web search will provide a ton of downloadable, equirectangular clips to play with.

Wide gamut / high dynamic range (HDR)

Apple is trying to establish leadership with the integration of workflows to support HDR editing. I suspect that their ultimate goal is proper HDR support for Apple TV 4K and the iPhone X. The state of HDR today is very confusing without any real standards. There’s DolbyVision and HDR10, an open standard. The latter leaves the actual implementation up to manufacturers, while Dolby licenses its technology with tight specs. The theoretical DolbyVision brightness standard is 10,000 nits (cd/m2), but their current target is only 4,000 nits. HDR10 caps at 1,000 nits. Current consumer TV sets run in the 300 to 500 nit range with none exceeding 1,000 nits. Finally, projected brightness in movie theaters is even lower.

To work in HDR within Final Cut Pro X, first set up the FCPX Library as wide instead of standard gamut. Then set the Project (sequence) to one of four standards: Rec 709 (standard dynamic range), Rec 2020, Rec 2020 PQ, or Rec 2020 HLG. The first Rec 2020 mode simply preserves the full dynamic range of log-encoded camera files when FCPX applies its LUTs. The PQ and HLG options are designed for DolbyVision and/or HDR10 mastering. HDR tools are provided to go between color spaces, such as mastering in Rec 2020 PQ and delivering in Rec 709 (consult Apple’s workflow document). However, it is only in the Rec 2020 PQ color space that the FCPX scope will display in nits, rather than IRE. When set to nits, the scale is 0 to 10,000 nits instead of 0 to 120 IRE.

To edit in one of these wide gamut color spaces, set your preferences to display HDR in raw values. Then Final Cut interacts with the color profile of the monitor through macOS to effectively dim the viewer image for this new color space. However, this technique is not applied to the filmstrips and thumbnail images in the browser, which will appear with blown out levels unless you manually override the colorspace setting for each clip. If your footage was shot with camera raw or log-encoding, using a RED, ARRI or similar camera, then you are ready to work in HDR today.

It’s critical to note that no current computer display or consumer flat panel will give you an accurate HDR image to grade by. This includes the new iMac Pro screens. You will need the proper AJA i/o hardware and a calibrated HDR display to see a proper HDR image. Even then, it’s still a question of which HDR levels you are trying to master to. For example, if you are using the scope in FCPX with a brightness level up to 10,000 nits, but your target display can only achieve 1,000 nits, then what good is the reading on the scope? We are still early in the HDR process, but I’m concerned that FCPX 10.4 will give users a false impression of what it really takes to do HDR properly.

HEVC / H.265

You can now import iMovie for iOS projects into FCPX 10.4.  Support for the H.265 (HEVC) codec has been added with this release, but you’ll need to be on High Sierra. If you shot video with an iPhone X and started organizing it in iMovie on the phone, then that video may have used the H.265 codec. Now you can bring that into FCPX to continue the job.

Going the other way will require Compressor encoding. HEVC is also the required format to send HDR material to the web. Apple is late to the game in H.265 support, as Sorenson and Adobe users have been able to do that for a while. I tested H.265 encoding of short clips in Compressor on my mid-2014 Retina MacBook Pro and it was extremely slow. There was no issue with H.264 encoding. The same H.265 test in Adobe Media Encoder – even when it was uprezzing a 1080p file to 4K – was significantly faster than Compressor.

Final thoughts

For current users. When you update to Final Cut Pro X 10.4, please remember that it will update each FCPX library file that you open afterwards. Although this has generally been harmless for most users, it’s best to follow some precautions. Zip your 10.3 (or earlier) version of the application and move that .zip file out of the applications folder before you update. Archive all of your existing Final Cut libraries. This way you can find your way back, in case of some type of failure.

Final Cut Pro X 10.4 is a solid upgrade that will have loyal FCPX users applauding. Overall, these new tools are useful and, as before, FCPX is a very fluid, enjoyable editing application. It slices through 4K content better than any other NLE on the Mac platform. If you like its editing paradigm, then nothing else comes close.

Unfortunately, Apple didn’t squash some long-standing bugs. For example, numerous users online are still complaining about the issue where browser text intermittently disappears. I do feel that there were missed opportunities. The functionality of audio lanes – a feature introduced in 10.3 as a way to get closer to track-style audio mixing – hasn’t been expanded. The hope for an enhanced, roles-based audio mixer has once again gone unanswered. On the other hand, the built-in audio plug-ins have been updated to those used by Logic Pro X and there’s a clean path to send your audio to Logic if you want to mix there.

I definitely welcome these updates. The new color tools make it a more powerful application to use for color grading, so I’m happy to see that Apple has been listening. Now, I hope that we’ll see some of the other needs addressed before another year passes us by.

Originally written for Digital Video magazine / Creative Planet Network

©2017, 2018 Oliver Peters

Downsizing

The bond between a film director and the editor is often a long-lasting one. The industry is full of pairings that continue film after film. One such duo is director Alexander Payne (Nebraska, The Descendants, Sideways) and editor Kevin Tent (Welcome to Me, Girl Interrupted, Election). Tent has edited every film that Payne directed, with the exception of Payne’s short film Paris, je t’aime. In fact, Payne also served as producer for Crash Pad, a film directed by Tent.

The latest Alexander Payne film to hit the cinemas is Downsizing, a sci-fi satire starring Matt Damon, Christoph Waltz, and Kristen Wig. In the film, scientists discover human miniaturization as a way to combat overpopulation. Paul (Matt Damon) and Audrey (Kristen Wig) decide to give it a try, exchanging their average life in Omaha for Leisure Land, one of the ‘micro-communities’ sprouting up. Their modest $150,000 in personal assets will make them multimillionaires, so they take the plunge.

Sci-fi and satire

The sci-fi genre is a new approach for Payne, which is where I started my conversation with Kevin Tent. He explains, “The sci-fi theme is a departure for Alexander, but this is still very much an ‘Alexander Payne movie’. It’s still about the human experience. In the plot, shrinking is seen as a way to save the human race, but people get greedy. They can make themselves instantly rich, save money on food, medicine, and move into big ‘McMansions’. Human nature takes over, which makes the film funny and also thought-provoking. It covers a lot of ground and politics.”

“It’s easy to ask, why sci-fi,” Tent continues. “Alexander Payne is an artist who is always looking for ways to challenge himself. He co-wrote the script ten years ago, but it took this long to get it made. For one thing, Downsizing is more expensive than his past films. As an editor, I first looked at the cutting differently, because of working with the visual effects; but, I quickly realized that this film, like Alexander’s others, was about the characters and the story.  [Those are] still the most important elements of the movie. I had recently worked on The Audition, which was shot mostly with green screen – and a while back, The Golden Compass, which was a serious visual effects movie.  I had enough knowledge about the process to know one thing. These people can do anything! We had a terrific VFX team, headed by our creative guru, Jamie Price. ILM and Framestore did most of the visual effects.”

Digital production to aid the process

Alexander Payne shifted to digital acquisition with Nebraska and has followed suit with his latest, Downsizing. According to Tent, “Alexander shoots a lot of coverage, so he likes digital for that. It’s also easier to deal with when compositing visual effects. We had over 130 hours of total footage. Of course, a fairly good chunk was plates for VFX and 2nd unit footage. Most of the scenes were shot with single camera, but sometimes with multi-cam. Especially for some of the big speeches, which were covered with two and sometimes three cameras. We synced up the takes in the Avid, which makes it so easy to switch from camera to camera. Mindy Elliot is our amazing first assistant. She’s a total pro and a total joy to work with. She’s been running our cutting rooms since The Descendants. Angela Latimer was our second. She did 99% of the scripting [for Avid’s ScriptSync feature] and also helped cut early versions of Paul’s drug montage [scene in Downsizing]. Joe Carson was our VFX editor. I met him while working on Sponge Bob The Movie. I was one of the live action CGI editors on that film. Joe is awesome. He not only kept all of our visual effects organized, but he was also kept busy with the countless comps, morphs, and speed-ups that we tossed at him on a daily basis.”

Production wrapped in mid-August 2016 and then Tent started cutting with Payne right after Labor Day. Tent continues, “When I cut with Alexander, we basically start from scratch. I do create an editor’s cut during production, which we go back to for reference during our time together cutting, but it isn’t the starting point when I begin with Alexander. He’s a good editor, so when we work together, it’s really like having two editors in the room. We start watching dailies and start building scenes. We often look back at my editor’s cut and realize the scene or a part of it was better in that earlier version. Or maybe not. If there is something we like, we’ll put it back into the current cut.  We completed our first pass (kind of a director’s assembly) in January to show the studio. By early to mid-July we had a locked cut with about 80% of the completed VFX shots. The remainder trickled in afterwards. All together, that’s about ten or eleven months of cutting and finishing. Our DI/color grading was handled by the amazing Skip Kimball at Technicolor.”

Tools and tips

As a fellow editor, it’s always fun to talk about the tools and how to use them on a feature film project. Kevin Tent is a committed Avid Media Composer user. (Pacific Post provided the Avid systems used by the editing team.) According to Tent, “This was a huge project and Media Composer never had a problem with it.” One unique hallmark of Media Composer is Avid’s Script Integration. Notable within it is ScriptSync, Media Composer’s ability to automatically analyze waveforms and synchronize them – and, therefore, the associated clip – against text that has been input, like a film script. When correctly indexed, simply clicking on a line of dialogue in the on-screen script brings up all of the corresponding coverage. An ongoing licensing dispute limited its use to older versions of Media Composer, until the issue was finally resolved this year. That is great news for devotees of Avid’s powerful ScriptSync capability.

Many film editors swear by Avid’s Script Integration tools, yet some never use them at all. Was Tent a ScriptSync user? “Hell, ya!,” is his instant reply. “We stayed on Media Composer 7.0.6, because of the ScriptSync licensing issue, just so we could use it. I had Angela mark a lot of extra material and ad libs in addition to the scripted dialog. For example, an action like Paul opening a door or something like that. That would help, especially if they shot a lot of takes or resets within one bigger take, which tends to happen a lot when the shooting is on digital. There’s a massive party scene midway through the movie with people dancing, smoking pot, that kind of thing, and I asked Angela to add a ton of detail describing the scene. It made finding specific actions so quick. It’s also an especially great aid at re-cutting scenes when you are looking for alternate coverage.”

Another aid that editors like is to place scene cards on the wall. Typically these are 3”x5” note cards with written scene descriptions – one for each scene – that can be pinned to the wall in the order of the ongoing edit. Although Tent is also a proponent of these – a remnant practice from the old film days – his Downsizing cutting room didn’t have enough wall space to accommodate cards.

The Downsizing script clocked in a tad long and the first assembly that Payne and Tent cut was 2:45 (final length was 2:08). Obviously the team needed to do a bit of “downsizing” themselves. Tent explains, “The biggest lost scenes were bookending storyteller elements to open and close the film. There was an old caveman from far in the future telling a group of children about the events within the film and how once giants roamed the world. This story element was painful to lose, because it was very funny and effective emotionally. But it took an added three or four minutes to get to Matt Damon’s character and that hurt us.  The audience wants you to get to your main characters and understand what they’re seeing within a reasonable amount of time. Fortunately, Alexander hadn’t shot it yet as part of the main production. We previewed with storyboards, temp music, and voice over. While it was tough to lose it from the point of view of the script, we weren’t leaving produced material ‘on the cutting room floor’. Ultimately if you don’t know it was there, you won’t miss not having it.”

Downsizing opened in cinemas on December 21. Whether you are in it for the thought-provoking concepts or simply a lot of laughs and a wild ride, it’s a film to enjoy. Alexander Payne is bound to have another success on his hands.

Originally written for Digital Video magazine / Creative Planet Network

© 2017, 2018 Oliver Peters