Easy 4K Workflow

df1816_easy_4k_sm

In the last post I questioned the visual value of 4K. However, it’s inevitable that more and more distributors will be asking for 4K deliverables, so you might as well start planning how you are going to achieve that. There are certainly plenty of demos showing how easy it is to edit 4K content and they use iPhone video for the demo material. The reality is that such footage is crap and should only be used when it’s the only camera available. At the low end, there are plenty of cameras to choose from that work with highly-compressed 4K images and yet, yield great results. The Blackmagic Design URSA Mini, Sony FS7 and Canon C300 Mark II come to mind. Bump up to something in a more cinema-style package and you are looking at a Sony F55, RED, ARRI or even the AJA CION.

df1816_easy_4k_1While many cameras record to various proprietary compressed codecs, having a common media codec is the most ideal. Typically this means Apple ProRes or Avid DNxHD/HR. Some cameras and standalone monitor/recorders can natively generate media in these formats. In other circumstances, it requires an interim transcode before editing. This is where system throughput becomes a big issue. For example, if you want to work with native 4K material as ProRes 4444, you are going to need fast drives. On my home Mac Pro tower, I have two internal 7200RPM spinning drives for media striped as RAID-0. In addition to these and the boot drive, I also have another internal SSD media drive. When I checked their relative performance with the AJA System Test utility, these clocked at 161 write /168 read for the RAID-0 stripe and 257/266 for the single SSD. That’s good enough for approximately 27fps and 43fps respectively, if the media were large 3840 x 2160 (2160p) ProRes 4444 files. In other words, both drive units are adequate for a single stream of 2160p/23.98 as ProRes 4444, but would have a tougher time with two streams or more.

Unfortunately the story doesn’t end with drive performance alone, because some NLEs handle real-time playback of 4K media better than do others. I’ve performed a number of tests with 4K files in Apple Final Cut Pro X, Adobe Premiere Pro CC, Avid Media Composer and Blackmagic Design DaVinci Resolve. This has been on a number of different units, including a couple of Mac Pro towers, as well as a newer “trash can” Mac Pro. Plus, I’ve run tests with local drives, attached media RAIDs, and network-attached storage systems. What I’ve found is that as long as you have fast drive performance, then the bottleneck is the NLE.

Pretty much all of these choices can handle a single stream of 4K media without too much of an issue. However, when you stack up a second layer or track for a simple 2D PIP composite, generally the system struggles. In some cases, FCPX has performed better than the others, but not consistently.  The others all choked to varying degrees. When you limit it to a single stream of 4K video with associated audio, then FCPX performs more fluidly at a higher quality level than Media Composer or Premiere Pro, although Media Composer also performed well in some of the tests. My conclusion, for now, is that if you want to work with native 4K media in a client-involved session, and with the least amount of rendering, then FCPX is the clear winner – at least on the Mac platform. For many editors it will be the most viable choice.

Native workflow

The first big plus for Final Cut Pro X is how easily it works with native media that it’s compatible with. That’s one thing I don’t generally advocate on a large project like a show or feature film – opting instead to create “optimized” media first, either externally or within FCPX. Nevertheless, a lot of native codecs can be quite easy on the system. For example, one client cut an indie feature, using all native camera files from his Sony FS7. His Final Cut system was a tricked out iMac that was a couple of years old and a Promise Pegasus RAID array. Initially he cut the film from native 4K FS7 files to an FCPX 1080p timeline. I was doing the grading in Resolve, so I had him export a single, flattened movie file from the timeline as 1080p ProRes 4444. I brought this into Resolve, “bladed” the cuts to create edit points and applied my color correction. I exported a single ProRes 4444 master file, which he could import back into FCPX and marry with the post-production mix.

df1816_easy_4k_2Fast forward a year and the film distributor was inquiring whether they could easily produce a 4K master instead of a 1080 master. This turned out to be relatively simple. All my client had to do was change his FCPX project (timeline) settings to 4K, double-check the scaling for his clips and export a new 4K ProRes 4444 file of the timeline. In Resolve, I also changed the timeline setting to 4K and then relinked to the new 4K file. Voila! – all the cuts lined up and the previous grades all looked fine. Then I simply exported the graded 4K file to send back to the client.

In this example, even with a roundtrip to Resolve and a change from 1080p to 2160p, FCPX performed perfectly without much fuss. However, for many, you wouldn’t even need to go this far. Depending on how much you like to play and tweak during the color grade, there are plenty of ways to do this and stay totally inside FCPX. You could use tools like the Color Board, Hawaiki Color, Color Finale, or even some home-brew Motion effects, and achieve excellent results without ever leaving Final Cut Pro X.

As a reminder, Media Composer, Premiere Pro CC and Resolve are all capable of working with native media, including 4K.

Proxy workflow

df1816_easy_4k_4In addition to native 4K post, Apple engineers built an ingenious internal proxy workflow into Final Cut. Transcode the camera files in the background, flip a toggle, and work with the proxy files until you are ready to export a master. When you opt to transcode proxies, FCPX generates half-resolution, ProRes Proxy media corresponding to your original files. As an example, if your media consists of 2160p XAVC camera files, FCPX creates corresponding 1080p ProRes Proxy files. Even though the proxy media’s frame is 1/4th the size of the 4K original, FCPX takes care of handling the scaling math in the timeline between original and proxy media. The viewer display will also appear very close in quality, regardless of whether you have switched to original/optimized or proxy media. The majority of legacy A/V output cards, like a Blackmagic Design Decklink, are only capable of displaying SD and HD content to an external monitor. FCPX can send it the proper data so that a 4K timeline is displayed as a scaled 1080 output to your external video monitor.

Although proxies are small for a 4K project, these are still rather large to be moving around among multiple editors. It’s not an official part of the Final Cut operation, but you can replace these generated proxies with your own versions, with some caveats. Let’s say you have 3840 x 2160, log-gamma-encoded, 4K camera files. You would first need to have FCPX generate proxies. However, using an external application such as EditReady, Compressor, etc, you could transcode these camera files into small 960×540 ProRes Proxy media, complete with a LUT applied and timecode/clip name burnt in. Then find your Proxy Media folder, trash the FCPX-generated files and replace them with your own files. FCPX should properly relink to these and understand the correct relationship between the original and the proxy files. (This post explains the process in more detail.) There are several caveats. Clip name, frame rate, clip length, aspect ratio, and audio channel configurations must match. Otherwise you are good to go.df1816_easy_4k_3

The benefit to this solution is that you can freely edit with the proxies on a lightweight system, such as a MacBook Pro with a portable drive. When ready, move back to a beefier unit and storage, flip to original/optimized media, double-check all effects and color-correction on a good monitor, and then export the master files. It’s worth noting that this workflow is also potentially possible with Premiere Pro CC, because the new version to be introduced later this year will include a proxy editing workflow.

Naturally there is no single solution, but Final Cut Pro X makes this process far easier than any other tool that I use. If 4K is increasingly looming on the horizon for you, then FCPX is certainly worth a test run.

©2016 Oliver Peters

Deadpool

df1016_deadpool_1_sm

Adobe has been on a roll getting filmmakers to adopt its Premiere Pro CC editing software for feature film post. Hot on the heels of its success at Sundance, where a significant number of the indie films we’re edited using Premiere Pro, February saw the release of two major Hollywood films that were cut using Premiere Pro – the Coen Brothers’ Hail, Caesar! and Tim Miller’s Deadpool.

Deadpool is one of Marvel Comics’ more unconventional superheroes. Deadpool, the film, is the origin story of how Wade Wilson (Ryan Reynolds) becomes Deadpool. He’s a mercenary soldier that gains accelerated healing powers through a rogue experiment. Left disfigured, but with new powers, he sets off to rescue his girlfriend (Morena Baccarin) and find the person responsible. Throughout all of this, the film is peppered with Deadpool’s wise-cracking and breaking the fourth wall by addressing the audience.

This is the first feature film for director Tim Miller, but he’s certainly not new to the process. Miller and his company Blur Studios are known for their visual effects work on commercials, shorts, and features, including Scott Pilgrim vs. the World and Thor: The Dark World. Setting out to bring as much of the post in-house, Miller consulted with his friend, director David Fincher, who recommended the Adobe Creative Cloud solution, based on Fincher’s experience during Gone Girl. Several editing bays were established within Blur’s facility – using new, tricked out Mac Pros connected to an Open Drives Velocity SSD 180TB shared storage solution.

Plugging new software into a large VFX film pipeline

df1016_deadpool_6Julian Clarke (Chappie, Elysium, District 9) came on board to edit the film. He explains, “I talked with Tim and was interested in the whole pioneering aspect of it. The set-up costs to make these permanent edit suites for his studio are attractive. I learned editing using [Apple] Final Cut Pro at version one and then I switched to Avid about four years later and have cut with it since. If you can learn [Avid] Media Composer, then [Adobe] Premiere Pro is fine. I was up to about 80% of my normal speed after just two days.”

To ease any growing pains of using a new editing tool on such a complex film, Miller and Adobe also brought in feature film editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2: The Second One, An American Carol) as a workflow consultant. Nedomansky’s job was to help establish a workflow pipeline and to get the editorial team up to speed with Premiere Pro. He had performed a similar role on Gone Girl. He says, “I’ve cut nine features and the last four have been using Premiere Pro. Adobe has called on me for that editor-to-editor interface and to help Blur set up five edit bays. I translated what we figured out with Gone Girl, but adapted it to Blur’s needs, as well as taking into consideration the updates made to the software since then. During the first few weeks of shooting, I worked with Julian and the assistant editors to customize their window layouts and keyboard shortcuts, since prior to this, the whole crew had primarily been using Avid.”

Deadpool was shot mostly with ARRI ALEXA cameras recording open gate 2.8K ARRIRAW. Additional footage also came from Phantom and RED cameras. Most scenes were recorded with two cameras. The original camera files were transcoded to 2K ProRes dailies in Vancouver. Back at Blur, first assistant editor Matt Carson would sync audio and group the clips into Premiere Pro multicam sequences.

Staying up with production

df1016_deadpool_2As with most features, Clarke was cutting while the production was going on. However, unlike many films, he was ready to show Miller edited scenes to review within 24 hours after the shoot had wrapped for the day. Not only a cut scene, but one already fleshed out with temporary sound effects and music. This is quite a feat, considering that Miller shot more than 500 hours of footage. Seeing a quick turnaround of edited scenes was very beneficial for Miller as a first-time feature director. Clarke adds, “My normal approach is to start cutting and see what works as a first draft. The assistant will add sound effects and temp music and if we hit a stumbling block, we move on to another scene. Blur had also created a lot of pre-vis shots for the effects scenes prior to the start of principal photography. I was able to cut these in as temp VFX. This way the scenes could play through without a lot of holes.”

df1016_deadpool_3To make their collaborative workflow function, Nedomansky, Clarke, and the assistants worked out a structure for organizing files and Premiere Pro projects. Deadpool was broken into six reels, based on the approximate page count in the script where a reel break should occur. Every editor had their own folder on the Open Drives SAN containing only the most recent version of whatever project that they were working on. If Julian Clarke was done working on Reel 1, then that project file could be closed and moved from Clarke’s folder into the folder of one of the assistants. They would then open the project to add temporary sound effects or create some temporary visual effects. Meanwhile, Clarke would continue on Reel 2, which was located in his folder. By keeping only the active project file in the various folders and moving projects among editors’ folders, it would mimic the bin-locking method used in shared Avid workflows.

In addition, Premiere Pro’s Media Browser module would also enable the editors to access and import sequences found within other project files. This is a non-destructive process. Older versions of the project files would be stored in a separate folder on the SAN in order to keep the active folders and projects uncluttered. Premiere Pro’s ability to work with folders as they were created in the Finder, let the editors do more of the organization at the Finder level than they normally would, had they been cutting with Avid systems.

Cutting an action film

df1016_deadpool_4Regardless of the software you use, each film presents a unique set of creative challenges. Clarke explains, “One scene that took a while was a long dialogue scene with Deadpool and Colossus on the highway. It’s quintessential Deadpool with a lot of banter and improv from Ryan. There’s not much story going on in the background at that time. We didn’t want to cut too much out, but at the same time we didn’t want to have the audience get lost in what’s supposed to be the bigger story. It took some time to strike the right balance. Overall the film was just about right. The director’s cut was about two hours, which was cut into the final length of one hour and 45 minutes. That’s just about the right amount to cut out, because you don’t end up loosing so much of the heart of the film.”

Many editors have a particular way they like their assistants to organize bins and projects. Clarke offers, “I tend to work in the frame view and organize my set-ups by masters, close-ups, and so on. Where I may be a little different than other editors is how I have my assistants organize action scenes. I’ll have them break down the choreography move-by-move and build a sequence of selected shots in the order of these moves. So for example, all the angles of the first punch, followed by all the angles of the next move – a punch, or block, or kick. Action scenes are often shot with so much coverage, that this lets me quickly zero in on the best stuff. It eliminates the scavenger hunt to find just the right angle on a move.”

df1016_deadpool_8The script was written to work in a nonlinear order. Clarke explains how that played out through the edit, “We stood by this intention in the editing. We found, in fact, that the film just didn’t work linearly at all. The tone of the two [scripted] timelines are quite different, with the more serious undertones of the origin story and the broad humor of the Deadpool timeline. When played sequentially, it was like oil and water – two totally different movies. By interweaving the timelines, the tone of the movie felt more coherent with the added bonus of being able to front load action into the movie to excite the audience, before getting into the heavier cancer part of the story.”

One editing option that might come to mind is that a character in a mask offers an interesting opportunity to change dialogue without difficult sync issues. However it wasn’t the sort of crutch some might assume. Clarke says, “Yes, the mask provided a lot of opportunity for ADR. Though this was used more for tweaking dialogue for plot clarity or to try out alternate jokes, than a wholesale replacement of the production track. If we liked the production performance we generally kept it, and embraced the fact that the mask Ryan was wearing would dull the audio a bit. I try to use as little ADR as possible, when it comes to it being used for technical reasons, rather than creative ones. I feel like there’s a magic that happens on set that is often hard to replicate in the ADR booth.”

Pushing the envelope

df1016_deadpool_7The editing systems proved to offer the performance needed to complete a film of this size and complexity. Vashi Nedomansky says, “There were 1400 effects shots handled by ten vendors. Thanks to the fact that Blur tricked out the bays, the editors could push 10 to 15 layers of 2K media at a time for temp effects – in real-time without rendering. When the film was locked, audio was exported as AAF for the sound facility along with an H.264 picture reference. Blur did many of the visual effects in-house. For final picture deliverables, we exported an XML from Premiere Pro, but also used the Change List tool from Intelligent Assistance. This was mainly to supply the list in a column format that would match Avid’s output to meet the studio’s requirements.”

df1016_deadpool_5I asked Clarke and Nedomansky what the team liked best about working with the Adobe solution. Nedomansky says, “I found that the editors really liked the tilde key [on the keyboard], which in Premiere Pro brings any window to fullscreen. When you have a timeline with 24 to 36 tracks of temp sound effects, it’s really nice to be able to make that fullscreen so that you can fine-tune them. They also liked what I call the ‘pancake timeline’. This is where you can stack two timelines over each other to compare or pull clips from one into the other. When you can work faster like this, there’s more time for creativity.” Clarke adds, “I used a lot of the time-remapping in After Effects. Premiere Pro’s sub-frame audio editing is really good for dialogue. When Avid and Apple were competing with Media Composer and Final Cut Pro it was very productive for both companies. So competition between Avid and Adobe is good, because Premiere Pro is very forward-thinking.”

Many NLE users may question how feature films apply to the work they do. Nedomansky explains, “When Kirk Baxter used Premiere Pro for Fincher’s Gone Girl, the team requested many features that they were used to from Final Cut Pro 7. About 200 of those suggestions have found their way as features into the current release that all Creative Cloud customers receive. Film editors will stress a system in ways that others won’t, and that information benefits all users. The important takeaway from the Deadpool experience is that after some initial adjustment, there were no showstoppers and no chaos. Deadpool is a monster film, but these are just tools. It’s the human in the chair making the decision. We all just want to work and not deal with technical issues. Whatever makes the computer invisible – that’s the power.”

Deadpool is certainly a fun rid, with a lot of inside jokes for veteran Marvel fans. Look for the Stan Lee cameo and be sure to stay all the way through the end credits!

Watch director Tim Miller discuss the choice to go with Adobe.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Whiskey Tango Foxtrot

df1116_wtf_1_sm

As most readers know, “whiskey tango foxtrot” is the military way to communicate the letters WTF. Your imagination can fill in the rest. Whiskey Tango Foxtrot, the movie, is a dark comedy about the experiences of a female journalist in Afghanistan, based on Kim Barker’s memoir, The Taliban Shuffle: Strange Days in Afghanistan and Pakistan. Paramount Pictures tapped the writing/directing team of John Requa and Glenn Ficarra (Focus, Crazy, Stupid, Love., I Love You Phillip Morris) to tackle the film adaptation, starring Tina Fey, Margot Robbie, Martin Freeman, Billy Bob Thornton, and Alfred Molina.

df1116_wtf_2Glenn Ficarra explains the backstory, “When the military focus shifted from Afghanistan to Iraq there was a void in coverage. Barker was looking for a change in her life and volunteered to embed as a correspondent in Kabul. When she got there, she wasn’t quite ready for the high-adrenaline, partying lifestyle of many of the journalists. Most lived in dorms away from the general Afghan population. Since there weren’t that many females there, she found that there was a lot of interest in her.” This is the basis of both the book and the film – an Afghanistan story with a touch of Animal House and M*A*S*H.

Filming in Afghanistan would have been too dangerous, so production shifted to New Mexico, with Xavier Grobet (Focus, Enough Said, I Love You Phillip Morris) as the director of photography. The filmmakers also hired a female, Muslim journalist, Galereh Kiazand, as the second unit photographer to pick up B-roll in Kabul, which added to the authenticity. In addition, they also licensed stock shots originally filmed for The Kite Runner, but not used in that film. Ficarra adds, “We built two huge sets for Kabul and Kandahar, which were quite convincing, even to vets and Afghans who saw them.”

df1116_wtf_9With efficiencies realized during Focus, the team followed a similar course on this film. Ficarra explains, “We previously pulled the editing in-house. For Whiskey Tango Foxtrot we decided to do all the visual effects in-house, too. There are about 1,000 VFX shots in the film. It’s so great to simply bring on more artists as you need them and you only have to pay the crew. At its peak, we had about 20 Nuke artists working on shots. Doing it internally opens you up to more possibilities for minor effects that enhance shots. You would otherwise skip these if you were working with an outside effects house. We carried this approach into the filming as well. While traveling, it was great to quickly pick up a shot that you could use as B-roll. So our whole mentality has been very much like you work in film school.”

Adjusting the workflow for a new film

df1116_wtf_3The duo started production of Whiskey Tango Foxtrot on the heels of completing Focus. They brought along editor Jan Kovac, as well as use of Apple Final Cut Pro X for editing. This was the off-the-shelf version of Final Cut Pro X available to all customers at the time of the production – no special version or side build. Kovac explains what differed on this new film, “The biggest change was in camera formats. Instead of shooting [Apple] ProRes 4444, we switched to using the new ProRes 4444 XQ codec, which was deployed by ARRI on the ALEXAs. On Focus, we recorded ARRIRAW for the green screen shots. We did extensive testing with this XQ codec prior to production and it was perfect for even the green screen work. Most of the production was shot with two ALEXAs recording in a 2K theatrical format using the ProRes 4444 XQ codec.”

Light Iron provided a DIT on set who took the camera files, added a basic color LUT, synced production sound, and then generated viewing dailies, which were distributed to department heads on Apple iPads. The DIT also generated editorial files that were in the full 2K ProRes 4444 XQ resolution. Both the camera original files and the color-corrected editorial files were stored on a 160TB Accusys ExaSAN system back at the film’s post headquarters. Two Mac Minis served as metadata controllers. Kovac explains, “By always having the highest quality image to edit with, it meant that we could have the highest quality screenings at any given time. You always see the film in a state that is very close to the final product. Since visual effects were being handled in-house, it made sense to have the camera original files on the SAN. This way shots could quickly be pulled for VFX work, without the usual intermediate step of coordinating with the lab or post house that might otherwise store these files.”

df1116_wtf_6Another change was that audio was re-synced by the editing team. First assistant editor Kevin Bailey says, “The DIT would sync the production mix, but when it got here, I would sync up all the audio tracks using Sync-N-Link X. This syncs by timecode, making the process fast. I would group the cameras into multicam clips, but as many as 12 isolated audio tracks were also set up as separate angles. This way, Jan could easily switch between the production mix and individual mics. The only part that wasn’t as automatic was that the crew also used a Blackmagic Pocket Camera and a Sony A7 for some of the shots. The production was running at a true 24.0 fps frame rate, while these smaller cameras only shot 24 frames at a video rate of 23.98. These shots required adjustment and manual syncing. The reason for a true 24.0 frame rate was to make it easy to work with 48fps material. Sometimes the A-camera would run at 24fps while the B-camera ran at 48fps. Speeding up the B-camera by a 2X factor gets it into sync, without worrying about more complicated speed offsets.” In addition to these formats, the Afghanistan second unit footage was shot on a RED camera.

df1116_wtf_5Bailey is an experienced programmer who created the program Shot Notes X, which was used on this film. He continues, “Our script supervisor used Filemaker Pro, which exports a .csv file. Using Shot Notes X, I could combine the FCPXML from Final Cut with the .csv file and then generate a new FCPXML file. When imported back into Final Cut, the event would be updated to display scenes and takes, along with the script notes in the browser’s notes column. Common script codes would be used for close-ups, dolly shots, and so on. Filtering the list view by one of these codes in Final Cut would then display only the close-ups or only the dolly shots for easy access.” Bailey helped set up this pipeline during the first few weeks of production, at which point apprentice editor Esther Sokolow took over the dailies processing. Bailey shifted over to assist with sound and Sokolow later moved into a VFX editor role as one of several people doing temp VFX.

From trailer to home base

df1116_wtf_8During production in New Mexico, Kovac worked out of an editorial trailer equipped with a single Mac Pro and an 8TB G-Raid drive. There he was cutting using the proxy files that Final Cut Pro X can generate internally. During that 47-day period, Kovac was doing 90% of the editing. The amount of footage averaged about three hours and 40 minutes per day. In April, the unit moved back to home base in Los Angeles, where the team had two Mac Pro edit suites set up for the editors, as well as iMacs for the assistants.

John Requa and Glenn Ficarra are “hands-on” participants in the editing process. Kovac would cut in one room, while Ficarra and Requa would cut in the other. After the first preview, their collaboration slowly changed into a more traditional editor-director format. Even towards the end, Ficarra would still edit when he found time to do so. Post ended just before Christmas after a 35-week post schedule. Glenn Ficarra explains, “John and I have worked together for 30 years, so we are generally of one mind when we write, direct, or edit. Sometimes John would cut with me and I’d be the ‘fingers’ and other times he’d work with Jan. Or maybe I’d work with Jan and John would review and pick takes. So our process is very fluid.”

df1116_wtf_4The Whiskey Tango Foxtrot team worked deeper into temp sound and visual effects than before. Kovac explains, “Kevin is very comfortable with sound design during the edit. And he’s a good Nuke artist, too. While I was working on one reel, Kevin could work on a different reel adding in sound effects and creating monitor comps and screen replacements. A lot of this work was done inside of Final Cut using the SliceX and TrackX plug-ins from CoreMelt. We were able to work in a 5.1 surround project and did all of our temp mixes in 5.1.” The power of the plug-ins let more of the temp effects be done inside Final Cut  Pro X, resulting in a more efficient workflow with less need for roundtrips to other applications.

All media and render files were kept on the ExaSAN storage, but external of the Final Cut Pro X library files, thus keeping those small. The library files were stored on a separate NFS server (a Mac Mini using NFS Manager) with a separate FCPX library file for each reel of the film. This enabled the editors and assistants to all access any FCPX library file, as long as someone else wasn’t using it at that time. A shared iTunes library for temporary sound effects and music selections was stored on the SAN with all machines pointing to that location. From within Final Cut, any editor could browse the iTunes library for music and sound effects.

When it came time for sound and picture turnovers, X2Pro Audio Convert was used to pass audio to the sound design team as an AAF file. Light Iron’s Ian Vertovec handled final color correction on their Quantel Pablo Rio system. He was working off of camera original media, which Light Iron also stored at their facility after the production. Effects shots were sent over as DPX image sequences.

Thoughts on the cut

df1116_wtf_7The director’s cut for Whisky Tango Foxtrot ran about three hours, although the final length clocked in at 1:52:00 with credits. Kovac explains, “There were 167 scripted scenes in the original script, requiring a fair amount of trimming. Once you removed something it had consequences that rippled throughout. It took time to get it right. While it was a tougher film from that standpoint, it was easier, because no studio approval process was needed for the use of Final Cut Pro X. So it built upon the shoulders of Focus. Final Cut has proven itself as a valuable member of the NLE community. Naturally anything can be improved. For example, optical flow and auditions don’t work with multicam clips. Neither do the CoreMelt plug-ins.” Bailey adds, “For me the biggest selling point is the magnetic timeline. In areas where I would build up temp sound design, these would be the equivalent of ten tracks deep. It’s far easier to trim sections and have the audio follow along than in any other NLE.”

Glenn Ficarra wrapped up with these thoughts. He says, “A big step forward on this film was how we dealt with audio. We devised a method to keep as much as possible inside FCPX, for as long as possible – especially for screenings. This gave us more cutting time, which was nice. There was no need for any of the in-between turnovers I’ve gone through on other systems, just to prepare the movie for screenings. I like the robust third-party approach with Final Cut. It’s a small, tight-knit community. You can actually get in touch with a developer without going through a large corporation. I’d like to see Apple improve some features, like better match-back. I feel they’ve only scratched the surface with roles, so I’d like to see them develop that more.”

He concludes, “A lot of directors would like to cut for themselves, but find a tool like Avid impenetrable. It doesn’t have to be that way. My 12-year-old daughter is perfectly comfortable with Final Cut Pro X. Many of the current workflows stem from what was built up around film and we no longer work that way. Why adhere to the old film methods and rules? Filmmakers who are using new methods are those that aren’t satisfied with the status quo. They are willing to push the boundaries.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Hail, Caesar!

df0916_hailcaesar_1_sm

Combine kidnapping, mystery, farce, and a good measure of quirkiness, and you’ve defined the quintessential Coen Brothers script. Complete with a cast of Coen alums, Hail, Caesar! is just such a film. Joel and Ethan Coen’s latest is set in the motion picture factory town of Hollywood in the 1950s. Eddie Mannix (Josh Brolin) is a studio fixer tasked with finding Baird Whitlock (George Clooney), one of the studio’s biggest money-makers. Whitlock has been kidnapped in the middle of production of a Bible epic by a group called “The Future”. Of course, that’s not Mannix’s only dilemma, as he has other studio problems he needs to deal with, such as disgruntled director Laurence Laurentz (Ralph Fiennes) and personal issues by starlet DeeAnna Moran (Scarlett Johansson).

The Hail, Caesar! story idea has been kicking around for over a decade before the Coens finally brought it into production. Along with being a concept that fits right into their wheelhouse, it’s also a complex production. In this story about the Golden Age of Hollywood, much of the film involves movies within the movie. The tale weaves in and out of multiple productions being filmed on the fictional Capitol Pictures lot.

In keeping with the texture of films of that era, Hail, Caesar! was shot on film by long-time Coen director of photography, Roger Deakins (True Grit, No Country for Old Men, The Ladykillers). Deakins’ first choice might have been the ARRI ALEXA, but agreed that film was the appropriate solution and so shot with an ARRI 535-B to Kodak Vision3 negative stock. Fotokem handled development with EFILM covering telecine transfer, finishing, and digital intermediate color correction.

Time for a fresh change

Although they are lovers of the film image, Joel and Ethan Coen were also among the first to embrace Apple Final Cut Pro in their transition to digital editing for the film Intolerable Cruelty. They had been using Final Cut Pro up until Inside Llewyn Davis; however, it had become sufficiently “long in the tooth” that it was time for a change. This brought them to Adobe Premiere Pro CC. I recently interviewed Katie McQuerrey about this shift. She is credited as an additional or associate editor of numerous Coen films (Inside Llewyn Davis, True Grit, Burn After Reading) – a role which she describes as being Joel and Ethan’s right-hand person in the cutting room. For Hail, Caesar!, this included interfacing with Adobe and handling the general workflow so that Premiere Pro was a functional editing tool for the filmmakers.

df0916_hailcaesar_6McQuerrey explains, “After Apple stopped supporting Final Cut Pro 7 we knew it was time to change. We looked at Final Cut Pro X, but because of its lack of audio editing functions, we knew that it wasn’t right for us. So, we decided to give Premiere Pro a try. David Fincher had a successful experience with Gone Girl and we knew that Walter Murch, who is a friend of the Joel and Ethan’s, was using it on his next film. I’ve edited on Avid, Final Cut, and now Premiere Pro and they all make you adjust your editing style to adapt to the software. Joel and Ethan had only ever edited digitally on Final Cut Pro, so Premiere Pro provided the easiest transition. [Avid] Media Composer is very robust for the assistant editor, but a bit restrictive for the editor. I’m on an Avid job right now after a year away from it and miss some of the flexibility that Premiere Pro offers. You really come to appreciate how fluid it is to edit with. I think both Final Cut Pro 7 and Premiere Pro are better for the editor, but they do add a bit more stress on the assistants. Of course, Joel and Ethan were generally shielded from that.”

df0916_hailcaesar_2One of the unknowns with Premiere Pro was the fact that Hail, Caesar! was being shot on film. Avid has tried-and-true methods for tracking film keycode, but that was never part Premiere Pro’s architecture. Assistant editor David Smith explains, “EFILM scanned all of the negative at 2K resolution to ProRes for our cutting purposes. On an Avid job, they would have provided a corresponding ALE (Avid Log Exchange list) for the footage and you would be able to track keycode and timecode for the dailies. For this film, EFILM sync’ed the dailies and provided us with the media, as well as a Premiere Pro project file for each day. We were concerned about tracking keycode to turn over a cut list at the end of the job. Adobe even wrote us a build that included a metadata column for keycode. EFILM tracks their transfers internally, so their software would reference timecode back to the keycode in order to pull selects for the final scan and conform. At their suggestion, we used Change List software from Intelligent Assistance to provide a cut list, plus a standard EDL generated from Premiere Pro. In the end, the process wasn’t that much different after all.” EFILM scanned the selected negative clips at 4K resolution and the digital intermediate color correction was handled by Mitch Paulson under Roger Deakins’ supervision.

Adapting Premiere Pro to the Coen Brothers workflow

df0916_hailcaesar_3It was Katie McQuerrey’s job to test drive Premiere Pro ahead of the Coens and provide assistance as needed to get them up to speed. She says, “Joel was actually up to speed after a day or so. Initially we all wanted to make Premiere Pro work just like Final Cut, because it appears similar. Of course, many functions are quite different, but the longer we worked with it, the more we got used to some of  the Premiere Pro ways of doing things. As functionality issues came up, Adobe would make adjustments and send new software builds. I would test these out first. When I thought they would be ready for Joel and Ethan to use, we’d install it on their machines. I needed to let them concentrate on the edit and not worry about software.”

Joel and Ethan Coen developed a style of working that stems from their film editing days and that carried over into their use of Final Cut Pro. This was adjusted for Premiere Pro. McQuerrey continues, “Ethan and Joel work on different computers. Ethan will pick selected takes and mark ins and outs. Then he saves the project and dings a bell. Joel opens that project up to use as he assembles scenes. With FCP you could have multiple projects open at once, but not so with Premiere. We found out from Adobe that the way to handle this was through the Media Browser module inside of Premiere. Joel could browse the drive for Ethan’s project and then access it for specific sequences or selected shots. Joel could import these through Media Browser into his project as a non-destructive copy, letting Ethan continue on. Media Browser is the key to working collaboratively among several editors on the same project.” Their edit system consisted of several Mac Pro “tube” models connected to Open Drives shared storage. This solution was developed by workflow engineer Jeff Brue for Gone Girl and is based on using solid state drives, which enable fast media access.

df0916_hailcaesar_5As with all films, Hail, Caesar! posed creative challenges that any application must be able to deal with. McQuerrey explains, “Unlike other directors, Joel and Ethan wait until all the shooting is done before anything is cut. I wasn’t cutting along with dailies as is the case with most other directors. This gave me time to get comfortable with Premiere and to organize the footage. Because the story includes movies within the movie, there are different aspect ratios, different film looks and color and black-and-white film material. Editorially it was an exciting project because of this. For example, if a scene in the film was being ‘filmed’ by the on-camera crew, it was in color and should appear to play out in real-time as you see the take being filmed. This same sequence might also appear later in a Moviola viewer, as black-and-white, edited film. This affected how sequences were cut. Some shots that were supposed to be real-time needed to look like one continuous take. Or someone in the film may be watching a rough cut, therefore that part had to be cut like a rough cut. This is a film that I think editors will like, because there are a lot of inside jokes they’ll appreciate.”

Fine tuning for the feature film world

df0916_hailcaesar_4One criticism of Adobe Premiere Pro CC has been how it handles large project files, particularly when it comes to load times. McQuerrey answers, “The Open Drives system definitely helped with that. We had to split the film up into a separate projects, for cuts, sound, visual effects, music, etc. in order to work efficiently. However, as we got later into the post we found that even the smaller projects had grown to the size that load times got much slower. The remedy was to cull out old versions of sequences, so that these didn’t require indexing each time the project was opened. Periodically I would create archive projects to keep the oldest sequences and then delete most of the oldest sequences from the active project. This improved performance.”

The filmmaking team finished Hail, Caesar! with a lot of things they liked about their new software choice. McQuerrey says, “Joel likes some of the effects features in Premiere Pro to build transitions and temp comps. This film has more visual effects than a usual Coen Brothers film, including green screens, split screens, and time remaps. Many of the comps were done in Premiere, rather than After Effects. Ethan and Joel both work differently. Ethan would leave his bins in list view and do his mark-ups. On the other hand, Joel also really liked the icon view and hover scrubbing a lot. Temp sound editing while you are picture editing is very critical to their process. They’ll often use different takes or readings for the audio than for the picture, so how an application edits sound is as important – if not more so – than how it edits picture. We had a couple of bumps in the road getting the sound  tracks interface working to our liking, but with Adobe’s help in building new versions of software for us, we got to the place where we really appreciated Premiere’s sound tools.”

Katie McQuerrey and I wrapped up the interview with an anecdote about the Coens’ unique approach to their new editing tool. McQuerrey explains, “With any application, there are a number of repetitive keystrokes. At one point Joel joked about using a foot pedal, like on an old upright Moviola. At first we laughed it off, but then I checked around and found that you could buy custom control devices for video game play, including special mice and even foot controls. So we ordered a foot pedal and hooked it up to the computer. It came with it own software that let us map command functions to the pedal. We did this with Premiere’s snapping control, because Joel constantly toggles it on and off!” It’s ironic, given the context of the Hail, Caesar! story, but here you have something straight out of the Golden Age of film that’s found itself useful in the digital age.

Click here for Adobe’s behind-the-scenes look.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Vinyl

df0816_VINYL_1_sm

The decade of the 1970s was the heyday of the rock music business when a hit record nearly made you a king. It was the time right after Woodstock. Top bands like Led Zeppelin and the Rolling Stones commanded huge stadium shows. The legendary excesses of the music industry are most often encapsulated as “sex, drugs, and rock-n-roll”. Now the New York music and record company scene of that era has been brought to the screen in the new HBO series, Vinyl. The series was created by Mick Jagger & Martin Scorsese & Rich Cohen and Terence Winter.

Vinyl is told largely through the eyes of Richie Finestra (played by Bobby Cannavale, Daddy’s Home, Ant-Man), the founder and president of the fictional American Century Records. He’s a rags-to-riches guy with a gift for discovering music acts. In the pilot episode, the company is about to be sold to Polygram, but a series of events changes the course of Finestra’s future, which sets up the basis for the series. It’s New York in the 70s at the birth of hip-hop, disco, and punk rock with a lot of cultural changes going on as the backdrop. The series features an eclectic cast, including James Jagger (Mick’s son) as the leader singer of a raw, New York punk band, The Nasty Bits.

The pilot teleplay was written by Terence Winter and George Mastras, based on a story developed by Cohen, Scorsese, Jagger, and Winter. This feature-length series kick-off was directed by Scorsese with Rodrigo Prieto (A Midsummer Night’s Dream, The Wolf of Wall Street) as director of photography. Martin Scorsese is certainly no stranger to the music industry with projects like Woodstock, The Last Waltz, The Blues, and Shine A Light to his credit. Coupled with his innate ability to tell entertaining stories about the underbelly of life in New York, Vinyl makes for an interesting stew. The pilot was a year in production and post and sets the tone for the rest of the series, which will be directed by seven other directors. This is the same model as with Boardwalk Empire. Scorsese and Jagger are part of the team of executive producers, with Winter as the show runner.

Producing a pilot like a feature film

df0816_VINYL_2I recently interviewed David Tedeschi (George Harrison: Living in the Material World, Public Speaking), editor for the pilot episode of Vinyl. Kate Sanford and Tim Streeto are the editors for the series. Tedeschi has edited both documentary and narrative films prior to working with Martin Scorsese, for whom he’s edited a number of documentaries, such as No Direction Home and Shine A Light. But the Vinyl pilot is his first narrative project with Scorsese. Tedeschi explains, “The concept started out as an idea for a feature film. It landed at HBO, who was willing to green-light it as a full series. We were able to treat the pilot like a feature and had the luxury of being able to spend nearly a year in post, with some breaks in between.”

Even though Scorsese’s Sikelia Productions approached it like a feature, the editorial staff was small, consisting mainly of Tedeschi and one associate editor, Alan Lowe. Tedeschi talks about the post workflow, “The film was shot digitally with a Sony F55, but Scorsese and Prieto wanted to evoke a 16mm film look to be in keeping with the era. Deluxe handled the dailies – adding a film look emulation that included grain. They provided us with Avid DNxHD 115 media. Since most scenes were shot with two or three cameras, Alan would sync the audio by slate and then create multicam clips for me, before I’d start to edit. We were working on two Avid Media Composers connected to Avid ISIS shared storage. For viewing, we installed 50” Panasonic plasma displays that were calibrated by Deluxe. The final conform and color correction was  handed by Deluxe with Steve Bodner as colorist.”

df0816_VINYL_3He continues, “Scorsese had really choreographed the scenes precisely, with extensive notes. In the dailies process we would review every scene, and he would map out selects and then we’d work through it. In spite of being very specific about how he’d planned out a scene, he would often revisit a scene and look at other options in order to improve it. He was very open minded to new ways of looking at the material. Overall, it was a pretty tight script and edit. The first director’s cut was a little over two hours and the final came in at one hour and fifty-two minutes plus end credits.”

Story and structure

The pilot episode of Vinyl moves back and forth through a timeline of Finestra’s life and punctuates moments with interstitial elements, such as a guitar cameo by a fictionalized Bo Diddley. It’s easy to think these are constructs devised during editing, but Tedeschi says no. He explains, “I would love to take credit for that, but moving back and forth through eras was how the script was written. The interstitial elements weren’t in the script, but were Marty’s idea. He found extra time in the shooting schedule to film those and they worked beautifully in the edit.”

df0816_VINYL_4Many film editors have very specific ways they like to set up their bins in order to best sort and organize elected footage. Tedeschi’s approach is more streamlined. He explains, “My method is usually pretty simple. I don’t do special things in the bins. I will usually assemble a sequence of selected dailies for each scene. Then I’ll mark it up with markers and sometimes may color-code a few clips. On Vinyl, Alan would do the initial pass to composite some of the visual effects, like green screen window composites. He also handled a lot of the sound design for me.”

Vinyl is very detailed in how actual events, bands, people, and elements of the culture are represented and integrated into the story – although, in a fictionalized way. It’s a historical snapshot of the New York in the 70s and the culture of that time. Little elements like The King Biscuit Flower Hour (a popular radio show on progressive rock radio stations back then) playing on a radio or a movie marquee for Deep Throat easily pin-point the time and place. Anyone who’s seen the Led Zeppelin concert documentary, The Song Remains the Same, will remember one of the Madison Square Garden backstage scenes with an angry and colorful Peter Grant (Led Zeppelin’s manager). His persona and a similar event also made it into the story, but modified to be integral to the plot.

df0816_VINYL_5Accuracy is very important to Scorsese. Tedeschi says, “We have done documentaries about music and some of these people are part of our lives. We would all hear stories about some pretty over-the-top things, so a lot of this comes directly from their memories. The biggest challenge was to be faithful to New York in 1973.  It’s become this mythical place, but in Vinyl that’s the New York of Scorsese’s memory. We’ve certainly altered many actual facts, but even the most outrageous events that happen in the pilot and the series are rooted in true, historical events. We even reviewed historical footage. There was a very methodical approach.” Aside from the entertaining elements, it’s also a pretty solid story about how record companies actually operate.  He adds, “We had a screening towards the end of the editing process for the consultants, who had all worked in the record business. I knew we had done well, because they immediately launched into a lively discussion about contracts and industry standards and what names had been changed.”

This is a story about music and the music itself is a driving influence. Tedeschi concludes, “There is almost constant source music in the background. Scorsese went through each scene and we painstakingly auditioned many songs. One thing folks might not realize is that we sourced all of the recorded music that was used in their original formats. If a hit song was originally released as a 45 RPM record or an LP, then we’d track down a copy and try to use that. A few songs even came from 78 RPM records. We found a place that could handle high-quality transfers from such media and provide us with a digital file, which we used in the final mix. Often, a song may have been remastered, but we would compare our transfer with the remaster. The objective was to be faithful to the original sound – the way people heard it when it was released. After all, the series is called Vinyl for a reason. This was the director’s vision and how he remembered it.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Producing a Short Mini-Doc with the AJA CION

df0516_CION4
AJA surprised the industry in 2014 when it rolled out its CION digital cinema 4K camera. Although not known as a camera manufacturer, it had been working on this product for over four years. Last year the company offered its Try CION promotion (ended in October), which loaned camera systems to qualified filmmakers. Even though this promotion is over, potential customers with a serious interest can still get extended demos of the camera through their regional AJA sales personnel. It was in this vein that I arranged a two-week loan of a camera unit for this review.

I’m a post guy and don’t typically write camera reviews; however, I’m no stranger to cameras either. I’ve spent a lot of time “shading” cameras (before that position was called a DIT) and have taken my turn as a studio and field camera operator. My interest in doing this review was to test the process. How easy was it to use the camera in actual production and how easy was the post workflow associated with it?

CION details

The AJA CION is a 4K digital camera that employs an APS-C CMOS sensor with a global shutter and both infrared-cut and optical low-pass filters. It can shoot in various frame sizes (from 1920×1080 up to 4096×2160) and frame rates (from 23.98 up 120fps). Sensor scaling rather than windowing/cropping is used, which means the lens size related to the image it produces is the same in 4K as in 2K or HD. In other words, a 50mm lens yields the same optical framing in all digital sizes.

df0516_CION_Chellee5The CION records in Apple ProRes (up to ProRes 4444) using a built-in Pak media recorder. Think of this as essentially an AJA KiPro built right into the camera. Since Pak media cards aren’t FAT32 formatted like CF or SD cards used by other cameras, you don’t run into a 4GB file-size limit that would cause clip-spanning.  You can also record AJA Raw externally (such as to an AJA KiPro Quad) over 3G-SDI or Thunderbolt. Video is linear without any log encoding schemes; but, there are a number of gamma profiles and color correction presets.

df0516_CION_prod_1It is designed as an open camera system, using standard connectors for HDMI, BNC, XLR, batteries, lens mounts, and accessories. CION uses a PL lens mount system, because that’s the most open and the best glass comes for that mounting system. When the AJA rep sent me the camera, it came ready to shoot and included a basic camera configuration, plus accessories, including some rods, an Ikan D5w monitor, a Zeiss Compact Prime 28mm lens, 512GB and 256GB solid-state Pak media cards, and a Pak media dock/reader. The only items not included – other than tripod, quick-release base plate, and head, of course – were camera batteries. The camera comes with a standard battery plate, as well as an AC power supply.

Learning the CION

The subject of this mini-doc was a friend of mine, Peter Taylor. He’s a talented luthier who builds and repairs electric and acoustic guitars and basses under his Chellee brand. He also designs and produces a custom line of electric guitar pedals. To pull this off, I partnered with the Valencia College Film Production Technology Program, with whom I’m edited a number of professional feature films and where I teach an annual editing workshop. I worked with Ray Bracero, a budding DP and former graduate of that program who helps there as an instructional assistant. This gave me the rest of the package I needed for the production, including more lenses, a B-camera for the interview, lighting, and sound gear.

Our production schedule was limited with only one day for the interview and B-roll shots in the shop. To augment this material, I added a second day of production with my son, Chris Peters, playing an original track that he composed as an underscore for the interview. Chris is an accomplished session musician and instructor who plays Chellee guitars.

df0516_CION_prod_2With the stage set, this provided about half a day for Ray and me to get familiar with the CION, plus two days of actual production, all within the same week. If AJA was correct in designing an easy-to-use cinematic camera, then this would be a pretty good test of that concept. Ray had never run a CION before, but was familiar with REDs, Canons, and other camera brands. Picking up the basic CION operation was simple. The menu is easier than other cameras. It uses the same structure as a KiPro, but there’s also an optional remote set-up, if you want a wireless connection to the CION from a laptop.

4K wasn’t warranted for this project, so everything was recorded in 2K (2048×1080) to be used in an HD 2.35:1 sequence (1920×817). This would give me some room to reframe in post. All sync sound shots would be 23.98fps and all B-roll would be in slow motion. The camera permits “overcranking”, meaning we shot at 59.94fps for playback at 23.98fps. The camera can go up to 120fps, but only when recording externally in AJA Raw. To keep it simple on this job, all recording was internal to the Pak media card – ProResHQ for the sync footage and ProRes 422 for the slow motion shots.

Production day(s)

The CION is largely a “what you see is what you get” camera. Don’t plan on extensive correction in post. What you see on the monitor is typically what you’ll get, so light and control your production set-up accordingly. It doesn’t have as wide of a dynamic range as an ARRI ALEXA for example. The bottom EI (exposure index) is 320 and that’s pretty much where you want to operate as a sweet spot. This is similar to the original RED One. This means that in bright exteriors, you’ll need filtering to knocking down the light. There’s also not much benefit in running with a high EI. The ALEXA, for instance, looks great at 800, but that setting didn’t seem to help the CION.

df0516_CION_Chellee13_smGamma profiles and color temperature settings didn’t really behave like I would have expected from other cameras. With our lighting, I would have expected a white balance of 3200 degrees Kelvin, however 4500 looked right to the eye and was, in fact, correct in post. The various gamma profiles didn’t help with clipping in the same way as Log-C does, so we ultimately stayed with Normal/Expanded. This shifts the midrange down to give you some protection for highlights. Unfortunately with CION, when highlights are clipped or blacks are crushed, that is actually how the signal is being recorded and these areas of the signal are not recoverable. The camera’s low end is very clean and there’s a meaty midrange. We discovered that you cannot monitor the video over SDI while recording 59.94-over-23.98 (slow motion). Fortunately HDMI does maintain a signal. All was good again, once we switched to the HDMI connection.

CION features a number of color correction presets. For Day 1 in the luthier shop, I used the Skin Tones preset. This is a normal color balance, which slightly desaturates the red-orange range, thus yielding more natural flesh tones. On Day 2 for the guitar performance, I switched to the Normal color correction preset. The guitar being played has a red sunburst paint finish and the Skin Tones preset pulled too much of the vibrance out of the guitar. Normal more closely represented what it actually looked like.

df0516_CION_Chellee4During the actual production, Ray used three Zeiss Super Speed Primes (35mm, 50mm, and 85mm) on the CION, plus a zoom on the Canon 5D B-camera. Since the locations were tight, he used an ARRI 650w light with diffusion for a key and bounced a second ARRI 150w light as the back light. The CION permits two channels of high-quality audio input (selectable line, mic, or +48v). I opted to wire straight into the camera, instead of using an external sound recorder. Lav and shotgun mics were directly connected to each channel for the interview. For the guitar performance, the amp was live-mic’ed into an Apogee audio interface (part of Chris’ recording system) and the output of that was patched into the CION at line level.df0516_CION_Chellee8

The real-time interview and performance material was recorded with the CION mounted on a tripod, but all slow motion B-roll shots were handheld. Since the camera had been rigged with a baseplate and rods, Ray opted to use the camera in that configuration instead of taking advantage of the nice shoulder pad on the CION. This gave him an easy grasp of the camera for “Dutch angles” and close working proximity to the subject. Although a bit cumbersome, the light weight of the CION made such quick changes possible.

Post production

df0516_CION_FCPX_2As an editor, I want a camera to make life easy in post, which brought me to Apple Final Cut Pro X for the finished video. Native ProRes, easy syncing of two-camera interviews, and simple-yet-powerful color correction makes FCPX a no-brainer. We recorded a little over three hours of material – 146 minutes on the CION, 37 minutes on the 5D and 11 minutes on a C500 (for two pick-up shots). All of the CION footage only consumed about 50% of the single 512GB Pak media card. Using the Pak media dock, transfer times were fast. While Pak media isn’t cheap, the cards are very robust and unless you are chewing through tons of 4K, you actually get a decent amount of recording time on them.

I only applied a minor amount of color correction on the CION footage. This was primarily to bring up the midrange due to the Normal/Expanded gamma profile, which naturally makes the recorded shot darker. The footage is very malleable without introducing the type of grain-like sensor noise artifacts that I see with other cameras using a similar amount of correction. Blacks stay true black and clean. Although my intention was not to match the 5D to the CION – I had planned on some stylized correction instead – in the end I matched it anyway, since I only used two shots. Surprisingly, I was able to get a successful match.

Final thoughts

df0516_CION_Chellee6The CION achieved the design goals AJA set for it. It is easy to use, ergonomic, and gets you a good image with the least amount of fuss. As with any camera, there are a few items I’d change. For example, the front monitoring connectors are too close to the handle. Occasionally you have to press record twice to make sure you are really recording. There’s venting on the top, which would seem to be an issue if you suddenly got caught in the rain. Overall, I was very happy with the results, but I think AJA still needs to tweak the color science a bit more.

In conjunction with FCPX for post, this camera/NLE combo rivals ARRI’s ALEXA and AMIRA for post production ease and efficiency. No transcoding. No performance hits due to taxing, native, long-GOP media. Proper file names and timecode. A truly professional set-up. At a starting point of $4,995, the AJA CION is a dynamite camera for the serious producer or filmmaker. The image is good and the workflow outstanding.

Click this link to see the final video on Vimeo.

Originally written for Digital Video magazine / Creative Planet Network

©2016 Oliver Peters

Carol

df0116_carol_smFilms tend to push social boundaries and one such film this season is Carol, starring Cate Blanchett, Rooney Mara and Kyle Chandler. It’s a love story between two women, but more importantly it’s a love story between two people.  The story is based on the novel The Price of Salt by Patricia Highsmith, who also penned The Talented Mr. Ripley and Strangers on a Train. Todd Haynes (Six by Sondheim, Mildred Pierce) directed the film adaptation. Carol was originally produced in 2014 and finished in early 2015, but The Weinstein Company opted to time the release around the start of the 2015 awards season.

Affonso Gonçalves (Beasts of the Southern Wild, Winter’s Bone), the editor on Carol, explains, “Carol is a love story about two women coming to terms with the dissatisfaction of their lives. The Carol character (Cate Blanchett) is unhappily married, but loves her child. Carol has had other lesbian affairs before, but is intrigued by this new person, Therese (Rooney Mara), whom she encounters in a department store. Therese doesn’t know what she wants, but through the course of the film, learns who she is.”

Gonçalves and Haynes worked together on the HBO mini-series Mildred Pierce. Gonçalves says, “We got along well and when he got involved with the production, he passed along the script to me and I loved it.” Carol was shot entirely on Super 16mm film negative, primarily as a single-camera production. Only about five percent of the production included A and B cameras. Ed Lachman (Dark Blood, Stryker, Selena) served as the cinematographer. The film negative was scanned in log color space and then a simple log-to-linear LUT (color look-up table) was applied to the Avid DNxHD36 editorial files for nice-looking working files.

Creating a timeless New York story

Cincinnati served as the principal location designed to double for New York City in the early 1950s. The surrounding area also doubled for Iowa and Pennsylvania during a traveling portion of the film. Gonçalves discussed how Haynes and he worked during this period. “The production shot in Cincinnati, but I was based at Goldcrest Films in New York. The negative was shipped to New York each day, where it was processed and scanned. Then I would get Avid editorial files. The cutting room was set up with Avid Media Composer and ISIS systems and my first assistant Perri [Pivovar] had the added responsibilities on this project to check for film defects. Ed would also review footage each day; however, Todd doesn’t like to watch dailies during a production. He would rely on me instead to be his eyes and ears to make sure that the coverage that he needed was there.”

He continues, “After the production wrapped, I completed my editor’s cut, while Todd took a break. Then he spent two weeks reviewing all the dailies and making his own detailed notes. Then, when he was ready, he joined me in the cutting room and we built the film according to his cut. Once we had these two versions – his and mine – we compared the two. They were actually very similar, because we both have a similar taste. I had started in May and by September the cut was largely locked. Most of the experimenting came with structure and music.”

The main editorial challenges were getting the right structure for the story and tone for the performances. According to Gonçalves, “Cate’s and Rooney’s performances are very detailed and I felt the need to slow the cutting pace down to let you appreciate that performance. Rooney’s is so delicate. Plus, it’s a love story and we needed to keep the audience engaged. We weren’t as concerned with trimming, but rather, to get the story right. The first cut was two-and-a-half hours and the finished length ended up at 118 minutes. Some scenes were cut out that involved additional characters in the story. Todd isn’t too precious about losing scenes and this allowed us to keep the story focused on our central characters.”

“The main challenge was the party scene at the end. The story structure is similar to Brief Encounters (a 1946 David Lean classic with the beginning and ending set in the same location). Initially we had two levels of flashbacks, but there was too much of a shift back and forth. We had a number of ‘friends and family’ screenings and it was during these that we discovered the issues with the flashbacks. Ultimately we decided to rework the ending and simplify the temporal order of the last scene. The film was largely locked by the sixth or seventh cut.”

As a period piece, music is very integral to Carol. Gonçalves explains, “We started with about 300 to 400 songs that Todd liked, plus old soundtracks. These included a lot of singers of the time, like Billie Holiday. I also added ambiences for restaurants and bars. Carter (Burwell, composer) saw our cut at around the second or third screening with our temp score. After that he started sending preliminary themes to for us to work into the cut. These really elevated the tone of the film. He’d come in every couple of weeks to see how his score was working out with the cut, so it became a very collaborative process.”

The editing application that an editor uses is an extension of how he works. Some have very elaborate routines for preparing bins and sequences and others take a simpler approach. Gonçalves fits into the latter group. He says, “Avid is like sitting down and driving a car for me. It’s all so smooth and so fast. It’s easy to find things and I like the color correction and audio tools. I started working more sound in the Avid on True Detective and its tools really help me to dress things up. I don’t use any special organizing routines in the bins. I simply highlight the director’s preferred takes; however, I do use locators and take a lot of handwritten notes.”

Film sensibilities in the modern digital era

Carol was literally the last film to be processed at Deluxe New York before the lab was shut down. In addition to a digital release, Technicolor also did a laser “film-out” to 35mm for a few release prints. All digital post-production was handled by Goldcrest Films, who scanned the Super 16mm negative on an ARRI laser scanner at 3K resolution for a 2K digital master. Goldcrest’s Boon Shin Ng handled the scanning and conforming of the film. Creating the evocative look of Carol fell to New York colorist John J. Dowdell III. Trained in photography before becoming a colorist in 1980, Dowdell has credits on over 200 theatrical and television films.

Unlike other films, Dowdell was involved earlier in the overall process. He explains, “Early on, I had a long meeting with Todd and Ed about the look of the film. Todd had put together a book of photographs and tear sheets that helped with the colors and fashions from the 1950s. While doing the color grading job, we’d often refer back to that book to establish the color palette for the film.” Carol has approximately 100 visual effects shots to help make Cincinnati look like New York, circa 1952-53. Dowdell continues, “Boon coordinated effects with Chris Haney, the visual effects producer. The ARRI scanner is pin-registered, which is essential for the work of the visual effects artists. We’d send them both log and color corrected files. They’d use the color corrected files to create a reference, preview LUT for their own use, but then send us back finished effects in log color space. These were integrated back into the film.”

Dowdell’s tool of choice is the Quantel Pablo Rio system, which incorporates color grading tools that match his photographic sensibilities. He says, “I tend not to rely as much on the standard lift/gamma/gain color wheels. That’s a video approach. Quantel includes a film curve, which I use a lot. It’s like an s-curve tool, but with a pivot point. I also use master density and RGB printer light controls. These are numeric and let you control the color very precisely, but also repeatably. That’s important as I was going through options with Todd and Ed. You could get back to an earlier setting. That’s much harder to do precisely with color wheels and trackball controls.”

The Quantel Pablo Rio is a complete editing and effects system as well, integrating the full power of Quantel’s legendary Paintbox. This permitted John Dowdell and Boon Schin Ng to handle some effects work within the grading suite. Dowdell continues, “With the paint and tracking functions, I could do a lot of retouching. For example, some modern elements, like newer style parking meters, were tracked, darkened and blurred, so that they didn’t draw attention. We removed some modern signs and also did digital clean-up, like painting out negative dirt that made it through the scan. Quantel does beautiful blow-ups, which was perfect for the minor reframing that we did on this film.”

The color grading toolset is often a Swiss Army Knife for the filmmaker, but in the end, it’s about the color. Dowdell concludes, “Todd and Ed worked a lot to evoke moods. In the opening department store scene, there’s a definite green cast that was added to let the audience feel that this is an unhappy time. As the story progresses, colors become more intense and alive toward the end of the film. We worked very intuitively to achieve the result and care was applied to each and every shot. We are all very proud of it. Of all the films I’ve color corrected, I feel that this is really my masterpiece.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters