Carol

df0116_carol_smFilms tend to push social boundaries and one such film this season is Carol, starring Cate Blanchett, Rooney Mara and Kyle Chandler. It’s a love story between two women, but more importantly it’s a love story between two people.  The story is based on the novel The Price of Salt by Patricia Highsmith, who also penned The Talented Mr. Ripley and Strangers on a Train. Todd Haynes (Six by Sondheim, Mildred Pierce) directed the film adaptation. Carol was originally produced in 2014 and finished in early 2015, but The Weinstein Company opted to time the release around the start of the 2015 awards season.

Affonso Gonçalves (Beasts of the Southern Wild, Winter’s Bone), the editor on Carol, explains, “Carol is a love story about two women coming to terms with the dissatisfaction of their lives. The Carol character (Cate Blanchett) is unhappily married, but loves her child. Carol has had other lesbian affairs before, but is intrigued by this new person, Therese (Rooney Mara), whom she encounters in a department store. Therese doesn’t know what she wants, but through the course of the film, learns who she is.”

Gonçalves and Haynes worked together on the HBO mini-series Mildred Pierce. Gonçalves says, “We got along well and when he got involved with the production, he passed along the script to me and I loved it.” Carol was shot entirely on Super 16mm film negative, primarily as a single-camera production. Only about five percent of the production included A and B cameras. Ed Lachman (Dark Blood, Stryker, Selena) served as the cinematographer. The film negative was scanned in log color space and then a simple log-to-linear LUT (color look-up table) was applied to the Avid DNxHD36 editorial files for nice-looking working files.

Creating a timeless New York story

Cincinnati served as the principal location designed to double for New York City in the early 1950s. The surrounding area also doubled for Iowa and Pennsylvania during a traveling portion of the film. Gonçalves discussed how Haynes and he worked during this period. “The production shot in Cincinnati, but I was based at Goldcrest Films in New York. The negative was shipped to New York each day, where it was processed and scanned. Then I would get Avid editorial files. The cutting room was set up with Avid Media Composer and ISIS systems and my first assistant Perri [Pivovar] had the added responsibilities on this project to check for film defects. Ed would also review footage each day; however, Todd doesn’t like to watch dailies during a production. He would rely on me instead to be his eyes and ears to make sure that the coverage that he needed was there.”

He continues, “After the production wrapped, I completed my editor’s cut, while Todd took a break. Then he spent two weeks reviewing all the dailies and making his own detailed notes. Then, when he was ready, he joined me in the cutting room and we built the film according to his cut. Once we had these two versions – his and mine – we compared the two. They were actually very similar, because we both have a similar taste. I had started in May and by September the cut was largely locked. Most of the experimenting came with structure and music.”

The main editorial challenges were getting the right structure for the story and tone for the performances. According to Gonçalves, “Cate’s and Rooney’s performances are very detailed and I felt the need to slow the cutting pace down to let you appreciate that performance. Rooney’s is so delicate. Plus, it’s a love story and we needed to keep the audience engaged. We weren’t as concerned with trimming, but rather, to get the story right. The first cut was two-and-a-half hours and the finished length ended up at 118 minutes. Some scenes were cut out that involved additional characters in the story. Todd isn’t too precious about losing scenes and this allowed us to keep the story focused on our central characters.”

“The main challenge was the party scene at the end. The story structure is similar to Brief Encounters (a 1946 David Lean classic with the beginning and ending set in the same location). Initially we had two levels of flashbacks, but there was too much of a shift back and forth. We had a number of ‘friends and family’ screenings and it was during these that we discovered the issues with the flashbacks. Ultimately we decided to rework the ending and simplify the temporal order of the last scene. The film was largely locked by the sixth or seventh cut.”

As a period piece, music is very integral to Carol. Gonçalves explains, “We started with about 300 to 400 songs that Todd liked, plus old soundtracks. These included a lot of singers of the time, like Billie Holiday. I also added ambiences for restaurants and bars. Carter (Burwell, composer) saw our cut at around the second or third screening with our temp score. After that he started sending preliminary themes to for us to work into the cut. These really elevated the tone of the film. He’d come in every couple of weeks to see how his score was working out with the cut, so it became a very collaborative process.”

The editing application that an editor uses is an extension of how he works. Some have very elaborate routines for preparing bins and sequences and others take a simpler approach. Gonçalves fits into the latter group. He says, “Avid is like sitting down and driving a car for me. It’s all so smooth and so fast. It’s easy to find things and I like the color correction and audio tools. I started working more sound in the Avid on True Detective and its tools really help me to dress things up. I don’t use any special organizing routines in the bins. I simply highlight the director’s preferred takes; however, I do use locators and take a lot of handwritten notes.”

Film sensibilities in the modern digital era

Carol was literally the last film to be processed at Deluxe New York before the lab was shut down. In addition to a digital release, Technicolor also did a laser “film-out” to 35mm for a few release prints. All digital post-production was handled by Goldcrest Films, who scanned the Super 16mm negative on an ARRI laser scanner at 3K resolution for a 2K digital master. Goldcrest’s Boon Shin Ng handled the scanning and conforming of the film. Creating the evocative look of Carol fell to New York colorist John J. Dowdell III. Trained in photography before becoming a colorist in 1980, Dowdell has credits on over 200 theatrical and television films.

Unlike other films, Dowdell was involved earlier in the overall process. He explains, “Early on, I had a long meeting with Todd and Ed about the look of the film. Todd had put together a book of photographs and tear sheets that helped with the colors and fashions from the 1950s. While doing the color grading job, we’d often refer back to that book to establish the color palette for the film.” Carol has approximately 100 visual effects shots to help make Cincinnati look like New York, circa 1952-53. Dowdell continues, “Boon coordinated effects with Chris Haney, the visual effects producer. The ARRI scanner is pin-registered, which is essential for the work of the visual effects artists. We’d send them both log and color corrected files. They’d use the color corrected files to create a reference, preview LUT for their own use, but then send us back finished effects in log color space. These were integrated back into the film.”

Dowdell’s tool of choice is the Quantel Pablo Rio system, which incorporates color grading tools that match his photographic sensibilities. He says, “I tend not to rely as much on the standard lift/gamma/gain color wheels. That’s a video approach. Quantel includes a film curve, which I use a lot. It’s like an s-curve tool, but with a pivot point. I also use master density and RGB printer light controls. These are numeric and let you control the color very precisely, but also repeatably. That’s important as I was going through options with Todd and Ed. You could get back to an earlier setting. That’s much harder to do precisely with color wheels and trackball controls.”

The Quantel Pablo Rio is a complete editing and effects system as well, integrating the full power of Quantel’s legendary Paintbox. This permitted John Dowdell and Boon Schin Ng to handle some effects work within the grading suite. Dowdell continues, “With the paint and tracking functions, I could do a lot of retouching. For example, some modern elements, like newer style parking meters, were tracked, darkened and blurred, so that they didn’t draw attention. We removed some modern signs and also did digital clean-up, like painting out negative dirt that made it through the scan. Quantel does beautiful blow-ups, which was perfect for the minor reframing that we did on this film.”

The color grading toolset is often a Swiss Army Knife for the filmmaker, but in the end, it’s about the color. Dowdell concludes, “Todd and Ed worked a lot to evoke moods. In the opening department store scene, there’s a definite green cast that was added to let the audience feel that this is an unhappy time. As the story progresses, colors become more intense and alive toward the end of the film. We worked very intuitively to achieve the result and care was applied to each and every shot. We are all very proud of it. Of all the films I’ve color corrected, I feel that this is really my masterpiece.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Sicario

df0416_sicario_1_sm

Sicario is an emotional and suspenseful look into the dark side of the war on drugs as told by Canadian director Denis Villeneuve (Enemy, Prisoners, Incendies). It teams a by-the-book FBI agent Kate (Emily Blunt) with an interagency task force led by CIA agent Matt (Josh Brolin). The shadowy mix of characters includes Alejandro (Benicio Del Toro) – an enigmatic contractor working with Matt. As the special operation proceeds with increasingly extra-legal means, we learn that there’s more to Alejandro than meets the eye – part former crusading prosecutor and part hitman. Kate and the audience are forced to question the morality of whether the ends justify the means as told through an increasingly tense and suspenseful story.

From Wagner to Hollywood

The key to driving such a thriller is often the editor, which in this case was Joe Walker (12 Years a Slave, Hunger, Harry Brown). I had a chance to discuss Sicario with Walker as he took a break from cutting the next Villeneuve film, Story of Your Life. Walker’s road to Hollywood is different than many other top-level, feature film editors. While editors often play musical instruments as a hobby, Walker actually studied to be a classical composer in his native England.

df0416_sicario_7Walker explains, “It’s always been a hard choice between films and writing music. I remember when I was ten years old, I’d run 8mm films of the Keystone Cops at slow speed with Richard Wagner playing against it and kind of get depressed! So, these were twin interests of mine. I studied classical composing and balanced two careers of editing and composing up until the mid-2000s. I used my music degree to get a job with the BBC where I moved into assistant editor roles. The BBC is very cautious and it took me eleven years before finally being allowed to cut drama as an editor. This was all on 16mm film and then I moved into digital editing, first with Lightworks and later Avid. I always wanted to work on bigger films, but I felt there was a glass ceiling in England. Big studio films that came in would always bring their own editors. The big break for me was 12 Years a Slave, which provided the opportunity to move to Los Angeles.”

Controlling the story, characters and rhythm

df0416_sicario_6Sicario has a definite rhythm designed to build suspense. There are scenes that are slow but tense and others that are action-packed. Walker explains his philosophy on setting the pace, “Since working with Steve McQueen (director, 12 Years a Slave) I’ve been known for holding shots a long time to build tension. This is contrary to the usual approach, which says you build tension by an increasingly faster cutting pace. Sometimes if you hold a shot, there’s even more tension if the story supports it. I’ll even use the trick of invisible split screens in order to hold a take longer than the way it was originally shot. For example, the left side of one take might hold long enough, but something breaks on the right. I’ll pull the right side from a different take in order to extend the end of the complete shot.”

Another interesting aspect to Sicario is the sparseness of the musical score, in favor of sound design. Walker comments, “Music is in an abusive relationship with film. Putting on my composer hat, I don’t want to tell the audience what to think only by the music. It’s part of the composite. I try to cut without a temp score, because you have to know when it’s only the music that drives the emotion. I’ll even turn the sound down and cut it as if it was a silent movie, so that I can feel the rhythm visually. Then sound effects add another layer and finally music. In Sicario, I made use of a lot of walkie-talkie dialogue to fill in spaces – using them almost like a sound effect.  Jóhann Jóhannsson (composer, Prisoners, The Theory of Everything, Foxcatcher) was thrilled to get a clean output without someone else’s preconceived temp score, because it allowed him to start with a clean palette.”

df0416_sicario_3Editing shapes the characters. Walker says, “Taylor Sheridan’s script was fantastic, so I don’t want to do a disservice to him, but there was a continual process of paring down the dialogue and simplifying the story, which continued long into the edit. Benicio Del Toro’s character says very little and that helps keep him very mysterious. One of the biggest cuts we made in the edit was to eliminate the original opening scene, shot on the coast at Veracruz. In it, Alejandro (Del Toro) is interrogating a cop by holding his head underwater. He goes too far and kills him.  So he drags the lifeless body to the shore only to resuscitate him and begin the interrogation again. A strong and brutal scene, but one that told too much about Alejandro at the outset, rather than letting us – and Kate (Emily Blunt) – figure him out piece by piece. We needed to tell the story through Kate’s eyes. The film now starts with the hostage rescue raid, which better anchors the film on Kate.  And it’s not short of its own brutality. At the end of the scene we smash cut from a mutilated hand on the ground to Kate washing the blood out of her hair in the shower. This very violent beginning lets the audience know that anything could happen in this film.”

A carefully considered production

Sicario was produced for an estimated $31 million. While not exactly low budget, it was certainly modest for a film of this ambition. The majority of the film was shot in New Mexico over a 49-day period, starting in July of 2014. Final post was completed in March of this year. Roger Deakins (Unbroken, Prisoners, Skyfall), the film’s director of photography, relied on his digital camera of choice these days, the ARRI Alexa XT recording to ARRIRAW. The editorial team cut with transcoded Avid DNxHD media using two Avid Media Composer systems.

df0416_sicario_4Joe Walker continues, “This was a very carefully considered shoot. They spent a lot of effort working out shots to avoid overshooting. Most of the set-ups were in the final cut. They were also lucky with the weather. I cut the initial assembly in LA while they were shooting in New Mexico. The fine cut was done in Montreal with Denis for ten weeks and then back to LA for the final post. The edit really came together easily because of all the prep. Roger has to be one of our generation’s greatest cinematographers. Not only are his shots fantastic, but he has a mastery of sequence building, which is matched by Denis.”

“Ninety percent of the time the editorial team consisted of just my long-time first assistant Javier [Marcheselli] and me. The main focus of the edit was to streamline the storytelling and to be as muscular and rhythmic with the cutting as possible. We spent a lot of time focused on the delicate balance between how much we see the story through our central character’s eyes and how much we should let the story progress by itself.  One of the constructs that came out of the edit was to beef up the idea of surveillance by taking helicopter aerials of the desert and creating drone footage from it.  Javier is great with temp visual effects and I’m good with sound, so we’d split up duties that way.”

df0416_sicario_8“I’m happy that this was largely a single-camera production. Only a few shots were two-camera shots. Single-camera has the advantage that the editor can better review the footage. With multi-cam you might get four hours of dailies, which takes about seven hours to review. When are you left with time to cut? This makes it hard to build a relationship with the dailies. With a single-camera film, you have more time to really investigate the coverage. I like to mind-read what the direction was by charting the different nuances between takes.”

It shouldn’t matter what the knives are

Walker is a long-time Media Composer user. We wrapped up with a discussion about the tools of the trade. Walker says, “This was a small film compared to some, so we used two Avid workstations connected to Avid’s ISIS shared storage while in LA. It’s rock solid. In Montreal, there was a different brand of shared storage, which wasn’t nearly as solid as ISIS. On Michael Mann’s Blackhat, we sometimes had sixteen Avids connected to ISIS, so that’s pretty hard to beat. I really haven’t used other NLEs, like Final Cut, but Premiere is tempting. If anything, going back to Lightworks is even more intriguing to me. I really loved how intuitive the ‘paddles’ (the Lightworks flatbed-style Controller) were. But edit systems are like knives. You shouldn’t care what knives the chef used if the meal tastes good. Given the right story, I’d be happy to cut it on wet string.”

df0416_sicario_2The editing application isn’t Walker’s only go-to tool. He continues, “I wish Avid would include more improvements on the audio side of Media Composer. I often go to outside applications. One of my favorites is [UISoftware’s] MetaSynth, which lets me extend music. For instance, if a chord is held for one second, I can use MetaSynth to extend that hold for as much as ten, twenty seconds. This makes it easy to tailor music under a scene and it sounds completely natural. I also used it on Sicario to elongate some great screaming sounds in the scene where Alejandro is having a nightmare on the plane – they are nicely embedded into the sounds of the jet engines – we wanted the message to be subliminal.”

df0416_sicario_5Joe Walker is a fan of visual organization. He explains, “When I’m working with dailies, I usually don’t pre-edit select sequences for a scene unless it’s a humongous amount of coverage. Instead, I prefer to visually arrange the ‘tiles’ (thumbnail frames in the bin) in a way that makes it easier to tuck in. But I am a big fan of the scene wall. I write out 3” x 5” note cards for each scene with a short description of the essence of that scene on it. This is a great way to quickly see what that scene is all about and remind you of a character’s journey up to that point. When it comes time to re-order scenes, it’s often better to do that by shifting the cards on the wall first. If you try to do it in the software, you get bogged down in the logistics of making those edit changes. I’ll put the cards for deleted scenes off to the side, so a quick glance reminds me of what we’ve removed. It’s just something that works for me.  Denis has just spent the best part of a year turning words into pictures so he laughs at my wall and my reliance on it!”

(It’s also worth checking out Steve Hullfish’s excellent interview with Walker at his Art of the Cut column.)

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Steve Jobs

df0216_sj1_smIt’s challenging to condense the life of a complex individual into a two-hour-long film. So it’s no wonder that the filmmakers of Steve Jobs have earned both praise and criticism for their portrayal of the Apple co-founder. The real Steve Jobs generated differing emotions from those who knew him or those who viewed his life from the outside. To tackle that dilemma screenwriter Aaron Sorkin (Moneyball, The Social Network, Charlie Wilson’s War) and director Danny Boyle (127 Hours, Slumdog Millionaire, 28 Days Later) set out to create a “painting instead of a photograph”.

Steve Jobs with Michael Fassbender in the central role uses a classic Shakespearean three-act structure, focusing on three key product launches. Act 1 depicts the unveiling of the first Macintosh computer (1984); Act 2 is the introduction of the NeXT computer (1988); Act 3 is the reveal of the original iMac (1998). These three acts cover the narrative arc of Jobs’ rise, humiliation/revenge, and his ultimate return to prominence at Apple. All of the action takes place backstage at these launch events, but is intercut with flashbacks. The emotional thread that ties the three acts together is Jobs’ relationship with his daughter, Lisa Brennan-Jobs.

An action film of words

Aaron Sorkin’s scripts are known for their rapid fire dialogue and Steve Jobs is no exception. Clocking in at close to 190 script pages, the task of whittling that down to a two-hour movie fell to editor Elliot Graham (Milk, 21, Superman Returns). I recently spoke with Graham about how he connected with this project and some of the challenges the team faced. He explains, “I’ve been a fan of Danny’s and his regular editor wasn’t available to cut this film. So I reached out and met with them and I joined the team.”

Steve Jobs“When I read the script, I characterized it as an ‘action film of words.’ Early on we talked about the dialogue and the need to get to two hours. I’ve never talked about the film’s final length with a director at the start of the project, but we knew the information would come fast and we didn’t want the audience to feel pummeled. We needed to create a tide of energy from beginning to end that takes the viewer through this dialogue as these characters travel from room to room. It’s our responsibility to keep each entrance into a different room or hallway revelatory in some fashion – so that the viewer stays with the ideas and the language. Thank goodness we had sound recordist Lisa Pinero on hand – she really helped the cast stay true to the musicality of the writing. The script is full of intentional overlaps, and Danny didn’t want to stop them from happening. Lisa captured it so that I could edit it. We knew we wanted very little ADR in this film, so we let the actors play out the scene. That was pivotal in capturing Aaron’s language.”

“Each act is a little different, both in production design and in the format. [Director of photography] Alwin Küchler (Divergent, R.I.P.D., Hanna) filmed Act 1 on 16mm, Act 2 on 35mm, and Act 3 digitally with the ARRI Alexa. We also added visuals in the form of flashbacks and other intercutting to make it more cinematic. Danny would keep rolling past the normal end of a take and would get some great emotions from the actors that I could use elsewhere. Also when the audience arrives to take their seats at these launch events, Danny would record that, which gave us additional material to work with. In one scene with Jobs and Joanna Hoffman (Kate Winslet), Danny kept rolling on Kate after Michael left the room. In that moment we got an exquisite emotional performance from her that was never in the script. In another example, he got this great abstract close-up of Michael that we were able to use to intercut with the boardroom scene later. This really puts the audience into Steve’s head and is a pay-off for the revenge concept.”

Building structure

df0216_sj2Elliot Graham likes to make his initial cut tight and have a first presentation that’s reasonably finished. His first cut was approximately 147 minutes long compared with a final length of 117 minutes plus credits. He continues, “In the case of this film, cutting tight was beneficial, because we needed to know whether or not the pace would work. The good news is that this leaves you more time to experiment, because less time is spent in cutting it down for time. We needed to make sure the viewer would stay engaged, because the film is really three separate stories. To avoid the ‘stage play’ feeling and move from one act into the next, we added some interstitial visual elements to move between acts. In our experimenting and trimming, we opted to cut out part of the start of Act 2 and Act 3 and join the walking-talking dialogue ‘in progress.’ This becomes a bit of a montage, but it serves the purpose of quickly bringing the viewer along even though they might have to mentally fill in some of the gaps. That way it didn’t feel like Act 2 and Act 3 were the start of new films and kept the single narrative intact.”

“At the start, the only way to really ascertain the success of our efforts was to see Act 1, as close to screen-ready as we could come. So I put together an assemblage and Danny, the producers, and I viewed it. Not only did we want to see how it all worked together before moving on, we wanted to see that we had achieved the tone and quality we were after, because each act needed to feel completely different. And since Danny was shooting each piece a bit differently, I was cutting each one differently. For example, there’s a lot of energy, almost frenetic, to the camera movements in Act 1, plus it was shot on 16mm, so it gives it this cinema verité feel and harkens back to a less technically-savvy time. Act 2 has a more classical technique to it, so the cutting becomes a little slower in pacing. By getting a sense of what was working and maybe what wasn’t, it helped define how we were going to shoot the subsequent two acts and ensure we were creating an evolution for the character and the story. We would not have been able to do this if we had shot this film chronologically out of order, the way most features are.”

It’s common for a film’s scene structure to be re-arranged during the edit, but that’s harder to do with a film like Steve Jobs. There’s walking-talking dialogue that moves from one room to the next, which means the written script forces a certain linear progression. It’s a bit like the challenge faced in Birdman: Or (The Unexpected Virtue of Ignorance), except without the need to present the story as a continuous, single take. Graham says, “We did drop some scenes, but it was tricky, because you have to bridge the gap without people noticing. One of the scenes that was altered a lot from how it was written was the fight between John Scully (Jeff Daniels) and Steve Jobs (Michael Fassbender). This scene runs about eleven minutes and Danny and I felt it lost momentum. So we spent about 48 hours recutting the scene. Instead of following the script literally, we followed the change in emotion of the actors’ performances. This led to a better emotional climax, which made the scene work.”

From San Francisco to London

df0216_sj4Steve Jobs was shot in San Francisco from January to April of this year and then post shifted to London from April until October. The editorial team worked with two Avid Media Composers connected to Avid ISIS shared storage. The film elements were scanned and then all media transcoded to Avid DNxHD for the editing team. Graham explains, “From the standpoint of the edit, it didn’t matter whether it was shot on film or digitally – the different formats didn’t change our workflow. But it was still exciting to have part of this on film, because that’s so rare these days. Danny likes a very collaborative process, so Aaron and the producers were all involved in reviewing the cuts and providing their creative input. As a director, Danny is very involved with the edit. He’d go home and review all the dailies again on DVD just to make sure we weren’t missing anything. This wasn’t an effects-heavy film like a superhero film, yet there were still several hundred visual effects. These were mostly clean-ups, like make-up fixes, boom removals, but also composites, like wall projections.”

Various film editors have differing attitudes about how much sound they include in their cut. For Elliot Graham it’s an essential part of the process. He says, “I love working with sound and temp music, because it changes your perception and affects how you approach the cut. For Steve Jobs, music was a huge part of the process from the beginning. Unlike other films, we received a lot of pieces of music from Daniel Pemberton (composer, The Man from U.N.C.L.E., Cuban Fury, The Counselor) right at the start. He had composed a number of options based on his reading of the script. We tried different test pieces even before the shoot. Once some selections were made, Daniel gave us stems so that I could really tailor the music to the scene. This helped to define the flashbacks musically. The process was much more collaborative between the director and composer than on other films and it was a really unique way to work.”

Getting the emotion right

Elliot Graham joined the project after Michael Fassbender was signed to play Steve Jobs. Graham comments, “I’ve always thought Michael was a brilliant actor and I’d much rather have that to work with than someone who just looks like Jobs. Steve Wozniak (who is played by actor Seth Rogan in the film) watched the film several times and he commented that although the actual events were slightly different, the feeling behind what’s in the film was right. He’s said that to him, it was like seeing the real Steve.  So Michael was in some way capturing the essence of this guy.  I’m biased, of course, but Danny’s aim was to get the emotional approach right and I think he succeeded.”

“I’m a big Apple fan, so the whole process felt a bit strange – like I was in some sort of wonderful Charlie Kaufman wormhole. Here I was working on a Mac and using an iPhone to communicate while cutting a film about the first Mac and the person who so impacted the world through these innovations. I felt that by working on this film, I could understand Jobs just a little bit better. You get a sense of Jobs through his coming into contact with all of these people and his playing out whatever conflicts that existed. I think it’s more of a ‘why’ and ‘who’ story – rather than a point for point biography – why this person, whose impact on our lives is immeasurable, was the way he was. It’s my feeling that we were trying to look at his soul much more than track his life story.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Avid Media Composer Goes 4K

df0915_avidmc83_5_sm

Avid Technology entered 2015 with a bang. The company closed out 2014 with the release of its Media Composer version 8.3 software, the first to enable higher resolution editing, including 2K, UHD and 4K projects. On January 16th of this year, Avid celebrated its relisting on the NASDAQ exchange by ringing the opening bell. Finally – as in most years – the Academy Awards nominee field is dominated by films that used either Media Composer and/or Pro Tools during the post-production process.

In a software landscape quickly shifting to rental (subscription) business models, Avid now offers the most flexible price model. Media Composer | Software may be purchased, rented, or managed through a floating licensing. If you purchase a perpetual license (you own the software), then an annually-renewed support contract gives you phone support and continued software updates. Opt out of the contract and you’ll still own the software you bought – you just lose any updates to newer software.

You can purchase other optional add-ons, like Symphony for advanced color correction. Unfortunately there’s still no resolution to the impasse between Avid and Nexidia. If you purchased ScriptSync or PhraseFind in the past, which rely on IP from Nexidia, then you can’t upgrade to version 8 or higher software and use those options. On the other hand, if you own an older version, such as Media Composer 7, and need to edit a project that requires a higher version, you can simply pick up a software subscription for the few months. This would let you run the latest software version for the time that it will take to complete that project.

df0915_avidmc83_1_smThe jump from Media Composer | Software 8.2 to 8.3 might seem minor, but in fact this was a huge update for Avid editors. It ushered in new, high-resolution project settings and capabilities, but also added a resolution-independent Avid codec – DNxHR. Not merely just the ability to edit in 4K, Media Composer now addresses most of the different 4K options that cover the TV and cinema variations, as well as new color spaces and frame rates. Need to edit 4K DCI Flat (3996×2160) at 48fps in DCI-P3 color space? Version 8.3 makes it possible. Although Avid introduced high-resolution editing in its flagship software much later than its competitors, it comes to the table with a well-designed upgrade that attempts to address the nuances of modern post.

df0915_avidmc83_2_smAnother new feature is LUT support. Media Composer has allowed users to add LUTs to source media for awhile now, but 8.3 adds a new LUT filter. Apply this to a top video track on your timeline and you can then add a user-supplied, film emulation (or any other type) look to all of your footage. There’s a new Proxy setting designed for work with high-resolution media. For example, switch your project settings to 1/4 or 1/16 resolution for better performance while editing with large files. Switch Proxy off and you are ready to render and output at full quality. As Media Composer becomes more capable of functioning as a finishing system, it has gained DPX image sequence file export via the Avid Image Sequencer, as well as export to Apple ProRes 4444 (Mac only).

df0915_avidmc83_4_smThis new high resolution architecture requires that the software increasingly shed itself of any remaining 32-bit parts in order to be compatible with modern versions of the Mac and Windows operating systems. Avid’s Title Tool still exists for legacy SD and HD projects, but higher resolutions will use NewBlue Titler Pro, which is included with Media Composer. It can, of course, also be used for all other titling.

There are plenty of new, but smaller features for the editor, such as a “quick filter” in the bin. Use it to quickly filter items to match the bin view to correspond with your filter text entry. The Avid “helper” applications of EDL Manager and FilmScribe have now been integrated inside Media Composer as the List Tool. This may be used to generate EDLs, Cut Lists and Change Lists.

df0915_avidmc83_3_smAvid is also a maker of video i/o hardware – Mojo DX and Nitris DX. While these will work to monitor higher resolution projects as downscaled HD, they won’t be updated to display native 4K output, for instance. Avid has qualifying AJA and Blackmagic Design hardware for use as 4K i/o. It is currently also qualifying BlueFish 444. If you work with a 4K computer display connected to your workstation, then the Full Screen mode enables 4K preview monitoring.

Avid Media Composer | Software version 8.3 is just the beginning of Avid’s entry into the high-resolution post-production niche. Throughout 2015, updates will further refine and enhance these new capabilities and expand high-resolution to other Avid products and solutions. Initial user feedback is that 8.3 is reasonably stable and performs well, which is good news for the high-end film and television world that continues to rely on Avid for post-production tools and solutions.

(Full disclosure: I have participated in the Avid Customer Association and chaired the Video Subcommittee of the Products and Solutions Council. This council provides user feedback to Avid product management to aid in future product development.)

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

The Black Panthers: Vanguard of the Revolution

df0615_panthers_3_smDocumentaries covering subject matter that happens within a generation usually divides the audience between those who personally lived through the time period and those who’ve only read about it in history books. The Black Panthers: Vanguard of the Revolution is one such film. If you are over 50, you are aware of the media coverage of the Black Panther Party and certainly have opinions and possibly misconceptions of who they were. If you are under 50, then you may have learned about them in history class, if which case, you may only know them by myth and legend. Filmmaker Stanley Nelson (The American Experience, Freedom Summer, Wounded Knee, Jonestown: The Life and Death of Peoples Temple) seeks to go beyond what you think you know with this new Sundance Film Festival documentary entry.

I spoke with the film’s editor, Aljernon Tunsil, as he was putting the finishing touches on the film to get it ready for Sundance presentation. Tunsil has worked his way up from assistant editor to editor and discussed the evolution in roles. “I started in a production company office, initially helping the assistant editor,” he says. “Over a period of seven or eight years, I worked my way up from assistant to a full-time editor. Along the way, I’ve had a number of mentors and learned to cut on both [Apple] Final Cut Pro and [Avid] Media Composer. These mentors were instrumental in my learning how to tell a story. I worked on a short with Stanley [Nelson] and that started our relationship of working together on films. I view my role as the ‘first audience’ for the film. The producer or director knows the story they want to make, but the editor helps to make sense of it for someone who doesn’t intimately know the material. My key job is to make sure that the narrative makes senses and that no one gets lost.”

df0615_panthers_2_smThe Black Panthers is told through a series of interviews (about 40 total subjects). Although a few notables, like Kathleen Cleaver, are featured, the chronicle of the rise and fall of the Panthers is largely told by lesser known party members, as well as FBI informants and police officers active in the events. The total post-production period took about 40 to 50 weeks. Tunsil explains, “Firelight Films (the production company) is very good at researching characters and finding old subjects for the interviews. They supplied me with a couple of hundred hours of footage. That’s a challenge to organize so that you know what you have. My process is to first watch all of that with the filmmakers and then to assemble the best of the interviews and best of the archival footage. Typically it takes six to ten weeks to get there and then another four to six weeks to get to a rough cut.”

Tunsil continues, “The typical working arrangement with Stanley is that he’ll take a day to review any changes I’ve made and then give me notes for any adjustments. As we were putting the film together, Stanley was still recording more interviews to fill in the gaps – trying to tie the story together without the need for a narrator. After that, it’s the usual process of streamlining the film. We could have made a ten-hour film, but, of course, not all of the stories would fit into the final two-hour version.”

df0615_panthers_5_smLike many documentary film editors, Tunsil prefers having interview transcripts, but acknowledged they don’t tell the whole story. He says, “One example is in the interview with former Panther member Wayne Pharr. He describes the police raid on the LA headquarters of the party and the ensuing shootout. When asked how he felt, he talks about his feeling of freedom, even though the event surrounding him was horrific. That feeling clearly comes across in the emotion on his face, which transcends the mere words in the transcript. You get to hear the story from the heart – not just the facts. Stories are what makes a documentary like this.”

As with many films about the 1960s and 1970s, The Black Panthers weaves into its fabric the music of the era. Tunsil says, “About 60% of the film was composed by Tom Phillips, but we also had about seven or eight period songs, like ‘Express Yourself’, which we used under [former Panther member] Bobby Seale’s run for mayor of Oakland. I used other pieces from Tom’s library as temp music, which we then gave to him for the feel. He’d compose something similar – or different, but in a better direction.”

df0615_panthers_6_smTunsil is a fervent Avid Media Composer editor, which he used for The Black Panthers. He explains, “I worked with Rebecca Sherwood as my associate editor and we were both using Media Composer version 7. We used a Facilis Terrablock for shared storage, but this was primarily used to transfer media between us, as we both had our own external drives with a mirrored set of media files. All the media was at the DNxHD 175 resolution. I like Avid’s special features such as PhraseFind, but overall, I feel that Media Composer is just better at letting me organize material than is Final Cut. I love Avid as an editing system, because it’s the most stable and makes the work easy. Editing is best when there’s a rhythm to the workflow and Media Composer is good for that. As for the stills, I did temporary moves with the Avid pan-and-zoom plug-in, but did the final moves in [Adobe] After Effects.”

df0615_panthers_1_smFor a documentary editor, part of the experience are the things you personally learn. Tunsil reflects, “I like the way Stanley and Firelight handle these stories. They don’t just tell it from the standpoint of the giants of history, but more from the point-of-view of the rank-and-file people. He’s trying to show the full dimension of the Panthers instead of the myth and iconography. It’s telling the history of the real people, which humanizes them. That’s a more down-to-earth, honest experience. For instance, I never knew that they had a communal living arrangement. By having the average members tell their stories, it makes it so much richer. Another example is the Fred Hampton story. He was the leader of the Chicago chapter of the party who was killed in a police shootout; but, there was no evidence of gunfire from inside the building that he was in. That’s a powerful scene, which resonates. One part of the film that I think is particularly well done is the explanation of how the party declined due to a split between Eldridge Cleaver and Huey Newton. This was in part as a result of an internal misinformation campaign instigated by the FBI within the Panthers.”

df0615_panthers_4_smThroughout the process, the filmmakers ran a number of test screenings with diverse audiences, including industry professionals and non-professionals, people who knew the history and people who didn’t. Results from these screenings enabled Nelson and Tunsil to refine the film. To complete the film’s finishing, Firelight used New York editorial facility Framerunner. Tunsil continues, “Framerunner is doing the online using an Avid Symphony. To get ready, we simply consolidated the media to a single drive and then brought it there. They are handling all color correction, improving moves on stills and up-converting the standard definition archival footage.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Stocking Stuffers 2014

df_stuff14_1_smAs we head toward the end of the year, it’s time to look again at a few items you can use to spruce up your edit bay.

Let’s start at the computer. The “tube” Mac Pro has been out for nearly a year, but many will still be trying to get the most life out of their existing Mac Pro “tower”. I wrote about this awhile back, so this is a bit of a recap. More RAM, an internal SSD and an upgraded GPU card are the best starting points. OWC and Crucial are your best choices for RAM and solid state drives. If you want to bump up your GPU, then the Sapphire 7950 (Note: I have run into issues with some of these cards, where the spacer screws are too tall, requiring you to install the card in slot 2) and/or Nvidia GTX 680 Mac Edition cards are popular choices. However, these will only give you an incremental boost if you’ve already been running an ATI 5870 or Nvidia Quadro 4000 display card. df_stuff14_2_smIf you have the dough and want some solid horsepower, then go for the Nvidia Quadro K5000 card for the Mac. To expand your audio monitoring, look at Mackie mixers, KRK speakers and the PreSonus Audiobox USB interface. Naturally there are many video monitor options, but assuming you have an AJA or Blackmagic Design interface, FSI would be my choice. HP Dreamcolor is also a good option when connecting directly to the computer.

The video plug-in market is prolific, with plenty of packages and/or individual filters from FxFactory, Boris, GenArts, FCP Effects, Crumplepop, Red Giant and others. I like the Universe package from df_stuff14_3_smRed Giant, because it supports FCP X, Motion, Premiere Pro and After Effects. Red Giant continues to expand the package, including some very nice new premium effects. If you are a Media Composer user, then you might want to look into the upgrade from Avid FX to Boris Red. Naturally, you can’t go wrong with FxFactory, especially if you use FCP X. There’s a wide range of options with the ability to purchase single filters – all centrally managed through the FxFactory application.

df_stuff14_4_smFor audio, the go-to filter companies are iZotope, Waves and Focusrite to name a few. iZotope released some nice tools in its RX4 package – a state-of-the-art repair and restoration suite. If you just want a suite of EQ and compression tools, then Nectar Elements or Nectar 2 are the best all-in-one collections of audio filters. While most editors do their audio editing/mastering within their NLE, some need a bit more. Along with a 2.0 bump for Sound Forge Pro Mac, Sony Creative Software also released a standard version of Sound Forge through the Mac App Store.

df_stuff14_5_smIn the color correction world, there’s been a lot of development in film emulation look-up tables (LUTs). These can be used in most NLEs and grading applications. If that’s for you, check out ImpulZ and Osiris from Color Grading Central (LUT Utility required with FCP X), Koji Color or the new SpeedLooks 4 (from LookLabs). Each package offers a selection of Fuji and Kodak emulations, as well as other stylized looks. These packages feature LUT files in the .cube and/or .look (Adobe) LUT file formats and, thus, are compatible with most applications. If you want film emulation that also includes 3-way grading tools and adjustable film grain, your best choice is FilmConvert 2.0.

df_stuff14_6_smAnother category that is expanding covers the range of tools used to prep media from the camera prior to the edit. This had been something only for DITs and on-set “data wranglers”, but many videographers are increasingly using such tools on everyday productions. These now offer on-set features that benefit all file-based recordings. Pomfort Silverstack, ShotPut Pro, Redcine-X Pro and Adobe Prelude have been joined by new tools. To start, there’s Offload and EditReady, which are two very specific tools. Offload simply copies and verifies camera-card media to two target drives. EditReady is a simple drag-and-drop batch convertor to transcode media files. These join QtChange (a utility to batch-add timecode and reel IDs to media files) and Better Rename (a Finder renaming utility) in my book, as the best single-purpose production applications.

df_stuff14_7_smIf you want more in one tool, then there’s Bulletproof, which has now been joined in the market by Sony Creative Software’s Catalyst Browse and Prepare. Bulletproof features media offload, organization, color correction and transcoding. I like it, but my only beef is that it doesn’t properly handle timecode data, when present. Catalyst Browse is free and similar to Canon’s camera utility. It’s designed to read and work with media from any Sony camera. Catalyst Prepare is the paid version with an expanded feature set. It supports media from other camera manufacturers, including Canon and GoPro.

df_stuff14_8_smFinally, many folks are looking for alternative to Adobe Photoshop. I’m a fan of Pixelmator, but this has been joined by Pixlr and Mischief. All three are available from the Mac App Store. Pixlr is free, but can be expanded through subscription. In its basic form, Pixlr is a stylizing application that is like a very, very “lite” version of Photoshop; however, it includes some very nice image processing filters. Mischief is a drawing application designed to work with drawing tablets, although a mouse will work, too.

©2014 Oliver Peters

24p HD Restoration

df_24psdhd_6

There’s a lot of good film content that only lives on 4×3 SD 29.97 interlaced videotape masters. Certainly in many cases you can go back and retransfer the film to give it new life, but for many small filmmakers, the associated costs put that out of reach. In general, I’m referring to projects with $0 budgets. Is there a way to get an acceptable HD product from an old Digibeta master without breaking the bank? A recent project of mine would say, yes.

How we got here

I had a rather storied history with this film. It was originally shot on 35mm negative, framed for 1.85:1, with the intent to end up with a cut negative and release prints for theatrical distribution. It was being posted around 2001 at a facility where I worked and I was involved with some of the post production, although not the original edit. At the time, synced dailies were transferred to Beta-SP with burn-in data on the top and bottom of the frame for offline editing purposes. As was common practice back then, the 24fps film negative was transferred to the interlaced video standard of 29.97fps with added 2:3 pulldown – a process that duplicates additional fields from the film frames, such that 24 film frames evenly add up to 60 video fields in the NTSC world. This is loaded into an Avid, where – depending on the system – the redundant fields are removed, or the list that goes to the negative cutter compensates for the adjustments back to a frame-accurate 24fps film cut.

df_24psdhd_5For the purpose of festival screenings, the project file was loaded into our Avid Symphony and I conformed the film at uncompressed SD resolution from the Beta-SP dailies and handled color correction. I applied a mask to hide the burn-in and ended up with a letter-boxed sequence, which was then output to Digibeta for previews and sales pitches to potential distributors. The negative went off to the negative cutter, but for a variety of reasons, that cut was never fully completed. In the two years before a distribution deal was secured, additional minor video changes were made throughout the film to end up with a revised cut, which no longer matched the negative cut.

Ultimately the distribution deal that was struck was only for international video release and nothing theatrical, which meant that rather than finishing/revising the negative cut, the most cost-effective process was to deliver a clean video master. Except, that all video source material had burn-in and the distributor required a full-height 4×3 master. Therefore, letter-boxing was out. To meet the delivery requirements, the filmmaker would have to go back to the original negative and retransfer it in a 4×3 SD format and master that to Digital Betacam. Since the negative was only partially cut and additional shots were added or changed, I went through a process of supervising the color-corrected transfer of all required 35mm film footage. Then I rebuilt the new edit timeline largely by eye-matching the new, clean footage to the old sequence. Once done and synced with the mix, a Digibeta master was created and off it went for distribution.

What goes around comes around

After a few years in distribution, the filmmaker retrieved his master and rights to the film, with the hope of breathing a little life into it through self-distribution – DVDs, Blu-rays, Internet, etc. With the masters back in-hand, it was now a question of how best to create a new product. One thought was simply to letter-box the film (to be in the director’s desired aspect) and call it a day. Of course, that still wouldn’t be in HD, which is where I stepped back in to create a restored master that would work for HD distribution.

Obviously, if there was any budget to retransfer the film negative to HD and repeat the same conforming operation that I’d done a few years ago – except now in HD – that would have been preferable. Naturally, if you have some budget, that path will give you better results, so shop around. Unfortunately, while desktop tools for editors and color correction have become dirt-cheap in the intervening years, film-to-tape transfer and film scanning services have not – and these retain a high price tag. So if I was to create a new HD master, it had to be from the existing 4×3 NTSC interlaced Digibeta master as the starting point.

In my experience, I know that if you are going to blow-up SD to HD frame sizes, it’s best to start with a progressive and not interlaced source. That’s even more true when working with software, rather than hardware up-convertors, like Teranex. Step one was to reconstruct a correct 23.98p SD master from the 29.97i source. To do this, I captured the Digibeta master as a ProResHQ file.

Avid Media Composer to the rescue

df_24psdhd_2_sm

When you talk about software tools that are commonly available to most producers, then there are a number of applications that can correctly apply a “reverse telecine” process. There are, of course, hardware solutions from Snell and Teranex (Blackmagic Design) that do an excellent job, but I’m focusing on a DIY solution in this post. That involves deconstructing the 2:3 pulldown (also called “3:2 pulldown”) cadence of whole and split-field frames back into only whole frames, without any interlaced tearing (split-field frames). After Effects and Cinema Tools offer this feature, but they really only work well when the entire source clip is of a consistent and unbroken cadence. This film had been completed in NTSC 29.97 TV-land, so frequently at cuts, the cadence would change. In addition, there had been some digital noise reduction applied to the final master after the Avid output to tape, which further altered the cadence at some cuts. Therefore, to reconstruct the proper cadence, changes had to be made at every few cuts and, in some scenes, at every shot change. This meant slicing the master file at every required point and applying a different setting to each clip. The only software that I know of to effectively do this with is Avid Media Composer.

Start in Media Composer by creating a 29.97 NTSC 4×3 project for the original source. Import the film file there. Next, create a second 23.98 NTSC 4×3 project. Open the bin from the 29.97 project into the 23.98 project and edit the 29.97 film clip to a new 23.98 sequence. Media Composer will apply a default motion adapter to the clip (which is the entire film) in order to reconcile the 29.97 interlaced frame rate into a 23.98 progressive timeline.

Now comes the hard part. Open the Motion Effect Editor window and “promote” the effect to gain access to the advanced controls. Set the Type to “Both Fields”, Source to “Film with 2:3 Pulldown” and Output to “Progressive”. Although you can hit “Detect” and let Media Composer try to decide the right cadence, it will likely guess incorrectly on a complex file like this. Instead, under the 2:3 Pulldown tab, toggle through the cadence options until you only see whole frames when you step through the shot frame-by-frame. Move forward to the next shot(s) until you see the cadence change and you see split-field frames again. Split the video track (place an “add edit”) at that cut and step through the cadence choices again to find the right combination. Rinse and repeat for the whole film.

Due to the nature of the process, you might have a cut that itself occurs within a split-field frame. That’s usually because this was a cut in the negative and was transferred as a split-field video frame. In that situation, you will have to remove the entire frame across both audio and video. These tiny 1-frame adjustments throughout the film will slightly shorten the duration, but usually it’s not a big deal. However, the audio edit may or may not be noticeable. If it can’t simply be fixed by a short 2-frame dissolve, then usually it’s possible to shift the audio edit a little into a pause between words, where it will sound fine.

Once the entire film is done, export a new self-contained master file. Depending on codecs and options, this might require a mixdown within Avid, especially if AMA linking was used. That was the case for this project, because I started out in ProResHQ. After export, you’ll have a clean, reconstructed 23.98p 4×3 NTSC-sized (720×486) master file. Now for the blow-up to HD.

DaVinci Resolve

df_24psdhd_1_smThere are many applications and filters that can blow-up SD to HD footage, but often the results end up soft. I’ve found DaVinci Resolve to offer some of the cleanest resizing, along with very fast rendering for the final output. Resolve offers three scaling algorithms, with “Sharper” providing the crispest blow-up. The second issue is that since I wanted to restore the wider aspect, which is inherent in going from 4×3 to 16×9, this meant blowing up more than normal – enough to fit the image width and crop the top and bottom of the frame. Since Resolve has the editing tools to split clips at cuts, you have the option to change the vertical position of a frame using the tilt control. Plus, you can do this creatively on a shot-by-shot basis if you want to. This way you can optimize the shot to best fit into the 16×9 frame, rather than arbitrarily lopping off a preset amount from the top and bottom.

df_24psdhd_3_smYou actually have two options. The first is to blow up the film to a large 4×3 frame out of Resolve and then do the slicing and vertical reframing in yet another application, like FCP 7. That’s what I did originally with this project, because back then, the available version of Resolve did not offer what I felt were solid editing tools. Today, I would use the second option, which would be to do all of the reframing strictly within Resolve 11.

As always, there are some uncontrollable issues in this process. The original transfer of the film to Digibeta was done on a Rank Cintel Mark III, which is a telecine unit that used a CRT (literally an oscilloscope tube) as a light source. The images from these tubes get softer as they age and, therefore, they require periodic scheduled replacement. During the course of the transfer of the film, the lab replaced the tube, which resulted in a noticeable difference in crispness between shots done before and after the replacement. In the SD world, this didn’t appear to be a huge deal. Once I started blowing up that footage, however, it really made a difference. The crisper footage (after the tube replacement) held up to more of a blow-up than the earlier footage. In the end, I opted to only take the film to 720p (1280×720) rather than a full 1080p (1920×1080), just because I didn’t feel that the majority of the film held up well enough at 1080. Not just for the softness, but also in the level of film grain. Not ideal, but the best that can be expected under the circumstances. At 720p, it’s still quite good on Blu-ray, standard DVD or for HD over the web.

df_24psdhd_4_smTo finish the process, I dust-busted the film to fix places with obvious negative dirt (white specs in the frame) caused by the initial handling of the film negative. I used FCP X and CoreMelt’s SliceX to hide and cover negative dirt, but other options to do this include built in functions within Avid Media Composer. While 35mm film still holds a certain intangible visual charm – even in such a “manipulated” state – the process certainly makes you appreciate modern digital cameras like the ARRI ALEXA!

As an aside, I’ve done two other complete films this way, but in those cases, I was fortunate to work from 1080i masters, so no blow-up was required. One was a film transferred in its entirety from a low-contrast print, broken into reels. The second was assembled digitally and output to intermediate HDCAM-SR 23.98 masters for each reel. These were then assembled to a 1080i composite master. Aside from being in HD to start with, cadence changes only occurred at the edits between reels. This meant that it only required 5 or 6 cadence corrections to fix the entire film.

©2014 Oliver Peters