Five Came Back

We know them today as the iconic Hollywood directors who brought us such classic films as Mr. Smith Goes To Washington, It’s a Wonderful Life, The African Queen, and The Man Who Shot Liberty Valance – just to name a few. John Ford, William Wyler, John Huston, Frank Capra and George Stevens also served their country on the ground in World War II, bringing its horrors and truth to the American people through film. In Netflix’s new three-part documentary series, based on Mark Harris’ best-selling book, Five Came Back: A Story of Hollywood and the Second World War, contemporary filmmakers explore the extraordinary story of how Hollywood changed World War II – and how World War II changed Hollywood, through the interwoven experiences of these five legendary filmmakers.

This documentary series features interviews with Steven Spielberg, Francis Ford Coppola, Guillermo Del Toro, Paul Greengrass and Lawrence Kasdan, who add their own perspectives on these efforts. “Film was an intoxicant from the early days of the silent movies,” says Spielberg in the opening moments of Five Came Back. “And early on, Hollywood realized that it had a tremendous tool or weapon for change, through cinema.” Adds Coppola, “Cinema in its purest form could be put in the service of propaganda. Hitler and his minister of propaganda Joseph Goebbels understood the power of the cinema to move large populations toward your way of thinking.”

Five Came Back is directed by Laurent Bouzereau, written by Mark Harris and narrated by Meryl Streep. Bouzereau and his team gathered over 100 hours of archival and newsreel footage; watched over 40 documentaries and training films directed and produced by the five directors during the war; and studied 50 studio films and over 30 hours of outtakes and raw footage from their war films to bring this story to Netflix audiences. Says director Laurent Bouzereau, “These filmmakers, at that time, had a responsibility in that what they were putting into the world would be taken as truth. You can see a lot of echoes in what is happening today. It became clear as we were doing this series that the past was re-emerging in some ways, including the line we see that separates cinema that exists for entertainment and cinema that carries a message. And politics is more than ever a part of entertainment. I find it courageous of filmmakers then, as with artists today, to speak up for those who don’t have a platform.”

An editor’s medium

As every filmmaker knows, documentaries are truly an editor’s medium. Key to telling this story was Will Znidaric, the series editor. Znidaric spent the first sixteen years of his career as a commercial editor in New York City before heading to Los Angeles, in a move to become more involved in narrative projects and hone his craft. This move led to a chance to cut the documentary Winter on Fire: Ukraine’s Fight for Freedom. Production and post for that film was handled by LA’s Rock Paper Scissors Entertainment, a division of the Rock Paper Scissors post facility. RPS is co-owned by Oscar-winning editor, Angus Wall (The Social Network, The Girl with the Dragon Tattoo). Wall, along with Jason Sterman and Linda Carlson, was an executive producer on Winter of Fire for RPS. The connection was a positive experience, so when RPS got involved with Five Came Back, Wall tapped Znidaric as its editor. Much of the same post team worked on both of these documentaries.

I recently interviewed Will Znidaric about his experience editing Five Came Back. “I enjoyed working with Angus,” he explains. “We edited and finished at Rock Paper Scissors over a fifteen month period. They are structured to encourage creativity, which was great for me as a documentary editor. Narratively, this story has five main characters who are on five individual journeys. The canvas is civilization’s greatest conflict. You have to be clear about the war in order to explain their context. You have to be able to find the connections to weave a tapestry between all of these elements. This came together thanks to the flow and trust that was there with Laurent [Bouzereau, director]. The unsung hero is Adele Sparks, our archival producer, who had to find the footage and clear the rights. We were able to generally get rights to acquire the great majority of the footage on our wish list.”

Editing is paleontology

Znidaric continues, “In a documentary like this, editing is a lot like paleontology – you have to find the old bones and reconstruct something that’s alive. There was a lot of searching through newsreels of the day, which was interesting thematically. We all look at the past through the lens of history, but how was the average American processing the events of that world during that time? Of course, those events were unfolding in real time for them. It really makes you think about today’s films and how world events have an impact on them. We had about 100 hours of archival footage, plus studio films and interviews. For eight to nine months we had our storyboard wall with note cards for each of the films. As more footage came in, you could chart the growth through the cards.”

Five Came Back was constructed using three organizing principles: 1) the directors’ films before the war, 2) their documentaries during the war, and 3) their films after the war. According to Znidaric, “We wanted to see how the war affected their work after the war. The book was our guide for causality and order, so I was able to build the structure of the documentary before the contemporary directors were interviewed. I was able to do so with the initial interview with the author, Mark Harris. This way we were able to script an outline to follow. Interview footage of our actual subjects from a few decades ago were also key elements used to tell the story. In recording the modern directors, we wanted to give them space – they are masters – we just needed to make sure we got certain story beats. Their point of view is unique in the sense that they are providing their perspective on their heroes. At the beginning, we have one modern director talking about one of our subject directors. Then that opens up over the three hours, as each talks a little bit about all of these filmmakers.”

From Moviola to Premiere Pro

This was the first film that Znidaric had edited using Adobe Premiere Pro. He says, “During film school, I got to cut 16mm on the Moviola, but throughout my time in New York, I worked on [Avid] Media Composer and then later [Apple] Final Cut Pro 7. When Final Cut Pro X came out, I just couldn’t wrap my head around it, so it was time to shift over to Premiere Pro. I’m completely sold on it. It was a dream to work with on this project. At Rock Paper Scissors, my associate editor James Long and I were set up in two suites. We had duplicate drives of media – not a SAN, which was just given to how the suites were wired. It worked out well for us, but forced us to be extremely diligent with how our media was organized and maintaining that throughout.” The suites were configured with 6-core 2013 Mac Pros, AJA IoXT boxes and Mackie Big Knob mixers for playback.

“All of the media was first transcoded to ProRes, which I believe is one of the reasons that the systems were rock solid during that whole time. There’s an exemplary engineering department at RPS, and they have a direct line to Adobe, so if there were any issues, they became the go-betweens. That way I could stay focused on the creative and not get bogged down with technical issues. Plus, James [Long] would generally handle issues of a technical nature. All told, it was very minimal. The project ran quite smoothly.” To stay on the safe side, the team did not update their versions of Premiere Pro during this time frame, opting to stick with Premiere Pro CC2015 for the duration. Because of the percentage of archival footage, Five Came Back was finished as HD and not in 4K, as are a number of other Netflix shows.

To handle Premiere Pro projects over the course of fifteen months, Znidaric and Long would transfer copies of the project files on a daily basis between the rooms. Znidaric continues, “There were sequences for individual ‘mini-stories’ inside the film. I would build these and then combine the stories. As the post progressed, we would delete some of the older sequences from the project files in order to keep them lean. Essentially we had a separate Premiere Pro project file for each day, therefore, at any time we could go back to an earlier project file to access an older sequence, if needed. We didn’t do much with the other Creative Cloud tools, since we had Elastic handling the graphics work. I would slug in raw stills or placeholder cards for maps and title cards. That way, again, I could stay focused on weaving the complex narrative tapestry.”

Elastic developed the main title and a stylistic look for the series while a52 handled color correction and finishing. Elastic and a52 are part of the Rock Paper Scissors group. Znidaric explains, “We had a lot of discussions about how to handle photos, stills, flyers, maps, dates and documents. The reality of filming under the stress of wartime and combat creates artifacts like scratches, film burn-outs and so on. These became part of our visual language. The objective was to create new graphics that would be true to the look and style of the archival footage.” The audio mix when out-of-house to Monkeyland, a Los Angeles audio post and mixing shop.

Five Came Back appealed to the film student side of the editor. Znidaric wrapped up our conversation with these thoughts. “The thrill is that you are learning as you go through the details. It’s mind-blowing and the series could easily have been ten hours long. We are trying to replicate a sense of discovery without the hindsight of today’s perspective. This was fun because it was like a graduate level film school. Most folks have seen some of the better known films, but many of these films aren’t as recognized these days. Going through them is a form of ‘cinematic forensics’. You find connections tied to the wartime experience that might not otherwise be as obvious. This is great for a film geek like me. Hopefully many viewers will rediscover some of these films by seeing this documentary series.”

The first episode of Five Came Back aired on Netflix on March 31. In conjunction with the launch of Five Came Back, Netflix will also present thirteen documentaries discussed in the series, including Ford’s The Battle of Midway, Wyler’s The Memphis Belle: A Story of a Flying Fortress, Huston’s Report from the Aleutians, Capra’s The Battle of Russia, Stevens’ Nazi Concentration Camps, and Stuart Heisler’s The Negro Soldier.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Red Giant Magic Bullet Suite 13

The hallmark of Red Giant’s Magic Bullet software products are that they are designed to enhance or stylize images. As their banner states, they focus on “color correction, finishing and film looks for filmmakers.” You can purchase individual software products or a comprehensive suite of tools. I reviewed Magic Bullet Suite 12 a couple a years ago. A few months ago Red Giant released its Magic Bullet Suite 13 update. As in the past, you can purchase it outright or as an upgrade from a previous version. With each iteration of the suite, Red Giant shuffles the mix of products in the toolkit and this version is no different.

Magic Bullet Suite 13 is comprised of seven plug-in products, which include Looks 4.0, Colorista IV, Denoiser III, Cosmo II, Mojo II, Film, and the newly added Renoiser. The tools are cross-platform compatible (macOS or Windows), but depending on the editing or compositing software you use, not all of these plug-ins work in every possible host. All of the tools will work in Adobe Premiere Pro or After Effects, as well as Apple Final Cut Pro X. Magic Bullet Looks 4.0 provides the broadest host support, including some less common choices. Looks supports After Effects, Premiere Pro, Final Cut Pro X, Motion, Magix Vegas Pro, Avid Media Composer, DaVinci Resolve, EDIUS, and HitFilm Pro. Colorista only supports the Adobe and Apple hosts, while the other tools support the bulk of possible choices, with the exception Media Composer. Therefore, what you cut or composite with will determine what your best purchase will be – full suite or individual plug-ins.

New bells and whistles

The big selling point of this release is GPU acceleration across the board using OpenGL/OpenCL. This provides real-time color correction. There are plenty of refinements throughout, but if you are an Adobe user, you’ll note that Colorista IV has embraced Adobe’s panel technology. If you are comfortable with Premiere Pro’s Lumetri Color panel, you can now instead work with Colorista, in this exact same manner. I’ve dabbled a bit with all of these tools in various Avid, Apple, and Adobe hosts. While performance is good and certainly improved, you’ll have the best experience in Adobe After Effects and Premiere Pro. Another advantage you’ll have is Adobe’s built-in masking and tracking tools. Want to isolate someone’s face and track a Colorista correction to it during a moving shot? No problem, since the Adobe’s features augment any installed plug-in. As an editor, I like to do most of my work within the NLE, but honestly, if you want the best total experience, use these tools in After Effects. That’s where everything shines.

Looks

I won’t dive into each specific feature, since you can download a free trail version and see for yourself. Plus, you can reread my Magic Bullet Suite 12 review, as many of the main features are similar. But let me note a few items, starting with Looks. This is the grandaddy plug-in of the group, which actually runs as a mini sidebar application. Apply the plug-in, click the “edit” button, and your reference frame opens in the standalone Looks interface. It includes a wealth of tools that can be applied, reordered, and adjusted in near-infinite variations to get just the specific look you desire. There are three helpful features – grading head starts, the ability to save custom presets, and a looks browser. The browser offers a ton of custom presets with a small thumbnail for each. These are updated with the reference frame and as you hover over each, the main viewer window is updated to display that look, thanks to GPU acceleration. If you want to start from scratch, but not sure what the best tools are to use, that’s where the head starts come in. This section includes six starting points that include a series of correction tools in a preset order, but without any tweaks yet applied.

Colorista IV

Colorista IV is another tools that’s received a lot of attention in this build. I’ve already mentioned the panel, but something really unique is the built-in Guided Color Correction routine. This is designed to guide novice and even experienced editors and compositors through series of color correction steps in the right order. Colorista also gained temperature and tint controls, RGB point curves, log support, and LUTs. The addition of integrated LUTs fills a gap, because Red Giant’s separate LUT Buddy tool has been dropped from Suite 13.

Renoiser

The other tools have also gained added features, but let’s not forget the new Magic Bullet Renoiser 1.0. This is designed to give cinematic texture and grain to pristine video and CGI footage. It includes 16 stock presets ranging from 8mm to 35mm. These are labeled based on certain fanciful styles, like “Kung Fu Fighting” or “Classic 35mm”. Renoiser’s settings are completely customizable.

There’s a lot to like in this upgrade, but first and foremost for me was the overall zippier operation, thanks to GPU acceleration. If you use these tools a lot in your daily editing and compositing, then Magic Bullet Suite 13 will definitely be worth the update.

©2017 Oliver Peters

Voyage of Time

df0617_vot_3_smFans of director Terrence Malick adore his unique approach to filmmaking, which is often defined by timeless and painterly cinematic compositions. The good news for moviegoers is that Malick has been in the most prolific period of his directing career. What could be the penultimate in cinema as poetry is Malick’s recent documentary, Voyage of Time. This is no less than a chronicle of the history of the universe as seen through Malick’s eyes. Even more intriguing is the fact that the film is being released in two versions – a 90 minute feature (Voyage of Time: Life’s Journey), narrated by Cate Blanchett, as well as a 45 minute IMAX version (Voyage of Time: The IMAX Experience), narrated by Brad Pitt.

This period of Malick’s increased output has not only been good for fans, but also for Keith Fraase, co-editor of Voyage of Time. Fraase joined Malick’s filmmaking team during the post of The Tree of Life. Although he had been an experienced editor cutting commercials and shorts, working with Malick was his first time working on a full-length feature. Keith Fraase and I recently discussed what it took to bring Voyage of Time to the screen.

Eight years in the making

“I began working with Terry back in 2008 on The Tree of Life,” Fraase says. “Originally, Voyage of Time had been conceived as a companion piece to The Tree of Life, to be released simultaneously. But plans changed and the release of Voyage was delayed. Some of the ideas and thematic elements that were discussed for Voyage ended up as the ‘creation sequence’ in Tree, but reworked to fit the tone and style of that film. Over the years, Voyage became something that Terry and I would edit in between post on his other narrative films. It was our passion project.”

df0617_vot_1Malick’s cutting rooms are equipped with Avid Media Composer systems connected to Avid shared storage. Typically his films are edited by multiple editors. (Voyage of Time was co-edited by Fraase and Rehman Nizar Ali.) Not only editors, but also researchers, needed access to the footage, so at times, there were as many as eight Media Composer systems used in post. Fraase explains, “There is almost always more than one editor on Terry’s films. At the start of post, we’d divvy up the film by section and work on it until achieving a rough assembly. Then, once the film was assembled in full, each editor would usually trade-off sections or scenes, in the hope to achieve some new perspective on the cut. It was always about focusing on experimentation or discovering different approaches to the edit. With Voyage, there was so much footage to work with, some of which Terry had filmed back in the 70s. This was a project he’d had in his mind for decades. In preparation, he traveled all over the world and had amassed years of research on natural phenomena and the locations where he could film them. During filming, the crew would go to locations with particular goals in mind, like capturing mud pots in Iceland or cuttlefish in Palau. But Terry was always on the lookout for the unexpected. Due to this, much of the footage that ended up in the final films was unplanned.”

df0617_vot_2Cutting Voyage of Time presented an interesting way to tackle narration. Fraase continues, “For Voyage, there were hours and hours of footage to cut with, but we also did a lot of experiments with sound. Originally, there was a 45 page script written for the IMAX version, which was expanded for the full feature. However, this script was more about feelings and tone than outlining specific beats or scenes. It was more poetry than prose, much of which was later repurposed and recorded as voiceover. Terry has a very specific way of working with voiceover. The actors record pages and pages of it. All beautifully written. But we never know what is going to work until it’s recorded, brought into the Avid, and put up against picture. Typically, we’ll edit together sequences of voiceover independent of any footage. Then we move these sequences up and down the timeline until we find a combination of image and voiceover that produces meaning greater than the sum of the parts. Terry’s most interested in the unexpected, the unplanned.”

The art of picture and sound composition

Naturally, when moviegoers think of a Terrence Malick film, imagery comes to mind. Multiple visual effects houses worked on Voyage of Time, under the supervision of Dan Glass (Jupiter Ascending, Cloud Atlas, The Master). Different artists worked on different sections of the film. Fraase explains, “Throughout post production, we sought the guidance from scientific specialists whenever we could. They would help us define certain thematic elements that we knew we wanted – into specific, illustratable moments. We’d then bring these ideas to the different VFX shops to expand on them. They mocked up the various ‘previz’ shots that we’d test in our edit – many of which were abandoned along the way. We had to drop so many wonderful images and moments after they’d been painstakingly created, because it was impossible to know what would work best until placed in the edit.”

df0617_vot_4“For VFX, Terry wanted to rely on practical film elements as much as possible. Even the shots that were largely CGI had to have some foundation in the real. We had an ongoing series of what we called ’skunkworks shoots’ during the weekends, where the crew would film experiments with elements like smoke, flares, dyes in water and so on. These were all layered into more complex visual effects shots.” Although principal photography was on film, the finished product went through a DI (digital intermediate) finishing process. IMAX visual effects elements were scanned at 11K resolution and the regular live action footage at 8K resolution.

df0617_vot_5The music score for Voyage of Time was also a subject of much experimentation. Fraase continues, “Terry has an extensive classical music library, which was all loaded into the Avid, so that we could test a variety of pieces against the edit. This started with some obvious choices like [Gustav] Holst’s ‘The Planets’ and [Joseph] Haydn’s ‘The Creation’ for a temp score. But we tried others, like a Keith Jarrett piano piece. Then one of our composers [Hanan Townshend, To The Wonder, Knight of Cups] experimented further by taking some of the classical pieces we’d been using and slowing them way, way down. The sound of stringed instruments being slowed results in an almost drone-like texture. For some of the original compositions, Terry was most interested in melodies and chords that never resolve completely. The idea being that, by never resolving, the music was mimicking creation – constantly struggling and striving for completion. Ultimately a collection of all these techniques was used in the final mix. The idea was that this eclectic approach would provide for a soundtrack that was always changing.”

Voyage of Time is a visual symphony, which is best enjoyed if you sit back and just take it in. Keith Fraase offers this, “Terry has a deep knowledge of art and science and he wanted everyone involved in the project to be fascinated and love it as much as he. This is Terry’s ode to the earth.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Final Cut Pro X – Reflecting on Six Years

df0417_fcpx5yrs_01_sm

Some personal musings…

Apple’s Final Cut Pro X has passed its five-year mark – and by now nearly most of its sixth. Although it’s getting increasing respect from many corners of the professional editing community, there are still many that dismiss it, due to its deviation from standard editing software conventions. Like so many other things that are Apple, FCPX tends to be polarizing with a large cohort of both fanboys and haters.

For me software is a tool. I’ve been editing since the 70s and have used about 15 different linear and nonlinear systems on billable work during that time. More like 20 if you toss in color correction applications. Even more with tools where I’ve had a cursory exposure to (such as in product reviews), but haven’t used on real jobs. All of these tools are a love-hate relationship for me. I have to laugh when folks talk about FCPX bringing back fun to their editing experience. I hope that the projects I work on bring me fun. I don’t really care about the software itself. Software should just get out of the way and let me do my job.

These six years have been a bit of a personal journey with Final Cut Pro X after a number of years with the “classic” version. I’ve been using FCPX since it first came out on commercials, corporate videos, shorts and even an independent feature film. It’s not my primary NLE most of the time, because my clients have largely moved to Adobe Premiere Pro CC and ask me to be compatible with them. My FCPX work tends to be mixed in and around my Premiere Pro editing gigs. For instance, right now I’m simultaneously involved in two large corporate video jobs – one of which I’m cutting in Premiere Pro and the other in Final Cut Pro X. As these things go, it can be frustrating, because you always want some function, tool or effect that’s available in Application A while you’re working in Application B. However, it also provides a perspective on what’s good and bad about each and where real speed advantages exist.

I have to say that even after six years, Final Cut Pro X is still more of a crapshoot than any other editing tool that I’ve used. I love its organizing power and often start a job really liking it. However, the deeper I get into the job – and the larger the library becomes – and the more complex the sequences become – the more bogged down FCPX becomes. It’s also the most inconsistent across various Mac models. I’ve run it on older towers, new MacBook Pros, iMacs and 2013 Mac Pros. Of these experiences, the laptops seem to be the most optimized for FCPX.

Quite frankly, working with the “trash can” Mac Pros, at times I wonder if Apple has lost its mojo. Don’t get me wrong – it’s a sweet machine, but its horsepower leaves me underwhelmed. Given the right upgrades, a 2010 Mac Pro tower is still quite competitive against it. Couple that with intermittent corrupt renders and exports on Adobe applications – due to the D-series AMD GPUs – one really has to question Apple’s design compromises. On the other hand, working with recent and new MacBook Pros, it seems pretty obvious that this is where Apple’s focus has been. And in fact, that’s where Final Cut really shines. Run a complex project on a MacBook Pro versus an older tower and it’s truly a night-and-day experience. By comparison, the performance with Adobe and Avid on the same range of machines results in a much more graduated performance curve. Best might not be quite as good, but worst isn’t nearly as awful.

A lot is made of new versus old code in these competing applications. The running argument is that FCPX uses a sleek, new codebase, whereas Premiere Pro and Media Composer run on creaky old software. Yet Final Cut has been out publicly for six years, which means development started a few years before that. Hmmm, no longer quite so new. Yet, if you look at the recent changes from 10.2 to 10.3, it seems pretty clear that a lot more was changed than just cosmetics. The truth of the matter is that all three of these major applications are written in a way that modules of software can be added, removed or changed, without the need to start from scratch. Therefore, from a coding standpoint, Final Cut doesn’t have nearly the type of advantages that many think it has.

The big advantage that FCPX does have, is that Apple can optimize its performance for the holistic hardware and macOS software architecture of their own machines. As such, performance, render speeds, etc. aren’t strictly tied to only the CPU or the GPU. It’s what enables the new MacBook Pro to offer top-end performance, while still staying locked to 16GB of RAM. It seems to me, that this is also why the Core-series processors appear to be better performers than are the Xeon-series chips, when it comes to Final Cut, Motion and Compressor.

If you compare this to Premiere Pro, Adobe hits the GPUs much harder than does Apple, which is the reason behind the occasional corruptions on the “trash can” Macs with Adobe renders. If you were running the Adobe suite on a top-level PC with high-end Nvidia cards, performance would definitely shine over that of the Macs. This is largely due to leveraging the CUDA architecture of these Nvidia GPUs. With Apple’s shift to using only AMD and Intel GPUs, CUDA acceleration isn’t available on newer Macs. Under the current software versions of Adobe CC (at the time of this writing) and Sierra, you are tied to OpenCL or software-only rendering and cannot even use Apple’s Metal acceleration. This is a driver issue still being sorted out between Apple and Adobe. Metal is something that Apple tools take advantage of and is a way that they leverage the combined hardware power, without focusing solely on CPU or GPU acceleration.

All of this leads me back to a position of love-hate with any of these tools. I suspect that my attitude is more common than most folks who frequent Internet forum debates want to admit. The fanboy backlash is generally large. When I look at how I work and what gets the results, I usually prefer track-based systems to the FCPX approach. I tend to like Final Cut as a good rough-cut editing application, but less as a fine-cut tool. Maybe that’s just me. That being said, I’ve had plenty of experiences where FCPX quite simply is the better tool under the circumstance. On a recent on-site edit gig at CES, I had to cut some 4K ARRI ALEXA material on my two-year-old Retina MacBook Pro. Premiere Pro couldn’t hack it without stuttering playback, while FCPX was buttery smooth. Thus FCPX was the axe for me throughout this gig.

Likewise, in the PC vs. Mac hardware debates,  I may criticize some of Apple’s moves and long to work on a fire-breathing platform. But if push came to shove and I had to buy a new machine today, it would be either a Mac Pro “trash can” or a tricked-out iMac. I don’t do heavy 3D renders or elaborate visual effects – I edit and color correct. Therefore, the overall workflow, performance and “feel” of the Apple ecosystem is a better fit for me, even though at times performance might be middling.

Wrapping up this rambling post – it’s all about personal preference. I applaud Apple for making the changes in Final Cut Pro X that they did; however, a lot of things are still in need of improvement. Hopefully these will get addressed soon. If you are looking to use FCPX professionally, then my suggestion is to stick with only the newest machines and keep your productions small and light. Keep effects and filters to a minimum and you’ll be happiest with the results and the performance. Given the journey thus far, let’s see what the next six years will bring.

©2017 Oliver Peters

Nocturnal Animals

nocanim_01_smSome feature films are entertaining popcorn flicks, while others challenge the audience to go deeper. Writer/director Tom Ford’s (A Single Man) second film, Nocturnal Animals definitely fits into the latter group. Right from the start, the audience is confronted with a startling and memorable main title sequence, which we soon learn is actually part of an avant garde art gallery opening. From there the audience never quite knows what’s around the next corner.

Susan Morrow (Amy Adams) is a privileged Los Angeles art gallery owner who seems to have it all, but whose life is completely unfulfilled. One night she receives an unsolicited manuscript from Edward Sheffield (Jake Gyllenhaal), her ex-husband with whom she’s been out of touch for years. With her current husband (Armie Hammer) away on business, she settles in for the night to read the novel. She is surprised to discover it is dedicated to her. The story being told by Edward is devastating and violent, and it triggers something in Susan that arouses memories of her past love with the author.

Nocturnal Animals keeps the audience on edge and is told through three parallel storylines – Susan’s current reality, flashbacks of her past with Edward, and the events that are unfolding in the novel. Managing this delicate balancing act fell to Joan Sobel, ACE, the film’s editor. In her film career, Sobel has worked with such illustrious directors as Quentin Tarantino, Billy Bob Thornton, Paul Thomas Anderson and Paul Weitz.  She was Sally Menke’s First Assistant Editor for six-and-a-half years on four films, including Kill Bill, vol. 1 and Kill Bill, vol. 2.  Sobel also edited the Oscar-winning short dark comedy, The Accountant.  This is her second feature with Tom Ford at the helm.

Theme and structure

In our recent conversation, Joan Sobel discussed Nocturnal Animals. She says, “At its core, this film is about love and revenge and regret, with art right in the middle of it all. It’s about people we have loved and then carelessly discarded, about the cruelties that we inflict upon each other, often out of fear or ambition or our own selfishness.  It is also about art and the stuff of dreams.  Susan has criticized Edward’s ambition as a writer. Edward gets Susan to feel again through his art – through that very same writing that Susan has criticized in the past. But art is also Edward’s vehicle for revenge – revenge for the hurt that Susan has caused him during their past relationship. The film uses a three-pronged story structure, which was largely as Tom scripted. The key was to find a fluid and creative way to transition from one storyline to the other, to link those moments emotionally or visually or both. Sometimes that transition was triggered by a movement, but other times just a look, a sound, a color or an actor’s nuanced glance.”

nocanim_02Nocturnal Animals was filmed (yes, on film not digital) over 31 days in California, with the Mojave Desert standing in for west Texas. Sobel was cutting while the film was being shot and turned in her editor’s cut about a week after the production wrapped. She explains, “Tom likes to work without a large editorial infrastructure, so it was just the two of us working towards a locked cut. I finished my cut in December and then we relocated to London for the rest of post. I always put together a very polished first cut, so that there is already an established rhythm and a flow.  That way the director has a solid place to begin the journey. Though the movie was complex with its three-pronged structure – along with the challenge of bringing to life the inner monologue that is playing in Susan’s head – the movie came together rather quickly. Tom’s script was so well written and the performances so wonderful that by the end of March we pretty much had a locked cut.”

The actors provided fruitful ground for the editor.  Sobel continues, “It was a joy to edit Amy Adams’ performance. She’s a great actress, but when you actually edit her dailies, you get to see what she brings to the movie. Her performance is reliant less on dialogue (she actually doesn’t have many lines), instead emphasizing Amy’s brilliance as a film actor in conveying emotion through her mind and through her face and her eyes.”

“Tom is a staggering talent, and working with him is a total joy.  He’s fearless and his creativity is boundless.  He is also incredibly generous and very, very funny (we laugh a lot!), and we share an endless passion for movies.  Though the movie is always his vision, his writing, he gravitates towards collaboration. So we would get quite experimental in the cut. The trust and charm and sharp, clear intelligence that he brings into the cutting room resulted in a movie that literally blossoms with creativity. Editing Nocturnal Animals was a totally thrilling experience.”

Tools of the trade

nocanim_03Sobel edited Nocturnal Animals with Avid Media Composer. Although she’s used other editing applications, Media Composer is her tool of choice. I asked about how she approaches each new film project. She explains, “The first thing I do is read the script. Then I’ll read it again, but this time out loud. The rhythms of the script become more lucid that way and I can conceptualize the visuals. When I get dailies for a scene, I start by watching everything and taking copious notes about every nuance in an actor’s performance that triggers an emotion in me, that excites me, that moves me, that shows me exactly where this scene is going.  Those moments can be the slightest look, a gesture, a line reading.”

“I like to edit very organically based on the footage. I know some editors use scene cards on a wall or they rely on Avid’s Script Integration tools, but none of those approaches are for me. Editing is like painting – it’s intuitive. My assistants organize bins for themselves in dailies order. Then they organize my bins in scene/script order. I do not make selects sequences or ‘KEM rolls’. I simply set up the bins in frame view and then rearrange the order of clips according to the flow – wide to tight and so on. As I edit, I’m concentrating on performance and balance. One little trick I use is to turn off the sound and watch the edit to see what is rhythmically and emotionally working. Often, as I’m cutting the scene, I find myself actually laughing with the actor or crying or gasping! Though this is pretty embarrassing if someone happens to walk into my cutting room, I know that if I’m not feeling it, then the audience won’t either.”

Music and sound are integral for many editors, especially Sobel. She comments, “I love to put temp music into my editor’s cuts. That’s a double-edged sword, though, because the music may or may not be to the taste of the director. Though Tom and I are usually in sync when it comes to music, Tom doesn’t like to start off with temp music in the initial cut, so I didn’t add it on this film. Once Tom and I started working together, we played with music to see what worked. This movie is one that we actually used very little music in and when we did, added it quite sparingly. Mostly the temp music we used was music from some of Abel’s [Korzeniowski, composer] other film scores. I also always add layers of sound effects to my tracks to take the movie and the storytelling to a further level. I use sound to pull your attention, to define a character, or a mood, or elevate a mystery.”

Unlike many films, Nocturnal Animals flew through the post process without any official test screenings. Its first real screening was at the Venice Film Festival where it won the Silver Lion Grand Jury Prize. “Tom has the unique ability to both excite those working with him and to effortlessly convey his vision, and he had total confidence in the film. The film is rich with many layers and is the rare film that can reveal itself through subsequent viewings, hopefully providing the audience with that unique experience of being completely immersed in a novel, as our heroine becomes immersed in Nocturnal Animals,” Sobel says. The film opened in the US during November and is a Focus Features release.

Check out more with Joan Sobel at “Art of the Cut”.

Originally written for Digital Video magazine / Creative Planet Network.

©2017 Oliver Peters

AJA T-Tap

 

df0217_ttap_sm

The Thunderbolt protocol has ushered in a new era for easy connectivity of hardware peripherals. It allows users to deploy a single connection type to tie in networking, external storage, monitoring and broadcast audio and video input and output. Along with easy connections, it has also enabled peripheral devices to becomes smaller, lighter and more powerful. This is in part due to advances in the hardware and software, as well. AJA Video Systems is one of the popular video manufacturers that has taken advantage of these benefits.

In many modern editing environments, the actual editing system has become extremely streamlined. All it really takes is a Thunderbolt-enabled laptop, all-in-one (like an iMac) or desktop computer, fast external storage, and professional monitoring – and you are good to go. For many editors, live video output is strictly for monitoring, as deliverables are more often-than-not files and not tape. Professional monitoring is easy to achieve using SDI or HDMI connections. Any concern for analog is gone, unless you need to maintain analog audio monitoring. AJA makes a series of i/o products to address these various needs, ranging from full options down to simple monitoring devices. Blackmagic Design and AJA currently produce the lion’s share of these types of products, including PCIe cards for legacy installations and Thunderbolt devices for newer systems.

I recently tested the AJA T-Tap, which is a palm-sized video output device that connects to the computer using the Thunderbolt 2 protocol. It is bus-powered – meaning that no external power supply or “wall-wart” is needed to run it. I tested this on both a 2013 Mac Pro and a 2015 MacBook Pro. In each case, my main need was SDI and/or HDMI out of the unit to external monitors. Installation couldn’t be easier. Simply download the current control panel software and drivers from AJA’s website, install, and then connect the T-Tap. Hook up your monitors and you are ready. There’s very little else to do, except set your control panel configuration for the correct video/frame rate standard. Everything else is automatic in both Adobe Premiere Pro CC and Apple Final Cut Pro X. Although you’ll want to check your preference settings to make sure the device is detected and enabled.

One of the main reasons I wanted to test the T-Tap was as a direct comparison with the Blackmagic products on these same computers. For example, the current output device being used on the 2013 Mac Pro that I tested is a Blackmagic DesignUltraStudio Express. This contains a bit more processing and is comparable to AJA’s Io XT . I also tested the BMD MiniMonitor, which is a direct competitor to the T-Tap. The UltraStudio provides both input and output and offers an analog break-out cable harness, whereas the two smaller units are only output using SDI and HDMI. All three are bus-powered. In general, all performed well with Premiere Pro, except that the BMD MiniMonitor couldn’t provide output via HDMI. For unexplained reasons, that screen was blank. No such problem with either the T-Tap or the UltraStudio Express.

The real differences are with Final Cut Pro X on the Mac Pro. That computer has six Thunderbolt ports, which are shared across three buses – i.e. two connectors per bus. On the test machine, one bus feeds the two external displays, the second bus connects to external storage (not shared for maximum throughput), and the remaining bus connects to both the output device and a CalDigit dock. If the BMD UltraStudio Express is plugged into any connection shared with another peripheral, JKL high-speed playback and scrubbing in FCPX is useless. Not only does the video output stutter and freeze, but so does the image in the application’s viewer. So you end up wasting an available Thunderbolt port on the machine, if you want to use that device with FCPX. Therefore, using the UltraStudio with FCPX on this machine isn’t really functional, except for screening with a client. This means I end up disabling the device most of the time I use FCPX. In that respect, both the AJA T-Tap and the BMD MiniMonitor performed well. However, my subjective evaluation is that the T-Tap gave better performance in my critical JKL scrubbing test.

One difference that might not be a factor for most, is that the UltraStudio Express (which costs a bit more) has advanced processing. This yields a smooth image in pause when working with progressive and PsF media. When my sequence was stopped in either FCPX or Premiere, both the T-Tap and the UltraStudio yield a full-resolution, whole-frame image on the HDMI output. (HDMI didn’t appear to function on the MiniMonitor.) On the TV Logic broadcast display that was being fed vis SDI, the T-Tap and MiniMonitor only displayed a field in pause, so you get an image with “jaggies”. The UltraStudio Express generates a whole frame for a smooth image in pause. I didn’t test a unit like AJA’s Io XT, so I’m not sure if the more expensive AJA model offers similar processing. However, it should be noted that the Io XT is triple the cost of the UltraStudio Express.

The elephant in the room, of course, is Blackmagic Design DaVinci Resolve. That application is restricted to only work with Blackmagic’s own hardware devices. If you want to run Resolve – and you want professional monitoring out of it – then you can’t use any AJA product with it. However, these units are so inexpensive to begin with – compared with what they used to cost – it’s realistic to own both. In fact, some FCPX editors use a T-Tap while editing in FCPX and then switch over to a MiniMonitor or UltraStudio for Resolve work. The reason being the better performance between Final Cut and the AJA products.

Ultimately these are all wonderful devices. I like the robustness of AJA’s manufacturing and software tools. I’ve used their products over the years and never been disappointed with performance or service if needed. If you don’t need video output from Resolve, then the AJA T-Tap is a great choice for an inexpensive, simple, Thunderbolt video output solution. Laptop users who need to hook up to monitors while working at home or away will find it a great choice. Toss it into your laptop bag and you are ready to rock.

©2017 Oliver Peters

BorisFX BCC 10

df3216_bcc10_01_sml

Boris Continuum Complete (BCC) by BorisFX is the epitome of the term “Swiss Army knife” when it comes to talking about plug-ins. Most editors will pick this package over others, if they can only have one toolkit to cover a diverse range of picture enhancements. In the past year, BorisFX has upgraded this toolkit with new effects, expanded to add more NLE hosts, and integrated mocha’s Academy Award-winning planar tracking technology after the acquisition of Imagineer Systems. This set of plug-ins is now up to version BCC10. BorisFX has not only added new effects to BCC10, but also expanded its licensing options to include multi-host and subscription options.

Since many users now work with several NLEs, multi-host licensing makes a lot of sense. One purchase with a single serial number covers the installation for each of the various applications. There are two multi-host license versions: one for Avid/Adobe/Apple/OFX and the second that doesn’t include Avid. OFX licensing covers the installation for Blackmagic Design DaVinci Resolve, as well as Sony Vegas Pro for PC users.

What’s new in BCC10

df3216_bcc10_10Boris Continuum Complete version 10 includes over 230 effects within 16 different categories, like 3D Objects, Art Looks, Particles, Perspective and more. Each effect comes with numerous presets for a total of over 2,500 presets in all. There are plenty of new tools in BCC10, but the biggest news is that each effect filter integrates mocha planar tracking. BorisFX has always included Pixel Chooser as a way of masking objects. Now each filter also lets you launch the mocha interface right from inside the plug-in’s effect control panel. For example, if you are applying skin smoothing to only your talent’s forehead using the new BCC Beauty Studio, simply launch mocha, create a mask for the forehead and track the talent’s movement within the shot. The mask and track are saved within the plug-in, so you can instantly see the results.

df3216_bcc10_05A second big change is the addition and integration of the FX Browser. Each plug-in effect lets you launch the FX Browser interface to display how each of the various presets for that effect would look when applied to the selected clip. You can preview the whole clip, not just a thumbnail. FX Browser is also a standalone effect that can be applied to the clip. When you use it that way, then all presets for all filters can be previewed. While FX Browser has been implemented in past versions in some of the hosts, this is the first time that it’s become an integrated part of the BCC package across all NLEs.

df3216_bcc10_02BCC10 includes two new “studio” tools, as well as a number of new individual effects. BCC Beauty Studio is a set of tools in a single filter targeted at image retouching, especially the skin texture of talent. Photographers retouch “glamor” shots to reduce or remove blemishes, so Photoshop-style retouching is almost expected these days. This is the digital video equivalent. As with most skin smoothing filters, BCC Beauty Studio uses skin keying algorithms to isolate skin colors. It then blurs skin texture, but also lets the editor adjust contrast, color correction, and even add a subtle glow to image highlights. Of course, as I mentioned above, mocha masking and tracking is integrated for the ultimate control in where and how the effect is applied.

The second new, complex filter is BCC Title Studio. This is an integrated 3D titling tool that can be used based on templates within the effects browser or by launching the separate Title Studio interface. Editors familiar with BorisFX products will recognize this titling interface as essentially Boris RED right inside of their NLE. Not only can you create titles, but also more advanced motion graphics. You can even import objects, EPS and image files for 3D effects, including the addition of materials and shading. As with other BorisFX tilting tools, you can animate text on and off the screen.

df3216_bcc10_03In addition to these two large plug-ins, BCC10 also gained nine new filters and transitions. These include BCC Remover (fills in missing pixels or removes objects using cloning) and BCC Drop-out Fixer (restores damaged footage). For the folks who have to deal with a lot of 4×3 content and vertical cell phone footage, there’s BCC Reframer. Unlike the usual approach where the same image is stretched and blurred behind the vertical shot, this filter includes options to stylize the foreground and background.

df3216_bcc10_11The trend these days is to embrace image “defects” as a creative effect, so two of the new filters are BCC Light Leaks and BCC Video Glitch. Each adds organic, distressed effects, like in-camera light contamination and corrupted digital video artifacts. To go along with this, there are also four new transitions, including a BCC Light Leaks Dissolve, Cross Glitch, Cross Zoom and Cross Melt. Of these, the light leaks, glitch and zoom transitions are about what you’d expect from the name, however, the melt transition seems rather unique. In addition to the underlying dissolve between two images, there are a variety of effects options that can be applied as part of this transition. Many of these are glass, plastic, prism or streak effects, which add an interesting twist to this style of transition.

In use

df3216_bcc10_04The new BCC10 package works within the established hosts much like it always has, so no surprises there. The Boris Continuum Complete package used to come bundled with Avid Media Composer, but unfortunately that’s no longer the case. Avid editors who want the full BCC set have to purchase it. As with most plug-ins, After Effects is generally the best host when adjustment and manipulation of effects are required.

df3216_bcc10_09A new NLE to consider is DaVinci Resolve. Many are testing the waters to see if Resolve could become their NLE of choice. Blackmagic Design introduced Resolve 12.5 with even more focus on its editing toolset, including new, built-in effect filters and transitions. In my testing, BCC10 works reasonably well with Resolve 12.5 once you get used to where the effects are. Resolve uses a modal design with editing and color correction split into separate modes or pages. BCC10 transition effects only show up in the OFX library of the edit page. For filter effects, which are applied to the whole clip, you have to go to the color page. During the color correction process you may add any filter effect, but it has to be applied to a node. If you apply more than one filter, you have to add a new node for each filter. With the initial release of BCC10, mocha did not work within Resolve. If you tried to launch it, a message came up that this functionality would be added at a later time. In May, BorisFX released BCC10.2, which included mocha for both Resolve 12.5 and Vegas Pro. To use the BCC10 effects with Resolve 12.5 you need the paid Studio version and not the free version of Resolve.

df3216_bcc10_07BorisFX BCC10 is definitely a solid update, with new features, mocha integration and better GPU-based performance. It runs best in After Effects CC, Premiere Pro CC and Avid Media Composer. The built-in effects tools are pretty good in After Effects, Final Cut Pro X and Resolve 12.5 – meaning you might get by without needing what BCC10 has to offer. On the other hand, they are unfortunately very mediocre in Premiere Pro or Media Composer. If one of those is your editing axe, then BCC10 becomes an essential purchase, if you want to improve the capabilities of your editing application. Regardless of which tool you use, BCC10 will give you more options to stretch your creativity.

df3216_bcc10_08On a related note, at IBC 2016 in Amsterdam, BorisFX announced the acquisition of GenArts. This means that the Sapphire effects are now housed under the BorisFX umbrella, which could make for some interesting bundling options in the future. As with their integration of mocha tracking into the BCC effects, future versions of BCC and/or Sapphire might also see a sharing of compatible technologies across these two effects families. Stay tuned.

Originally written for Digital Video magazine / Creative Planet Network

©2016 Oliver Peters