Digital Anarchy Samurai Sharpen

df0717_sam_1_sm

Editors often face the dilemma of dealing with less-than-perfect footage. Focus is the bane of this challenge, where you have the ideal shot, but the operator missed the optimal focus, leaving a useable, albeit soft, image. Editing and compositing apps offer a number of built-in and third-party sharpen and unsharp mask filters that can be employed as a fix. While you can’t really fix the focus issue, you can sharpen the image so that it is perceived by the viewer as being better in focus. All of these filters work on the concept of localized contrast. This means that any dark-to-light edge transition within the image is enhanced and contrast in that area is increased. The dark area is darkened and the brighter part enhanced. This creates a halo effect, which can become quite visible as you increase the amount of sharpening, but also quite obnoxious when you push the amount to its full range. A little bit improves the image – a lot creates an electric, stylized effect.

One of the better sharpening filters on the market is Digital Anarchy’s Samurai Sharpen, which is available for Apple Final Cut Pro X, Adobe Premiere Pro CC and After Effects CC. (According to their website, Avid and OpenFX plug-ins are in development and coming soon.) What makes Samurai Sharpen different is that it includes sophisticated masking in order to restrict the part of the image to be sharpened. For example, on a facial close-up, you can enhance the sharpness of eyes without also pushing the skin texture by an unflattering amount. Yet, you still have plenty of control to push the image into a “look”. For example, the photographic trend these days seems to be photos with an obvious over-sharpened look for dramatic appeal. If you want subtle or if you want to stylize the image, both are achievable with Samurai Sharpen.df0717_sam_2_sm

Click any of the example images to see an enlarged view. In these comparisons, pay attention to not only the eyes, but also lips and strands of hair, as these are also affected by sharpening. (Image courtesy of Blackmagic Design.)

df0717_sam_4_smThe effect controls are divided into three groups – Sharpen, Mask and Blend. The top three sharpen controls are similar to most other filters. Amount is self-explanatory, radius adjusts the size of the localized contrast halo, and edge mask strength controls the mask that determines what is or isn’t sharpened. The edge mask strength range markings might seem counter-intuitive, though. All the way to the left (0) means that you haven’t increased the mask strength, therefore, more of the image is being sharpened. In our facial close-up example, more texture (like the skin) and noise (background) would be sharpened. If you crank the slider all the way to the right (50), you have increased the mask strength, thus less of the image is being sharpened. For the face, this means the eyes and eyelashes are sharpened, but the skin stays smooth. The handy “show sharpening” toggle renders a quick hi-con image (mask) of the area being sharpened.

df0717_sam_3_smThe real power of Samurai Sharpen is in the Mask Group. You have two controls each for shadow and highlights, as well as an on/off toggle to enable shadow and/or highlight masking. These four sliders function like a curves control, enabling you to broaden or restrict the range of dark or light portions of the image that will be affected by the sharpening. Enabling and adjusting the shadow mask controls lets you eliminate darker background portions of the image from being sharpened. You don’t want these areas sharpened, because it would result in a noisier appearance. The mask can also be blurred in order to feather the fall-off between sharpened and unprocessed portions of the image. Finally, there’s a layer mask control in this group, which shows up a bit differently between the Adobe apps and FCPX. Essentially it allows you to use another source to define your sharpening mask.

df0717_sam_5_smThe last section is the Blend Group. This offers slider adjustments for the opacity of the shadow and highlight masks created in the Mask Group section. GPU acceleration results in an effect that is quick to apply and adjust, along with good playback performance.

While there are many free sharpening tools on the market, Digital Anarchy’s Samurai Sharpen is worth the extra for the quality and control it offers. Along with Beauty Box and Flicker Free, they offer a nice repertoire of image enhancement tools.

©2017 Oliver Peters

Voyage of Time

df0617_vot_3_smFans of director Terrence Malick adore his unique approach to filmmaking, which is often defined by timeless and painterly cinematic compositions. The good news for moviegoers is that Malick has been in the most prolific period of his directing career. What could be the penultimate in cinema as poetry is Malick’s recent documentary, Voyage of Time. This is no less than a chronicle of the history of the universe as seen through Malick’s eyes. Even more intriguing is the fact that the film is being released in two versions – a 90 minute feature (Voyage of Time: Life’s Journey), narrated by Cate Blanchett, as well as a 45 minute IMAX version (Voyage of Time: The IMAX Experience), narrated by Brad Pitt.

This period of Malick’s increased output has not only been good for fans, but also for Keith Fraase, co-editor of Voyage of Time. Fraase joined Malick’s filmmaking team during the post of The Tree of Life. Although he had been an experienced editor cutting commercials and shorts, working with Malick was his first time working on a full-length feature. Keith Fraase and I recently discussed what it took to bring Voyage of Time to the screen.

Eight years in the making

“I began working with Terry back in 2008 on The Tree of Life,” Fraase says. “Originally, Voyage of Time had been conceived as a companion piece to The Tree of Life, to be released simultaneously. But plans changed and the release of Voyage was delayed. Some of the ideas and thematic elements that were discussed for Voyage ended up as the ‘creation sequence’ in Tree, but reworked to fit the tone and style of that film. Over the years, Voyage became something that Terry and I would edit in between post on his other narrative films. It was our passion project.”

df0617_vot_1Malick’s cutting rooms are equipped with Avid Media Composer systems connected to Avid shared storage. Typically his films are edited by multiple editors. (Voyage of Time was co-edited by Fraase and Rehman Nizar Ali.) Not only editors, but also researchers, needed access to the footage, so at times, there were as many as eight Media Composer systems used in post. Fraase explains, “There is almost always more than one editor on Terry’s films. At the start of post, we’d divvy up the film by section and work on it until achieving a rough assembly. Then, once the film was assembled in full, each editor would usually trade-off sections or scenes, in the hope to achieve some new perspective on the cut. It was always about focusing on experimentation or discovering different approaches to the edit. With Voyage, there was so much footage to work with, some of which Terry had filmed back in the 70s. This was a project he’d had in his mind for decades. In preparation, he traveled all over the world and had amassed years of research on natural phenomena and the locations where he could film them. During filming, the crew would go to locations with particular goals in mind, like capturing mud pots in Iceland or cuttlefish in Palau. But Terry was always on the lookout for the unexpected. Due to this, much of the footage that ended up in the final films was unplanned.”

df0617_vot_2Cutting Voyage of Time presented an interesting way to tackle narration. Fraase continues, “For Voyage, there were hours and hours of footage to cut with, but we also did a lot of experiments with sound. Originally, there was a 45 page script written for the IMAX version, which was expanded for the full feature. However, this script was more about feelings and tone than outlining specific beats or scenes. It was more poetry than prose, much of which was later repurposed and recorded as voiceover. Terry has a very specific way of working with voiceover. The actors record pages and pages of it. All beautifully written. But we never know what is going to work until it’s recorded, brought into the Avid, and put up against picture. Typically, we’ll edit together sequences of voiceover independent of any footage. Then we move these sequences up and down the timeline until we find a combination of image and voiceover that produces meaning greater than the sum of the parts. Terry’s most interested in the unexpected, the unplanned.”

The art of picture and sound composition

Naturally, when moviegoers think of a Terrence Malick film, imagery comes to mind. Multiple visual effects houses worked on Voyage of Time, under the supervision of Dan Glass (Jupiter Ascending, Cloud Atlas, The Master). Different artists worked on different sections of the film. Fraase explains, “Throughout post production, we sought the guidance from scientific specialists whenever we could. They would help us define certain thematic elements that we knew we wanted – into specific, illustratable moments. We’d then bring these ideas to the different VFX shops to expand on them. They mocked up the various ‘previz’ shots that we’d test in our edit – many of which were abandoned along the way. We had to drop so many wonderful images and moments after they’d been painstakingly created, because it was impossible to know what would work best until placed in the edit.”

df0617_vot_4“For VFX, Terry wanted to rely on practical film elements as much as possible. Even the shots that were largely CGI had to have some foundation in the real. We had an ongoing series of what we called ’skunkworks shoots’ during the weekends, where the crew would film experiments with elements like smoke, flares, dyes in water and so on. These were all layered into more complex visual effects shots.” Although principal photography was on film, the finished product went through a DI (digital intermediate) finishing process. IMAX visual effects elements were scanned at 11K resolution and the regular live action footage at 8K resolution.

df0617_vot_5The music score for Voyage of Time was also a subject of much experimentation. Fraase continues, “Terry has an extensive classical music library, which was all loaded into the Avid, so that we could test a variety of pieces against the edit. This started with some obvious choices like [Gustav] Holst’s ‘The Planets’ and [Joseph] Haydn’s ‘The Creation’ for a temp score. But we tried others, like a Keith Jarrett piano piece. Then one of our composers [Hanan Townshend, To The Wonder, Knight of Cups] experimented further by taking some of the classical pieces we’d been using and slowing them way, way down. The sound of stringed instruments being slowed results in an almost drone-like texture. For some of the original compositions, Terry was most interested in melodies and chords that never resolve completely. The idea being that, by never resolving, the music was mimicking creation – constantly struggling and striving for completion. Ultimately a collection of all these techniques was used in the final mix. The idea was that this eclectic approach would provide for a soundtrack that was always changing.”

Voyage of Time is a visual symphony, which is best enjoyed if you sit back and just take it in. Keith Fraase offers this, “Terry has a deep knowledge of art and science and he wanted everyone involved in the project to be fascinated and love it as much as he. This is Terry’s ode to the earth.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Sonicfire Pro 6

df0517_sfp6_01Most editors have a pretty innate sense of rhythm, yet often finding and tailoring the right music to your video poses a challenge for even the most talented cutter. SmartSound has provided a solution to this dilemma for many years. Last year they updated their custom Sonicfire Pro audio mixing software to version 6. This update adds interesting new features and support for today’s crop of NLEs.

The starting point is SmartSound’s library of original music. You buy the tracks you like once, which includes easy licensing, and then tailor the song for the length needed, for as many productions as required. SmartSound’s offerings cover a wide range of genres, all of which have been quantized into beat blocks that the Sonicfire Pro application automatically uses for timing adjustments. While this might sound like all the music would need to be synthetically generated – it isn’t. These tracks are played by humans with real instruments, so if you want rock, electronic, symphonic, etc. – you’ve got it. Many selections have been mood-mapped – SmartSound’s term to identify music cues that are multi-layered with up to nine instrument layers. If you like the track, but want to lose the drums or lower the lead instrument’s volume within the mix, simply turn off that layer or adjust its volume envelope. Both multi-layer and single-layer tracks can all be adjusted for time within Sonicfire Pro.df0517_sfp6_02

Sonicfire Pro 6 brings with it a modern interface

The new Sonicfire Pro 6 application is a welcomed update. It’s more streamlined than version 5, with a clean, modern interface. This excellent mini-tutorial by Larry Jordan will give you a quick overview of how it works.

df0517_sfp6_08From within the application, you have immediate access to all of your owned titles, as well as any other SmartSound selection (when you are online). If you don’t already own it, find something from SmartSound that you like, buy and download it right from within Sonicfire Pro 6. In the upper browser pane, search for specific tracks, albums and style, or sort by tempo or intensity.

Naturally, Sonicfire Pro 6 supports video, since it’s intended to empower user-friendly music scoring to picture. To add a video clip, show the video window and from its pulldown, select “Add Video”. You can also resize the Sonicfire Pro 6 interface larger (it references your monitor size automatically) and at the right size it will allow you to have both the Video window and either the Inspector or Markers window open simultaneously, so you can actually reference your video when making adjustments in these panels. Now the video will run in sync with your timeline. You can also import audio from a video file, if you want to do the whole mix in Sonicfire – much like a traditional DAW. In addition, you can also export tracks, full mixes and/or complete audio/video files with completed mixes. However, this is optional, as you can run SFP6 as just an audio-only tool without ever involving video, should you decide to work that way.

df0517_sfp6_04df0517_sfp6_05When you initially pick a track, three settings will get you started. The first is duration. Enter the desired duration and Sonicfire Pro will change the song structure to fit the length. It does this without just repeating the same loop. Next, pick your variation. Each track has a set of variations, which are different arrangements of the same song. Finally, for mood-mapped (multi-layer) tracks, make a mood selection. Moods are different instrument arrangements within the song, going from a full mix to various combinations of dominant instruments used for that song. Finally, there’s a advanced tab for additional options, including adjusting the mix of multi-layer tracks and shifting the tempo. A really cool search function is “Tap”. Simply tap out the beats by clicking the Tap button a few times and Sonicfire Pro will subsequently sort the library selections based on the tempo you tapped out.

df0517_sfp6_06Working in the timeline

Once you’ve auditioned and (optionally) adjusted the duration, variation and mood, drag and drop the selection to the timeline at the bottom of the application window. If you need the track to be longer or shorter, just drag the edge of the clip to the desired length and Sonicfire Pro will automatically change the arrangement as needed, based on SmartSound’s proprietary beat block structure. Additional selections can be dragged to the timeline, so it’s easy to score an entire video using multiple track selections. Each added song dragged to the timeline creates its own, new track on the timeline. This enables you to still make volume, length and mood adjustments to a song without affecting other surrounding selections on that timeline.

Within the inspector you have additional controls, including the fade in and out handles for a clip, along with a new timing control feature. This was introduced in SFP5, but improved in version 6. As of this writing SmartSound has updated 110 albums for this feature, that’s over 1,100 tracks, and adds new albums regularly. For the tracks that have been updated, when you enable timing control, several markers appear on the clip in the timeline. These markers can be dragged to better adjust song changes to match key points in your video. When you drag a marker, SFP6 automatically shuffles the arrangement of that song. For example, if you want a big ending to happen at a better match for your video cut, sliding the marker will make this happen. In actual practice during my testing, this was a bit of trial and error. In one case, a change made too close to the end left me with an incomplete ending. I needed to also slide the track length a tad longer for SFP6 to come up with a good-sounding ending. But, this feature is designed to enable experimentation to produce a custom score, so, don’t be afraid to play with it.

df0517_sfp6_07Finally, as part of its integration with NLEs, Sonicfire offers a new feature called “Cut-Video-to-Music”. Final Cut Pro X, Premiere Pro CC, Avid Media Composer and Vegas Pro are all supported. This new feature lets you export a track along with a corresponding XML file, which in turn is imported into the designated project. Inside the NLE, the track shows up with markers identifying your choice of either beats, strong beats only, SmartSound blocks or music sections, making it easy to edit picture cuts accordingly.

Conclusion

Make sure you are running on the most recent version after you initially install the software. I did run into some minor issues with the initial 6.0.0 version, which were fixed with the df0517_sfp6_036.0.3 update. Updates may be downloaded from the SmartSound website. Overall, SmartSound’s Sonicfire Pro 6 is a welcomed refresh to a wonderful tool. To my knowledge, no other software developer offers anything to match it. Adobe briefly tried with its custom music features inside Soundbooth, but then dropped this function after a couple of years. Magix and Apple offer applications where you can create your own loop-based tunes; however, neither starts with finished compositions that can be modified both in length and arrangement with such ease.

While music choices are very subjective, I’ve personally built up a SmartSound library over the years, which lets me offer clients quality music alternatives without much fuss or cost. Just another service I can offer to a client. It allows you as an editor to be the hero to your client and accomplish the task expediently and on budget.

©2017 Oliver Peters

Final Cut Pro X – Reflecting on Six Years

df0417_fcpx5yrs_01_sm

Some personal musings…

Apple’s Final Cut Pro X has passed its five-year mark – and by now nearly most of its sixth. Although it’s getting increasing respect from many corners of the professional editing community, there are still many that dismiss it, due to its deviation from standard editing software conventions. Like so many other things that are Apple, FCPX tends to be polarizing with a large cohort of both fanboys and haters.

For me software is a tool. I’ve been editing since the 70s and have used about 15 different linear and nonlinear systems on billable work during that time. More like 20 if you toss in color correction applications. Even more with tools where I’ve had a cursory exposure to (such as in product reviews), but haven’t used on real jobs. All of these tools are a love-hate relationship for me. I have to laugh when folks talk about FCPX bringing back fun to their editing experience. I hope that the projects I work on bring me fun. I don’t really care about the software itself. Software should just get out of the way and let me do my job.

These six years have been a bit of a personal journey with Final Cut Pro X after a number of years with the “classic” version. I’ve been using FCPX since it first came out on commercials, corporate videos, shorts and even an independent feature film. It’s not my primary NLE most of the time, because my clients have largely moved to Adobe Premiere Pro CC and ask me to be compatible with them. My FCPX work tends to be mixed in and around my Premiere Pro editing gigs. For instance, right now I’m simultaneously involved in two large corporate video jobs – one of which I’m cutting in Premiere Pro and the other in Final Cut Pro X. As these things go, it can be frustrating, because you always want some function, tool or effect that’s available in Application A while you’re working in Application B. However, it also provides a perspective on what’s good and bad about each and where real speed advantages exist.

I have to say that even after six years, Final Cut Pro X is still more of a crapshoot than any other editing tool that I’ve used. I love its organizing power and often start a job really liking it. However, the deeper I get into the job – and the larger the library becomes – and the more complex the sequences become – the more bogged down FCPX becomes. It’s also the most inconsistent across various Mac models. I’ve run it on older towers, new MacBook Pros, iMacs and 2013 Mac Pros. Of these experiences, the laptops seem to be the most optimized for FCPX.

Quite frankly, working with the “trash can” Mac Pros, at times I wonder if Apple has lost its mojo. Don’t get me wrong – it’s a sweet machine, but its horsepower leaves me underwhelmed. Given the right upgrades, a 2010 Mac Pro tower is still quite competitive against it. Couple that with intermittent corrupt renders and exports on Adobe applications – due to the D-series AMD GPUs – one really has to question Apple’s design compromises. On the other hand, working with recent and new MacBook Pros, it seems pretty obvious that this is where Apple’s focus has been. And in fact, that’s where Final Cut really shines. Run a complex project on a MacBook Pro versus an older tower and it’s truly a night-and-day experience. By comparison, the performance with Adobe and Avid on the same range of machines results in a much more graduated performance curve. Best might not be quite as good, but worst isn’t nearly as awful.

A lot is made of new versus old code in these competing applications. The running argument is that FCPX uses a sleek, new codebase, whereas Premiere Pro and Media Composer run on creaky old software. Yet Final Cut has been out publicly for six years, which means development started a few years before that. Hmmm, no longer quite so new. Yet, if you look at the recent changes from 10.2 to 10.3, it seems pretty clear that a lot more was changed than just cosmetics. The truth of the matter is that all three of these major applications are written in a way that modules of software can be added, removed or changed, without the need to start from scratch. Therefore, from a coding standpoint, Final Cut doesn’t have nearly the type of advantages that many think it has.

The big advantage that FCPX does have, is that Apple can optimize its performance for the holistic hardware and macOS software architecture of their own machines. As such, performance, render speeds, etc. aren’t strictly tied to only the CPU or the GPU. It’s what enables the new MacBook Pro to offer top-end performance, while still staying locked to 16GB of RAM. It seems to me, that this is also why the Core-series processors appear to be better performers than are the Xeon-series chips, when it comes to Final Cut, Motion and Compressor.

If you compare this to Premiere Pro, Adobe hits the GPUs much harder than does Apple, which is the reason behind the occasional corruptions on the “trash can” Macs with Adobe renders. If you were running the Adobe suite on a top-level PC with high-end Nvidia cards, performance would definitely shine over that of the Macs. This is largely due to leveraging the CUDA architecture of these Nvidia GPUs. With Apple’s shift to using only AMD and Intel GPUs, CUDA acceleration isn’t available on newer Macs. Under the current software versions of Adobe CC (at the time of this writing) and Sierra, you are tied to OpenCL or software-only rendering and cannot even use Apple’s Metal acceleration. This is a driver issue still being sorted out between Apple and Adobe. Metal is something that Apple tools take advantage of and is a way that they leverage the combined hardware power, without focusing solely on CPU or GPU acceleration.

All of this leads me back to a position of love-hate with any of these tools. I suspect that my attitude is more common than most folks who frequent Internet forum debates want to admit. The fanboy backlash is generally large. When I look at how I work and what gets the results, I usually prefer track-based systems to the FCPX approach. I tend to like Final Cut as a good rough-cut editing application, but less as a fine-cut tool. Maybe that’s just me. That being said, I’ve had plenty of experiences where FCPX quite simply is the better tool under the circumstance. On a recent on-site edit gig at CES, I had to cut some 4K ARRI ALEXA material on my two-year-old Retina MacBook Pro. Premiere Pro couldn’t hack it without stuttering playback, while FCPX was buttery smooth. Thus FCPX was the axe for me throughout this gig.

Likewise, in the PC vs. Mac hardware debates,  I may criticize some of Apple’s moves and long to work on a fire-breathing platform. But if push came to shove and I had to buy a new machine today, it would be either a Mac Pro “trash can” or a tricked-out iMac. I don’t do heavy 3D renders or elaborate visual effects – I edit and color correct. Therefore, the overall workflow, performance and “feel” of the Apple ecosystem is a better fit for me, even though at times performance might be middling.

Wrapping up this rambling post – it’s all about personal preference. I applaud Apple for making the changes in Final Cut Pro X that they did; however, a lot of things are still in need of improvement. Hopefully these will get addressed soon. If you are looking to use FCPX professionally, then my suggestion is to stick with only the newest machines and keep your productions small and light. Keep effects and filters to a minimum and you’ll be happiest with the results and the performance. Given the journey thus far, let’s see what the next six years will bring.

©2017 Oliver Peters

Nocturnal Animals

nocanim_01_smSome feature films are entertaining popcorn flicks, while others challenge the audience to go deeper. Writer/director Tom Ford’s (A Single Man) second film, Nocturnal Animals definitely fits into the latter group. Right from the start, the audience is confronted with a startling and memorable main title sequence, which we soon learn is actually part of an avant garde art gallery opening. From there the audience never quite knows what’s around the next corner.

Susan Morrow (Amy Adams) is a privileged Los Angeles art gallery owner who seems to have it all, but whose life is completely unfulfilled. One night she receives an unsolicited manuscript from Edward Sheffield (Jake Gyllenhaal), her ex-husband with whom she’s been out of touch for years. With her current husband (Armie Hammer) away on business, she settles in for the night to read the novel. She is surprised to discover it is dedicated to her. The story being told by Edward is devastating and violent, and it triggers something in Susan that arouses memories of her past love with the author.

Nocturnal Animals keeps the audience on edge and is told through three parallel storylines – Susan’s current reality, flashbacks of her past with Edward, and the events that are unfolding in the novel. Managing this delicate balancing act fell to Joan Sobel, ACE, the film’s editor. In her film career, Sobel has worked with such illustrious directors as Quentin Tarantino, Billy Bob Thornton, Paul Thomas Anderson and Paul Weitz.  She was Sally Menke’s First Assistant Editor for six-and-a-half years on four films, including Kill Bill, vol. 1 and Kill Bill, vol. 2.  Sobel also edited the Oscar-winning short dark comedy, The Accountant.  This is her second feature with Tom Ford at the helm.

Theme and structure

In our recent conversation, Joan Sobel discussed Nocturnal Animals. She says, “At its core, this film is about love and revenge and regret, with art right in the middle of it all. It’s about people we have loved and then carelessly discarded, about the cruelties that we inflict upon each other, often out of fear or ambition or our own selfishness.  It is also about art and the stuff of dreams.  Susan has criticized Edward’s ambition as a writer. Edward gets Susan to feel again through his art – through that very same writing that Susan has criticized in the past. But art is also Edward’s vehicle for revenge – revenge for the hurt that Susan has caused him during their past relationship. The film uses a three-pronged story structure, which was largely as Tom scripted. The key was to find a fluid and creative way to transition from one storyline to the other, to link those moments emotionally or visually or both. Sometimes that transition was triggered by a movement, but other times just a look, a sound, a color or an actor’s nuanced glance.”

nocanim_02Nocturnal Animals was filmed (yes, on film not digital) over 31 days in California, with the Mojave Desert standing in for west Texas. Sobel was cutting while the film was being shot and turned in her editor’s cut about a week after the production wrapped. She explains, “Tom likes to work without a large editorial infrastructure, so it was just the two of us working towards a locked cut. I finished my cut in December and then we relocated to London for the rest of post. I always put together a very polished first cut, so that there is already an established rhythm and a flow.  That way the director has a solid place to begin the journey. Though the movie was complex with its three-pronged structure – along with the challenge of bringing to life the inner monologue that is playing in Susan’s head – the movie came together rather quickly. Tom’s script was so well written and the performances so wonderful that by the end of March we pretty much had a locked cut.”

The actors provided fruitful ground for the editor.  Sobel continues, “It was a joy to edit Amy Adams’ performance. She’s a great actress, but when you actually edit her dailies, you get to see what she brings to the movie. Her performance is reliant less on dialogue (she actually doesn’t have many lines), instead emphasizing Amy’s brilliance as a film actor in conveying emotion through her mind and through her face and her eyes.”

“Tom is a staggering talent, and working with him is a total joy.  He’s fearless and his creativity is boundless.  He is also incredibly generous and very, very funny (we laugh a lot!), and we share an endless passion for movies.  Though the movie is always his vision, his writing, he gravitates towards collaboration. So we would get quite experimental in the cut. The trust and charm and sharp, clear intelligence that he brings into the cutting room resulted in a movie that literally blossoms with creativity. Editing Nocturnal Animals was a totally thrilling experience.”

Tools of the trade

nocanim_03Sobel edited Nocturnal Animals with Avid Media Composer. Although she’s used other editing applications, Media Composer is her tool of choice. I asked about how she approaches each new film project. She explains, “The first thing I do is read the script. Then I’ll read it again, but this time out loud. The rhythms of the script become more lucid that way and I can conceptualize the visuals. When I get dailies for a scene, I start by watching everything and taking copious notes about every nuance in an actor’s performance that triggers an emotion in me, that excites me, that moves me, that shows me exactly where this scene is going.  Those moments can be the slightest look, a gesture, a line reading.”

“I like to edit very organically based on the footage. I know some editors use scene cards on a wall or they rely on Avid’s Script Integration tools, but none of those approaches are for me. Editing is like painting – it’s intuitive. My assistants organize bins for themselves in dailies order. Then they organize my bins in scene/script order. I do not make selects sequences or ‘KEM rolls’. I simply set up the bins in frame view and then rearrange the order of clips according to the flow – wide to tight and so on. As I edit, I’m concentrating on performance and balance. One little trick I use is to turn off the sound and watch the edit to see what is rhythmically and emotionally working. Often, as I’m cutting the scene, I find myself actually laughing with the actor or crying or gasping! Though this is pretty embarrassing if someone happens to walk into my cutting room, I know that if I’m not feeling it, then the audience won’t either.”

Music and sound are integral for many editors, especially Sobel. She comments, “I love to put temp music into my editor’s cuts. That’s a double-edged sword, though, because the music may or may not be to the taste of the director. Though Tom and I are usually in sync when it comes to music, Tom doesn’t like to start off with temp music in the initial cut, so I didn’t add it on this film. Once Tom and I started working together, we played with music to see what worked. This movie is one that we actually used very little music in and when we did, added it quite sparingly. Mostly the temp music we used was music from some of Abel’s [Korzeniowski, composer] other film scores. I also always add layers of sound effects to my tracks to take the movie and the storytelling to a further level. I use sound to pull your attention, to define a character, or a mood, or elevate a mystery.”

Unlike many films, Nocturnal Animals flew through the post process without any official test screenings. Its first real screening was at the Venice Film Festival where it won the Silver Lion Grand Jury Prize. “Tom has the unique ability to both excite those working with him and to effortlessly convey his vision, and he had total confidence in the film. The film is rich with many layers and is the rare film that can reveal itself through subsequent viewings, hopefully providing the audience with that unique experience of being completely immersed in a novel, as our heroine becomes immersed in Nocturnal Animals,” Sobel says. The film opened in the US during November and is a Focus Features release.

Check out more with Joan Sobel at “Art of the Cut”.

Originally written for Digital Video magazine / Creative Planet Network.

©2017 Oliver Peters

AJA T-Tap

 

df0217_ttap_sm

The Thunderbolt protocol has ushered in a new era for easy connectivity of hardware peripherals. It allows users to deploy a single connection type to tie in networking, external storage, monitoring and broadcast audio and video input and output. Along with easy connections, it has also enabled peripheral devices to becomes smaller, lighter and more powerful. This is in part due to advances in the hardware and software, as well. AJA Video Systems is one of the popular video manufacturers that has taken advantage of these benefits.

In many modern editing environments, the actual editing system has become extremely streamlined. All it really takes is a Thunderbolt-enabled laptop, all-in-one (like an iMac) or desktop computer, fast external storage, and professional monitoring – and you are good to go. For many editors, live video output is strictly for monitoring, as deliverables are more often-than-not files and not tape. Professional monitoring is easy to achieve using SDI or HDMI connections. Any concern for analog is gone, unless you need to maintain analog audio monitoring. AJA makes a series of i/o products to address these various needs, ranging from full options down to simple monitoring devices. Blackmagic Design and AJA currently produce the lion’s share of these types of products, including PCIe cards for legacy installations and Thunderbolt devices for newer systems.

I recently tested the AJA T-Tap, which is a palm-sized video output device that connects to the computer using the Thunderbolt 2 protocol. It is bus-powered – meaning that no external power supply or “wall-wart” is needed to run it. I tested this on both a 2013 Mac Pro and a 2015 MacBook Pro. In each case, my main need was SDI and/or HDMI out of the unit to external monitors. Installation couldn’t be easier. Simply download the current control panel software and drivers from AJA’s website, install, and then connect the T-Tap. Hook up your monitors and you are ready. There’s very little else to do, except set your control panel configuration for the correct video/frame rate standard. Everything else is automatic in both Adobe Premiere Pro CC and Apple Final Cut Pro X. Although you’ll want to check your preference settings to make sure the device is detected and enabled.

One of the main reasons I wanted to test the T-Tap was as a direct comparison with the Blackmagic products on these same computers. For example, the current output device being used on the 2013 Mac Pro that I tested is a Blackmagic DesignUltraStudio Express. This contains a bit more processing and is comparable to AJA’s Io XT . I also tested the BMD MiniMonitor, which is a direct competitor to the T-Tap. The UltraStudio provides both input and output and offers an analog break-out cable harness, whereas the two smaller units are only output using SDI and HDMI. All three are bus-powered. In general, all performed well with Premiere Pro, except that the BMD MiniMonitor couldn’t provide output via HDMI. For unexplained reasons, that screen was blank. No such problem with either the T-Tap or the UltraStudio Express.

The real differences are with Final Cut Pro X on the Mac Pro. That computer has six Thunderbolt ports, which are shared across three buses – i.e. two connectors per bus. On the test machine, one bus feeds the two external displays, the second bus connects to external storage (not shared for maximum throughput), and the remaining bus connects to both the output device and a CalDigit dock. If the BMD UltraStudio Express is plugged into any connection shared with another peripheral, JKL high-speed playback and scrubbing in FCPX is useless. Not only does the video output stutter and freeze, but so does the image in the application’s viewer. So you end up wasting an available Thunderbolt port on the machine, if you want to use that device with FCPX. Therefore, using the UltraStudio with FCPX on this machine isn’t really functional, except for screening with a client. This means I end up disabling the device most of the time I use FCPX. In that respect, both the AJA T-Tap and the BMD MiniMonitor performed well. However, my subjective evaluation is that the T-Tap gave better performance in my critical JKL scrubbing test.

One difference that might not be a factor for most, is that the UltraStudio Express (which costs a bit more) has advanced processing. This yields a smooth image in pause when working with progressive and PsF media. When my sequence was stopped in either FCPX or Premiere, both the T-Tap and the UltraStudio yield a full-resolution, whole-frame image on the HDMI output. (HDMI didn’t appear to function on the MiniMonitor.) On the TV Logic broadcast display that was being fed vis SDI, the T-Tap and MiniMonitor only displayed a field in pause, so you get an image with “jaggies”. The UltraStudio Express generates a whole frame for a smooth image in pause. I didn’t test a unit like AJA’s Io XT, so I’m not sure if the more expensive AJA model offers similar processing. However, it should be noted that the Io XT is triple the cost of the UltraStudio Express.

The elephant in the room, of course, is Blackmagic Design DaVinci Resolve. That application is restricted to only work with Blackmagic’s own hardware devices. If you want to run Resolve – and you want professional monitoring out of it – then you can’t use any AJA product with it. However, these units are so inexpensive to begin with – compared with what they used to cost – it’s realistic to own both. In fact, some FCPX editors use a T-Tap while editing in FCPX and then switch over to a MiniMonitor or UltraStudio for Resolve work. The reason being the better performance between Final Cut and the AJA products.

Ultimately these are all wonderful devices. I like the robustness of AJA’s manufacturing and software tools. I’ve used their products over the years and never been disappointed with performance or service if needed. If you don’t need video output from Resolve, then the AJA T-Tap is a great choice for an inexpensive, simple, Thunderbolt video output solution. Laptop users who need to hook up to monitors while working at home or away will find it a great choice. Toss it into your laptop bag and you are ready to rock.

©2017 Oliver Peters

La La Land

df0117_lalaland_01_sm

La La Land is a common euphemism for Los Angeles and Hollywood, but it’s also writer/director Damien Chazelle’s newest film. Chazelle originally shopped La La Land around without much success and so moved on to another film project, Whiplash. Five Oscar nominations with three wins went a long way to break the ice and secure funding for La La Land. The new film tells the story of two struggling artists – Mia (Emma Stone), an aspiring actress, and Sebastian (Ryan Gosling), a jazz musician. La La Land was conceived as a musical set in modern day Los Angeles. It harkens back to the MGM musicals of the 50s and 60s, as well as French musicals, including Jacques Demy’s The Umbrellas of Cherbourg.

One of the Whiplash Oscars went to Tom Cross for Best Achievement in Film Editing. After working as one of David O. Russell’s four editors on Joy, Cross returned to cut La La Land with Chazelle. Tom Cross and I discussed how this film came together. “As we were completing Whiplash, Damien was talking about his next film,” he says. “He sent me a script and a list of reference movies and I was all in. La La Land is Damien’s love letter to old Hollywood. He knew that doing a musical was risky, because it would require large scale collaboration of all the film crafts. He loves the editing process and felt that the cutting would be the technical bridge that would hold it all together. He wanted to tell his story with the language of dreams, which to Damien is the film language of old Hollywood cinema. That meant that he wanted to use iris transitions, layered montages and other old optical techniques. The challenge was to use these retro styles, but still have a story that feels contemporary and grounded in realism.”

Playing with tone and time

La La Land was shot in approximately forty days, but editing the film took nearly a year. Cross explains, “Damien is great at planning and is very clear in what he shoots and his intentions. In the cutting room, we spent a lot of time calibrating the movie – playing with tone and time. Damien wanted to start our story with both characters together on the freeway, then branch off and show Mia going through her day. We take her to a specific plot intersection and then flashback in time to Sebastian on the freeway. Then we move through his day, until we are back at the intersection where our two stories collide. Much like the seasons that our movie cycles through – Winter, Spring, Summer, Fall – we end up returning to this specific intersection later in the film, but with a different outcome. Damien wanted to set up certain timelines and patterns that the audience would follow, so that we could ricochet off of them later.”

df0117_lalaland_02As a musical, Tom Cross had to vary his editorial style for different scenes. He continues, “For Sebastian and Mia’s musical courtship, Damien wanted the scenes to be slower and romantic with a lot of camera moves. In Griffith Park, it’s a long unbroken take with rounded edges. On the other hand, the big concert with John Legend is cutty, with sharp edges. It’s fragmented and the opposite of romantic. Likewise, when they are full-on in love and running around LA, the cutting is at a fever pitch. It’s lively and sweeps you off your feet. Damien wanted to be careful to match the editing style to the emotion of each scene. He knew that one style would accentuate the other.”

La La Land was shot in the unusual, extra-wide aspect ratio of 2.55:1 to replicate cinemascope from the 1950s. “This makes ordinary locations look extraordinary,” Cross says. “Damien would vary the composition from classic wides to very fragmented framing and big close-ups. When Sebastian and Mia are dancing, there’s head-to-toe framing like you would have in a Fred Astaire/Ginger Rogers film. During their dinner break up scene, the shots of their faces get tighter – almost claustrophobic – to be purposefully uncomfortable and unflinching. Damien wanted the cutting to be stripped down and austere – the opposite of what’s come before. He told me to play the scene in their medium shots until I punched into their close angles. And once we’re close, we have to stay there.”

Tricks and tools

The Avid Media Composer-equipped cutting rooms were hosted by Electric Picture Solutions in North Hollywood. Tom Cross used plenty of Media Composer features to cut La La Land. He explains, “For the standard dialogue scenes we used Avid’s Script Sync feature. This was very handy for Damien, because he likes to go over every line with a fine tooth comb. The musical numbers were cut using multi-cam groups. For scenes with prerecorded music, multiple takes could be synced and grouped as if they were different camera angles. I had my assistant set up what I call ‘supergroups’. For instance, all the singers might be grouped into one clip. The instruments might be in another group. Then I could stack the different groups onto multiple video tracks, making it easy to cut between groups, as well as angles within the groups.”

In addition to modern cutting techniques, Cross also relies on lo-fi tools, like scene cards on a wall. Cross says, “Damien was there the whole time and he loves to see every part of the process. He has a great editor’s mind – very open to editorial cheats to solve problems, such as invisible split screen effects and speed adjustments. Damien wanted us to be very meticulous about lip sync during the musical scenes because he felt that anything less than perfect would take you out of the moment. His feeling was that the audience scrutinizes the sync of the singing in a musical more than spoken dialogue in a normal film. So we spent a lot of time cutting and manipulating the vocal track – in order to get it right. Sometimes, I would speed-ramp the picture to match the singing. Damien was also very particular about downbeats and how they hit the picture. I learned that while working with him on Whiplash. It has to be precise. Justin Hurwitz, our composer, provided a mockup score to cut with, and that was eventually replaced by the final music recorded with a 95-piece orchestra. Of course, when you have living, breathing musicians, notes line up differently from the mockup track. Therefore, we had many cuts that needed to be shifted in order to maintain the sync that Damien wanted. During our final days of the sound mix, we were rolling cuts one or two frames in either direction on the dub stage at Fox.”

Editors and directors each have different ways to approach the start of the cutting process. Cross explains, “I edited while they were shooting and had a cut by the time the production wrapped. It’s a great way for the editor to learn the footage and make sure the production is protected. However, Damien didn’t want to see the first cut, but preferred to have it on hand if we needed it. I think first cuts are overwhelming for most directors. Damien had the idea of starting at the end first. There’s a big end scene and he decided that we should do that heavy lifting first. He said, ‘at least we’ll have an ending.’ We worked on it until we got it to a good place and then went back and started from the beginning. It re-invigorated us.”

Tom Cross wrapped with these parting thoughts. “This was a dream project for Damien and it was my dream to be able to work with him on it. It’s romantic, magical and awe-inspiring. I was very lucky to go from a film where you get beaten down on the drums – to another, where you get swept off your feet!”

For more conversations with Tom Cross, check out Art of the Cut.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters