New NLE Color Features

df_mascliplut_2_sm

As someone who does color correction as often within an NLE as in a dedicated grading application, it’s nice to see that Apple and Adobe are not treating their color tools as an afterthought. (No snide Apple Color comments, please.) Both the Final Cut Pro 10.1.2 and Creative Cloud 2014 updates include new tools specifically designed to improve color correction. (Click the images below for an expanded view with additional explanation.)

Apple Final Cut Pro 10.1.2

df_mascliplut_3_sm

This FCP X update includes a new, built-in LUT (look-up table) feature designed to correct log-encoded camera files into Rec 709 color space. This type of LUT is camera-specific and FCP X now comes with preset LUTs for ARRI, Sony, Canon and Blackmagic Design cameras. This correction is applied as part of the media file’s color profile and, as such, takes affect before any filters or color correction is applied.

These LUTs can be enabled for master clips in the event, or after a clip has been edited to a sequence (FCP X project). The log processing can be applied to a single clip or a batch of clips in the event browser. Simply highlight one or more clips, open the inspector and choice the “settings” selection. In that pane, access the “log processing” pulldown menu and choose one of the camera options. This will now apply that camera LUT to all selected clips and will stay with a clip when it’s edited to the sequence. Individual clips in the sequence can later be enabled or disabled as needed. This LUT information does not pass though as part of an FCPXML roundtrip, such as sending a sequence to Resolve for color grading.

Although camera LUTs are specific to the color science used for each camera model’s type of log encoding, this doesn’t mean you can’t use a different LUT. Naturally some will be too extreme and not desirable. Some, however, are close and using a different LUT might give you a desirable creative result, somewhat like cross-processing in a film lab.

Adobe CC 2014 – Premiere Pro CC and SpeedGrade CC

df_mascliplut_1_sm

In this CC 2014 release, Adobe added master clip effects that travel back and forth between Premiere Pro CC and SpeedGrade CC via Direct Link. Master clip effects are relational, meaning that the color correction is applied to the master clip and, therefore, every instance of this clip that is edited to the sequence will have the same correction applied to it automatically. When you send the Premiere Pro CC sequence to SpeedGrade CC, you’ll see that the 2014 version now has two correction tabs: master clip and clip. If you want to apply a master clip effect, choose that tab and do your grade. If other sections of the same clip appear on the timeline, they have automatically been graded.

Of course, with a lot of run-and-gun footage, iris levels and lighting changes, so one setting might not work for the entire clip. In that case, you can add a second level of grading by tweaking the shot in the clip tab. Effectively you now have two levels of grading. Depending on the show, you can grade in the master clip tab, the clip tab or both. When the sequence goes back to Premiere Pro CC, SpeedGrade CC corrections are applied as Lumetri effects added to each sequence clip. Any master clip effects also “ripple back” to the master clip in the bin. This way, if you cut a new section from an already-graded master clip to that or any other sequence, color correction has already been applied to it.

In the example I created for the image above, the shot was graded as a master clip effect. Then I added more primary correction and a filter effect, by using the clip mode for the first time the clip appears in the sequence. This was used to create a cartoon look for that segment on the timeline. Compare the two versions of these shots – one with only a master clip effect (shots match) and the other with a separate clip effect added to the first (shots are different).

Since master clip effects apply globally to source clips within a project, editors should be careful about changing them or copy-and-pasting them, as you may inadvertently alter another sequence within the same project.

©2014 Oliver Peters

Adobe Anywhere

df_anywhere_1_sm

Adobe Anywhere for video is Adobe’s first foray into collaborative editing. Anywhere functions a lot like other shared storage environments, except that editors and producers are not bound to working within the facility and its hard-wired network. The key difference between Adobe Anywhere and other NLE/SAN combinations is that all media is stored at the central location and the system’s servers handle the actual editing and compositing functions of the editing software. This means that no media is stored on the editor’s local computer and lightweight client stations can be used, since the required horsepower exists at the central location. Anywhere works within a facility using the existing LAN or externally over the internet when client systems connect remotely over VPN. Currently Adobe Anywhere is integrated directly into Adobe Premiere Pro CC and Prelude CC (Windows and OS X). Early access to After Effects integration is part of Adobe Anywhere 1.6, with improved integration available in the next release.

The Adobe Anywhere cluster

df_anywhere_4_smAdobe Anywhere software is installed on a set of Windows servers, which are general purpose server computers that you would buy from a vendor like Dell or HP. The software creates two types of nodes: a single Adobe Anywhere Collaboration Hub node and three or more Adobe Mercury Streaming Engine nodes. Each node is installed on a separate server, so a minimum configuration requires four computers. This is separate from the shared storage. If you use a SAN, such as a Facilis Technology or an EditShare system, the SAN will be mounted at the OS level by the computing cluster of Anywhere servers. Local and remote editors can upload source media to the SAN for shared access via Anywhere.

The Collaboration Hub computer stores all of the Anywhere project metadata, manages user access and coordinates the other nodes in the system. The Mercury Streaming Engine computers provide real-time, dynamic viewing streams of Premiere Pro and Prelude sequences with GPU-accelerated effects. Media stays in its native file format on the storage servers. There are no proxy files created by the system. In order to handle real-time effects, each of the Streaming Engine servers must be equipped with a high-end NVIDIA graphics card.

As a rule of thumb, this minimum cluster size supports 10-15 active users, according to Adobe. However, the actual number depends on media type, resolution, number of simultaneous source clips needed per editor, as well as activities that may be automated like import and export. Adobe prices the Anywhere software based on the number of named users. This is a subscription model of $1,000/year/user. That’s in addition to installed seats of Creative Cloud and the cost of the hardware to make the system work, which is supplied by other vendors and not Adobe. Since this is not sold as a turnkey installation by Adobe, certain approved vendors, like TekServe and Keycode Media, have been qualified as Adobe Anywhere system integrators.

How it works

df_anywhere_5_smWhile connected to Adobe Anywhere and working with an Anywhere project, the Premiere Pro or Prelude application on the local computer is really just functioning as the software front-end that is driving the application running back at the server. The result of the edit decisions are streamed back to the local machine in real-time as a single stream of video. The live stream of media from the Mercury Streaming Engine is being handled in a similar fashion to the playback resolution throttle that’s already part of Premiere Pro. As native media is played, the computer adjusts the stream’s playback compression based on bandwidth. Whenever playback is paused, the parked frame is updated to full resolution – thus, enabling an editor to tweak an effect or composite and always see the full resolution image while making the adjustments.

To understand this better, let’s use the example of a quad split. If this were done locally, the drives would be playing back four streams of video and the software and GPU of that local computer would composite the quad split and present a single stream of video to the viewer display. In the case of Adobe Anywhere, the playback of these four streams and the compositing of the quad split would take place on the Mercury Streaming Engine computer. In turn, it would stream this live composite as a single feed of video back to the remotely connected computer. Since all the “heavy lifting” is done at “home base” the system requirements for the client machine can be less beefy. In theory, you could be working with a MacBook Air, while editing RED Epic 5K footage.

Productions

Another difference with Adobe Anywhere is that instead of having Premiere Pro or Prelude project files, users create shared productions, designed for multi-user and multi-application access. This way a collaborating team is set up like a workgroup with assigned permission levels. Media is common and central to avoid media duplication. Any media that is added on-site, is uploaded to the production in its native resolution and becomes part of the shared assets of the production. The Collaboration Hub computer manages the database for all productions.

When a user remotely logs into an Adobe Anywhere Production, media to which he has been granted access is available for browsing using Premiere Pro’s standard Media Browser panel. When an editor starts working, Anywhere automatically makes a virtual “clone” of his or her production items and opens them in a private session. Because multiple people can be working in the same production at the same time, Adobe Anywhere provides protection against conflicts or overwrites. In order to share your private changes, you must first get any updates from the shared production. This pulls all shared changes into your private view. If another person has changed the same asset you are working on, you are provided with information about the conflict and given the opportunity to keep the other person’s changes, your changes or both. Once you make your choices, you can then transfer your changes back to the shared production. Anywhere also maintains a version history, so if unwanted changes are made, you can revert back to an earlier or alternate version.

Adobe Anywhere in the wild

df_anywhere_2_smAlthough large installations like CNN are great for publicity headlines, Adobe Anywhere is proving to be useful at smaller facilities, too. G-Men Media is a production company based in Venice, California. They are focused primarily on feature film and commercial broadcast work. According to G-Men COO, Jeff Way, “G-Men was originally founded with the goal of utilizing the latest digital technologies available to reduce costs, accelerate workflow and minimize turnaround time for our clients. Adobe Anywhere allowed us to provide our clients a more efficient workflow on post productions without having to grow infrastructure on a per project basis.”

“A significant factor of Adobe Anywhere, which increased the growth of our client base, was the system’s ability to organize production teams based on talent instead of location. If we can minimize or eliminate time required for coordinating actual production work (i.e. shipping hard drives, scheduling meetings with editors, awaiting review/approval), we can save clients money that they can then invest into more creative aspects of the project – or simply undercut their budget. Furthermore, we have the ability to scale up or down without added expenses in infrastructure. All that’s required on our end is simply granting the Creative Cloud seat access to the system assets for their production.”

df_anywhere_3_smThe G-Men installation was handled by Keycode Media, based on the recommended Adobe configuration described at the beginning of this article. This includes four SuperMicro 1U rack-mounted SuperServers. Three of these operate as the Adobe Anywhere Mercury Streaming Engines and the fourth acts as the Adobe Anywhere Collaboration Hub. Each of the Mercury Streaming Engines has its own individual NVIDIA Tesla K10 GPU card. The servers are connected to a Facilis Terrablock shared storage array via a 10 Gigabit Ethernet switch. Their Internet feed is via a fiber optic connection, typically operating at 500Mbps (down) /150Mbps (up). G-Men has used the system on every project, since it went live in August of 2013. Noteworthy was its use for post on Savageland – the first feature film to run through an Adobe Anywhere system.

Way continued, “Savageland ended up being a unique situation and the ultimate test of the system’s capabilities. Savageland was filmed over three years with various forms of media from iPhone and GoPro footage to R3D raw and Canon 5D. It was really a matter of what the directors/producers could get their hands on from day-to-day. After ingesting the assets into our system, we were able to see a fluid transition straight into editing without having to transcode media assets. One of the selling factors of gaining Savageland as a client was the flexibility and feasibility of allowing all of the directors and editors (who lived large distances from each other in Los Angeles) to work at their convenience. The workflow for them changed from setting aside their weekends and nights for review meetings at a single location to a readily available review via their MacBooks and iPads.”

“For most of our clients, the system has allowed them to bring on the editorial talent they want without having to worry about the location of the editor. At the same time, the editors enjoyed the flexibility of working from wherever they wanted – many times out of their own homes. The benefit for editors and directors is the capability to remotely collaborate and provide feedback immediately. We’ve had a few productions where there are more than one editor working on the same assets – both creating different versions of the same edit. At the same time we had a director viewing the changes immediately after they were shared, with notes on each version. Then they had the ability to immediately make a decision on one or the other or provide creative feedback, so the editors could immediately apply the changes in real time.”

G-Men is in production on Divine Access, a feature film being shot in Austin, Texas. Way explained, “We’re currently in Austin beginning principal photography. Knowing the cloud-based editing workflows available to us, we wanted to expand the benefits we are gaining in post to the entirety of a feature film production from first location scout to principal photography and all the way through to delivery. We’re using our infrastructure to ingest and begin edits as we shoot, which is really new and exciting to all of the producers working on the film.  With the upload speeds we have available to us, we are able to provide review/approvals to our director the same day.”

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Cold In July

df_cij_2_smJim Mickle started his career as a freelance editor in New York, working on commercials and corporate videos, like so many others. Bitten by the filmmaking bug, Mickle has gone on to successfully direct four indie feature films, including his latest, Cold in July. Like his previous film, We Are What We Are, both films had a successful premiere at the Sundance Film Festival.

Cold In July, which is based on a novel by Joe R. Lansdale, is a noir crime drama set in 1980s East Texas. It stars Michael C. Hall (Dexter), Sam Shepard (Out of the Furnace, Killing Them Softly) and Don Johnson (Django Unchained, Miami Vice). Awakened in the middle of the night, small town family man Richard Dane (Hall) kills a burglar in his house. Dane soon fears for his family’s safety when the burglar’s ex-con father, Ben (Shepard), comes to town, bent on revenge. However, the story takes a twist into a world of corruption and violence. Add Jim Bob (Johnson) to this mix, as a pig-farming, private eye, and you have an interesting trio of characters.

According to Jim Mickle, Cold In July was on a fast-track schedule. The script was optioned in 2007, but production didn’t start until 2013. This included eight weeks of pre-production beginning in May and principal photography starting in July (for five weeks) with a wrap in September. The picture was “locked” shortly after Thanksgiving. Along with Mickle, John Paul Hortsmann (Killing Them Softly) shared editing duties.

df_cij_1_smI asked Mickle how it was to work with another editor. He explained, “I edited my last three films by myself, but with this schedule, post was wedged between promoting We Are What We Are and the Sundance deadline. I really didn’t have time to walk away from it and view it with fresh eyes. I decided to bring John Paul on board to help. This was the first time I’ve worked with another editor. John Paul was cutting while I was shooting and edited the initial assembly, which was finished about a week before the Sundance submission deadline. I got involved in the edit about mid-October. At that point, we went back to tighten and smooth out the film. We would each work on scenes and then switch and take a pass at each other’s work.”

df_cij_4_smMickle continued, “The version that we submitted to Sundance was two-and-a-half hours long. John Paul and I spent about three weeks polishing and were ready to get feedback from the outside. We held a screening for 20 to 25 people and afterwards asked questions about whether the plot points were coherent to them. It’s always good for me, as the director, to see the film with an audience. You get to see it fresh – with new eyes – and that helps you to trim and condense sections of the film. For example, in the early versions of the script, it generally felt like the middle section of the film lost tension. So, we had added a sub-plot element into the script to build up the mystery. This was a car of agents tailing our hero that we could always reuse, as needed. When we held the screening, it felt like that stuff was completely unnecessary and simply put on top of the rest of the film. The next day we sliced it all out, which cut 10 minutes out of the film. Then it finally felt like everything clicked.”

df_cij_3_smThe director-editor relationship always presents an interesting dynamic, since the editor can be objective in cutting out material that may have cost the director a lot of time and effort on set to capture. Normally, the editor has no emotional investment in production of the footage. So, how did Jim Mickle as the editor, treat his own work as the director? Mickle answered, “As an editor, I’m more ruthless on myself as the director. John Paul was less quick to give up on scenes than I. There are things I didn’t think twice about losing if they didn’t work, but he’d stay late to fix things and often have a solution the next day. I shoot with plenty of coverage these days, so I’ll build a scene and then rework it. I love the edit. It’s the first time you really feel comfortable and can craft the story. On the set, things happen so quickly, that you always have to be reactive – working and thinking on your feet.”

df_cij_5_smAlthough Mickle had edited We Are What We Are with Adobe Premiere Pro, the decision was made to shift back to Apple Final Cut Pro 7 for the edit of Cold In July. Mickle explained, “As a freelance editor in New York, I was very comfortable with Final Cut, but I’m also an After Effects user. When doing a lot of visual effects, it really feels tedious to go back and forth between Final Cut and After Effects. The previous film was shot with RED cameras and I used a raw workflow in post, cutting natively with Premiere Pro. I really loved the experience – working with raw files and Dynamic Link between Premiere and After Effects. When we hired John Paul as the primary editor on the film, we opted to go back to Final Cut, because that is what he is most comfortable with. That would get the job done in the most expedient fashion, since he was handling the bulk of the editing.”

df_cij_6_sm“We shot with RED cameras again, but the footage was transcoded to ProRes for the edit. I did find the process to be frustrating, though, because I really like the fluidness of using the raw files in Premiere. I like the editing process to live and breath and not be delineated. Having access to the raw films, lets me tweak the color correction, which helps me to get an idea of how a scene is shaping up. I get the composer involved early, so we have a lot of the real music in place as a guide while we edit. This way, your cutting style – and the post process in general – are more interactive. In any case, the ProRes files were only used to get us to the locked cut. Our final DI was handled by Light Iron in New York and they conformed the film from the original RED files for a 2K finish.”

The final screening with mix, color correction and all visual effects occurred just before Sundance. There the producers struck a distribution deal with IFC Films. Cold In July started its domestic release in May of this year.

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Filmmaking Pointers

df_fmpointersIf you want to be a good indie filmmaker, you have to understand some of the basic principles of telling interesting visual stories and driving the audience’s emotions. These six   ideas transcend individual components of filmmaking, like cinematography or editing. Rather, they are concepts that every budding director should understand and weave into the entire structure of how a film is approached.

1. Get into the story quickly. Films are not books and don’t always need a lengthy backstory to establish characters and plot. Films are a journey and it’s best to get the characters on that road as soon as possible. Most scripts are structured as three-act plays, so with a typical 90-100 minute running time, you should be through act one at roughly one third of the way into the film. If not, you’ll lose the interest of the audience. If you are 20 minutes into the film and you are still establishing the history of the characters without having advanced the story, then look for places to start cutting.

Sometimes this isn’t easy to tell and an extended start may indeed work well, because it does advance the story. One example is There Will Be Blood. The first reel is a tour de force of editing, in which editor Dylan Tichenor builds a largely dialogue-free montage that quickly takes the audience through the first part of Daniel Plainview’s (Daniel Day-Lewis) history in order to bring the audience up to the film’s present day. It’s absolutely instrumental to the rest of the film.

2. Parallel story lines. A parallel story structure is a great device to show the audience what’s happening to different characters at different locations, but at more or less the same time. With most scripts, parallel actions are designed to eventually converge as related or often unrelated characters ultimately end up in the same place for a shared plot. An interesting take on this is Cloud Atlas, in which an ensemble cast plays different characters spread across six different eras and locations – past, present and future.

The editing style pulled off by Alexander Berner is quite a bit different than traditional parallel story editing. A set of characters might start a scene in one era. Halfway through the scene – through some type of abrupt cut, such as walking through a door – the characters, location and eras shift to somewhere else. However, the story and the editing are such that you clearly understand how the story continues for the first half of that scene, as well as how it led into the second half. This is all without explicitly shooting those parts of each scene. Scene A/era A informs your understanding of scene B/era B and vice versa.

3. Understand camera movement. When a camera zooms, moves or is used in a shaky, handheld manner, this elicits certain emotions from the audience. As a director or DP, you need to understand when each style is appropriate and when it can be overdone. Zooming into a close-up while an actor delivers a line should be done intentionally. It tells the audience, “Listen up. This is important.” If you shoot handheld footage, like most of the Bourne series, it drives a level of documentary-style, frenetic action that should be in keeping with the concept.

The TV series NYPD Blue is credited with introducing TV audiences to the “shaky-cam” style of camera work. Many pros thought it was overdone, with movement often being introduced in an unmotivated fashion. Yet, the original Law & Order series also made extensive use of handheld photography. As this was more in keeping with a subtle documentary style, few complained about its use on that show.

4. Color palettes and art direction. Many new filmmakers often feel that you can get any look you want through color grading. The reality is that it all starts with art direction. Grading should enhance what’s there, not manufacture something that isn’t. To get that “orange & teal” look, you need to have a set and wardrobe that has some greens and blues in it. To get a warm, earthy look, you need a set and wardrobe with browns and reds.

This even extends to black & white films. To get the right contrast and tonal values in black & white, you often have to use set/wardrobe color choices that are not ideal in a color world. That’s because different colors carry differing luminance and midrange values, which becomes very obvious, once you eliminate the color information from the picture. Make sure you take that into account if you plan to produce a black & white film.

5. Score versus sound design. Music should enhance and underscore a film, but it does not have to be wall-to-wall. Some films, like American Hustle and The Wolf of Wall Street, are driven by a score of popular tunes. Others are composed with an original score. However, often the “score” consists of sound design elements and simple musical drones designed to heighten tension and otherwise manipulate emotion. The absence of score in a scene can achieve the same effect. Sound effects elements with stark simplicity may have more impact  on the audience than music. Learn when to use one or the other or both. Often less is more.

6. Don’t tell too much story. Not every film requires extensive exposition. As I said at the top, a film is not a book. Visual cues are as important as the spoken word and will often tell the audience a lot more in shorthand, than pages and pages of script. The audience is interested in the journey your film’s characters are on and frequently need very little backstory to get an understanding of the characters. Don’t shy away from shooting enough of that sort of detail, but also don’t be afraid to cut it out, when it becomes superfluous.

©2014 Oliver Peters

Amira Color Tool and your NLE

df_amiracolor_1I was recently alerted to the new Amira Color Tool by Michael Phillips’ 24p blog. This is a lightweight ARRI software application designed to create custom in-camera looks for the Amira camera. You do this by creating custom color look-up tables (LUT). The Amira Color Tool is available as a free download from the ARRI website (free registration required). Although the application is designed for the camera, you can also export looks in a variety of LUT file formats, which in turn, may be installed and applied to footage in a number of different editing and color correction applications. I tested this in both Apple Final Cut Pro X and Avid Media Composer | Software (v8) with good results.

The Amira Color Tool is designed to correct log-C encoded footage into a straight Rec709 offset or with a custom look. ARRI offers some very good instructions, white papers, sample looks and tutorials that cover the operation of this software. The signal flow is from the log-C image, to the Rec709 correction, and then to the CDL-based color correction. To my eye, the math appears to be floating point, because a Rec709 conversion that throws a shot into clipping, can be pulled back out of clipping in the look tab, using the CDL color correction tools. Therefore it is possible to use this tool for shots other than ARRI Amira or Alexa log-C footage, as long as it is sufficiently flat.

The CDL correction tools are based on slope, offset and power. In that model slope is equivalent to gain, offset to lift and power to gamma. In addition to color wheels, there’s a second video look parameters tab for hue intensities for the six main vectors (red, yellow, green, cyan, blue and magenta). The Amira Color Tool is Mac-only and opens both QuickTime and DPX files from the clips I tested. It worked successfully with clips shot on an Alexa (log-C), Blackmagic Cinema Camera (BMD Film profile), Sony F-3 (S-log) and Canon 1DC (4K Canon-log). Remember that the software is designed to correct flat, log-C images, so you probably don’t want to use this with images that were already encoded with vibrant Rec709 colors.

FCP X

df_amiracolor_4To use the Amira Color Tool, import your clip from the application’s file browser, set the look and export a 3D LUT in the appropriate format. I used the DaVinci Resolve setting, which creates a 3D LUT in a .cube format file. To get this into FCP X, you need to buy and install a LUT filter, like Color Grading Central’s LUT Utility. To install a new LUT there, open the LUT Utility pane in System Preferences, click the “+” symbol and navigate to where the file was saved.df_amiracolor_5_sm In FCP X, apply the LUT Utility to the clip as a filter. From the filter’s pulldown selection in the inspector, choose the new LUT that you’ve created and installed. One caveat is to be careful with ARRI files. Any files recorded with newer ARRI firmware are flagged for log-C and FCP X automatically corrects these to Rec709. Since you don’t want to double up on LUTs, make sure “log processing” is unchecked for those clips in the info tab of the inspector pane.

Media Composer

df_amiracolor_6_smTo use the custom LUTs in Media Composer, select “source settings” for the clip. Go to the color management tab and install the LUT. Now it will be available in the pull-down menu for color conversions. This color management change can be applied to a single clip or to a batch of clips within a bin.

In both cases, the source clips in FCP X and/or Media Composer will play in real-time with the custom look already applied.

df_amiracolor_2_sm

df_amiracolor_3_sm

©2014 Oliver Peters