Lumetri plus SpeedGrade Looks

df2515_lumsglooks_3_sm

Last year I created a series of Looks presets that are designed to work with SpeedGrade CC. These use Adobe’s .look format, which is a self-contained container format that includes SpeedGrade color correction layers and built-in effects. Although I specifically designed these for use with SpeedGrade, I received numerous inquiries as to how they could be used directly within Premiere Pro. There have been solutions, but finally with the release of Premiere Pro CC 2015, this has become very easy. (Look for a full review of Premiere Pro CC 2015 in a future post.) Click any image for an expanded view.

df2515_lumsglooks_1_smOne of the top features of the CC 2015 release is the new Lumetri Color panel for Premiere Pro. When you select the Color workspace, the Premiere Pro interface will automatically display the Lumetri Color panel along with new, real-time videoscopes. This new panel provides extensive color correction features in a single panel (controls are also available in the Effects Control panel). It is based on a layer design that is similar to the Lightroom adjustment controls.

df2515_lumsglooks_6_smThe top control of the panel lets you select either the source clip (left name) or that one instance on the timeline (right name). If you select the source clip, then any correction is applied as a master clip effect. This correction will ripple to any other instances of that source on the timeline. If you select the timeline clip, then corrections only affect that one spot on the timeline. Key, for the purposes of this article, is the fact that the Lumetri Color panel includes two entry points for LUTs, using either the .cube or .look format. Adobe supplies a set of Adobe and LookLabs (SpeedLooks) LUTs. You can access built-in or third-party files from either the Basic or the Creative tab of the Lumetri Color panel.

df2515_lumsglooks_5_smIf you want to use any custom Look file – such as the free ones that I built or a purchased set, like SpeedLooks – simply choose browse from the pulldown menu and navigate to your hard drive location containing the file that you want to use. Sometimes this will require two LUTs. For example, SpeedLooks are based on corrections to a default log format optimized for LookLabs products. This means you’ll need to apply one of their camera patches to move the camera color into their unified log format. On the other hand, my Looks are based on a standard image, so you may or may not need an additional LUT. If you have ARRI Alexa footage recorded with a log-C gamma profile, then you’ll want to add Adobe’s default Log-C-to-Rec709 LUT, along with the Look file. In both examples, you would add the camera LUT in the Basic tab, since this is where the correction pipeline starts. Camera LUTs should be applied as source effects, so that they are applied as master clip effects.df2515_lumsglooks_2_sm

The next step is to apply your creative “look”, which might be a film emulation LUT or some other type of subjective look. This is applied through the pulldown in the Creative tab. Usually it’s best to apply this as a timeline effect. Simply select a built-in option or browse to other choices on your hard drive. In the case of my SpeedGrade Looks, pick the one you like based on the style you are after. Since the .look format can contain SpeedGrade’s built-in effect filters and vignettes, these will be included when applied in the Lumetri panel as part of a single LUT file.

df2515_lumsglooks_4_smAs with any LUT, not all settings work ideally with your own footage. This means you MUST adjust the other settings in the Lumetri Color panel to get the results you want. A creative LUT is only a starting point and never the final look. As you look through the various controls on the tabs, you’ll see a plethora of grading tools for exposure, contrast, color balance, curves, vignettes, and more. Tweak to your heart’s content and you’ll get some outstanding results without ever leaving the Premiere Pro environment.

Click here to download a .zip archive of the free SpeedGrade Looks file.

©2015 Oliver Peters

Fresh Dressed

df0515_frdress_2_smThe Sundance Film Festival is always a great event to showcase not just innovative dramas and comedies, but also new documentaries. This year brought good news for Adobe, because 21 of the documentaries to be shown were edited on Premiere Pro, which is more than double last year’s count. One such film is Fresh Dressed, which chronicles the history of hip-hop fashion from its birth in the Bronx during the 1970s to its evolution into a mainstream industry. It digs underneath the surface to look into other factors, like race and the societal context. Fresh Dressed was the first film written and directed by veteran producer Sacha Jenkins (Being Terry Kennedy, 50 Cent: The Power and the Money). The film features interviews with Pharrell Williams, Nas, Daymond John, Damon Dash, and Karl Kani, among others. It includes archival footage and some animation.

I recently spoke with Andrea B. Scott (Florence Arizona, A Place at the Table), who was brought in to complete the editing of the film to get it ready in time for Sundance submission. Scott explains, “Sacha and the team started shooting interviews in September of 2013. Initially there was another editor on board, who handled the first pass of cutting and organization of the project. I came to the film in May of 2014 after a basic assembly had been completed. This film was being produced by CNN and they recommended me. I definitely agree with the sentiment that editing is a lot like ‘writing with pictures’. It was my job to streamline the film and help craft the narrative, and bring Sacha’s vision to life as a moving story.”

df0515_frdress_1_smScott has worked on several documentaries before and has her own routine for learning the material. She says, “I usually start by watching the interviews through a couple of times, making notes with markers, and also by reading interview transcripts and highlighting certain passages. Then, I’ll pull selects to whittle down the interview to the parts that are most likely to be used in any given section. On Fresh Dressed, because I started with an assembly and needed to work quickly to get to a rough cut, I relied heavily on interview transcripts – going through the film section-by-section and interview-by-interview, and pulling selects – going back and forth from reading the transcript to watching the interview. Fresh Dressed involved about 30 interviews and totaled approximately 200 hours of raw footage. A lot of the archival search had already been done by the time I came on board, so I also had to watch through that footage and had a lot of good material to pull from.”

All film editing involves a working relationship between the editor and the director and Fresh Dressed was no exception. Scott continues, “It’s always a process of gaining the trust of the director. I come from the suburbs and I’m a bit younger than some of the crew, so it was a steep learning curve for me to understand the history of the hip-hop culture and fashion. It basically evolved from the urban gang culture of the 1970s, moved out from New York City, and went global from there. Inevitably, as the editor, you bring fresh eyes to the project and part of the editing process is to refine. The goal was to tell the story without voice-over, so we used the interviews to create that narrative thread. I put in a lot more archival material than was there before, which served to enliven the film with moments of nostalgia and infuse it with a fun energy. In a written script or book there can be a lot of side stories, which make sense on paper and are easy for the reader to follow and digest. But, the film we were making had to be more direct, with a linear timeline. Part of what I did was to strip away tangents that take you away from the main story.”df0515_frdress_3_sm

Scott’s touch also extended to the music. “The film was originally delivered to me with wall-to-wall music,” she explains. “I stripped out the music at first, so I could really think about story. Then I added temp score back in places to help steer the audience and underscore certain moments with another level of meaning.  In the end, we hired a talented composer, Tyler Strickland, to write the bulk of the score, and we also used some popular tracks from critical moments in the history of hip-hop.”

This was Scott’s first experience with Adobe Premiere Pro CC. Her prior experience had been with Apple Final Cut Pro (the “legacy” version). She found it to be a relatively easy transition. “The production company had already started the edit on Premiere Pro and so I continued with it. I welcomed being pushed to a new editing platform. It took about a week for me to get the hang of it. Since we were on a short deadline by that time, I simply ran it like I was used to running Final Cut. I really didn’t have the time to learn all of its nuances. I used the FCP keyboard settings, so everything felt natural to me. There’s a lot about Premiere Pro that I really like now. For example, the way it works with native media and using Adobe Media Encoder to export files.” The workstations were connected to shared storage, allowing the Scott to access material from any computer in the production office.

df0515_frdress_4_smEditors considering a shift to Premiere Pro CC sometimes question how its performance is with long-form project. Scott responds, “I was editing on an iMac and performance was fine. One tip I found that helps to speed up the loading of a large project is to discard old sequences. When I edit, I generally duplicate sequences and continue on those as I make changes. So on a large project you tend to build up a lot of sequences that way. While it’s good to save the past few versions in case you need to go back, you still have a lot of the oldest ones that simply aren’t ever needed again. These tend to slow down the speed of loading the project as all the media is relinked each time you launch it. By simply getting rid of a lot of these, you can improve performance.”

To handle the final stages of post, Scott exported an OMF file from Premiere Pro CC to be used by the audio mixer and and an XML file for the colorist. The final color correction of Fresh Dressed is being handled by Light of Day in New York. They will also complete the conform and recreate all moves on archival stills.

Scott concludes, “The film was, for the most part, made in New York, which makes sense, because Fresh Dressed really is a New York story at its heart.  Working on this film, I gained another level of love for New York, a deeper appreciation for all the many stories that start in this city, and for the deeper context that surrounds those individual stories.  Plus I had a lot of fun along the way.”

Read more about Fresh Dressed at Adobe’s Premiere Pro blog.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Understanding SpeedGrade

df1615_sg_1How you handle color correction depends on your temperament and level of expertise. Some editors want to stay within the NLE, so that editorial adjustments are easily made after grading. Others prefer the roundtrip to a powerful external application. When Adobe added the Direct Link conduit between Premiere Pro CC and SpeedGrade CC, they gave Premiere Pro editors the best of both worlds.

Displays

df1615_sg_4SpeedGrade is a standalone grading application that was initially designed around an SDI feed from the GPU to a second monitor for your external video. After the Adobe acquisition, Mercury Transmit was eventually added, so you can run SpeedGrade with one display, two computer displays, or a computer display plus a broadcast monitor. With a single display, the video viewer is integrated into the interface. At home, I use two computer displays, so by enabling a dual display layout, I get the SpeedGrade interface on one screen and the full-screen video viewer on the other. To do this you have to correctly offset the pixel dimensions and position for the secondary display in order to see it. Otherwise the image is hidden behind the interface.

Using Mercury Transmit, the viewer image is sent to an external monitor, but you’ll need an appropriate capture/monitoring card or device. AJA products seem to work fine. Some Blackmagic devices work and others don’t. When this works, you will lose the viewer from the interface, so it’s best to have the external display close – as in next to your interface monitor.

Timeline

df1615_sg_3When you use Direct Link, you are actually sending the Premiere Pro timeline to SpeedGrade. This means that edits and timeline video layers are determined by Premiere Pro and those editing functions are disabled in SpeedGrade. It IS the Premiere Pro timeline. This means certain formats that might not be natively supported by a standalone SpeedGrade project will be supported via the Direct Link path – as long as Premiere Pro natively supports them.

There is a symbiotic relationship between Premiere Pro and SpeedGrade. For example, I worked on a music video that was edited natively using RED camera media. The editor had done a lot of reframing from the native 4K media in the 1080 timeline. All of this geometry was correctly interpreted by SpeedGrade. When I compared the same sequence in Resolve (using an XML roundtrip), the geometry was all wrong. SpeedGrade doesn’t give you access to the camera raw settings for the .r3d media, but Premiere Pro does. So in this case, I adjusted the camera raw values by using the source settings control in Premiere Pro, which then carried those adjustments over to SpeedGrade.

df1615_sg_2Since the Premiere Pro timeline is the SpeedGrade timeline when you use Direct Link, you can add elements into the sequence from Premiere, in order to make them available in SpeedGrade. Let’s say you want to add a common edge vignette across all the clips of your sequence. Simply add an adjustment layer to a top track while in Premiere. This appears in your SpeedGrade timeline, enabling you to add a mask and correction within the adjustment layer clip. In addition, any video effects filters that you’ve applied in Premiere will show up in SpeedGrade. You don’t have access to the controls, but you will see the results interactively as you make color correction adjustments.

df1615_sg_17All SpeedGrade color correction values are applied to the clip as a single Lumetri effect when you send the timeline back to Premiere Pro. All grading layers are collapsed into a single composite effect per clip, which appears in the clip’s effect stack (in Premiere Pro) along with all other filters. In this way you can easily trim edit points without regard to the color correction. Traditional roundtrips render new media with baked-in color correction values. There, you can only work within the boundaries of the handles that you’ve added to the file upon rendering. df1615_sg_16Not so with Direct Link, since color correction is like any other effect applied to the original media. Any editorial changes you’ve made in Premiere Pro are reflected in SpeedGrade should you go back for tweaks, as long as you continue to use Direct Link.

12-way and more

df1615_sg_5Most editors are familiar with 3-way color correctors that have level and balance controls for shadows, midrange and highlights. Many refer to SpeedGrade’s color correction model as a 12-way color corrector. The grading interface features a 3-way (lift/gamma/gain) control for four ranges of correction: overall, shadows, midrange, and highlights. Each tab also adds control of contrast, pivot, color temperature, magenta (tint), and saturation. Since shadow, midrange, and highlight ranges overlap, you also have sliders that adjust the overlap thresholds between shadow and midrange and between the midrange and highlight areas.

df1615_sg_7Color correction is layer based – similar to Photoshop or After Effects. SpeedGrade features primary (“P”) , secondary (“S”) and filter layers (the “+” symbol). When you add layers, they are stacked from bottom to top and each layer includes an opacity control. As such, layers work much the same as rooms in Apple Color or nodes in DaVinci Resolve. You can create a multi-layered adjustment by using a series of stacked primary layers. Shape masks, like that for a vignette, should be applied to a primary layer. df1615_sg_10The mask may be normal or inverted so that the correction is applied either to the inside or the outside of the mask. Secondaries should be reserved for HSL keys. For instance, highlighting the skin tones of a face to adjust its color separately from the rest of the image. The filter layer (“+”) is where you’ll find a number of useful tools, including Photoshop-style creative effect filters, LUTs, and curves.

Working with grades

df1615_sg_13The application of color correction can be applied to a clip as either a master clip correction or just a clip correction (or both). When you grade using the default clip tab, then that color correction is only being applied to that single clip. If you grade in the master clip tab, then any color correction that you apply to that clip will also be applied to every other instance of that same media file elsewhere on the timeline. Theoretically, in a multicam edit – made up of four cameras with a single media file per camera – you could grade the entire timeline by simply color correcting the first clip for each of the four cameras as a master clip correction. All other clips would automatically inherit the same settings. Of course, that almost never works out quite as perfectly, therefore, you can grade a clip using both the master clip and the regular clip tabs. Use the master for a general setting and still use the regular clip tab to tweak each shot as needed.

df1615_sg_9Grades can be saved and recalled as Lumetri Looks, but typically these aren’t as useful in actual grading as standard copy-and-paste functions – a recent addition to SpeedGrade CC. Simply highlight one or more layers of a graded clip and press copy (cmd+c on a Mac). Then paste (cmd+v on a Mac) those to the target clip. These will be pasted in a stack on top of the default, blank primary correction that’s there on every clip. You can choose to use, ignore, or delete this extra primary layer.

SpeedGrade features a cool trick to facilitate shot matching. The timeline playhead can be broken out into multiple playheads, which will enable you to compare two or more shots in real-time on the viewer. This quick comparison lets you make adjustments to each to get a closer match in context with the surrounding shots.

A grading workflow

df1615_sg_14Everyone has their own approach to grading and these days there’s a lot of focus on camera and creative LUTs. My suggestions for prepping a Premiere Pro CC sequence for SpeedGrade CC go something like this.

df1615_sg_6Once, you are largely done with the editing, collapse all multicam clips and flatten the timeline as much as possible down to the bottom video layer. Add one or two video tracks with adjustment layers, depending on what you want to do in the grade. These should be above the last video layer. All graphics – like lower thirds – should be on tracks above the adjustment layer tracks. This is assuming that you don’t want to include these in the color correction. Now duplicate the sequence and delete the tracks with the graphics from the dupe. Send the dupe to SpeedGrade CC via Direct Link.

In SpeedGrade, ignore the first primary layer and add a filter layer (“+”) above it. Select a camera patch LUT. For example, an ARRI Log-C-to-Rec-709 LUT for Log-C gamma-encoded Alexa footage. Repeat this for every clip from the same camera type. If you intend to use a creative LUT, like one of the SpeedLooks from LookLabs, you’ll need one of their camera patches. This shifts the camera video into a unified gamma profile optimized for their creative LUTs. If all of the footage used in the timeline came from the same camera and used the same gamma profile, then in the case of SpeedLooks, you could apply the creative LUT to one the adjustment layer clips. This will apply that LUT to everything in the sequence.

df1615_sg_8Once you’ve applied input and output LUTs you can grade each clip as you’d like, using primary and secondary layers. Use filter layers for curves. Any order and any number of layers per clip is fine. Using this methodology all grading is happening between the camera patch LUT and the creative LUT added to the adjustment layer track. Finally, if you want a soft edge vignette on all clips, apply an edge mask to the default primary layer of the topmost adjustment layer clip. Adjust the size, shape, and softness of the mask. Darken the outside of the mask area. Done.df1615_sg_11

(Note that not every camera uses logarithmic gamma encoding, nor do you want to use LUTs on every project. These are the “icing on the cake”, NOT the “meat and potatoes” of grading. If your sequence is a standard correction without any stylized creative looks, then ignore the LUT procedures I described above.)

df1615_sg_15Now simply send your timeline back to Premiere Pro (the “Pr” button). Back in Premiere Pro CC, duplicate that sequence. Copy-and-paste the graphics tracks from the original sequence to the available blank tracks of the copy. When done, you’ll have three sequences: 1) non-color corrected with graphics, 2) color corrected without graphics, and 3) final with color correction and graphics. The beauty of the Direct Link path between Premiere Pro CC and SpeedGrade CC is that you can easily go back and forth for changes without ever being locked in at any point in the process.

©2015 Oliver Peters

Adobe Anywhere and Divine Access

df0115_da_1_sm

Editors like the integration of Adobe’s software, especially Dynamic Link and Direct Link between creative applications. This sort of approach is applied to collaborative workflows with Adobe Anywhere, which permits multiple stakeholders, including editors, producers and directors, to access common media and productions from multiple, remote locations. One company that has invested in the Adobe Anywhere environment is G-Men Media of Venice, California, who installed it as their post production hub. By using Adobe Anywhere, Jeff Way (COO) and Clay Glendenning (CEO) sought to improve the efficiency of the filmmaking process for their productions. No science project – they have now tested the concept in the real world on several indie feature films.

Their latest film, Divine Access, produced by The Traveling Picture Show Company in association with G-Men Media, is a religious satire centering on reluctant prophet Jack Harriman. Forces both natural and supernatural lead Harriman down a road to redemption culminating in a final showdown with his long time foe, Reverend Guy Roy Davis. Steven Chester Prince (Boyhood, The Ringer, A Scanner Darkly) moves behind the camera as the film’s director. The entire film was shot in Austin, Texas during May of 2014, but the processing of dailies and all post production was handled back at the Venice facility. Way explains, “During principal photography we were able to utilize our Anywhere system to turn around dailies and rough cuts within hours after shooting. This reduced our turnaround time for review and approval, thus reducing budget line items. Using Anywhere enabled us to identify cuts and mark them as viable the same day, reducing the need for expensive pickup shoots later down the line.”

The production workflow

df0115_da_3_smDirector of Photography Julie Kirkwood (Hello I Must Be Going, Collaborator, Trek Nation) picked the ARRI ALEXA for this film and scenes were recorded as ProRes 4444 in 2K. An on-set data wrangler would back up the media to local hard drives and then a runner would take the media to a downtown upload site. The production company found an Austin location with 1GB upload speeds. This enabled them to upload 200GB of data in about 45 minutes. Most days only 50-80GB were uploaded at one time, since uploads happened several times throughout each day.

Way says, “We implemented a technical pipeline for the film that allowed us to remain flexible.  Adobe’s open API platform made this possible. During production we used an Amazon S3 instance in conjunction with Aspera to get the footage securely to our system and also act as a cloud back-up.” By uploading to Amazon and then downloading the media into their Anywhere system in Venice, G-Men now had secure, full-resolution media in redundant locations. Camera LUTs were also sent with the camera files, which could be added to the media for editorial purposes in Venice. Amazon will also provide a long-term archive of the 8TB of raw media for additional protection and redundancy. This Anywhere/Amazon/Aspera pipeline was supervised by software developer Matt Smith.

df0115_da_5_smBack in Venice, the download and ingest into the Anywhere server and storage was an automated process that Smith programmed. Glendenning explains, “It would automatically populate a bin named for that day with the incoming assets. Wells [Phinny, G-Men editorial assistant] would be able to grab from subfolders named ‘video’ and ‘audio’ to quickly organize clips into scene subfolders within the Anywhere production that he would create from that day’s callsheet. Wells did most of this work remotely from his home office a few miles away from the G-Men headquarters.” Footage was synced and logged for on-set review of dailies and on-set cuts the next day. Phinny effectively functioned as a remote DIT in a unique way.

Remote access in Austin to the Adobe Anywhere production for review was made possible through an iPad application. Way explains, “We had close contact with Wells via text message, phone and e-mail. The iPad access to Anywhere used a secure VPN connection over the Internet. We found that a 4G wireless data connection was sufficient to play the clips and cuts. On scenes where the director had concerns that there might not be enough coverage, the process enabled us to quickly see something. No time was lost to transcoding media or to exporting a viewable copy, which would be typical of the more traditional way of working.”

Creative editorial mixing Adobe Anywhere and Avid Media Composer

df0115_da_4_smOnce principal photography was completed, editing moved into the G-Men mothership. Instead of editing with Premiere Pro, however, Avid Media Composer was used. According to Way, “Our goal was to utilize the Anywhere system throughout as much of the production as possible. Although it would have been nice to use Premiere Pro for the creative edit, we believed going with an editor that shared our director’s creative vision was the best for the film. Kindra Marra [Scenic Route, Sassy Pants, Hick] preferred to cut in Media Composer. This gave us the opportunity to test how the system could adapt already existing Adobe productions.” G-Men has handled post on other productions where the editor worked remotely with an Anywhere production. In this case, since Marra lived close-by in Santa Monica, it was simpler just to set up the cutting room at their Venice facility. At the start of this phase, assistant editor Justin (J.T.) Billings joined the team.

Avid has added subscription pricing, so G-Men installed the Divine Access cutting room using a Mac Pro and “renting” the Media Composer 8 software for a few months. The Anywhere servers are integrated with a Facilis Technology TerraBlock shared storage network, which is compatible with most editing applications, including both Premiere Pro and Media Composer. The Mac Pro tower was wired into the TerraBlock SAN and was able to see the same ALEXA ProRes media as Anywhere. According to Billings, “Once all the media was on the TerraBlock drives, Marra was able to access these in the Media Composer project using Avid’s AMA-linking. This worked well and meant that no media had to be duplicated. The film was cut solely with AMA-linked media. External drives were also connected to the workstations for nightly back-ups as another layer of protection.”

Adobe Anywhere at the finish line

df0115_da_6_smOnce the cut was locked, an AAF composition for the edited sequence was sent from Media Composer to DaVinci Resolve 11, which was installed on an HP workstation at G-Men. This unit was also connected to the TerraBlock storage, so media instantly linked when the AAF file was imported. Freelance colorist Mark Todd Osborne graded the film on Resolve 11 and then exported a new AAF file corresponding to the rendered media, which now also existed on the SAN drives. This AAF composition was then re-imported into Media Composer.

Billings continues, “All of the original audio elements existed in the Media Composer project and there was no reason to bring them into Premiere Pro. By importing Resolve’s AAF back into Media Composer, we could then double-check the final timeline with audio and color corrected picture. From here, the audio and OMF files were exported for Pro Tools [sound editorial and the mix is being done out-of-house]. Reference video of the film for the mix could now use the graded images. A new AAF file for the graded timeline was also exported from Media Composer, which then went back into Premiere Pro and the Anywhere production. Once we get the mixed tracks back, these will be added to the Premiere Pro timeline. Final visual effects shots can also be loaded into Anywhere and then inserted into the Premiere Pro sequence. From here on, all further versions of Divine Access will be exported from Premiere Pro and Anywhere.”

Glendenning points out that, “To make sure the process went smoothly, we did have a veteran post production supervisor – Hank Braxtan – double check our workflow.  He and I have done a lot of work together over the years and has more than a decade of experience overseeing an Avid house. We made sure he was available whenever there were Avid-related technical questions from the editors.”

Way says, “Previously, on post production of [the indie film] Savageland, we were able to utilize Anywhere for full post production through to delivery. Divine Access has allowed us to take advantage of our system on both sides of the creative edit including principal photography and post finishing through to delivery. This gives us capabilities through entire productions. We have a strong mix of Apple and PC hardware and now we’ve proven that our Anywhere implementation is adaptable to a variety of different hardware and software configurations. Now it becomes a non-issue whether it’s Adobe, Avid or Resolve. It’s whatever the creative needs dictate; plus, we are happy to be able to use the fastest machines.”

Glendenning concludes, “Tight budget projects have tight deadlines and some producers have missed their deadlines because of post. We installed Adobe Anywhere and set up the ecosystem surrounding it because we feel this is a better way that can save time and money. I believe the strategy employed for Divine Access has been a great improvement over the usual methods. Using Adobe Anywhere really let us hit it out of the park.”

Originally written for DV magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Stocking Stuffers 2014

df_stuff14_1_smAs we head toward the end of the year, it’s time to look again at a few items you can use to spruce up your edit bay.

Let’s start at the computer. The “tube” Mac Pro has been out for nearly a year, but many will still be trying to get the most life out of their existing Mac Pro “tower”. I wrote about this awhile back, so this is a bit of a recap. More RAM, an internal SSD and an upgraded GPU card are the best starting points. OWC and Crucial are your best choices for RAM and solid state drives. If you want to bump up your GPU, then the Sapphire 7950 (Note: I have run into issues with some of these cards, where the spacer screws are too tall, requiring you to install the card in slot 2) and/or Nvidia GTX 680 Mac Edition cards are popular choices. However, these will only give you an incremental boost if you’ve already been running an ATI 5870 or Nvidia Quadro 4000 display card. df_stuff14_2_smIf you have the dough and want some solid horsepower, then go for the Nvidia Quadro K5000 card for the Mac. To expand your audio monitoring, look at Mackie mixers, KRK speakers and the PreSonus Audiobox USB interface. Naturally there are many video monitor options, but assuming you have an AJA or Blackmagic Design interface, FSI would be my choice. HP Dreamcolor is also a good option when connecting directly to the computer.

The video plug-in market is prolific, with plenty of packages and/or individual filters from FxFactory, Boris, GenArts, FCP Effects, Crumplepop, Red Giant and others. I like the Universe package from df_stuff14_3_smRed Giant, because it supports FCP X, Motion, Premiere Pro and After Effects. Red Giant continues to expand the package, including some very nice new premium effects. If you are a Media Composer user, then you might want to look into the upgrade from Avid FX to Boris Red. Naturally, you can’t go wrong with FxFactory, especially if you use FCP X. There’s a wide range of options with the ability to purchase single filters – all centrally managed through the FxFactory application.

df_stuff14_4_smFor audio, the go-to filter companies are iZotope, Waves and Focusrite to name a few. iZotope released some nice tools in its RX4 package – a state-of-the-art repair and restoration suite. If you just want a suite of EQ and compression tools, then Nectar Elements or Nectar 2 are the best all-in-one collections of audio filters. While most editors do their audio editing/mastering within their NLE, some need a bit more. Along with a 2.0 bump for Sound Forge Pro Mac, Sony Creative Software also released a standard version of Sound Forge through the Mac App Store.

df_stuff14_5_smIn the color correction world, there’s been a lot of development in film emulation look-up tables (LUTs). These can be used in most NLEs and grading applications. If that’s for you, check out ImpulZ and Osiris from Color Grading Central (LUT Utility required with FCP X), Koji Color or the new SpeedLooks 4 (from LookLabs). Each package offers a selection of Fuji and Kodak emulations, as well as other stylized looks. These packages feature LUT files in the .cube and/or .look (Adobe) LUT file formats and, thus, are compatible with most applications. If you want film emulation that also includes 3-way grading tools and adjustable film grain, your best choice is FilmConvert 2.0.

df_stuff14_6_smAnother category that is expanding covers the range of tools used to prep media from the camera prior to the edit. This had been something only for DITs and on-set “data wranglers”, but many videographers are increasingly using such tools on everyday productions. These now offer on-set features that benefit all file-based recordings. Pomfort Silverstack, ShotPut Pro, Redcine-X Pro and Adobe Prelude have been joined by new tools. To start, there’s Offload and EditReady, which are two very specific tools. Offload simply copies and verifies camera-card media to two target drives. EditReady is a simple drag-and-drop batch convertor to transcode media files. These join QtChange (a utility to batch-add timecode and reel IDs to media files) and Better Rename (a Finder renaming utility) in my book, as the best single-purpose production applications.

df_stuff14_7_smIf you want more in one tool, then there’s Bulletproof, which has now been joined in the market by Sony Creative Software’s Catalyst Browse and Prepare. Bulletproof features media offload, organization, color correction and transcoding. I like it, but my only beef is that it doesn’t properly handle timecode data, when present. Catalyst Browse is free and similar to Canon’s camera utility. It’s designed to read and work with media from any Sony camera. Catalyst Prepare is the paid version with an expanded feature set. It supports media from other camera manufacturers, including Canon and GoPro.

df_stuff14_8_smFinally, many folks are looking for alternative to Adobe Photoshop. I’m a fan of Pixelmator, but this has been joined by Pixlr and Mischief. All three are available from the Mac App Store. Pixlr is free, but can be expanded through subscription. In its basic form, Pixlr is a stylizing application that is like a very, very “lite” version of Photoshop; however, it includes some very nice image processing filters. Mischief is a drawing application designed to work with drawing tablets, although a mouse will work, too.

©2014 Oliver Peters

Gone Girl

df_gg_4David Fincher is back with another dark tale of modern life, Gone Girl – the film adaptation of Gillian Flynn’s 2012 novel. Flynn also penned the screenplay.  It is the story of Nick and Amy Dunne (Ben Affleck and Rosamund Pike) – writers who have been hit by the latest downturn in the economy and are living in America’s heartland. Except that Amy is now mysteriously missing under suspicious circumstances. The story is told from each of their subjective points of view. Nick’s angle is revealed through present events, while Amy’s story is told through her diary in a series of flashbacks. Through these we learn that theirs is less than the ideal marriage we see from the outside. But whose story tells the truth?

To pull the film together, Fincher turned to his trusted team of professionals including director of photography Jeff Cronenweth, editor Kirk Baxter and post production supervisor Peter Mavromates. Like Fincher’s previous films, Gone Girl has blazed new digital workflows and pushed new boundaries. It is the first major feature to use the RED EPIC Dragon camera, racking up 500 hours of raw footage. That’s the equivalent of 2,000,000 feet of 35mm film. Much of the post, including many of the visual effects, were handled in-house.

df_gg_1Kirk Baxter co-edited David Fincher’s The Curious Case of Benjamin Button, The Social Network and The Girl with the Dragon Tattoo with Angus Wall – films that earned the duo two best editing Oscars. Gone Girl was a solo effort for Baxter, who had also cut the first two episodes of House of Cards for Fincher. This film now becomes the first major feature to have been edited using Adobe Premiere Pro CC. Industry insiders consider this Adobe’s Cold Mountain moment. That refers to when Walter Murch used an early version of Apple Final Cut Pro to edit the film Cold Mountain, instantly raising the application’s awareness among the editing community as a viable tool for long-form post production. Now it’s Adobe’s turn.

In my conversation with Kirk Baxter, he revealed, “In between features, I edit commercials, like many other film editors. I had been cutting with Premiere Pro for about ten months before David invited me to edit Gone Girl. The production company made the decision to use Premiere Pro, because of its integration with After Effects, which was used extensively on the previous films. The Adobe suite works well for their goal to bring as much of the post in-house as possible. So, I was very comfortable with Premiere Pro when we started this film.”

It all starts with dailies

df_gg_3Tyler Nelson, assistant editor, explained the workflow, “The RED EPIC Dragon cameras shot 6K frames (6144 x 3072), but the shots were all framed for a 5K center extraction (5120 x 2133). This overshoot allowed reframing and stabilization. The .r3d files from the camera cards were ingested into a FotoKem nextLAB unit, which was used to transcode edit media, viewing dailies, archive the media to LTO data tape and transfer to shuttle drives. For offline editing, we created down-sampled ProRes 422 (LT) QuickTime media, sized at 2304 x 1152, which corresponded to the full 6K frame. The Premiere Pro sequences were set to 1920 x 800 for a 2.40:1 aspect. This size corresponded to the same 5K center extraction within the 6K camera files. By editing with the larger ProRes files inside of this timeline space, Kirk was only viewing the center extraction, but had the same relative overshoot area to enable easy repositioning in all four directions. In addition, we also uploaded dailies to the PIX system for everyone to review footage while on location. PIX also lets you include metadata for each shot, including lens choice and camera settings, such as color temperature and exposure index.”

Kirk Baxter has a very specific way that he likes to tackle dailies. He said, “I typically start in reverse order. David tends to hone in on the performance with each successive take until he feels he’s got it. He’s not like other directors that may ask for completely different deliveries from the actors with each take. With David, the last take might not be the best, but it’s the best starting point from which to judge the other takes. Once I go through a master shot, I’ll cut it up at the points where I feel the edits will be made. Then I’ll have the assistants repeat these edit points on all takes and string out the line readings back-to-back, so that the auditioning process is more accurate. David is very gifted at blocking and staging, so it’s rare that you don’t use an angle that was shot for a scene. I’ll then go through this sequence and lift my selected takes for each line reading up to a higher track on the timeline. My assistants take the selects and assemble a sequence of all the angles in scene order. Once it’s hyper-organized, I’ll send it to David via PIX and get his feedback. After that, I’ll cut the scene. David stays in close contact with me as he’s shooting. He wants to see a scene cut together before he strikes a set or releases an actor.”

Telling the story

df_gg_5The director’s cut is often where the story gets changed from what works on paper to what makes a better film. Baxter elaborated, “When David starts a film, the script has been thoroughly vetted, so typically there isn’t a lot of radical story re-arrangement in the cutting room. As editors, we got a lot of credit for the style of intercutting used in The Social Network, but truthfully that was largely in the script. The dialogue was tight and very integral to the flow, so we really couldn’t deviate a lot. I’ve always found the assembly the toughest part, due to the volume and the pressure of the ticking clock. Trying to stay on pace with the shoot involves some long days. The shooting schedule was 106 days and I had my first cut ready about two weeks after the production wrapped. A director gets around ten weeks for a director’s cut and with some directors, you are almost starting from scratch once the director arrives. With David, most of that ten week period involves adding finesse and polish, because we have done so much of the workload during the shoot.”

df_gg_9He continued, “The first act of Gone Girl uses a lot of flashbacks to tell Amy’s side of the story and with these, we deviated a touch from the script. We dropped a couple of scenes to help speed things along and reduced the back and forth of the two timelines by grouping flashbacks together, so that we didn’t keep interrupting the present day; but, it’s mostly executed as scripted. There was one scene towards the end that I didn’t feel was in the right place. I kept trying to move it, without success. I ended up taking another pass at the cut of the scene. Once we had the emotion right in the cut, the scene felt like it was in the right place, which is where it was written to be.”

“The hardest scenes to cut are the emotional scenes, because David simplifies the shooting. You can’t hide in dynamic motion. More complex scenes are actually easier to cut and certainly quite fun. About an hour into the film is the ‘cool girls’ scene, which rapidly answers lots of question marks that come before it. The scene runs about eight minutes long and is made up of about 200 set-ups. It’s a visual feast that should be hard to put together, but was actually dessert from start to finish, because David thought it through and supplied all the exact pieces to the puzzle.”

Music that builds tension

df_gg_6Composers Trent Reznor and Atticus Ross of Nine Inch Nails fame are another set of Fincher regulars. Reznor and Ross have typically supplied Baxter with an album of preliminary themes scored with key scenes in mind. These are used in the edit and then later enhanced by the composers with the final score at the time of the mix. Baxter explained, “On Gone Girl we received their music a bit later than usual, because they were touring at the time. When it did arrive, though, it was fabulous. Trent and Atticus are very good at nailing the feeling of a film like this. You start with a piece of music that has a vibe of ‘this is a safe, loving neighborhood’ and throughout three minutes it sours to something darker, which really works.”

“The final mix is usually the first time I can relax. We mixed at Skywalker Sound and that was the first chance I really had to enjoy the film, because now I was seeing it with all the right sound design and music added. This allows me to get swallowed up in the story and see beyond my role.”

Visual effects

df_gg_7The key factor to using Premiere Pro CC was its integration with After Effects CC via Adobe’s Dynamic Link feature. Kirk Baxter explained how he uses this feature, “Gone Girl doesn’t seem like a heavy visual effects film, but there are quite a lot of invisible effects. First of all, I tend to do a lot of invisible split screens. In a two-shot, I’ll often use a different performance for each actor. Roughly one-third of the timeline contains such shots. About two-thirds of the timeline has been stabilized or reframed. Normally, this type of in-house effects work is handled by the assistants who are using After Effects. Those shots are replaced in my sequence with an After Effects composition. As they make changes, my timeline is updated.”

“There are other types of visual effects, as well. David will take exteriors and do sky replacements, add flares, signage, trees, snow, breath, etc. The shot of Amy sinking in the water, which has been used in the trailers, is an effects composite. That’s better than trying to do multiple takes with the real actress by drowning her in cold water. Her hair and the water elements were created by Digital Domain. This is also a story about the media frenzy that grows around the mystery, which meant a lot of TV and computer screen comps. That content is as critical in the timing of a scene as the actors who are interacting with it.”

Tyler Nelson added his take on this, “A total of four assistants worked with Kirk on these in-house effects. We were using the same ProRes editing files to create the composites. In order to keep the system performance high, we would render these composites for Kirk’s timeline, instead of using unrendered After Effects composites. Once a shot was finalized, then we would go back to the 6K .r3d files and create the final composite at full resolution. The beauty of doing this all internally is that you have a team of people who really care about the quality of the project as much as everyone else. Plus the entire process becomes that much more interactive. We pushed each other to make everything as good as it could possibly be.”

Optimization and finishing

df_gg_2A custom pipeline was established to make the process efficient. This was spearheaded by post production consultant Jeff Brue, CTO of Open Drives. The front end storage for all active editorial files was a 36TB RAID-protected storage network built with SSDs. A second RAID built with standard HDDs was used for the .r3d camera files and visual effects elements. The hardware included a mix of HP and Apple workstations running with NVIDIA K6000 or K5200 GPU cards. Use of the NVIDIA cards was critical to permit as much real-time performance as possible doing the edit. GPU performance was also a key factor in the de-Bayering of .r3d files, since the team didn’t use any of the RED Rocket accelerator cards in their pipeline. The Macs were primarily used for the offline edit, while the PCs tackled the visual effects and media processing tasks.

In order to keep the Premiere Pro projects manageable, the team broke down the film into eight reels with a separate project file per reel. Each project contained roughly 1,500 to 2,000 files. In addition to Dynamic Linking of After Effects compositions, most of the clips were multi-camera clips, as Fincher typically shoots scenes with two or more cameras for simultaneous coverage. This massive amount of media could have potentially been a huge stumbling block, but Brue worked closely with Adobe to optimize system performance over the life of the project. For example, project load times dropped from about six to eight minutes at the start down to 90 seconds at best towards the end.

The final conform and color grading was handled by Light Iron on their Quantel Pablo Rio system run by colorist Ian Vertovec. The Rio was also configured with NVIDIA Tesla cards to facilitate this 6K pipeline. Nelson explained, “In order to track everything I used a custom Filemaker Pro database as the codebook for the film. This contained all the attributes for each and every shot. By using an EDL in conjunction with the codebook, it was possible to access any shot from the server. Since we were doing a lot of the effects in-house, we essentially ‘pre-conformed’ the reels and then turned those elements over to Light Iron for the final conform. All shots were sent over as 6K DPX frames, which were cropped to 5K during the DI in the Pablo. We also handled the color management of the RED files. Production shot these with the camera color metadata set to RedColor3, RedGamma3 and an exposure index of 800. That’s what we offlined with. These were then switched to RedLogFilm gamma when the DPX files were rendered for Light Iron. If, during the grade, it was decided that one of the raw settings needed to be adjusted for a few shots, then we would change the color settings and re-render a new version for them.” The final mastering was in 4K for theatrical distribution.

df_gg_8As with his previous films, director David Fincher has not only told a great story in Gone Girl, but set new standards in digital post production workflows. Seeking to retain creative control without breaking the bank, Fincher has pushed to handle as many services in-house as possible. His team has made effective use of After Effects for some time now, but the new Creative Cloud tools with Premiere Pro CC as the hub, bring the power of this suite to the forefront. Fortunately, team Fincher has been very eager to work with Adobe on product advances, many of which are evident in the new application versions previewed by Adobe at IBC in Amsterdam. With a film as complex as Gone Girl, it’s clear that Adobe Premiere Pro CC is ready for the big leagues.

Kirk Baxter closed our conversation with these final thoughts about the experience. He said, “It was a joy from start to finish making this film with David. Both he and Cean [Chaffin, producer and David Fincher’s wife] create such a tight knit post production team that you fall into an illusion that you’re making the film for yourselves. It’s almost a sad day when it’s released and belongs to everyone else.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

_________________________________

Needless to say, Gone Girl has received quite a lot of press. Here are just a few additional discussions of the workflow:

Adobe panel discussion with the post team

PostPerspective

FxGuide

HDVideoPro

IndieWire

IndieWire blog

ICG Magazine

RedUser

Tony Zhou’s Vimeo take on Fincher 

©2014 Oliver Peters

Sitting in the Mix Revisited

df_nlemix2_1

Video editors are being called on to do more and mixing audio is one of those tasks. While advanced audio editing and mixing is still best done in a DAW and by a professional who uses those tools everyday, it’s long been the case that most local TV commercials and a lot of corporate videos are mixed by the editor within the NLE. Time for a second look at the subject.

df_nlemix2_3Although most modern NLEs have very strong audio tools, I find that Adobe Premiere Pro CC is one of the better NLEs when it comes to basic audio mixing. There is a wide range of built-in plug-ins and it accepts most third party VST and AU (Mac) filters. Audio can be mixed at both the clip and the track level using faders, rubber-banding in the timeline or by writing automation mix passes with the track mixer. The following are some simple tips for getting good mixes for TV using Premiere Pro CC.

df_nlemix2_7Repair – If you have problem audio tracks, don’t forget that you can send your audio clip to Audition. When you select a clip to edit in Audition, a copy of the file is extracted and sent to Audition. This extracted copy replaces the original clip on the Premiere timeline so the original stays untouched. Audition is good for surgery, such as removing background noise. There are both waveform and spectral views where it’s possible to isolate and “heal” noise elements visible in the spectral view. I recently used this to reduce the noise from a lawn mower heard in the background of an on-location interview.

df_nlemix2_4Third-party filters – In addition the built-in tools, Premiere Pro supports any compliant audio filters on your system. By scanning the system, Premiere Pro (as well as Audition) can access plug-ins that you might have installed as part of other applications. Several good filter sets are available from Focusrite, Waves and iZotope. When it comes to audio mixing for simple projects, I’m a fan of the Vocal Rider and One Knob plug-ins from Waves. Vocal Rider is best with voice-overs by automatically “riding” the level between a minimum and maximum setting. It works a bit like a human operator in evening out volume variations and is not as blunt a tool as a compressor. The One Knob filters are a series of comprehensive filters for EQ or reverb controlled by a single adjustment knob. For example, you can use the “brighter” filter to adjust a multi-band, parametric-style EQ that increases the trebleness of the sound.

df_nlemix2_5Mixing formula – This is my standard formula for mixing TV spots in Premiere Pro. My intention is to end up with voices that sit well against a music track without the music volume being too low. A handy Premiere tool is the vocal enhancer. It’s a simple filter with an adjustment dial that balances the setting for male or female voices as well as for music. Dial in the setting by ear to the point that the voice “cuts” through the mix without sounding overly processed.  For music, I’ll typically apply an EQ filter to the track and bring down the broader mid-range by -2dB. Across the master bus (or a submix bus for each stem) I’ll apply a dynamic compressor/limiter. This is just used to “soft clip” the bus volume at -10dB. Overall, I’ll adjust clip and track volumes to run under this range, so as not to be harshly compressed or clipped.

df_nlemix2_6CALM – Most audio delivered for US broadcast has to be compliant to the loudness specs of the CALM Act. There are similar European standards. Adobe aids us in this, by including the TC Electronics Radar metering plug-in. If you use this, place it on the master bus and make sure audio is routed first through a submix bus. I’ll place a compressor/limiter on the submix bus. This way, all volume adjustments and limiting happen upstream of the meter. By adjusting your mix with the Radar meter running, it’s possible to end up with a compliant mix that still sounds quite natural.

©2014 Oliver Peters