Avid Media Composer Goes 4K

df0915_avidmc83_5_sm

Avid Technology entered 2015 with a bang. The company closed out 2014 with the release of its Media Composer version 8.3 software, the first to enable higher resolution editing, including 2K, UHD and 4K projects. On January 16th of this year, Avid celebrated its relisting on the NASDAQ exchange by ringing the opening bell. Finally – as in most years – the Academy Awards nominee field is dominated by films that used either Media Composer and/or Pro Tools during the post-production process.

In a software landscape quickly shifting to rental (subscription) business models, Avid now offers the most flexible price model. Media Composer | Software may be purchased, rented, or managed through a floating licensing. If you purchase a perpetual license (you own the software), then an annually-renewed support contract gives you phone support and continued software updates. Opt out of the contract and you’ll still own the software you bought – you just lose any updates to newer software.

You can purchase other optional add-ons, like Symphony for advanced color correction. Unfortunately there’s still no resolution to the impasse between Avid and Nexidia. If you purchased ScriptSync or PhraseFind in the past, which rely on IP from Nexidia, then you can’t upgrade to version 8 or higher software and use those options. On the other hand, if you own an older version, such as Media Composer 7, and need to edit a project that requires a higher version, you can simply pick up a software subscription for the few months. This would let you run the latest software version for the time that it will take to complete that project.

df0915_avidmc83_1_smThe jump from Media Composer | Software 8.2 to 8.3 might seem minor, but in fact this was a huge update for Avid editors. It ushered in new, high-resolution project settings and capabilities, but also added a resolution-independent Avid codec – DNxHR. Not merely just the ability to edit in 4K, Media Composer now addresses most of the different 4K options that cover the TV and cinema variations, as well as new color spaces and frame rates. Need to edit 4K DCI Flat (3996×2160) at 48fps in DCI-P3 color space? Version 8.3 makes it possible. Although Avid introduced high-resolution editing in its flagship software much later than its competitors, it comes to the table with a well-designed upgrade that attempts to address the nuances of modern post.

df0915_avidmc83_2_smAnother new feature is LUT support. Media Composer has allowed users to add LUTs to source media for awhile now, but 8.3 adds a new LUT filter. Apply this to a top video track on your timeline and you can then add a user-supplied, film emulation (or any other type) look to all of your footage. There’s a new Proxy setting designed for work with high-resolution media. For example, switch your project settings to 1/4 or 1/16 resolution for better performance while editing with large files. Switch Proxy off and you are ready to render and output at full quality. As Media Composer becomes more capable of functioning as a finishing system, it has gained DPX image sequence file export via the Avid Image Sequencer, as well as export to Apple ProRes 4444 (Mac only).

df0915_avidmc83_4_smThis new high resolution architecture requires that the software increasingly shed itself of any remaining 32-bit parts in order to be compatible with modern versions of the Mac and Windows operating systems. Avid’s Title Tool still exists for legacy SD and HD projects, but higher resolutions will use NewBlue Titler Pro, which is included with Media Composer. It can, of course, also be used for all other titling.

There are plenty of new, but smaller features for the editor, such as a “quick filter” in the bin. Use it to quickly filter items to match the bin view to correspond with your filter text entry. The Avid “helper” applications of EDL Manager and FilmScribe have now been integrated inside Media Composer as the List Tool. This may be used to generate EDLs, Cut Lists and Change Lists.

df0915_avidmc83_3_smAvid is also a maker of video i/o hardware – Mojo DX and Nitris DX. While these will work to monitor higher resolution projects as downscaled HD, they won’t be updated to display native 4K output, for instance. Avid has qualifying AJA and Blackmagic Design hardware for use as 4K i/o. It is currently also qualifying BlueFish 444. If you work with a 4K computer display connected to your workstation, then the Full Screen mode enables 4K preview monitoring.

Avid Media Composer | Software version 8.3 is just the beginning of Avid’s entry into the high-resolution post-production niche. Throughout 2015, updates will further refine and enhance these new capabilities and expand high-resolution to other Avid products and solutions. Initial user feedback is that 8.3 is reasonably stable and performs well, which is good news for the high-end film and television world that continues to rely on Avid for post-production tools and solutions.

(Full disclosure: I have participated in the Avid Customer Association and chaired the Video Subcommittee of the Products and Solutions Council. This council provides user feedback to Avid product management to aid in future product development.)

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Focus

df1515_focus_7

Every once in awhile a movie comes along that has the potential to change how we in the film and video world work. Focus is one such movie. It’s a romantic caper film in the vein of To Catch a Thief or the Oceans franchise. It stars Will Smith and Margot Robbie as master and novice con artists who become romantically involved. Focus was written and directed by the veteran team of Glenn Ficarra and John Requa (Crazy, Stupid, Love) who decided to use some innovative, new approaches in this production.

Focus is a high-budget, studio picture shot in several cities, including New Orleans and Buenos Aires. It also happens to be the first studio feature film that was cut using Apple’s Final Cut Pro X. The production chose to shoot with ARRI Alexa cameras, but record to ProRes 4444, instead of ARRIRAW (except for some VFX shots). At its launch in 2011, Final Cut Pro X had received a negative reaction from many veteran editors, so this was a tough sell to Warner Bros execs. In order to get over that hurdle, the editorial team went through extensive testing of all of the typical processes involved in feature film post production. They had to prove that the tool was more than up to the task.

Proving the concept

df1515_focus_5_smMike Matzdorff, first assistant editor, explains, “Warner Bros wanted to make sure that there was a fallback if everything blew up. That ‘Plan B’ was to step back and cut in either [Apple] FCP 7 or [Avid] Media Composer. Getting projects from Final Cut Pro X into Media Composer was very clunky, because of all the steps – getting to FCP 7 and then Automatic Duck to Avid. Getting to FCP 7 was relatively solid, so we ran a ‘chase project’ in FCP 7 with dailies for the entire show. Every couple days we would import an XML and relink the media. Fortunately FCP X worked well and ‘Plan B’ was never needed.”

df1515_focus_8_smGlenn Ficarra adds, “The industry changes so quickly that it’s hard to follow the progress. The studio was going off of old information and once they saw that our approach would work and also save time and money, then they were completely onboard with our choice.” The editorial team also consulted with Sam Mestman of FCPworks to determine what software, other than FCP X, was required to satisfy all of the elements associated with post on a feature film.

df1515_focus_4_smThis was a new experience for editor Jan Kovac (Curb Your Enthusiasm), as Focus is his first Hollywood feature film. Kovac studied film in the Czech Republic and then editing at UCLA. He’s been in the LA post world for 20 years, which is where he met Ficarra and Requa. Kovac was ready to be part of the team and accept the challenge of using Final Cut Pro X on a studio feature. He explains, “I was eager to work with John and Glenn and prove that FCP X is a viable option. In fact, I was using FCP X for small file-based projects since the fall of 2012.”

Production and post on the go

df1515_focus_1_smFocus was shot in 61 days across two continents, during a three-month period. Kovac and three assistants (Mike Matzdorff, Andrew Wallace, Kimaree Long) worked from before principal photography started until the sound mix and final delivery of the feature – roughly from September 2013 until August 2014. The production shot 145 hours of footage, much of it multicam. Focus was shot in an anamorphic format as 2048 x 1536 ProRes 4444 files recorded directly to the Alexa’s onboard cards. On set the DIT used the Light Iron Outpost mobile system to process the files, by de-squeezing them and baking in CDL color information. The editors then received 2048 x 1152 color-corrected ProRes 4444 “dailies”, which were still encoded with a Log-C gamma profile. FCP X has the ability to internally add a Log-C LUT on-the-fly to correct the displayed image. Therefore, during the edit, the clips always looked close the final appearance. Ficarra says, “This was great, because when we went through the DI for the final grading, the look was very close to what was decided on set. You didn’t see something radically different in the edit, so you didn’t develop ‘temp love’ for a certain look”.

df1515_focus_6_smA number of third-party developers have created utilities that fill in gaps and one of these is Intelligent Assistance, which makes various workflow tools based on XML. The editors used a number of these, including Sync-N-Link X, which enabled them to sync double-system sound with common timecode in a matter of minutes instead of hours. (Only a little use of Sync-N-Link X was made on Focus, because the DIT was using the Light Iron system to sync dailies.) Script data can also be added to Final Cut Pro X clips as notes. On Focus, that had to be done manually by the assistants. This need to automate the process spurred Kevin Bailey (Kovac’s assistant on his current film) to develop Shot Notes X, an application that takes the script supervisor’s information and merges it with FCP X Events to add this metadata into the notes field.

During the months of post, Apple released several updates to Final Cut Pro X and the team was not shy about upgrading mid-project. Matzdorff explains, “The transition to 10.1 integrated Events and Projects into Libraries. To make sure there weren’t any hiccups, I maintained an additional FCP X ‘chase project’.  I ran an alternate world between 10.0.9 and 10.1. We had 52 days of dailies in one Library and I would bring cuts across to see how they linked up and what happened. The transition was a rough one, but we learned a lot, which really helped down the line.”

Managing the media

df1515_focus_2_smFinal Cut Pro X has the unique ability to internally transcode quarter-sized editorial proxy files in the ProRes Proxy format. The editor can easily toggle between original footage and editorial proxies and FCP X takes care of the math to make sure color, effects and sizing information tracks correctly between modes. Throughout the editing period, Kovac, Ficarro, and the assistants used both proxies and the de-sequeezed camera files as their source. According to Kovac, “In Buenos Aires I was working from a MacBook Pro laptop using the proxies. For security reasons, I would lock up the footage in a safe. By using proxies, which take up less drive space, a much smaller hard drive was required and that easily fit into the safe.”

df1515_focus_3_smBack at their home base in LA, four rooms were set up connected to XSAN shared storage. These systems included iMacs and a Mac Pro (“tube” version). All camera media and common source clips. like sound effects libraries. lived on the XSAN, while each workstation had a small SSD RAID for proxies and local FCP X Libraries. The XSAN included a single transfer Library so that edits could be moved among the rooms. Kovac and Ficarra shared roles as co-editors at this stage, collaborating on each other’s scenes. Kovac says, “This was very fluid going back and forth between Glenn and me. The process was a lot like sharing sequences with FCP 7. It’s always good to keep perspective, so each of us would review the other’s edited scenes and offer notes.” The other two systems was used by the assistants. Kovac continues, “The Libraries were broken down by reel and all iterations of sharing were used, including the XSAN or sneaker net.”

Setting up a film edit in FCP X

df1515_focus_9_smAs with any film, the key is organization and translating the script into a final product. Kovac explains his process with FCP X, “The assistants would group the multicam clips and ‘reject’ the clip ranges before ‘action’ and after ‘cut’. This hides any extraneous material so you only have to sort through useable clips. We used a separate Event for each scene. With Sam and Mike, we worked out a process to review clips based on line readings. The dialogue lines in the script were numbered and the assistants would place a marker and a range for every three lines of dialogue. These were assigned keywords, so that each triplet of dialogue lines would end up in a Keyword Collection. Within a scene Event, I would have Keyword Collections for L1-3, L4-6, and so on. I would also create Smart Collections for certain criteria – for instance, a certain type of shot or anything else I might designate.”

Everyone involved felt that FCP X made the edit go faster, but it still takes time to be creative. Ficarra continues, “The first assembly of the film according to the script was about three hours long. I call this the ‘kitchen sink’ cut. The first screening cut was about two-and-a-half hours. We had removed some scenes and lengthened others and showed it to a ‘friends and family’ audience. It actually didn’t play as well as we’d hoped. Then we added these scenes back in and shortened everything, which went over much better. We had intentionally shot alternate versions of scenes just to play around with them in the edit. FCP X is a great tool for that, because you can easily edit a number of iterations.”

Engineered for speed

df1515_focus_10_smWhile many veteran editors experienced in other systems might scoff at the claims that FCP X is a faster editor, Mike Matzdorff was willing to put a finer point on that for me. He says, “I find that because of the magnetic timeline, trimming is a lot faster. If you label roles extensively, it’s easier to sort out temporary from final elements or organize sound sources when you hand off audio for sound post. With multi-channel audio in an Avid, for example, you generally sync the clips using only the composite mix. That way you aren’t tying up a lot of tracks on the timeline for all of the source channels. If you have to replace a line with a clean isolated mic, you have to dig it out and make the edit. With FCP X, all of the audio channels are there and neatly tucked away until you need them. It’s a simple matter of expanding a clip and picking a different channel. That alone is a major improvement.”

Ficarra and Kovac are in complete agreement. Ficarra points out, “As an editor, I’m twice as fast on FCP X as on Avid. There’s less clicking. This is the only NLE that’s not trying to emulate some other model, like cutting on a flatbed. You are able to move faster on your impulses.” Kovac adds, “It keeps you in the zone.”

The final DI was handled by Light Iron, who conformed and graded Focus. The handoff was made using an EDL and an FCPXML, along with a QuickTime picture reference. Light Iron relinked to the original anamorphic camera masters and graded using a Quantel Rio unit.

Filling in the workflow gaps

A number of developers contributed to the success of FCP X on Focus. Having a tight relationship with the editing team let them tailor their solutions to the needs of the production. One of these developers, Philip Hodgetts (President, Intelligent Assistance) says, “One of the nice things about being a small software developer is that we can react to customer needs very quickly. During the production of Focus we received feature requests for all the tools we were providing – Sync-N-Link X, Change List X and Producer’s Best Friend. For example, Sync-N-Link X gained the ability to create multicam clips, in addition to synchronizing audio and video, as a result of a feature request from first assistant Mike Matzdorff.” This extended to Apple’s ProApps team, who also kept a close and helpful watch on the progress of Focus.

df1515_focus_11_smFor every film that challenges convention, a lot of curiosity is raised about the process. Industry insiders refer to the “Cold Mountain moment” – alluding to the use of FCP 3 by editor Walter Murch on the film, Cold Mountain. That milestone added high-end legitimacy for the earlier Final Cut among professional users. Gone Girl did that for Adobe Premiere Pro and now Focus has done that for a new Final Cut. But times are different and it’s hard to say what the true impact will be. Nevertheless, Focus provided the confidence for the team to continue on their next film in the same manner, tapping Final Cut Pro X once again. Change can be both scary and exciting, but as Glenn Ficarra says, “We like to shake things up. It’s fun to see the bemused comments wondering how we could ever pull it off with something like FCP X!”

For those that want to know more about the nuts and bolts of the post production workflow, Mike Matzdorff released “Final Cut Pro X: Pro Workflow”, an e-book that’s a step-by-step advanced guide based on the lessons learned on Focus. It’s available through iTunes and Kindle.

For some additional reading on the post production workflow of Focus, check out this Apple “in action” story, as well as Part 1 and Part 2 of FCP.co’s very in-depth coverage of how the team got it done. For a very in-depth understanding, make sure you watch the videos at PostPerspective.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

The Black Panthers: Vanguard of the Revolution

df0615_panthers_3_smDocumentaries covering subject matter that happens within a generation usually divides the audience between those who personally lived through the time period and those who’ve only read about it in history books. The Black Panthers: Vanguard of the Revolution is one such film. If you are over 50, you are aware of the media coverage of the Black Panther Party and certainly have opinions and possibly misconceptions of who they were. If you are under 50, then you may have learned about them in history class, if which case, you may only know them by myth and legend. Filmmaker Stanley Nelson (The American Experience, Freedom Summer, Wounded Knee, Jonestown: The Life and Death of Peoples Temple) seeks to go beyond what you think you know with this new Sundance Film Festival documentary entry.

I spoke with the film’s editor, Aljernon Tunsil, as he was putting the finishing touches on the film to get it ready for Sundance presentation. Tunsil has worked his way up from assistant editor to editor and discussed the evolution in roles. “I started in a production company office, initially helping the assistant editor,” he says. “Over a period of seven or eight years, I worked my way up from assistant to a full-time editor. Along the way, I’ve had a number of mentors and learned to cut on both [Apple] Final Cut Pro and [Avid] Media Composer. These mentors were instrumental in my learning how to tell a story. I worked on a short with Stanley [Nelson] and that started our relationship of working together on films. I view my role as the ‘first audience’ for the film. The producer or director knows the story they want to make, but the editor helps to make sense of it for someone who doesn’t intimately know the material. My key job is to make sure that the narrative makes senses and that no one gets lost.”

df0615_panthers_2_smThe Black Panthers is told through a series of interviews (about 40 total subjects). Although a few notables, like Kathleen Cleaver, are featured, the chronicle of the rise and fall of the Panthers is largely told by lesser known party members, as well as FBI informants and police officers active in the events. The total post-production period took about 40 to 50 weeks. Tunsil explains, “Firelight Films (the production company) is very good at researching characters and finding old subjects for the interviews. They supplied me with a couple of hundred hours of footage. That’s a challenge to organize so that you know what you have. My process is to first watch all of that with the filmmakers and then to assemble the best of the interviews and best of the archival footage. Typically it takes six to ten weeks to get there and then another four to six weeks to get to a rough cut.”

Tunsil continues, “The typical working arrangement with Stanley is that he’ll take a day to review any changes I’ve made and then give me notes for any adjustments. As we were putting the film together, Stanley was still recording more interviews to fill in the gaps – trying to tie the story together without the need for a narrator. After that, it’s the usual process of streamlining the film. We could have made a ten-hour film, but, of course, not all of the stories would fit into the final two-hour version.”

df0615_panthers_5_smLike many documentary film editors, Tunsil prefers having interview transcripts, but acknowledged they don’t tell the whole story. He says, “One example is in the interview with former Panther member Wayne Pharr. He describes the police raid on the LA headquarters of the party and the ensuing shootout. When asked how he felt, he talks about his feeling of freedom, even though the event surrounding him was horrific. That feeling clearly comes across in the emotion on his face, which transcends the mere words in the transcript. You get to hear the story from the heart – not just the facts. Stories are what makes a documentary like this.”

As with many films about the 1960s and 1970s, The Black Panthers weaves into its fabric the music of the era. Tunsil says, “About 60% of the film was composed by Tom Phillips, but we also had about seven or eight period songs, like ‘Express Yourself’, which we used under [former Panther member] Bobby Seale’s run for mayor of Oakland. I used other pieces from Tom’s library as temp music, which we then gave to him for the feel. He’d compose something similar – or different, but in a better direction.”

df0615_panthers_6_smTunsil is a fervent Avid Media Composer editor, which he used for The Black Panthers. He explains, “I worked with Rebecca Sherwood as my associate editor and we were both using Media Composer version 7. We used a Facilis Terrablock for shared storage, but this was primarily used to transfer media between us, as we both had our own external drives with a mirrored set of media files. All the media was at the DNxHD 175 resolution. I like Avid’s special features such as PhraseFind, but overall, I feel that Media Composer is just better at letting me organize material than is Final Cut. I love Avid as an editing system, because it’s the most stable and makes the work easy. Editing is best when there’s a rhythm to the workflow and Media Composer is good for that. As for the stills, I did temporary moves with the Avid pan-and-zoom plug-in, but did the final moves in [Adobe] After Effects.”

df0615_panthers_1_smFor a documentary editor, part of the experience are the things you personally learn. Tunsil reflects, “I like the way Stanley and Firelight handle these stories. They don’t just tell it from the standpoint of the giants of history, but more from the point-of-view of the rank-and-file people. He’s trying to show the full dimension of the Panthers instead of the myth and iconography. It’s telling the history of the real people, which humanizes them. That’s a more down-to-earth, honest experience. For instance, I never knew that they had a communal living arrangement. By having the average members tell their stories, it makes it so much richer. Another example is the Fred Hampton story. He was the leader of the Chicago chapter of the party who was killed in a police shootout; but, there was no evidence of gunfire from inside the building that he was in. That’s a powerful scene, which resonates. One part of the film that I think is particularly well done is the explanation of how the party declined due to a split between Eldridge Cleaver and Huey Newton. This was in part as a result of an internal misinformation campaign instigated by the FBI within the Panthers.”

df0615_panthers_4_smThroughout the process, the filmmakers ran a number of test screenings with diverse audiences, including industry professionals and non-professionals, people who knew the history and people who didn’t. Results from these screenings enabled Nelson and Tunsil to refine the film. To complete the film’s finishing, Firelight used New York editorial facility Framerunner. Tunsil continues, “Framerunner is doing the online using an Avid Symphony. To get ready, we simply consolidated the media to a single drive and then brought it there. They are handling all color correction, improving moves on stills and up-converting the standard definition archival footage.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Preparing Digital Camera Files

df0815_media_4_sm.

The modern direction in file-based post production workflows is to keep your camera files native throughout the enter pipeline. While this might work within a closed loop, like a self-contained Avid, Adobe or Apple workflow, it breaks down when you have to move your project across multiple applications. It’s common for an editor to send files to a Pro Tools studio for the final mix and to a colorist running Resolve, Baselight, etc. for the final grade. In doing so, you have to ensure that editorial decisions aren’t incorrectly translated in the process, because the NLE might handle a native camera format differently than the mixer’s or colorist’s tool. To keep the process solid, I’ve developed some disciplines in how I like to handle media. The applications I mention are for Mac OS, but most of these companies offer Windows versions, too. If not, you can easily find equivalents.

Copying media

df0815_media_6_smThe first step is to get the media from the camera cards to a reliable hard drive. It’s preferable to have at least two copies (from the location) and to make the copies using software that verifies the back-up. This is a process often done on location by the lowly “data wrangler” under less than ideal conditions. A number of applications, such as Imagine Products’ ShotPut Pro and Adobe Prelude let you do this task, but my current favorite is Red Giant’s Offload. It uses a dirt simple interface permitting one source and two target locations. It has the sole purpose of safely transferring media with no other frills.

Processing media on location

df0815_media_5_smWith the practice of shooting footage with a flat-looking log gamma profile, many productions like to also see the final, adjusted look on location. This often involves some on-site color grading to create either a temporary look or even the final look. Usually this task falls to a DIT (digital imaging technician). Several applications are available, including DaVinci Resolve, Pomfort Silverstack and Redcine-X Pro. Some new applications, specifically designed for field use, include Red Giant’s BulletProof and Catalyst Browse/Prepare from Sony Creative Software. Catalyst Browse in free and designed for all Sony cameras, whereas Catalyst Prepare is a paid application that covers Sony cameras, but also other brands, including Canon and GoPro. Depending on the application, these tools may be used to add color correction, organize the media, transcode file formats, and even prepare simple rough assemblies of selected footage.

All of these tools add a lot of power, but frankly, I’d prefer that the production company leave these tasks up to the editorial team and allow more time in post. In my testing, most of the aforementioned apps work as advertised; however, BulletProof continues to have issues with the proper handling of timecode.

Transcoding media

df0815_media_2_smI’m not a big believer in always using native media for the edit, unless you are in a fast turnaround situation. To get the maximum advantage for interchanging files between applications, it is ideal to end up in one of several common media formats, if that isn’t how the original footage was recorded. You also want every file to have unique and consistent metadata, including file names, reel IDs and timecode. The easiest common media format is QuickTime using the .mov wrapper and encoded using either Apple ProRes, Panasonic AVC-Intra, Sony XDCAM, or Avid DNxHD codecs. These are generally readable in most applications running on Mac or PC. My preference is to first convert all files into QuickTime using one of these codecs, if they originated as something else. That’s because the file is relatively malleable at that point and doesn’t require a rigid external folder structure.

Applications like BulletProof and Catalyst can transcode camera files into another format. Of course, there are dedicated batch encoders like Sorenson Squeeze, Apple Compressor, Adobe Media Encoder and Telestream Episode. My personal choice for a tool to transcode camera media is either MPEG Streamclip (free) or Divergent Media’s EditReady. Both feature easy-to-use batch processing interfaces, but EditReady adds the ability to apply LUTs, change file names and export to multiple targets. It also reads formats that MPEG Streamclip doesn’t, such as C300 files (Canon XF codec wrapped as .mxf). If you want to generate a clean master copy preserving the log gamma profile, as well as a second lower resolution editorial file with a LUT applied, then EditReady is the right application.

Altering your media

df0815_media_3_smI will go to extra lengths to make sure that files have proper names, timecode and source/tape/reel ID metadata. Most professional video cameras will correctly embed that information. Others, like the Canon 5D Mark III, might encode a non-standard timecode format, allow duplicated file names, and not add reel IDs.

Once the media has been transcoded, I will use two applications to adjust the file metadata. For timecode, I rely on VideoToolShed’s QtChange. This application lets you alter QuickTime files in a number of ways, but I primarily use it to strip off unnecessary audio tracks and bad timecode. Then I use it to embed proper reel IDs and timecode. Because it does this by altering header information, processing a lot of files happens quickly. The second tool in this mix is Better Rename, which is batch renaming utility. I use it frequently for adding, deleting or changing all or part of the file name for a batch of files. For instance, I might append a production job number to the front of a set of Canon 5D files. The point in doing all of this is so that you can easily locate the exact same point within any file using any application, even several years apart.

df0815_media_1_smSpeed is a special condition. Most NLEs handle files with mixed frame rates within the same project and sequences, but often such timelines do not correctly translate from one piece of software to the next. Edit lists are interchanged using EDL, XML, FCPXML and AAF formats and each company has its own variation of the format that they use. Some formats, like FCPXML, require third party utilities to translate the list, adding another variable. Round-tripping, such as going from NLE “A” (for offline) to Color Correction System “B” (for grading) and then to NLE “C” (for finishing), often involves several translations. Apart from effects, speed differences in native camera files can be a huge problem.

A common mixed frame rate situation in the edit is combining 23.98fps and 29.97fps footage. If both of these were intended to run in real-time, then it’s usually OK. However, if the footage was recorded with the intent to overcrank for slomo (59.94 or 29.97 native for a timebase of 23.98) then you start to run into issues. As long as the camera properly flags the file, so that every application plays it at the proper timebase (slowed), then things are fine. This isn’t true of DSLRs, where you might shoot 720p/59.94 for use as slomo in a 1080p/29.97 or 23.98 sequence. With these files, my recommendation is to alter the speed of the file first, before using it inside the NLE. One way to do this is to use Apple Cinema Tools (part of the defunct Final Cut Studio package, but can still be found). You can batch-conform a set of 59.94fps files to play natively at 23.98fps in very short order. This should be done BEFORE adding any timecode with QtChange. Remember that any audio will have its sample rate shifted, which I’ve found to be a problem with FCP X. Therefore, when you do this, also strip off the audio tracks using QtChange. They play slow anyway and so are useless in most cases where you want overcranked, slow motion files.

Audio in your NLE

The last point to understand is that not all NLEs deal with audio tracks in the same fashion. Often camera files are recorded with multiple mono audio sources, such as a boom and a lav mic on channels 1 and 2. These may be interpreted either as stereo or as dual mono, depending on the NLE. Premiere Pro CC in particular sees these as stereo when imported. If you edit them to the timeline as a single stereo track, you will not be able to correct this in the sequence afterwards by panning. Therefore, it’s important to remember to first set-up your camera files with a dual mono channel assignment before making the first edit. This same issue crops up when round-tripping files through Resolve. It may not properly handle audio, depending on how it interprets these files, so be careful.

These steps add a bit more time at the front end of any given edit, but are guaranteed to give you a better editing experience on complex projects. The results will be easier interchange between applications and more reliable relinking. Finally, when you revisit a project a year or more down the road, everything should pop back up, right where you left it.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Adobe Anywhere and Divine Access

df0115_da_1_sm

Editors like the integration of Adobe’s software, especially Dynamic Link and Direct Link between creative applications. This sort of approach is applied to collaborative workflows with Adobe Anywhere, which permits multiple stakeholders, including editors, producers and directors, to access common media and productions from multiple, remote locations. One company that has invested in the Adobe Anywhere environment is G-Men Media of Venice, California, who installed it as their post production hub. By using Adobe Anywhere, Jeff Way (COO) and Clay Glendenning (CEO) sought to improve the efficiency of the filmmaking process for their productions. No science project – they have now tested the concept in the real world on several indie feature films.

Their latest film, Divine Access, produced by The Traveling Picture Show Company in association with G-Men Media, is a religious satire centering on reluctant prophet Jack Harriman. Forces both natural and supernatural lead Harriman down a road to redemption culminating in a final showdown with his long time foe, Reverend Guy Roy Davis. Steven Chester Prince (Boyhood, The Ringer, A Scanner Darkly) moves behind the camera as the film’s director. The entire film was shot in Austin, Texas during May of 2014, but the processing of dailies and all post production was handled back at the Venice facility. Way explains, “During principal photography we were able to utilize our Anywhere system to turn around dailies and rough cuts within hours after shooting. This reduced our turnaround time for review and approval, thus reducing budget line items. Using Anywhere enabled us to identify cuts and mark them as viable the same day, reducing the need for expensive pickup shoots later down the line.”

The production workflow

df0115_da_3_smDirector of Photography Julie Kirkwood (Hello I Must Be Going, Collaborator, Trek Nation) picked the ARRI ALEXA for this film and scenes were recorded as ProRes 4444 in 2K. An on-set data wrangler would back up the media to local hard drives and then a runner would take the media to a downtown upload site. The production company found an Austin location with 1GB upload speeds. This enabled them to upload 200GB of data in about 45 minutes. Most days only 50-80GB were uploaded at one time, since uploads happened several times throughout each day.

Way says, “We implemented a technical pipeline for the film that allowed us to remain flexible.  Adobe’s open API platform made this possible. During production we used an Amazon S3 instance in conjunction with Aspera to get the footage securely to our system and also act as a cloud back-up.” By uploading to Amazon and then downloading the media into their Anywhere system in Venice, G-Men now had secure, full-resolution media in redundant locations. Camera LUTs were also sent with the camera files, which could be added to the media for editorial purposes in Venice. Amazon will also provide a long-term archive of the 8TB of raw media for additional protection and redundancy. This Anywhere/Amazon/Aspera pipeline was supervised by software developer Matt Smith.

df0115_da_5_smBack in Venice, the download and ingest into the Anywhere server and storage was an automated process that Smith programmed. Glendenning explains, “It would automatically populate a bin named for that day with the incoming assets. Wells [Phinny, G-Men editorial assistant] would be able to grab from subfolders named ‘video’ and ‘audio’ to quickly organize clips into scene subfolders within the Anywhere production that he would create from that day’s callsheet. Wells did most of this work remotely from his home office a few miles away from the G-Men headquarters.” Footage was synced and logged for on-set review of dailies and on-set cuts the next day. Phinny effectively functioned as a remote DIT in a unique way.

Remote access in Austin to the Adobe Anywhere production for review was made possible through an iPad application. Way explains, “We had close contact with Wells via text message, phone and e-mail. The iPad access to Anywhere used a secure VPN connection over the Internet. We found that a 4G wireless data connection was sufficient to play the clips and cuts. On scenes where the director had concerns that there might not be enough coverage, the process enabled us to quickly see something. No time was lost to transcoding media or to exporting a viewable copy, which would be typical of the more traditional way of working.”

Creative editorial mixing Adobe Anywhere and Avid Media Composer

df0115_da_4_smOnce principal photography was completed, editing moved into the G-Men mothership. Instead of editing with Premiere Pro, however, Avid Media Composer was used. According to Way, “Our goal was to utilize the Anywhere system throughout as much of the production as possible. Although it would have been nice to use Premiere Pro for the creative edit, we believed going with an editor that shared our director’s creative vision was the best for the film. Kindra Marra [Scenic Route, Sassy Pants, Hick] preferred to cut in Media Composer. This gave us the opportunity to test how the system could adapt already existing Adobe productions.” G-Men has handled post on other productions where the editor worked remotely with an Anywhere production. In this case, since Marra lived close-by in Santa Monica, it was simpler just to set up the cutting room at their Venice facility. At the start of this phase, assistant editor Justin (J.T.) Billings joined the team.

Avid has added subscription pricing, so G-Men installed the Divine Access cutting room using a Mac Pro and “renting” the Media Composer 8 software for a few months. The Anywhere servers are integrated with a Facilis Technology TerraBlock shared storage network, which is compatible with most editing applications, including both Premiere Pro and Media Composer. The Mac Pro tower was wired into the TerraBlock SAN and was able to see the same ALEXA ProRes media as Anywhere. According to Billings, “Once all the media was on the TerraBlock drives, Marra was able to access these in the Media Composer project using Avid’s AMA-linking. This worked well and meant that no media had to be duplicated. The film was cut solely with AMA-linked media. External drives were also connected to the workstations for nightly back-ups as another layer of protection.”

Adobe Anywhere at the finish line

df0115_da_6_smOnce the cut was locked, an AAF composition for the edited sequence was sent from Media Composer to DaVinci Resolve 11, which was installed on an HP workstation at G-Men. This unit was also connected to the TerraBlock storage, so media instantly linked when the AAF file was imported. Freelance colorist Mark Todd Osborne graded the film on Resolve 11 and then exported a new AAF file corresponding to the rendered media, which now also existed on the SAN drives. This AAF composition was then re-imported into Media Composer.

Billings continues, “All of the original audio elements existed in the Media Composer project and there was no reason to bring them into Premiere Pro. By importing Resolve’s AAF back into Media Composer, we could then double-check the final timeline with audio and color corrected picture. From here, the audio and OMF files were exported for Pro Tools [sound editorial and the mix is being done out-of-house]. Reference video of the film for the mix could now use the graded images. A new AAF file for the graded timeline was also exported from Media Composer, which then went back into Premiere Pro and the Anywhere production. Once we get the mixed tracks back, these will be added to the Premiere Pro timeline. Final visual effects shots can also be loaded into Anywhere and then inserted into the Premiere Pro sequence. From here on, all further versions of Divine Access will be exported from Premiere Pro and Anywhere.”

Glendenning points out that, “To make sure the process went smoothly, we did have a veteran post production supervisor – Hank Braxtan – double check our workflow.  He and I have done a lot of work together over the years and has more than a decade of experience overseeing an Avid house. We made sure he was available whenever there were Avid-related technical questions from the editors.”

Way says, “Previously, on post production of [the indie film] Savageland, we were able to utilize Anywhere for full post production through to delivery. Divine Access has allowed us to take advantage of our system on both sides of the creative edit including principal photography and post finishing through to delivery. This gives us capabilities through entire productions. We have a strong mix of Apple and PC hardware and now we’ve proven that our Anywhere implementation is adaptable to a variety of different hardware and software configurations. Now it becomes a non-issue whether it’s Adobe, Avid or Resolve. It’s whatever the creative needs dictate; plus, we are happy to be able to use the fastest machines.”

Glendenning concludes, “Tight budget projects have tight deadlines and some producers have missed their deadlines because of post. We installed Adobe Anywhere and set up the ecosystem surrounding it because we feel this is a better way that can save time and money. I believe the strategy employed for Divine Access has been a great improvement over the usual methods. Using Adobe Anywhere really let us hit it out of the park.”

Originally written for DV magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Birdman

Birdman-PosterIt’s rare, but exhilarating, when you watch a movie with a unique take on film’s visual language, without the crutch of extensive computer generated imagery. That’s precisely the beauty of Birdman or (The Unexpected Virtue of Ignorance). The film is directed and co-written by Alejandro González Iñárritu (Biutiful, Babel, 21 Grams) and features a dynamic ensemble cast dominated by Michael Keaton’s lead performance as Riggan Thomson. While most films are constructed of intercutting master shots, two-shots and singles, Birdman is designed to look like a continuous, single take. While this has been done before in films, approximately 100 minutes out of the two-hour movie appear as a completely seamless composite of lengthy Steadicam and hand-held takes.

Riggan Thomson (Keaton) is a movie star who rode to fame as the comic book super hero Birdman; but, it’s a role that he walked away from. Searching for contemporary relevance, Riggan has decided to mount a Broadway play, based on the real-life Raymond Carter short story, What We Talk About When We Talk About Love. The film takes place entirely at the historic St. James Theater near Times Square and the surrounding area in New York. Principal photography occurred over a 30-day period, both at the real theater and Times Square, as well as at Kaufman Astoria Studios. The soundstage sets were for the backstage and dressing room portions of the theater. Throughout the film, Riggan struggles with the challenges of getting the play to opening day and dealing with his fellow cast members, but more notably confronting his super ego Birdman, seen in person and heard in voice-over. This is, of course, playing out in Riggan’s imagination. The film, like the play within the film, wrestles with the overall theme of the confusion between love and affection.

Bringing this ambitious vision to life fell heavily upon the skills of the director of photography and the editors. Emmanuel Lubezki, known as Chivo, served as DoP. He won the 2014 Cinematography Oscar for Gravity, a film that was also heralded for its long, seemingly continuous shots. Stephen Mirrione (The Monuments Men, Ocean’s Thirteen, Traffic) and Douglas Crise (Arbitrage, Deception, Babel) teamed up for the edit. Both editors had worked together before, as well as with the director. Mirrione started during the rehearsal process. At the time of production, Crise handled the editing in New York, while Mirrione, who was back LA at this time, was getting dailies and checking in on the cut as well as sending scenes back and forth with changes every day.

It starts with preparation

Stephen Mirrione explains, “When I first saw what they wanted to do, I was a bit skeptical that it could be pulled off successfully. Just one scene that didn’t work would ruin the whole film. Everything really had to align. One of the things that was considered, was to tape and edit all of the rehearsals. This was done about two months before the principal photography was set to start. The rehearsals were edited together, which allowed Alejandro to adjust the script, pacing and performances. We could see what would work and what wouldn’t. Before cameras even rolled, we had an assembly made up of the rehearsal footage and some of the table read. So, together with Alejandro, we could begin to gauge what the film would look and sound like, where a conversation was redundant, where the moves would be. It was like a pre-vis that you might create for a large-scale CGI or animated feature.”

Once production started in New York, Douglas Crise picked up the edit. Typically, the cast and crew would rehearse the first half of the day and then tape during the second half. ARRI ALEXA cameras were used. The media was sent to Technicolor, who would turn around color corrected Avid DNxHD36 dailies for the next day. The team of editors and assistants used Avid Media Composer systems. According to Crise, “I would check the previous day’s scenes and experiment to see how the edit points would ‘join’ together. You are having to make choices based on performance, but also how the camera work would edit together. Alejandro would have to commit to certain editorial decisions, because those choices would dictate where the shot would pick up on the next day. Stephen would check in on the progress during this period and then he picked up once the cut shifted to visual effects.”

Naturally the editing challenge was to make the scenes flow seamlessly in both a figurative and literal sense. “The big difference with this film was that we didn’t have the conventional places where one scene started and another ended. Every scene walks into the next one. Alejandro described it as going down a hill and not stopping. There wasn’t really a transition. The characters just keep moving on,” Crise says.

“I think we really anticipated a lot of the potential pitfalls and really prepared, but what we didn’t plan on were all the speed changes,” Mirrione adds. “At certain points, when the scene was not popping for us, if the tempo was a little off, we could actually dial up the pace or slow it down as need be without it being perceptible to the audience and that made a big difference.”

Score and syncopation

To help drive pace, much of the track uses a drum score composed and performed by Mexican drummer Antonio Sanchez. In some scenes within the film, the camera also pans over to a drummer with a kit who just happens to be playing in an alley or even in a backstage hallway. Sanchez and Iñárritu went into a studio and recorded sixty improvised tracks based on the emotions that the film needed. Mirrione says, “Alejandro would explain the scene to the drummer in the studio and then he’d create it.” Crise continues, “Alejandro had all these drum recordings and he told me to pick six of my favorites. We cut those together so that he could have a track that the drummer could mimic when they shot that scene. He had the idea for the soundtrack from the very beginning and we had those samples cut in from the start, too.”

“And then Martín [Hernández, supervising sound editor] took it to another level. Once there was an first pass at the movie, with a lot of those drum tracks laid in as an outline, he spent a lot of time working with Alejandro, to strip layers away, add some in, trying a lot of different beats. Obviously, in every movie, music will have an impact on point of view and mood and tone. But with this, I think it was especially important, because the rhythm is so tied to the camera and you can’t make those kinds of cadence adjustments with as much flexibility as you can with cuts. We had to lean on the music a little more than normal at times, to push back or pull forward,” Mirrione says.

The invisible art

The technique of this seamless sequence of scenes really allows you to get into the head of Riggan more so than other films, but the editors are reserved in discussing the actual editing technique. Mirrione explains, “Editing is often called the ‘invisible art’. We shape scenes and performances on every film. There has been a lot of speculation over the internet about the exact number and length of shots. I can tell you it’s more than most people would guess. But we don’t want that to be the focus of the discussion. The process is still the same of affecting performance and pace. It’s just that the dynamic has been shifted, because much of the effort was front-loaded in the early days. Unlike other films, where the editing phase after the production is completed, focuses on shaping the story – on Birdman it was about fine-tuning.”

Crise continues, “Working on this film was a different process and a different way to come up with new ideas. It’s also important to know that most of the film was shot practically. Michael [Keaton] really is running through Times Square in his underwear. The shots are not comped together with green screen actors against CGI buildings.” There are quite a lot of visual effects used to enhance and augment the transitions from one shot to the next to make these seamless. On the other hand, when Riggan’s Birdman delusions come to life on screen, we also see more recognizable visual effects, such as a brief helicopter and creature battle playing out over the streets of New York.

Winking at the audience

The film is written as a black comedy with quite a few insider references. Clearly, the casting of Michael Keaton provides allusion to his real experiences in starting the Batman film franchise and in many ways the whole super hero film genre. However, there was also a conscious effort during rehearsals and tapings to adjust the dialogue in ways that kept these references as current as possible. Crise adds, “Ironically, in the scenes on the rooftop there was a billboard in the background behind Emma Stone and Edward Norton, with a reference to Tom Hanks. We felt that audiences would believe that we created it on purpose, when if fact it was a real billboard. It was changed in post, just to keep from appearing to be an insider reference that was too obvious.”

The considerations mandated during the edit by a seamless film presented other challenges, too. For example, simple concerns, like where to structure reel breaks and how to hand off shots for visual effects. Mirrione points out, “Simple tasks such as sending out shots for VFX, color correction, or even making changes for international distribution requirements were complicated by the fact that once we finished, there weren’t individual ‘shots’ to work with – just one long never ending strand.  It meant inventing new techniques along the way.  Steven Scott, the colorist, worked with Chivo at Technicolor LA to perfect all the color and lighting and had to track all of these changes across the entire span of the movie.  The same way we found places to hide stitches between shots, they had to hide these color transitions which are normally changed at the point of a cut from one shot to the next.”

In total, the film was in production and post for about a year, starting with rehearsals in early 2013. Birdman was mixed and completed by mid-February 2014. While it posed a technical and artistic challenge from the start, everything amazingly fell into place, highlighted by perfect choreography of cast, crew and the post production team. It will be interesting to see how Birdman fares during awards season, because it succeeds on so many levels.

Originally written for Digital Video magazine / CreativePlanetNetwork

©2014, 2015 Oliver Peters

Stocking Stuffers 2014

df_stuff14_1_smAs we head toward the end of the year, it’s time to look again at a few items you can use to spruce up your edit bay.

Let’s start at the computer. The “tube” Mac Pro has been out for nearly a year, but many will still be trying to get the most life out of their existing Mac Pro “tower”. I wrote about this awhile back, so this is a bit of a recap. More RAM, an internal SSD and an upgraded GPU card are the best starting points. OWC and Crucial are your best choices for RAM and solid state drives. If you want to bump up your GPU, then the Sapphire 7950 (Note: I have run into issues with some of these cards, where the spacer screws are too tall, requiring you to install the card in slot 2) and/or Nvidia GTX 680 Mac Edition cards are popular choices. However, these will only give you an incremental boost if you’ve already been running an ATI 5870 or Nvidia Quadro 4000 display card. df_stuff14_2_smIf you have the dough and want some solid horsepower, then go for the Nvidia Quadro K5000 card for the Mac. To expand your audio monitoring, look at Mackie mixers, KRK speakers and the PreSonus Audiobox USB interface. Naturally there are many video monitor options, but assuming you have an AJA or Blackmagic Design interface, FSI would be my choice. HP Dreamcolor is also a good option when connecting directly to the computer.

The video plug-in market is prolific, with plenty of packages and/or individual filters from FxFactory, Boris, GenArts, FCP Effects, Crumplepop, Red Giant and others. I like the Universe package from df_stuff14_3_smRed Giant, because it supports FCP X, Motion, Premiere Pro and After Effects. Red Giant continues to expand the package, including some very nice new premium effects. If you are a Media Composer user, then you might want to look into the upgrade from Avid FX to Boris Red. Naturally, you can’t go wrong with FxFactory, especially if you use FCP X. There’s a wide range of options with the ability to purchase single filters – all centrally managed through the FxFactory application.

df_stuff14_4_smFor audio, the go-to filter companies are iZotope, Waves and Focusrite to name a few. iZotope released some nice tools in its RX4 package – a state-of-the-art repair and restoration suite. If you just want a suite of EQ and compression tools, then Nectar Elements or Nectar 2 are the best all-in-one collections of audio filters. While most editors do their audio editing/mastering within their NLE, some need a bit more. Along with a 2.0 bump for Sound Forge Pro Mac, Sony Creative Software also released a standard version of Sound Forge through the Mac App Store.

df_stuff14_5_smIn the color correction world, there’s been a lot of development in film emulation look-up tables (LUTs). These can be used in most NLEs and grading applications. If that’s for you, check out ImpulZ and Osiris from Color Grading Central (LUT Utility required with FCP X), Koji Color or the new SpeedLooks 4 (from LookLabs). Each package offers a selection of Fuji and Kodak emulations, as well as other stylized looks. These packages feature LUT files in the .cube and/or .look (Adobe) LUT file formats and, thus, are compatible with most applications. If you want film emulation that also includes 3-way grading tools and adjustable film grain, your best choice is FilmConvert 2.0.

df_stuff14_6_smAnother category that is expanding covers the range of tools used to prep media from the camera prior to the edit. This had been something only for DITs and on-set “data wranglers”, but many videographers are increasingly using such tools on everyday productions. These now offer on-set features that benefit all file-based recordings. Pomfort Silverstack, ShotPut Pro, Redcine-X Pro and Adobe Prelude have been joined by new tools. To start, there’s Offload and EditReady, which are two very specific tools. Offload simply copies and verifies camera-card media to two target drives. EditReady is a simple drag-and-drop batch convertor to transcode media files. These join QtChange (a utility to batch-add timecode and reel IDs to media files) and Better Rename (a Finder renaming utility) in my book, as the best single-purpose production applications.

df_stuff14_7_smIf you want more in one tool, then there’s Bulletproof, which has now been joined in the market by Sony Creative Software’s Catalyst Browse and Prepare. Bulletproof features media offload, organization, color correction and transcoding. I like it, but my only beef is that it doesn’t properly handle timecode data, when present. Catalyst Browse is free and similar to Canon’s camera utility. It’s designed to read and work with media from any Sony camera. Catalyst Prepare is the paid version with an expanded feature set. It supports media from other camera manufacturers, including Canon and GoPro.

df_stuff14_8_smFinally, many folks are looking for alternative to Adobe Photoshop. I’m a fan of Pixelmator, but this has been joined by Pixlr and Mischief. All three are available from the Mac App Store. Pixlr is free, but can be expanded through subscription. In its basic form, Pixlr is a stylizing application that is like a very, very “lite” version of Photoshop; however, it includes some very nice image processing filters. Mischief is a drawing application designed to work with drawing tablets, although a mouse will work, too.

©2014 Oliver Peters