Avid Media Composer Goes 4K


Avid Technology entered 2015 with a bang. The company closed out 2014 with the release of its Media Composer version 8.3 software, the first to enable higher resolution editing, including 2K, UHD and 4K projects. On January 16th of this year, Avid celebrated its relisting on the NASDAQ exchange by ringing the opening bell. Finally – as in most years – the Academy Awards nominee field is dominated by films that used either Media Composer and/or Pro Tools during the post-production process.

In a software landscape quickly shifting to rental (subscription) business models, Avid now offers the most flexible price model. Media Composer | Software may be purchased, rented, or managed through a floating licensing. If you purchase a perpetual license (you own the software), then an annually-renewed support contract gives you phone support and continued software updates. Opt out of the contract and you’ll still own the software you bought – you just lose any updates to newer software.

You can purchase other optional add-ons, like Symphony for advanced color correction. Unfortunately there’s still no resolution to the impasse between Avid and Nexidia. If you purchased ScriptSync or PhraseFind in the past, which rely on IP from Nexidia, then you can’t upgrade to version 8 or higher software and use those options. On the other hand, if you own an older version, such as Media Composer 7, and need to edit a project that requires a higher version, you can simply pick up a software subscription for the few months. This would let you run the latest software version for the time that it will take to complete that project.

df0915_avidmc83_1_smThe jump from Media Composer | Software 8.2 to 8.3 might seem minor, but in fact this was a huge update for Avid editors. It ushered in new, high-resolution project settings and capabilities, but also added a resolution-independent Avid codec – DNxHR. Not merely just the ability to edit in 4K, Media Composer now addresses most of the different 4K options that cover the TV and cinema variations, as well as new color spaces and frame rates. Need to edit 4K DCI Flat (3996×2160) at 48fps in DCI-P3 color space? Version 8.3 makes it possible. Although Avid introduced high-resolution editing in its flagship software much later than its competitors, it comes to the table with a well-designed upgrade that attempts to address the nuances of modern post.

df0915_avidmc83_2_smAnother new feature is LUT support. Media Composer has allowed users to add LUTs to source media for awhile now, but 8.3 adds a new LUT filter. Apply this to a top video track on your timeline and you can then add a user-supplied, film emulation (or any other type) look to all of your footage. There’s a new Proxy setting designed for work with high-resolution media. For example, switch your project settings to 1/4 or 1/16 resolution for better performance while editing with large files. Switch Proxy off and you are ready to render and output at full quality. As Media Composer becomes more capable of functioning as a finishing system, it has gained DPX image sequence file export via the Avid Image Sequencer, as well as export to Apple ProRes 4444 (Mac only).

df0915_avidmc83_4_smThis new high resolution architecture requires that the software increasingly shed itself of any remaining 32-bit parts in order to be compatible with modern versions of the Mac and Windows operating systems. Avid’s Title Tool still exists for legacy SD and HD projects, but higher resolutions will use NewBlue Titler Pro, which is included with Media Composer. It can, of course, also be used for all other titling.

There are plenty of new, but smaller features for the editor, such as a “quick filter” in the bin. Use it to quickly filter items to match the bin view to correspond with your filter text entry. The Avid “helper” applications of EDL Manager and FilmScribe have now been integrated inside Media Composer as the List Tool. This may be used to generate EDLs, Cut Lists and Change Lists.

df0915_avidmc83_3_smAvid is also a maker of video i/o hardware – Mojo DX and Nitris DX. While these will work to monitor higher resolution projects as downscaled HD, they won’t be updated to display native 4K output, for instance. Avid has qualifying AJA and Blackmagic Design hardware for use as 4K i/o. It is currently also qualifying BlueFish 444. If you work with a 4K computer display connected to your workstation, then the Full Screen mode enables 4K preview monitoring.

Avid Media Composer | Software version 8.3 is just the beginning of Avid’s entry into the high-resolution post-production niche. Throughout 2015, updates will further refine and enhance these new capabilities and expand high-resolution to other Avid products and solutions. Initial user feedback is that 8.3 is reasonably stable and performs well, which is good news for the high-end film and television world that continues to rely on Avid for post-production tools and solutions.

(Full disclosure: I have participated in the Avid Customer Association and chaired the Video Subcommittee of the Products and Solutions Council. This council provides user feedback to Avid product management to aid in future product development.)

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

The Black Panthers: Vanguard of the Revolution

df0615_panthers_3_smDocumentaries covering subject matter that happens within a generation usually divides the audience between those who personally lived through the time period and those who’ve only read about it in history books. The Black Panthers: Vanguard of the Revolution is one such film. If you are over 50, you are aware of the media coverage of the Black Panther Party and certainly have opinions and possibly misconceptions of who they were. If you are under 50, then you may have learned about them in history class, if which case, you may only know them by myth and legend. Filmmaker Stanley Nelson (The American Experience, Freedom Summer, Wounded Knee, Jonestown: The Life and Death of Peoples Temple) seeks to go beyond what you think you know with this new Sundance Film Festival documentary entry.

I spoke with the film’s editor, Aljernon Tunsil, as he was putting the finishing touches on the film to get it ready for Sundance presentation. Tunsil has worked his way up from assistant editor to editor and discussed the evolution in roles. “I started in a production company office, initially helping the assistant editor,” he says. “Over a period of seven or eight years, I worked my way up from assistant to a full-time editor. Along the way, I’ve had a number of mentors and learned to cut on both [Apple] Final Cut Pro and [Avid] Media Composer. These mentors were instrumental in my learning how to tell a story. I worked on a short with Stanley [Nelson] and that started our relationship of working together on films. I view my role as the ‘first audience’ for the film. The producer or director knows the story they want to make, but the editor helps to make sense of it for someone who doesn’t intimately know the material. My key job is to make sure that the narrative makes senses and that no one gets lost.”

df0615_panthers_2_smThe Black Panthers is told through a series of interviews (about 40 total subjects). Although a few notables, like Kathleen Cleaver, are featured, the chronicle of the rise and fall of the Panthers is largely told by lesser known party members, as well as FBI informants and police officers active in the events. The total post-production period took about 40 to 50 weeks. Tunsil explains, “Firelight Films (the production company) is very good at researching characters and finding old subjects for the interviews. They supplied me with a couple of hundred hours of footage. That’s a challenge to organize so that you know what you have. My process is to first watch all of that with the filmmakers and then to assemble the best of the interviews and best of the archival footage. Typically it takes six to ten weeks to get there and then another four to six weeks to get to a rough cut.”

Tunsil continues, “The typical working arrangement with Stanley is that he’ll take a day to review any changes I’ve made and then give me notes for any adjustments. As we were putting the film together, Stanley was still recording more interviews to fill in the gaps – trying to tie the story together without the need for a narrator. After that, it’s the usual process of streamlining the film. We could have made a ten-hour film, but, of course, not all of the stories would fit into the final two-hour version.”

df0615_panthers_5_smLike many documentary film editors, Tunsil prefers having interview transcripts, but acknowledged they don’t tell the whole story. He says, “One example is in the interview with former Panther member Wayne Pharr. He describes the police raid on the LA headquarters of the party and the ensuing shootout. When asked how he felt, he talks about his feeling of freedom, even though the event surrounding him was horrific. That feeling clearly comes across in the emotion on his face, which transcends the mere words in the transcript. You get to hear the story from the heart – not just the facts. Stories are what makes a documentary like this.”

As with many films about the 1960s and 1970s, The Black Panthers weaves into its fabric the music of the era. Tunsil says, “About 60% of the film was composed by Tom Phillips, but we also had about seven or eight period songs, like ‘Express Yourself’, which we used under [former Panther member] Bobby Seale’s run for mayor of Oakland. I used other pieces from Tom’s library as temp music, which we then gave to him for the feel. He’d compose something similar – or different, but in a better direction.”

df0615_panthers_6_smTunsil is a fervent Avid Media Composer editor, which he used for The Black Panthers. He explains, “I worked with Rebecca Sherwood as my associate editor and we were both using Media Composer version 7. We used a Facilis Terrablock for shared storage, but this was primarily used to transfer media between us, as we both had our own external drives with a mirrored set of media files. All the media was at the DNxHD 175 resolution. I like Avid’s special features such as PhraseFind, but overall, I feel that Media Composer is just better at letting me organize material than is Final Cut. I love Avid as an editing system, because it’s the most stable and makes the work easy. Editing is best when there’s a rhythm to the workflow and Media Composer is good for that. As for the stills, I did temporary moves with the Avid pan-and-zoom plug-in, but did the final moves in [Adobe] After Effects.”

df0615_panthers_1_smFor a documentary editor, part of the experience are the things you personally learn. Tunsil reflects, “I like the way Stanley and Firelight handle these stories. They don’t just tell it from the standpoint of the giants of history, but more from the point-of-view of the rank-and-file people. He’s trying to show the full dimension of the Panthers instead of the myth and iconography. It’s telling the history of the real people, which humanizes them. That’s a more down-to-earth, honest experience. For instance, I never knew that they had a communal living arrangement. By having the average members tell their stories, it makes it so much richer. Another example is the Fred Hampton story. He was the leader of the Chicago chapter of the party who was killed in a police shootout; but, there was no evidence of gunfire from inside the building that he was in. That’s a powerful scene, which resonates. One part of the film that I think is particularly well done is the explanation of how the party declined due to a split between Eldridge Cleaver and Huey Newton. This was in part as a result of an internal misinformation campaign instigated by the FBI within the Panthers.”

df0615_panthers_4_smThroughout the process, the filmmakers ran a number of test screenings with diverse audiences, including industry professionals and non-professionals, people who knew the history and people who didn’t. Results from these screenings enabled Nelson and Tunsil to refine the film. To complete the film’s finishing, Firelight used New York editorial facility Framerunner. Tunsil continues, “Framerunner is doing the online using an Avid Symphony. To get ready, we simply consolidated the media to a single drive and then brought it there. They are handling all color correction, improving moves on stills and up-converting the standard definition archival footage.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2015 Oliver Peters

Stocking Stuffers 2014

df_stuff14_1_smAs we head toward the end of the year, it’s time to look again at a few items you can use to spruce up your edit bay.

Let’s start at the computer. The “tube” Mac Pro has been out for nearly a year, but many will still be trying to get the most life out of their existing Mac Pro “tower”. I wrote about this awhile back, so this is a bit of a recap. More RAM, an internal SSD and an upgraded GPU card are the best starting points. OWC and Crucial are your best choices for RAM and solid state drives. If you want to bump up your GPU, then the Sapphire 7950 (Note: I have run into issues with some of these cards, where the spacer screws are too tall, requiring you to install the card in slot 2) and/or Nvidia GTX 680 Mac Edition cards are popular choices. However, these will only give you an incremental boost if you’ve already been running an ATI 5870 or Nvidia Quadro 4000 display card. df_stuff14_2_smIf you have the dough and want some solid horsepower, then go for the Nvidia Quadro K5000 card for the Mac. To expand your audio monitoring, look at Mackie mixers, KRK speakers and the PreSonus Audiobox USB interface. Naturally there are many video monitor options, but assuming you have an AJA or Blackmagic Design interface, FSI would be my choice. HP Dreamcolor is also a good option when connecting directly to the computer.

The video plug-in market is prolific, with plenty of packages and/or individual filters from FxFactory, Boris, GenArts, FCP Effects, Crumplepop, Red Giant and others. I like the Universe package from df_stuff14_3_smRed Giant, because it supports FCP X, Motion, Premiere Pro and After Effects. Red Giant continues to expand the package, including some very nice new premium effects. If you are a Media Composer user, then you might want to look into the upgrade from Avid FX to Boris Red. Naturally, you can’t go wrong with FxFactory, especially if you use FCP X. There’s a wide range of options with the ability to purchase single filters – all centrally managed through the FxFactory application.

df_stuff14_4_smFor audio, the go-to filter companies are iZotope, Waves and Focusrite to name a few. iZotope released some nice tools in its RX4 package – a state-of-the-art repair and restoration suite. If you just want a suite of EQ and compression tools, then Nectar Elements or Nectar 2 are the best all-in-one collections of audio filters. While most editors do their audio editing/mastering within their NLE, some need a bit more. Along with a 2.0 bump for Sound Forge Pro Mac, Sony Creative Software also released a standard version of Sound Forge through the Mac App Store.

df_stuff14_5_smIn the color correction world, there’s been a lot of development in film emulation look-up tables (LUTs). These can be used in most NLEs and grading applications. If that’s for you, check out ImpulZ and Osiris from Color Grading Central (LUT Utility required with FCP X), Koji Color or the new SpeedLooks 4 (from LookLabs). Each package offers a selection of Fuji and Kodak emulations, as well as other stylized looks. These packages feature LUT files in the .cube and/or .look (Adobe) LUT file formats and, thus, are compatible with most applications. If you want film emulation that also includes 3-way grading tools and adjustable film grain, your best choice is FilmConvert 2.0.

df_stuff14_6_smAnother category that is expanding covers the range of tools used to prep media from the camera prior to the edit. This had been something only for DITs and on-set “data wranglers”, but many videographers are increasingly using such tools on everyday productions. These now offer on-set features that benefit all file-based recordings. Pomfort Silverstack, ShotPut Pro, Redcine-X Pro and Adobe Prelude have been joined by new tools. To start, there’s Offload and EditReady, which are two very specific tools. Offload simply copies and verifies camera-card media to two target drives. EditReady is a simple drag-and-drop batch convertor to transcode media files. These join QtChange (a utility to batch-add timecode and reel IDs to media files) and Better Rename (a Finder renaming utility) in my book, as the best single-purpose production applications.

df_stuff14_7_smIf you want more in one tool, then there’s Bulletproof, which has now been joined in the market by Sony Creative Software’s Catalyst Browse and Prepare. Bulletproof features media offload, organization, color correction and transcoding. I like it, but my only beef is that it doesn’t properly handle timecode data, when present. Catalyst Browse is free and similar to Canon’s camera utility. It’s designed to read and work with media from any Sony camera. Catalyst Prepare is the paid version with an expanded feature set. It supports media from other camera manufacturers, including Canon and GoPro.

df_stuff14_8_smFinally, many folks are looking for alternative to Adobe Photoshop. I’m a fan of Pixelmator, but this has been joined by Pixlr and Mischief. All three are available from the Mac App Store. Pixlr is free, but can be expanded through subscription. In its basic form, Pixlr is a stylizing application that is like a very, very “lite” version of Photoshop; however, it includes some very nice image processing filters. Mischief is a drawing application designed to work with drawing tablets, although a mouse will work, too.

©2014 Oliver Peters

24p HD Restoration


There’s a lot of good film content that only lives on 4×3 SD 29.97 interlaced videotape masters. Certainly in many cases you can go back and retransfer the film to give it new life, but for many small filmmakers, the associated costs put that out of reach. In general, I’m referring to projects with $0 budgets. Is there a way to get an acceptable HD product from an old Digibeta master without breaking the bank? A recent project of mine would say, yes.

How we got here

I had a rather storied history with this film. It was originally shot on 35mm negative, framed for 1.85:1, with the intent to end up with a cut negative and release prints for theatrical distribution. It was being posted around 2001 at a facility where I worked and I was involved with some of the post production, although not the original edit. At the time, synced dailies were transferred to Beta-SP with burn-in data on the top and bottom of the frame for offline editing purposes. As was common practice back then, the 24fps film negative was transferred to the interlaced video standard of 29.97fps with added 2:3 pulldown – a process that duplicates additional fields from the film frames, such that 24 film frames evenly add up to 60 video fields in the NTSC world. This is loaded into an Avid, where – depending on the system – the redundant fields are removed, or the list that goes to the negative cutter compensates for the adjustments back to a frame-accurate 24fps film cut.

df_24psdhd_5For the purpose of festival screenings, the project file was loaded into our Avid Symphony and I conformed the film at uncompressed SD resolution from the Beta-SP dailies and handled color correction. I applied a mask to hide the burn-in and ended up with a letter-boxed sequence, which was then output to Digibeta for previews and sales pitches to potential distributors. The negative went off to the negative cutter, but for a variety of reasons, that cut was never fully completed. In the two years before a distribution deal was secured, additional minor video changes were made throughout the film to end up with a revised cut, which no longer matched the negative cut.

Ultimately the distribution deal that was struck was only for international video release and nothing theatrical, which meant that rather than finishing/revising the negative cut, the most cost-effective process was to deliver a clean video master. Except, that all video source material had burn-in and the distributor required a full-height 4×3 master. Therefore, letter-boxing was out. To meet the delivery requirements, the filmmaker would have to go back to the original negative and retransfer it in a 4×3 SD format and master that to Digital Betacam. Since the negative was only partially cut and additional shots were added or changed, I went through a process of supervising the color-corrected transfer of all required 35mm film footage. Then I rebuilt the new edit timeline largely by eye-matching the new, clean footage to the old sequence. Once done and synced with the mix, a Digibeta master was created and off it went for distribution.

What goes around comes around

After a few years in distribution, the filmmaker retrieved his master and rights to the film, with the hope of breathing a little life into it through self-distribution – DVDs, Blu-rays, Internet, etc. With the masters back in-hand, it was now a question of how best to create a new product. One thought was simply to letter-box the film (to be in the director’s desired aspect) and call it a day. Of course, that still wouldn’t be in HD, which is where I stepped back in to create a restored master that would work for HD distribution.

Obviously, if there was any budget to retransfer the film negative to HD and repeat the same conforming operation that I’d done a few years ago – except now in HD – that would have been preferable. Naturally, if you have some budget, that path will give you better results, so shop around. Unfortunately, while desktop tools for editors and color correction have become dirt-cheap in the intervening years, film-to-tape transfer and film scanning services have not – and these retain a high price tag. So if I was to create a new HD master, it had to be from the existing 4×3 NTSC interlaced Digibeta master as the starting point.

In my experience, I know that if you are going to blow-up SD to HD frame sizes, it’s best to start with a progressive and not interlaced source. That’s even more true when working with software, rather than hardware up-convertors, like Teranex. Step one was to reconstruct a correct 23.98p SD master from the 29.97i source. To do this, I captured the Digibeta master as a ProResHQ file.

Avid Media Composer to the rescue


When you talk about software tools that are commonly available to most producers, then there are a number of applications that can correctly apply a “reverse telecine” process. There are, of course, hardware solutions from Snell and Teranex (Blackmagic Design) that do an excellent job, but I’m focusing on a DIY solution in this post. That involves deconstructing the 2:3 pulldown (also called “3:2 pulldown”) cadence of whole and split-field frames back into only whole frames, without any interlaced tearing (split-field frames). After Effects and Cinema Tools offer this feature, but they really only work well when the entire source clip is of a consistent and unbroken cadence. This film had been completed in NTSC 29.97 TV-land, so frequently at cuts, the cadence would change. In addition, there had been some digital noise reduction applied to the final master after the Avid output to tape, which further altered the cadence at some cuts. Therefore, to reconstruct the proper cadence, changes had to be made at every few cuts and, in some scenes, at every shot change. This meant slicing the master file at every required point and applying a different setting to each clip. The only software that I know of to effectively do this with is Avid Media Composer.

Start in Media Composer by creating a 29.97 NTSC 4×3 project for the original source. Import the film file there. Next, create a second 23.98 NTSC 4×3 project. Open the bin from the 29.97 project into the 23.98 project and edit the 29.97 film clip to a new 23.98 sequence. Media Composer will apply a default motion adapter to the clip (which is the entire film) in order to reconcile the 29.97 interlaced frame rate into a 23.98 progressive timeline.

Now comes the hard part. Open the Motion Effect Editor window and “promote” the effect to gain access to the advanced controls. Set the Type to “Both Fields”, Source to “Film with 2:3 Pulldown” and Output to “Progressive”. Although you can hit “Detect” and let Media Composer try to decide the right cadence, it will likely guess incorrectly on a complex file like this. Instead, under the 2:3 Pulldown tab, toggle through the cadence options until you only see whole frames when you step through the shot frame-by-frame. Move forward to the next shot(s) until you see the cadence change and you see split-field frames again. Split the video track (place an “add edit”) at that cut and step through the cadence choices again to find the right combination. Rinse and repeat for the whole film.

Due to the nature of the process, you might have a cut that itself occurs within a split-field frame. That’s usually because this was a cut in the negative and was transferred as a split-field video frame. In that situation, you will have to remove the entire frame across both audio and video. These tiny 1-frame adjustments throughout the film will slightly shorten the duration, but usually it’s not a big deal. However, the audio edit may or may not be noticeable. If it can’t simply be fixed by a short 2-frame dissolve, then usually it’s possible to shift the audio edit a little into a pause between words, where it will sound fine.

Once the entire film is done, export a new self-contained master file. Depending on codecs and options, this might require a mixdown within Avid, especially if AMA linking was used. That was the case for this project, because I started out in ProResHQ. After export, you’ll have a clean, reconstructed 23.98p 4×3 NTSC-sized (720×486) master file. Now for the blow-up to HD.

DaVinci Resolve

df_24psdhd_1_smThere are many applications and filters that can blow-up SD to HD footage, but often the results end up soft. I’ve found DaVinci Resolve to offer some of the cleanest resizing, along with very fast rendering for the final output. Resolve offers three scaling algorithms, with “Sharper” providing the crispest blow-up. The second issue is that since I wanted to restore the wider aspect, which is inherent in going from 4×3 to 16×9, this meant blowing up more than normal – enough to fit the image width and crop the top and bottom of the frame. Since Resolve has the editing tools to split clips at cuts, you have the option to change the vertical position of a frame using the tilt control. Plus, you can do this creatively on a shot-by-shot basis if you want to. This way you can optimize the shot to best fit into the 16×9 frame, rather than arbitrarily lopping off a preset amount from the top and bottom.

df_24psdhd_3_smYou actually have two options. The first is to blow up the film to a large 4×3 frame out of Resolve and then do the slicing and vertical reframing in yet another application, like FCP 7. That’s what I did originally with this project, because back then, the available version of Resolve did not offer what I felt were solid editing tools. Today, I would use the second option, which would be to do all of the reframing strictly within Resolve 11.

As always, there are some uncontrollable issues in this process. The original transfer of the film to Digibeta was done on a Rank Cintel Mark III, which is a telecine unit that used a CRT (literally an oscilloscope tube) as a light source. The images from these tubes get softer as they age and, therefore, they require periodic scheduled replacement. During the course of the transfer of the film, the lab replaced the tube, which resulted in a noticeable difference in crispness between shots done before and after the replacement. In the SD world, this didn’t appear to be a huge deal. Once I started blowing up that footage, however, it really made a difference. The crisper footage (after the tube replacement) held up to more of a blow-up than the earlier footage. In the end, I opted to only take the film to 720p (1280×720) rather than a full 1080p (1920×1080), just because I didn’t feel that the majority of the film held up well enough at 1080. Not just for the softness, but also in the level of film grain. Not ideal, but the best that can be expected under the circumstances. At 720p, it’s still quite good on Blu-ray, standard DVD or for HD over the web.

df_24psdhd_4_smTo finish the process, I dust-busted the film to fix places with obvious negative dirt (white specs in the frame) caused by the initial handling of the film negative. I used FCP X and CoreMelt’s SliceX to hide and cover negative dirt, but other options to do this include built in functions within Avid Media Composer. While 35mm film still holds a certain intangible visual charm – even in such a “manipulated” state – the process certainly makes you appreciate modern digital cameras like the ARRI ALEXA!

As an aside, I’ve done two other complete films this way, but in those cases, I was fortunate to work from 1080i masters, so no blow-up was required. One was a film transferred in its entirety from a low-contrast print, broken into reels. The second was assembled digitally and output to intermediate HDCAM-SR 23.98 masters for each reel. These were then assembled to a 1080i composite master. Aside from being in HD to start with, cadence changes only occurred at the edits between reels. This meant that it only required 5 or 6 cadence corrections to fix the entire film.

©2014 Oliver Peters

The Zero Theorem

df_tzt_1Few filmmakers are as gifted as Terry Gilliam when it comes to setting a story inside a dystopian future. The Monty Python alum, who brought us Brazil and Twelve Monkeys, to name just a few, is back with his newest, The Zero Theorem. It’s the story of Qohen Leth – played by Christoph Walz (Django Unchained, Water for Elephants, Inglorious Basterds) – an eccentric computer programmer who has been tasked by his corporate employer to solve the Zero Theorem. This is a calculation, that if solved, might prove that the meaning of life is nothingness.

The story is set in a futuristic London, but carries many of Gilliam’s hallmarks, like a retro approach to the design of technology. Qohen works out of his home, which is much like a rundown church. Part of the story takes Qohen into worlds of virtual reality, where he frequently interacts with Bainsley (Melanie Thierry), a webcam stripper that he met at a party, but who may have been sent by his employer, Mancom, to distract him. The Zero Theorem is very reminiscent of Brazil, but in concept, also of The Prisoner, a 1960s-era television series. Gilliam explores themes of isolation versus loneliness, the pointlessness of mathematical modeling to derive meaning and privacy issues.

I recently had a Skype chat with Mick Audsley, who edited the film last year. Audsley is London-based, but is currently nearing completion of a director’s cut of the feature film Everest in Iceland. This was his third Gilliam film, having previously edited Twelve Monkeys and The Imaginarium of Doctor Parnassus. Audsley explained, “I knew Terry before Twelve Monkeys and have always had a lot of admiration for him. This is my third film with Terry, as well as a short, and he’s an extraordinarily interesting director to work with. He still thinks in a graphic way, since he is both literally and figuratively an artist. He can do all of our jobs better than we can, but really values the input from other collaborators. It’s a bit like playing in a band, where everyone feeds off of the input of the other band members.”df_tzt_5

The long path to production

The film’s screenplay writer Pat Rushin teaches creative writing at the University of Central Florida in Orlando, Florida. He originally submitted the script for The Zero Theorem to the television series Project Greenlight, where it made the top 250. The script ended up with the Zanuck Company. It was offered to Gilliam in 2008, but initially other projects got in the way. It was revived in June 2012 with Gilliam at the helm. The script was very ambitious for a limited budget of under $10 million, so production took place in Romania over a 37-day period. In spite of the cost challenges, it was shot on 35mm film and includes 250 visual effects.

df_tzt_6Audsley continued, “Nicola [Pecorini, director of photography] shot a number of tests with film, RED and ARRI ALEXA cameras . The decision was made to use film. It allowed him the latitude to place lights outside of the chapel set – Qohen’s home – and have light coming in through the windows to light up the interior. Kodak’s lab in Bucharest handled the processing and transfer and then sent Avid MXF files to London, where I was editing. Terry and the crew were able to view dailies in Romania and then we discussed these over the phone. Viewing dailies is a rarity these days with digitally-shot films and something I really miss. Seeing the dailies with the full company provides clarity, but I’m afraid it’s dying out as part of the filmmaking process.”df_tzt_7

While editing in parallel to the production, Audsley didn’t upload any in-progress cuts for Gilliam to review. He said, “It’s hard for the director to concentrate on the edit, while he’s still in production. As long as the coverage is there, it’s fine. Certainly Terry and Nicola have a supreme understanding of film grammar, so that’s not a problem. Terry knows to get those extra little shots that will make the edit better. So, I was editing largely on my own and had a first cut within about ten days of the time that the production wrapped. When Terry arrived in London, we first went over the film in twenty-minute reels. That took us about two to three weeks. Then we went through the whole film as one piece to get a sense for how it worked as a film.”

Making a cinematic story

df_tzt_4As with most films, the “final draft” of the script occurs in the cutting room. Audsley continued, “The film as a written screenplay was very fluid, but when we viewed it as a completed film, it felt too linear and needed to be more cinematic – more out of order. We thought that it might be best to move the sentences around in a more interesting way. We did that quite easily and quickly. Thus, we took the strength of the writing and realized it in cinematic language. That’s one of the big benefits of the modern digital editing tools. The real film is about the relationship between Bainsley and Qohen and less about the world they inhabit. The challenge as filmmakers in the cutting room is to find that truth.”

df_tzt_8Working with visual effects presents its own editorial challenge. “As an editor, you have to evaluate the weight and importance of the plate – the base element for a visual effect – before committing to the effect. From the point-of-view of cost, you can’t keep undoing shots that have teams of artists working on them. You have to ensure that the timing is exactly right before turning over the elements for visual effects development. The biggest, single visual challenge is making Terry’s world, which is visually very rich. In the first reel, we see a futuristic London, with moving billboards. These shots were very complex and required a lot of temp effects that I layered up in the timeline. It’s one of the more complex sequences I’ve built in the Avid, with both visual and audio elements interacting. You have to decide how much can you digest and that’s an open conversation with the director and effects artists.”

The post schedule lasted about twenty weeks ending with a mix in June 2013. Part of that time was tied up in waiting for the completion of visual effects. Since there was no budget for official audience screenings, the editorial team was not tasked with creating temp mixes and preview versions before finishing the film. Audsley said, “The first cut was not overly long. Terry is good in his planning. One big change that we made during the edit was to the film’s ending. As written, Qohen ends up in the real world for a nice, tidy ending. We opted to end the film earlier for a more ambiguous ending that would be better. In the final cut the film ends while he’s still in a virtual reality world. It provides a more cerebral versus practical ending for the viewer.”

Cutting style 

df_tzt_9Audsley characterizes his cutting style as “old school”. He explained, “I come from a Moviola background, so I like to leave my cut as bare as possible, with few temp sound effects or music cues. I’ll only add what’s needed to help you understand the story. Since we weren’t obliged on this film to do temp mixes for screenings, I was able to keep the cut sparse. This lets you really focus on the cut and know if the film is working or not. If it does, then sound effects and music will only make it better. Often a rough cut will have temp music and people have trouble figuring out why a film isn’t working. The music may mask an issue or, in fact, it might simply be that the wrong temp music was used. On The Zero Theorem, George Fenton, our composer, gave us representative pieces late in the  process that he’d written for scenes.” Andre Jacquemin was the sound designer who worked in parallel to Audsley’s cut and the two developed an interactive process. Audsley explained, “Sometimes sound would need to breath more, so I’d open a scene up a bit. We had a nice back-and-forth in how we worked.”

df_tzt_3Audsley edited the film using Avid Media Composer version 5 connected to an Avid Unity shared storage system. This linked him to another Avid workstation run by his first assistant editor, Pani Ahmadi-Moore. He’s since upgraded to version 7 software and Avid ISIS shared storage. Audsley said, “I work the Avid pretty much like I worked when I used the Moviola and cut on film. Footage is grouped into bins for each scene. As I edit, I cut the film into reels and then use version numbers as I duplicate sequences to make changes. I keep a daily handwritten log about what’s done each day. The trick is to be fastidious and organized. Pani handles the preparation and asset management so that I can concentrate on the edit.”

df_tzt_2Audsley continued, “Terry’s films are very much a family type of business. It’s a family of people who know each other. Terry is supremely in control of his films, but he’s also secure in sharing with his filmmaking family. We are open to discuss all aspects of the film. The cutting room has to be a safe place for a director, but it’s the hub of all the post activity, so everyone has to feel free about voicing their opinions.”

Much of what the editor does, proceeds in isolation. The Zero Theorem provided a certain ironic resonance for Audsley, who commented, “At the start, we see a guy sitting naked in front of a computer. His life is harnessed in manipulating something on screen, and that is something I can relate to as a film editor! I think it’s very much a document of our time, about the notion that in this world of communication, there’s a strong aspect of isolation. All the communication in the world does not necessarily connect you spiritually.” The Zero Theorem is scheduled to open for limited US distribution in September.

For more thoughts from Mick Audsley, read this post at Avid Blogs.

Originally written for DV magazine / CreativePlanetNetwork.

©2014 Oliver Peters

Avid Everywhere

df_avid_ev_1It’s interesting to see that in spite of a lot of press, the Avid Everywhere concept still results in confusion. They’ve certainly been enunciating it since last year, with a full roll-out at NAB this past April. For whatever reason, Avid Everywhere seems to be lumped together with Adobe Anywhere in the minds of many. Maybe it’s the similarity of names or it’s that they both have a cloud component, but they aren’t the same thing. Avid Everywhere is a corporate vision, while Adobe Anywhere is a specific product (more on that later).

Vision and strategy

Avid Technology is a company with a diverse range of hardware and software products, covering content creation (video, audio, graphics, news), asset management, audio/video hardware i/o, consoles and control surfaces, storage and servers. In an effort to consolidate and rebrand a wide-ranging set of offerings, Avid has repackaged these existing (and future) products under the banner of Avid Everywhere. This is a marketing strategy designed to convey the message that whatever your media needs might be, Avid has a product or service to satisfy that need. This is coupled to a community of users that can benefit from their common use of Avid products.

This vision positions Avid’s products as a “platform”, in the same way that Windows, Mac OS X, iOS, Android, Apple hardware and PC hardware are all platforms. Within this platform concept, the products become stratified into product tiers or “suites”. Bear in mind that “suite” really refers to a group of products and not specifically a collection of hardware or software that you purchase as a single unit. The base layer of this platform contains the various software hooks that tie the products together – for example, APIs required to use Media Composer software with Interplay asset management or in an ISIS SAN environment. This is called the Avid MediaCentral Platform.

df_avid_ev_2On top of this sits the Storage Suite, which consists of the various Avid storage solutions, such as ISIS, along with news play-out servers. The next tier is the Media Suite, which encompasses the Interplay asset management and iNews newsroom products. In the transition to the Avid Everywhere strategy, you’ll see a lot of references on Avid’s website and in their marketing literature to “formerly Interplay ___”. That’s because Avid is in the process of rebranding these products into something with a “Media ___” name.

Most users who are editing and audio professionals will mainly associate Avid with the Artist Suite tier. This is the layer of content creation tools, including Media Composer, Pro Tools, Sibelius and the control surfaces that came out of Digidesign and Euphonix, including the Artist panels. If you are a single user of Media Composer, Pro Tools or Sibelius and own no other Avid infrastructure, like ISIS or Interplay, then the entire Avid Everywhere media platform doesn’t touch you very much for now.

The top layer of the platform chart is MediaCentral | UX, which was formerly known as Interplay Central. This is a web front-end that allows you to browse, log and notate Interplay assets from a desktop computer, laptop or mobile device. Although the current iteration is targeted at news production, the concept is story-centric and could provide functionality in other arenas, such as drama and reality series production.

Surrounding the entire structure are support services (tech support and professional integration services) plus a private and public marketplace. Media Composer software has included a Marketplace menu item for a few versions. Until now, this has been a web portal to buy plug-ins and stock footage. The updated vision for this is more along the lines of services like SoundCloud, Adobe’s Behance service or the files section of Creative Cloud. For example, let’s say you are a composer that uses Pro Tools. You create licensable music tracks and post them to the Marketplace. Other users can browse the Marketplace and find your tracks, complete with licensing and payment arrangements. To make this work, the Avid MediaCentral Platform includes things like proper security to enable such transactions.

All clouds are not the same

df_avid_ev_3I started this post with the comment that I feel many editors confuse Adobe Anywhere and Avid Everywhere. I believe that’s because they mistakenly interpret Avid Everywhere as the specific version of the Media Composer product that enables remote-access editing. As I’ve explained above, Everywhere is a concept and vision, not a product. That specific Media Composer product (formerly Interplay Sphere) is now branded as Media Composer | Cloud. As a product, it most closely approximates Adobe Anywhere, but there are key differences.

Adobe Anywhere is a system that requires a centralized server and storage. Any computer with Premiere Pro CC or CC 2014 can remotely access the assets on this system, which streams proxy media back to that computer. All the “heavy lifting” is done at the central site and the editor’s Premiere Pro is effectively working only as a local front-end. The operation does not allow hybrid editing with a combination of local and remote assets. All local assets have to be uploaded to the server and then streamed back to the editor. That’s because Anywhere manages the assets for multiple editors during collaborative workflows and handles project versioning. If you are working on an Anywhere production, you always have to be connected to the network.

df_avid_ev_4In contrast, Media Composer | Cloud is primarily a plug-in that works with an otherwise standard version of the Media Composer software. In order for it to function, the “home base” facility must have an appropriate Interplay/ISIS infrastructure so that Media Composer | Cloud can talk to it. In Avid marketing parlance “you’ve got to get on the platform” for some of these things to work.

Media Composer | Cloud permits hybrid editing. For example, a news videographer in the field can be editing at the proverbial Starbucks using local assets. Maybe part of the story requires access to past b-roll footage that lives back at the station on its newsroom storage. Through Media Composer | Cloud and Interplay, the videographer can access those files as proxies and integrate them into the piece. Meanwhile, local assets can be uploaded back to the station. When the piece is cut, a “publish” command (an AAF of the sequence) goes back to the station for quick turnaround to air. Media Composer | Cloud, by its nature, doesn’t require continuous connection, so editing can continue during transit, such as in a vehicle.

While not everything about Avid Everywhere has been fully implemented, yet, it certainly is an aggressive strategy. It is an attempt to move the company as a whole into areas beyond just editing software, while still allowing users and owners to leverage their Avid assets into other opportunities.

©2014 Oliver Peters

Amira Color Tool and your NLE

df_amiracolor_1I was recently alerted to the new Amira Color Tool by Michael Phillips’ 24p blog. This is a lightweight ARRI software application designed to create custom in-camera looks for the Amira camera. You do this by creating custom color look-up tables (LUT). The Amira Color Tool is available as a free download from the ARRI website (free registration required). Although the application is designed for the camera, you can also export looks in a variety of LUT file formats, which in turn, may be installed and applied to footage in a number of different editing and color correction applications. I tested this in both Apple Final Cut Pro X and Avid Media Composer | Software (v8) with good results.

The Amira Color Tool is designed to correct log-C encoded footage into a straight Rec709 offset or with a custom look. ARRI offers some very good instructions, white papers, sample looks and tutorials that cover the operation of this software. The signal flow is from the log-C image, to the Rec709 correction, and then to the CDL-based color correction. To my eye, the math appears to be floating point, because a Rec709 conversion that throws a shot into clipping, can be pulled back out of clipping in the look tab, using the CDL color correction tools. Therefore it is possible to use this tool for shots other than ARRI Amira or Alexa log-C footage, as long as it is sufficiently flat.

The CDL correction tools are based on slope, offset and power. In that model slope is equivalent to gain, offset to lift and power to gamma. In addition to color wheels, there’s a second video look parameters tab for hue intensities for the six main vectors (red, yellow, green, cyan, blue and magenta). The Amira Color Tool is Mac-only and opens both QuickTime and DPX files from the clips I tested. It worked successfully with clips shot on an Alexa (log-C), Blackmagic Cinema Camera (BMD Film profile), Sony F-3 (S-log) and Canon 1DC (4K Canon-log). Remember that the software is designed to correct flat, log-C images, so you probably don’t want to use this with images that were already encoded with vibrant Rec709 colors.


df_amiracolor_4To use the Amira Color Tool, import your clip from the application’s file browser, set the look and export a 3D LUT in the appropriate format. I used the DaVinci Resolve setting, which creates a 3D LUT in a .cube format file. To get this into FCP X, you need to buy and install a LUT filter, like Color Grading Central’s LUT Utility. To install a new LUT there, open the LUT Utility pane in System Preferences, click the “+” symbol and navigate to where the file was saved.df_amiracolor_5_sm In FCP X, apply the LUT Utility to the clip as a filter. From the filter’s pulldown selection in the inspector, choose the new LUT that you’ve created and installed. One caveat is to be careful with ARRI files. Any files recorded with newer ARRI firmware are flagged for log-C and FCP X automatically corrects these to Rec709. Since you don’t want to double up on LUTs, make sure “log processing” is unchecked for those clips in the info tab of the inspector pane.

Media Composer

df_amiracolor_6_smTo use the custom LUTs in Media Composer, select “source settings” for the clip. Go to the color management tab and install the LUT. Now it will be available in the pull-down menu for color conversions. This color management change can be applied to a single clip or to a batch of clips within a bin.

In both cases, the source clips in FCP X and/or Media Composer will play in real-time with the custom look already applied.



©2014 Oliver Peters