Avid Media Composer | First

They’ve teased us for two years, but now it’s finally out. Avid Technology has released its free nonlinear editing application, Media Composer | First. This is not dumbed-down, teaser software, but rather a partially-restricted version of the full-fledged Media Composer software and built upon the same code. With that comes an inherent level of complexity, which Avid has sought to minimize for new users; however, you really do want to go through the tutorials before diving in.

It’s important to understand who the target user is. Avid didn’t set out to simply add another free, professional editing tool to an increasingly crowded market. Media Composer | First is intended as a functional starter tool for users who want to get their feet wet in the Avid ecosystem, but then eventually convert to the full-fledged, paid software. That’s been successful for Avid with Pro Tools | First. To sweeten the pot, you’ll also get 350 sound effects from Pro Sound Effects and 50 royalty-free music tracks from Sound Ideas (both sets are also free).

Diving in

To get Media Composer | First, you must set up an Avid master account, which is free. Existing customers can also get First, but the software cannot be co-installed on a computer with the full version. For example, I installed Media Composer | First on my laptop, because I have the full Media Composer application on my desktop. You must sign into the account and stay signed in for Media Composer | First to lunch and run. I did get it to work if I signed in, but then disconnected the internet. There was a disconnection prompt, but nevertheless, the application worked, saved, and exported properly. It doesn’t seem mandatory to be constantly connected to Avid over the internet. All project data is stored locally, so this is not a cloud application.

The managing of the account and future updates are handled through Application Manager, an Avid desktop utility. It’s not my favorite, as at times it’s unreliable, but it does work most of the time. Opening the installer .dmg file will take a long time to verify. This seems to be a general Avid quirk, so be patient. When you first open the application, you may get a disk drive write permissions error message. On macOS you normally set drive permissions for “system”, “wheel”, and “everyone”. Typically I have the last two set to “read only”, which works for every other application, except Avid’s. Therefore, if you want to store Avid media on your internal system hard drive, then “everyone” must be changed to “read & write”.

The guided tour

The Avid designers have tried to make the Media Composer | First interface easy to navigate for new users – especially those coming from other NLEs, where media and projects are managed differently than in Media Composer. Right at the launch screen you have the option to learn through online tutorials. These will be helpful even for experienced users who might try to “out-think” the software. The interface includes a number of text overlays to help you get started. For example, there is no place to set project settings. The first clip added to the first sequence sets the project settings from there on. So, don’t drop a 25fps clip onto the timeline as your first clip, if you intend to work in a 23.98fps project. These prompts are right in front of you, so if you follow their guidance, you’ll be OK.

The same holds true for importing media through the Source Browser. With Media Composer you either transcode a file, which turns it into Avid-managed media placed into the Avid MediaFiles folder, or simply link to the file. If you select link, then the file stays in place and it’s up to the user not to move or delete that file on the hard drive. Although the original Avid paradigm was to only manage media in its MediaFiles hard drive folders, the current versions handle linking just fine and act largely the same as other NLEs.

Options, restrictions, and limitations

Since this is a free application, a number of features have been restricted. There are three biggies. Tracks are limited to four video tracks and eight audio tracks. This is actually quite workable, however, I think a higher audio track count would have been advisable, because of how Avid handles stereo, mono, and multichannel files. On a side note, if you use the “collapse” function to nest video clips, it’s possible to vertically stack more than just four clips on the timeline.

The application is locked to a maximum project size of 1920×1080 (Rec. 709 color space only) and up to 59.94fps. Source files can be larger (such as 4K) and you can still use them on the timeline, but you’ll have to pan-and-scan, crop, or scale them. I hope future versions will permit at least UltraHD (4K) project sizes.

Finally, Media Composer | First projects cannot be interchanged with full fledged Media Composer projects. This means that you cannot start in Media Composer | First and then migrate your project to the paid version. Hopefully this gets fixed in a future update. If not, it will negatively impact students and indie producers using the application for any real work.

As expected, there are no 3D stereoscopic tools, ScriptSync (automatic speech-to-text/sync-to-script), PhraseFind (phonetic search engine), or Symphony (advanced color correction) options. One that surprised me, though, was the removable of the superior Spectramatte keyer. You are left with the truly terrible RGB keyer for blue/green-screen work.

Nevertheless, there’s plenty of horsepower left. For example, FrameFlex to handle resizing and Timewarps for retiming clips, which is how 4K and off-speed frame rates are handled. Color correction (including scopes), multicam, IllusionFX, source setting color LUTs, Audiosuite, and Pro Tools-style audio track effects are also there. Transcoding allows for the use of a wide range of codecs, including ProRes on a Mac. 4K camera clips will be transcoded to 1080. However, exports are limited to Avid DNxHD and H.264 QuickTime files at up to 1920×1080. The only DNxHD export flavor is the 100Mbps variant (at 29.97, 80Mbps for 23.98), which is comparable to ProResLT. It’s good quality, but not at the highest mastering levels.

Conclusion

This is a really good first effect, no pun intended. As you might expect, it’s a little buggy for a first version. For example, I experienced a number of crashes while testing source LUTs. However, it was well-behaved during standard editing tasks. If Media Composer | First files can become compatible with the paid systems and the 1080 limit can be increased to UHD/4K, then Avid has a winner on its hands. Think of the film student who starts on First at home, but then finishes on the full version in the college’s computer lab. Or the indie producer/director who starts his or her own rough cut on First, but then takes it to an editor or facility to complete the process. These are ideal scenarios for First. I’ve cut tons of short and long form projects, including a few feature films, using a variety of NLEs. Nearly all of those could have been done using Media Composer | First. Yes, it’s free, but there’s enough power to get the job done and done well.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Advertisements

Suburbicon

George Clooney’s latest film, Suburbicon, originated over a decade ago as a screenplay by Joel and Ethan Coen. Clooney picked it up when the Coens decided not to produce the film themselves. Clooney and writing partner Grant Heslov (The Monuments Men, The Ides of March, Good Night, and Good Luck), rewrote it as taking place in the 1950s and added another story element. In the summer of 1957, the Myers, an African-American couple, moved into a largely white suburb in Levittown, Pennsylvania, setting off months of violent protests. The rewritten script interweaves the tale of the black family with that of their next-door neighbors, Gardner (Matt Damon) and Margaret (Julianne Moore). In fact, a documentary was produced about the historical events and shots from that documentary were used in Suburbicon.

Calibrating the tone

During the production and editing of the film, the overall tone was adjusted as a result of the actual, contemporary events occurring in the country. I spoke with the film’s editor, Stephen Mirrione (The Revenant, Birdman or (The Unexpected Virtue of Ignorance), The Monuments Men) about this. Mirrione explains, “The movie is presented as over-the-top to exaggerate events as satire. In feeling that out, George started to tone down the silliness, based on surrounding events. The production was being filmed during the time of the US election last year, so the mood on the set changed. The real world was more over-the-top than imagined, so the film didn’t feel quite right. George started gravitating towards a more realistic style and we locked into that tone by the time the film moved into post.”

The production took place on the Warner Brothers lot in September 2016 with Mirrione and first assistant editor Patrick Smith cutting in parallel with the production. Mirrione continues, “I was cutting during this production period. George would come in on Saturdays to work with me and ‘recalibrate’ the cut. Naturally some scenes were lost in this process. They were funny scenes, but just didn’t fit the direction any longer. In January we moved to England for the rest of the post. Amal [Clooney, George’s wife] was pregnant at the time, so George and Amal wanted to be close to her family near London. We had done post there before and had a good relationship with vendors for sound post. The final sound mix was in the April/May time frame. We had an editing room set up close to George outside of London, but also others in Twickenham and at Pinewood Studios. This way I could move around to work with George on the cut, wherever he needed to be.”

Traveling light

Mirrione is used to working with a light footprint, so the need for mobility was no burden. He explains, “I’m accustomed to being very mobile. All the media was in the Avid DNxHD36 format on mobile drives. We had an Avid ISIS shared storage system in Twickenham, which was the hub for all of the media. Patrick would make sure all the drives were updated during production, so I was able to work completely with standalone drives. The Avid is a bit faster that way, although there’s a slight trade-off waiting for updated bins to be sent. I was using a ‘trash can’ [2013] Mac Pro plus AJA hardware, but I also used a laptop – mainly for reference – when we were in LA during the final steps of the process.” The intercontinental workflow also extended to color correction. According to Mirrione, “Stefan Sonnenfeld was our digital intermediate colorist and Company 3 [Co3] stored a back-up of all the original media. Through an arrangement with Deluxe, he was able to stream material to England for review, as well as from England to LA to show the DP [Robert Elswit].”

Music was critical to Suburbicon and scoring fell to Alexandre Desplat (The Secret Life of Pets, Florence Foster Jenkins, The Danish Girl). Mirrione explains their scoring process. “It was very important, as we built the temp score in the edit, to understand the tone and suspense of the film. George wanted a classic 1950s-style score. We tapped some Elmer Bernstein, Grifters, The Good Son, and other music for our initial style and direction. Peter Clarke was brought on as music editor to help round out the emotional beats. Once we finished the cut, Alexandre and George worked together to create a beautiful score. I love watching the scenes with that score, because his music makes the editing seem much more exciting and elegant.”

Suiting the edit tool to your needs

Stephen Mirrione typically uses Avid Media Composer to cut his films and Suburbicon is no exception. Unlike many film editors who rely on unique Avid features, like ScriptSync, Mirrione takes a more straightforward approach. He says, “We were using Media Composer 8. The way George shoots, there’s not a lot of improv or tons of takes. I prefer to just rely on PDFs of the script notes and placing descriptions into the bins. The infrastructure required for ScriptSync, like extra assistants, is not something I need. My usual method of organization is a bin for each day of dailies, organized in shooting order. If the director remembers something, it’s easy to find in a day bin. During the edit, I alternate my bin set-ups between the script view and the frame view.”

With a number of noted editors dabbling with other software, I wondered whether Mirrione has been tempted. He responds, “I view my approach as system-agnostic and have cut on Lightworks and the old Laser Pacific unit, among others. I don’t want to be dependent on one piece of software to define how I do my craft. But I keep coming back to Avid. For me it’s the trim mode. It takes me back to the way I cut film. I looked at Resolve, because it would be great to skip the roundtrip between applications. I had tested it, but felt it would be too steep a learning curve, and that would have impacted George’s experience as the director.”

In wrapping our conversation, Mirrione concluded with this take away from his Suburbicon experience. He explains, “In our first preview screening, it was inspiring to see how seriously the audience took to the film and the attachment they had to the characters. The audiences were surprised at how biting and relevant it is to today. The theme of the film is really talking about what can happen when people don’t speak out against racism and bullying. I’m so proud and lucky to have the opportunity to work with someone like George, who wants to do something meaningful.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Baby Driver

You don’t have to be a rabid fan of Edgar Wright’s work to know of his films. His comedy trilogy (Shaun of the Dead, Hot Fuzz, The World’s End) and cult classics like Scott Pilgrim vs. the World loom large in pop culture. His films have earned a life beyond most films’ brief release period and earned Wright a loyal following. The latest film from Wright is Baby Driver, a musically-fueled action film written and directed by Wright, which just made a big splash at SXSW. It stars Ansel Elgort, Kevin Spacey, Jon Hamm, Jamie Foxx, and Eiza Gonzalez.

At NAB, Avid brought in a number of featured speakers for its main stage presentations, as well as its Avid Connect event. One of these speakers was Paul Machliss (Scott Pilgrim vs. the World, The World’s End, Baby Driver), who spoke to packed audiences about the art of editing these films. I had a chance to go in-depth with Machliss about the complex process of working on Baby Driver.

From Smoke to baptism by fire

We started our conversation with a bit of the backstory of the connection between Wright and Machliss. He says, “I started editing as an online editor and progressed from tape-based systems to being one of the early London-based Smoke editors. My boss at the time passed along a project that he thought would be perfect for Smoke. That was onlining the sitcom Spaced, directed by Edgar Wright. Edgar and I got on well. Concurrent to that, I had started learning Avid. I started doing offline editing jobs for other directors and had a ball. A chance came along to do a David Beckham documentary, so I took the plunge from being a full-time online editor to taking my chances in the freelance world. On the tail end of the documentary, I got a call from Edgar, offering me the gig to be the offline editor for the second season of Spaced, because Chris Dickens (Hot Fuzz, Berberian Sound Studio, Slumdog Millionaire) wasn’t available to complete the edit. And that was really jumping into the deep end. It was fantastic to be able to work with Edgar at that level.”

Machliss continues, “Chris came back to work with Edgar on Shaun of the Dead and Hot Fuzz, so over the following years I honed my skills working on a number of British comedies and dramas. After Slumdog Millionaire came out, which Chris cut and for which he won a number of awards, including an Oscar, Chris suddenly found himself very busy, so the rest of us working with Edgar all moved up one in the queue, so to speak. The opportunity to edit Scott Pilgrim came up, so we all threw ourselves into the world of feature films, which was definitely a baptism by fire. We were very lucky to be able to work on a project of that nature during a time where the industry was in a bit of a slump due to the recession. And it’s fantastic that people still remember it and talk about it seven years on. Which brings us to Baby Driver. It’s great when a studio is willing to invest in a film that isn’t a franchise, a sequel, or a reboot.”

Music drives the film

In Baby Driver, Ansel Elgort plays “Baby”, a young kid who is the getaway driver for a gang. At a young age, he was in a car accident which leaves him with tinnitus, so it takes listening to music 24/7 to drown out the tinnitus. Machliss explains, “His whole life becomes regimented to whatever music he is listening to – different music for different moods or occasions. Somehow everything falls magically into sync with whatever he is listening to – when he’s driving, swerving to avoid a car, making a turn – it all seems to happen on the beat. Music drives every single scene. Edgar deliberately chose commercial top-20 tracks from the 1960s up to today. Each song Baby listens to also slyly comments on whatever is happening at the time in the story. Everything is seemingly choreographed to musical rhythms. You’re not looking at a musical, but everything is musically driven.”

Naturally, building a film to popular music brings up a whole host of production issues. Machliss tells how this film had been in the planning for years, “Edgar had chosen these tracks years ago. I believe it was in 2011 that Edgar and I tried to sequence the tracks and intersperse them with sound effects. A couple of months later, he did a table read in LA and sent me the sound files. In the Avid, I combined the sound files, songs, and some sound effects to create effectively a 100-minute radio play, which was, in fact, the film in audio form. The big thing is that we had to clear every song before we could start filming. Eventually we cleared 30-odd songs for the film. In addition, Edgar worked with his stunt team and editor Evan Schiff in LA to create storyboards and animatics for all of the action scenes.”

Editor on the front lines

Unlike most films, a significant amount of the editing took place on-set with Machliss working from a portable set-up. He says, “Based on our experiences with Scott Pilgrim and World’s End, Edgar decided it would be best to have me on-set during most of the Atlanta shoot for Baby Driver. Even though a cutting room was available, I was in there maybe ten percent of the time. The rest of the time I was on set. I had a trolley with a laptop, monitor, an Avid Mojo, and some hard drives and I would connect myself via ethernet to the video assist’s hard drive. Effectively I was crew in the front lines with everyone else. Making sure the edit worked was as important as getting a good take in the can. If I assured Edgar that a take would work, then he knew it wasn’t going to come back and cause problems for us six months later. We wanted things to work naturally in camera without a lot of fiddling in post. We didn’t want to have to fall back on frame-cutting and vari-speeding if we didn’t have to. There was a lot of prep work in making sure actions correctly coincided with certain lyrics without the action seeming mechanical.”

The nature of the production added to the complexity of the production audio configuration, too. Machliss explains, “Sound-wise, it was very complicated. We had playback going to earwigs in the actors’ ears, Edgar wanted to hear music plus the dialogue in his cans, and then I needed to get a split feed of the audio, since I already had the clean music on my timeline. We shot this mostly on 35mm film. Some days were A-camera only, but usually two cameras running. It was a combination of Panavision, Arricams, and occasionally Arri Alexas. Sometimes there were some stunt shots, which required nine or ten cameras running. Since the action all happened against playback of a track, this allowed me to use Avid’s multicam tools to quickly group shots together. Avid’s AMA tools have really come of age, so I was able to work without needing to ingest anything. I could treat the video assist’s hard drive as my source media, as long as I had the ethernet connection to it. If we were between set-ups, I could get Avid to background-transcode the media, so I’d have my own copy.”

Did all of this on-set editing speed up the rest of the post process? He continues, “All of the on-set editing helped a great deal, because we went into the real post-production phase knowing that all the sequences basically worked. During that time, as I’d fill up a LaCie Rugged drive, I would send that back to the suites. My assistant, Jerry Ramsbottom, would then patiently overcut my edits from the video assist with the actual scanned telecine footage as it came in. We shot from mid-February until mid-May and then returned to England. Jonathan Amos came on board a few weeks into the director’s cut edit and worked on the film with Edgar and myself up until the director’s cut picture lock. He did a pass on some of the action scenes while Edgar and myself concentrated on dialogue and the overall shape of the film. He stayed on board up until the final picture lock and made an incredible contribution to the action and the tension of the film. By the end of the year we’d locked and then we finished the final mix mid-February of this year. But the great thing was to be able to come into the edit and have those sequences ready to go.”

Editing from set is something many editors try to avoid. They feel they can be more objective that way. Machliss sees it a bit differently, “Some editors don’t like being on set, but I like the openness of it – taking it all in. Because when you are in the edit, you can recall the events of the day a particular scene was shot – ‘I can remember when Kevin Spacey did this thing on the third take, which could be useful’. It’s not vital to work like this, but it does preclude to a kind of short-hand, which is something Edgar and I have developed over these years anyway. The beauty of it is that Edgar and I will take the time to try every option. You can never hit on the perfect cut the first time. Often you’ll get feedback from screenings, such as ‘we’d like to see more emotion between these characters’. You know what’s available and sometimes four extra shots can make all the difference in how a scene reads without having to re-imagine anything. We did drop some scenes from the final version of the film. Of course, you go ‘that’s a shame’, but at least these scenes were given a chance. However, there are always bits where upon the 200th viewing you can decide, ‘well, that’s completely redundant’ – and it’s easy to drop. You always skate as close to the edge of making a film shorter without doing any damage to it.”

The challenge of sound

During sound post, Baby Driver also presented some unique challenges. Machliss says, “For the sound mix – and even for the shoot – we had to make sure we were working with the final masters of the song recordings to make sure the pitch and duration remained constant throughout. Typically these came in as mono or stereo WAVs. Because music is such an important element to the film, the concept of perceived direction becomes important. Is the music emanating from Baby’s earbuds? What happens to it when the camera moves or he turns his head? We had to work out a language for the perception of sound. This was Edgar’s first film mixed in Dolby ATMOS and we were the second film in Goldcrest London’s new Atmos-certified dubbing theater. Then we did a reduction to 7.1 and 5.1. Initially we were thinking this film would have no score other than the songs. Invariably you need something to get from A to B. We called on the services of Steven Price (Gravity, Fury, Suicide Squad), who provided us with some original cues and some musical textures. He did a very clever thing where he would match the end pitch or notes of a commercial song and then by the time he came to the end of his cue, it would match to the incoming note or key of the next song. And you never notice the change.”

Working with Avid in a new way

To wrap up the conversation, we talked a bit about using Avid Media Composer on his work. Machliss has used numerous other systems, but Media Composer still fits the bill for his work today. He says, “For me, the speed of working with AMA in Avid in the latest software was a real benefit. I could actually keep up with the speed of the shoot. You don’t want to be the one holding up a crew of 70. I also made good use of background transcoding. On a different project (Fleabag), I was able to work with native 2K Alexa ProRes camera files at full resolution. It was fantastic to be able to use Frameflex and apply LUTs – doing the cutting, but then bringing back my old skills as an online editor to paint out booms and fix things up. Once we locked, I could remove the LUTs and export DPX files, which went straight to the grading facility. That was exciting to work in a new way.”

Baby Driver opened at the start of July in the US and is a fun ride. You can certainly enjoy a film like this without knowing the nitty gritty of the production that goes into it. However, after you’ve read this article, you just might need to see it at least twice – once to just enjoy and once again to study the “invisible art” that’s gone into bringing it to screen.

(For more with Paul Machliss, check out these interviews at Studio Daily, ProVideoCoalition, and FrameIO.)

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Voyage of Time

df0617_vot_3_smFans of director Terrence Malick adore his unique approach to filmmaking, which is often defined by timeless and painterly cinematic compositions. The good news for moviegoers is that Malick has been in the most prolific period of his directing career. What could be the penultimate in cinema as poetry is Malick’s recent documentary, Voyage of Time. This is no less than a chronicle of the history of the universe as seen through Malick’s eyes. Even more intriguing is the fact that the film is being released in two versions – a 90 minute feature (Voyage of Time: Life’s Journey), narrated by Cate Blanchett, as well as a 45 minute IMAX version (Voyage of Time: The IMAX Experience), narrated by Brad Pitt.

This period of Malick’s increased output has not only been good for fans, but also for Keith Fraase, co-editor of Voyage of Time. Fraase joined Malick’s filmmaking team during the post of The Tree of Life. Although he had been an experienced editor cutting commercials and shorts, working with Malick was his first time working on a full-length feature. Keith Fraase and I recently discussed what it took to bring Voyage of Time to the screen.

Eight years in the making

“I began working with Terry back in 2008 on The Tree of Life,” Fraase says. “Originally, Voyage of Time had been conceived as a companion piece to The Tree of Life, to be released simultaneously. But plans changed and the release of Voyage was delayed. Some of the ideas and thematic elements that were discussed for Voyage ended up as the ‘creation sequence’ in Tree, but reworked to fit the tone and style of that film. Over the years, Voyage became something that Terry and I would edit in between post on his other narrative films. It was our passion project.”

df0617_vot_1Malick’s cutting rooms are equipped with Avid Media Composer systems connected to Avid shared storage. Typically his films are edited by multiple editors. (Voyage of Time was co-edited by Fraase and Rehman Nizar Ali.) Not only editors, but also researchers, needed access to the footage, so at times, there were as many as eight Media Composer systems used in post. Fraase explains, “There is almost always more than one editor on Terry’s films. At the start of post, we’d divvy up the film by section and work on it until achieving a rough assembly. Then, once the film was assembled in full, each editor would usually trade-off sections or scenes, in the hope to achieve some new perspective on the cut. It was always about focusing on experimentation or discovering different approaches to the edit. With Voyage, there was so much footage to work with, some of which Terry had filmed back in the 70s. This was a project he’d had in his mind for decades. In preparation, he traveled all over the world and had amassed years of research on natural phenomena and the locations where he could film them. During filming, the crew would go to locations with particular goals in mind, like capturing mud pots in Iceland or cuttlefish in Palau. But Terry was always on the lookout for the unexpected. Due to this, much of the footage that ended up in the final films was unplanned.”

df0617_vot_2Cutting Voyage of Time presented an interesting way to tackle narration. Fraase continues, “For Voyage, there were hours and hours of footage to cut with, but we also did a lot of experiments with sound. Originally, there was a 45 page script written for the IMAX version, which was expanded for the full feature. However, this script was more about feelings and tone than outlining specific beats or scenes. It was more poetry than prose, much of which was later repurposed and recorded as voiceover. Terry has a very specific way of working with voiceover. The actors record pages and pages of it. All beautifully written. But we never know what is going to work until it’s recorded, brought into the Avid, and put up against picture. Typically, we’ll edit together sequences of voiceover independent of any footage. Then we move these sequences up and down the timeline until we find a combination of image and voiceover that produces meaning greater than the sum of the parts. Terry’s most interested in the unexpected, the unplanned.”

The art of picture and sound composition

Naturally, when moviegoers think of a Terrence Malick film, imagery comes to mind. Multiple visual effects houses worked on Voyage of Time, under the supervision of Dan Glass (Jupiter Ascending, Cloud Atlas, The Master). Different artists worked on different sections of the film. Fraase explains, “Throughout post production, we sought the guidance from scientific specialists whenever we could. They would help us define certain thematic elements that we knew we wanted – into specific, illustratable moments. We’d then bring these ideas to the different VFX shops to expand on them. They mocked up the various ‘previz’ shots that we’d test in our edit – many of which were abandoned along the way. We had to drop so many wonderful images and moments after they’d been painstakingly created, because it was impossible to know what would work best until placed in the edit.”

df0617_vot_4“For VFX, Terry wanted to rely on practical film elements as much as possible. Even the shots that were largely CGI had to have some foundation in the real. We had an ongoing series of what we called ’skunkworks shoots’ during the weekends, where the crew would film experiments with elements like smoke, flares, dyes in water and so on. These were all layered into more complex visual effects shots.” Although principal photography was on film, the finished product went through a DI (digital intermediate) finishing process. IMAX visual effects elements were scanned at 11K resolution and the regular live action footage at 8K resolution.

df0617_vot_5The music score for Voyage of Time was also a subject of much experimentation. Fraase continues, “Terry has an extensive classical music library, which was all loaded into the Avid, so that we could test a variety of pieces against the edit. This started with some obvious choices like [Gustav] Holst’s ‘The Planets’ and [Joseph] Haydn’s ‘The Creation’ for a temp score. But we tried others, like a Keith Jarrett piano piece. Then one of our composers [Hanan Townshend, To The Wonder, Knight of Cups] experimented further by taking some of the classical pieces we’d been using and slowing them way, way down. The sound of stringed instruments being slowed results in an almost drone-like texture. For some of the original compositions, Terry was most interested in melodies and chords that never resolve completely. The idea being that, by never resolving, the music was mimicking creation – constantly struggling and striving for completion. Ultimately a collection of all these techniques was used in the final mix. The idea was that this eclectic approach would provide for a soundtrack that was always changing.”

Voyage of Time is a visual symphony, which is best enjoyed if you sit back and just take it in. Keith Fraase offers this, “Terry has a deep knowledge of art and science and he wanted everyone involved in the project to be fascinated and love it as much as he. This is Terry’s ode to the earth.”

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters

Vinyl

df0816_VINYL_1_sm

The decade of the 1970s was the heyday of the rock music business when a hit record nearly made you a king. It was the time right after Woodstock. Top bands like Led Zeppelin and the Rolling Stones commanded huge stadium shows. The legendary excesses of the music industry are most often encapsulated as “sex, drugs, and rock-n-roll”. Now the New York music and record company scene of that era has been brought to the screen in the new HBO series, Vinyl. The series was created by Mick Jagger & Martin Scorsese & Rich Cohen and Terence Winter.

Vinyl is told largely through the eyes of Richie Finestra (played by Bobby Cannavale, Daddy’s Home, Ant-Man), the founder and president of the fictional American Century Records. He’s a rags-to-riches guy with a gift for discovering music acts. In the pilot episode, the company is about to be sold to Polygram, but a series of events changes the course of Finestra’s future, which sets up the basis for the series. It’s New York in the 70s at the birth of hip-hop, disco, and punk rock with a lot of cultural changes going on as the backdrop. The series features an eclectic cast, including James Jagger (Mick’s son) as the leader singer of a raw, New York punk band, The Nasty Bits.

The pilot teleplay was written by Terence Winter and George Mastras, based on a story developed by Cohen, Scorsese, Jagger, and Winter. This feature-length series kick-off was directed by Scorsese with Rodrigo Prieto (A Midsummer Night’s Dream, The Wolf of Wall Street) as director of photography. Martin Scorsese is certainly no stranger to the music industry with projects like Woodstock, The Last Waltz, The Blues, and Shine A Light to his credit. Coupled with his innate ability to tell entertaining stories about the underbelly of life in New York, Vinyl makes for an interesting stew. The pilot was a year in production and post and sets the tone for the rest of the series, which will be directed by seven other directors. This is the same model as with Boardwalk Empire. Scorsese and Jagger are part of the team of executive producers, with Winter as the show runner.

Producing a pilot like a feature film

df0816_VINYL_2I recently interviewed David Tedeschi (George Harrison: Living in the Material World, Public Speaking), editor for the pilot episode of Vinyl. Kate Sanford and Tim Streeto are the editors for the series. Tedeschi has edited both documentary and narrative films prior to working with Martin Scorsese, for whom he’s edited a number of documentaries, such as No Direction Home and Shine A Light. But the Vinyl pilot is his first narrative project with Scorsese. Tedeschi explains, “The concept started out as an idea for a feature film. It landed at HBO, who was willing to green-light it as a full series. We were able to treat the pilot like a feature and had the luxury of being able to spend nearly a year in post, with some breaks in between.”

Even though Scorsese’s Sikelia Productions approached it like a feature, the editorial staff was small, consisting mainly of Tedeschi and one associate editor, Alan Lowe. Tedeschi talks about the post workflow, “The film was shot digitally with a Sony F55, but Scorsese and Prieto wanted to evoke a 16mm film look to be in keeping with the era. Deluxe handled the dailies – adding a film look emulation that included grain. They provided us with Avid DNxHD 115 media. Since most scenes were shot with two or three cameras, Alan would sync the audio by slate and then create multicam clips for me, before I’d start to edit. We were working on two Avid Media Composers connected to Avid ISIS shared storage. For viewing, we installed 50” Panasonic plasma displays that were calibrated by Deluxe. The final conform and color correction was  handed by Deluxe with Steve Bodner as colorist.”

df0816_VINYL_3He continues, “Scorsese had really choreographed the scenes precisely, with extensive notes. In the dailies process we would review every scene, and he would map out selects and then we’d work through it. In spite of being very specific about how he’d planned out a scene, he would often revisit a scene and look at other options in order to improve it. He was very open minded to new ways of looking at the material. Overall, it was a pretty tight script and edit. The first director’s cut was a little over two hours and the final came in at one hour and fifty-two minutes plus end credits.”

Story and structure

The pilot episode of Vinyl moves back and forth through a timeline of Finestra’s life and punctuates moments with interstitial elements, such as a guitar cameo by a fictionalized Bo Diddley. It’s easy to think these are constructs devised during editing, but Tedeschi says no. He explains, “I would love to take credit for that, but moving back and forth through eras was how the script was written. The interstitial elements weren’t in the script, but were Marty’s idea. He found extra time in the shooting schedule to film those and they worked beautifully in the edit.”

df0816_VINYL_4Many film editors have very specific ways they like to set up their bins in order to best sort and organize elected footage. Tedeschi’s approach is more streamlined. He explains, “My method is usually pretty simple. I don’t do special things in the bins. I will usually assemble a sequence of selected dailies for each scene. Then I’ll mark it up with markers and sometimes may color-code a few clips. On Vinyl, Alan would do the initial pass to composite some of the visual effects, like green screen window composites. He also handled a lot of the sound design for me.”

Vinyl is very detailed in how actual events, bands, people, and elements of the culture are represented and integrated into the story – although, in a fictionalized way. It’s a historical snapshot of the New York in the 70s and the culture of that time. Little elements like The King Biscuit Flower Hour (a popular radio show on progressive rock radio stations back then) playing on a radio or a movie marquee for Deep Throat easily pin-point the time and place. Anyone who’s seen the Led Zeppelin concert documentary, The Song Remains the Same, will remember one of the Madison Square Garden backstage scenes with an angry and colorful Peter Grant (Led Zeppelin’s manager). His persona and a similar event also made it into the story, but modified to be integral to the plot.

df0816_VINYL_5Accuracy is very important to Scorsese. Tedeschi says, “We have done documentaries about music and some of these people are part of our lives. We would all hear stories about some pretty over-the-top things, so a lot of this comes directly from their memories. The biggest challenge was to be faithful to New York in 1973.  It’s become this mythical place, but in Vinyl that’s the New York of Scorsese’s memory. We’ve certainly altered many actual facts, but even the most outrageous events that happen in the pilot and the series are rooted in true, historical events. We even reviewed historical footage. There was a very methodical approach.” Aside from the entertaining elements, it’s also a pretty solid story about how record companies actually operate.  He adds, “We had a screening towards the end of the editing process for the consultants, who had all worked in the record business. I knew we had done well, because they immediately launched into a lively discussion about contracts and industry standards and what names had been changed.”

This is a story about music and the music itself is a driving influence. Tedeschi concludes, “There is almost constant source music in the background. Scorsese went through each scene and we painstakingly auditioned many songs. One thing folks might not realize is that we sourced all of the recorded music that was used in their original formats. If a hit song was originally released as a 45 RPM record or an LP, then we’d track down a copy and try to use that. A few songs even came from 78 RPM records. We found a place that could handle high-quality transfers from such media and provide us with a digital file, which we used in the final mix. Often, a song may have been remastered, but we would compare our transfer with the remaster. The objective was to be faithful to the original sound – the way people heard it when it was released. After all, the series is called Vinyl for a reason. This was the director’s vision and how he remembered it.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Carol

df0116_carol_smFilms tend to push social boundaries and one such film this season is Carol, starring Cate Blanchett, Rooney Mara and Kyle Chandler. It’s a love story between two women, but more importantly it’s a love story between two people.  The story is based on the novel The Price of Salt by Patricia Highsmith, who also penned The Talented Mr. Ripley and Strangers on a Train. Todd Haynes (Six by Sondheim, Mildred Pierce) directed the film adaptation. Carol was originally produced in 2014 and finished in early 2015, but The Weinstein Company opted to time the release around the start of the 2015 awards season.

Affonso Gonçalves (Beasts of the Southern Wild, Winter’s Bone), the editor on Carol, explains, “Carol is a love story about two women coming to terms with the dissatisfaction of their lives. The Carol character (Cate Blanchett) is unhappily married, but loves her child. Carol has had other lesbian affairs before, but is intrigued by this new person, Therese (Rooney Mara), whom she encounters in a department store. Therese doesn’t know what she wants, but through the course of the film, learns who she is.”

Gonçalves and Haynes worked together on the HBO mini-series Mildred Pierce. Gonçalves says, “We got along well and when he got involved with the production, he passed along the script to me and I loved it.” Carol was shot entirely on Super 16mm film negative, primarily as a single-camera production. Only about five percent of the production included A and B cameras. Ed Lachman (Dark Blood, Stryker, Selena) served as the cinematographer. The film negative was scanned in log color space and then a simple log-to-linear LUT (color look-up table) was applied to the Avid DNxHD36 editorial files for nice-looking working files.

Creating a timeless New York story

Cincinnati served as the principal location designed to double for New York City in the early 1950s. The surrounding area also doubled for Iowa and Pennsylvania during a traveling portion of the film. Gonçalves discussed how Haynes and he worked during this period. “The production shot in Cincinnati, but I was based at Goldcrest Films in New York. The negative was shipped to New York each day, where it was processed and scanned. Then I would get Avid editorial files. The cutting room was set up with Avid Media Composer and ISIS systems and my first assistant Perri [Pivovar] had the added responsibilities on this project to check for film defects. Ed would also review footage each day; however, Todd doesn’t like to watch dailies during a production. He would rely on me instead to be his eyes and ears to make sure that the coverage that he needed was there.”

He continues, “After the production wrapped, I completed my editor’s cut, while Todd took a break. Then he spent two weeks reviewing all the dailies and making his own detailed notes. Then, when he was ready, he joined me in the cutting room and we built the film according to his cut. Once we had these two versions – his and mine – we compared the two. They were actually very similar, because we both have a similar taste. I had started in May and by September the cut was largely locked. Most of the experimenting came with structure and music.”

The main editorial challenges were getting the right structure for the story and tone for the performances. According to Gonçalves, “Cate’s and Rooney’s performances are very detailed and I felt the need to slow the cutting pace down to let you appreciate that performance. Rooney’s is so delicate. Plus, it’s a love story and we needed to keep the audience engaged. We weren’t as concerned with trimming, but rather, to get the story right. The first cut was two-and-a-half hours and the finished length ended up at 118 minutes. Some scenes were cut out that involved additional characters in the story. Todd isn’t too precious about losing scenes and this allowed us to keep the story focused on our central characters.”

“The main challenge was the party scene at the end. The story structure is similar to Brief Encounters (a 1946 David Lean classic with the beginning and ending set in the same location). Initially we had two levels of flashbacks, but there was too much of a shift back and forth. We had a number of ‘friends and family’ screenings and it was during these that we discovered the issues with the flashbacks. Ultimately we decided to rework the ending and simplify the temporal order of the last scene. The film was largely locked by the sixth or seventh cut.”

As a period piece, music is very integral to Carol. Gonçalves explains, “We started with about 300 to 400 songs that Todd liked, plus old soundtracks. These included a lot of singers of the time, like Billie Holiday. I also added ambiences for restaurants and bars. Carter (Burwell, composer) saw our cut at around the second or third screening with our temp score. After that he started sending preliminary themes to for us to work into the cut. These really elevated the tone of the film. He’d come in every couple of weeks to see how his score was working out with the cut, so it became a very collaborative process.”

The editing application that an editor uses is an extension of how he works. Some have very elaborate routines for preparing bins and sequences and others take a simpler approach. Gonçalves fits into the latter group. He says, “Avid is like sitting down and driving a car for me. It’s all so smooth and so fast. It’s easy to find things and I like the color correction and audio tools. I started working more sound in the Avid on True Detective and its tools really help me to dress things up. I don’t use any special organizing routines in the bins. I simply highlight the director’s preferred takes; however, I do use locators and take a lot of handwritten notes.”

Film sensibilities in the modern digital era

Carol was literally the last film to be processed at Deluxe New York before the lab was shut down. In addition to a digital release, Technicolor also did a laser “film-out” to 35mm for a few release prints. All digital post-production was handled by Goldcrest Films, who scanned the Super 16mm negative on an ARRI laser scanner at 3K resolution for a 2K digital master. Goldcrest’s Boon Shin Ng handled the scanning and conforming of the film. Creating the evocative look of Carol fell to New York colorist John J. Dowdell III. Trained in photography before becoming a colorist in 1980, Dowdell has credits on over 200 theatrical and television films.

Unlike other films, Dowdell was involved earlier in the overall process. He explains, “Early on, I had a long meeting with Todd and Ed about the look of the film. Todd had put together a book of photographs and tear sheets that helped with the colors and fashions from the 1950s. While doing the color grading job, we’d often refer back to that book to establish the color palette for the film.” Carol has approximately 100 visual effects shots to help make Cincinnati look like New York, circa 1952-53. Dowdell continues, “Boon coordinated effects with Chris Haney, the visual effects producer. The ARRI scanner is pin-registered, which is essential for the work of the visual effects artists. We’d send them both log and color corrected files. They’d use the color corrected files to create a reference, preview LUT for their own use, but then send us back finished effects in log color space. These were integrated back into the film.”

Dowdell’s tool of choice is the Quantel Pablo Rio system, which incorporates color grading tools that match his photographic sensibilities. He says, “I tend not to rely as much on the standard lift/gamma/gain color wheels. That’s a video approach. Quantel includes a film curve, which I use a lot. It’s like an s-curve tool, but with a pivot point. I also use master density and RGB printer light controls. These are numeric and let you control the color very precisely, but also repeatably. That’s important as I was going through options with Todd and Ed. You could get back to an earlier setting. That’s much harder to do precisely with color wheels and trackball controls.”

The Quantel Pablo Rio is a complete editing and effects system as well, integrating the full power of Quantel’s legendary Paintbox. This permitted John Dowdell and Boon Schin Ng to handle some effects work within the grading suite. Dowdell continues, “With the paint and tracking functions, I could do a lot of retouching. For example, some modern elements, like newer style parking meters, were tracked, darkened and blurred, so that they didn’t draw attention. We removed some modern signs and also did digital clean-up, like painting out negative dirt that made it through the scan. Quantel does beautiful blow-ups, which was perfect for the minor reframing that we did on this film.”

The color grading toolset is often a Swiss Army Knife for the filmmaker, but in the end, it’s about the color. Dowdell concludes, “Todd and Ed worked a lot to evoke moods. In the opening department store scene, there’s a definite green cast that was added to let the audience feel that this is an unhappy time. As the story progresses, colors become more intense and alive toward the end of the film. We worked very intuitively to achieve the result and care was applied to each and every shot. We are all very proud of it. Of all the films I’ve color corrected, I feel that this is really my masterpiece.”

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters