Generalists versus Specialists

“Jack of all trades, master of none” is a quote most are familiar with. But the complete quote “Jack of all trades, master of none, but oftentimes better than master of one” actually has quite the opposite perceived meaning. In the world of post production you have Jacks and Jills of all trades (generalists) and masters of one (specialists). While editors are certainly specialized in storytelling, I would consider them generalists when comparing their skillset to those of other specialists, such as visual effects artists, colorists, and audio engineers. Editors often touch on sound, effects, and color in a more general (often temp) way to get client approval. The others have to deliver the best, final results within a single discipline. Editors have to know the tools of editing, but not the nitty gritty of color correction or visual effects.

This is closely tied to the Pareto Principle, which most know as the 80/20 Rule. This principle states that 80% of the consequences come from 20% of the causes, but it’s been applied in various ways. When talking about software development, the 80/20 Rule predicts that 80% of the users are going to use 20% of the features, while only 20% of users will find a need for the other features. The software developer has to decide whether the target customer is the generalist (the 80% user) or the specialist (the 20% user). If the generalist is the target, then the challenge is to add some specialized features to service the advanced user without creating a bloated application that no one will use.

Applying these concepts to editing software development

When looking at NLEs, the first question to ask is, “Who is defined as a video editor today?” I would separate editors into three groups. One group would be the “I have to do it all” group, which generates most of what we see on local TV, corporate videos, YouTube, etc. These are multi-discipline generalists who have neither the time nor interest in dealing with highly specialized software. In the case of true one-man bands, the skill set also includes videography, plus location lighting and sound.

The “top end” – national and international commercials, TV series, and feature films – could be split into two groups: craft (aka film or offline) editors and finishing (aka online) editors. Craft editors are specialists in molding the story, but generalists when it comes to working software. Their technical skills don’t have to be the best, but they need to have a solid understanding of visual effects, sound, and color, so that they can create a presentable rough cut with temp elements. The finishing editor’s role is to take the final elements from sound, color, and the visual effects houses, and assemble the final deliverables. A key talent is quality control and attention to detail; therefore, they have no need to understand dedicated color, sound, or effects applications, unless they are also filling one of these roles.

My motivation for writing this post stemmed from an open letter to Tim Cook, which many editors have signed – myself included. Editors have long been fans of Apple products and many gravitated from Avid Media Composer to Apple Final Cut Pro 1-7. However, when Apple reimagined Final Cut and dropped Final Cut Studio in order to launch Final Cut Pro X many FCP fans were in shock. FCPX lacked a number of important features at first. A lot of these elements have since been added back, but that development pace hasn’t been fast enough for some, hence the letter. My wishlist for new features is quite small. I recognize Final Cut for what it is in the Apple ecosystem. But I would like to see Apple work to raise the visibility of Final Cut Pro within the broader editing community. That’s especially important when the decision of which editing application to use is often not made by editors.

Blackmagic Design DaVinci Resolve – the über-app for specialists

This brings me to Resolve. Editors point to Blackmagic’s aggressive development pace and the rich feature set. Resolve is often viewed as the greener pasture over the hill. I’m going to take a contrarian’s point of view. I’ve been using Resolve since it was introduced as Mac software and recently graded a feature film that was cut on Resolve by another editor.

Unfortunately, the experience was more problematic than I’ve had with grades roundtripped to Resolve from other NLEs. Its performance as an editor was quite slow when trying to move around in the timeline, replace shots, or trim clips. Resolve wouldn’t be my first NLE choice when compared to Premiere Pro, Media Composer, or Final Cut Pro. It’s a complex program by necessity. The color management alone is enough to trip up even experienced editors who aren’t intimately familiar with what the various settings do with the image.

DaVinci Resolve is an all-in-one application that integrates editing (2 different editing models), color correction (aka grading), Fusion visual effects, and the Fairlight DAW. Historically, all-in-ones have not had a great track record in the market. Other such über-apps would include Avid|DS and Autodesk Smoke. Avid pulled the plug on DS and Autodesk changed their business model for the Flame/Smoke/Lustre product family into subscription. Neither DS nor Smoke as a standalone application moved the needle for market share.

At its core, Resolve is a grading application with Fusion and Fairlight added in later. Color, effects, and audio mixing are all specialized skills and the software is designed so that each specialist if comfortable with the toolset presented on those pages/modes. I believe Blackmagic has been attempting to capitalize on Final Cut editor discontent and create the mythical “FCP8” or “FC Extreme” that many wanted. However, adding completely new and disparate functions to an application that at its core is designed around color correction can make it quite unwieldy. Beginning editors are never going to touch most of what Resolve has to offer and the specialists would rather have a dedicated specialized tool, like Nuke, After Effects, or Pro Tools.

Apple Final Cut Pro – reimagining modern workflows for generalists

Apple makes software for generalists. Pages, Numbers, Keynote, Photos, GarageBand, and iMovie are designed for that 80%. Apple also creates advanced software for the more demanding user under the ProApps banner (professional applications). This is still “generalist” software, but designed for more complex workflows. That’s where Final Cut Pro, Motion, Compressor, and Logic Pro fit.

Apple famously likes to “skate to where the puck will be” and having control over hardware, operating system, and software gives the teams special incite to develop software that is optimized for the hardware/OS combo. As a broad-based consumer goods company Apple also understands market trends. In the case of iPhones and digital photography it also plays a huge role in driving trends.

When Apple launched Final Cut Pro X the goal was an application designed for simplified, modernized workflows – even if “Hollywood” wasn’t quite ready. This meant walking away from the comprehensive “suite of tools” concept (Final Cut Studio). They chose to focus on a few applications that were better equipped for where the wider market of content creators was headed – yet, one that could still address more sophisticated needs, albeit in a different way.

This reimagining of Final Cut Pro had several aspects to it. One was to design an application that could easily be used on laptops and desktop systems and was adaptable to single and dual screen set-ups. It also introduced workflows based on metadata to improve edit efficiency. It was intended as a platform with third parties filling in the gaps. This means you need to augment FCP to cover a few common industry workflows. In short, FCP is designed to appeal to a broad spectrum of today’s “professionals” and not how one might have defined that term in the early 1990s, when nonlinear editing first took hold.

For a developer, it gets down to who the product is marketed towards and which new features to prioritize. Generalists are going to grow the market faster, hence a better return on development resources. The more complex an application becomes, the more likely it is to have bugs or break when the hardware or OS is updated. Quality assurance testing (QA) expands exponentially with complexity.

Final thoughts

Do my criticisms of Resolve mean that it’s a bad application? No, definitely not! It’s powerful in the right hands, especially if you work within its left-to-right workflow (edit -> Fusion -> color -> Fairlight). But, I don’t think it’s the ideal NLE for craft editing. The tools are designed for a collection of specialists. Blackmagic has been on this path for a rather long time now and seem to be at a fork in the road. Maybe they should step back, start from a clean slate, and develop a fresh, streamlined version of Resolve. Or, split it up into a set of individual, focused applications.

So, is Final Cut Pro the ideal editing platform? It’s definitely a great NLE for the true generalist. I’m a fan and use it when it’s the appropriate tool for the job. I like that it’s a fluid NLE with a responsive UI design. Nevertheless, it isn’t the best fit for many circumstances. I work in a market and with clients that are invested in Adobe Creative Cloud workflows. I have to exchange project files and make sure plug-ins are all compatible. I collaborate with other editors and more than one of us often touches these projects.

Premiere Pro is the dominant NLE for me in this environment. It also clicks with how my mind works and feels natural to me. Although you hear complaints from some, Premiere has been quite stable for me in all my years of use. Premiere Pro hits the sweet spot for advanced editors working on complex productions without becoming overly complex. Product updates over the past year have provided new features that I use every day. However, if I were in New York or Los Angeles, that answer would likely be Avid Media Composer, which is why Avid maintains such dominance in broadcast operations and feature film post.

In the end, there is no right or wrong answer. If you have the freedom to choose, then assess your skills. Where do you fall on the generalist/specialist spectrum? Pick the application that best meets your needs and fits your mindset.

For another direct comparison check out this previous post.

©2022 Oliver Peters

Analogue Wayback, Ep. 8

Nonlinear editing in the early days.

At the dawn of the “Hollywood East” days in central Florida, our post house, Century III, took up residency at Universal Studios Florida. As we ramped up the ability to support episodic television series production, a key member joined our team. John Elias was an A-list Hollywood TV series editor who’d moved to Florida in semi-retirement. Instead of retiring, he joined the company as our senior film editor.

Based on John’s experience in LA, our NLE of choice at that time was the Cinedco Ediflix. Like many of the various early NLEs, the Ediflix was a Rube Goldberg contraption. The edit computer controlled 12 VHS decks designed to mimic random access playback. The edit interface was controlled using a light pen. The on-screen display emulated numbered dialogue lines and vertical take lines somewhat like a script supervisor’s notation – only without any text for dialogue.

Shooting ratios were reasonable in those days; therefore, a day of filming was generally no more than an hour of footage. The negative would come back from the lab. The colorist would transfer and sync dailies in the telecine room, recording color-corrected footage to the camera master reels (1″ Type C). Dailies would then be copied to 3/4″ Umatic to be loaded into the Ediflex by the assistant editor. This included adding all the script information in a process called Script Mimic. The 3/4″ footage was copied to 12 duplicate VHS videocassettes (with timecode) for the Ediflex to use as its media.

The decks were industrial-grade JVC players. Shuttling and cueing performance was relatively fast with 60-minute cassettes. To edit and/or play a scene, the Ediflex would access the VHS deck closest to the desired timecode, cueing and switching between different decks to maintain real-time playback of a sequence. On most scripted shows, the editor could get through a scene or often even a complete act without the need to wait on a machine to cue or change videocassette loads.

Rough cuts were recorded back to 3/4″ for review by the director and producers. When the edit was locked, an EDL was transferred via floppy disk to the online edit bay where the film transfer masters were conformed at full quality. Since the Ediflex systems used an internal timebase corrector, image quality was reasonable and you could easily check for issues like proper lip-sync. So, while the system was a bit clunky in operation, it was head and shoulders better for film and TV offline editing than the digital upstarts – mainly thanks to relatively better image quality.

We leased four full Ediflex systems plus assistant stations by 1990. John and our team of offline editors cut numerous series, films, and even some custom, themed attraction programs. It was in this climate that Avid Technology came onto the scene and had to prove itself. We had seen one of the earliest Avid demos at NAB when they were still back in the “pipe-and-drape” section of the show floor. Most editors who were there will likely remember the Top Gun demo project and just how awful the image quality really was. There was no common computer media architecture, so Avid had to invent their own. As I remember, these were 4-bit images – very low-res and very posterized. But, most of the core editing functions were in place. 

Avid was making headway among commercial editors. The company was also eager to gain traction in the long-form market. As the resident post house on a major studio lot in a promising new market, making a successful sale to us was definitely of interest to them. To see if this was the right fit, Avid brought in a system for John to test. Avid legend Tom Ohanian also came down to train and work with John and run through a test project. They took an episode of one of the shows to see how the process compared with our experiences using Ediflex.

Unfortunately, this first test wasn’t great. Well into the week, the Mac’s internal drive crashed, thus corrupting the project file and bins. Your media becomes useless when the project is gone. This meant starting over. Needless to say, John was not a happy camper. When the cut was done, we decided to have the series producer review the rough cut to see if the quality was acceptable. This was the Swamp Thing series – a Universal property that you can still find in distribution. The show is dark and scenes generally happen at night. As the producer reviewed the edit, it was immediately clear that the poor image quality was a no-go at that time. It was impossible to see sync or proper eyeline on anything other than close-up shots. That temporarily tanked our use of Avid for series work.

Fast forward a couple of years. Avid’s image quality had improved and there was at least one in town. As a test run, we booked time on that unit and I cut a statewide citrus campaign. This was a more successful experience, so we soon added an Avid to our facility. This eventually grew to include several Media Composers, shared storage, and even a hero room rebuilt around Avid Symphony (a separate unit from Media Composer back then). Those systems were ultimately used to cut many shows, commercials, feature films, and IllumiNations: Reflections of Earth – a show that enjoyed a 20-year run at EPCOT.

©2022 Oliver Peters

Last Night in Soho

Edgar Wright has written and directed a string of successful comedies and cult classics. His latest, Last Night in Soho is a change from this pattern. It’s a suspense thriller that would make Hitchcock proud. Eloise (Thomasin McKenzie) is a young English country girl who’s moved to the Soho area of London to study fashion design. Her nightly dreams transport her into the past of the swinging 60s and the Soho nightlife, observing and becoming strangely intertwined with the life of Sandie (Anya Taylor-Joy), an aspiring singer. Those dreams quickly evolve into nightmares as the story turns more sinister.

Last Night in Soho was edited by Paul Machliss, ACE, a frequent collaborator with Wright. Machliss edited Scott Pilgrim vs. The World, The World’s End, and Baby Driver. The latter picked up a BAFTA win and an Oscar nomination for best editing. I recently had a chance to interview Machliss during a break from cutting his next feature, The Flash.

______________________________

Music was a key plot device and impetus for much of the editing in Baby Driver. In Last Night in Soho, music of the 1960s also plays a key role, but in a less blatant manner.

Baby Driver was about capital E editing. Edgar and I took a lot of what we learned and applied it to Soho. Can we do it in a more subtle away? It’s not immediately obvious, but it’s probably as intense. It just doesn’t show itself to the same level.

Many shots that look like sophisticated visual effects were done practically, such as the dance sequence involving both lead actresses and Matt Smith. What were some of the other practical effects?

One clever shot was the bit in the phone box, where right at the end of the dance, they go in for a snog. Initially the reflection is that of Matt [Jack] and Anya [Sandie]. Then as Matt pulls back, you realize that it becomes Thomasin [Eloise] in the mirror. That was initially in a mirror when they start kissing. In the middle of the kiss there’s an effects team yanking the mirror back to reveal Thomasin and a stand-in behind the mirror. The visual effect is to get rid of the moment when the mirror is yanked.

I’d love to say it was full of subtle things that couldn’t have been done without editing. However, it almost goes back to the days of Vaudeville or the earliest days of cinema. Less is more – those simple things are still incredibly effective.

As with Baby Driver, I presume you were editing primarily on the set initially?

I was on set every day. We did about three weeks straight of night shoots at the height of summer in Soho. I wouldn’t say we owned Soho, but we could just literally run anywhere. We could park in Soho Square and wheel the trolleys wherever we needed for the purposes of filming. However, Soho didn’t close down – life carried on in Soho. It was fascinating to see the lifestyle of Soho change when you are filming from 6:00 PM till 6:00 AM. It’s in the smaller hours that elements of the ‘darker’ side of Soho start to appear and haunt the place. So that slightly sinister level of it hasn’t gone away.

Being on set had a lot to do with things like music playback, motion control cameras, and certainly lighting, which probably more than Baby Driver played a huge part in this film. I found myself somewhat responsible for the timing to make sure that Edgar’s idea of how he wanted the lighting to work in sync with the music was 100% successful on the night, because there was no fixing it afterwards. 

Some editors avoid being on set to maintain objectivity. Your working style seems to be different.

My MO really is not to do the perfect assemble edit in the midst of all the madness of filming. What I’m trying to do for Edgar is a proof of concept. We know as we’re shooting that issues can arise from continuity, camera angles, and various other things. So part of what I’m doing on set is keeping an eye on all the disparate elements from a technical perspective to make sure they’re all in harmony to support Edgar’s vision. But as the editor, I still need to make time for an assembly. Sometimes that meant getting there an hour or two earlier on set. Then I just sit at the Avid with headphones and quickly catch up on the previous day’s work, where I do try and make more of a concerted effort to cut a good scene. Then that gets passed on to the editorial department. By the time Edgar and I get back into the cutting room, we have a fully assembled film to start working on.

Lighting design – especially the neon sign outside of Ellie’s window – drives the scene transitions.

Edgar’s original brief was, “There is a point where the lighting goes from blue, white, red, blue, white, red, and then whenever she transitions into the past, it goes into a constant flashing red.” That’s something that any good lighting operator can queue quite simply. What made it more interesting is that Edgar said, “What I’d love, is for the lighting to flash subtly, but in time to every different bit of music that Eloise puts on her little record player.” Then it was like, “Oh, right, how do we do that?” 

Bradley Farmer on our music team was able to break the songs down into a kind of beat sheet with all the lyrics and the chorus. Edgar would go, “According to the storyboards this is the line in the song that I’d like it to start going red, red, red.” Armed with that knowledge, I had one track of audio with the music in mono and another track with longitudinal timecode – a different hour code for every song. I would edit full screen color images of red, white, and blue as a reference on the Avid to match what the timing of the color changes should be against the track.

Next, I was able to export an XML file out of the Avid, which could be read by the lighting panel computer. The lighting operator would load these sequences in so the panel would know when to have the lights in their ‘on’ or ‘off’ state. He also had a QuickTime reference from the Avid so he could see the color changes against the burnt-in timecode and know, “Next one’s red, program the SkyPanel to go red.”

Our music playback guy, Pete Blaxill, had copies of the tracks in stereo and was able to use my timecode as his base timecode. He then sent that timecode to the lighting panel. So if Edgar goes, “I now want to pick up the song from the first chorus,” then the lighting panel would chase the playback timecode. Once the sequence was set at the lighting panel, wherever Edgar wanted to go, the lighting desk knew which part of the song we were at and what the next color in the sequence was.

To make things a tad more complex Edgar wanted to shoot some of the action to the playback at 32fps so there could be a dreamlike quality to movement at certain points in the song. This meant creating a lighting map that would work at 32fps, as well as the regular 24fps version. Bradley Farmer gave me a version of the songs that were exactly 33.3% faster and pitch-corrected. I reformatted my Avid sequence so everything just went on and off a third faster for the 32fps tracks. And once again, gave the XML file to the lighting guy and the sped-up tracks to Pete for his Pro Tools system.

I realized that the motion control camera could also be triggered by external timecode from Pete’s Pro Tools system. We utilized that at the climax of ‘You’re My World’ where Anya descends the staircase in the Cafe de Paris ballroom. This was a two-layer composite shot filmed with a motion control camera in multiple passes. Thomasin does one pass descending the staircase and then we did Anya’s pass. We also had singer Beth Singh [Cilla Black] in the foreground of the shot with her backing band behind her. Pete Blaxill would hit play on his Pro Tools. The music would be coming through the foldback for Beth to mime to, the lighting switched in at the right musical point, and then at exactly the right moment in the chorus the Pro Tools timecode would trigger the MoCo. I remember sitting crosslegged on the floor out of the way in the corner and watching all this happen. It’s incredibly ‘nerdy,’ but it gave one a wonderful feeling of satisfaction to be part of the team that could make moments like this happen seamlessly.

What formats were used to capture Last Night in Soho?

This was all shot 35mm, except that we used an ARRI Alexa for the night shoots. We tested both 16mm and 35mm for these exteriors. The 16mm looked good, but was too grainy and the highlights flared out tremendously. The 35mm was okay, but required a lot more lighting to achieve the desired look. Of course with the Alexa being digital, you put it there without any extra lighting at all, look at the result, and go, “Gosh, look at the amount of detail – that’s almost usable as it is.” Just because of the variety of shots we needed, Edgar and DP Chung-hoon Chung decided to use the Alexa for exterior night locations and any interior location where we would be looking out into the street at some point – for example, the library that Eloise visits later in the film.

Does the choice of film rather than digital acquisition add some technical challenges?

Fortunately, Kodak is still a company that is churning out 35mm stock full-time. The infrastructure for it though is getting less and less. Even in the time between doing The World’s End and Last Night in Soho, there is now only one facility in the UK that can process 35mm. And it has to process the rushes for every production that’s shooting on 35mm in the UK.

Even so, it’s great to think that a medium like 35mm can still support something as complicated as one of Edgar’s films without having to do everything in the digital domain. You’ve got the [Panavision] Millennium XL cameras – even though they’re nearing 20 years old, they’re pretty solid. 

Edgar is totally aware that he’s not shooting on an ‘archaic’ format for the sake of it, but that it’s his preferred medium of acquisition – in the same way some painters choose water-based paints as opposed to oil-based. He knows what medium he’s on and he respects that. You might say, “Yes, but the punter won’t necessarily know or appreciate what format this was shot on.” However, it helps to contribute to the feeling of our production. It’s part of the look, part of the patina – that slightly organic feel of the media brings so much to the look of Soho.

Because you are cutting on set, you’re working from a video tap off of the film camera.

Right. Once again, I had a little network set up with a large spool of Cat 5 cable connected to the Qtake system. And I would put myself out of the way in a corner of the soundstage and get on with it. I would just be a crew member quietly doing my job editing, not bothering Edgar. Sometimes he might not see me for a couple of hours into the day until he needed my input on a shot.

So that means at some point the low-resolution video tap clips have to be replaced by the actual footage after the film has been transferred.

That’s right. I used my editorial team from Baby Driver – Jerry Ramsbottom and Jessica Medlycott. Both of them were well-versed in the technique of over-cutting the low-res video tap footage once the hi-res Avid media came back from the lab. I never had to worry about that. The cutting room was also in Soho and when we were shooting there, if Edgar ever had a question about the footage or the previous day’s edit he would say, “Could we meet an hour early (before call-time) and pop into the cutting room to look at some stuff?” It also meant that we could tweak the edit before that day’s shoot and give Edgar a better idea of his goals for that day.

Please tell me about the production and post timeline, considering that this was all impacted by the pandemic.

Prep started at the beginning of 2019. I came on board in April. The shoot itself went from May till early September. We then had the director’s cut period, which due to the complexity took slightly longer than your average 10 weeks.

We worked up until the new year and had a preview screening early in January, which got some good numbers back. Edgar was able to start talking to the studio about some additional photography. We planned that shoot for the last week of March 2020. But at the last minute it was cancelled as we entered the first lockdown. However, visual effects work continued, because all of that could be done remotely. We had a weekly VFX review using Zoom and Cinesync with Double Negative for about a four-month period during the first lockdown. That was a bit of a lifesaver for us to know that the film still had a heartbeat during that time.

Things calmed down at the end of July – enough for the studio to consider allowing us to remount the additional photography. And of course, it was a new world we were living in by that stage. Suddenly we were working in ‘zones’ on set. I was assigned the zone with the video department and couldn’t go on to set and work with Edgar. We had an area – divided by plastic fencing – where we could be. We would have to maintain a distance with him on one side and me on the other. Fortunately, I had my edit-trolley modified so that the A-Grade monitor was on a swivel-mount and that’s how he was able to keep an eye on the progress of the work.

We were only the second production in England to resume at that time. I think other productions were watching us and thinking, “Would we all just collapse like flies? Would COVID just go ‘BAM!’ and knock us all out?” Overall, the various PCR testing and new health and safety procedures added about an extra 20% to the budget, but it was the only way we were going to be allowed to shoot.

The reshoots went very well and we had another preview screening in October and the numbers were even better. But then we were approaching our second lockdown in the UK. However, this time Edgar and I were able to see post-production all the way through. All of our dates at Twickenham Studios for the sound mix and at Warner Bros. De Lane Lea for the grade could be honored, even though strict safety precautions were in place.

We delivered the film on December 18th of last year, having made all the various HDR and SDR versions for the UHD Blu-ray, as well as general release. We did a wonderful Dolby Vision/Dolby Atmos version, which is actually the ultimate way to see the film. Because of the pandemic and the lack of open theaters at the time, there was a school of thought that this film should be released directly onto a streaming platform. Fortunately, producers Eric Fellner and Nira Park viewed the final grade and mix at Twickenham and said, “No, no, this is a cinematic experience. It was designed to be seen on the big screen, so let’s wait. Let’s bide our time.”

Edgar Wright was also working on the documentary, The Sparks Brothers. Was the post on these two simultaneous? If so, how did this impact his availability to you during the Soho post?

The timelines to the two projects were kind of parallel. Making a documentary is about gathering tons of archival footage and obtaining new interviews. Then you can leave it with the editor to put a version of it together. I remember that Edgar had done a ton of interviews before we started shooting Soho. He’d done all of the black-and-white interviews in London or in LA pre-pandemic. The assembly of all of that footage happened during our shoot. Then when the lockdown occurred, it was very easy for Paul Trewartha, the editor of the Sparks documentary, to carry on working from home.

When we came back, the Soho and Sparks cutting rooms were both on different floors of our edit facility on Wardour Street in Soho. They were on the first floor and we were on the top floor. Edgar would literally bounce between the two. It got a little bit fraught for Edgar, because the grading and dubbing for both films happened at the same time at different facilities. I remember Edgar had to sign off on both films on December 18th. So he had to go to one facility to watch Soho with me and then he went off to watch Sparks with Paul, which I imagined gave him quite a degree of satisfaction to complete both huge projects on the same day.

To wrap it up, let’s talk edit systems. Avid Media Composer is still your go-to NLE. Right?

Avid, yeah. I’ve been running it for God knows how long now. It’s a little like touch-typing for me – it all just happens very quickly. When I watch all the dailies of a particular scene, I’m in almost a trance-like state, putting shots together very quickly on the timeline before taking a more meticulous approach. I know the ballistics of the Avid, how it behaves, how long it takes between commands. It’s still the best way for me to connect emotionally to the material. Plus on a technical level – in terms of media management, having multiple users, VFX editors, two assistant editors – It’s still the best.

In a past life you were a Smoke editor. Any closing thoughts about some of the other up-and-coming editing applications?

You certainly can’t call them the poor cousins of post-production. Especially Resolve. Our colorist, Asa Shoul, suggested I look at Resolve. He said, “Paul, you really should have a look, because not only is the cutting intuitive, but you could send me a sequence, to which I could apply a grade, and that would be instantly updated on your timeline.” Temp mixes from the sound department would work in a very similar way. I think that sort of cross-pollination of ideas from various departments, all contributing elements to a sequence, which itself is being continually updated, is a very exciting concept.

I wouldn’t be surprised if one day someone said, “Paul, we are doing this film, but we’re going to do it on Resolve, because we have this workflow in place and it’s really good.” At my advanced age of 49 [laugh], I don’t want to think, “Well, no, it’s Avid or nothing.” I think part of keeping your career fresh is the ability to integrate your skills with new workflow methods that offer up results, which would have seemed impossible ten years earlier. 

Images courtesy of Focus Features.

This article is also available at postPerspective.

For more, check out Steve Hullfish’s Art of the Cut interview with Paul Machliss.

©2021 Oliver Peters

Avid’s Hidden Gems

Avid Media Composer offers a few add-on options, but two are considered gems by the editors that rely on them. ScriptSync and PhraseFind are essential for many drama and documentary editors who wield Media Composer keyboards every day. I’ve written about these tools in the past, including how you can get similar functionality in other NLEs. New transcription services, like Simon Says, make them more viable than ever for the average editor.

Driven by the script

Avid’s script-based editing, also called script integration, builds a representation of the script supervisor’s lined script directly into the Avid Media Composer workflow and interface. While often referred to as ScriptSync, Avid’s script integration is actually not the same. Script-based editing and script bins are part of the core Media Composer system and does not cost extra.

The concept originated with the Cinedco Ediflex NLE and migrated to Avid. In the regular Media Composer system, preparing a script bin and aligning takes to that script is a manual process, often performed by assistant editors that are part of a larger editorial team. Because it is labor-intensive, most individual editors working on projects that aren’t major feature films or TV series avoid using this workflow.

Avid ScriptSync (a paid option) automates this script bin preparation process, by automatically aligning spoken words in a take to the text lines within the written script. It does this using speech recognition technology licensed from Nexidia. This technology is based on phonemes, the sounds that are combined to create spoken words. Clips can be imported (transcoded into Avid MediaFiles) or linked.

Through automatic analysis of the audio within a take, ScriptSync can correlate a line in the script to its relative position within that take or within multiple takes. Once clips have been properly aligned to the written dialogue, ScriptSync is largely out of the picture. And so, in Avid’s script-based editing, the editor can then click on a line of dialogue within the script bin and see all of the coverage for that line.

Script integration with non-scripted content

You might think, “Great, but I’m not cutting TV shows and films with a script.” If you work in documentaries or corporate videos built around lengthy interviews, then script integration may have little meaning – unless you have transcripts. Getting long interviews transcribed can be costly and/or time-consuming.  That’s where an automated transcription service like Simon Says comes in. There are certainly other, equally good services. However, Simon Says, offers export options tailored for each NLE, including Avid Media Composer.

With a transcription available on a fast turnaround, it becomes easy to import an interview transcript into a Media Composer script bin and align clips to it. ScriptSync takes care of the automatic alignment making script-based editing quick, easy, and painless – even for an individual editor without any assistants.

Finding that needle in the haystack

The second gem is PhraseFind, which builds upon the same Nexidia speech recognition technology. It’s a tool that’s even more essential for the documentary editor than script integration. PhraseFind (a paid option) is a phonetic search tool that analyzes the audio for clips within an Avid MediaFiles folder. Type in a word or phrase and PhraseFind will return a number of “hits” with varying degrees of accuracy.

The search is based on phonemes, so the results are based on words that “sound like” the search term. On one side this means that low-accuracy results may include unrelated finds that sound similar. On the other hand, you can enter a search word that is spelled differently or inaccurately, but as long as it still sounds the same, then useful results will be returned.

PhraseFind is very helpful in editing “Frankenbites.” Those are edits were sentences are ended in the middle, because a speaker went off on a tangent, or when different phrases are combined to complete a thought. Often you need to find a word that matches your edit point, but with the correct inflection, such as ending a sentence. PhraseFind is great for these types of searches, since your only alternative is scouring through multiple clips in search of a single word.

Working with the options

Script-based editing, ScriptSync, and PhraseFind are unique features that are only available in Avid Media Composer. No other NLE offers similar built-in features. Boris FX does offer Soundbite, which is a standalone equivalent to the PhraseFind technology licensed to them by Nexidia. It’s still available, but not actively promoted nor developed. Adobe had offered Story as a way to integrate script-based editing into Premiere Pro. That feature is no longer available. So today, if you want the accepted standard for script and phonetic editing features, then Media Composer is where it’s at.

These are separate add-on options. You can pick one or the other or both (or neither) depending on your needs and style of work. They are activated through Avid Link. If you own multiple seats of Media Composer, then you can purchase one license of ScriptSync and/or PhraseFind and float them between Media Composers via Avid Link activation. While these tools aren’t for everyone, they do offer a new component to how you work as an editor. Many who’ve adopted them have never looked back.

©2020, 2021 Oliver Peters

Avid Media Composer 2020

Avid Media Composer has been at the forefront of nonlinear, digital video editing for three decades. While most editors and audio mixers know Avid for Media Composer and Pro Tools, the company has grown considerably in that time. Whether by acquisition or internal development, Avid Technology encompasses such products as storage, live and post mixing consoles, newsroom software, broadcast graphics, asset management, and much more.

In spite of this diverse product line, Media Composer, as well as Pro Tools, continue to be the marquee products that define the brand. Use the term “Avid” and generally people understand that you are talking about Media Composer editing software. If you are an active Media Composer editor, then most of this article will be old news. But if you are new to Media Composer, read on.

The Media Composer heritage

Despite challenges from other NLEs, such as Final Cut Pro,  Final Cut Pro X, Premiere Pro, and DaVinci Resolve, Media Composer continues to be the dominant NLE for television and feature film post around the world. Even in smaller broadcast markets and social media, it’s not a given that the other options are exclusively used. If you are new to the industry and intend to work in one of the major international media hubs, then knowing the Media Composer application is helpful and often required.

Media Composer software comes in four versions, ranging from Media Composer | First (free) up to Media Composer Enterprise. Most freelance editors will opt for one of the two middle options: Media Composer or Media Composer | Ultimate. Licenses may be “rented” via a subscription or bought as a perpetual license. The latter includes a year of support with a renewal at the end of that year. If you opt not to renew support, then your Media Composer software will be frozen at the last valid version issued within that year; but it will continue to work. No active internet connection or periodic sign-in is required to use Media Composer, so you could be off the grid for months and the software works just fine.

A Media Composer installation is full-featured, including effects, audio plug-ins, and background rendering software. Depending on the version, you may also receive loyalty offers (free) for additional software from third-party vendors, like Boris FX, NewBlueFX, iZotope, and Accusonus.

Avid only offers three add-on options for Media Composer itself: ScriptSync, PhraseFind, and Symphony. Media Composer already incorporates manual script-based editing. Plain text script documents can be imported into a special bin and clips aligned to sentences and paragraphs in that script. Synchronization has to be done manually to use this feature. The ScriptSync option saves time – automating the process by phonetically analyzing and syncing clips to the script text. Click on a script line and any corresponding takes can be played starting from that point within the scene.

The PhraseFind option is a phonetic search engine, based on the same technology as ScriptSync. It’s ideal for documentary and reality editors. PhraseFind automatically indexes the phonetics of the audio for your clips. Search by a word or phrase and all matching  instances will appear, regardless of actual spelling. You can dial in the sensitivity to find only the most accurate hits, or broader in cases where dialogue is hard to hear or heavily accented.

Media Composer includes good color correction, featuring wheels and curves. In fact, Avid had this long before other NLEs. The Symphony option expands the internal color correction with more capabilities, as well as a full color correction workflow. Grade clips by source, timeline, or both. Add vector-based secondary color correction and more. Symphony is not as powerful as Baselight or Resolve, but you avoid any issues associated with roundtrips to other applications. That’s why it dominates markets where turnaround time is critical, like finishing for non-scripted (“reality”) TV shows. A sequence from a Symphony-equipped Media Composer system can still be opened on another Media Composer workstation that does not have the Symphony option. Clips play fine (no “media offline” or “missing plug-in” screen); however, the editor cannot access or alter any of the color correction settings specific to Symphony.

Overhauling Media Composer

When Jeff Rosica took over as CEO of Avid Technology in 2018, the company embraced an effort to modernize Media Composer. Needless to say, that’s a challenge. Any workflow or user interface changes affect familiarity and muscle memory. This is made tougher in an application with a loyal, influential, and vocal customer base.  An additional complication for every software developer is keeping up with changes to the underlying operating system. Changes from Windows 7 to Windows 10, or from macOS High Sierra to Mojave to Catalina, all add their own peculiar speed bumps to the development roadmap.

For example, macOS Catalina is Apple’s first, full 64-bit operating system. Apple dropped any 32-bit QuickTime library components that were used by developers to support certain codecs. Of course, this change impacted Media Composer. Without Apple rewriting 64-bit versions of these legacy components, the alternative is for a developer to add their own support back into the application, which Avid has had to do. Unfortunately, this introduces some inevitable media compatibility issues between older and newer versions of Media Composer. Avid is not alone in this case.

Nevertheless, Media Composer changes aren’t just cosmetic, but also involve many “under the hood” improvements. These include a 32-bit float color pipeline, support for ACES projects, HDR support, dealing with new camera raw codecs, and the ability to read and write ProRes media on both macOS and Windows systems.

Avid Media Composer 2020.10

Avid bases its product version numbers by the year and month of release. Media Composer 2020.10 – the most recent version as of this writing – was just released. The versions prior to that were Media Composer 2020.9 and 2020.8, released in September and August respectively. But before that it was 2020.6 from June, skipping .7. (Some of the features that I will describe were introduced in earlier versions and are not necessarily new in 2020.10.)

Media Composer 2020.10 is fully compatible with macOS Catalina. Due to the need to shift to a 64-bit architecture, the AMA framework – used to access media using non-Avid codecs – has been revamped as UME (Universal Media Engine). Also the legacy Title Tool has been replaced with the 64-bit Titler+.

If you are a new Media Composer user or moving to a new computer, then several applications will be installed. In addition to the Media Composer application and its built-in plug-ins and codecs, the installer will add Avid Link to your computer. This is a software management tool to access your Avid account, update software, activate/deactivate licenses, search a marketplace, and interact with other users via a built-in social component.

The biggest difference for Premiere Pro, Resolve, or Final Cut Pro X users who are new to Media Composer is understanding the Avid approach to media. Yes, you can link to any compatible codec, add it to a bin, and edit directly with it – just like the others. But Avid is designed for and works best with optimized media.

This means transcoding the linked media to MXF-wrapped Avid DNxHD or HR media. This media can be OPatom (audio and video as separate files) or OP1a (interleaved audio/video files). It’s stored in an Avid MediaFiles folder located at the root level of the designated media volume. That’s essentially the exact same process adopted by Final Cut Pro X when media is transcoded and placed inside an FCPX Library file. The process for each enables a bullet-proof way to move project files and media around without breaking links to that media.

The second difference is that each Avid bin within the application is also a dedicated data file stored within the project folder on your hard drive. Bins can be individually locked (under application control). This facilitates multiple editors working in a collaborative environment. Adobe adopted an analog of this method in their new Adobe Productions feature.

The new user interface

Avid has always offered a highly customizable user interface. The new design, introduced in 2019, features bins, windows, and panels that can be docked, tabbed, or floated. Default workspaces have been streamlined, but you can also create your own. A unique feature compared to the competing NLEs is that open panes can be slid left or right to move them off of the active screen. They aren’t actually closed, but compacted into the side of the screen. Simply slide the edge inward again to reveal that pane.

One key to Avid’s success is that the keyboard layout, default workspaces, and timeline interactions tend to be better focused on the task of editing. You can get more done with fewer keystrokes. In all fairness, Final Cut Pro X also shares some of this, if you can get comfortable with their very different approach. My point is that the new Media Composer workspaces cover most of what I need and I don’t feel the need for a bunch of custom layouts. I also don’t feel the need to remap more levels of custom keyboard commands than what’s already there.

Media Composer for Premiere and Final Cut editors

My first recommendation is to invest in a custom Media Composer keyboard from LogicKeyboard or Editors Keys. Media Composer mapping is a bit different than the Final Cut “legacy” mapping that many NLEs offer. It’s worth learning the standard Media Composer layout. A keyboard with custom keycaps will be a big help.

My second recommendation is to learn all about Media Composer’s settings (found under Preferences and Settings). There are a LOT of them, which may seem daunting at first. Once you understand these settings, you can really customize the software just for you.

Getting started

Start by establishing a new project from the projects panel. Projects can be saved to any available drive and do not have to be in a folder at the root level. When you create a new project, you are setting the format for frame size, rate, and color space. All sequences created inside of this project will adhere to these settings. However, other sequences using different formats can be imported into any project.

Once you open a project, Media Composer follows a familiar layout of bins, timeline, and source/record windows. There are three normal bin views, plus script-based editing (if you use it): frame, column, and storyboard. In column view, you may create custom columns as needed. Clips can be sorted and filtered based on the criteria you pick. In the frame view, clips can be arranged in a freeform manner, which many film editors really like.

The layout works on single and dual-monitor set-ups. If you have two screens, it’s easy to spread out your bins on one screen in any manner you like. But if you only have one screen, you may want to switch to a single viewer mode, which then displays only the record side. Click a source clip from a bin and it open its own floating window. Mark in/out, make the edit, and close. I wish the viewer would toggle between source and record, but that’s not the case, yet

Sequences

Media Composer does not use stacked or tabbed sequences, but there is a history pulldown for quick access to recent sequences and/or source clips. Drag and load any sequence into the source window and toggle the timeline view between the source or the record side. This enables easy editing of portions from one sequence into another sequence.

Mono and stereo audio tracks are treated separately on the timeline. If you have a clip with left and right stereo audio on two separate channels (not interleaved), then these will cut to the timeline as two mono tracks with a default pan setting to the middle for each. You’ll need to pan these tracks back to left and right in the timeline. If you have a clip with interleaved, stereo audio, like a music cue, it will be edited to a new interleaved stereo track, with default stereo panning. You can’t mix interleaved stereo and mono content onto the same timeline track.

Effects

Unlike other NLEs, timeline clips are only modified when a specific effect is applied. When clips of a different format than the sequence format are cut to the timeline, a FrameFlex effect is automatically applied for transform and color space changes. There is no persistent Inspector or Effects Control panel. Instead you have to select a clip with an effect applied to it and open the effect mode editor. While this may seem more cumbersome, the advantage is that you won’t inadvertently change the settings of one clip thinking that another has been selected.

Media Composer installs a fair amount of video and audio plug-ins, but for more advanced effects, I recommend augmenting with BorisFX’s Continuum Complete or Sapphire. What is often overlooked is that Media Composer does include paint, masking, and tracking tools. And, if you work on stereo 3D projects, Avid was one of the first companies to integrate a stereoscopic toolkit into Media Composer

The audio plug-ins provide a useful collection of filters for video editors. These plug-ins come from the Pro Tools side of the company. Media Composer and Pro Tools use the AAX plug-in format; therefore, no AU or VST audio plug-ins will show up inside Media Composer.

Due to the 64-bit transition, Avid dropped the legacy Title Tool and Marquee titler, and rewrote a new Titler+. Honestly, it’s not as intuitive as it should be and took some time for me to warm up to it. Once you play with it, though, the controls are straight-forward. It includes roll and crawl options, along with keyframed moves and tracking. Unfortunately, there are no built-in graphics templates.

Trimming

When feature film editors are asked why they like Media Composer, the trim mode is frequently at the top of the list. The other NLEs offer advanced trimming modes, but none seems as intuitive to use as Avid’s. Granted, you don’t have to stick with the mouse to use them, but I definitely find it easier to trim by mouse in Premiere or Final Cut.

Trimming in Media Composer is geared towards fluid keyboard operation. I find that when I’m building up a sequence, my flow is completely different in Media Composer. Some will obviously prefer the others’ tools and, in fact, Media Composer’s smart keys enable mouse-based trimming, too. It’s certainly preference, but once you get comfortable with the flow and speed of Media Composer’s trim mode, it’s hard to go to something else.

Avid’s journey to modernize Media Composer has gone surprisingly well. If anything, the pace of feature enhancements might be too incremental for users wishing to see more radical changes. For now, there hasn’t been too much resistance from the old guard and new editors are indeed taking a fresh look. Whether you are cutting spots, social media, or indie features, you owe it to yourself to take an objective look at Media Composer as a viable editing option.

To get more familiar with Media Composer, check out Kevin P. McAuliffe’s Let’s Edit with Media Composer tutorial series on YouTube.

Originally written for Pro Video Coalition.

©2020 Oliver Peters