Subscribers to Adobe Creative Cloud have a whole suite of creative tools at their fingertips. I believe most users often overlook some of the less promoted features. Here are five quick tips for your workflow. (Click on images to see an enlarged view.)
Camera Raw. Photographers know that the Adobe Camera Raw module is used to process camera raw images, such as .cr2 files. It’s a “develop” module that opens first when you import a camera raw file into Photoshop. It’s also used in Bridge and Lightroom. Many people use Photoshop for photo enhancement – working with the various filters and adjustment layer tools available. What may be overlooked is that you can use the Camera Raw Filter in Photoshop on any photo, even if the file is not raw, such as a JPEG or TIFF.
Select the layer containing the image and choose the Camera Raw Filter. This opens that image into this separate “develop” module. There you have all the photo and color enhancement tools in a single, comprehensive toolkit – the same as in Lightroom. Once you’re done and close the Camera Raw Filter, those adjustments are now “baked” into the image on that layer.
Remix. Audition is a powerful digital audio workstation application that many use in conjunction with Premiere Pro or separately for audio productions. One feature it has over Premiere Pro is the ability to use AI to automatically edit the length of music tracks. Let’s say you have a music track that’s 2:47 in length, but you want a :60 version to underscore a TV commercial. Yes, you could manually edit it, but Audition Remix turns this into an “automagic” task. This is especially useful for projects where you don’t need to have certain parts of the song time to specific visuals.
Open Audition, create a multitrack session, and place the music selection on any track in the timeline. Right-click the selection and enable Remix. Within the Remix dialogue box, set the target duration and parameters – for example, short versus long edits. Audition will calculate the number and location of edit points to seamlessly shorten the track to the approximate desired length.
Audition attempts to create edits at points that are musically logical. You won’t necessarily get an exact duration, since the value you entered is only a target. This is even more true with tracks that have a long musical fade-out. A little experimentation may be needed. For example, a target value of :59 will often yield significantly different results than a target of 1:02, thanks to the recalculation. Audition’s remix isn’t perfect, but will get you close enough that only minimal additional work is required. Once you are happy, bounce out the edited track for the shortened version to bring into Premiere Pro.
Photoshop Batch Processing. If you want to add interesting stylistic looks to a clip, then effects filters in Premiere Pro and/or After Effects usually fit the bill. Or you can go with expensive third party options like Continuum Complete or Sapphire from Boris FX. However, don’t forget Photoshop, which includes many stylized looks not offered in either of Adobe’s video applications, such as specific paint and brush filters. But, how do you apply those to a video clip?
The first step is to turn your clip into an image sequence using Adobe Media Encoder. Then open a representative frame in Photoshop to define the look. Create a Photoshop action using the filters and settings you desire. Save the action, but not the image. Then create a batch function to apply that stored action to the clean frames within the image sequence folder. The batch operation will automatically open each image, apply the effects, and save the stylized results to a new destination folder.
Open that new image sequence using any app that supports image sequences (including QuickTime) and save it as a ProRes (or other) movie file. Stylized effects, like oil paint, are applied to individual frames and will vary with the texture and lighting of each frame; therefore, the stitched movie will display an animated appearance to that effect.
After Effects for broadcast deliverables. After Effects is the proverbial Swiss Army knife for editors and designers. It’s my preferred conversion tool when I have 24p masters that need to be delivered as 60i broadcast files.
Import a 23.98 master and place it into a new composition. Scale, if needed (UHD to HD, for instance). Send to the Render Queue. Set the frame rate to 29.97, field render to Upper (for HD), and enable pulldown (any whole/split frame cadence is usually OK). Turn off Motion Blur and Frame Blending. Render for a proper interlaced broadcast deliverable file.
Photoshop motion graphics. One oft-ignored (or forgotten) feature of Photoshop is that you can do layer-based video animation and editing within. Essentially there’s a very rudimentary version of After Effects inside Photoshop. While you probably wouldn’t want to use it for video instead of using After Effects or Premiere Pro, Photoshop does have a value in creating animated lower thirds and other titles.
Photoshop provides much better text and graphic style options than Premiere Pro. The files are more lightweight than an After Effects comp on your Premiere timeline – or rendering animated ProRes 4444 movies. Since it’s still a Photoshop file (albeit a special version), the “edit in original” command opens the file in Photoshop for easy revisions. Let’s say you are working on a show that has 100 lower thirds that slide in and fade out. These can easily be prepped for the editor by the graphics department in Photoshop – no After Effects skills required.
Create a new file in Photoshop, turn on the timeline window, and add a new blank video layer. Add a still onto a layer for positioning reference, delete the video layer, and extend the layers and timeline to the desired length. Now build your text and graphic layers. Keyframe changes to opacity, position, and other settings for animation. Delete the reference image and save the file. This is now a keyable Photoshop file with embedded animation properties.
Import the Photoshop file into Premiere with Merged Layers. Add to your timeline. The style in Premiere should match the look created in Photoshop. It will animate based on the keyframe settings created in Photoshop.
Mank, David Fincher’s eleventh film, chronicles Herman Mankiewicz (portrayed by Gary Oldman) during the writing of the film classic, Citizen Kane. Mankiewicz, known as Mank, was a witty New York journalist and playwright who moved to Los Angles in the 1930s to become a screenwriter. He wrote or co-wrote about 40 films, often uncredited, including the first draft of The Wizard of Oz. Together with Orson Welles, he won an Academy Award for the screenplay of Citizen Kane. It’s long been disputed whether or not he, rather than Welles, actually did the bulk of the work on the screenplay.
The script for Mank was penned decades ago by David Fincher’s father, Jack Fincher, and was finally brought to the screen thanks to Netflix this past year. Fincher deftly blends two parallel storylines: Mankiewicz’ writing of Kane during his convalescence from an accident – and his earlier Hollywood experiences with the studios, as told through flashbacks. These experiences, including his acquaintance with William Randolph Hearst – the media mogul of his time and the basis for Charles Foster Kane in Citizen Kane – inspired Mankiewicz’ script. This earlier period is infused with the political undercurrent of the Great Depression and the California gubernatorial race between Upton Sinclair and Frank Merriam.
David Fincher and director of photography Erik Messerschmidt, ASC (Mindhunter) used many techniques to pay homage to the look of Citizen Kane and other classic films of the era, including shooting in true black-and-white with RED Monstro 8K Monochrome cameras and Leica Summilux lenses. Fincher also tapped other frequent collaborators, including Trent Reznor and Atticus Ross for a moving, vintage score, and Oscar-winning editor, Kirk Baxter, ACE. I recently caught up with Baxter to discuss Mank, the fourth film he’s edited for David Fincher.
Citizen Kane is the 800 pound gorilla. Had you seen that film before this or was it research for the project?
I get so nervous about this topic, because with cinephiles, it’s almost like talking about religion. I had seen Citizen Kane when I was younger, but I was too young to appreciate it. I was growing up on Star Wars, Indiana Jones, and Conan the Barbarian. Then advancing my tastes to the Godfather films and French Connection. Citizen Kane is still just such a departure from all of that. I was kind of like, “What?” That was probably in my late teens.
I went back and watched it again before the shoot after reading the screenplay. There were certain technical aspects to the film that I thought were incredible. I loved the way OrsonWelles chose to leave his scenes by turning off lights like it was in the theater. There was this sort of slow decay and I enjoy how David picked up on that and took it into Mank. Each time one of those shots came up in the bungalow scenes, I thought it was fantastic.
Overall, I don’t consider myself any sort of expert on 1930s and 1940s movie-making and I didn’t make a conscious effort to try to replicate any styles. I approached the work in the same way I do with all of David’s work – by being reactionary to the material and the coverage that he shot. In regard to how close David took the stylings, well, that was more his tight rope walk. So, I felt no shackling to slow down an edit pace or stay in masters or stay in 50-50s as might have been common in the genre. I used all the tools at my disposal to exploit every scene the best I could.
Since you are cutting while the shooting goes on, do you have the ability to ask for coverage that you might feel is missing?
I think a little bit of that goes on, but it’s not me telling Fincher what’s required. It’s me building assemblies and giving them to David as he’s going and he will assess where he’s short and where he’s not. I’ve read many editor interviews over the years and I’ve always kind of gone, “huh,” when someone’s projecting they’re in the control seat. When you’re with someone with the ability that Fincher has, then I’m in a support position of helping him make his movie as best he can. Any other way of looking at it is delusional. But, I take a lot of pride in where I do get to contribute.
Mank is a different style of film than Fincher’s previous projects. Did that change the workflow or add any extra pressure?
I don’t think it did for me. I think it was harder for David. The film was in his head for so many decades and there were a couple of attempts to make it happen. Obviously a lot changes in that time frame. So, I think he had a lot of internal pressure about what he was making. For me, I found the entire process to be really buoyant and bubbly and just downright fun.
As with all films, there were moments when it was hard to keep up during the shoot. And definitely moments coming down to that final crunch. That’s when I really put a lot of pressure on myself to deliver cut scenes to David to help him. I felt the pressure of that, but my main memory of it really was one of joy. Not that the other movies aren’t, but I think sometimes the subject matter can control the mood of the day. For instance, in other movies, like Dragon Tattoo, the feeling was a bit like your head in a vise when I look back at it.
Sure. Dragoon Tattoo is dark subject matter. On the other hand, Gary Oldman’s portrayal of Mankiewicz really lights up the screen. It certainly looks like he’s having fun with the character.
Right. I loved all the bungalow scenes. I thought there was so much warmth in those. And I had so much compassion for the lead character, Mank. Those scenes really made me adore him. But also when the flashback scenes came, they’re just a hoot and great fun to put together. There was this warmth and playfulness of the two different opposing storylines. No matter which one turned up, I was happy to see it.
Was the inter-cutting of those parallel storylines the way it was scripted? Or was that a construction in post?
Yes, it was scripted that way. There was a little bit of pulling at the thread later. Can we improve on this? There was a bit of reshuffling later on and then working out that ‘as written’ was the best path. We certainly kicked the tires a few times. After we put the blueprint together, mostly the job became tightening and shortening.
Obviously one of the technical differences was that this film was a true black-and-white film shot with modified, monochrome RED cameras. So not color and then changed to black-and-white in the grade. Did that impact your thinking in how to tackle the edit?
For the first ten minutes. At first you sit down and you go, “Oh, we work in black and white.” And then you get used to it very quickly. I forwarded the trailer when it was released to my mother in Australia. She texted back, “It’s black and white????” [laugh] You’ve got to love family!
Black-and-white has a unique look, but I know that other films, like Roma, were shot in color to satisfy some international distribution requirements.
That’s never going to happen with someone like David. I can’t picture who that person would be that would tell him with any authority that his movie requires color.
Of course, it matches films of the era and more importantly Citizen Kane. It does bring an intentional, stylistic treatment to the content.
Black-and-white has got a great way of focusing your attention and focusing your eye. There’s a discipline that’s required with how shots are framed and how you’re using the images for eye travel. But I think all of David work comes with that discipline anyway. So to me, it didn’t alter it. He’s already in that ballpark.
As in most of David’s movies, it’s everywhere and a lot of the time it looks invisible, but things are being replaced. I don’t have a ratio for it, but I’d say almost half the movie. We’ve got a team that’s stabilizing shots as we’re going. We’ve got an in-house visual effects team that is building effects, just to let us know that certain choices can be made. The split screen thing is constant, but I’ll do a lot of that myself. I’ll do a fairly haphazard job of it and then pass it on for our assistant editors to follow up on. Even the montage kaleidoscope effect was all done in-house down the hall by Christopher Doulgeris, one of our VFX artists. A lot of it’s farmed out, but a fair slice is done under the roof.
Please tell me a bit about working with Adobe Premiere Pro again to cut this film.
In previous versions, Premiere Pro required projects to contain copies of all the media used in that project. As you would hand the scene off to other people to work on in parallel, all the media would travel into that new project, and the same was true when combining projects back together to merge your work. You had monstrously huge projects with every piece of media, and frequently duplicate copies of that media, packed into them. They often took 15 minutes to open. Now Adobe has solved that and streamlined the process. They knew it was a massive overhaul, but I think that’s been completely solved. Because it’s functioning, I can now purely concentrate on the thought process of where I’m going in the edit. I’m spoiled with having very technical people around me so that I can exist as a child. [laugh]
How was the color grade handled?
We had Eric Weidt working downstairs at Fincher’s place on Baselight. David is really fortunate that he’s not working in this world of “Here’s three weeks for color. Go into this room each day and where you come out is where you are at.” There’s an ongoing grade that’s occurring in increments and traveling with the job that we’re doing. It’s updated and brought into the cut. We experience editing with it and then it’s updated again and brought back into the cut. So it’s this constant progression.
Let’s talk about project organization. You’ve told me in the past that your method of organizing a selects reel was to string out shots in the order of wide shots, mediums, close ups, and so on. And then bump up the ones you like. Finally, you’d reduce the choices before those were presented to David as possible selects. Did you handle it the same way on Mank?
Over time, I’ve streamlined that further. I’ve found that if I send something that’s too long while he’s in the middle of shooting that he might watch the first two minutes of it, give me a couple of notes of what he likes and what he doesn’t like, and move on. So, I’ve started to really reduce what I send. It’s more cut scenes with some choices. That way I get the most relevant information and can move forward.
With scenes that are extremely dense, like Louis B. Mayer’s birthday party at Hearst’s, it really is an endless multiple choice of how to tackle it. I’ll often present a few paths. Here’s what it is if I really hold out these wides at the front and I hang back for a bit longer. Here’s what it is if I stay more with Gary [Oldmam] listening. It’s not that this take is better than the other take, but more options featuring different avenues and ways to tell the story.
I like working that way, even if it wasn’t for the sake of presenting it to David. I can’t watch a scene that’s that dense and go, “Oh, I know what to do.” I wouldn’t have a clue. I like to explore it. I’ve got to turn the soil and snuff the truffles and try it all out. And then the answers present themselves. It all just becomes clear. Unfortunately, the world of the editor, regardless of past experiences, is always destined to be filled with labor. There is no shortcut to doing it properly.
With large-scale theatrical distribution out of the question – and the shift to Netflix streaming as the prime focus – did the nature of studio notes change at all?
David’s generous about thought and opinion, if it’s constructive and helpful. He’s got a long history of forwarding those notes to me and exploring them. I’m not positive if I get all of them. Anything that’s got merit will reach me, which is wise. Having spent so many years in the commercial world, there’s a part of me that’s always a little eager to solve a puzzle. If I’m delivered a pile of notes, good or bad, I’m going to try my best to execute them. So, David is wise to just not let me see the bad ones.
Were you able to finish Mank before the virus-related lockdowns started? Did you have to move to a remote workflow?
The shooting had finished and we already had the film assembled. I work at a furious rate whilst David’s shooting, so that we can interface during the shoot. That way he knows what he’s captured, what he needs, and he can move on and strike sets, release actors, etc. There’s this constant back and forth.
At the point when he stops shooting, we’re pretty far along in terms of replicating the original plan, the blueprint. Then it’s what I call the sweeps, where you go back to the top and you just start sweeping through the movie, improving it. I think we’d already done one of those when we went remote. So, it was very fortunate timing.
We’re quite used to it. During shooting, we work in a remote way anyway. It’s a language and situation that we’re completely used to. I think from David’s perspective, it didn’t change anything.
If the timing had been different and you would have had to handle all of the edit under remote conditions, would anything change? Or would you approach it the same way?
Exactly the same. It wouldn’t have changed the amount of time that I get directly with David. I don’t want to give the impression that I cut this movie and David was on the sidelines. He’s absolutely involved, but pops in and out and looks at things that are made. He’s not a director that sits there the whole time. A lot of it is, “I’ve made this cut, let’s watch it together. I’ve done these selects, let’s watch them together.” It’s really possible to do that remotely.
I prefer to be with David when he’s shooting and especially in this one that he shot in Los Angeles. I really tried to have one day a week where we got to be together on the weekends and his world quieted down. David loves that. I would sort of construct my week’s thinking towards that goal. If on a Wednesday I had six scenes that were backed up, I’d sort of think to myself, “What can I achieve in the time frame before David’s with me on Saturday? Should I just select all these scenes and then we’ll go through the selects together? Or should I tackle this hardest one and get a good cut of that going?”
A lot of the time I would choose – if he was coming in and had the time to watch things – to do selects. Sometimes we could bounce through them just from having a conversation of what his intent was and the things that he was excited about when he was capturing them. With that, I’m good to go. Then I don’t need David for another week or so. We were down to the short hand of one sentence, one email, one text. That can inform me with all the fuel I need to drive cross-country.
The film’s back story clearly has political overtones that have an eerie similarity to 2020. I realize the script was written a while back at a different time, but was some of that context added in light of recent events?
That was already there. But, it really felt like we are reliving this now. In the beginning of the shutdown, you didn’t quite know where it was going to go. The parallels to the Great Depression were extreme. There were a lot of lessons for me.
The character of Louis B. Mayer slashes all of his studio employees’ salaries to 50 percent. He promises to give every penny back and then doesn’t do it. I was crafting that villain’s performance, but at the same time I run a company [Exile Edit] that has a lot of employees in Los Angeles and New York. We had no clue if we would be able to get through the pandemic at the time when it hit. We also asked staff to take a pay cut, so that we could keep everyone employed and keep everybody on health insurance. But the moment we realized we could get through it six months later, there was no way I could ever be that villain. We returned every cent.
I think most companies are set up to be able to exist for four months. If everything stops dead – no one’s anticipating that – the 12-month brake pull. It was really, really frightening. I would hope that I would think this way anyway, but with crafting that villain’s performance, there was no way I was going to replicate it.
Leading into the new year, it’s time to take a fresh look at a perennial subject. Whether you work as a solo editor or part of a team, having a plan for organizing your projects – along with a workflow for moving media though your system – will lead to success in being able to find and restore material when needed at a future date. For a day-to-day workflow, I rely on five standard applications: Post Haste, Hedge, Better Rename, DiskCatalogMaker, and Kyno. I work on Macs, but there are Windows versions or alternatives for each.
Proper project organization. Regardless of your NLE, it’s a good idea to create a project “silo” for each job on your hard drive, RAID, or networked storage (NAS). That’s a main folder for the job, with subfolders for the edit project files, footage, audio, graphics, documents, exports, etc. I use Post Haste to create a new set of project folders for each new project.
Post Haste uses default or custom templates that can include Adobe project files. This provides a common starting point for each new project based on a template that I’ve created. Using this template, Post Haste generates a new project folder with common subfolders. A template Premiere Pro project file with my custom bin structure is contained within the Post Haste template. When each new set of folders is created, this Premiere file is also copied.
In order to track productions, each job is assigned a number, which becomes part of the name structure assigned within Post Haste. The same name is applied to the Premiere Pro project file. Typically, the master folder (and Premiere project) for a new job created through Post Haste will be labelled according to this schema: 9999_CLIENT_PROJECT_DATE.
Dealing with source footage, aka rushes or dailies. The first thing you have to deal with on a new project is the source media. Most of the location shoots for my projects come back to me with around 1TB of media for a day’s worth of filming. That’s often from two or three cameras, recorded in a variety of codecs at 4K/UHD resolution and 23.98fps. Someone on location (DIT, producer, DP, other) has copied the camera cards to working SSDs, which will be reused on later productions. Hedge is used to copy the cards, in order to provide checksum copy verification.
I receive those SSDs and not the camera cards. The first step is to copy that media “as is” into the source footage subfolder for that project on the editing RAID or NAS. Once my copy is complete, those same SSDs are separately copied “as is” via Hedge to one or more Western Digital or Seagate portable drives. Theoretically, this is for a deep archive, which hopefully will never be needed. Once we have at least two copies of the media, these working SSDs can be reformatted for the next production. The back-up drives should be stored in a safe location on-premises or better yet, offsite.
Since video cameras don’t use a standard folder structure on the cards, the next step is to reorganize the copied media in the footage folder according to date, camera, and roll. This means ripping media files out of their various camera subfolders. Within the footage folder, my subfolder hierarchy becomes shoot date (MMDDYY), then camera (A-CAM, B-CAM, etc), and then camera roll (A001, A002, etc). Media is located within the roll subfolder. Double-system audio recordings go into a SOUND folder for that date and follow this same hierarchy for sound rolls. When this reorganization is complete, I delete the leftover camera subfolders, such as Private, DCIM, etc.
It may be necessary to rename or append prefixes to file names in order to end up with completely unique file names within this project. That’s where Better Rename comes in. This is a Finder-level batch renaming tool. If a camera generates default names on a card, such as IMG_001, IMG_002 and so on, then renaming becomes essential. I try to preserve the original name in order to be able to trace the file back to back-up drives if I absolutely have to. Therefore, it’s best to append a prefix. I base this on project, date, camera, and roll. As an example, if IMG_001 was shot as part of the Bahamas project on December 20th, recorded by E-camera on roll seven, then the appended file would be named BAH1220E07_IMG_001.
Some camera codecs, like those used by drones and GoPros, are a beast for many NLEs to deal with. Proxy media is one way or you can transcode only the offending files. If you choose to transcode these files, then Compressor, Adobe Media Encoder, or Resolve are the best go-to applications. Transcode at the native file size and resolution into an optimized codec, like ProRes. Maintain log color spaces, because these optimized files become the new “camera” files in your edit. I will add separate folders for ORIG (camera original media) and PRORES (my transcoded, optimized files) within each camera roll folder. Only the ProRes media is to be imported into the NLE for editing.
Back-up! Do not proceed to GO! Now that you’ve spent all of this effort reorganizing, renaming, and transcoding media, you first want a back-up the files before starting to edit. I like to back up media to raw, removable, enterprise-grade HGST or Seagate hard drives. Over the years, I’ve accumulated a variety of drive sizes ranging from 2TB to now 8TB. Larger capacities are available, but 8TB is a cost-effective and manageable capacity. When placed into a Thunderbolt or USB drive dock, these function like any other local hard drive.
When you’ve completed dealing with the media from the shoot, simply copy the whole job folder to a drive. You can store multiple projects on the same drive, depending on their capacity. This is an easy overnight process with most jobs, so it won’t impact your edit time. The point is to back up the newly organized version of your raw media. Once completed, you will have three copies of the source footage – the “as is” copy, the version on your RAID or NAS, and this back-up on the raw drive. After the project has been completed and delivered, load up the back-up drive and copy everything else from this job to that drive. This provides a “clone” of the complete job on both your RAID/NAS and the back-up drive.
In order to keep these back-up drives straight, you’ll need a catalog. At home, I’ve accumulated 12 drives thus far. At work we’ve accumulated over 200. I’ve found the easiest way to deal with this is an application called DiskCatalogMaker. It scans the drive and stores the file information in a catalog document. Each drive entry mimics what you see in the Finder, including folders, files, sizes, dates, and so on. The catalog document is searchable, which is why job numbers become important. It’s a good idea to periodically mount and spin up these drives to maintain reliability. Once a year is a minimum.
If you have sufficient capacity on your RAID or NAS, then you don’t want to immediately delete jobs and media when the work is done. In our case, once a job has been fully backed up, the job folder is moved into a BACKED UP folder on the NAS. This way we know when a job has been backed up, yet it is still easily retrieved should the client come back with revisions. Plus, you still have three total copies of the source media.
Other back-ups. I’ve talked a lot about backing up camera media, but what about other files? Generally files like graphics are supplied, so these are also backed up elsewhere. Plus they will get backed up on the raw drive when the job is done.
I also use Dropbox for interim back-ups of project files. Since a Premiere Pro project file is light and doesn’t carry media, it’s easy to back up in the cloud. At work, at the end of each day, each editor copies in-progress Premiere files to a company Dropbox folder. The idea is that in the event of some catastrophe, you could get your project back from Dropbox and then use the backed up camera drives to rebuild an edit. In addition, we also export and copy Resolve projects to Dropbox, as well as the DiskCatalogMaker catalog documents.
Whenever possible, audio stems and textless masters are exported for each completed job. These are stored with the final masters. Often it’s easier to make revisions using these elements, than to dive back into a complex job after it’s been deeply archived. Our NAS contains a separate top-level folder for all finished masters, in addition to the master subfolder within each project. When a production is done, the master file is copied into this other folder, resulting in two sets of the master files on the NAS. And by “master” I generally mean a final ProRes file along with a high-quality MP4 file. The MP4 is most often what the client will use as their “master,” since so much of our work these days is for the web. Therefore, both NAS locations hold a ProRes and an MP4. That’s in addition to the masters stored on the raw, back-up drive.
Final, Final revised, no really, this one is Final. Let’s address file naming conventions. Every editor knows the “danger” of calling something Final. Clients love to make changes until they no longer can. I work on projects that have running changes as adjustments are made for use in new presentations. Calling any of these “Final” never works. Broadcast commercials are usually assigned definitive ISCI codes, but that’s rarely the case with non-broadcast projects. The process that works for us is simply to use version numbers and dates. This makes sense and is what software developers use.
We use this convention: CLIENT_PROJECTNAME_VERSION_DATE_MODIFIER. As an example, if you are editing a McDonald’s Big Mac :60 commercial, then a final version might be labelled “MCD_Big Mac 60_v13_122620.” A slight change on that same day would become “MCD_Big Mac 60_v14_122620.” We use the “modifier” to designate variations from the norm. Our default master files are formatted as 1080p at 23.98 with stereo audio. So a variation exported as 4K/UHD or 720p or with a 5.1 surround mix would have the added suffix of “_4K” or “_720p” or “_51MIX.”
Some projects go through many updates and it’s often hard to know when a client (working remotely) considers a version truly done. They are supposed to tell you that, but they often just don’t. You sort of know, because the changes stop coming and a presentation deadline has been met. Whenever that happens, we export a ProRes master file plus high-quality MP4 files. The client may come back a week later with some revisions. Then, new ProRes and MP4 files are generated. Since version numbers are maintained, the ProRes master files will also have different version numbers and dates and, therefore, you can differentiate one from the other. Both variations may be valid and in use by the client.
Asset management. The last piece of software that comes in handy for us is Kyno. This is a lightweight asset management tool that we use to scan and find media on our NAS. Our method of organization makes it relatively easy to find things just by working in the Finder. However, if you are looking for that one piece of footage and need to be able to identify it visually, then that’s where Kyno is helpful. It’s like Adobe Bridge on steroids. One can organize and sort using the usual database tools, but it also has a very cool “drill down” feature. If you want to browse media within a folder without stepping through a series of subfolders, simply enable “drill down” and you can directly browse all media that’s contained therein. Kyno also features robust transcode and “send to” features designed with NLEs in mind. Prep media for an edit or create proxies? Simply use Kyno as an alternative to other options.
Hopefully this recap has provided some new workflow pointers for 2021. Good luck!
Avid Media Composer offers a few add-on options, but two are considered gems by the editors that rely on them. ScriptSync and PhraseFind are essential for many drama and documentary editors who wield Media Composer keyboards every day. I’ve written about these tools in the past, including how you can get similar functionality in other NLEs. New transcription services, like Simon Says, make them more viable than ever for the average editor.
Driven by the script
Avid’s script-based editing, also called script integration, builds a representation of the script supervisor’s lined script directly into the Avid Media Composer workflow and interface. While often referred to as ScriptSync, Avid’s script integration is actually not the same. Script-based editing and script bins are part of the core Media Composer system and does not cost extra.
The concept originated with the Cinedco Ediflex NLE and migrated to Avid. In the regular Media Composer system, preparing a script bin and aligning takes to that script is a manual process, often performed by assistant editors that are part of a larger editorial team. Because it is labor-intensive, most individual editors working on projects that aren’t major feature films or TV series avoid using this workflow.
Avid ScriptSync (a paid option) automates this script bin preparation process, by automatically aligning spoken words in a take to the text lines within the written script. It does this using speech recognition technology licensed from Nexidia. This technology is based on phonemes, the sounds that are combined to create spoken words. Clips can be imported (transcoded into Avid MediaFiles) or linked.
Through automatic analysis of the audio within a take, ScriptSync can correlate a line in the script to its relative position within that take or within multiple takes. Once clips have been properly aligned to the written dialogue, ScriptSync is largely out of the picture. And so, in Avid’s script-based editing, the editor can then click on a line of dialogue within the script bin and see all of the coverage for that line.
Script integration with non-scripted content
You might think, “Great, but I’m not cutting TV shows and films with a script.” If you work in documentaries or corporate videos built around lengthy interviews, then script integration may have little meaning – unless you have transcripts. Getting long interviews transcribed can be costly and/or time-consuming. That’s where an automated transcription service like Simon Says comes in. There are certainly other, equally good services. However, Simon Says, offers export options tailored for each NLE, including Avid Media Composer.
With a transcription available on a fast turnaround, it becomes easy to import an interview transcript into a Media Composer script bin and align clips to it. ScriptSync takes care of the automatic alignment making script-based editing quick, easy, and painless – even for an individual editor without any assistants.
Finding that needle in the haystack
The second gem is PhraseFind, which builds upon the same Nexidia speech recognition technology. It’s a tool that’s even more essential for the documentary editor than script integration. PhraseFind (a paid option) is a phonetic search tool that analyzes the audio for clips within an Avid MediaFiles folder. Type in a word or phrase and PhraseFind will return a number of “hits” with varying degrees of accuracy.
The search is based on phonemes, so the results are based on words that “sound like” the search term. On one side this means that low-accuracy results may include unrelated finds that sound similar. On the other hand, you can enter a search word that is spelled differently or inaccurately, but as long as it still sounds the same, then useful results will be returned.
PhraseFind is very helpful in editing “Frankenbites.” Those are edits were sentences are ended in the middle, because a speaker went off on a tangent, or when different phrases are combined to complete a thought. Often you need to find a word that matches your edit point, but with the correct inflection, such as ending a sentence. PhraseFind is great for these types of searches, since your only alternative is scouring through multiple clips in search of a single word.
Working with the options
Script-based editing, ScriptSync, and PhraseFind are unique features that are only available in Avid Media Composer. No other NLE offers similar built-in features. Boris FX does offer Soundbite, which is a standalone equivalent to the PhraseFind technology licensed to them by Nexidia. It’s still available, but not actively promoted nor developed. Adobe had offered Story as a way to integrate script-based editing into Premiere Pro. That feature is no longer available. So today, if you want the accepted standard for script and phonetic editing features, then Media Composer is where it’s at.
These are separate add-on options. You can pick one or the other or both (or neither) depending on your needs and style of work. They are activated through Avid Link. If you own multiple seats of Media Composer, then you can purchase one license of ScriptSync and/or PhraseFind and float them between Media Composers via Avid Link activation. While these tools aren’t for everyone, they do offer a new component to how you work as an editor. Many who’ve adopted them have never looked back.