Premiere Pro Workflow Tips

When you are editing on projects that only you touch, your working practices can be as messy as you want them to be. However, if you work on projects that need to be interchanged with others down the line, or you’re in a collaborative editing environment, good operating practices are essential. This starts at the moment you first receive the media and carries through until the project has been completed, delivered, and archived.

Any editor who’s worked with Avid Media Composer in a shared storage situation knows that it’s pretty rock solid and takes measures to assure proper media relinking and management. Adobe Premiere Pro is very powerful, but much more freeform. Therefore, the responsibility of proper media management and editor discipline falls to the user. I’ve covered some of these points in other posts, but it’s good to revisit workflow habits.

Folder templates. I like to have things neat and one way to assure that is with project folder templates. You can use a tool like Post Haste to automatically generate a new set of folders for each new production – or you can simply design your own set of folders as a template layout and copy those for each new job. Since I’m working mainly in Premiere Pro these days, my folder template includes a Premiere Pro template project, too. This gives me an easy starting point that has been tailored for the kinds of narrative/interview projects that I’m working on. Simply rename the root folder and the project for the new production (or let Post Haste do that for you). My layout includes folders for projects, graphics, audio, documents, exports, and raw media. I spend most of my time working at a multi-suite facility connected to a NAS shared storage system. There, the folders end up on the NAS volume and are accessible to all editors.

Media preparation. When the crew comes back from the shoot, the first priority is to back-up their files to an archive drive and then copy the files again to the storage used for editing – in my case a NAS volume. If we follow the folder layout described above, then those files get copied to the production dailies or raw media (whatever you called it) folder. Because Premiere Pro is very fluid and forgiving with all types of codecs, formats, and naming conventions, it’s easy to get sloppy and skip the next steps. DON’T. The most important thing for proper media linking is to have consistent locations and unique file names. If you don’t, then future relinking, moving the project into an application like Resolve for color correction/finishing, or other process may lead to not linking to the correct file.

Premiere Pro works better when ALL of the media is in a single common format, like DNxHD/HR or ProRes. However, for most productions, the transcoding time involved would be unacceptable. A large production will often shoot with multiple camera formats (Alexa, RED, DSLRs, GoPros, drones, etc.) and generate several cards worth of media each day. My recommendation is to leave the professional format files alone (like RED or Alexa), but transcode the oddball clips, like DJI cameras. Many of these prosumer formats place the media into various folder structures or hide them inside a package container format. I will generally move these outside of this structure so they are easily accessible at the Finder level. Media from the cameras should be arranged in a folder hierarchy of Date, Camera, and Card. Coordinate with the DIT and you’ll often get the media already organized in this manner. Transcode files as needed and delete the originals if you like (as long as they’ve been backed up first).

Unfortunately these prosumer cameras often use repeated, rather than unique, file names. Every card starts over with clip number 0001. That’s why we need to rename these files. You can usually skip renaming professional format files. It’s optional. Renaming Alexa files is fine, but avoid renaming RED or P2 files. However, definitely rename DSLR, GoPro, and DJI clips. When renaming clips I use an app called Better Rename on the Mac, but any batch renaming utility will do. Follow a consistent naming convention. Mine is a descriptive abbreviation, month/day, camera, and card. So a shoot in Palermo on July 22, using the B camera, recorded on card 4, becomes PAL0722B04_. This is appended in front of the camera-generated clip name, so then clip number 0057 becomes PAL0722B04_0057. You don’t need the year, because the folder location, general project info, or the embedded file info will tell you that.

A quick word on renaming. Stick with universal alphanumeric conventions in both the files and the folder names. Avoid symbols, emojis, etc. Otherwise, some systems will not be able to read the files. Don’t get overly lengthy in your names. Stick with upper and lower case letters, numbers, dashes, underscores, and spaces. Then you’ll be fine.

Project location. Premiere Pro has several basic file types that it generates with each project. These include the project file itself, Auto-saved project files, renders, media cache files and audio peak (.pek) files. Some of these are created in the background as new media is imported into the project. You can choose to store these anywhere you like on the system, although there are optimal locations.

Working on a NAS, there is no problem in letting the project file, Auto-saves, and renders stay on the NAS in the same section of the NAS as all of your other media. I do this because it’s easy to back-up the whole job at the end of the line and have everything in one place. However, you don’t want all the small, application-generated cache files to be there. While it’s an option in preferences, it is highly recommended to have these media cache files go to the internal hard drive of the workstation or a separate, external local drive. The reason is that there are a lot of these small files and that traffic on the NAS will tend to bog down the overall performance. So set them to be local (the default).

The downside of doing this is that when another editor opens the Premiere Pro project on a different computer, these files have to be regenerated on that new system. The project will react sluggishly until this background process is complete. While this is a bit of a drag, it’s what Adobe recommends to keep the system operating well.

One other cache setting to be mindful of is the automatic delete option. A recent Premiere Pro problem cropped up when users noticed that original media was disappearing from their drives. Although this was a definite bug, the situation mainly affected users who had set Media cache to be with their original media files and had enabled automatic deletion. You are better off to keep the default location, but change the deletion setting to manual. You’ll have to occasional clean your caches manually, but this is preferable to losing your original content.

Premiere Pro project locking. A recent addition to Premiere Pro is project locking. This came about because of Team Projects, which are cloud-only shared project files. However, in many environments, facilities do not want their projects in the cloud. Yet, they can still take advantage of this feature. When project locking is enabled in Premiere Pro (every user on the system must do this), the application opens a temporary .prlock next to the project file. This is intended to prevent other users from opening the same project and overwriting the original editor’s work and/or revisions.

Unfortunately, this only works correctly when you open a project from the launch window. Do not open the project by double-clicking the project file itself in order to launch Premiere Pro and that project. If you open through the launch window, then Premiere Pro will prevents you from opening a locked project file. However, if you open through the Finder, then the locking system is circumvented, causing crashes and potentially lost work.

Project layout templates.  Like folder layouts, I’m fond of using a template for my Premiere Pro projects, too. This way all projects have a consistent starting point, which is good when working with several editors collaboratively. You can certainly create multiple templates depending on the nature and specs of the job, e.g. commercials, narrative, 23.98, 29.97, etc. As with the folder layout, I’ll often use a leading underscore with a name to sort an item to the top of a list, or start the name with a “z” to sort it to the bottom. A lot of my work is interview-driven with supportive B-roll footage. Most of the time I’m cutting in 23.98fps. So, that’s the example shown here.

My normal routine is to import the camera files (using Premiere Pro’s internal Media Browser) according to the date/camera/card organization described earlier. Then I’ll review the footage and rearrange the clips. Interview files go into an interview sources bin. I will add sub-bins in the B-roll section for general categories. As I review footage, I’ll move clips into their appropriate area, until the date/camera/card bins are empty and can be deleted from the project. Interviews will be grouped as multi-cam clips and edited to a single sequence for each person. This sequence gets moved into the Interview Edits sub-bin and becomes the source for any clips from this interview. I do a few other things before starting to edit, but that’s for another time and another post.

Working as a team. There are lots of ways to work collaboratively, so the concept doesn’t mean the same thing in every type of job. Sometimes it requires different people working on the same job. Other times it means several editors may access a common pool of media, but working in their own discrete projects. In any case, Premiere does not allow the same sort of flexibility that Media Composer or Final Cut Pro editors enjoy. You cannot have two or more editors working inside the same project file. You cannot open more than one project at a time. This mean Premiere Pro editors need to think through their workflows in order to effectively share projects.

There are different strategies to employ. The easiest is to use the standard “save as” function to create alternate versions of a project. This is also useful to keep project bloat low. As you edit a long time on a project, you build up a lot of old “in progress” sequences. After a while, it’s best to save a copy and delete the older sequences. But the best way is to organize a structure to follow.

As an example, let’s say a travel-style show covers several locations in an episode. Several editors and an assistant are working on it. The assistant would create a master project with all the footage imported and organized, interviews grouped/synced, and so on. At this point each editor takes a different location to cut that segment. There are two options. The first is to duplicate the project file for each location. Open each one up and delete the content that’s not for that location. The second option is to create a new project for each location and them import media from the master project using Media Browser. This is Adobe’s built-in module that enables the editor to access files, bins, and sequences from inside other Premiere Pro projects. When these are imported, there is no dynamic linking between the two projects. The two sets of files/sequences are independent of each other.

Next, each editors cuts their own piece, resulting in a final sequence for each segment. Back in the master project, each edited sequence can be imported – again, using Media Browser –  for the purposes of the final show build and tweaks. Since all of the media is common, no additional media files will be imported. Another option is to create a new final project and then import each sequence into it (using Media Browser). This will import the sequences and any associated media films. Then use the segment sequences to build the final show sequence and tweak as needed.

There are plenty of ways to use Premiere Pro and maintain editing versatility within a shared storage situation. You just have to follow a few rules for “best practices” so that everyone will “play nice” and have a successful experience.

Click here to download a folder template and enclosed Premiere Pro template project.

©2017 Oliver Peters

Bricklayers and Sculptors

One of the livelier hangouts on the internet for editors to kick around their thoughts is the Creative COW’s Apple Final Cut Pro X Debates forum. Part forum, part bar room brawl, it started as a place to discuss the relative merits (or not) of Apple’s FCP X. As such, the COW’s bosses allow a bit more latitude than in other forums. However, often threads derail into really thoughtful discussions about editing concepts.

Recently one of its frequent contributors, Simon Ubsdell, posted a thread called Bricklayers and Sculptors. In his words, “There are two different types of editors: Those who lay one shot after another like a bricklayer builds a wall. And those who discover the shape of their film by sculpting the raw material like a sculptor works with clay. These processes are not the same. There is no continuum that links these two approaches. They are diametrically opposed.”

Simon Ubsdell is the creative director, partner, and editor/mixer for London-based trailer shop Tokyo Productions. Ubsdell is also an experienced plug-in developer, having developed and/or co-developed the TKY, Tokyo, and Hawaiki effects plug-ins. But beyond that, Simon is one of the folks with whom I often have e-mail discussions regarding the state of editing today. We were both early adopters of FCP X who have since shifted almost completely to Adobe Premiere Pro. In keeping with the theme of his forum post, I asked him to share his ideas about how to organize an edit.

With Simon’s permission, the following are his thoughts on how best to organize editing projects in a way that keeps you immersed in the material and results in editing with greater assurance that you’ve make the best possible edit decisions.

________________________________________________

Simon Ubsdell – Bricklayers and Sculptors in practical terms

To avoid getting too general about this, let me describe a job I did this week. The producer came to us with a documentary that’s still shooting and only roughly “edited” into a very loose assembly – it’s the stories of five different women that will eventually be interweaved, but that hasn’t happened yet. As I say, extremely rough and unformed.

I grabbed all the source material and put it on a timeline. That showed me at a glance that there was about four hours of it in total. I put in markers to show where each woman’s material started and ended, which allowed me to see how much material I had for each of them. If I ever needed to go back to “everything”, it would make searching easier. (Not an essential step by any means.)

I duplicated that sequence five times to make sequences of all the material for each woman. Then I made duplicates of those duplicates and began removing everything I didn’t want. (At this point I am only looking for dialogue and “key sound”, not pictures which I will pick up in a separate set of passes.)

Working subtractively

From this point on I am working almost exclusively subtractively. A lot of people approach string-outs by adding clips from the browser – but here all my clips are already on the timeline and I am taking away anything I don’t want. This is for me the key part of the process because each edit is not a rough approximation, but a very precise “topping and tailing” of what I want to use. If you’re “editing in the Browser” (or in Bins), you’re simply not going to be making the kind of frame accurate edits that I am making every single time with this method.

The point to grasp here is that instead of “making bricks” for use later on, I am already editing in the strictest sense – making cuts that will stand up later on. I don’t have to select and then trim – I am doing both operations at the same time. I have my editing hat on, not an organizing hat. I am focused on a timeline that is going to form the basis of the final edit. I am already thinking editorially (in the sense of creative timeline-based editing) and not wasting any time merely thinking organizationally.

I should mention here that this is an iterative process – not just one pass through the material, but several. At certain points I will keep duplicates as I start to work on shorter versions. I won’t generally keep that many duplicates – usually just an intermediate “long version”, which has lost all the material I definitely don’t want. And by “definitely don’t want” I’m not talking about heads and tails that everybody throws away where the camera is being turned on or off or the crew are in shot – I am already making deep, fine-grained editorial and editing decisions that will be of immense value later on. I’m going straight to the edit point that I know I’ll want for my finished show. It’s not a provisional edit point – it’s a genuine editorial choice. From this point of view, the process of rejecting slates and tails is entirely irrelevant and pointless – a whole process that I sidestep entirely. I am cutting from one bit that I want to keep directly to the next bit I want to keep and I am doing so with fine-tuned precision. And because I am working subtractively I am actually incorporating several edit decisions in one – in other words, with one delete step I am both removing the tail from the outgoing clip and setting the start of the next clip.

Feeling the pacing and flow

Another key element here is that I can see how one clip flows into another – even if I am not going to be using those two clips side-by-side. I can already get a feel for the pacing. I can also start to see what might go where, so as part of this phase, I am moving things around as options start suggesting themselves. Because I am working in the timeline with actual edited material, those options present themselves very naturally – I’m getting offered creative choices for free. I can’t stress too strongly how relevant this part is. If I were simply sorting through material in a Browser/Bin, this process would not be happening or at least not happening in anything like the same way. The ability to reorder clips as the thought occurs to me and for this to be an actual editorial decision on a timeline is an incredibly useful thing and again a great timesaver. I don’t have to think about editorial decisions twice.

And another major benefit that is simply not available to Browser/Bin-based methods, is that I am constructing editorial chunks as I go. I’m taking this section from Clip A and putting it side-by-side with this other section from Clip A, which may come from earlier in the actual source, and perhaps adding a section from Clip B to the end and something from Clip C to the front. I am forming editorial units as I work through the material. And these are units that I can later use wholesale.

Another interesting spin-off is that I can very quickly spot “duplicate material”, by which I mean instances where the same information or sentiment is conveyed in more or less the same terms at different places in the source material. Because I am reviewing all of this on the timeline and because I am doing so iteratively, I can very quickly form an opinion as to which of the “duplicates” I want to use in my final edit.

Working towards the delivery target

Let’s step back and look at a further benefit of this method. Whatever your final film is, it will have the length that it needs to be – unless you’re Andy Warhol. You’re delivering a documentary for broadcast or theatrical distribution, or a short form promo or a trailer or TV spot. In each case you have a rough idea of what final length you need to arrive at. In my case, I knew that the piece needed to be around three minutes long. And that, of course, throws up a very obvious piece of arithmetic that it helps me to know. I had five stories to fit into those three minutes, which meant that the absolute maximum of dialogue that I would need would be just over 30 seconds from each story!  The best way of getting to those 30 seconds is obviously subtractively.

I know I need to get my timeline of each story down to something approaching this length. Because I’m not simply topping and tailing clips in the Browser, but actually sculpting them on the timeline (and forming them into editorial units, as described above), I can keep a very close eye on how this is coming along for each story strand. I have a continuous read-out of how well I am getting on with reducing the material down to the target length. By contrast, if I approach my final edit with 30 minutes of loosely selected source material to juggle, I’m going to spend a lot more time on editorial decisions that I could have successfully made earlier.

So the final stage of the process in this case was simply to combine and rearrange the pre-edited timelines into a final timeline – a process that is now incredibly fast and a lot of fun. I’ve narrowed the range of choices right down to the necessary minimum. A great deal of the editing has literally already been done, because I’ve been editing from the very first moment that I laid all the material on the original timeline containing all the source material for the project.

As you can see, the process has been essentially entirely subtractive throughout – a gradual whittling down of the four hours to something closer to three minutes. This is not to say there won’t be additive parts to the overall edit. Of course, I added music, SFX, and graphics, but from the perspective of the process as a whole, this is addition at the most trivial level.

Learning to tell the story in pictures

There is another layer of addition that I have left out and that’s what happens with the pictures. So far I’ve only mentioned what is happening with what is sometimes called the “radio edit”. In my case, I will perform the exact same (sometimes iterative) process of subtracting the shots I want to keep from the entirety of the source material – again, this is obviously happening on a timeline or timelines. The real delight of this method is to review all the “pictures” without reference to the sound, because in doing so you can get a real insight into how the story can be told pictorially. I will often review the pictures having very, very roughly laid up some of the music tracks that I have planned on using. It’s amazing how this lets you gauge both whether your music suits the material and conversely whether the pictures are the right ones for the way you are planning to tell the story.

This brings to me a key point I would make about how I personally work with this method and that’s that I plunge in and experiment even at the early stages of the project. For me, the key thing is to start to get a feel for how it’s all going to come together. This loose experimentation is a great way of approaching that. At some point in the experimentation something clicks and you can see the whole shape or at the very least get a feeling for what it’s all going to look like. The sooner that click happens, the better you can work, because now you are not simply randomly sorting material, you are working towards a picture you have in your head. For me, that’s the biggest benefit of working in the timeline from the very beginning. You’re getting immersed in the shape of the material rather than just its content and the immersion is what sparks the ideas. I’m not invoking some magical thinking here – I’m just talking about a method that’s proven itself time and time again to be the best and fastest way to unlock the doors of the edit.

Another benefit is that although one would expect this method to make it harder to collaborate, in fact the reverse is the case if each editor is conversant with the technique. You’re handing over vastly more useful creative edit information with this process than you could by any other means. What you’re effectively doing is “showing your workings” and not just handing over some versions. It means that the editor taking over from you can easily backtrack through your work and find new stuff and see the ideas that you didn’t end up including in the version(s) that you handed over. It’s an incredibly fast way for the new editor to get up to speed with the project without having to start from scratch by acquainting him or herself with where the useful material can be found.

Even on a more conventional level, I personally would far rather receive string-outs of selects than all the most carefully organized Browser/Bin info you care to throw at me. Obviously if I’m cutting a feature, I want to be able to find 323T14 instantly, but beyond that most basic level, I have no interest in digging through bins or keyword collections or whatever else you might be using, as that’s just going to slow me down.

Freeing yourself of the Browser/Bins

Another observation about this method is how it relates to the NLE interface. When I’m working with my string-outs, which is essentially 90% of the time, I am not ever looking at the Browser/Bins. Accordingly, in Premiere Pro or Final Cut Pro X, I can fully close down the Project/Browser windows/panes and avail myself of the extra screen real estate that gives me, which is not inconsiderable. The consequence of that is to make the timeline experience even more immersive and that’s exactly what I want. I want to be immersed in the details of what I’m doing in the timeline and I have no interest in any other distractions. Conversely, having to keep going back to Bins/Browser means shifting the focus of attention away from my work and breaking the all-important “flow” factor. I just don’t want any distractions from the fundamentally crucial process of moving from one clip to another in a timeline context. As soon as I am dragged away from that, there’s is a discontinuity in what I am doing.

The edit comes to shape organically

I find that there comes a point, if you work this way, when the subsequence you are working on organically starts to take on the shape of the finished edit and it’s something that happens without you having to consciously make it happen. It’s the method doing the work for you. This means that I never find myself starting a fresh sequence and adding to it from the subsequences and I think that has huge advantages. It reinforces my point that you are editing from the very first moment when you lay all your source material onto one timeline. That process leads without pause or interruption to the final edit through the gradual iterative subtraction.

I talked about how the iterative sifting process lets you see “duplicates”, that’s to say instances where the same idea is repeated in an alternative form – and that it helps you make the choice between the different options. Another aspect of this is that it helps you to identify what is strong and what is not so strong. If I were cutting corporates or skate videos this might be different, but for what I do, I need to be able to isolate the key “moments” in my material and find ways to promote those and make them work as powerfully as possible.

In a completely literal sense, when you’re cutting promos and trailers, you want to create an emotional, visceral connection to the material in the audience. You want to make them laugh or cry, you want to make them hold their breath in anticipation, or gasp in astonishment. You need to know how to craft the moments that will elicit the response you are looking for. I find that this method really helps me identify where those moments are going to come from and how to structure everything around them so as to build them as strongly as possible. The iterative sifting method means you can be very sure of what to go for and in what context it’s going to work the best. In other words, I keep coming back to the realization that this method is doing a lot of the creative work for you in a way that simply won’t happen with the alternatives. Even setting aside the manifest efficiency, it would be worth it for this alone.

There’s a huge amount more that I could say about this process, but I’ll leave it there for now. I’m not saying this method works equally well for all types of projects. It’s perhaps less suited to scripted drama, for instance, but even there it can work effectively with certain modifications. Like every method, every editor wants to tweak it to their own taste and inclinations. The one thing I have found to its advantage above all others is that it almost entirely circumvents the problem of “what shot do I lay down next?” Time and again I’ve seen Browser/Bin-focused editors get stuck in exactly this way and it can be a very real block.

– Simon Ubsdell

For an expanded version of this concept, check out Simon’s in-depth article at Creative COW. Click here to link.

For more creative editing tips, click on this link for Film Editor Techniques.

©2017 Oliver Peters

Five Came Back

We know them today as the iconic Hollywood directors who brought us such classic films as Mr. Smith Goes To Washington, It’s a Wonderful Life, The African Queen, and The Man Who Shot Liberty Valance – just to name a few. John Ford, William Wyler, John Huston, Frank Capra and George Stevens also served their country on the ground in World War II, bringing its horrors and truth to the American people through film. In Netflix’s new three-part documentary series, based on Mark Harris’ best-selling book, Five Came Back: A Story of Hollywood and the Second World War, contemporary filmmakers explore the extraordinary story of how Hollywood changed World War II – and how World War II changed Hollywood, through the interwoven experiences of these five legendary filmmakers.

This documentary series features interviews with Steven Spielberg, Francis Ford Coppola, Guillermo Del Toro, Paul Greengrass and Lawrence Kasdan, who add their own perspectives on these efforts. “Film was an intoxicant from the early days of the silent movies,” says Spielberg in the opening moments of Five Came Back. “And early on, Hollywood realized that it had a tremendous tool or weapon for change, through cinema.” Adds Coppola, “Cinema in its purest form could be put in the service of propaganda. Hitler and his minister of propaganda Joseph Goebbels understood the power of the cinema to move large populations toward your way of thinking.”

Five Came Back is directed by Laurent Bouzereau, written by Mark Harris and narrated by Meryl Streep. Bouzereau and his team gathered over 100 hours of archival and newsreel footage; watched over 40 documentaries and training films directed and produced by the five directors during the war; and studied 50 studio films and over 30 hours of outtakes and raw footage from their war films to bring this story to Netflix audiences. Says director Laurent Bouzereau, “These filmmakers, at that time, had a responsibility in that what they were putting into the world would be taken as truth. You can see a lot of echoes in what is happening today. It became clear as we were doing this series that the past was re-emerging in some ways, including the line we see that separates cinema that exists for entertainment and cinema that carries a message. And politics is more than ever a part of entertainment. I find it courageous of filmmakers then, as with artists today, to speak up for those who don’t have a platform.”

An editor’s medium

As every filmmaker knows, documentaries are truly an editor’s medium. Key to telling this story was Will Znidaric, the series editor. Znidaric spent the first sixteen years of his career as a commercial editor in New York City before heading to Los Angeles, in a move to become more involved in narrative projects and hone his craft. This move led to a chance to cut the documentary Winter on Fire: Ukraine’s Fight for Freedom. Production and post for that film was handled by LA’s Rock Paper Scissors Entertainment, a division of the Rock Paper Scissors post facility. RPS is co-owned by Oscar-winning editor, Angus Wall (The Social Network, The Girl with the Dragon Tattoo). Wall, along with Jason Sterman and Linda Carlson, was an executive producer on Winter of Fire for RPS. The connection was a positive experience, so when RPS got involved with Five Came Back, Wall tapped Znidaric as its editor. Much of the same post team worked on both of these documentaries.

I recently interviewed Will Znidaric about his experience editing Five Came Back. “I enjoyed working with Angus,” he explains. “We edited and finished at Rock Paper Scissors over a fifteen month period. They are structured to encourage creativity, which was great for me as a documentary editor. Narratively, this story has five main characters who are on five individual journeys. The canvas is civilization’s greatest conflict. You have to be clear about the war in order to explain their context. You have to be able to find the connections to weave a tapestry between all of these elements. This came together thanks to the flow and trust that was there with Laurent [Bouzereau, director]. The unsung hero is Adele Sparks, our archival producer, who had to find the footage and clear the rights. We were able to generally get rights to acquire the great majority of the footage on our wish list.”

Editing is paleontology

Znidaric continues, “In a documentary like this, editing is a lot like paleontology – you have to find the old bones and reconstruct something that’s alive. There was a lot of searching through newsreels of the day, which was interesting thematically. We all look at the past through the lens of history, but how was the average American processing the events of that world during that time? Of course, those events were unfolding in real time for them. It really makes you think about today’s films and how world events have an impact on them. We had about 100 hours of archival footage, plus studio films and interviews. For eight to nine months we had our storyboard wall with note cards for each of the films. As more footage came in, you could chart the growth through the cards.”

Five Came Back was constructed using three organizing principles: 1) the directors’ films before the war, 2) their documentaries during the war, and 3) their films after the war. According to Znidaric, “We wanted to see how the war affected their work after the war. The book was our guide for causality and order, so I was able to build the structure of the documentary before the contemporary directors were interviewed. I was able to do so with the initial interview with the author, Mark Harris. This way we were able to script an outline to follow. Interview footage of our actual subjects from a few decades ago were also key elements used to tell the story. In recording the modern directors, we wanted to give them space – they are masters – we just needed to make sure we got certain story beats. Their point of view is unique in the sense that they are providing their perspective on their heroes. At the beginning, we have one modern director talking about one of our subject directors. Then that opens up over the three hours, as each talks a little bit about all of these filmmakers.”

From Moviola to Premiere Pro

This was the first film that Znidaric had edited using Adobe Premiere Pro. He says, “During film school, I got to cut 16mm on the Moviola, but throughout my time in New York, I worked on [Avid] Media Composer and then later [Apple] Final Cut Pro 7. When Final Cut Pro X came out, I just couldn’t wrap my head around it, so it was time to shift over to Premiere Pro. I’m completely sold on it. It was a dream to work with on this project. At Rock Paper Scissors, my associate editor James Long and I were set up in two suites. We had duplicate drives of media – not a SAN, which was just given to how the suites were wired. It worked out well for us, but forced us to be extremely diligent with how our media was organized and maintaining that throughout.” The suites were configured with 6-core 2013 Mac Pros, AJA IoXT boxes and Mackie Big Knob mixers for playback.

“All of the media was first transcoded to ProRes, which I believe is one of the reasons that the systems were rock solid during that whole time. There’s an exemplary engineering department at RPS, and they have a direct line to Adobe, so if there were any issues, they became the go-betweens. That way I could stay focused on the creative and not get bogged down with technical issues. Plus, James [Long] would generally handle issues of a technical nature. All told, it was very minimal. The project ran quite smoothly.” To stay on the safe side, the team did not update their versions of Premiere Pro during this time frame, opting to stick with Premiere Pro CC2015 for the duration. Because of the percentage of archival footage, Five Came Back was finished as HD and not in 4K, as are a number of other Netflix shows.

To handle Premiere Pro projects over the course of fifteen months, Znidaric and Long would transfer copies of the project files on a daily basis between the rooms. Znidaric continues, “There were sequences for individual ‘mini-stories’ inside the film. I would build these and then combine the stories. As the post progressed, we would delete some of the older sequences from the project files in order to keep them lean. Essentially we had a separate Premiere Pro project file for each day, therefore, at any time we could go back to an earlier project file to access an older sequence, if needed. We didn’t do much with the other Creative Cloud tools, since we had Elastic handling the graphics work. I would slug in raw stills or placeholder cards for maps and title cards. That way, again, I could stay focused on weaving the complex narrative tapestry.”

Elastic developed the main title and a stylistic look for the series while a52 handled color correction and finishing. Elastic and a52 are part of the Rock Paper Scissors group. Znidaric explains, “We had a lot of discussions about how to handle photos, stills, flyers, maps, dates and documents. The reality of filming under the stress of wartime and combat creates artifacts like scratches, film burn-outs and so on. These became part of our visual language. The objective was to create new graphics that would be true to the look and style of the archival footage.” The audio mix when out-of-house to Monkeyland, a Los Angeles audio post and mixing shop.

Five Came Back appealed to the film student side of the editor. Znidaric wrapped up our conversation with these thoughts. “The thrill is that you are learning as you go through the details. It’s mind-blowing and the series could easily have been ten hours long. We are trying to replicate a sense of discovery without the hindsight of today’s perspective. This was fun because it was like a graduate level film school. Most folks have seen some of the better known films, but many of these films aren’t as recognized these days. Going through them is a form of ‘cinematic forensics’. You find connections tied to the wartime experience that might not otherwise be as obvious. This is great for a film geek like me. Hopefully many viewers will rediscover some of these films by seeing this documentary series.”

The first episode of Five Came Back aired on Netflix on March 31. In conjunction with the launch of Five Came Back, Netflix will also present thirteen documentaries discussed in the series, including Ford’s The Battle of Midway, Wyler’s The Memphis Belle: A Story of a Flying Fortress, Huston’s Report from the Aleutians, Capra’s The Battle of Russia, Stevens’ Nazi Concentration Camps, and Stuart Heisler’s The Negro Soldier.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

A quarter-century for Premiere Pro

I don’t normally plug a manufacturer’s promotional marketing events, but this one seems especially noteworthy. At the end of last year, Adobe Premiere Pro hit its 25th anniversary. It launched in November 1991 as simply Premiere and has gone through numerous iterations – from Premiere to Premiere Pro, CS and now CC. Premiere Pro in all of its versions has always been a popular piece of software by the number of units in the field. However, it’s only been in recent years that this NLE has attracted the attention and respect of top tier editors. And along with that, a legion of editors who now consider it their “go to” editing application. So, this event seems too good not to pass along.

To commemorate this quarter-century milestone, Adobe is kicking off Premiere Pro’s 25th Anniversary today. Adobe is celebrating through a special contest with the help of Imagine Dragons. The Grammy-winning band has teamed up with Adobe to give fans and aspiring producers the chance to co-create a music video. In an industry first, Imagine Dragons is offering total access to the raw footage shot from their music video of Believer, which was posted on YouTube March 7. At this writing, it’s already garnered over seven million views.

Integrating these video clips, fans can cut their own version using Premiere Pro (and Creative Cloud) to enter Adobe’s Make the Cut contest. Entries will be judged by a panel of industry pros, including Angus Wall (Zodiac, The Curious Case of Benjamin Button, The Social Network, The Girl with the Dragon Tattoo), Kirk Baxter (The Curious Case of Benjamin ButtonThe Social Network, The Girl with the Dragon Tattoo, Gone Girl), Bill Fox (Straight Outta Compton, Hustle & FlowBand of Brothers), Matt Eastin (director for Believer), Vinnie Hobbs, an award-winning music video editor who has worked with Kendrick Lamar and Britney Spears, Imagine Dragons and Ann Lewnes (Adobe CMO).

The winner of the contest will claim a Grand Prize of $25,000. Adobe will also award bonus prizes of $1,000 each and a year-long Creative Cloud subscription in four special categories:

Fan Favorite: The most liked video by fans on the Adobe Creative Cloud Channel on YouTube.

Most Unexpected: No specific criteria, but knock their socks off.

Best Young Creator: The best up and coming editor under 25 years old.

Best Short Form: The most impressive video that’s 30-60 seconds long.

Finally, one special bonus prize of $2,500, a year-long subscription to Creative Cloud, and 25 Adobe Stock credits, will go to the cut with the best use of supplied Adobe Stock clips.

If you’re up for the challenge, head over to Adobe’s Make the Cut contest website for more details and to enter.

From the site: “Download exclusive, uncut music video footage and work with Adobe Premiere Pro CC to create your own edit of the video for their new hit song Believer. You’ll have 25 days to make your cut and show the world your editing chops—deadline is April 8th.” Good luck!

EDIT: The contest has closed and you can vote for a fan favorite among the Top 25 Finalists here.

©2017 Oliver Peters

Audio Splits and Stems in Premiere Pro

df2916_audspltppro_8_sm

When TV shows and feature films are being mixed, the final deliverables usually include audio stems as separate audio files or married to a multi-channel video master file or tape. Stems are the isolated submix channels for dialogue, sound effects and music. These elements are typically called DME (dialogue, music, effects) stems or splits and a multi-channel master file that includes these is usually called a split-track submaster. These isolated tracks are normally at mix level, meaning that you can combine them and the sum should equal the same level and mix as the final composite mixed track.

The benefit of having such stems is that you can easily replace elements, like re-recording dialogue in a different language, without having to dive back into the original audio project. The simplest form is to have 3 stereo stem tracks (6 mono tracks) for left and right dialogue, sound effects and music. Obviously, if you have a 5.1 surround mix, you’ll end up with a lot more tracks. There are also other variations for sports or comedy shows. For example, sports shows often isolate the voice-over announcer material from an on-camera dialogue. Comedy shows may isolate the laugh track as a stem. In these cases, rather than 3 stereo DME stems, you might have 4 or more. In other cases, the music and effects stems are combined to end up with a single stereo M&E track (music and effects minus dialogue).

Although this is common practice for entertainment programming, it should also be common practice if you work in short films, corporate videos or commercials. Creating such split-track submasters at the time you finish your project can often save your bacon at some point down the line. I ran into this during the past week. df2916_audspltppro_1A large corporate client needed to replace the music tracks on 11 training videos. These videos were originally editing in 2010 using Final Cut Pro 7 and mixed in Pro Tools. Although it may have been possible to resurrect the old project files, doing so would have been problematic. However, in 2010, I had exported split-track submasters with the final picture and isolated stereo tracks for dialogue, sound effects and music. These have become the new source for our edit – now 6 years later. Since I am editing these in Premiere Pro CC, it is important to also create new split-track submasters, with the revised music tracks, should we ever need to do this again in the future.

Setting up a new Premiere Pro sequence 

I’m usually editing in either Final Cut Pro X or Premiere Pro CC these days. It’s easy to generate a multi-channel master file with isolated DME stems in FCP X, by using the Roles function. However, to do this, you need to make sure you properly assign the correct Roles from the get-go. Assuming that you’ve done this for dialogue, sound effects and music Roles on the source clips, then the stems become self-sorting upon export – based on how you route a Role to its corresponding export channel. When it comes to audio editing and mixing, I find Premiere Pro CC’s approach more to my liking. This process is relatively easy in Premiere, too; however, you have to set up a proper sequence designed for this type of audio work. That’s better than trying to sort it out at the end of the line.

df2916_audspltppro_4The first thing you’ll need to do is create a custom preset. By default, sequence presets are configured with a certain number of tracks routed to a stereo master output. This creates a 2-channel file on export. Start by changing the track configuration to multi-channel and set the number of output channels. My requirement is to end up with an 8-channel file that includes a stereo mix, plus stereo stems for isolated dialogue, sound effects and music. Next, add the number of tracks you need and assign them as “standard” for the regular tracks or “stereo submix” for the submix tracks.

df2916_audspltppro_2This is a simple example with 3 regular tracks and 3 submix tracks, because this was a simple project. A more complete project would have more regular tracks, depending on how much overlapping dialogue or sound effects or music you are working with on the timeline. For instance, some editors like to set up “zones” for types of audio. You might decide to have 24 timeline tracks, with 1-8 used for dialogue, 9-18 for sound effects and 17-24 for music. In this case, you would still only need 3 submix tracks for the aggregate of the dialogue, sound effects and music.

df2916_audspltppro_5Rename the submix tracks in the timeline. I’ve renamed Submix 1-3 as DIA, SFX and MUS for easy recognition. With Premiere Pro, you can mix audio in several different places, such as the clip mixer or the audio track mixer. Go to the audio track mixer and assign the channel output and routing. (Channel output can also be assigned in the sequence preset panel.) For each of the regular tracks, I’ve set the pulldown for routing to the corresponding submix track. Audio 1 to DIA, Audio 2 to SFX and Audio 3 to MUS. The 3 submix tracks are all routed to the Master output.

df2916_audspltppro_3The last step is to properly assign channel routing. With this sequence preset, master channels 1 and 2 will contain the full mix. First, when you export a 2-channel file as a master file or a review copy, by default only the first 2 output channels are used. So these will always get the mix without you having to change anything. Second, most of us tend to edit with stereo monitoring systems. Again, output channels 1 and 2 are the default, which means you’ll always be monitoring the full mix, unless you make changes or solo a track. Output channels 3-8 correspond to the stereo stems. Therefore, to enable this to happen automatically, you must assign the channel output in the following configuration: DIA (Submix 1) to 1-2 and 3-4, SFX (Submix 2) to 1-2 and 5-6, and MUS (Submix 3) to 1-2 and 7-8. The result is that everything goes to both the full mix, as well as the isolated stereo channel for each audio component – dialogue, sound effects and music.

Editing in the custom timeline

Once you’ve set up the timeline, the rest is easy. Edit any dialogue clips to track 1, sound effects to track 2 and music to track 3. In a more complex example, like the 24-track timeline I referred to earlier, you’d work in the “zones” that you had organized. If 1-8 are routed to the dialogue submix track, then you would edit dialogue clips only to tracks 1-8. Same for the corresponding sound effects and music tracks. Clips levels can still be adjusted as you normally would. But, by having submix tracks, you can adjust the level of all dialogue by moving the single, DIA submix fader in the audio track mixer. This can also be automated. If you want a common filter, like a compressor, added all of one stem – like a compressor across all sound effects – simply assign it from the pulldown within that submix channel strip.

Exporting the file

df2916_audspltppro_6The last step is exporting your spilt-track submaster file. If this isn’t correct, the rest was all for naught. The best formats to use are either a QuickTime ProRes file or one of the MXF OP1a choices. In the audio tab of the export settings panel, change the pulldown channel selection from Stereo to 8 channels. Now each of your timeline output channels will be exported as a separate mono track in the file. These correspond to your 4 stereo mix groups – the full mix plus stems. Now in one single, neat file, you have the final image and mix, along with the isolated stems that can facilitate easy changes down the road. Depending on the nature of the project, you might also want to export versions with and without titles for an extra level of future-proofing.

Reusing the file

df2916_audspltppro_7If you decide to use this exported submaster file at a later date as a source clip for a new edit, simply import it into Premiere Pro like any other form of media. However, because its channel structure will be read as 8 mono channels, you will need to modify the file using the Modify-Audio Channels contextual menu (right-click the clip). Change the clip channel format from Mono to Stereo, which turns your 8 mono channels back into the left and right sides of 4 stereo channels. You may then ignore the remaining “unassigned” clip channels. Do not change any of the check boxes.

Hopefully, by following this guide, you’ll find that creating timelines with stem tracks becomes second nature. It can sure help you years later, as I found out yet again this past week!

©2016 Oliver Peters

Swiss Army Man

df2716_swissarmymanWhen it comes to quirky movies, Swiss Army Man stands alone. Hank (Paul Dano) is a castaway on a deserted island at his wit’s end. In an act of final desperation, he’s about to hang himself, when he discovers Manny (Daniel Radcliffe), a corpse that’s just washed up on shore. At this point the film diverges from the typical castaway/survival story into an absurdist comedy. Manny can talk and has “magical powers” that Hank uses to find his way back to civilization.

Swiss Army Man was conceived and directed by the writing and directing duo of Dan Kwan and Daniel Sheinert, who work under the moniker Daniels. This is their feature length film debut and was produced with Sundance in mind. The production company brought on Matthew Hannam to edit the film. Hannam (The OA, Enemy, James White) is a Canadian film and TV editor with numerous features and TV series under his belt. I recently spoke with Hannam about the post process on Swiss Army Man.

Hannam discussed the nature of the film. “It’s a very handmade film. We didn’t have a lot of time to edit and had to make quick decisions. I think that really helped us. This was the dozenth or so feature for me, so in a way I was the veteran. It was fun to work with these guys and experience their creative process. Swiss Army Man is a very cinematically-aware film, full of references to other famous films. You’re making a survival movie, but it’s very aware that other survival movies exist. This is also a very self-reflexive film and, in fact, the model is more like a romantic comedy than anything else. So I was a bit disappointed to see a number of the reviews focus solely on the gags in the film, particularly around Manny, the corpse. There’s more to it than that. It’s about a guy who wonders what it might be like had things been different. It’s a very special little film, because the story puts us inside of Hank’s head.”

Unlike the norm for most features, Hannam joined the team after the shooting had been completed. He says, “I came on board during the last few days of filming. They shot for something like 25 days. This was all single-camera work with Larkin Seiple (Cop Car, Bleed For This) as director of photography. They shot ARRI ALEXA XT with Cooke anamorphic lenses. It was shot ARRIRAW, but for the edit we had a special LUT applied to the dailies, so the footage was already beautiful. I got a drive in August and the film premiered at Sundance. That’s a very short post schedule, but our goal was always Sundance.”

Shifting to Adobe tools

Like many of this year’s Sundance films, Adobe Premiere Pro was the editing tool of choice. Hannam continues, “I’m primarily an Avid [Media Composer] editor and the Dans [Kwan and Sheinert] had been using [Apple] Final Cut Pro in the past for the shorts that they’ve edited themselves. They opted to go with Premiere on this film, as they thought it would be easiest to go back and forth with After Effects. We set up a ‘poor man’s’ shared storage with multiple systems that each had duplicate media on local drives. Then we’d use Dropbox to pass around project files and shared elements, like sound effects and temp VFX. While the operation wasn’t flawless – we did experience a few crashes – it got the job done.”

Swiss Army Man features quite a few visual effects shots and Hannam credits the co-directors’ music video background with making this a relatively easy task. He says, “The Dans are used to short turnarounds in their music video projects, so they knew how to integrate visual effects into the production in a way that made it easier for post. That’s also the beauty of working with Premiere Pro. There’s a seamless integration with After Effects. What’s amazing about Premiere is the quality of the built-in effects. You get effects that are actually useful in telling the story. I used the warp stabilizer and timewarp a lot. In some cases those effects made it possible to use shots in a way that was never possible before. The production company partnered with Method for visual effects and Company 3 [Co3] for color grading. However, about half of the effects were done in-house using After Effects. On a few shots, we actually ended up using After Effects’ stabilization after final assembly, because it was that much better than what was possible during the online assembly of the film.”

Another unique aspect of Swiss Army Man is its musical score. Hannam explains, “Due to the tight schedule, music scoring proceeded in parallel with the editing. The initial temp music pulled was quirky, but didn’t really match the nature of the story. Once we got the tone right with the temp tracks, scenes were passed on to the composers – Andy Hull and Robert McDowell – who Daniels met while making a video for their band Manchester Orchestra. The concept for the score was that it was all coming from inside of Hank’s head. Andy sang all the music as if Hank was humming his own score. They created new tracks for us and by the end we had almost no temp music in the edit. Once the edit was finalized, they worked with Paul [Dano] and Daniel [Radcliffe] to sing and record the parts themselves. Fortunately both are great singers, so the final a cappella score is actually the lead actors themselves.”

Structuring the edit

Matthew Hannam and I discussed his approach to editing scenes, especially with this foray into Premiere Pro. He responds, “When I’m on Media Composer, I’m a fan of ScriptSync. It’s a great way to know what coverage you have. There’s nothing like that in Premiere, although I did use the integrated Story app. This enables you to load the script into a tab for quick access. Usually my initial approach is to sit down and watch all the footage for the particular scene while I plan how I’m going to assemble it. The best way to know the footage is to work with it. You have to watch how the shoot progresses in the dailies. Listen to what the director says at the end of a take – or if he interrupts in the middle – and that will give you a good idea of the intention. Then I just start building the scene – often first from the middle. I’m looking for what is the central point of that scene and it often helps to build from the middle out.”

Although Hannam doesn’t use any tricks to organize his footage or create selects, he does use “KEM rolls”. This term stems from the KEM flatbed film editing table. In modern parlance, it means that the editor has strung out all the footage for a scene into a single timeline, making it easy to scrub through all the available footage quickly. He continues, “I’ll build a dailies reel and tuck it away in the bottom of the bin. It’s a great way to quickly see what footage you have available. When it’s time to revise a scene, it’s good to go back to the raw footage and see what options you have. It is a quick way to jog your memory about what was shot.”

A hybrid post workflow

Another integral member of the post team was assistant editor Kyle Gilbertson. He had worked with the co-directors previously and was the architect of the hybrid post workflow followed on this film. Gilbertson pulled all of the shots for VFX that were being handled in-house. Many of the more complicated montages were handled as effects sequences and the edit was rebuilt in DaVinci Resolve before re-assembly in After Effects. Hannam explains, “We had two stages of grading with [colorist] Sofie Borup at Co3. The first was to set looks and get an idea what the material was going to look like once finished. Then, once everything was complete, we combined all of the material for final grading and digital intermediate mastering. There was a real moment of truth when the 100 or so shots that Daniels did themselves were integrated into the final cut. Luckily it all came together fairly seamlessly.”

“Having finished the movie, I look back at it and I’m full of warm feelings. We kind of just dove into it as a big team. The two Dans, Kyle and I were in that room kind of just operating as a single unit. We shifted roles and kept everything very open. I believe the end product reflects that. It’s a film that took inspiration from everywhere and everyone. We were not setting out to be weird or gross. The idea was to break down an audience and make something that everyone could enjoy and be won over by. In the end, it feels like we really took a step forward with what was possible at home. We used the tools we had available to us and we made them work. It makes me excited that Adobe’s Creative Cloud software tools were enough to get a movie into 700 cinemas and win those boys the Sundance Directing prize. We’re at a point in post where you don’t need a lot of hardware. If you can figure out how to do it, you can probably make it yourself. That was our philosophy from start to finish on the movie.”

Originally written for Digital Video magazine / Creative Planet Network.

©2016 Oliver Peters

NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters