Audio Splits and Stems in Premiere Pro

df2916_audspltppro_8_sm

When TV shows and feature films are being mixed, the final deliverables usually include audio stems as separate audio files or married to a multi-channel video master file or tape. Stems are the isolated submix channels for dialogue, sound effects and music. These elements are typically called DME (dialogue, music, effects) stems or splits and a multi-channel master file that includes these is usually called a split-track submaster. These isolated tracks are normally at mix level, meaning that you can combine them and the sum should equal the same level and mix as the final composite mixed track.

The benefit of having such stems is that you can easily replace elements, like re-recording dialogue in a different language, without having to dive back into the original audio project. The simplest form is to have 3 stereo stem tracks (6 mono tracks) for left and right dialogue, sound effects and music. Obviously, if you have a 5.1 surround mix, you’ll end up with a lot more tracks. There are also other variations for sports or comedy shows. For example, sports shows often isolate the voice-over announcer material from an on-camera dialogue. Comedy shows may isolate the laugh track as a stem. In these cases, rather than 3 stereo DME stems, you might have 4 or more. In other cases, the music and effects stems are combined to end up with a single stereo M&E track (music and effects minus dialogue).

Although this is common practice for entertainment programming, it should also be common practice if you work in short films, corporate videos or commercials. Creating such split-track submasters at the time you finish your project can often save your bacon at some point down the line. I ran into this during the past week. df2916_audspltppro_1A large corporate client needed to replace the music tracks on 11 training videos. These videos were originally editing in 2010 using Final Cut Pro 7 and mixed in Pro Tools. Although it may have been possible to resurrect the old project files, doing so would have been problematic. However, in 2010, I had exported split-track submasters with the final picture and isolated stereo tracks for dialogue, sound effects and music. These have become the new source for our edit – now 6 years later. Since I am editing these in Premiere Pro CC, it is important to also create new split-track submasters, with the revised music tracks, should we ever need to do this again in the future.

Setting up a new Premiere Pro sequence 

I’m usually editing in either Final Cut Pro X or Premiere Pro CC these days. It’s easy to generate a multi-channel master file with isolated DME stems in FCP X, by using the Roles function. However, to do this, you need to make sure you properly assign the correct Roles from the get-go. Assuming that you’ve done this for dialogue, sound effects and music Roles on the source clips, then the stems become self-sorting upon export – based on how you route a Role to its corresponding export channel. When it comes to audio editing and mixing, I find Premiere Pro CC’s approach more to my liking. This process is relatively easy in Premiere, too; however, you have to set up a proper sequence designed for this type of audio work. That’s better than trying to sort it out at the end of the line.

df2916_audspltppro_4The first thing you’ll need to do is create a custom preset. By default, sequence presets are configured with a certain number of tracks routed to a stereo master output. This creates a 2-channel file on export. Start by changing the track configuration to multi-channel and set the number of output channels. My requirement is to end up with an 8-channel file that includes a stereo mix, plus stereo stems for isolated dialogue, sound effects and music. Next, add the number of tracks you need and assign them as “standard” for the regular tracks or “stereo submix” for the submix tracks.

df2916_audspltppro_2This is a simple example with 3 regular tracks and 3 submix tracks, because this was a simple project. A more complete project would have more regular tracks, depending on how much overlapping dialogue or sound effects or music you are working with on the timeline. For instance, some editors like to set up “zones” for types of audio. You might decide to have 24 timeline tracks, with 1-8 used for dialogue, 9-18 for sound effects and 17-24 for music. In this case, you would still only need 3 submix tracks for the aggregate of the dialogue, sound effects and music.

df2916_audspltppro_5Rename the submix tracks in the timeline. I’ve renamed Submix 1-3 as DIA, SFX and MUS for easy recognition. With Premiere Pro, you can mix audio in several different places, such as the clip mixer or the audio track mixer. Go to the audio track mixer and assign the channel output and routing. (Channel output can also be assigned in the sequence preset panel.) For each of the regular tracks, I’ve set the pulldown for routing to the corresponding submix track. Audio 1 to DIA, Audio 2 to SFX and Audio 3 to MUS. The 3 submix tracks are all routed to the Master output.

df2916_audspltppro_3The last step is to properly assign channel routing. With this sequence preset, master channels 1 and 2 will contain the full mix. First, when you export a 2-channel file as a master file or a review copy, by default only the first 2 output channels are used. So these will always get the mix without you having to change anything. Second, most of us tend to edit with stereo monitoring systems. Again, output channels 1 and 2 are the default, which means you’ll always be monitoring the full mix, unless you make changes or solo a track. Output channels 3-8 correspond to the stereo stems. Therefore, to enable this to happen automatically, you must assign the channel output in the following configuration: DIA (Submix 1) to 1-2 and 3-4, SFX (Submix 2) to 1-2 and 5-6, and MUS (Submix 3) to 1-2 and 7-8. The result is that everything goes to both the full mix, as well as the isolated stereo channel for each audio component – dialogue, sound effects and music.

Editing in the custom timeline

Once you’ve set up the timeline, the rest is easy. Edit any dialogue clips to track 1, sound effects to track 2 and music to track 3. In a more complex example, like the 24-track timeline I referred to earlier, you’d work in the “zones” that you had organized. If 1-8 are routed to the dialogue submix track, then you would edit dialogue clips only to tracks 1-8. Same for the corresponding sound effects and music tracks. Clips levels can still be adjusted as you normally would. But, by having submix tracks, you can adjust the level of all dialogue by moving the single, DIA submix fader in the audio track mixer. This can also be automated. If you want a common filter, like a compressor, added all of one stem – like a compressor across all sound effects – simply assign it from the pulldown within that submix channel strip.

Exporting the file

df2916_audspltppro_6The last step is exporting your spilt-track submaster file. If this isn’t correct, the rest was all for naught. The best formats to use are either a QuickTime ProRes file or one of the MXF OP1a choices. In the audio tab of the export settings panel, change the pulldown channel selection from Stereo to 8 channels. Now each of your timeline output channels will be exported as a separate mono track in the file. These correspond to your 4 stereo mix groups – the full mix plus stems. Now in one single, neat file, you have the final image and mix, along with the isolated stems that can facilitate easy changes down the road. Depending on the nature of the project, you might also want to export versions with and without titles for an extra level of future-proofing.

Reusing the file

df2916_audspltppro_7If you decide to use this exported submaster file at a later date as a source clip for a new edit, simply import it into Premiere Pro like any other form of media. However, because its channel structure will be read as 8 mono channels, you will need to modify the file using the Modify-Audio Channels contextual menu (right-click the clip). Change the clip channel format from Mono to Stereo, which turns your 8 mono channels back into the left and right sides of 4 stereo channels. You may then ignore the remaining “unassigned” clip channels. Do not change any of the check boxes.

Hopefully, by following this guide, you’ll find that creating timelines with stem tracks becomes second nature. It can sure help you years later, as I found out yet again this past week!

©2016 Oliver Peters

Swiss Army Man

df2716_swissarmymanWhen it comes to quirky movies, Swiss Army Man stands alone. Hank (Paul Dano) is a castaway on a deserted island at his wit’s end. In an act of final desperation, he’s about to hang himself, when he discovers Manny (Daniel Radcliffe), a corpse that’s just washed up on shore. At this point the film diverges from the typical castaway/survival story into an absurdist comedy. Manny can talk and has “magical powers” that Hank uses to find his way back to civilization.

Swiss Army Man was conceived and directed by the writing and directing duo of Dan Kwan and Daniel Sheinert, who work under the moniker Daniels. This is their feature length film debut and was produced with Sundance in mind. The production company brought on Matthew Hannam to edit the film. Hannam (The OA, Enemy, James White) is a Canadian film and TV editor with numerous features and TV series under his belt. I recently spoke with Hannam about the post process on Swiss Army Man.

Hannam discussed the nature of the film. “It’s a very handmade film. We didn’t have a lot of time to edit and had to make quick decisions. I think that really helped us. This was the dozenth or so feature for me, so in a way I was the veteran. It was fun to work with these guys and experience their creative process. Swiss Army Man is a very cinematically-aware film, full of references to other famous films. You’re making a survival movie, but it’s very aware that other survival movies exist. This is also a very self-reflexive film and, in fact, the model is more like a romantic comedy than anything else. So I was a bit disappointed to see a number of the reviews focus solely on the gags in the film, particularly around Manny, the corpse. There’s more to it than that. It’s about a guy who wonders what it might be like had things been different. It’s a very special little film, because the story puts us inside of Hank’s head.”

Unlike the norm for most features, Hannam joined the team after the shooting had been completed. He says, “I came on board during the last few days of filming. They shot for something like 25 days. This was all single-camera work with Larkin Seiple (Cop Car, Bleed For This) as director of photography. They shot ARRI ALEXA XT with Cooke anamorphic lenses. It was shot ARRIRAW, but for the edit we had a special LUT applied to the dailies, so the footage was already beautiful. I got a drive in August and the film premiered at Sundance. That’s a very short post schedule, but our goal was always Sundance.”

Shifting to Adobe tools

Like many of this year’s Sundance films, Adobe Premiere Pro was the editing tool of choice. Hannam continues, “I’m primarily an Avid [Media Composer] editor and the Dans [Kwan and Sheinert] had been using [Apple] Final Cut Pro in the past for the shorts that they’ve edited themselves. They opted to go with Premiere on this film, as they thought it would be easiest to go back and forth with After Effects. We set up a ‘poor man’s’ shared storage with multiple systems that each had duplicate media on local drives. Then we’d use Dropbox to pass around project files and shared elements, like sound effects and temp VFX. While the operation wasn’t flawless – we did experience a few crashes – it got the job done.”

Swiss Army Man features quite a few visual effects shots and Hannam credits the co-directors’ music video background with making this a relatively easy task. He says, “The Dans are used to short turnarounds in their music video projects, so they knew how to integrate visual effects into the production in a way that made it easier for post. That’s also the beauty of working with Premiere Pro. There’s a seamless integration with After Effects. What’s amazing about Premiere is the quality of the built-in effects. You get effects that are actually useful in telling the story. I used the warp stabilizer and timewarp a lot. In some cases those effects made it possible to use shots in a way that was never possible before. The production company partnered with Method for visual effects and Company 3 [Co3] for color grading. However, about half of the effects were done in-house using After Effects. On a few shots, we actually ended up using After Effects’ stabilization after final assembly, because it was that much better than what was possible during the online assembly of the film.”

Another unique aspect of Swiss Army Man is its musical score. Hannam explains, “Due to the tight schedule, music scoring proceeded in parallel with the editing. The initial temp music pulled was quirky, but didn’t really match the nature of the story. Once we got the tone right with the temp tracks, scenes were passed on to the composers – Andy Hull and Robert McDowell – who Daniels met while making a video for their band Manchester Orchestra. The concept for the score was that it was all coming from inside of Hank’s head. Andy sang all the music as if Hank was humming his own score. They created new tracks for us and by the end we had almost no temp music in the edit. Once the edit was finalized, they worked with Paul [Dano] and Daniel [Radcliffe] to sing and record the parts themselves. Fortunately both are great singers, so the final a cappella score is actually the lead actors themselves.”

Structuring the edit

Matthew Hannam and I discussed his approach to editing scenes, especially with this foray into Premiere Pro. He responds, “When I’m on Media Composer, I’m a fan of ScriptSync. It’s a great way to know what coverage you have. There’s nothing like that in Premiere, although I did use the integrated Story app. This enables you to load the script into a tab for quick access. Usually my initial approach is to sit down and watch all the footage for the particular scene while I plan how I’m going to assemble it. The best way to know the footage is to work with it. You have to watch how the shoot progresses in the dailies. Listen to what the director says at the end of a take – or if he interrupts in the middle – and that will give you a good idea of the intention. Then I just start building the scene – often first from the middle. I’m looking for what is the central point of that scene and it often helps to build from the middle out.”

Although Hannam doesn’t use any tricks to organize his footage or create selects, he does use “KEM rolls”. This term stems from the KEM flatbed film editing table. In modern parlance, it means that the editor has strung out all the footage for a scene into a single timeline, making it easy to scrub through all the available footage quickly. He continues, “I’ll build a dailies reel and tuck it away in the bottom of the bin. It’s a great way to quickly see what footage you have available. When it’s time to revise a scene, it’s good to go back to the raw footage and see what options you have. It is a quick way to jog your memory about what was shot.”

A hybrid post workflow

Another integral member of the post team was assistant editor Kyle Gilbertson. He had worked with the co-directors previously and was the architect of the hybrid post workflow followed on this film. Gilbertson pulled all of the shots for VFX that were being handled in-house. Many of the more complicated montages were handled as effects sequences and the edit was rebuilt in DaVinci Resolve before re-assembly in After Effects. Hannam explains, “We had two stages of grading with [colorist] Sofie Borup at Co3. The first was to set looks and get an idea what the material was going to look like once finished. Then, once everything was complete, we combined all of the material for final grading and digital intermediate mastering. There was a real moment of truth when the 100 or so shots that Daniels did themselves were integrated into the final cut. Luckily it all came together fairly seamlessly.”

“Having finished the movie, I look back at it and I’m full of warm feelings. We kind of just dove into it as a big team. The two Dans, Kyle and I were in that room kind of just operating as a single unit. We shifted roles and kept everything very open. I believe the end product reflects that. It’s a film that took inspiration from everywhere and everyone. We were not setting out to be weird or gross. The idea was to break down an audience and make something that everyone could enjoy and be won over by. In the end, it feels like we really took a step forward with what was possible at home. We used the tools we had available to us and we made them work. It makes me excited that Adobe’s Creative Cloud software tools were enough to get a movie into 700 cinemas and win those boys the Sundance Directing prize. We’re at a point in post where you don’t need a lot of hardware. If you can figure out how to do it, you can probably make it yourself. That was our philosophy from start to finish on the movie.”

Originally written for Digital Video magazine / Creative Planet Network.

©2016 Oliver Peters

NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters

Adobe Premiere Pro CC Learning Tips

df2016_ppro_1

Adobe Premiere Pro CC is the heir apparent editing application for many editors. In order to make your transition easier, I’ve compiled a series of links to various official and unofficial resources, including Adobe sites, forums, YouTube videos, training resources, and various blog posts. This list is by no means all that’s out there, but it should provide a great starting point to become more comfortable with Premiere Pro CC.

Adobe and Adobe-related training resources

Adobe tutorials

Adobe Premiere Pro tips

Maxim Jago’s tips at Lynda

Maxim Jago’s tips at Peachpit

Lynda’s Adobe Premiere Pro training

Ripple Training – Premiere Pro CC 2015

Adobe-related blogs and YouTube channels

Dave Helmly’s DAV Tech Table

Jason Levine’s YouTube channel

Colin Smith’s YouTube channel

Dave Helmly’s presentation at Orlando Post Pros meeting – 2015 (webcast)

Forums

Adobe Filmmaker stories

Adobe Premiere Pro Forum

Creative COW Premiere Pro forum

Premiere-centric sites and YouTube channels

Jarle Leirpoll’s Premiere Pro blog

Retooled

Premiere Bro

Best Premiere Pro Quick Tips YouTube Channel

Premiere Pro Tips YouTube channel

Blog posts

VashiVisuals – Deadpool Premiere Pro Presets

VashiVisuals – Keyboard Layouts

VashiVisuals – Music Video Editing Tips

VashiVisuals – Pancake Timeline

Jonny Elwyn – Premiere posts

Jonny Elwyn – Premiere Pro Tools and Tutorials

Jonny Elwyn – Tips and Tricks

Jonny Elwyn – Tips for Better Editing in Premiere Pro

Jonny Elwyn – Tutorials for Better Editing in Premiere Pro

Premium Beat – Beginner’s Guide to Premiere Pro Shortcuts

Premium Beat – Match Frame and Replace Edit

Premium Beat – How to Clean up Audio in Premiere Pro

Premium Beat – Creating a Storyboard Edit in Premiere Pro

Premium Beat – AVCHD Editing Workflow

Premium Beat – How to Organize a Feature Film Edit

Premium Beat – 3 Quick Tips for Editing in Premiere Pro

Premium Beat – Time-Saving Premiere Pro CC Tips

Derek Lieu – Simple Tricks for Faster Editing in Premiere Pro

No Film School – Premiere Pro Keyboard Shortcuts

Wipster – 4 Reasons Why Premiere Pro is a Great Choice

Wipster – Master the Media Browser

Plug-ins, add-ons, other

Kinetic type and layer effects by TypeMonkey for After Effects

Post Notes Premiere Pro control panel

PDFviewer Premiere Pro control panel

Frame.io integration

Wipster.io integration

Axle Video integration

LookLabs SpeedLooks

FxFactory plug-ins

RedGiant Software plug-ins

Boris FX plug-ins

DigitalFilms – SpeedGrade Looks

Jarle’s Premiere Pro Presets

Note : this information is also included on the Editing Resources page accessible from the header of this blog. Future updates will be made there.

©2016 Oliver Peters

Deadpool

df1016_deadpool_1_sm

Adobe has been on a roll getting filmmakers to adopt its Premiere Pro CC editing software for feature film post. Hot on the heels of its success at Sundance, where a significant number of the indie films we’re edited using Premiere Pro, February saw the release of two major Hollywood films that were cut using Premiere Pro – the Coen Brothers’ Hail, Caesar! and Tim Miller’s Deadpool.

Deadpool is one of Marvel Comics’ more unconventional superheroes. Deadpool, the film, is the origin story of how Wade Wilson (Ryan Reynolds) becomes Deadpool. He’s a mercenary soldier that gains accelerated healing powers through a rogue experiment. Left disfigured, but with new powers, he sets off to rescue his girlfriend (Morena Baccarin) and find the person responsible. Throughout all of this, the film is peppered with Deadpool’s wise-cracking and breaking the fourth wall by addressing the audience.

This is the first feature film for director Tim Miller, but he’s certainly not new to the process. Miller and his company Blur Studios are known for their visual effects work on commercials, shorts, and features, including Scott Pilgrim vs. the World and Thor: The Dark World. Setting out to bring as much of the post in-house, Miller consulted with his friend, director David Fincher, who recommended the Adobe Creative Cloud solution, based on Fincher’s experience during Gone Girl. Several editing bays were established within Blur’s facility – using new, tricked out Mac Pros connected to an Open Drives Velocity SSD 180TB shared storage solution.

Plugging new software into a large VFX film pipeline

df1016_deadpool_6Julian Clarke (Chappie, Elysium, District 9) came on board to edit the film. He explains, “I talked with Tim and was interested in the whole pioneering aspect of it. The set-up costs to make these permanent edit suites for his studio are attractive. I learned editing using [Apple] Final Cut Pro at version one and then I switched to Avid about four years later and have cut with it since. If you can learn [Avid] Media Composer, then [Adobe] Premiere Pro is fine. I was up to about 80% of my normal speed after just two days.”

To ease any growing pains of using a new editing tool on such a complex film, Miller and Adobe also brought in feature film editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2: The Second One, An American Carol) as a workflow consultant. Nedomansky’s job was to help establish a workflow pipeline and to get the editorial team up to speed with Premiere Pro. He had performed a similar role on Gone Girl. He says, “I’ve cut nine features and the last four have been using Premiere Pro. Adobe has called on me for that editor-to-editor interface and to help Blur set up five edit bays. I translated what we figured out with Gone Girl, but adapted it to Blur’s needs, as well as taking into consideration the updates made to the software since then. During the first few weeks of shooting, I worked with Julian and the assistant editors to customize their window layouts and keyboard shortcuts, since prior to this, the whole crew had primarily been using Avid.”

Deadpool was shot mostly with ARRI ALEXA cameras recording open gate 2.8K ARRIRAW. Additional footage also came from Phantom and RED cameras. Most scenes were recorded with two cameras. The original camera files were transcoded to 2K ProRes dailies in Vancouver. Back at Blur, first assistant editor Matt Carson would sync audio and group the clips into Premiere Pro multicam sequences.

Staying up with production

df1016_deadpool_2As with most features, Clarke was cutting while the production was going on. However, unlike many films, he was ready to show Miller edited scenes to review within 24 hours after the shoot had wrapped for the day. Not only a cut scene, but one already fleshed out with temporary sound effects and music. This is quite a feat, considering that Miller shot more than 500 hours of footage. Seeing a quick turnaround of edited scenes was very beneficial for Miller as a first-time feature director. Clarke adds, “My normal approach is to start cutting and see what works as a first draft. The assistant will add sound effects and temp music and if we hit a stumbling block, we move on to another scene. Blur had also created a lot of pre-vis shots for the effects scenes prior to the start of principal photography. I was able to cut these in as temp VFX. This way the scenes could play through without a lot of holes.”

df1016_deadpool_3To make their collaborative workflow function, Nedomansky, Clarke, and the assistants worked out a structure for organizing files and Premiere Pro projects. Deadpool was broken into six reels, based on the approximate page count in the script where a reel break should occur. Every editor had their own folder on the Open Drives SAN containing only the most recent version of whatever project that they were working on. If Julian Clarke was done working on Reel 1, then that project file could be closed and moved from Clarke’s folder into the folder of one of the assistants. They would then open the project to add temporary sound effects or create some temporary visual effects. Meanwhile, Clarke would continue on Reel 2, which was located in his folder. By keeping only the active project file in the various folders and moving projects among editors’ folders, it would mimic the bin-locking method used in shared Avid workflows.

In addition, Premiere Pro’s Media Browser module would also enable the editors to access and import sequences found within other project files. This is a non-destructive process. Older versions of the project files would be stored in a separate folder on the SAN in order to keep the active folders and projects uncluttered. Premiere Pro’s ability to work with folders as they were created in the Finder, let the editors do more of the organization at the Finder level than they normally would, had they been cutting with Avid systems.

Cutting an action film

df1016_deadpool_4Regardless of the software you use, each film presents a unique set of creative challenges. Clarke explains, “One scene that took a while was a long dialogue scene with Deadpool and Colossus on the highway. It’s quintessential Deadpool with a lot of banter and improv from Ryan. There’s not much story going on in the background at that time. We didn’t want to cut too much out, but at the same time we didn’t want to have the audience get lost in what’s supposed to be the bigger story. It took some time to strike the right balance. Overall the film was just about right. The director’s cut was about two hours, which was cut into the final length of one hour and 45 minutes. That’s just about the right amount to cut out, because you don’t end up loosing so much of the heart of the film.”

Many editors have a particular way they like their assistants to organize bins and projects. Clarke offers, “I tend to work in the frame view and organize my set-ups by masters, close-ups, and so on. Where I may be a little different than other editors is how I have my assistants organize action scenes. I’ll have them break down the choreography move-by-move and build a sequence of selected shots in the order of these moves. So for example, all the angles of the first punch, followed by all the angles of the next move – a punch, or block, or kick. Action scenes are often shot with so much coverage, that this lets me quickly zero in on the best stuff. It eliminates the scavenger hunt to find just the right angle on a move.”

df1016_deadpool_8The script was written to work in a nonlinear order. Clarke explains how that played out through the edit, “We stood by this intention in the editing. We found, in fact, that the film just didn’t work linearly at all. The tone of the two [scripted] timelines are quite different, with the more serious undertones of the origin story and the broad humor of the Deadpool timeline. When played sequentially, it was like oil and water – two totally different movies. By interweaving the timelines, the tone of the movie felt more coherent with the added bonus of being able to front load action into the movie to excite the audience, before getting into the heavier cancer part of the story.”

One editing option that might come to mind is that a character in a mask offers an interesting opportunity to change dialogue without difficult sync issues. However it wasn’t the sort of crutch some might assume. Clarke says, “Yes, the mask provided a lot of opportunity for ADR. Though this was used more for tweaking dialogue for plot clarity or to try out alternate jokes, than a wholesale replacement of the production track. If we liked the production performance we generally kept it, and embraced the fact that the mask Ryan was wearing would dull the audio a bit. I try to use as little ADR as possible, when it comes to it being used for technical reasons, rather than creative ones. I feel like there’s a magic that happens on set that is often hard to replicate in the ADR booth.”

Pushing the envelope

df1016_deadpool_7The editing systems proved to offer the performance needed to complete a film of this size and complexity. Vashi Nedomansky says, “There were 1400 effects shots handled by ten vendors. Thanks to the fact that Blur tricked out the bays, the editors could push 10 to 15 layers of 2K media at a time for temp effects – in real-time without rendering. When the film was locked, audio was exported as AAF for the sound facility along with an H.264 picture reference. Blur did many of the visual effects in-house. For final picture deliverables, we exported an XML from Premiere Pro, but also used the Change List tool from Intelligent Assistance. This was mainly to supply the list in a column format that would match Avid’s output to meet the studio’s requirements.”

df1016_deadpool_5I asked Clarke and Nedomansky what the team liked best about working with the Adobe solution. Nedomansky says, “I found that the editors really liked the tilde key [on the keyboard], which in Premiere Pro brings any window to fullscreen. When you have a timeline with 24 to 36 tracks of temp sound effects, it’s really nice to be able to make that fullscreen so that you can fine-tune them. They also liked what I call the ‘pancake timeline’. This is where you can stack two timelines over each other to compare or pull clips from one into the other. When you can work faster like this, there’s more time for creativity.” Clarke adds, “I used a lot of the time-remapping in After Effects. Premiere Pro’s sub-frame audio editing is really good for dialogue. When Avid and Apple were competing with Media Composer and Final Cut Pro it was very productive for both companies. So competition between Avid and Adobe is good, because Premiere Pro is very forward-thinking.”

Many NLE users may question how feature films apply to the work they do. Nedomansky explains, “When Kirk Baxter used Premiere Pro for Fincher’s Gone Girl, the team requested many features that they were used to from Final Cut Pro 7. About 200 of those suggestions have found their way as features into the current release that all Creative Cloud customers receive. Film editors will stress a system in ways that others won’t, and that information benefits all users. The important takeaway from the Deadpool experience is that after some initial adjustment, there were no showstoppers and no chaos. Deadpool is a monster film, but these are just tools. It’s the human in the chair making the decision. We all just want to work and not deal with technical issues. Whatever makes the computer invisible – that’s the power.”

Deadpool is certainly a fun rid, with a lot of inside jokes for veteran Marvel fans. Look for the Stan Lee cameo and be sure to stay all the way through the end credits!

Watch director Tim Miller discuss the choice to go with Adobe.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Hail, Caesar!

df0916_hailcaesar_1_sm

Combine kidnapping, mystery, farce, and a good measure of quirkiness, and you’ve defined the quintessential Coen Brothers script. Complete with a cast of Coen alums, Hail, Caesar! is just such a film. Joel and Ethan Coen’s latest is set in the motion picture factory town of Hollywood in the 1950s. Eddie Mannix (Josh Brolin) is a studio fixer tasked with finding Baird Whitlock (George Clooney), one of the studio’s biggest money-makers. Whitlock has been kidnapped in the middle of production of a Bible epic by a group called “The Future”. Of course, that’s not Mannix’s only dilemma, as he has other studio problems he needs to deal with, such as disgruntled director Laurence Laurentz (Ralph Fiennes) and personal issues by starlet DeeAnna Moran (Scarlett Johansson).

The Hail, Caesar! story idea has been kicking around for over a decade before the Coens finally brought it into production. Along with being a concept that fits right into their wheelhouse, it’s also a complex production. In this story about the Golden Age of Hollywood, much of the film involves movies within the movie. The tale weaves in and out of multiple productions being filmed on the fictional Capitol Pictures lot.

In keeping with the texture of films of that era, Hail, Caesar! was shot on film by long-time Coen director of photography, Roger Deakins (True Grit, No Country for Old Men, The Ladykillers). Deakins’ first choice might have been the ARRI ALEXA, but agreed that film was the appropriate solution and so shot with an ARRI 535-B to Kodak Vision3 negative stock. Fotokem handled development with EFILM covering telecine transfer, finishing, and digital intermediate color correction.

Time for a fresh change

Although they are lovers of the film image, Joel and Ethan Coen were also among the first to embrace Apple Final Cut Pro in their transition to digital editing for the film Intolerable Cruelty. They had been using Final Cut Pro up until Inside Llewyn Davis; however, it had become sufficiently “long in the tooth” that it was time for a change. This brought them to Adobe Premiere Pro CC. I recently interviewed Katie McQuerrey about this shift. She is credited as an additional or associate editor of numerous Coen films (Inside Llewyn Davis, True Grit, Burn After Reading) – a role which she describes as being Joel and Ethan’s right-hand person in the cutting room. For Hail, Caesar!, this included interfacing with Adobe and handling the general workflow so that Premiere Pro was a functional editing tool for the filmmakers.

df0916_hailcaesar_6McQuerrey explains, “After Apple stopped supporting Final Cut Pro 7 we knew it was time to change. We looked at Final Cut Pro X, but because of its lack of audio editing functions, we knew that it wasn’t right for us. So, we decided to give Premiere Pro a try. David Fincher had a successful experience with Gone Girl and we knew that Walter Murch, who is a friend of the Joel and Ethan’s, was using it on his next film. I’ve edited on Avid, Final Cut, and now Premiere Pro and they all make you adjust your editing style to adapt to the software. Joel and Ethan had only ever edited digitally on Final Cut Pro, so Premiere Pro provided the easiest transition. [Avid] Media Composer is very robust for the assistant editor, but a bit restrictive for the editor. I’m on an Avid job right now after a year away from it and miss some of the flexibility that Premiere Pro offers. You really come to appreciate how fluid it is to edit with. I think both Final Cut Pro 7 and Premiere Pro are better for the editor, but they do add a bit more stress on the assistants. Of course, Joel and Ethan were generally shielded from that.”

df0916_hailcaesar_2One of the unknowns with Premiere Pro was the fact that Hail, Caesar! was being shot on film. Avid has tried-and-true methods for tracking film keycode, but that was never part Premiere Pro’s architecture. Assistant editor David Smith explains, “EFILM scanned all of the negative at 2K resolution to ProRes for our cutting purposes. On an Avid job, they would have provided a corresponding ALE (Avid Log Exchange list) for the footage and you would be able to track keycode and timecode for the dailies. For this film, EFILM sync’ed the dailies and provided us with the media, as well as a Premiere Pro project file for each day. We were concerned about tracking keycode to turn over a cut list at the end of the job. Adobe even wrote us a build that included a metadata column for keycode. EFILM tracks their transfers internally, so their software would reference timecode back to the keycode in order to pull selects for the final scan and conform. At their suggestion, we used Change List software from Intelligent Assistance to provide a cut list, plus a standard EDL generated from Premiere Pro. In the end, the process wasn’t that much different after all.” EFILM scanned the selected negative clips at 4K resolution and the digital intermediate color correction was handled by Mitch Paulson under Roger Deakins’ supervision.

Adapting Premiere Pro to the Coen Brothers workflow

df0916_hailcaesar_3It was Katie McQuerrey’s job to test drive Premiere Pro ahead of the Coens and provide assistance as needed to get them up to speed. She says, “Joel was actually up to speed after a day or so. Initially we all wanted to make Premiere Pro work just like Final Cut, because it appears similar. Of course, many functions are quite different, but the longer we worked with it, the more we got used to some of  the Premiere Pro ways of doing things. As functionality issues came up, Adobe would make adjustments and send new software builds. I would test these out first. When I thought they would be ready for Joel and Ethan to use, we’d install it on their machines. I needed to let them concentrate on the edit and not worry about software.”

Joel and Ethan Coen developed a style of working that stems from their film editing days and that carried over into their use of Final Cut Pro. This was adjusted for Premiere Pro. McQuerrey continues, “Ethan and Joel work on different computers. Ethan will pick selected takes and mark ins and outs. Then he saves the project and dings a bell. Joel opens that project up to use as he assembles scenes. With FCP you could have multiple projects open at once, but not so with Premiere. We found out from Adobe that the way to handle this was through the Media Browser module inside of Premiere. Joel could browse the drive for Ethan’s project and then access it for specific sequences or selected shots. Joel could import these through Media Browser into his project as a non-destructive copy, letting Ethan continue on. Media Browser is the key to working collaboratively among several editors on the same project.” Their edit system consisted of several Mac Pro “tube” models connected to Open Drives shared storage. This solution was developed by workflow engineer Jeff Brue for Gone Girl and is based on using solid state drives, which enable fast media access.

df0916_hailcaesar_5As with all films, Hail, Caesar! posed creative challenges that any application must be able to deal with. McQuerrey explains, “Unlike other directors, Joel and Ethan wait until all the shooting is done before anything is cut. I wasn’t cutting along with dailies as is the case with most other directors. This gave me time to get comfortable with Premiere and to organize the footage. Because the story includes movies within the movie, there are different aspect ratios, different film looks and color and black-and-white film material. Editorially it was an exciting project because of this. For example, if a scene in the film was being ‘filmed’ by the on-camera crew, it was in color and should appear to play out in real-time as you see the take being filmed. This same sequence might also appear later in a Moviola viewer, as black-and-white, edited film. This affected how sequences were cut. Some shots that were supposed to be real-time needed to look like one continuous take. Or someone in the film may be watching a rough cut, therefore that part had to be cut like a rough cut. This is a film that I think editors will like, because there are a lot of inside jokes they’ll appreciate.”

Fine tuning for the feature film world

df0916_hailcaesar_4One criticism of Adobe Premiere Pro CC has been how it handles large project files, particularly when it comes to load times. McQuerrey answers, “The Open Drives system definitely helped with that. We had to split the film up into a separate projects, for cuts, sound, visual effects, music, etc. in order to work efficiently. However, as we got later into the post we found that even the smaller projects had grown to the size that load times got much slower. The remedy was to cull out old versions of sequences, so that these didn’t require indexing each time the project was opened. Periodically I would create archive projects to keep the oldest sequences and then delete most of the oldest sequences from the active project. This improved performance.”

The filmmaking team finished Hail, Caesar! with a lot of things they liked about their new software choice. McQuerrey says, “Joel likes some of the effects features in Premiere Pro to build transitions and temp comps. This film has more visual effects than a usual Coen Brothers film, including green screens, split screens, and time remaps. Many of the comps were done in Premiere, rather than After Effects. Ethan and Joel both work differently. Ethan would leave his bins in list view and do his mark-ups. On the other hand, Joel also really liked the icon view and hover scrubbing a lot. Temp sound editing while you are picture editing is very critical to their process. They’ll often use different takes or readings for the audio than for the picture, so how an application edits sound is as important – if not more so – than how it edits picture. We had a couple of bumps in the road getting the sound  tracks interface working to our liking, but with Adobe’s help in building new versions of software for us, we got to the place where we really appreciated Premiere’s sound tools.”

Katie McQuerrey and I wrapped up the interview with an anecdote about the Coens’ unique approach to their new editing tool. McQuerrey explains, “With any application, there are a number of repetitive keystrokes. At one point Joel joked about using a foot pedal, like on an old upright Moviola. At first we laughed it off, but then I checked around and found that you could buy custom control devices for video game play, including special mice and even foot controls. So we ordered a foot pedal and hooked it up to the computer. It came with it own software that let us map command functions to the pedal. We did this with Premiere’s snapping control, because Joel constantly toggles it on and off!” It’s ironic, given the context of the Hail, Caesar! story, but here you have something straight out of the Golden Age of film that’s found itself useful in the digital age.

Click here for Adobe’s behind-the-scenes look.

Originally written for Digital Video magazine / CreativePlanetNetwork.

©2016 Oliver Peters

Adobe Premiere Pro CC Tips

df0616_ppro_10

Adobe Premiere Pro CC is the dominant NLE that I encounter amongst my clients. Editors who’ve shifted over from Final Cut Pro “classic” may have simply transferred existing skills and working methods to Premiere Pro. This is great, but it’s easy to miss some of the finer points in Premiere Pro that will make you more productive. Here are seven tips that can benefit nearly any project.

df0616_ppro_1LUTs/Looks – With the addition of the Lumetri Color panel, it’s easy to add LUTs into your color correction workflow. You get there through the Color workspace preset or by applying a Lumetri Color effect to a timeline clip. Import a LUT from the Basic Correction or Creative section of the controls. From here, browse to any stored LUT on your hard drive(s) and it will be applied to the clip. There are plenty of free .cube LUTs floating around the web. However, you may not know that Look files, created through Adobe SpeedGrade CC in the .look format, may also be applied within the Creative section. You can also find a number of free ones on the web, including a set I created for SpeedGrade. Unlike LUTs, these also support effects used in SpeedGrade.

df0616_ppro_2Audio MixingPremiere Pro features very nice audio tools and internal audio mixing is a breeze. I typically use three filters on nearly every mix I create. First, I will add a basic dynamic compressor to all of my dialogue tracks. To keep it simple, I normally use the default preset. Second, I will add an EQ filter to my music tracks. Here, I will set it to notch out the midrange slightly, which lets the dialogue sit a bit better in the mix. Finally I’ll add limiting to the master track. Normally this is set to soft clip at -10db. If I have specific loudness specs as part of my delivery requirements, then I’ll route my mix through a submaster bus first and apply the limiting to the submaster. I will apply the RADAR loudness meter to the master bus and adjust accordingly to be compliant.

df0616_ppro_3Power windows – This is a term that came from DaVinci Resolve, but is often used generically to talk about building up a grade on a shot by isolating areas within the image. For example, brightening someone’s face more so than the overall image. You can do this in Premiere Pro by stacking up more than one Lumetri Color effect onto a clip. Start by applying a Lumetri Color effect and grade the overall shot. Next, apply a second instance of the effect and use the built-in Adobe mask tool to isolate only the selection that you want to add the second correction to, such as an oval around someone’s head. Tweak color as needed. If the shot moves around, you can even use the internal tracker to have your mask follow the object. Do you have another area to adjust? Simply add a third effect and repeat the process.

df0616_ppro_4Export/import titles – Premiere Pro titles are created in the Title Designer module and these titles can be exported as a separate metadata file (.prtl format). Let’s say that you have a bunch of titles that you plan to use repeatedly on new projects, but you don’t want to bring these in from one project to the next. You can do this more simply by exporting and re-importing the title’s data file. Simply select the title in the bin and then File/Export/Title. The hitch is that Adobe’s Media Browser will not recognize the .prtl format and so the easiest way to import it into a new project is to drag it from the Finder location straight into the new Premiere Pro project. This will create a new title inside of the new project. Both instances of this title are unique, so editing the title in any project won’t effect how it appears elsewhere.

df0616_ppro_5Replace with clip – I work on a number of productions where there’s a base version of a commercial and then a lot of versions with small changes to each. A typical example is a spot that uses many different lower third phone numbers, which are market-specific. The Replace function shaves hours off of this workflow. I first duplicate a completed sequence and rename it. Then I select the correct phone number in the bin, followed by selecting the clip in the timeline to be changed. Right-click and choose Replace with Clip/From Bin. This will update the content of my timeline clip with the new phone number. Any effects or keyframes that have been applied in the timeline remain.

df0616_ppro_6Optical flow speed changes – In a recent update, optical flow interpolation was added as one of the speed change choices. Other than the obvious uses of speed changes, I found this to be a get way of creating slower camera moves that look nearly perfect. Optical flow can be tricky – sometimes creating odd motion artifacts – and at other times it’s perfect. I have a camera slider move or pan along a mantle containing family photos. The move is too fast. So, yes, I can slow it down, but the horizontal motion will leave it as stuttering or blurred. However, if I slow it to exactly 50% and select optical flow, in most cases, I get very good results. That’s because this speed and optical flow have created perfect “in-between” frames. A :05 move is now :10 and works better in the edit. If I’m going to use this same clip a lot, I simply render/export it is as a new piece of media, which I’ll bring back into the project as if it were a VFX clip.

df0616_ppro_7Render and replace – Premiere Pro CC is great when you have a ton of different camera formats and want to work with native media. While that generally works, a large project will really impede performance, especially in the editing sequence. The alternative is to transcode the clips to an optimized or so-called mezzanine format. Adobe does this in the sequence rather than in the bin and it can be done for individual clips or every clip within the sequence. You might have a bunch of native 4K .mp4 camera clips in a 1080p timeline. Simply select the clips within the timeline that you would like to transcode and right-click for the Render and Replace dialogue. At this point you have a several options, including whether to use clip or sequence settings, handle length, codec, and file location. If you choose “clip”, then what you get is a new, trimmed clip in an optimized codec, which will be stored in a separate folder. This becomes a great way to consolidate your media. The clip is imported into your bin, so you have access to both the original and the optimized clip at the original settings. Therefore, your consolidated clips are still 4K if that’s how they started.

This also works for Dynamic Link After Effects compositions. Render and Replace those for better timeline performance. But if you need to go back to the composition in order to update it in After Effects, that’s just a few clicks away.

©2016 Oliver Peters