Handling and Protecting Media

Once the industry entered the file-based era, we realized that dealing with and properly archiving audio and video files could make or break a production company. No more videotapes on the shelf to pull footage from. Unfortunately many companies, producers, clients, and editors simply solved this with a hodgepodge of small, portable drives – Firewire, USB, Thunderbolt, whatever. That’s no longer practical. A typical 10-day, 4K shoot with a handful of formats can easily generate 8-10TB of original footage. That’s if the production is structured. Make that a 2-3 weeklong documentary or reality-style production and you’ll have closer to 20-30TB. Not exactly something you want to deal with in post using a bunch of orange LaCie drives!

The road to safeguarding your files

At the day job, we were able to invest in a LumaForge Jellyfish shared storage network (NAS). It’s 480TB, which sounds like a lot, but after RAID protection the available net capacity is 316TB. And you only want to use up to 80%-90% of that for the most efficient operation. While it still sounds like a lot of storage, it is a finite amount. This means that you need to develop a strategy for archiving older projects and the associated media, but yet easily find and restore it later for revisions.

Cloud storage remains a pipe dream at these quantities. LTO data tape back-up is also impractical, because of its linear read/write nature. It is only intended for deep storage archiving. Facilities who have attempted to use LTO as a type of near-line storage – with frequent restores, updates, and subsequent re-archiving – have worn out their LTO tapes long before the rated life.

Efficient media handling starts when a project or production is first originated. In our case, every new project gets a folder on the Jellyfish and inside that folder is a standard group of subfolders for the corresponding project files, graphics, exports, and source footage. We assign all projects a job number for billing and that number is part of the top level folder name, as well as in any project file name. This default, template starting point is generated for each new production using the Post Haste application.

The location crew

On location all media is copied daily (with verification using the Hedge application) to both master and back-up drives. Depending on the size of the crew, this is the responsibility of the DIT, assistant cameraman, or the director of photography. On large productions, the cost of these drives is built into the budget and they later end up being stored on the shelf for safe keeping. On smaller jobs (or some fast turnaround jobs) temporary, fast SSDs are used, which will later be reused on other projects.

Post starts here

The next step back at the shop is to copy all of this material from the location drives onto the Jellyfish into that project’s Source Media or Dailies subfolder. Once copied, I will proceed to clean up and reorganize all media into subfolders according to this hierarchy:

DATE / CAMERA / REEL

For example: 092819/A-CAMERA_ALEXA/A001

Or outside of the US, maybe: 28SEPT19/A-CAMERA_ALEXA/A001

If a camera file is buried several folders deep – due to the camera card structure or an error made by the crew member on location – I will move those files to the top level within the REEL subfolder without any other levels in between. Camera folders, like DCIM, CLIP, etc are thus orphaned, and so, deleted from Jellyfish. Remember that I still have the original master drive from the location, which will sit on the shelf. If I ever need to get back to the file in its original container, I have that option.

I discussed relinking strategies in the previous post and that comes into play here. Files from semi-pro and non-pro cameras, like DSLRs, GoPros, iPhones, etc will have a prefix appended to the file name using the Better Rename application. The name is typically a short 8-10 character alphanumeric to indicate a job name reference, date, camera letter, and reel.

For instance, a file from the B-camera’s reel 7 for a production done for project ABC on September 28th would get the prefix “ABC0928B07_”. The camera-generated clip name would follow the underscore in that name. The point of doing this is to guarantee unique file names, especially when multiple cameras and filming days are involved. I also apply this process to sound files, even if the clip name reflects the scene and take number.

The last step is to transcode and rate-convert all non-pro media. If my base rate is 23.98fps (23.976), then files like GoPro 59.94fps media get turned into ProRes at 23.98 (slomo). In that case, I will have a subfolder with the original media and a second subfolder with the transcoded media, both with proper file names. I usually apply the “_PR2398” suffix to these transcoded files. I have found that DaVinci Resolve is the best and fastest tool for this transcoding process and large batches can be run overnight as needed.

Archiving your files

If the crew used temporary drives on location, then before these are reformatted and recycled, they are copied to inexpensive portables, like Seagate or Western Digital USB drives. These are then parked on the shelf for safe keeping. The objectives is to end up with at least two copies of the source media – the unaltered, camera original files and the new, master files on the Jellyfish.

Once editing has been completed and approved and the client files have been delivered, we move into the archiving stage. For nearly every project, we try to make sure that a ProRes master and a textless ProRes master have been generated by the editor. In addition, the mixer or the editor will generate a mixed audio file and audio stems for dialogue, SFX, and music (as separate files). Many times, you end up making future changes or versions using these files without going back to the original project file.

The entire project folder with all of the associated media is now copied to a raw, removable hard drive. These are enterprise-grade drives. All of our workstations are equipped with docking stations for such drives. To date, we are up to 200 drives, ranging in size from 2TB to 8TB. They are indexed using the simple DiskCatalogMaker application, which generates a searchable index file of all of these archive drives. (Note – I would recommend spinning up these archive drives every few months.)

Let me mention that while this can be done at the end, I will often split this archival step into two phases. I will first copy only the Dailies media right after I have organized it on Jellyfish (before any editing), leaving the other project subfolders blank. The reason is that once location production is done, there won’t be anything else added to Dailies. In addition, it gives me three copies of the camera files – the location drive (or its back-up), Jellyfish, and the archive drive. Once the project is finished, I only need to copy the rest of the material from the other subfolders.

The last step is to move the project folder from the PROJECTS master folder on Jellyfish to the BACKED UP master folder. As long as we have space on Jellyfish, the project is never deleted. Often changes are required. When that happens, the affected project folder is moved from BACKED UP to PROJECTS again. The changes are made and client files delivered. Then the archive drive for that project is updated and re-indexed to the DiskCatalogMaker catalog file. The project file is finally returned to the BACKED UP folder. As we need space on Jellyfish, the oldest projects that haven’t been touched in a long while are deleted.

Redundancy is the key

There are two additional protection steps taken. All active project files (usually Premiere Pro) are copied to the company’s DropBox by every editor at the end of each day. In the event of a catastrophic NAS failure – before the completion of that project – we can at least get to the project file in the cloud (DropBox) and the media that is stored on hard drive in order to restore the edit. (Note that if you do this with FCPX Libraries, they must first be “zipped,” because DropBox and FCPX Libraries do not play well together.)

The second item is that we have an additional folder on Jellyfish for all completed masters. When an editor generates ProRes master and/or textless files, those files are also copied to this masters folder. That give us quick access to all final versions, should the client require an extra web file or some other type of deliverable. It’s easy to simply encode new files from these ProRes masters, without needing to search out the original project folder.

These steps may sound complex and daunting if you aren’t currently doing them. I have covered some of this in past posts, but I do update my processes over time. Once you get into a routine of doing these steps, the benefits pay off immensely. Your media is better protected, it’s easier to find in the future, and relinking is a no-brainer.

©2019 Oliver Peters

Advertisements

Foolproof Relinking Strategy

Prior to file-based camera capture, film and then videotape were the dominant visual acquisition technologies. To accommodate, post-production adopted a two-stage solution: work print editing + negative conform for film, offline/online editing for video. During the linear editing era high-res media on tape was transferred to a low-res tape format, like 3/4″, for creative editing (offline). The locked cut was assembled and enhanced with effects and graphics in a high-end online suite using an edit decision list and the high-res media. The inherent constraints of tape formats forced consistency in media standards and frame rates.

In the early nonlinear days, storage capacities were low and hard drives expensive, so this offline/online methodology persisted. Eventually storage could cost-effectively handle high-res media, but this didn’t eliminate these workflows. File-based camera acquisition has brought down operating cost, but the proliferation of formats and ever-increasing resolutions have meant that there is still a need for such a two-stage approach. This is now generally referred to as proxy versus full-resolution editing. The reasons vary, but typically it’s a matter of storage size, system performance, or the capabilities of the systems and operator/artist running the finishing/full-res (aka “online”) system.

All of this requires moving media around among drives, systems, locations, and facilities, thus making correct list management essential. Whether or not it works well depends on the ability to accurately relink media with each of these moves. Despite the ability of most modern NLEs to freely mix and match formats, sizes, frame rates, etc., ignoring certain criteria will break media relinking. You must be able to relink the same media between systems or between low and high-res media on the same or different systems.

Criterial for successful relinking

– Unique file names that match between low and high-res media (extensions are usually not important).

– Proper timecode that does not repeat within a single clip.

– A single, standard frame rate that matches the project’s base frame-rate. Using conform or interpret functions within an NLE to alter a clip’s frame rate will mess up relinking on another system. Constant speed changes (such as slomo at 50%) is generally OK, but speed ramp effects tend to be proprietary with every NLE and typically do not translate correctly between different edit or grading applications.

– Match audio configurations between low and high-res media. If your camera source has eight channels of audio, then so must the low-res proxy media.

– Match clip duration. High-res media and proxies must be of the exact same length.

– Note that what is not important is matching frame size or codec or movie wrapper type (extension).

Proxy workflows

Several NLE applications – particularly Final Cut Pro X and Premiere Pro – offer built-in proxy workflows, which automatically generate proxy media and let the editor seamlessly toggle between full-res and proxy files. These are nice as long as you don’t move files around between hard drives.

In the case of Premiere Pro, you can delete proxy files once you no longer need them. From that point on you are only working with full-res media. However, the Premiere project continues to expect to have the proxy file available and wants to locate them when you launch the project. You can, of course, ignore this prompt, but it’s still hard to get rid of completely.

With FCPX, any time you move media and the Library file to another drive with a different volume name, FCPX prompts a relink dialogue. It seems to relink master clips just fine, but not the proxy media that it generated IF stored outside of the Library package. The solution is to set your proxy location to be inside the Library. However, this will cause the Library file to bloat in size, making transfers of Library files between drives and editors that much more cumbersome. So for these and other reasons (like not adhering strictly to the criteria listed above) relinking can often be problematic to impossible (Avid, I’m looking at you).

Instead of using the built-in proxy workflows for projects with extended timetables or huge amounts of media, I prefer an old-school method. Simply transcode everything, work with low-res media, and then relink to the master clips for finishing. Final Cut Pro X, Premiere Pro, and Resolve all allow the relinking of master clips to different media if the criteria match.

Here are five simple steps to make that foolproof.

1. Transcode all non-professional camera originals to a high-quality mastering codec for optimized performance on your systems. I’m talking about footage from DSLRs, GoPros, drones, smart phones, etc. On Macs this will tend to be the ProRes codec family. On PCs, I would recommend DNxHD/HR. Make sure file names are unique (rename if needed) and that there is proper timecode. Adjust frame rates in the transcode if needed. For example, 29.97fps recordings for a playback base rate of 23.98fps should be transcoded to play natively at 23.98fps. This new media will become your master files, so park the camera originals on the shelf with the intent of never needing them (but for safety, DO NOT erase).

2. Transcode all master clips (both pro formats like RED or ARRI, as well as those transcoded in step 1) to your proxy format. Typically this might be ProRes Proxy at a lower frame size, like 1280 x 720. (This is obviously an optional step. If your system has sufficient performance and you have enough available drive space, then you may be able to simply edit with your master source files.)

3. Edit with your proxy media.

4. When you are ready to finish, relink the locked cut to your master files – pro formats like RED and ARRI – and/or the high-res transcodes from step 1.

5. Color correct/grade and add any final effects for finish and delivery.

©2019 Oliver Peters

Rocketman

The last two of years have been rich for film audiences interested in the lives of rock legends. Rocketman was this year’s stylized biography about Elton John. Helmed by British actor/director Dexter Fletcher and starring Taron Egerton of the Kingsman film series, Rocketman tells John’s life through his songs. Astute film buffs also know that Fletcher was the uncredited, additional director who completed Bohemian Rhapsody through the end of principal photography and post, which will invite obvious comparisons between the two rock biopics.

Shepherding Rocketman through the cut was seasoned film editor, Chris Dickens. With experience cutting comedies, dramas, and musicals, it’s impossible to pin Dickens down to any particular film genre. I had recently interviewed him for Mary Queen of Scots, which was a good place to pick up this conversation about editing Rocketman.

__________________________________________

[OP] Our last conversation was about Mary Queen of Scots. I presume you were in the middle of cutting Rocketman at that time. Those are two very different films, so what brought you to edit Rocketman?

[CD] I made a quick shift onto Rocketman after Mary Queen of Scots. It was a fast production with eight or nine months filming and editing. The project had been in the cards a year before and I had met with Dexter to discuss doing the film. But, it didn’t happen, so I had forgotten about it until it got greenlit. I like musicals and have done one before – Les Miserables. This one was more ambitious creatively. Right from the beginning I liked the treatment of it. Rocketman was a classic kind of musical, but it was different in that the themes were adult and had a strong visual sense. Also the treatment using Elton John’s songs and illustrating his life with those was interesting.

[OP] The director had a connection with both Rocketman and Bohemian Rhapsody. Both films are about rock legends, so audiences may draw an obvious comparison. What’s your feeling about the contrast between these two films?

[CD] Obviously, there are a lot of similarities. Both films are essentially rock biopics about a musical figure. Both Freddie and Elton were gay. So that theme is similar, but that’s where it ends. Bohemian Rhapsody was aimed at a wider audience, i.e. less adult material – sex and drug-taking – things like that. And secondly, it’s about music, but it’s not a musical. It’s always grounded in reality. Characters don’t get up and sing to the camera. It’s about Freddie Mercury and Queen and their music. So the treatment of it is very different. Another fundamental difference is that Elton John is still alive and Freddie Mercury is not, so that was right at the film’s core. From the start you know that, so it has a different kind of power.

[OP] Whenever a film deals with popular music – especially when the rights-owners are still alive and active – the treatment and use of that music can be a sticking point. Were Elton John or Bernie Taupin actively involved in the production of Rocketman?

[CD] Yes, they were. Bernie less so – mainly Elton. He didn’t come in the edit room that much, but his husband, David Furnish, was a bit more involved. Elton is not someone who goes out in public that much, except to perform. He’s such a massive star. But, he did watch cuts of the film and had notes – not at every stage – but, David Furnish was the conduit between us and him. Naturally, Elton sanctioned all of the music tracks that were used. But the film was not made by them, i.e. we were making the film and they were giving us notes.

[OP] How were the tracks handled? Was the music remixed from the original studio masters with Taron lip-syncing to Elton’s voice – or was it different?

[CD] The music was radically changed in some cases from the original – the arrangements, the scoring. The music was completely re-recorded and sung by Taron, the actor playing Elton. We evolved the choices made at the beginning during the edit. So alongside of the picture edit was a music edit and a music mix going on constantly. In some cases Taron was singing on-set and we used that for about a quarter of the tracks. These were going in and out of scenes that had natural dialogue. Taron would start singing and we would play the track underneath. Then at that point perhaps, he would start lip-syncing, so it was a combination. On some tracks he was completely lip-syncing to what he had recorded before. This set the tempo for those scenes, but the arrangements evolved during the edit.

Even when he was lip-syncing, it was to his own voice. The whole idea was that the singing would not be Elton, except at the end where we have a track with both singing in the credits roll. So it’s a key thing that these were new recordings. Giles Martin, son of the legendary George Martin, was the music producer who took care of everything and put up with our constant changes. We had a team of two music editors who worked alongside us and a score as well, written by Matt Margeson, which we were rolling into the film in places. It was a real team process of building the film slowly.

[OP] Please expand on the structure of a film musical and what it takes to edit one.

[CD] The editing process was challenging, because of the complex structure. It was fundamentally a musical, with fifteen or sixteen tracks – meaning songs or music numbers – that were initially planned to be shot. Some of these were choreographed song-and-dance sequences. Combined with that was a sort of kitchen sink drama about Elton’s life, his childhood, his teenage years, and then into manhood. And then becoming a superstar. The script has the songs and then long sequences of more classic storytelling. What I found – slowly, as we were putting the film together, even during the shoot – was that we needed to unify those two things within the edit.

For instance, the first song number in the movie is “The Bitch is Back.” It’s a dance sequence with Elton as a boy walking down the street while people are singing and dancing around him. Then his adult self is chasing him around. It’s a very stylized sequence, which then went into about an eight minute sequence of storytelling about his childhood. We needed to give the film the same tone all the way through, i.e. that slightly fantastical feel of a musical. We screened it a few times for some of the core people and it became clear that we wanted to go with the fantastical elements of the film, not the more down-to-earth, realistic elements. Obviously, you could have made the choice to cut back on the music, but that seemed counter-intuitive. So we had to make some deep cuts in the sections between the musical pieces to get the story to flow and have that same kind of tone.

There was also a flashback structure. The film starts with Elton later on as an adult in rehab, after having fallen into drug and alcohol addiction. We framed the film with this device, so it was another element that we had to make work in the edit to get it to feel as an organic part of the story. We found that we didn’t have enough of these rehab sequences and had to shoot a few more of them during the edit to knit the film together in this way in order to remind you that he was telling this story – looking back on his past.

Cutting back sections between the musical numbers wasn’t our only solution to get the right tone. We had to work out how to get in and out of the musical sequences and that’s where the score comes in. I played with this quite a lot with the composer and Giles to have themes from Elton’s song coming throughout the film. For example, “Goodbye Yellow Brick Road” had some musical themes in it that we started using as the theme that went with his rehab. The theme of the film is that Elton lost any sense of where he came from as a person, because of his stardom and “Goodby Yellow Brick Road” – the song – is about that. It’s actually about going back to the farm and your roots. The song isn’t actually in the film until the very end when he performs it. So we found that using this musical theme as a motif throughout the film is very powerful and helped to combine the classical storytelling scenes with the musical scenes.

[OP] Was this process of figuring out the right balance something that happened at the beginning and then became a type of template for the rest of the film? Or was is a constant adjustment process throughout the cutting of Rocketman?

[CD] It was a constant thing trying to make the film work as a whole so people wouldn’t be confused about the tone. At one point we had far too much music and had to take some out. It became very minimal in some areas. In others, it led you more. It was about getting that balance right all the way through. I’m primarily a picture editor, but on this film you couldn’t just concentrate on the picture and then leave the music to the music editors and composer, because it was absolutely a fundamental part of the film. It was about music and so how you were using music was very key within the edit. Sometimes we had to cut longer songs down. Very few are at their original length. Some are half their recorded length.

[OP] This process sounds intriguing, since the scenes use a song as the underlying building block. Elton John’s songs tend to be pop songs – or at least they received a lot of radio airplay – so did those recorded lengths tend to drive the film?

[CD] No. At first I thought we’d have to be very faithful, but as we started cutting, the producers -and particularly Elton John’s side of it – didn’t care whether we cut things down or made them longer or added bits. They weren’t precious about it. In fact, they wanted us to be creative. The producers would say, “Don’t worry about cutting that down, Giles will deal with it.” Of course he would. Although sometimes he’d come back to me and say, “Look, this doesn’t quite work musically. You need to add a bit more time to this, or another couple of bars of music.” So we had a whole back-and-forth process like that.

For instance, in the track “Rocketman,” which is the film’s centerpiece, Elton tries to commit suicide. He’s at a party, gets drunk, and jumps into the swimming pool. While he’s underneath he starts seeing visions of himself as a child under there. He starts singing and gets fished out of the pool and then put on stage in a stadium. It’s a whole sequence that’s been planned to play like that. Of course, I couldn’t fit what they’d shot into the song – there wasn’t enough time. It was all good stuff, so I added a few bars. I’d give it to music and they’d say, “Oh, you can’t add that in that way.” So I’d go back and try different ways of doing it.

At the end, when he’s put back on stage at Dodger Stadium, he’s in a baseball uniform and then fires into the air like a rocket. They shot it in a studio without a big crowd and it looked okay. As soon as we started getting the visual effects, we thought, “Wow. This looks great.” So we doubled the length of that – added on, repeated the chorus, and all of that – because we thought people were going to love this. It looked and sounded great. But, when we then tested the film, it was way too long. It had just outstayed its welcome. We then had to cut it down again, although it was still longer than they’d originally planned it.

[OP] With a regular theatrical musical, the song are written to tell the story. Here, you are using existing songs that weren’t written with that story in mind. I presume you have to be careful that you don’t end up with just a bunch of music videos strung back-to-back.

[CD] Exactly. I don’t think we ever strayed into that. It was always about – does it make its point? These songs were written at all times in his career, but we didn’t use them in their original chronological order. “Honky Cat” was written later than when we used it. He’s just getting successful and at the end of “Honky Cat” they are buying Rolls Royces and clothes and football teams. At the end of that there was a great song-and-dance routine with them dancing on a record – Elton and John Reid, his manager and also a kind of boyfriend. That part went on for two minutes and we ended cutting it out. Partly because people and the producers who saw it thought it wasn’t the right style. It had a kind of 1920s or 1930s style with lots of dancers. It was a big number and took a long time to edit, but we took it out. I thought it was quite a nice sequence, but most people thought the film was better without it, because it wasn’t moving the story on.

[OP] Other than adjusting scenes and length, did friends-and-family and test audience screenings change your edit significantly?

We did three big screenings in Los Angeles, San Francisco, and Kansas City, plus a number of smaller ones in England. The audiences were a mix of people who were Elton John fans, as well as those that weren’t. Essentially people liked the film right from the start, but the audiences weren’t getting some parts, like the flashback structure with the rehab scenes – particularly at the beginning. They didn’t really understand what he was singing about. 

That first song [“The Bitch Is Back”] caused a lot of difficulty, because it starts the film and says this is a musical. You have to handle that the right way. I think the initial problems were partly in how I had cut the sequence originally. I tried to show too much of the crowd around him and the dancers and I thought that was the way to go with it. Actually what it turned out was the way to go was the relationship between the two of them – Elton and Elton as the little boy – because that’s what the song was about. I then readjusted the edits, taking out a lot of the wide shots.

Also Taron had done some improvised dialogue to the little boy rather than just singing all the way through – dialogue lines like, “Stop doing that.” That was in the film a long time, but people didn’t like it and didn’t understand why he was angry with the boy. So we cut that out completely. Another issue was that right at the start, the little boy starts singing to Taron as Elton first, but audiences did not feel comfortable with it. We discussed it a lot and decided that the lead actor should be the one we hear singing first. We did a reshoot of that beginning portion of the scene. You have to let the audience into it more slowly than we had originally done. That’s a prime example of how editing decisions can lead to additional filming to really make it work.

[OP] You mentioned visual effects to complete the “Rocketman” scene. Were there a lot of effects used to make the film period-accurate or just for visual style?

[CD] Quite a lot, though not excessively, like a comic book movie. I imagine it was similar to Bohemian Rhapsody that had to shoot gigs and concerts and places were you couldn’t go now and film that. But our visual effects weren’t as fundamental in that I didn’t need them to cut with. The boy underwater was all created, of course. Taron in the pool was actually him underwater, because he had breathing apparatus. But the little boy couldn’t, so he was singing ‘dry for wet’ – shot in the studio and put into the scene later. There were different evolutions of that scene. In one version we took the boy out completely and just had Taron singing.

The end of the film as written was going to be a re-imagined version of Elton John’s “I’m Still Standing” music video, which is on the beach in Cannes, shot in the 80s. The idea was to go there and shoot it with a lot more dancers. By the time the film was being shot, the weather changed and we couldn’t shoot that sequence. That whole ending was shot later, partly in a studio. Because we couldn’t afford to go to Cannes and reshoot the whole thing, someone was able to get the original rushes from that music video, which had been shot on 16mm film, but edited on videotape. We had to get permission from the original director of that music video and he was very happy for us to do it. We had the 16mm film rescanned and also removed the grain. Instead of Elton, we put Taron into it.  In every shot with Elton, we replaced his head with Taron’s and that became the ending sequence of the film. As a visual effect, that took quite a leap of faith, but it did work in the end. That wasn’t the original plan, but I think it’s better.

[OP] In Bohemian Rhapsody there was a conscious consideration of matching the Live Aid concert angles and actions. Was there anything like that in Rocketman?

[CD] There was no point in trying to do that on Rocketman. It was always going to be stylized and different from reality. We staged Dodger Stadium the way it looked, but we didn’t try to match it. The original concert was late afternoon and ours is more towards night, which was visually better. The visual inspiration came from the stills taken by a famous rock photographer and they look a little more night. At one point we talked about having a concert at the end and we tried shooting something, but it just didn’t feel right. We were going to get compared to BoRhap anyway, so we didn’t want to even try and do something the same way.

[OP] Any final thoughts or advice on how to approach a film like Rocketman?

[CD] Every movie is different. Every single time you come to a story, you nearly have to start again. The director wants to do it a certain way and you have to adapt to that. With some of the dramas or comedies that I’ve cut, it’s a less immediate process. You don’t really know how the whole thing is coming together until you get a sense of it quite late. With this, they shot a few of the song sequences early and as soon as I saw that, I thought right away, “Oh, this is great.” You can build a quick three-minute sequence to show people and you get a feel for the whole film. You can get excited about it. On a drama or even worse, on a thriller, you’re guessing how it’s coming together and you’re using all of your skills to do that.

The director and the story are the differences and I try to adapt. Dexter wanted the film to be popular, but also distinctive. He wanted to see very quickly how it was coming together. As soon as he was done filming he wanted to go to the edit and see how it was coming along. In that scenario you try to get some things done more quickly. So I would try to get some sequences put together knowing that, and then come back to them later if you’ve rushed them.

Since it’s a musical you could string together the songs and get a feel, but that would be misleading. When you start off you can produce a sequence very quickly that looks good, because you’ve got the music that makes it feel almost finished and that it’s working. But that can lead you into a dead end if you’re not careful – if you are too precious about the music – the length of it and such. You still have to be hard about the storytelling element. Ultimately all of the decisions come from the story – how long the scene is, whether you start on a close-up or a wide – I always try to approach everything like that. If you keep that in your head, you’ll make the right decisions.

©2019 Oliver Peters

Why editors prefer Adobe Premiere Pro CC

Over my career I’ve cut client jobs with well over a dozen different linear and nonlinear editing systems and/or brands. I’ve been involved with Adobe Premiere/Premiere Pro as a user on and off since Premiere 5.5 (yes kids – before, Pro, CS, and CC). But I seriously jumped into regular use at the start of the Creative Cloud era, thanks to many of my clients’ shift away from Final Cut Pro. Some seriously gave FCPX a go, yet could never warm up to it. Others bailed right away. In any case, the market I work in and the nature of my clients dictate a fluency in Premiere Pro. While I routinely bounce between Final Cut Pro X, Media Composer, DaVinci Resolve, and Premiere Pro, the latter is my main axe at the day job.

Before I proceed, let me stop and acknowledge those readers who are now screaming, “But Premiere always crashes!” I certainly don’t want to belittle anyone’s bad experiences with an app; however in my experience, Premiere Pro has been just as stable as the others. All software crashes on occasion and usually at the most inopportune time. Nevertheless, I currently manage about a dozen Mac workstations between home and work, which are exposed to our regular pool of freelance editors. Over the course of the past three to four years, Premiere Pro (as well as the other Creative Cloud applications) has performed solidly for us across a wide range of commercial, corporate, and entertainment projects. Realistically, if our experiences were as bad as many others proclaim, we would certainly have shifted to some other editing software!

Stability questions aside, why do so many professional editors prefer Adobe Premiere Pro given the choices available? The Final Cut Pro X fans will point to Premiere’s similarities with Final Cut Pro 7, thus providing a comfort zone. The less benevolent FCPX fanboys like to think these editors are set in their ways and resistant to change. Yet many Premiere Pro users have gone through several software or system changes in their careers and are no strangers to a learning curve. Some have even worked with Final Cut Pro X, but find Premiere Pro to be a better fit. Whatever the reason, the following is a short list (in no order of importance) of why Premiere Pro becomes such a good option for many editors, given the available alternatives.

Responsive interface – I find the Premiere Pro user interface to be the most responsive application of any of the NLEs. I’m not talking about media handling, but rather the time between clicking on something or commanding a function and having that action occur. For example, in my Final Cut Pro X experience – which is an otherwise fast application – it feels slower for this type of response time. When I click to select a clip in the timeline, it takes a fraction of a second to respond. The same action is nearly instant in Premiere Pro. The reason seems to be that FCPX is constantly writing each action to the Library in a “constant save” mode. I have seen such differences across multiple Macs and hard drive types over the eight years since its introduction with very little improvement. Not a deal-breaker, but meanwhile, Premiere Pro has continued to become more responsive in the same period.

Customizable user interface – Users first exposed to Premiere Pro’s interface may feel it’s very complex. The truth is that you can completely customize the look, style, and complexity of the interface by re-arranging the stacked, tabbed, or floating panels. Make it as minimalistic or complex as you need and save these as workspaces. It’s not just the ability to show/hide panels, but unlike other NLEs, it’s the complete control over their size and location.

Media Browser – Premiere Pro includes a built in Media Browser panel that enables the immediate review and import of clips external to your project. It’s not just a view of folders in a clip name or thumbnail format to be imported. Media Browser offers the same scrubbing capabilities as for clips in a bin. Furthermore, the editor can directly edit clips to the timeline from the Media Browser, which then automatically also imports that clip into the project in a one-step process. You could start with a completely blank project (no imported media clips) and work directly between the Media Browser and the timeline if you wanted to.

Bins – Editors rely on bins for the organization of raw media. It’s the first level of project organization. FCPX went deep down this hole with Events and Keywords. Premiere Pro uses a more traditional approach and features three primary modes – list, thumbnail, and freeform. List and thumbnail are obvious, but what needs to be reiterated is that the thumbnail view enables Adobe’s hover scrubbing. While not as fluid as FCPX’s skimming, it’s a quick way to see what a clip contains. But more importantly, the thumbnails are completely resizable. If you want to see a few very large thumbnails in the bin, simply crank up the slider. The newest is a freeform view – something Avid editors know well. This removes the grid arrangement of the bin view and allows the editor to rearrange the position of clips within the panel for that bin. This is how many editors like to work, because it gives them visual cues about how material is organized, much like a storyboard.

Versatile media and project locations – Since Premiere Pro treats all of your external storage as available media locations (without the need for a structured MediaFiles folder or Library file), this gives the editor a better handle on controlling where media should be located. Of course, this puts the responsibility for proper media management on the user, without the application playing nanny. The big plus is that projects can be organized within a siloed folder structure on your hard drive. One main folder for each job, with subfolders for associated video clips, graphics, audio, and Premiere Pro project files. Once you are done, simply archive the job folder and everything is there. Or… If a completely different organizational structure better fits your needs – no sweat. Premiere Pro makes it just as easy.

Multiple open sequences/timelines – One big feature that brings editors to Premiere Pro instead of Media Composer or Final Cut Pro X is the ability work with multiple, open sequences in the timeline panel and easily edit between them. Thanks to the UI structure of Premiere Pro, editors can also have multiple stacked timeline panels open in their workspace – the so-called “pancake timeline” mode. Open a “KEM roll” (selects sequence) in one panel and your working sequence in another. Then edit between the two timeline panels without ever needing to go back-and-forth between bins and the timeline.

Multiple open projects/collaboration – Premiere Pro’s collaboration capabilities (working with multiple editors on one job) are not as robust as with Avid Media Composer. That being said, Premiere’s structure does enable a level of versatility not possible in the Avid environment – so it’s a trade-off. With Premiere project locking, the first editor to open a project has read/write control, while additional editors to open one of those open projects can access the files in a read-only mode. Clips and sequences can be pulled (copied/imported) from a read-only project into your own active project. The two will then be independent of each other. This is further enhanced by the fact that Premiere offers standard “save as” computer functions. If Editor #1 wants to offload part of the work to Editor #2, simply saving the project as a new file permits Editor #2 to work in their own active version of the project with complete read/write control.

Mixed frame rates and sizes – Premiere Pro projects can freely mix media and timelines with different sizes, aspect ratios and frame rates. It’s not the only NLE to do that, but some applications still start by having the project file based on a specific sequence format. Everything in the project must conform or be modified to those settings. Both solutions are viable, but Premiere’s open approach is more versatile for editors working in the hodgepodge that is today’s media landscape.

Audio mixing – While all NLEs offer decent audio mixing capabilities, Premiere Pro offers more refined mixing functions, including track automation, submaster tracks, proper loudness measurement, and AU, VST, and VST3 plug-in support. FCPX attempts to offer a trackless mixing model using audio roles, but the mixing routine breaks done pretty quickly when you get to a complex scenario, often requiring multiple levels of compound clips (nested sequences). None of that is needed in Premiere Pro. In addition, Creative Cloud subscribers also have access to Adobe Audition, a full-fledged DAW application. Premiere Pro sequences can be sent directly to Audition for more advanced mixing, plus additional Audition-specific tools, like Loudness Match and Music Remix. Adobe markets these as powered by Adobe Sensei (Adobe’s banded artificial intelligence). Loudness Match analyzes an audio clip and intelligently rises the gain of the quieter sections. Traditional loudness controls raise or lower the entire clip by a fixed amount. Music Remix doesn’t actually remix a track. Instead, it automatically edits a track based on a target length. Set a desired duration and Audition will determine the correct music edit points to get close to that target. You can use the default or set it to favor shorter sections, which will result in more edit points.

Interoperability – Most professional editors do not work within a single software ecosystem. You often have to work with After Effects and Photoshop files. Needless to say, Premiere Pro features excellent interoperability with the other Adobe applications, whether or not you use the Dynamic Link function. In addition, there’s the outside world. You may send out to a Pro Tools mixer for a final mix. Or a Resolve colorist for grading. Built-in list/file export formats make this easy without the requirement for third-party applications to facilitate such roundtrips.

Built-in tools that enhance editing – This could be a rather long list, but I’ll limit myself to a few functions. The first one I use a lot is the Replace command. This appears to be the best and easiest to use of all the apps. I can easily replace clips on the timeline from the source clips loaded into the viewer or directly from any clip in a bin. No drag-and-drop required. The second very useful operation is built-in masking and tracking for nearly every video filter and color correction layer. This is right at your fingertips in the Effects Control panel without requiring any extra steps or added plug-ins. Need more? Bounce out to After Effects with its more advanced tools, including the bundled Mocha tracker.

Proxy workflow – Premiere Pro includes a built-in Proxy workflow, which permits low-res edit proxies to be created externally and attached, or created within the application itself. In addition, working with proxies in not an all-or-nothing feature. You can toggle between proxies and high-res master clips, but you can also work with a mixture of proxies and high-res files. In other words, not all of your clips have to be transcoded into proxies to gain the benefit of a proxy workflow. Premiere takes care of tracking the various clip sizes and making sure that the correct size is displayed. It also calculates the size shift between proxy frame sizes and larger high-res frame sizes to keep the toggle between these two seamless.

Relinking – Lastly,  Premiere Pro can work with media on any of the available attached drives; therefore, it’s got to be able to quickly relink these files if you move locations. I tend to work in a siloed folder structure, where everything I need for a project is contained within a job folder and its subfolders. These folders are often moved to other drives (for instance, if I need to travel with a project) or archived to an external drive and later restored. It’s critical that a project easily find and relink to the correct media files. Generally, as long as files stay in the same relative folder paths – in relation to the location of the project files on the drive – then Premiere can easily find all the necessary offline media files once a project is moved from its original location. This is true whether you move to a different drive with a different volume name or whether you move the entire job folder up or down a level within the drive’s folder hierarchy. Media relinking is either automatic or worst case, requires one dialogue box for the editor to point Premiere to the new path for the first file. From there, Premiere Pro will locate all of the other files. I find this process to be the fastest and least onerous relink operation of all the NLEs.

©2019 Oliver Peters