VEGAS POST

If you are focused on Media Composer, Premiere Pro, or Final Cut Pro X, then it’s easy to forget that there are a number of other capable NLEs in the wild – especially for the Windows/PC platform. One of those is VEGAS Pro, which was originally developed by Sonic Foundry and came to market in 1999. At first VEGAS was a DAW, but video was quickly added, thus turning it into a competitive NLE. Over the years, the software development team moved to Sony (Sony Creative Software) and then to its new home, VEGAS Creative Software, owned by Magix, a German software company. Along with the VEGAS product line, Magix also handles the other creative apps from the Sonic Foundry/Sony family, including Acid and Sound Forge.

The VEGAS edit applications may be purchased in a number of versions and bundles, depending on your need and budget. Magix sent me VEGAS POST for this review, which bundles the latest VEGAS Pro 17 editing application with VEGAS Effects and VEGAS Image. The latter two apps were developed in conjunction with FX Home. VEGAS Effects shares technology with HitFilm Pro, while VEGAS Image shares a foundation with Imerge Pro. As a bundle, it’s similar to Adobe Premiere Pro grouped with After Effects and Photoshop.

VEGAS POST is available as a perpetual license or through subscription. Magix also offers other apps and add-ons, which are included with some of the other bundles or can be separately purchased. These include VEGAS DVD Architect, as well as plug-ins from Boris FX, and New Blue. Since I mainly work on Macs, Puget Systems loaned me one of their custom PCs for this review. More about Puget at the end.

First impressions

Technically this isn’t a first impression, because I’ve reviewed VEGAS a number of times in the past and my son had used VEGAS extensively for music recording and mixing years ago. But this is my first chance to look over the updated VEGAS Pro 17. To start with, this is an easy-to-use editing application that’s optimized for a Windows 10 workstation. It’s a 64-bit application that thoroughly takes advantage of the Windows media framework. If you are using NVIDIA or AMD graphics cards, then you also get the benefit of GPU acceleration for effects and some codecs, as well as for speeding up rendering.

If you’ve never worked with VEGAS and are used to NLEs like Avid Media Composer or Adobe Premiere Pro, then it will take a while to get comfortable with the user interface and workflow of VEGAS. In a very broad sense, the general editing concepts are much like Apple’s approach to iMovie and Final Cut Pro X, although VEGAS was there long before Apple developed their own take on things. VEGAS Pro 17 sports a modernized UI with a darker appearance and tabbed panels. The interface can be customized by undocking the tabs and moving panels. In addition to a fresher look, there are plenty of contextual “hamburger” menus throughout parts of the interface. I wish they had gone farther with this new interface design, but then I know how hard it is to change interfaces when you have a loyal fan base.

VEGAS claims to be faster than any other NLE. I would take that claim with a grain of salt, although it is fast and the interface is very responsive. It handles media well and thanks to the tight Windows integration, nearly any media file can be previewed from any location on your system and edited directly to the timeline. That’s without first importing these files into your project. Clips are treated as combined audio/video clips until you trim or ungroup them. VEGAS uses a very fluid form of hover scrubbing to skim through clips.

The editing workflow is more like working with an audio application than most video editing tools. So it’s oriented around a click-and-drag approach with direct timeline interaction. For example, you can trim clips while maintaining playback. If you drag a clip past the edge of another on the timeline, it will automatically create an audio and video crossfade in the overlap region between those two clips. This can be controlled by a preference setting. All timeline clips include audio and video fade handles. If you double-click a source clip in your project panel, it will automatically edit to the place in the timeline where your cursor is parked. Once you get used to the VEGAS way of editing, you can work quite quickly.

What’s new in VEGAS Pro 17

Thanks to Magix’s renewed support of the VEGAS Creative Software team, VEGAS Pro 17 sports over 30 new features. The biggest of these is nested timelines. VEGAS has always approached its project structure as being based on a single timeline. Working with multiple timelines required a workaround that involved opening multiple instances of the application. While nested or compound clips are pretty common in most other NLEs, this new feature in VEGAS takes a different approach.

To work with nested timelines, start by editing some clips together on the timeline, select them, and create and save a new nested timeline consisting of those selected clips. This nest is actually saved as a separate VEGAS project file. Multiple editors can work separately on each nest as its own VEGAS project. Changes made are then reflected back in the parent timeline. For instance, you might be editing a show and create three separate nested timelines for the show open, the body of the show, and the finale. Each segment could be edited as a self-contained project by a different editor. The three are then automatically combined and updated within the parent timeline. New tools on the bottom toolbar make it easy to navigate between the nested and parent timelines.

The next big feature is a unified color grading workspace. This combines all of the common grading tools like color wheels and curves into a single panel. Although VEGAS does not honor embedded camera LUTs, such as from an ARRI Amira, you can import and apply technical and creative LUTs from this grading workspace. VEGAS now also supports HLG HDR and the ACES 1.1 color standard. The tools worked well, but the controls have a rather coarse range when using a mouse. Holding down the CTL (control) key will give you a finer adjustment range.

Other features include planar tracking, optical flow slow motion, a warp flow (morph) transition, support for 8K files, integrated screen capture, a mesh warp effect, clip stabilization, and more. An offshoot of the warp transition is a new Smart Split edit function, which would commonly be used to shorten interview soundbites without a jump cut. Mark the in and out points within a lengthy timeline clip and perform the Smart Split command. In a single step, that section is trimmed out, the ends joined together, and a short warp flow transition is applied. You can see the effect in real time; however, the morph is between two still frames, rather than with continuous motion through the duration of the transition. Avid uses a similar effect in Media Composer.

Putting VEGAS Pro 17 through the paces

I used a mixture of media formats and projects to compare VEGAS Pro’s performance against Premiere Pro and DaVinci Resolve installed on the same PC. Most media was fine, although there’s no support for Blackmagic BRAW or Apple ProRes RAW. Standard ProRes media playback was good, except that VEGAS had issues with some 4K ProRes 4444 files from an ARRI Alexa LF camera. The other files that exhibited stuttering playback were the heavier 320Mb/s H.264 files from a Panasonic EVA1. The ProRes 4444 files played smoothly in Premiere Pro and Resolve, but only Resolve was fluid with the EVA1 files.

I also tested some older 4K REDCODE (.R3D) raw files. These open and play well in VEGAS, but the application does immediately start to generate some form of peak or proxy files upon import. However, you can cancel out of this file generation process and it doesn’t appear to affect the application or prevent you from playing the R3D. These were older RED One camera files, but judging by various forums comments that I’ve seen, newer RED camera files are not yet supported.

VEGAS supports several project exchange formats, including AAF and FCPXML. The latter is a specific interchange format exported by Final Cut Pro X. My AAF import from Premiere translated well, but the FCPXML import wasn’t successful. This doesn’t surprise me, because only Resolve – among the other editing apps – correctly supports FCPXML.

To claim to be fast, you also need speedy timeline exports. I was surprised to find that ProRes and DNxHD/HR formats were not available, even though other NLEs do offer these format options on Windows. It also wasn’t clear what the exact H.264 MP4 settings were. For consistency across the competing applications, I exported the same :90 test timeline from each using the Sony XDCAM HD422 50Mb/s MXF format. Export times were 1:20 for VEGAS Pro and 1:08 for Premiere Pro. But Resolve smoked them both with a zippy 28 seconds.

This PC was configured with an AMD rather than Intel CPU. It’s quite possible that VEGAS Pro or Premiere Pro may have faired better if I had been running the same test using an Intel Xeon or Core i9 processor. Nevertheless, media performance with VEGAS Pro is good, assuming you are working with formats for which VEGAS Pro is optimized.

I didn’t do much testing with either VEGAS Effects or VEGAS Image, but the combination does offer similarities to Adobe’s integration. You can send a clip on the timeline to VEGAS Effects for compositing, further image editing, or adding one of the many plug-in effects. When you save that change in VEGAS Effects, the clip is replaced in the VEGAS Pro timeline by a nest, which is a self-contained VEGAS project file. This method effectively gives you the same sort of dynamic link between the applications as you have between Premiere Pro and After Effects.

Conclusion

VEGAS Pro 17 continues to be a solid editor for the user who prefers Windows. It’s got a lot going for it, but it’s different than most other NLEs. The typical user is going to be the independent video professional who doesn’t regularly need to exchange project files or work with a mix of other freelance editors. That’s very similar to the situation with Final Cut Pro X among Apple users. Nevertheless, variety is good and that’s what VEGAS fans like about the application. So it’s a good thing that Magix sees value in supporting ongoing development.

A word about the Puget Systems PC

This review was made possible thanks in part to the help of Puget Systems. While Puget has benchmarked plenty of creative applications and has a good handle on how to optimize a PC configuration specific to your needs, they hadn’t ever tested the VEGAS products. In my consultation with them it was decided to configure a custom PC workstation optimized for Adobe applications. At retail, this would be a $5,000 system.

The workstation is a very quiet and cool tower built into a Fractal Design case. It was configured with an AMD Ryzen 9 16-core CPU, 64GB RAM, and an RTX 2080 Ti XC (11GB) GPU. Hard drives are the key to good media performance. We configured this with two Samsung 1TB M.2 SSDs – one of which was my main media drive for these tests. The SSD drives clocked in at about 2600 MB/s (write)/2900-3100 MB/s (read). Of course, the case and power supply support plenty of internal hardware and media expansion, along with numerous USB ports. While not cheap, this workstation is certainly more bang-for-the-buck than an equivalent HP or Apple workstation.

Part of the pleasure of dealing with the company is the personalized touch that Puget Systems’ customers enjoy when buying a new system. There’s a consultation call up front to nail down the specs and configuration that’s best for each customer. They are now extending that with Puget Systems Lab Services. This is an independent hardware consultation service, intended for people who need hardware advice, regardless of whether or not they are purchasing one of the custom Puget Systems PCs.

Originally written for Pro Video Coalition.

©2020 Oliver Peters

The Missing Mac

High-end Mac users waited six years for Apple to release its successor to the cylindrical 2013 Mac Pro. That’s a unit that was derided by some and was ideal for others. Its unique shape earned the nickname of the “trash can.” Love it or hate it that Mac Pro lacked the ability to expand and grow with the times. Nevertheless, many are still in daily service and being used to crank out great wok.

The 2019 Mac Pro revitalized Apple’s tower configuration – dubbed the “cheese grater” design. If you want expandability, this one has it in spades. But at a premium price that puts it way above the cost of a decked out 2013 Mac Pro. Unfortunately for many users, this leaves a gap in the product line – both in features and in price range.

If you want a powerful Mac in the $3,000 – $5,000 range without a built-in display, then there is none. I really like the top-spec versions of the iMac and iMac Pro, but if I already own a display, would like to only use an Apple XDR, or want an LG, Dell, Asus, etc, then I’m stuck. Naturally one approach would be to buy a 16″ MacBook Pro and dock it to an external display, using the MacBook Pro as a second display or in the clamshell configuration. I’ve discussed that in various posts and it’s one way nimble editing shops like Trim in London tend to work.

Another option would be the Mac Mini, which is closest to the unit that best fits this void. It recently got a slight bump up in specs, but it’s missing 8-core CPU options and an advanced graphics card. The best 6-core configuration might actually be a serviceable computer, but I would imagine effects requiring GPU acceleration will be hampered by the Intel UHD 630 built-in graphics. The Mini does tick a lot of the boxes, including wi-fi, Bluetooth, four Thunderbolt 3/USB-C ports, HDMI 2.0, two USB 3.0 ports, plus Ethernet and headphone jacks.

I’ve tested both the previous Mac Mini iteration (with and w/o eGPU) and the latest 16″ MacBook Pro. Both were capable Macs, but the 16″ truly shines. I find it hard to believe that Apple couldn’t have created a Mac Mini with the same electronics as the loaded 16″ MacBook Pro. After all, once you remove the better speaker system, keyboard, and battery from the lower case of the laptop, you have about the same amount of “guts” as that of the Mac Mini. I think you could make the same calculation with the iMac electronics. Even if the Mini case needed to be a bit taller, I don’t see why this wouldn’t be technically possible.

Here’s a hypothetical Mac Mini spec (similar to the MacBook Pro) that could be a true sweet spot:

  • 2.4GHz 8-core, 9th-generation Intel Core i9 processor (or faster)
  • 64GB 2666MHz DD4 memory
  • AMD Radeon Pro 5600M GPU with 8GB HBM2 memory
  • 1TB SSD storage (or higher – up to 8TB)
  • 10 Gigabit Ethernet

Such a configuration would likely be in the range of $3,000 – $5,000 based on the BTO options of the current Mini, iMac, and MacBook Pro. Of course, if you bump the internal SSD from 1TB to the higher capacities, the total price will go up. In my opinion, it should be easy for Apple to supply such a version without significant re-engineering. I recognize that if you went with a Xeon-based configuration, like the iMac Pros, then the task would be a bit more challenging, in part due to power demands and airflow. Naturally, an even higher-spec’ed Mac like this in the $5,000 – $10,000 range would also be appealing, but that would likely be a bridge too far for Apple.

What I ultimately want is a reasonably powerful Mac without being forced to also purchase an Apple display as this isn’t always the best option. But I want that without spending as much as on a car to get there. I understand that such a unit wouldn’t have the ability to add more cards, but then neither do the iMacs and MacBook Pros. So I really don’t see this as a huge issue. I feel that this configuration would be an instant success with many editors. Plug in the display and storage of your choice and Bob’s your uncle.

I’m not optimistic. Maybe Apple has run the calculation that such a version would rob sales from the 2019 Mac Pro or iMac Pros. Or maybe they simply view the Mini as fitting into a narrow range of server and home computing use cases. Whatever the reason, it seems clear to me that there is a huge gap in the product line that could be served by a Mac Mini with specs such as these.

On the other hand, Apple’s virtual WWDC is just around the corner, so we can always hope!

©2020 Oliver Peters

Paul McCartney’s “Who Cares”

Paul McCartney hasn’t been the type of rock star to rest on his past. Many McCartney-related projects have embraced new technologies, such as the 360VR. The music video for Who Cares – McCartney’s musical answer to bullying – was filmed in both 16mm and 65mm film. And it was edited using Final Cut Pro X.

Who Cares features Paul McCartney and actress Emma Stone in a stylized, surreal song and dance number filmed in 65mm, which is bookended by a reality-based 16mm segment. The video was directed by Brantley Gutierrez, choreographed by Ryan Heffington, and produced through LA production company Subtractive.

Gutierrez has collaborated for over 14 years with Santa Monica-based editor Ryan P. Adams on a range of projects, including commercials, concerts, and music videos. Adams also did a stint with Nitro Circus, cutting action sports documentaries for NBC and NBCSN. In that time he’s used the various NLEs, including Premiere Pro, Media Composer, and Final Cut Pro 7. But it was the demands of concert videos that really brought about his shift to Final Cut Pro X.

___________________________________

[OP] Please tell me a bit about what style you were aiming for in Who Cares. Why the choice to shoot in both 16mm and 65mm film?

[Brantley Gutierrez] In this video, I was going for an homage to vaudevillian theater acts and old Beatles-style psychedelia. My background is working with a lot of photography. I was working in film labs when I was pretty young. So my DP and friend, Linus Sandgren, suggested film and had the idea, “What if we shot 65mm?” I was open to it, but it came down to asking the folks at Kodak. They’re the ones that made that happen for us, because they saw it as an opportunity to try out their new Ektachrome 16mm motion film stock.

They facilitated us getting the 65mm at a very reasonable price and getting the unreleased Ektachrome 16mm film. The reason for the two stocks was the separation of the reality of the opening scene – kind of grainy and hand-held – with the song portion. It was almost dreamlike in its own way. This was in contrast to the 65mm psychedelic part, which was all on crane, starkly lit, and with very controlled choreography. The Ektachrome had this hazy effect with its grain. We wanted something that would jump as you went between these worlds and 16 to 65 was about as big of a jump as we could get in film formats.

[OP] What challenges did you face with this combination of film stocks? Was it just a digital transfer and then you were only dealing with video files? Or was the process different than that?

[BG] The film went to London where they could process and scan the 65mm film. It actually went in with Star Wars. Lucafilm had all of the services tied up, but they were kind enough put our film in with The Rise of Skywalker and help us get it processed and scanned. But we had to wait a couple of extra days, so it was a bit of a nervous time. I have full faith in Linus, so I knew we had it. However, it’s a little strange these days to wait eight or nine days to see what you had shot.

We were a guinea pig for Kodak for the 16mm stock. When we got it back, it looked crazy! We were like, “Oh crap.” It looked like it had been cross-processed  – super grainy and super contrasty. It did have a cool look, but more like a Tony Scott style of craziness. When we showed it to Kodak they agreed that it didn’t look right. Then we had Tom Poole, our colorist at Company 3 in New York, rescan the 16mm and it looked beautiful.

[Ryan P. Adams] Ektachrome is a positive stock, which hasn’t been used in a while. So the person in London scanning it just wasn’t familiar with it.

[BG] They just didn’t have the right color profile built for that stock yet, since it hadn’t been released yet. Of course, someone with a more experienced eye would know that wasn’t correct.

[OP] How did this delay impact your editing?

[BG] It was originally scanned and we started cutting with the incorrect version. In the meantime, the film was being rescanned by Poole. He didn’t really have to do any additional color correction to it once he had rescanned it. This was probably our quickest color correction session for any music video – probably 15 minutes total.

[RPA] One of the amazing things I learned, is that all you have to do is give it some minor contrast and then it is done. What it does give you is perfect skin tones. Once we got the proper scan and sat in the color session, that’s what really jumped out.

[OP] So then, what was the workflow like with Final Cut Pro X?

[RPA] The scans came in as DPX files. Here at Subtractive, we took those into DaVinci Resolve and spit out ProRes 422 HQ QuickTime files to edit with. To make things easy for Company 3, we did the final conform in-house using Resolve. An FCPXML file was imported into Resolve, we linked back to the DPX files, and then sent a Resolve project file to Company 3 for the final grade. This way we could make sure everything was working. There were a few effects shots that came in and we set all of that up so Tom could just jump on it and grade. Since he’s in New York, the LA and New York locations for Company 3 worked through a remote, supervised grading session.

[OP] The video features a number of effects, especially speed effects. Were those shot in-camera or added in post?

[RPA] The speed effects were done in post. The surreal world was very well choreographed, which just plays out. We had a lot of fun with the opening sequence in figuring out the timing. Especially in the transitional moment where Emma is staring into the hypnotic wheel. We were able to mock up a lot of the effects that we wanted to do in Final Cut. We would freeze-frame these little characters called “the idiots” that would jump into Emma’s head. I would do a loose rotoscope in Final Cut and then get the motion down to figure out the timing. Our effects people then remade that in After Effects.

[OP] How involved was Paul McCartney in the edit and in review-and-approval?

[BG] I’ve know Paul for about 13 years and we have a good relationship. I feel lucky that he’s very trusting of me and goes along with ideas like this. The record label didn’t even know this video was happening until the day of production. It was clandestine in a lot of ways, but you can get away with that when it’s Paul McCartney. If I had tried that with some other artist, I would have been in trouble. But Paul just said, “We’re going to do it ourselves.”

We showed him the cut once we had picture lock, before final color. He called on the phone, “Great. I don’t have any notes. It’s cool. I love it and will sign off.” That was literally it for Paul. It’s one of the few music videos where there was no going back and forth between the management, the artist, and the record label. Once Paul signed off on it, the record label was fine with it.

[OP] How did you manage to get Emma Stone to be a part of this video?

[BG] Emma is a really close friend of mine. Independently of each other, we both know Paul. Their paths have crossed over the years. We’ve all hung out together and talked about wanting to do something. When Paul’s album came out, I hit them both up with the idea for the music video and they both said yes.

The hardest part of the whole process was getting schedules to align. We finally had an open date in October with only a week and a half to get ready. That’s not a lot of time when you have to build sets and arrange the choreography. It was a bit of a mad dash. The total time was about six weeks from prep through to color.

Because of the nature of this music video, we only filmed two takes for Paul’s performance to the song. I had timed out each set-up so that we knew how long each scene would be. The car sequence was going to be “x” amount of seconds, the camera sequence would be “x” amount, and so on. As a result, we were able to tackle the edit pretty quickly. Since we were shooting 65mm film, we only had two or three takes max of everything. We didn’t have to spend a lot of time looking through hours of footage – just pick the best take for each. It was very old school in that way, which was fun.

[OP] Ryan, what’s your approach to organizing a project like this in Final Cut Pro X?

[RPA] I labelled every set-up and then just picked the best take. The first pass was just a rough to see what was the best version of this video. Then there were a few moments that we could just put in later, like when the group of idiots sings, “Who cares.”

My usual approach is to lay in the sections of synced song segments to the timeline first. We’ll go through that first to find the best performance moments and cut those into the video, which is our baseline. Then I’ll build on top of that. I like to organize that in the timeline rather than the browser so that I can watch it play against the music. But I will keyword each individual set-up or scene.

I also work that way when I cut commercials. I can manage this for a :30 commercial. When it’s a much bigger project, that’s where the organization needs to be a little more detailed. I will always break things down to the individual set-ups so I can reference them quickly. If we are doing something like a concert film, that organization may be broken up by the multiple days of the event. A great feature of Final Cut Pro X is the skim tool and that you can look at clips like a filmstrip. It’s very easy to keyword the angles for a scene and quickly go through it.

[OP] Brantley, I’m sure you’ve sat over the shoulder of the editor in many sessions. From a director’s point of view, what do you think about working with Final Cut Pro X?

[BG] This particular project was pretty well laid out in my head and it didn’t have a lot of footage, so it was already streamlined. On more complex projects, like a multi-cam edit, FCPX is great for me, because I get to look at it like a moving contact sheet from photography. I get to see my choices and I really respond to that. That feels very intuitive and it blows me away that every system isn’t like that.

[OP] Ryan, what attracted you about Final Cut Pro X in order to use it whenever possible?

[RPA] I started with Final Cut Pro X when they added multi-cam. At that time we were doing more concert productions. We had a lot of photographers who would fill in on camera and Canon 5Ds were prevalent. I like to call them “trigger-happy filmers,” because they wouldn’t let it roll all the way through.

FCPX came up with the solution to sync cameras with the audio on the back end. So I could label each photographer’s clips. Each clip might only be a few seconds long. I could then build the concert by letting FCPX sync the clips to audio even without proper timecode. That’s when I jumped on, because FCPX solved a problem that was very painful in Final Cut Pro 7 and a lot of other editing systems. That was an interesting moment in time when photographic cameras could shoot video and we hired a lot of those shooters. Final Cut Pro X solved the problem in a very cool way and it helped me tremendously.

We did this Tom Petty music video, which really illustrates why Final Cut Pro X is a go-to tool. After Tom had passed, we had to take a lot of archival footage as part of a music video, called Gainesville, that we did for his boxed set. Brantley shot a lot of video around Tom’s hometown of Gainesville [Florida], but they also brought us a box with a massive amount of footage that we put into the system. A mix of old films and tapes, some of Tom’s personal footage, all this archival stuff. It gave the video a wonderful feeling.

[BG] It’s very nostalgic from the point of view of Tom and the band. A lot of it was stuff they had shot in their 20s and had a real home movie feel. I shot Super 8mm footage around Tom’s original home and places where they grew up to match that tone. I was trying to capture the love his hometown has for him.

[RPA] That’s a situation where FCPX blows the competition out of the water. It’s easy to use the strip view to hunt for those emotional moments. So the skimmer and the strip view were ways for us to cull all of this hodge-podge of footage for those moments and to hit beats and moments in the music for a song that had been unreleased at that time. We had one week to turn that around. It’s a complicated situation to look through box of footage on a very tight deadline and put a story to it and make it feel correct for the song. That’s where all of those tools in Final Cut shine. When I have to build a montage, that’s when I love Final Cut Pro X the most.

[OP] You’ve worked with the various NLEs. You know DaVinci Resolve and Blackmagic is working hard to make it the best all-in-one tool on the market. When you look at this type of application, what features would you love to see added to Final Cut Pro X?

[RPA] If I had a wishlist, I would love to see if FCPX could be scaled up for multiple seats and multiple editors. I wish some focus was being put on that. I still go to Resolve for color. I look at compositing as just mocking something up so we can figure out timing and what it is generally going to look like. However, I don’t see a situation currently where I do everything in the editor. To me, DaVinci Resolve is kind of like a Smoke system and I tip my hat to them.

I find that Final Cut still edits faster than a lot of other systems, but speed is not the most important thing. If you can do things quickly, then you can try more things out. That helps creatively. But I think that typically things take about as long from one system to the next. If an edit takes me a week in Adobe it still takes me a week in FCPX. But if I can try more things out creatively, then that’s beneficial to any project.

Originally written for FCP.co.

©2020 Oliver Peters

Jezebel

If you’ve spent any time in Final Cut Pro X discussion forums, then you’ve probably run across posts by Tangier Clarke, a film editor based in Los Angeles. Clarke was an early convert to FCPX and recently handled the post-production finishing for the film, Jezebel. I was intrigued by the fact that Jezebel was picked up by Netflix, a streaming platform that has been driving many modern technical standards. This was a good springboard to chat with Clarke and see how FCPX fared as the editing tool of choice.

_______________________________________________________________

[OP] Please tell me a little about your background in becoming a film editor.

[TC] I’m a very technical person and have always had a love for computers. I went to college for computer science, but along the way I discovered Avid Videoshop and started to explore editing more, since it married my technical side with creative storytelling. So, at UC Berkeley I switched from computer science to film.

My first job was at motion graphics company Montgomery/Cobb, which was in Los Angeles. They later became Montgomery & Co. Creative. I was a production assistant for main titles and branding packages for The Weather Channel, Fox, NBC, CBS, and a whole host of cable shows. Then, I worked for 12 years with Loyola Productions (no affiliation with Loyola Marymount University).

I moved on to a company called Black & Sexy TV, which was started by Dennis Dortch as a company to have more control over black images in media. He created a movie called A Good Day to be Black and Sexy in 2008, which was picked up and distributed by Magnolia Pictures and became a cult hit. It ended up in Blockbuster Video stores, Target, and Netflix. The success of that film was leveraged to launch Black & Sexy TV and its online streaming platform.

[OP] You’ve worked on several different editing applications, but tell me a bit about your transition to Final Cut Pro X.

[TC] I started my career on Avid, which was also at the time when Final Cut Pro “legacy” was taking off. During 2011 at Loyola Productions, I had an opportunity to create a commercial for a contest put out by American Airlines. We thought this was an opportunity for us as a company to try Final Cut Pro X.

I knew that it was for us once we installed it. Of course, there were a lot of things missing coming from Final Cut Pro 7, and a couple of bugs here and there. The one thing that was astonishing for me, despite the initial learning curve, was that within one week of use my productivity compared to Final Cut Pro 7 went through the roof. There was no correlation between anything I had used before and what I was experiencing with Final Cut X in that first week. I also noticed that our interns – whose only experience was iMovie – just picked up Final Cut Pro X with no problems whatsoever.

Final Cut Pro X was very liberating, which I expressed to my boss, Eddie Siebert, the president and founder of Loyola Productions. We decided to keep using it to the extent that we could on certain projects and worked with Final Cut Pro 7 and Final Cut Pro X side-by-side until we eventually just switched over.

[OP] You recently were the post supervisor and finishing editor for the film Jezebel, which was picked up by Netflix. What is this film about?

[TC] Jezebel is a semi-autobiographical film written and directed by Numa Perrier, who is a co-founder of Black & Sexy TV. The plot follows a 19-year-old girl, who after the death of her mother begins to do sex work as an online chat room cam girl to financially support herself. Numa is starring in the film, playing her older sister, and an actress named Tiffany Tenille is playing Numa. This is also Numa Perrier’s feature film directorial debut. It’s a side of her that people didn’t know – about how she survived as a young adult in Las Vegas. So, she is really putting herself out there.

The film made its debut at South by Southwest last year, where it was selected as a “Best of SXSW” film by The Hollywood Reporter. After that it went to other domestic and international festivals. At some point it was seen by Ava DuVernay, who decided to to pick up Numa’s film through her company, Array. That’s how it got to Netflix.

[OP] Please walk me through the editorial workflow for Jezebel. How did FCPX play a unique role in the post?

[TC] I was working on a documentary at the time, so I couldn’t fully edit Jezebel, but I was definitely instrumental in the process. A former coworker of mine, Brittany Lyles, was given the task of actually editing the project in Final Cut Pro X, which I had introduced her to a couple of years ago and trained her on how to use it. The crew shot with a Canon C300 camera and we used the Final Cut proxy workflow. Brittany wouldn’t have been able to work on it if we weren’t using proxies, because of her hardware. I was using a late 2013 Mac Pro, as well as a 2016 MacBook Pro.

At the front end, I assisted the the production team with storage and media management. Frances Ampah (a co-producer on the film) and I worked to sync all the footage for Brittany, who was working with copy of the footage on a dedicated drive. We provided Bittany with XMLs during the syncing process as she was getting familiar with the footage.

While Brittany was working on the cut, Numa and I were trying to figure out how best to come up with a look and a style for the text during the chat room scenes in the movie. It hadn’t been determined yet if I was going to get the entire film and put the graphics in myself or if I was going to hand it off to Brittany for her to do it. I pitched Numa on the idea of creating a Motion template so that I could have more control over the look, feel, and animation of the graphics. That way either Brittany or I could do it and it would look the same.

Brittany and Numa refined the edit to a point where it made more sense for me to put in a lot of the graphics and do any updating per Numa’s notes, because some of the text had changed as well. And we wanted to really situate the motion of the chat so that it was characteristic of what it looked like back then – in the 90s. We needed specific colors for each user who was logged into the screen. I had some odd color issues with Final Cut and ended up actually just going into the FCPXML file to modify color values. I’m used to going into files like that and I’m not afraid of it. I also used the FCP X feature in the text inspector to save format and appearance attributes. This was tremendously helpful to quickly assign the color and formatting for the different users in the chat room – saving a lot of time.

Our secondary editor, Bobby Field, worked closely with Numa to do the majority of color grading on the film. He was more familiar with Premiere Pro than FCP X, but really enjoyed the color tools in Final Cut Pro X. Through experimentation, Bobby learned how to use adjustment layers to apply color correction. I was fascinated by this and it was a learning experience for me as well. I’m used to working directly with the clip itself and in my many years of using FCP X, this wasn’t a method I used or saw anyone else firsthand doing.

[OP] What about sound post and music?

[TC] I knew that there’s only so much technically that I’d had the skillset to do and I would not dare to pretend that I know how to do certain things. I called on the help of Jim Schaefer – a skilled and trusted friend that I worked with at Loyola productions. I knew he wanted an opportunity to work on a big project, particularly a feature. The film needed a tremendous amount of sound work, so he took it on along with Travis Prater, a coworker of his at Source Sound in Woodland Hills. Together they really transformed the film.

Jim and Travis worked in Pro Tools, so I used X2Pro to get files to them. Jim gave me a list of how he wanted the film broken down. Because of the length of Jezebel, he preferred that the film was broken up into reels. In addition to reels, I also gave him the entire storyline with all of the roles. Everything was broken down very nicely using AAFs and he didn’t really have any problems.  In his words, “It’s awesome that all the tracks are sorted by character and microphone – that’ll cut down significantly on the sorting/organizing pass for me.” The only hiccup experienced was that metadata was missing in the AAF up to a certain point in ProTools. Yet that metadata did exist in the original WAV. Some clip names were inconsistent as well, but that may have happened during production.

[OP] Jezebel is streaming on Netflix, which has a reputation for having tough technical specs. Were there any special things you had to do to make it ready for the platform?

[TC] We supplied Array with a DCI 2K (full frame) Quicktime master in ProRes 422HQ per their delivery schedule, along with other elements such as stereo and 5.1 mixes from Jim, Blu-Rays, DVD, and DCP masters. I expected to do special things to make it ready for Netflix. Numa and I discussed this, but to my knowledge, the Quicktime that I provided to Array is what Netflix received. There were no special conversions made just for Netflix on the part of Array.

[OP] Now that you have this Final Cut Pro X experience under your belt, what would you change if you could? Any special challenges or shortcomings?

[TC] I had to do some composite shots for the film, so the only shortcoming for me was Final Cut’s compositing tool set. I’d love to have better tools built right into FCP X, like in DaVinci Resolve. I love Apple Motion and it’s fine for what it is, but it could go a little further for me. I’d love to see an update with improved compositing and better tracking. Better reporting for missing files, plugins, and other elements would also be tremendously helpful in troubleshooting vague alerts.

In spite of this, there was no doubt in any part of the process whether or not Final Cut was fully capable of being at the center of everything that needed to be done – whether it was leveraging Motion for template graphics between Brittany and me, using a third-party tool to make sure that the post sound team had precisely what they needed, or exchanging XMLs or backup libraries with Bobby to make sure that his work got to me intact. I was totally happy with the performance of FCP X. It was just rock solid and for the most part did everything I needed it to do without slowing me down.

Originally written for FCPco.

A special thanks to Lumberjack System for their assistance in transcribing this interview.

©2020 Oliver Peters

The Banker

Apple has launched its new TV+ service and this provides another opportunity for filmmakers to bring untold stories to the world. That’s the case for The Banker, an independent film picked up by Apple. It tells the story of two African American entrepreneurs attempting to earn their piece of the American dream during the repressive 1960s through real estate and banking. It stars Samuel L. Jackson, Anthony Mackie, Nia Long, and Nicholas Hoult.

The film was directed by George Nolfi (The Adjustment Bureau) and produced by Joel Viertel, who also signed on to edit the film. Viertel’s background hasn’t followed the usual path for a feature film editor. Interested in editing while still in high school, the move to LA after college landed him a job at Paramount where he eventually became a creative executive. During that time he kept up his editing chops and eventually left Paramount to pursue independent filmmaking as a writer, producer, and editor. His editing experience included Apple Final Cut Pro 1.0 through 7.0 and Avid Media Composer, but cutting The Banker was his first time using Apple’s Final Cut Pro X.

I recently chatted with Joel Viertel about the experience of making this film and working with Apple’s innovative editing application.

____________________________________________

[OP] How did you get involved with co-producing and cutting The Banker?

[JV] This film originally started while I was at Paramount. Through a connection from a friend, I met with David Smith and he pitched me the film. I fell in love with it right away, but as is the case with these films, it took a long while to put all the pieces together. While I was doing The Adjustment Bureau with George Nolfi and Anthony Mackie, I pitched it to them, and they agreed it would be a great project for us all to collaborate on. From there it took a few years to get to a script we were all happy with, cast the roles, get the movie financed, and off the ground.

[OP] I imagine that it’s exciting to be one of the first films picked up by Apple for their TV+ service. Was that deal arranged before you started filming or after everything was in the can, so to speak?

[JV] Apple partnered with us after it was finished. It was made and financed completely independently through Romulus Entertainment. While we were in the finishing stages, Endeavor Content repped the film and got us into discussions with Apple. It’s one of their first major theatrical releases and then goes on the platform after that. Apple is a great company and brand, so it’s exciting to get in on the ground floor of what they’re doing.

[OP] When I screened the film, one of the things I enjoyed was the use of montages to quickly cover a series of events. Was that how it was written or were those developed during the edit as a way to cut running time?

[JV] Nope, it was all scripted. Those segments can bedevil a production, because getting all of those little pieces is a lot of effort for very little yield. But it was very important to George and myself and the collaborators on the film to get them. It’s a film about banking and real estate, so you have to figure out how to make that a fun and interesting story. Montages were one way to keep the film propulsive and moving forward – to give it motion and excitement. We just had to get through production finding places to pick off those pieces, because none of those were developed in post.

[OP] What was your overall time frame to shoot and post this film?

[JV] We started in late September 2018 and finished production in early November. It was about 30 days in Atlanta and then a few days of pick-ups in LA. We started post right after Thanksgiving and locked in May, I think. Once Apple got involved, there were a few minor changes. However, Apple’s delivery specs were completely different from our original delivery specs, so we had to circle back on a bunch of our finishing.

[OP] Different in what way?

[JV] We had planned to finish in 2K with a 5.1 mix. Their deliverables are 4K with a Dolby Atmos mix. Because we had shot on 35mm film, we had the capacity, but it meant that we had to rescan and redo the visual effects at 4K. We had to lay the groundwork to do an Atmos mix and DolbyVision finish for theatrical and home video, which required the 35mm film negative to be rescanned and dust-busted.

Our DP, Charlotte Bruus Christensen, has shot mostly on 35mm – films like A Quiet Place and The Girl on a Train and those movies are beautiful. And so we wanted to accommodate that, but it presents challenges if you aren’t shooting in LA. Between Kodak in Atlanta and Technicolor in LA we were able to make it work.

Kodak would process the negative and Technicolor made a one-light transfer for 2K dailies. Those were archived and then I edited with ProResLT copies in HD. Once we were done, Technicolor onlined the movie from their 2K scans. After the change in deliverable specs, Technicolor rescanned the clips used for the online finish at 4K and conformed the cut at 4K.

[OP] I felt that the eclectic score fit this movie well and really places it in time. As an editor, how did you work to build up your temp tracks? Or did you simply leave it up to the composer?

[JV] George and I have worked with our composer, Scott Salinas, for a very long time on a bunch of things. Typically, I give him a script and then he pulls samples that he thinks are in the ballpark. He gave me a grab bag of stuff for The Banker – some of which was score, some of which was jazz. I start laying that against the picture myself as I go and find these little things that feel right and set the tone of the movie. I’m finding my way for the right marriage of music and picture. If it works, it sticks. If it doesn’t, we replace it. Then at the end, he’s got to score over that stuff.

Most of the jazz in The Banker is original, but there are a couple tracks where we just licensed them. There’s a track called “Cash and Carry” that I used over the montage when they get rich. They’ve just bought the Banker’s Building and popped the champagne. This wacky, French 1970s bit of music comes in with a dude scatting over it while they are buying buildings or looking at the map of LA. That was a track Scott gave me before we shot a frame of film, so when we got to that section of the movie, I chose it out of the bin and put that sequence to it and it just stuck.

There are some cases where it’s almost impossible to temp, so I just cut it dry and give it to him. Sometimes he’ll temp it and sometimes he’ll do a scratch score. For example, the very beginning of the movie never had temp in any way. I just cut it dry. I gave it to Scott. He scored it and then we revised his scoring a bunch of times to get to the final version.

[OP] Did you do any official or “friends and family” screenings of The Banker while editing it? If so, did that impact the way the film turned out?

[JV] The post process is largely dictated by how good your first cut is. If the movie works, but needs improvement – that’s one thing. If it fundamentally doesn’t – that’s another. It’s a question of where you landed from the get-go and what needs to be fixed to get to the end of the road.

We’re big fans of doing mini-testing – bringing in people we know and people whose opinions we want to hear. At some point you have to get outside of the process and aggregate what you hear over and over again. You need to address the common things that people pick up on. The only way to keep improving your movie is to get outside feedback so they tell you what to focus on.

Over time that significantly impacted the film. It’s not like any one person said that one thing that caused us to re-edit the film. People see the problem that sticks out to them in the cut and you work on that. The next time there’s something else and then you work on that. You keep trying to make all the improvements you can make. So it’s an iterative process.

[OP] This film marked a shift for you from using earlier versions of Final Cut Pro to now cutting on Final Cut Pro X for the first time. Why did you make that choice and what was the experience like?

[JV] George has a relationship with Apple and they had suggested using Final Cut Pro X on his next project. I had always used Final Cut Pro 7 as my preference. We had used it on an NBC show called Allegiance in 2014 and then on Birth of the Dragon in 2015 and 2016 – long after it had been discontinued. We all could see the writing on the wall – operating systems would quit running it and it’s not harnessing what the computers can do.

I got involved in the conversation and was invited to come to a seminar at the Editors Guild about Final Cut Pro X that was taught by Kevin Bailey, who was the assistant editor for Whiskey Tango Foxtrot. I had looked at Final Cut Pro X when it first came out and then again several years later. I felt like it had been vastly improved and was in a place where I could give it a shot. So I committed at that point to cutting this film on Final Cut Pro X and teaching myself how to use it. I also hired Kevin to help as my assistant for the start of the film. He became unavailable later in the production, so we found Steven Moyer to be my assistant and he was fantastic. I would have never made it through without the both of them.

[OP] How did you feel about Final Cut Pro X once you got your sea legs?

[JV] It’s always hard to learn to walk again. That’s what a lot of editors bump into with Final Cut Pro X, because it is a very different approach than any other NLE. I found that once you get to know it and rewire your brain that you can be very fast on it. A lot of the things that it does are revolutionary and pretty incredible. And there are still other areas that are being worked on. Those guys are constantly trying to make it better. We’ve had multiple conversations with them about the possibilities and they are very open to feedback.

[OP] Every editor has their own way of tackling dailies and wading through an avalanche of footage coming in from production. And of course, Final Cut Pro X features some interesting ways to organize media. What was the process like for The Banker?

[JV] The sound and picture were both running at 24fps. I would upload the sound files from my hotel room in Atlanta to Technicolor in LA, who would sync the sound. They would send back the dailies and sound, which Kevin – who was assisting at that time – would load into Final Cut. He would multi-clip the sound files and the two camera angles. Everything is in a multi-clip, except for purely MOS B-roll shots. Each scene had its own event. Kevin used the same system he had devised with Jan [Kovac, editor on Whiskey Tango Foxtrot and Focus]. He would keyword each dialogue line, so that when you select a keyword collection in the browser, every take for that line comes up. That’s labor-intensive for the assistant, but it makes life that much faster for me once it’s set up.

[OP] I suppose that method also makes it much faster when you are working with the director and need to quickly get to alternate takes.

[JV] It speeds things along for George, but also for me. I don’t have to hunt around to find the lines when I have to edit a very long dialogue scene. You could assemble selects reels first, but I like to look at everything. I fundamentally believe there’s something good in every bad take. It doesn’t take very long to watch every take of a line. Plus I do a fair amount of ‘Franken-biting’ with dialogue where needed.

[OP] Obviously the final mix and color correction were done at specialty facilities. Since The Banker was shot on film, I would imagine that complicated the hand-off slightly. Please walk me through the process you followed.

[JV] Marti Humphrey did the sound at The Dub Stage in Burbank. We have a good relationship with him and can call him very early in the process to work out the timeline of how we are going to do things. He had to soup up his system a bit to handle the Atmos near-field stuff, but it was a good opportunity for him to get into that space. So he was able to do all the various versions of our mix.

Technicolor was the new guy for us. Mike Hatzer did the color grade. It was a fairly complex process for them and they were a good partner. For the conform, we handed them an XML and EDL. They had their Flex files to get back to the film edge code. Steven had to break up the sequence to generate separate tracks for the 35mm original, stock, and VFX shots, because Technicolor needed separate EDLs for those. But it wasn’t like we invented anything that hasn’t been done before.

We did use third-party apps for some of this. The great thing about that is you can just contact the developer directly. There was one EDL issue and Steven could just call up the app developer to explain the issue and they’d fix it in a couple of days.

[OP] What sort of visual effects were required? The film is set more or less 70 years ago, so were the majority of effects just to make the locations look right? Like cars, signs, and so on?

[JV] It was mostly period clean-up. You have to paint out all sorts of boring stuff, like road paint. In the 50s and 60s, those white lines have to come out. Wires, of course. A couple of shots we wanted to ‘LA-ify’ Georgia. We shot some stuff in LA, but when you put Griffith Park right next to a shot of Newnan, Georgia, the way to blend that over is to put palm trees in the Newnan shot.

We also did a pick-up with Anthony while he was on another show the required a beard for that role. So we had to paint out his beard. Good luck figuring out which was the shot where we had to paint out his beard!

[OP] Now that you have a feature film under your belt with Final Cut Pro X, what are your thoughts about it? Anything you feel that it’s missing?

[JV] All the NLEs have their particular strengths. Final Cut has several that are amazing, like background exports and rendering. It has Roles, where you can differentiate dialogue, sound effects, and music sources. You can bus things to different places. This is the first time I’ve ever edited in 5.1, because Final Cut supports that. That was a fun challenge.

We used Final Cut Pro X to edit a movie shot on film, which is kind of a first at this level, but it’s not like we crashed into some huge problem with that. We gamed it out and it all worked like it was supposed to. Obviously it doesn’t do some stuff the same way. Fortunately through our relationship with Apple we can make some suggestions about that. But there really isn’t anything it doesn’t do. If that were the case, we would have just said that we can’t cut with this.

Final Cut Pro X is an evolving NLE – as they all are. What I realized at the seminar is that it changed a lot from when it first appeared. It was a good experience cutting a movie on it. Some editors are hesitant, because that first hour is difficult and I totally get that. But if you push through that and get to know it – there are many things that are very good and addictively good. I would certainly cut another movie on it.

____________________________________________

The Banker started a limited theatrical release on March 6 and will be available on the Apple TV+ streaming service on March 20.

For even more details on the post process for The Baker, check out Pro Video Coalition

Originally written for FCPco.

®2020 Oliver Peters