The Banker

Apple has launched its new TV+ service and this provides another opportunity for filmmakers to bring untold stories to the world. That’s the case for The Banker, an independent film picked up by Apple. It tells the story of two African American entrepreneurs attempting to earn their piece of the American dream during the repressive 1960s through real estate and banking. It stars Samuel L. Jackson, Anthony Mackie, Nia Long, and Nicholas Hoult.

The film was directed by George Nolfi (The Adjustment Bureau) and produced by Joel Viertel, who also signed on to edit the film. Viertel’s background hasn’t followed the usual path for a feature film editor. Interested in editing while still in high school, the move to LA after college landed him a job at Paramount where he eventually became a creative executive. During that time he kept up his editing chops and eventually left Paramount to pursue independent filmmaking as a writer, producer, and editor. His editing experience included Apple Final Cut Pro 1.0 through 7.0 and Avid Media Composer, but cutting The Banker was his first time using Apple’s Final Cut Pro X.

I recently chatted with Joel Viertel about the experience of making this film and working with Apple’s innovative editing application.

____________________________________________

[OP] How did you get involved with co-producing and cutting The Banker?

[JV] This film originally started while I was at Paramount. Through a connection from a friend, I met with David Smith and he pitched me the film. I fell in love with it right away, but as is the case with these films, it took a long while to put all the pieces together. While I was doing The Adjustment Bureau with George Nolfi and Anthony Mackie, I pitched it to them, and they agreed it would be a great project for us all to collaborate on. From there it took a few years to get to a script we were all happy with, cast the roles, get the movie financed, and off the ground.

[OP] I imagine that it’s exciting to be one of the first films picked up by Apple for their TV+ service. Was that deal arranged before you started filming or after everything was in the can, so to speak?

[JV] Apple partnered with us after it was finished. It was made and financed completely independently through Romulus Entertainment. While we were in the finishing stages, Endeavor Content repped the film and got us into discussions with Apple. It’s one of their first major theatrical releases and then goes on the platform after that. Apple is a great company and brand, so it’s exciting to get in on the ground floor of what they’re doing.

[OP] When I screened the film, one of the things I enjoyed was the use of montages to quickly cover a series of events. Was that how it was written or were those developed during the edit as a way to cut running time?

[JV] Nope, it was all scripted. Those segments can bedevil a production, because getting all of those little pieces is a lot of effort for very little yield. But it was very important to George and myself and the collaborators on the film to get them. It’s a film about banking and real estate, so you have to figure out how to make that a fun and interesting story. Montages were one way to keep the film propulsive and moving forward – to give it motion and excitement. We just had to get through production finding places to pick off those pieces, because none of those were developed in post.

[OP] What was your overall time frame to shoot and post this film?

[JV] We started in late September 2018 and finished production in early November. It was about 30 days in Atlanta and then a few days of pick-ups in LA. We started post right after Thanksgiving and locked in May, I think. Once Apple got involved, there were a few minor changes. However, Apple’s delivery specs were completely different from our original delivery specs, so we had to circle back on a bunch of our finishing.

[OP] Different in what way?

[JV] We had planned to finish in 2K with a 5.1 mix. Their deliverables are 4K with a Dolby Atmos mix. Because we had shot on 35mm film, we had the capacity, but it meant that we had to rescan and redo the visual effects at 4K. We had to lay the groundwork to do an Atmos mix and DolbyVision finish for theatrical and home video, which required the 35mm film negative to be rescanned and dust-busted.

Our DP, Charlotte Bruus Christensen, has shot mostly on 35mm – films like A Quiet Place and The Girl on a Train and those movies are beautiful. And so we wanted to accommodate that, but it presents challenges if you aren’t shooting in LA. Between Kodak in Atlanta and Technicolor in LA we were able to make it work.

Kodak would process the negative and Technicolor made a one-light transfer for 2K dailies. Those were archived and then I edited with ProResLT copies in HD. Once we were done, Technicolor onlined the movie from their 2K scans. After the change in deliverable specs, Technicolor rescanned the clips used for the online finish at 4K and conformed the cut at 4K.

[OP] I felt that the eclectic score fit this movie well and really places it in time. As an editor, how did you work to build up your temp tracks? Or did you simply leave it up to the composer?

[JV] George and I have worked with our composer, Scott Salinas, for a very long time on a bunch of things. Typically, I give him a script and then he pulls samples that he thinks are in the ballpark. He gave me a grab bag of stuff for The Banker – some of which was score, some of which was jazz. I start laying that against the picture myself as I go and find these little things that feel right and set the tone of the movie. I’m finding my way for the right marriage of music and picture. If it works, it sticks. If it doesn’t, we replace it. Then at the end, he’s got to score over that stuff.

Most of the jazz in The Banker is original, but there are a couple tracks where we just licensed them. There’s a track called “Cash and Carry” that I used over the montage when they get rich. They’ve just bought the Banker’s Building and popped the champagne. This wacky, French 1970s bit of music comes in with a dude scatting over it while they are buying buildings or looking at the map of LA. That was a track Scott gave me before we shot a frame of film, so when we got to that section of the movie, I chose it out of the bin and put that sequence to it and it just stuck.

There are some cases where it’s almost impossible to temp, so I just cut it dry and give it to him. Sometimes he’ll temp it and sometimes he’ll do a scratch score. For example, the very beginning of the movie never had temp in any way. I just cut it dry. I gave it to Scott. He scored it and then we revised his scoring a bunch of times to get to the final version.

[OP] Did you do any official or “friends and family” screenings of The Banker while editing it? If so, did that impact the way the film turned out?

[JV] The post process is largely dictated by how good your first cut is. If the movie works, but needs improvement – that’s one thing. If it fundamentally doesn’t – that’s another. It’s a question of where you landed from the get-go and what needs to be fixed to get to the end of the road.

We’re big fans of doing mini-testing – bringing in people we know and people whose opinions we want to hear. At some point you have to get outside of the process and aggregate what you hear over and over again. You need to address the common things that people pick up on. The only way to keep improving your movie is to get outside feedback so they tell you what to focus on.

Over time that significantly impacted the film. It’s not like any one person said that one thing that caused us to re-edit the film. People see the problem that sticks out to them in the cut and you work on that. The next time there’s something else and then you work on that. You keep trying to make all the improvements you can make. So it’s an iterative process.

[OP] This film marked a shift for you from using earlier versions of Final Cut Pro to now cutting on Final Cut Pro X for the first time. Why did you make that choice and what was the experience like?

[JV] George has a relationship with Apple and they had suggested using Final Cut Pro X on his next project. I had always used Final Cut Pro 7 as my preference. We had used it on an NBC show called Allegiance in 2014 and then on Birth of the Dragon in 2015 and 2016 – long after it had been discontinued. We all could see the writing on the wall – operating systems would quit running it and it’s not harnessing what the computers can do.

I got involved in the conversation and was invited to come to a seminar at the Editors Guild about Final Cut Pro X that was taught by Kevin Bailey, who was the assistant editor for Whiskey Tango Foxtrot. I had looked at Final Cut Pro X when it first came out and then again several years later. I felt like it had been vastly improved and was in a place where I could give it a shot. So I committed at that point to cutting this film on Final Cut Pro X and teaching myself how to use it. I also hired Kevin to help as my assistant for the start of the film. He became unavailable later in the production, so we found Steven Moyer to be my assistant and he was fantastic. I would have never made it through without the both of them.

[OP] How did you feel about Final Cut Pro X once you got your sea legs?

[JV] It’s always hard to learn to walk again. That’s what a lot of editors bump into with Final Cut Pro X, because it is a very different approach than any other NLE. I found that once you get to know it and rewire your brain that you can be very fast on it. A lot of the things that it does are revolutionary and pretty incredible. And there are still other areas that are being worked on. Those guys are constantly trying to make it better. We’ve had multiple conversations with them about the possibilities and they are very open to feedback.

[OP] Every editor has their own way of tackling dailies and wading through an avalanche of footage coming in from production. And of course, Final Cut Pro X features some interesting ways to organize media. What was the process like for The Banker?

[JV] The sound and picture were both running at 24fps. I would upload the sound files from my hotel room in Atlanta to Technicolor in LA, who would sync the sound. They would send back the dailies and sound, which Kevin – who was assisting at that time – would load into Final Cut. He would multi-clip the sound files and the two camera angles. Everything is in a multi-clip, except for purely MOS B-roll shots. Each scene had its own event. Kevin used the same system he had devised with Jan [Kovac, editor on Whiskey Tango Foxtrot and Focus]. He would keyword each dialogue line, so that when you select a keyword collection in the browser, every take for that line comes up. That’s labor-intensive for the assistant, but it makes life that much faster for me once it’s set up.

[OP] I suppose that method also makes it much faster when you are working with the director and need to quickly get to alternate takes.

[JV] It speeds things along for George, but also for me. I don’t have to hunt around to find the lines when I have to edit a very long dialogue scene. You could assemble selects reels first, but I like to look at everything. I fundamentally believe there’s something good in every bad take. It doesn’t take very long to watch every take of a line. Plus I do a fair amount of ‘Franken-biting’ with dialogue where needed.

[OP] Obviously the final mix and color correction were done at specialty facilities. Since The Banker was shot on film, I would imagine that complicated the hand-off slightly. Please walk me through the process you followed.

[JV] Marti Humphrey did the sound at The Dub Stage in Burbank. We have a good relationship with him and can call him very early in the process to work out the timeline of how we are going to do things. He had to soup up his system a bit to handle the Atmos near-field stuff, but it was a good opportunity for him to get into that space. So he was able to do all the various versions of our mix.

Technicolor was the new guy for us. Mike Hatzer did the color grade. It was a fairly complex process for them and they were a good partner. For the conform, we handed them an XML and EDL. They had their Flex files to get back to the film edge code. Steven had to break up the sequence to generate separate tracks for the 35mm original, stock, and VFX shots, because Technicolor needed separate EDLs for those. But it wasn’t like we invented anything that hasn’t been done before.

We did use third-party apps for some of this. The great thing about that is you can just contact the developer directly. There was one EDL issue and Steven could just call up the app developer to explain the issue and they’d fix it in a couple of days.

[OP] What sort of visual effects were required? The film is set more or less 70 years ago, so were the majority of effects just to make the locations look right? Like cars, signs, and so on?

[JV] It was mostly period clean-up. You have to paint out all sorts of boring stuff, like road paint. In the 50s and 60s, those white lines have to come out. Wires, of course. A couple of shots we wanted to ‘LA-ify’ Georgia. We shot some stuff in LA, but when you put Griffith Park right next to a shot of Newnan, Georgia, the way to blend that over is to put palm trees in the Newnan shot.

We also did a pick-up with Anthony while he was on another show the required a beard for that role. So we had to paint out his beard. Good luck figuring out which was the shot where we had to paint out his beard!

[OP] Now that you have a feature film under your belt with Final Cut Pro X, what are your thoughts about it? Anything you feel that it’s missing?

[JV] All the NLEs have their particular strengths. Final Cut has several that are amazing, like background exports and rendering. It has Roles, where you can differentiate dialogue, sound effects, and music sources. You can bus things to different places. This is the first time I’ve ever edited in 5.1, because Final Cut supports that. That was a fun challenge.

We used Final Cut Pro X to edit a movie shot on film, which is kind of a first at this level, but it’s not like we crashed into some huge problem with that. We gamed it out and it all worked like it was supposed to. Obviously it doesn’t do some stuff the same way. Fortunately through our relationship with Apple we can make some suggestions about that. But there really isn’t anything it doesn’t do. If that were the case, we would have just said that we can’t cut with this.

Final Cut Pro X is an evolving NLE – as they all are. What I realized at the seminar is that it changed a lot from when it first appeared. It was a good experience cutting a movie on it. Some editors are hesitant, because that first hour is difficult and I totally get that. But if you push through that and get to know it – there are many things that are very good and addictively good. I would certainly cut another movie on it.

____________________________________________

The Banker started a limited theatrical release on March 6 and will be available on the Apple TV+ streaming service on March 20.

For even more details on the post process for The Baker, check out Pro Video Coalition

Originally written for FCPco.

®2020 Oliver Peters

Apple 2019 16″ MacBook Pro

Creatives in all fields are the target market for Apple’s MacBook Pro product line. Apple introduced the 16″ model to replace its 15″ predecessor in late 2019. This new model is not only a serious tool for location work but is powerful enough to form the hub of your edit suite, whether on-site or at a fixed facility.

Nuts and bolts

The 2019 MacBook Pro comes in 6-core (Intel Core i7) and 8-core (Intel Core i9) configurations. It boasts up to 64GB RAM, an AMD Radeon Pro 5500M series GPU with up to 8GB of GDDR6 VRAM, and can be equipped with up to 8TB of internal SSD storage. Prices start at under $2,800 USD*, but the full monty rings up at about $6,500 USD*. The high-capacity SSD options contribute to the most expensive configurations, but those sizes may be overkill for most users. A more realistic price for a typical editor’s 8-core configuration would be about $3,900 USD*. Just stick to a 1TB internal SSD and back the RAM down to 32GB. Quite frankly the 6-core is likely to be sufficient for many editing and design tasks. (*With AppleCare warranty, but no local VAT or sales taxes added.)

While there are great PC laptop choices, it’s very hard to make direct comparisons to laptops with all of these same components. Few PC laptops offer this much RAM or SSDs that large, for example. When you can make a direct comparison, name brand PC laptops are often more expensive. And remember, great gaming PCs are not necessarily the best editing machines and vice versa.

What’s improved?

Apple loaned me a space gray, 8-core 16″ MacBook Pro configured with 64GB RAM, the AMD GPU with 8GB VRAM, and a 4TB internal hard drive. I own a mid-2014 MacBook Pro as the center of my edit system at home. The two MacBook Pros are physically very similar. They are nearly the identical size, with the 2019 MacBook Pro a bit thinner and lighter. The keyboard footprint is the same as my five-year-old model, though the keys are slightly larger on the new laptop with less space between. Apple tweaked the keyboard mechanism on the 16″ model. Typing feels about the same between these two, though the keys on the new machine are quieter. Plus, it has a much bigger trackpad and the touch bar.

The Retina screen is housed in a lid about the same size as the 15″ model. Its 16″ diagonal spec is achieved by using smaller bezels. Of course, the newer display is also higher density resulting in 3072 x 1920 pixels at 226 ppi. This 500-nit, P3 display offers True Tone, thanks to macOS Catalina. True Tone alters the color temperature based on ambient light. It’s easy on the eyes, because it warms up the image in standard room lighting. However, I would discourage enabling it while working on projects requiring critical color accuracy.

The MacBook Pro features four Thunderbolt 3/USB-C ports, like its 15″ predecessor. These connect peripherals and/or power. I don’t know whether Apple had room to add standard USB-A ports or if there was a trade-off for space needed for cooling or the improved speaker system. Maybe it was just a decision to move forward regardless of some short-term inconvenience. In any case, if you purchase a MacBook Pro, plan on also buying a few adapters and cables, a USB hub, and/or a Thunderbolt 3 dock.

Performance testing

How does the 2019 MacBook Pro stack up against a desktop Mac, like the 10-core 3.0 GHz 2017 iMac Pro used at my daily editing gig? Both have 64GB RAM, but there are 16GB of VRAM in the iMac Pro. I tested with both the internal SSDs and an external USB 3.0 GDRIVE SSD. Both internal drives clocked well over 2500 MB/sec, while the GDRIVE was in the 400-500 range.

My “benchmark” tests included BruceX for Final Cut Pro X, Puget Systems’ After Effects benchmark, and two of Simon Ubsdell’s Motion tutorial projects. For the “real world” tests, I used a travelogue series sizzle edit with 4K (and larger) media, various codecs, scaling, speed changes, and color correction. The timeline was exported to ProRes and H.264 using various NLEs. My final test sequence was a taxing 6K FCPX timeline composed of nine layers of 6K RED raw files.

Until the release of the new Mac Pro tower, the iMac Pro had been Apple’s most powerful desktop computer. Yet, in nearly all of these tests, the 16″ MacBook Pro equaled or slightly bettered the times of the iMac Pro. The laptop had faster export times and a higher Puget score with After Effects. One exception was my nine-layer 6K RED project, where the iMac Pro shined – exporting twice as fast as the MacBook Pro. Both Macs played back and scrubbed through this variety of files with ease, regardless of application or internal versus external drive. Overall, the biggest difference I noticed was that exports on the iMac Pro stayed quiet throughout, while the MacBook Pro frequently had to rev up the fans.

Apple claims up to 11 hours of battery life, but that’s really just during light duty computing: checking e-mails, writing, surfing the web, etc. And if you’ve optimized your energy settings for battery life. The MacBook Pro includes an integrated Intel GPU and employs automatic graphics cards switching. During tasks that don’t generate a heavy GPU load the machine is running on the integrated card. It switches to the AMD for apps like Final Cut or Premiere. I purposefully set up a looping 4K sequence in FCPX and found that the battery drained from 100% down to 10% in about two hours. While this is more stress than normal editing, it’s typical behavior for creative applications. The bottom line is that you shouldn’t expect to be editing for 11 hours straight running only on the internal battery.

Final thoughts

This machine comes with macOS Catalina pre-installed. Don’t plan on downgrading that. Catalina is the first version of macOS that is purely 64-bit and with significant under-the-hood security changes. All 32-bit dependencies have been dropped, which means that some software and hardware products might not be fully compatible. So, check first. I would recommend a clean install of apps and plug-ins rather than migrating from another machine’s drive. I did a clean installation of apps through my Apple and Adobe CC IDs and was ready to go in half a day. Conversely I’ve heard stories from others who pulled their hair out for a week trying to migrate between machines and OS versions. There are ways to do this and most of the time they work, but why not just start out fresh when you get a new machine?

Apple’s ProApps, Adobe Creative Cloud software, and DaVinci Resolve are all ready to go. Media Composer editors have a slight wait. Avid is beta testing a Catalina-compatible version of Media Composer (at the time of this writing). Some functionality won’t be there at the start, but updates will quickly add in many of those missing features.

Mac owners who bought a recent 15″ MacBook Pro will see a definite boost, but probably not enough to refresh their systems yet. But if, like me, you are running a five-year-old model, then this unit becomes very tempting, especially when working with anything more taxing than ProRes 1080p files. For work on-the-go, like on-site editing, DIT tasks, photo processing, or other creative tasks, the 2019 MacBook Pro easily fits the bill. Many editors may also need a machine that can easily shift from the field to the office/home/studio in place of an iMac or iMac Pro. If that’s you, then the MacBook Pro has the horsepower. Plug it up to a Thunderbolt dock, an external display, or other peripherals, and you are ready to go – no desktop computer needed.

Original written for RedShark News.

©2020 Oliver Peters

A Conversation with Steve Bayes

As an early adopter of Avid systems at a highly visible facility, I first got to know Steve Bayes through his on-site visits. He was the one taking notes about how a customer used the product and what workflow improvements they’d like to see. Over the years, as an editor and tech writer, we’ve kept in touch through his travels from Avid to Media 100 and on to Apple. It was always good to get together and decompress at the end of a long NAB week.

With a career of using as well as helping to design and shepherd a wide range of post-production products, Steve probably knows more about a diverse field of editing systems than most other company managers at editing systems manufacturers. Naturally many readers will know him as Apple’s Senior Product Manager for Final Cut Pro X, a position he held until last year. But most users have little understanding of what a product manager actually does or how the products they love and use every day get from the drawing board into their hands. So I decided to sit down with Steve over Skype and pull back the curtain just a little on this very complex process.

______________________________________________________

[OP]  Let’s start this off with a deep dive into how a software product gets to the user. What part does a product manager play in developing new features and where does engineering fit into that process?

[SB]  I’m a little unconventional. I like to work closely with the engineers during their design and development, because I have a strong technical and industry background. More traditional product managers are product marketing managers who take a more hands-off, marketing-oriented approach. That’s important, but I never worked liked that.

My rule of thumb is that I will tell the engineers what the problem is, but I won’t tell them how to solve it. In many cases the engineers will come back and say, “You’ve told us that customers need to do this ‘thing.’ What do they really want to achieve? Are you telling us that they need to achieve it exactly like this?” And so you talk that out a bit. Maybe this is exactly what the customers really want to do, because that’s what they’ve always done or the way everyone else does it. Maybe the best way to do it is based on three other things in emerging technology that I don’t know about.

In some cases the engineers come back and say, “Because of these other three things you don’t know about, we have some new ideas about how to do that. What do you think?” If their solution doesn’t work, then you have to be very clear about why and be consistent throughout the discussion, while still staying open to new ways of doing things. If there is a legitimate opportunity to innovate, then that is always worth exploring.

Traveling around the world talking to post-production people for almost 30 years allowed me to act as the central hub for that information and an advocate for the user. I look at it as working closely in partnership with engineering to represent the customer and to represent the company in the bigger picture. For instance, what is interesting for Apple? Maybe those awesome cameras that happen to be attached to a phone. Apple has this great hardware and wonderful tactile devices. How would you solve these issues and incorporate all that? Apple has an advantage with all these products that are already out in the world and they can think about cool ways to combine those with professional editing.

In all the companies I’ve worked for, we work through a list of prioritized customer requests, bug fixes, and things that we saw on the horizon within the timeframe of the release date or shortly thereafter. You never want to be surprised by something coming down the road, so we were always looking farther out than most people. All of this is put together in a product requirements document (PRD), which lays out everything you’d like to achieve for the next release. It lists features and how they all fit together well, plus a little bit about how you would market that. The PRD creates the starting point for development and will be updated based on engineering feedback.

You can’t do anything without getting sign-off by quality assurance (QA). For example, you might want to support all 10,000 of the formats coming out, but QA says, “Excuse me? I don’t think so!” [laughs] So it has to be achievable in that sense – the art of the possible. Some of that has to do with their resources and schedule. Once the engineers “put their pencils down,” then QA starts seriously. Can you hit your dates? You also have to think about the QA of third parties, Apple hardware, or potentially a new operating system (OS). You never, ever want to release a new version of Final Cut and two weeks later a new OS comes out and breaks everything. I find it useful to think about the three points of the development triangle as: the number of features, the time that you have, and the level of stability. You can’t say, “I’m going to make a really unstable release, but it’s going to have more features than you’ve ever seen!” [laughs] That’s probably a bad decision.

Then I start working with the software in alpha. How does it really work? Are there any required changes? For the demo, I go off and shoot something cool that is designed specifically to show the features. In many ways you are shooting things with typical problems that are then solved by whatever is in the new software. And there’s got to be a little something in there for the power users, as well as the new users.

As you get closer to the release, you have to make decisions about whether things are stable enough. If some feature is not going to be ready, then you could delay it to a future release — never ideal, but better than a terrible user experience. Then you have to re-evaluate the messaging. I think FCP X has been remarkably stable for all the releases of the last eight years.

You also have to bring in the third parties, like developers, trainers, or authors, who provide feedback so we can make sure we haven’t broken anything for them. If there was a particularly important feature that required third parties to help out, I would reach out to them individually and give them a little more attention, making sure that their product worked as it should. Then I would potentially use it in my own presentation. I worked closely with SpeedScriber transcription software when Apple introduced subtitling and I talked every day with Atomos while they were shooting the demo in Australia on ProRes RAW. 

[OP]  What’s the typical time frame for a new feature or release – from the germ of an idea until it gets to the user?

[SB]  Industry-wide, companies tend to have a big release and then a series of smaller releases afterwards that come relatively quickly. Smaller releases might be to fix minor, but annoying bugs that weren’t bad enough to stop the larger release. You never ship with “priority one” (P1) bugs, so if there are some P2s or P3s, then you want to get to them in a follow-up. Or maybe there was a new device, codec, camera, or piece of hardware that you couldn’t test in time, because it wasn’t ready. Of course, the OS is changing while you are developing your application, as well. One of my metaphors is that “you are building the plane while you are flying it.” [laughs]

I can’t talk about the future or Apple specifically, but historically, you can see a big release might take most of a year. By the time it’s agreed upon, designed, developed, “pencils down – let’s test it” – the actual development time is not as long as you might think. Remember, you have to back-time for quality assurance. But, there are deeper functions that you can’t develop in that relatively short period of time. Features that go beyond a single release are being worked on in the background and might be out in two or three releases. You don’t want to restrict very important features just to hit a release date, but instead, work on them a bit longer.

Final Cut is an excellent application to demonstrate the capabilities of Apple hardware, ease of use, and third party ecosystem. So you want to tie all these things together as much as you can. And every now and then you get to time things so they hit a big trade show! [laughs]

[OP]  Obviously this is the work of a larger team. Are the romanticized tales of a couple of engineers coming out of the back room with a fully-cooked product more myth than reality?

[SB]  Software development is definitely a team effort. There are certain individuals that stand out, because they are good at what they do and have areas of specialty. They’ll come back and always give you more than you asked for and surprise you with amazing results. But, it’s much more of a coordinated effort – the customer feedback, the design, a team of managers who sign off on all that, and then initial development.

If it doesn’t work the way it’s supposed to, you may call in extra engineers to deal with the issues or to help solve those problems. Maybe you had a feature that turned out more complicated than first thought. It’s load balancing – taking your resources and moving them to where they do the most good for the product. Plus, you are still getting excellent feedback from the QA team. “Hey, this didn’t work the way we expected it to work. Why does it work like that?” It’s very much an effort with those three parts: design, engineering, and QA. There are project managers, as well, who coordinate those teams and manage the physical release of the software. Are people hitting their dates for turning things in? They are the people banging on your door saying, “Where’s the ‘thing with the stuff?'” [laughs]

There are shining stars in each of these areas or groups. They have a world of experience, but can also channel the customer – especially during the testing phase. And once you go to beta, you get feedback from customers. At that point, though, you are late in the process, so it’s meant to fix bugs, not add features. It’s good to get that feature feedback, but it won’t be in the release at that point.

[OP]  Throughout your time at various companies, color correction seems to be dear to you. Avid Symphony, Apple Color when it was in the package, not to mention the color tools in Final Cut Pro X. Now nearly every NLE can do color grading and the advanced tools like DaVinci Resolve are affordable to any user. Yet, there’s still that very high-end market for systems like Filmlight’s Baselight. Where do you see the process of color correction and grading headed?

[SB]  Color has always meant the difference for me between an OK project and a stellar project. Good color grading can turn your straw into gold. I think it’s an incredibly valuable talent to have. It’s an aesthetic sense first, but it’s also the ability to look at an image and say, “I know what will fix that image and it will look great.” It’s a specialized skill that shouldn’t be underrated. But, you just don’t need complex gear anymore to make your project better through color grading.

Will you make it look as good as a feature film or a high-end Netflix series? Now you’re talking about personnel decisions as much as technology. Colorists have the aesthetic and the ability to problem-solve, but are also very fast and consistent. They work well with customers in that realm. There’s always going to be a need for people like that, but the question is what chunk of the market requires that level of skill once the tools get easier to use?

I just think there’s a part of the market that’s growing quickly – potentially much more quickly – that could use the skills of a colorist, but won’t go through a separate grading step. Now you have look-up tables, presets, and plug-ins. And the color grading tools in Final Cut Pro X are pretty powerful for getting awesome results even if you’re not a colorist. The business model is that the more you can do in the app, the easier it is to “sell the cut.” The client has to see it in as close to the finished form as possible. Sometimes a bad color mismatch can make a cut feel rough and color correction can help smooth that out and get the cut signed off. As you get better using the color grading tools in FCP X, you can improve your aesthetic and learn how to be consistent across hundreds of shots. You can even add a Tangent Wave controller if you want to go faster. We find ourselves doing more in less time and the full range of color grading tools in FCP X and the FX Plug plug-ins can play a very strong roll in improving any production. 

[OP]  During your time at Apple, the ProRes codec was also developed. Since Apple was supplying post-production hardware and software and no professional production cameras, what was the point in developing your own codec?

[SB]  At the time there were all of these camera codecs coming out, which were going to be a very bad user experience for editing – even on the fastest Mac Pros at the time. The camera manufacturers were using compression algorithms that were high quality, but highly compressed, because camera cards weren’t that fast or that big. That compression was difficult to decode and play back. It took more processing power than you could get from any PC at that time to get the same number of video streams compared with digitizing from tape. In some cases you couldn’t even play the camera original video files at all, so you needed to transcode before you could start editing. All of the available transcoding codecs weren’t that high in quality or they had similar playback problems.

Apple wanted to make a better user experience, so ProRes was originally designed as an intermediate codec. It worked so well that the camera manufacturers wanted to put it into their cameras, which was fine with Apple, as long as you met the quality standards. Everyone has to submit samples and work with the Apple engineers to get it to the standard that Apple expects. ProRes doesn’t encode into as small file sizes as some of the other camera codecs; but given the choice between file size, quality, and performance, then quality and performance were more important. As camera cards and hard drives get bigger, faster, and cheaper, it’s less of an issue and so it was the right decision.

[OP]  The launch of Final Cut Pro X turned out to be controversial. Was the ProApps team prepared for the industry backlash that happened?

[SB] We knew that it would be disruptive, of course. It was a whole new interface and approach. It integrated a bunch of cutting edge technology that people weren’t familiar with. A complete rewrite of  the codebase was a huge step forward as you can see in the speed and fluidity that is so crucial during the creative process. Metadata driven workflows, background processing, magnetic timeline — in many ways people are still trying to catch up eight years later. And now FCP X is the best selling version of Final Cut Pro ever.

[OP]  When Walter Murch used Final Cut Pro to edit the film, Cold Mountain, it gained a lot of attention. Is there going to be another “Cold Mountain moment” for anyone or is that even important anymore?

[SB]  Post Cold Mountain? [chuckle] You have to be careful — the production you are trying to emulate might have nothing to do with your needs on an everyday basis. It may be aspirational, but by adopting Hollywood techniques, you aren’t doing yourself any favors. Those are designed with budgets, timeframes, and a huge crew that you don’t have. Adopt a workflow that is designed for the kind of work you actually do.

When we came up in the industry, you couldn’t make a good-looking video without going to a post house. Then NLEs came along and you could do a bunch of work in your attic, or on a boat, or in a hotel room. That creative, rough-cut market fractured, but you still had to go to an online edit house. That was a limited world that took capital to build and it was an expense by the hour. Imagine how many videos didn’t get made, because a good post house cost hundreds of dollars an hour.

Now the video market has fractured into all these different outlets – streaming platforms, social media, corporate messaging, fast-turnaround events, and mobile apps. And these guys have a ton of powerful equipment, like drones, gimbals, and Atomos ProRes RAW recorders – and it looks great! But, they’re not going to a post house. They’re going to pick up whatever works for them and at the end of the day impress their clients or customers. Each one is figuring out new ways to take advantage of this new technology.

One of the things Sam Mestman teaches in his mobile filmmaking class is that you can make really high-quality stuff for a fraction of the cost and time, as long as you are going to be flexible enough to work in a non-traditional way. That is the driving force that’s going to create more videos for all of these different outlets. When I started out, the only way you could distribute directly to the consumer was by mailing someone a VHS tape. That’s just long gone, so why are we using the same editing techniques and workflows?

I can’t remember the last time I watched something on broadcast TV. The traditional ways of doing things are a sort of assembly line — every step is very compartmentalized. This doesn’t stand to benefit from new efficiencies and technological advances, because it requires merging traditional roles, eliminating steps, and challenging the way things are charged for. The rules are a little less strict when you are working for these new distribution platforms. You still have to meet the deliverable requirements, of course. But if you do it the way you’ve always done it, then you won’t be able to bring it in on time or on budget in this emerging world. If you want to stay competitive, then you are forced to make these changes — your competition maybe already has. How can you tell when your phone doesn’t ring? And that’s why I would say there are Cold Mountain moments all the time when something gets made in a way that didn’t exist a few years ago. But, it happens across this new, much wider range of markets and doesn’t get so much attention.

[OP]  Final Cut Pro X seems to have gained more professional users internationally than in the US. In your writings, you’ve mentioned that efficiency is the way local producers can compete for viewers and maintain quality within budget. Would you expand upon that?

[SB]  There are a range of reasons why FCP X and new metadata-driven workflows are expanding in Europe faster than the US. One reason is that European crews tend to be smaller and there are fewer steps between the creatives and decision-making execs. The editor has more say in picking their editing system. I see over and over that editors are forced to use systems they don’t like in larger projects and they love to use FCP X on their own projects. When the facilities listen to and trust the editors, then they see the benefits pretty quickly. If you have government funded TV (like in many countries in Europe), then they are always under public pressure to justify the costs. Although they are inherently conservative, they are incentivized to always be looking for new ways to improve and that involves risks. With smaller crews, Europeans can be more flexible as to what being “an editor” really means and don’t have such strict rules that keep them from creating motion graphics – or the photographer from doing the rough cut. This means there is less pressure to operate like an assembly line and the entire production can benefit from efficiencies.

I think there’s a huge amount of money sloshing around in Europe and they have to figure out how to do these local-language productions for the high quality that will compete with the existing broadcasters, major features, and the American and British big-budget shows. So how are you going to do that? If you follow the rules, you lose. You have to look at different methods of production. 

Subscription is a different business model of continuing revenue. How many productions will the subscription model pay for? Netflix is taking out $2 billion in bonds on top of the $1 billion they already did to fund production and develop for the local languages. I’ve been watching the series Criminal on Netflix. It’s a crime drama based on police interrogations, with separate versions done in four different countries. English, French, German, and Spanish. Each one has its own cultural biases in getting to a confession (and that’s why I watched them all!). I’ve never seen anything like it before.

The guys at Metronome in Denmark used this moment as an opportunity to take some big chances with creating new workflows with FCP X and shared storage. They are using 1.5 petabytes of storage, six Synology servers, and 30 shows being edited right now in FCP X. They use the LumaForge Jellyfish for on-location post-production. If someone says it can’t be done, you need to talk to these guys and I’m happy to make the introduction.

I’m working with another company in France that shot a series on the firefighters of Marseilles. They shot most of it with iPhones, but they also used other cameras with longer lenses to get farther away from the fires. They’re looking at a series of these types of productions with a unique mobile look. If you put a bunch of iPhones on gimbals, you’ve got a high-quality, multi-cam shoot, with angles and performances that you could never get any other way. Or a bunch of DSLRs with Atomos devices and the Atomos sync modules for perfect timecode sync. And then how quickly can you turn out a full series? Producers need to generate a huge amount of material in a wide range of languages for a wide range of markets and they need to keep the quality up. They have to use new post-production talent and methods and, to me, that’s exciting.

[OP]  Looking forward, where do you see production and post technology headed?

[SB]  The tools that we’ve developed over the last 30 years have made such a huge difference in our industry that there’s a part of me that wants to go back and be a film student again. [laughs] The ability for people to turn out compelling material that expresses a point of view, that helps raise money for a worthy cause, that helps to explain a difficult subject, that raises consciousness, that creates an emotional engagement – those things are so much easier these days. It’s encouraging to me to see it being used like this.

The quality of the iPhone 11 is stunning. With awesome applications, like Mavis and FiLMiC Pro, these are great filmmaking tools. I’ve been playing around with the DJI Osmo Pocket, too, which I like a lot, because it’s a 4K sensor on a gimbal. So it’s not like putting an iPhone on a gimbal – it’s all-in-one. Although you can connect an iPhone to it for the bigger screen. 

Camera technology is going in the direction of more pixels and bigger sensors, more RAW and HDR, but I’d really like to see the next big change come in audio. It’s the one place where small productions still have problems. They don’t hire the full-time sound guy or they think they can shoot just with the mic attached to the hot shoe of the camera. That may be OK when using only a DSLR, but the minute you want to take that into a higher-end production, you’re going to need to think about it more.

Again, it’s a personnel issue. I can point a camera at a subject and get a pretty good recording, but to get a good sound recording – that’s much harder for me at this point. In that area, Apogee has done a great job with MetaRecorder for iOS. It’s not just generating iXML to automatically name the audio channels when you import into FCP X — you can actually label the FCP X roles in the app. It uses Timecode Systems (now Atomos) for multiple iOS recording devices to sync with rock-solid timecode and you can control those multiple recorders from a single iOS device. I would like to see more people adopt multiple microphones synced together wirelessly and controlled by an iPad.

One of the things I love about being “semi-retired” is if something’s interesting to me, I just dig into it. It’s exciting that you can edit from an iPad Pro, you can back up to a Gnarbox, you can shoot high-quality video with your iPhone or a DJI Osmo Pocket, and that opens the world up to new voices. If you were to graph it – the cost of videos is going down and to the right, the number of videos being created in going up and to the right, and at some point they cross over. That promises a huge increase in the potential work for those who can benefit from these new tools. We are close to that point.

It used to be that if your client went to another post house, you lost that client. It was a zero sum game — I win — you lose. Now there are so many potential needs for video we would never have imagined. Those clients are coming out of the woodwork and saying, “Now I can do a video. I’ll do some of it myself, but at some point I’ll hand it off to you, because you are the expert.” Or they feel they can afford your talent, because the rest of the production is so much more efficient. That’s a growing demand that you might not see until your market hits that crossover point.

This article also appears at FCPco.

©2019 Oliver Peters

The 2019 Mac Pro Truck

In 2010 Steve Jobs famously provided us with the analogy that traditional computers are like trucks in the modern era. Not that trucks were going away, but simply were no longer a necessity for most of us, now that the majority of the populace wasn’t engaged in farming. While trucks would continue to be purchased and used, far fewer people actually needed them, because the car covered their needs. The same was true, he felt, of traditional computers.

Jobs is often characterized as being a consumer market-driven guy, but I believe the story is more nuanced. After all, he founded NeXT Computer, which clearly made high-end workstations. Job also became the major shareholder in Pixar Animation Studios – a company that not only needed advanced, niche computing power, but also developed some of its own specialized graphics hardware and software. So a mix of consumer and advanced computing DNA runs throughout Apple.

By the numbers

Unless you’ve been under a rock, you know that Apple revealed its new 2019 Mac Pro at the WWDC earlier this month. This year’s WWDC was an example of a stable, mature Apple firing on all cylinders. iPhone unit sales have not been growing. The revenue has, but that’s because the prices have been going up. Now it’s time to push all of the company’s businesses, including iPad, services, software, and the Mac. Numbers are hard to come by, although Apple has acknowledged that the Mac unit by itself is nearly a $25 billion business and that it would be close to being in the Fortune 100 on its own. There’s a ratio of 80/20 Mac laptops to desktops. For comparison to the rest of the PC world, Apple’s marketshare is around 7%, ranking fourth behind Lenovo, HP, and Dell, but just ahead of Acer. There are 100 million active macOS users (Oct 2018), although Windows 10 adoption alone runs eight times larger (Mar 2019).

We can surmise from this information that there are 20 million active Mac Pro, iMac, iMac Pro, and Mac mini users. It’s fair to assume that a percentage of those are in the market for a new Mac Pro. I would project that maybe 1% of all Mac users would be interested in upgrading to this machine – i.e. around 1 million prospective purchasers. I’m just spit-balling here, but at a starting price of $6,000, that’s a potential market of $6 billion in sales before factoring in any upgrade options or new macOS users!

A funny thing happened on the way to the WWDC

Apple went through a computing platform progression from the old Quadra 950 and 9600 towers to the first Intel Mac Pro towers over the course of the mid-1990s to 2006. The second generation of the older Mac Pro was released during 2009. So in a dozen-plus years, Apple customers saw seven major processor/platform changes and had come to expect a constant churn. In essence, plan on replacing your system every few years. However, from 2009 onward, customers that bought those Mac Pros had a machine that could easily last, be productive, and still be somewhat competitive ten years later. The byproduct of this was the ability to plan longer life expectancy for the hardware you buy. No longer an automatic two to three year replacement need.

Even the 2013 Mac Pro has lasted until now (six years later) and remains competitive with most machines. The miscalculation that Apple made with the 2013 Mac Pro was that pro customers would prefer external expandability versus internal hardware upgrades. Form over function. That turned out to be wrong. I’m probably one of the few who actually likes the 2013 Mac Pro under the right conditions. It’s an innovative design, but unfortunately one that can’t be readily upgraded.

The second major change in computing hardware is that now “lesser” machines are more than capable of doing the work required in media and entertainment. During those earlier days of the G3/G4/G5 PowerMacs and the early Intel Mac Pros, Apple didn’t make laptops and all-in-ones that had enough horsepower to handle video editing and the like. Remember the colorful, plastic iMacs and white eMacs? Or what about the toilet-seat-like iBook laptop? Good enough for e-mail, but not what you would want for editing.

Now, we have a wide range of both Mac and PC desktop computers and laptops that are up to the task. In the past, if you needed a performance machine, then you needed a workstation class computer. Nothing else would do. Today, a general purpose desktop PC that isn’t necessarily classed as a workstation is more than sufficient for designers, editors, and colorists. In the case of Apple, there’s a range of laptops and all-in-ones that cover those needs at many different price points.

The 2019 Mac Pro Reveal

Let me first say that I didn’t attend WWDC and I haven’t seen the new Mac Pro in person. I hope to be able to do a review at some point in the future. The bottom line is that this is purely an opinion piece for now.

There have certainly been a ton of internet comments about this machine – both positive and negative. Price is the biggest pain point. Clearly Apple intends this to be a premium product for the customer with demanding computing requirements. You can spin the numbers any way you like and people have. Various sites have speculated that a fully-loaded machines could drive the starting price from $6,000 to as high as $35K to $50K. The components that Apple defines in the early tech information do not perfectly match equivalent model numbers available on the suppliers’ websites. No one knows for sure how the specific Intel Xeon being used by Apple equates to other Xeons listed on Intel’s site. Therefore, exact price extrapolations are simply guesses for now.

In late 2009 I purchased an entry model 8-core Mac Pro. With some storage and memory upgrades, AppleCare, sales tax, and a small business discount, I paid around $4,000. The inflation difference over the decade is about 17%, so that same hardware should cost me $4,680 today. In fairness, Apple has a different design in this new machine and there are technologies not in my base 2009 machine, such as 10GigE, Thunderbolt 3, a better GPU, etc. Even though this new machine may be out of my particular budget right now, it’s still an acceptable value when compared with the older Mac Pros.

Likewise, if you compare the 2019 Mac Pro to comparable name brand workstations, like an HP Z8, you’ll quickly find that the HP will cost more. One clear difference, though is that HP also offers smaller, less costly workstation models, such as the Z2, Z4 and Z6. The PC world also offers many high quality custom solutions, such as Puget Systems, which I have reviewed.

One design decision that could have mitigated the cost a bit is the choice of CPU chips. Apple has opted to install Xeons chips in all of its Mac Pro designs. Same with the iMac Pro. However, Intel also offers very capable Core i9 CPUs. The i9 chips offer faster core speeds and high core counts. The Xeons are designed to be run flat out 24/7. However, in the case of video editing, After Effects, and so on, the Core i9 chip may well be the better solution. These apps really thrive on fast single-core speeds, so having a 12-core or 28-core CPU, where each core has a slower clock speed, may not give you the best results. Regardless of benefit, Xeons do add to Apple’s hard costs in building the machine. Xeons are more expensive that Core chips. In some direct comparisons, a Xeon can garner $1,000 over Intel’s retail price of the equivalent Core CPU.

The ultimate justification for buying a Mac Pro tower isn’t necessarily performance alone, but rather longevity and expandability. As I outlined above, customers have now been conditioned to expect the system to last and be productive for at least a decade. That isn’t necessarily true of an all-in-one or a laptop. This means that if you amortize the investment in a 2019 Mac Pro over a ten-year period, it’s actually quite reasonable.

The shame – and this is where much of the internet ire is coming from – is that Apple didn’t offer any intermediate models, like HP’s Z4 or Z6. I presume that Apple is banking on those customers buying iMacs, iMac Pros, Mac minis, or MacBook Pros instead. Couple one of these models with an external GPU and fast external storage and you will have plenty of power for your needs today. It goes without saying that comparing this Mac Pro to a custom PC build (which may be cheaper) is a non-starter. A customer for this Mac Pro will buy one, pure and simple. There is built-in price elasticity to this niche of the market. Apple knows that and the customers know it.

Nuts and bolts

The small details haven’t been fully revealed, so we probably won’t know everything about these new Mac Pros until September (the rumored release). Apple once again adopted a signature case design, which like the earlier tower case has been dubbed a “cheese grater.” Unlike the previous model, where the holes were simply holes for ventilation, the updated model (or would that be the retro model?) uses a lattice system in the case to direct the airflow. The 2019 is about the same size as its “cheese grater” predecessor, but 20 pounds lighter.

There is very little rocket science in how you build a workstation, so items like Xeon CPUs, GPU cards, RAM, and SSD system drives are well understood and relatively standard for a modern PC system.

The short hardware overview consists of:

8, 12, 16, 24, and 28-core Xeon CPU options

Memory from 32GB to 1.5TB of DDR4 ECC RAM

Up to four AMD GPU cards

1.4 kW power supply

Eight PCIe expansion slots (one used for Apple i/o card)

System storage options from 256GB to 4TB

Four Thunderbolt 3 ports (2 top and 2 back) plus two USB 3 ports (back)

(Note – more ports available with the upgraded GPU options)

Two 10Gb Ethernet ports

WiFi, Bluetooth, built-in speakers, headphone jack

So far, so good. Any modern workstation would have similar choices. There are several key unknowns and that’s where the questions come in. First, the GPU cards appear to be custom-designed AMD cards installed into a new MPX (Mac Pro expansion) module. This is a mounting/connecting cage to install and connect the hardware. However, if you wanted to add your own GPU card, would it fit into such a module? Would you have to buy a blank module from Apple for your card? Would your card simply fit into the PCIe slot and screw in like on any other tower? The last question does appear to be possible, but will there be proper Nvidia support?

The second big question relates to internal storage. The old “cheese grater” had sleds to install four internal drives. Up to six could be installed if you used the optical drive bays. The 2019 Mac Pro appears to allow up to four drives within an MPX chassis. Promise has already announced two products specifically for the Mac Pro. One would include four RAIDed 8TB drives for a 32TB capacity. 14TB HDDs are already available, so presumably this internal capacity will go up. 

The unknown is whether or not you can add drives without purchasing an MPX module. The maximum internal GPU option seems to be four cards, which are mounted inside two MPX modules. This is also the space required for internal drives. Therefore, if you have both MPX modules populated with GPU cards, then I would imagine you can’t add internal storage. But I may be wrong. As with most things tech, I predict that if blank MPX modules are required, a number of vendors will quickly offer cheaper aftermarket MPX modules for GPUs, storage, etc.

One side issue that a few blogs have commented on is the power draw. Because of the size of the power supply, the general feeling is that the Mac Pro should be plugged into a standard electrical circuit by itself, plus maybe a monitor. In other words, not a circuit with a bunch of other electrical devices, otherwise you might start blowing breakers.

Afterburner

A new hardware item from Apple is the optional Afterburner ProRes and ProRes RAW accelerator card. This uses an FGPA (field programmable gate array), which is a chip that can be programmed for various specific functions. It can potentially be updated in the future. Anyone who has worked with the RED Rocket or RED Rocket-X card in the past will be quite familiar with what the Afterburner is.

The Afterburner will decode ProRes and ProRes RAW codecs on-the-fly when this media is played in Final Cut Pro X, QuickTime Player X, and any other application re-coded to support the card. This would be especially beneficial with camera raw codecs, because it debayers the raw sensor data via hardware acceleration at full resolution, instead of using the CPU. Other camera RAW manufacturers, like RED, ARRI, Canon, and Blackmagic Design, might add support for this card to accelerate their codecs, as well. What is not known is whether the Afterburner card can also be used to offload true background functions like background exports and transcoding within Final Cut Pro X.

An FPGA card offers the promise of being future-proofed, because you can always update its function later. However, in actual practice, the hardware capabilities of any card become outstripped as the technology changes. This happened with the RED Rocket card and others. We’ll see if Apple has any better luck over time.

Performance

Having lots of cores is great, but with most media and entertainment software the GPU can be key. Apple has been at a significant disadvantage with many applications, like After Effects, because of their stance with Nvidia and CUDA acceleration. Apple prefers that a manufacturer support Metal, which is their way of leveraging the combined power of all CPUs and GPUs in the system. This all sounds great, but the reality is that it’s one proprietary technology versus another. In the benchmark tests I ran with the Puget PC workstation, the CUDA performance in After Effects easily trounced any Mac that I scored it against.

Look at Apple’s website for a chart representing the relative GPU performance of a 2013 Mac Pro, an iMac Pro, and the new 2019 Mac Pro. Each was tested with their respective top-of-the-line GPU option. The iMac Pro is 1.5x faster than the 2013 Mac Pro. The 2019 Mac Pro is twice as fast as the iMac Pro and 3x faster than the 2013 Mac Pro. While that certainly looks impressive, that 2x improvement over the iMac Pro comes thanks to two upgraded GPU cards instead of one. Well, duh! Of course, at this time we have no idea what these cards and MPX units will cost. (Note – I am not totally sure as to whether this testing used two GPUs in one MPX module or a total of four GPUs in two modules.)

We won’t know how well these really perform until the first units get out into the wild. Especially how they compare against comparable PCs with high-powered Nvidia cards. I may be going out on a limb, but I would be willing to bet that many people who buy the base configuration for $6K – thinking that they will get a huge boost in performance – are going to be very disappointed. I don’t mean to trash the entry-level machine. It’s got solid specs, but in that configuration, isn’t the best performer. At $6K, you are buying a machine that will have longevity and which can be upgraded in the future. In short, the system can grow with you over time as the workload demands increase. That’s something which has not be available to Mac owners since the end of 2012.

Software

To take the most advantage of the capabilities of this new machine, software developers (both applications and plug-ins) will have to update their code. All of the major brands like Adobe, Avid, Blackmagic Design, and others seem to be on board with this. Obviously, so are the in-house developers at Apple who create the Pro Applications. Final Cut Pro X and Logic Pro X are obvious examples. Logic is increasing the track count and number of software instruments you can run. Updates have already been released.

Final Cut Pro X has a number of things that appear in need of change. Up until now, in spite of being based around Metal, Final Cut has not taken advantage of multiple GPUs when present. If you add an eGPU to a Mac today, you must toggle a preference setting to use one GPU or the other as the primary GPU (Mojave). Judging by the activity monitor, it appears to be an either-or thing, which means the other GPU is loafing. Clearly when you have four GPUs present, you will want to tap into the combined power of all four.

With the addition of the Afterburner option, FCPX (or any other NLE) has to know that the card is present and how to offload media to the card during playback (and render?). Finally, the color pipeline in Final Cut Pro X is being updated to work in 16-bit float math, as well as optimized for fast 8K workflows.

All of this requires new code and development work. With the industry now talking about 16K video, is 8K enough? Today, 4K delivery is still years away for many editors, so 8K is yet that much further. I suspect that if and when 16K gets serious traction, Apple will be ready with appropriate hardware and software technology. In the case of the new Mac Pro, this could simply mean a new Afterburner card instead of an entirely new computer.

The Apple Pro Display XDR

In tandem with the 2019 Mac Pro, Apple has also revealed the new Pro Display XDR – a 6K  32″ Retina display. It uses a similar design aesthetic to the Mac Pro, complete with a matching ventilation lattice. This display comes calibrated and is designed for HDR with 1,000 nits fullscreen, sustained brightness, and a 1,600 nit maximum. It will be interesting to see how this actually looks. Recent Final Cut Pro X updates have added HDR capabilities, but you can never get an accurate view of it on a UI display. Furthermore, the 500 nit, P3 displays used in the iMac Pros are some of the least color-accurate UI displays of any Mac that I work with. I really hope Apple gets this one right.

To sell the industry on this display, Apple is making the cost and feature comparison between this new display and actual HDR color reference displays costing in the $30K-40K range. Think Flanders Scientific or Sony. The dirty little HDR secret is that when you display an image at the maximum nit level across the entire screen, the display will dim in order to prevent damage. Only the most expensive displays are more tolerant of this. I would presume that the Pro Display XDR will also dim when presented with a fullscreen image of 1,600 nits, which is why their spec lists 1,000 nits fullscreen. That level is the minimum HDR spec. Of course, if you are grading real world images properly, then in my opinion, you rarely should have important picture elements at such high levels. Most of the image should be in a very similar range to SDR, with the extended range used to preserve highlight information, like a bright sky.

Some colorists are challenging the physics behind some of Apple’s claims. The concern is whether or not the display will result in bloomed highlights. Apple’s own marketing video points out that the design reduces blooming, but it doesn’t say that it completely eliminates it. We’ll see. I don’t quite see how this display fits as a reference display. It only has Thunderbolt connections – no SDI or HDMI – so it won’t connect in most standard color correction facilities without additional hardware. If, like all computer displays, the user can adjust the brightness, then that goes against the concept of an HDR reference display. At 32″, it’s much too small to be used as a client display to stick on the wall.

Why did Apple make the choice to introduce this as a user interface display? If they wanted to make a great HDR reference display, then that makes some sense. Even as a great specialty display, like you often find in photography or fine print work. I understand that it will likely display accurate, fullscreen video directly from Final Cut Pro X or maybe even Premiere Pro without the need and added cost of an AJA or BMD i/o device or card. But as a general purpose computer display? That feels like it simply misses the mark, no matter how good it is. Not to mention, at a brightness level of 1,000 to 1,600 nits, that’s way too bright for most edit suites. I even find that to be the case with the iMac Pro’s 500 nit displays, when you crank them up.

This display is listed as $5K without a stand. Add another $1k if you want a matte finish. Oh, and if you want the stand, add another $1K! I don’t care how seductively Jony Ive pronounces “all-u-minium,” that’s taxing the good will of your customer. Heck, make it $5,500 and toss in the stand at cost. Remember, the stand has an articulating arm, which will probably lose its tension in a few years. I hope that a number of companies will make high-quality knock-offs for a couple of hundred bucks.

If you compare the Apple Pro Display XDR to another UI display with a similar mission, then it’s worth comparing it to the HP Dreamcolor Z31x Studio Display. This is a 32″ 4K, calibrated display with an MSRP of right at $3,200. But it doesn’t offer HDR specs, Retina density, or 6K resolution. Factor in those features and Apple’s brand premium and then the entry price isn’t that far out of line – except for that stand.

I imagine that Apple’s thought process is that if you don’t want to buy this display, then there are plenty of cheaper choices, like an LG, HP, Asus, or Dell. And speaking of LG, where’s Apple’s innovative spirit to try something different with a UI display? Maybe something like an ultra wide. LG now has a high-resolution 49″ display for about $1,400. This size enables one large canvas across the width; or two views, like having two displays side-by-side. However, maybe a high-density display (Retina) isn’t possible with such a design, which could be Apple’s hang-up.

Final thoughts

The new 2019 Mac Pro clearly demonstrates that Apple has not left the high-end user behind. I view relevant technology through the lens of my needs with video; however, this model will appeal to a wide range of design, scientific, and engineering users. It’s a big world out there. While it may not be the most cost-effective choice for the individual owner/editor, there are still plenty of editors, production companies, and facilities that will buy one.

There is a large gap between the Mac mini and this new Mac Pro. I still believe there’s a market for a machine similar to some of those concept designs for a Mac Pro. Or maybe a smaller version of this machine that starts at $3,000. But there isn’t such a model from Apple. If you like the 2013 “trash can” Mac Pro, then you can still get it – at least until the 2019 model is officially released. Naturally, iMacs and iMac Pros have been a superb option for that in-between user and will continue to be so.

If you are in the market for the 2019 Mac Pro, then don’t cut yourself short. Think of it as an investment for at least 10 years. Unless you are tight and can only afford the base model, then I would recommend budgeting in the $10K range. I don’t have an exact configuration in mind, but that will likely be a sweet spot for demanding work. Once I get a chance to properly review the 2019 Mac Pro, I’ll be more than happy come back with a real evaluation.

©2019 Oliver Peters

NAB Show 2019

This year the NAB Show seemed to emphasize its roots – the “B” in National Association of Broadcasters. Gone or barely visible were the fads of past years, such as stereoscopic 3D, 360-degree video, virtual/augmented reality, drones, etc. Not that these are gone – merely that they have refocused on the smaller segment of marketshare that reflects reality. There’s not much point in promoting stereo 3D at NAB if most of the industry goes ‘meh’.

Big exhibitors of the past, like Quantel, RED, Apple, and Autodesk, are gone from the floor. Quantel products remain as part of Grass Valley (now owned by Belden), which is the consolidation of Grass Valley Group, Quantel, Snell & Wilcox, and Philips. RED decided last year that small, camera-centric shows were better venues. Apple – well, they haven’t been on the main floor for years, but even this year, there was no off-site, Final Cut Pro X stealth presence in a hotel suite somewhere. Autodesk, which shifted to a subscription model a couple of years ago, had a demo suite in the nearby Renaissance Hotel, focusing on its hero product, Flame 2020. Smoke for Mac users – tough luck. It’s been over for years.

This was a nuts-and-bolts year, with many exhibits showing new infrastructure products. These appeal to larger customers, such as broadcasters and network facilities. Specifically the world is shifting to an IP-based infrastructure for signal routing, control, and transmission. This replaces copper and fiber wiring of the past, along with the devices (routers, video switchers, etc) at either end of the wire. Companies that might have appeared less relevant, like Grass Valley, are back in a strong sales position. Other companies, like Blackmagic Design, are being encouraged by their larger clients to fulfill those needs. And as ever, consolidation continues – this year VizRT acquired NewTek, who has been an early player in video-over-IP with their proprietary NDI protocol.

Adobe

The NAB season unofficially started with Adobe’s pre-NAB release of the CC2019 update. For editors and designers, the hallmarks of this update include a new, freeform bin window view and adjustable guides in Premiere Pro and content-aware, video fill in After Effects. These are solid additions in response to customer requests, which is something Adobe has focused on. A smaller, but no less important feature is Adobe’s ongoing effort to improve media performance on the Mac platform.

As in past years, their NAB booth was an opportunity to present these new features in-depth, as well as showcase speakers who use Adobe products for editing, sound, and design. Part of the editing team from the series Atlanta was on hand to discuss the team’s use of Premiere Pro and After Effects in their ‘editing crash pad’.

Avid

For many attendees, NAB actually kicked off on the weekend with Avid Connect, a gathering of Avid users (through the Avid Customer Association), featuring meet-and-greets, workshops, presentations, and ACA leadership committee meetings. While past product announcements at Connect have been subdued from the vantage of Media Composer editors, this year was a major surprise. Avid revealed its Media Composer 2019.5 update (scheduled for release the end of May). This came as part of a host of many updates. Most of these apply to companies that have invested in the full Avid ecosystem, including Nexis storage and Media Central asset management. While those are superb, they only apply to a small percentage of the market. Let’s not forget Avid’s huge presence in the audio world, thanks to the dominance of Pro Tools – now with Dolby ATMOS support. With the acquisition of Euphonix years back, Avid has become a significant player in the live and studio sound arena. Various examples of its S-series consoles in action were presented.

Since I focus on editing, let me discuss Media Composer a bit more. The 2019.5 refresh is the first major Media Composer overhaul in years. It started in secret last year. 2019.5 is the first iteration of the new UI, with more to be updated in coming releases. In short, the interface has been modernized and streamlined in ways to attract newer, younger users, without alienating established editors. Its panel design is similar to Adobe’s approach – i.e. interface panels can be docked, floated, stacked, or tabbed. Panels that you don’t want to see may be closed or simply slid to the side and hidden. Need to see a hidden panel again? Simply side it back open from the edge of the screen.

This isn’t just a new skin. Avid has overhauled the internal video pipeline, with 32-bit floating color and an uncompressed DNx codec. Project formats now support up to 16K. Avid is also compliant with the specs of the Netflix Post Alliance and the ACES logo program.

I found the new version very easy to use and a welcomed changed; however, it will require some adaptation if you’ve been using Media Composer for a long time. In a nod to the Media Composer heritage, the weightlifter (aka ‘liftman’) and scissors icons (for lift and extract edits) are back. Even though Media Composer 2019.5 is just in early beta testing, Avid felt good enough about it to use this version in its workshops, presentations, and stage demos.

One of the reasons to go to NAB is for the in-person presentations by top editors about their real-world experiences. No one can top Avid at this game, who can easily tap a host of Oscar, Emmy, BFTA, and Eddie award winners. The hallmark for many this year was the presentation at Avid Connect and/or at the show by the Oscar-winning picture and sound editing/mixing team for Bohemian Rhapsody. It’s hard not to gather a standing-room-only crowd when you close your talk with the Live Aid finale sequence played in kick-ass surround!

Blackmagic Design

Attendees and worldwide observers have come to expect a surprise NAB product announcement out of Grant Petty each year and he certainly didn’t disappoint this time. Before I get into that, there were quite a few products released, including for IP infrastructures, 8K production and post, and more. Blackmagic is a full spectrum video and audio manufacturer that long ago moved into the ‘big leagues’. This means that just like Avid or Grass Valley, they have to respond to pressure from large users to develop products designed around their specific workflow needs. In the BMD booth, many of those development fruits were on display, like the new Hyperdeck Extreme 8K HDR recorder and the ATEM Constellation 8K switcher.

The big reveal for editors was DaVinci Resolve 16. Blackmagic has steadily been moving into the editorial space with this all-in-one, edit/color/mix/effects/finishing application. If you have no business requirement for – or emotional attachment to – one of the other NLE brands, then Resolve (free) or Resolve Studio (paid) is an absolute no-brainer. Nothing can touch the combined power of Resolve’s feature set.

New for Resolve 16 is an additional editorial module called the Cut Page. At first blush, the design, layout, and operation are amazingly similar to Apple’s Final Cut Pro X. Blackmagic’s intent is to make a fast editor where you can start and end your project for a time-sensitive turnaround without the complexities of the Edit Page. However, it’s just another tool, so you could work entirely in the Cut Page, or start in the Cut Page and refine your timeline in the Edit Page, or skip the Cut Page all together. Resolve offers a buffet of post tools that are at your disposal.

While Resolve 16’s Cut Page does elicit a chuckle from experienced FCPX users, it offers some new twists. For example, there’s a two-level timeline view – the top section is the full-length timeline and the bottom section is the zoomed-in detail view. The intent is quick navigation without the need to constantly zoom in and out of long timelines. There’s also an automatic sync detection function. Let’s say you are cutting a two-camera show. Drop the A-camera clips onto the timeline and then go through your B-camera footage. Find a cut-away shot, mark in/out on the source, and edit. It will ‘automagically’ edit to the in-sync location on the timeline. I presume this is matched by either common sound or timecode. I’ll have to see how this works in practice, but it demos nicely. Changes to other aspects of Resolve were minor and evolutionary, except for one other notable feature. The Color Page added its own version of content-aware, video fill.

Another editorial product addition – tied to the theme of faster, more-efficient editing – was a new edit keyboard. Anyone who’s ever cut in the linear days – especially those who ran Sony BVE9000/9100 controllers – will feel very nostalgic. It’s a robust keyboard with a high-quality, integrated jog/shuttle knob. The feel is very much like controlling a tape deck in a linear system, with fast shuttle response and precise jogging. The precision is far better than any of the USB controllers, like a Contour Shuttle. Whether or not enough people will have interest in shelling out $1,025 for it awaits to be seen. It’s a great tool, but are you really faster with one, than with FCPX’s skimming and a standard keyboard and mouse?

Ironically, if you look around the Blackmagic Design booth there does seem to be a nostalgic homage to Sony hardware of the past. As I said, the edit keyboard is very close to a BVE9100 keyboard. Even the style of the control panel on the Hyperdecks – and the look of the name badges on those panels – is very much Sony’s style. As humans, this appeals to our desire for something other than the glass interfaces we’ve been dealing with for the past few years. Michael Cioni (Panavision, Light Iron) coined this as ‘tactile attraction’ in his excellent Faster Together Stage talk. It manifests itself not only in these type of control surfaces, but also in skeuomorphic designs applied to audio filter interfaces. Or in the emotion created in the viewer when a colorist adds film grain to digital footage.

Maybe Grant is right and these methods are really faster in a pressure-filled production environment. Or maybe this is simply an effort to appeal to emotion and nostalgia by Blackmagic’s designers. (Check out Grant Petty’s two-hour 2019 Product Overview for more in-depth information on Blackmagic Design’s new products.)

8K

I won’t spill a lot of words on 8K. Seems kind of silly when most delivery is HD and even SD in some places. A lot of today’s production is in 4K, but really only for future-proofing. But the industry has to sell newer and flashier items, so they’ve moved on to 8K pixel resolution (7680 x 4320). Much of this is driven by Japanese broadcast and manufacturer efforts, who are pushing into 8K. You can laugh or roll your eyes, but NAB had many examples of 8K production tools (cameras and recorders) and display systems. Of course, it’s NAB, making it hard to tell how many of these are only prototypes and not yet ready for actual production and delivery.

For now, it’s still a 4K game, with plenty of mainstream product. Not only cameras and NLEs, but items like AJA’s KiPro family. The KiPro Ultra Plus records up to four channels of HD or one channel of 4K in ProRes or DNx. The newest member of the family is the KiPro GO, which records up to four channels of HD (25Mbps H.264) onto removable USB media.

Of course, the industry never stops, so while we are working with HD and 4K, and looking at 8K, the developers are planning ahead for 16K. As I mentioned, Avid already has project presets built-in for 16K projects. Yikes!

HDR

HDR – or high dynamic range – is about where it was last year. There are basically four formats vying to become the final standard used in all production, post, and display systems. While there are several frontrunners and edicts from distributors to deliver HDR-compatible masters, there still is no clear path. In you shoot in log or camera raw with nearly any professional camera produced within the past decade, you have originated footage that is HDR-compatible. But none of the low-cost post solutions make this easy. Without the right monitoring environment, you are wasting your time. If anything, those waters are muddier this year. There were a number of HDR displays throughout the show, but there were also a few labelled as using HDR simulation. I saw a couple of those at TV Logic. Yes, they looked gorgeous and yes, they were receiving an HDR signal. I found out that the ‘simulation’ part of the description meant that the display was bright (up to 350 nits), but not bright enough to qualify as ‘true’ HDR (1,000 nits or higher).

As in past transitions, we are certainly going to have to rely on a some ‘glue’ products. For me, that’s AJA again. Through their relationship with Colorfront, AJA offers two FS-HDR products: the HDR Image Analyzer and the FS-HDR convertor. The latter was introduced last year as a real-time frame synchronizer and color convertor to go between SDR and HDR display standards.  The new Analyzer is designed to evaluate color space and gamut compliance. Just remember, no computer display can properly show you HDR, so if you need to post and delivery HDR, proper monitoring and analysis tools are essential.

Cameras

I’m not a cinematographer, but I do keep up with cameras. Nearly all of this year’s camera developments were evolutionary: new LF (large format sensor) cameras (ARRI), 4K camcorders (Sharp, JVC), a full-frame mirrorless DSLR from Nikon (with ProRes RAW recording coming in a future firmware update). Most of the developments were targeted towards live broadcast production, like sports and megachurches.  Ikegami had an 8K camera to show, but their real focus was on 4K and IP camera control.

RED, a big player in the cinema space, was only there in a smaller demo room, so you couldn’t easily compare their 8K imagery against others on the floor, but let’s not forget Sony and Panasonic. While ARRI has been a favorite, due to the ‘look’ of the Alexa, Sony (Venice) and Panasonic (Varicam and now EVA-1) are also well-respected digital cinema tools that create outstanding images. For example, Sony’s booth featured an amazing, theater-sized, LED 8K micro-pixel display system. Some of the sample material shown was of the Rio Carnival, shot with anamorphic lenses on a 6K full-frame Sony Venice camera. Simply stunning.

Finally, let’s not forget Canon’s line-up of cinema cameras, from the C100 to the C700FF. To complement these, Canon introduced their new line of Sumire Prime lenses at the show. The C300 has been a staple of documentary films, including the Oscar-winning film, Free Solo, which I had the pleasure of watching on the flight to Las Vegas. Sweaty palms the whole way. It must have looked awesome in IMAX!

(For more on RED, cameras, and lenses at NAB, check out this thread from DP Phil Holland.)

It’s a wrap

In short, NAB 2019 had plenty for everyone. This also included smaller markets, like products for education seminars. One of these that I ran across was Cinamaker. They were demonstrating a complete multi-camera set-up using four iPhones and an iPad. The iPhones are the cameras (additional iPhones can be used as isolated sound recorders) and the iPad is the ‘switcher/control room’. The set-up can be wired or wireless, but camera control, video switching, and recording is done at the iPad. This can generate the final product, or be transferred to a Mac (with the line cut and camera iso media, plus edit list) for re-editing/refinement in Final Cut Pro X. Not too shabby, given the market that Cinamaker is striving to address.

For those of us who like to use the NAB Show exhibit floor as a miniature yardstick for the industry, one of the trends to watch is what type of gear is used in the booths and press areas. Specifically, one NLE over another, or one hardware platform versus the other. On that front, I saw plenty of Premiere Pro, along with some Final Cut Pro X. Hardware-wise, it looked like Apple versus HP. Granted, PC vendors, like HP, often supply gear to use in the booths as a form of sponsorship, so take this with a grain of salt. Nevertheless, I would guess that I saw more iMac Pros than any other single computer. For PCs, it was a mix of HP Z4, Z6, and Z8 workstations. HP and AMD were partner-sponsors of Avid Connect and they demoed very compelling set-ups with these Z-series units configured with AMD Radeon cards. These are very powerful workstations for editing, grading, mixing, and graphics.

©2019 Oliver Peters