CineMatch for FCP

Last year FilmConvert, developers of the Nitrate film emulation plug-in, released CineMatch. It’s a camera-matching plug-in designed for multiple platforms – including operating systems and different editing/grading applications. The initial 2020 release worked with DaVinci Resolve and Premiere Pro. Recently FilmConvert added Final Cut Pro support. You can purchase the plug-in for individual hosts or as a bundle for multiple hosts. If you bought the bundled version last year, then that license key is also applicable to the new Final Cut Pro plug-in. So, nothing extra to purchase for bundle owners.

CineMatch is designed to work with log and raw formats and a wide range of camera packs is included within the installer. To date, 70 combinations of brands and models are supported, including iPhones. FilmConvert has created these profiles based on the color science of the sensor used in each of the specific cameras.

CineMatch for FCP works the same way as the Resolve and Premiere Pro versions. First, select the source profile for the camera used. Next, apply the desired target camera profile. Finally, make additional color adjustments as needed.

If you a shoot with one predominant A camera that is augmented by B and C cameras of different makes/models, then you can apply CineMatch to the B and C camera clips in order to better match them to the A camera’s look.

You can also use it to shift the look of a camera to that of a different camera. Let’s say that you want a Canon C300 to look more like an ARRI Alexa or even an iPhone. Simply use CineMatch to do that. In my example images, I’ve adjusted Blackmagic and Alexa clips so that they both emulate the color science of a Sony Venice camera.

When working in Final Cut Pro, remember that it will automatically apply Rec 709 LUTs to some log formats, like ARRI Alexa Log-C. When you plan to use CineMatch, be sure to also set the Camera LUT pulldown selector in the inspector pane to “none.” Otherwise, you will be stacking two LUT conversions resulting in a very ugly look.

Once camera settings have been established, you can further adjust exposure, color balance, lift/gamma/gain color wheels, saturation, and the luma curve. There is also an HSL curves panel to further refine hue, saturation, and luma for individual color ranges. This is helpful when trying to match two cameras or shots to each other with greater accuracy. FCP’s comparison viewer is a great aid in making these tweaks.

As a side note, it’s also possible to use CineMatch in conjunction with FilmConvert Nitrate (if you have it) to not only adjust color science, but then to subsequently emulate different film stocks and grain characteristics.

CineMatch is a useful tool when working with different camera types and want to achieve a cohesive look. It’s easy and quick to use with little performance impact. CineMatch now also supports M1 Macs.

©2021 Oliver Peters

Storage for Editors 

Storage is the heart of a modern post-production facility. The size and type of storage you pick can greatly impact the efficiency of the facility. Surprisingly the concerns and requirements around a storage network aren’t all that different, regardless of whether you’re a large or smaller post facility.

I recently spoke with industry veterans at Molinare in London and Republic Editorial in Dallas about how they’ve addressed storage needs. 

Molinare

Molinare is a world-class, London-based operation, known for its work on leading television series for the BBC and Netflix among many others. I spoke with Darren Woolfson, Molinare’s Director of Technology and Visual Services. Woolfson is a thirty-year veteran of the London post scene, including a stint at Pinewood Studios. He joined Molinare in March 2020 in a new role that touches broadly on technology company-wide.

Please tell me about the set-up at Molinare.

[Woolfson] Molinare has 46 offline cutting rooms with Avid Media Composer software. Then we have eight Flames for finishing and four Symphonies for online. All are capable of working in full resolution at 4K. We’ve got five Baselights for color correction, a bit of Resolve, and handfuls of other tools. If you wanted to come to Molinare and cut on Premiere Pro, we would make it possible, but we’re mainly an Avid and Flame house.

What storage systems do you have to support all of that?

[Woolfson] The core of our facility is based on two petabytes of Dell EMC Isilon, which is high-speed storage in clusters. Most of the systems can attach to it. The Media Composers are predominantly hooked up to ISIS or NEXIS storage. We own quite a lot of that. In addition to our 46 cutting rooms in the building, I reckon we have at least another 20 running remotely from the editors’ homes. And so we’ve also filled our racks up with more Media Composers. Teradici happens to be the PC-over-IP tool of choice.

The FilmLight Baselight systems use dedicated storage for grading. It’s made by FilmLight and runs at about 6 or 7GB/s. It wouldn’t be unusual for us to be grading 4K 16-bit images. This architecture allows our five Baselight suites to all pull media from the shared storage. The Flames have a bit of local storage on them, as well as connection to the Isilon.

I presume the Avid suites are using low resolution media for offline editing.

[Woolfson] Yes. Typically most of the shows that we cut in offline would be at DNxHD 36. A Netflix show or any of the big streamers will often be finished in 4K 16-bit, uncompressed. A lot of the material that we finish on the Symphonies, which is generally for broadcasters, will use a lightly-compressed mastering codec.

Overall, what sort of connection speeds to the storage are we talking about?

[Woolfson] Our core is 100Gbps, using really good quality Mellanox switches. The Baselights connect into that at 25Gbps. The Flames connect at 10 or 25Gbps. And then the Avids, depending on whether it’s online or offline, connect at either 1 or 10Gbps.

You also have quite a few audio rooms using Pro Tools. Do those tie into the Avid NEXIS units?

[Woolfson] No. We use DDP [Dynamic Drive Pool] for our audio rooms. The storage is by Ardis out of the Netherlands. NEXIS isn’t optimized for the thousands of relatively small audio files. We often do foreign language versioning, so the mixes go very wide. They might be 500 tracks, particularly if it was a Dolby Atmos mix. Ardis has optimized their solutions to work really well in that environment.

You have a lot of different shows and series that get posted at Molinare. How do you control the amount of media so that your storage isn’t completely filled up?

[Woolfson] One of the things I identified quite quickly when I arrived was the lack of a long term storage strategy. I’ve implemented buying a lot of lower-speed, lower-cost, and expandable options for nearline and archival storage. With good media management, we hopefully won’t have to buy too much more of the really high-speed stuff, unless we decide to add on a lot more finishing rooms.

For us, nearline includes lots of NAS [network attached storage] and big tape robots behind it. Plus some software that sits on top, which automatically manages moving data from disk to tape and then back again. We are using a Quantum tape solution – LTO-8 at the moment. Let’s say the grading team has access to a share of storage. Anything that they copy into there will be managed by a policy. We can either make one or two tape copies. We can leave the original media on disk. We can remove it from disk. We can remove it from disk after 60 days. There’s full flexibility in how it works.

The pandemic has ushered in remote workflows and now companies are looking at various hybrid models. How does that impact your storage needs and affect other issues, like security?

[Woolfson] With the solutions that we offer our clients, media remains securely in our building as if you are editing locally. Effectively, Teradici remotes the monitors, keyboard, and mouse data, and it’s encrypted. In general terms that does satisfy most of our clients.

It didn’t change our storage systems, other than we needed more of it. Effectively we had increased the number of cutting rooms. We suddenly had 60 or 70 systems hung off of it rather than the 45 we usually do.

If you were to rebuild the facility today, what would you do differently when it comes to storage?

[Woolfson] I would still use Avid for basic editing. NEXIS works really well in the Media Composer world. Even if we had 200 edit suites, they’re unlikely all to be working on one project. So it’s still fairly straightforward to have multiple NEXIS’ each with a couple of projects on them, and then you only attach the relevant Avids to those.

I wouldn’t buy any more Isilon. I think at the time it was a good product. It was the right one for the company to buy, but it’s a very expensive product. I’m actually quite a fan of StorNext, Quantum’s file system. It has its challenges, but when it’s going well, it’s very reliable and fast.

__________

Republic Editorial

Republic Editorial is a commercial editorial shop located in Dallas. It started out as a branch of the Santa Monica-based Red Car. I spoke with Keith James, Republic partner and senior editor, along with Jason Vigue, who juggles a dual role as a designer and IT specialist for the company.

Please tell me a bit about Republic Editorial’s background.

[James] Red Car brought my partners and me together in 2004. When Red Car’s owner retired, we were able to transform into our own company – Republic Editorial. We also have two sister companies, Infinite Fiction, which focuses on 3D motion design work, and Threaded Pictures, which focuses on production. Our core business is 90% long and short-format commercials, working with ad agencies, as well as direct to clients.

What is the design of your facility?

[James] We occupy the ground floor of an eight-story building in uptown Dallas. We have two Flame finishing rooms, five creative offline suites and two audio post studios – all very comfortable and designed around clients being in the room with us. Infinite Fiction, our design team, is housed in a small wing on this floor. Our Threaded Pictures production division has a small area for doing pre-pro meetings and casting calls.

What software are you using for editing and color correction?

[James] Two of the editors work on Avid Media Composer and four on Adobe Premiere Pro. Our Flame guys do the majority of color at this point just because they’re also compositors and that’s just been easier than round-tripping to other systems for grading.

Shared storage is an important component of many facilities. What did you settle on for your shop?

[Vigue] We have a 96 TB Facilis as our primary storage server. We started off with fiber channel when we did the initial build-out in 2013. But, with advancements in the technology and the iMac workstations, we’ve shifted to 10GbE [10 Gigabit Ethernet] over Cat6 cable. Now we’re down to only the two Flame systems that are still on fiber. We also have a smaller 32 TB Facilis system and everything is linked together on the same network through two 10GbE switches. So, even though they’re separate systems, they’re shared throughout the office.

When you made the decision to go with the Facilis system, were you primarily an Avid-based editorial shop?

[Vigue] At the time we were half and half. Adobe came in at the right time with a stable build just as Apple was shifting to Final Cut Pro X. Some of the editors didn’t want to shift from Final Cut Pro 7 back to Media Composer, so we were early adopters of Premiere.

When the pandemic hit, many places locked down and companies were forced to work remotely as much as possible. How did you address that?

[James] We adapted quickly to set up our home studios for our artists at the start, but also made a push before Delta hit to start getting people back into the office. We found that some disciplines, like audio, were really hard to do remotely – everyone was listening on different speakers. The offline editors were fine throughout the pandemic working from home. It was a lot more work with local drives, accessing our VPN and stuff like that, but we’re transitioning right now. We’re making the move to a ProMAX system, which gives you a hub at your house, plus a hub that sits off the Facilis and allows you to sync project and media files seamlessly between the two.

[Vigue] ProMAX has been around for a while with storage. We just came across this solution, which is a peer-to-peer sharing setup. They’ve got their own software, but it’s done through an Intel NUC mini PC. A NUC goes home with a local RAID and then we’re just selectively sharing job folders complete with project, graphics, and media files to editors based on their ongoing project needs.

[James] To be clear – we’re not using raw media with ProMAX – only lower-resolution, transcoded media for offline editing. We’re working out a different setup for our two Flame guys. For them, we’re looking at using Amulet Hotkey with Teradici, which will allow them to just remote into their systems at the office. The Flames need their local Stone storage, so duplicating all that stuff would not really work the way that it does for the offline editors.

Do you have a strategy for archiving projects and productions?

[Vigue] Currently we’re doing everything to LTO-6. Ours is a manual operation with two bays for LTO using StorageDNA’s DNAevolution. We keep everything online and organize our material quarterly. After about six to nine months, we start backing up quarters. We’ve also been discussing incorporating a cloud backup solution. We’ve talked to Wasabi, because they have no egress charges and have flat fees for their storage. I don’t think we would ever put an entire shoot in the cloud – only our project files. Original camera media is being backed up to LTO anyway.

Tell me a little bit about the considerations in re-assessing your future facility needs.

[James] We’re working with advertising clients in a traditional setup where the client is normally in the room. With the pandemic, we’ve done editing over Zoom and we’ve also used Evercast with specific jobs when latency was critical. I’d say right now, maybe 30% of our work is supervised with clients coming into the facility.

As an editor, when somebody sends out email comments, you often feel like you’re trying to fit a round peg into a square hole to make their cut work. However, when they’re with you, you can quickly take a stab at it and go, “Hey guys, I don’t think this is working,” present an alternate solution, and keep things moving forward. There are shops that are going completely virtual and we’re hearing more and more clients expressing frustration with that being their only option for working. Our goal, when this whole thing is hopefully over, is to just have the flexibility to be in studio or work from home, depending on what the client wants.

Originally written for postPerspective.

©2021 Oliver Peters

My Kingdom for Some Color Bars

In a former life, video deliverables were on videotape and no one seriously used the internet for any mission-critical media projects. TVs and high-quality video monitors used essentially the same display technology and standards. Every videotape started with SMPTE color bars used as a reference to set up the playback of the tape deck. Monitors were calibrated to bars and gray scale charts to assure proper balance, contrast, saturation, and hue. If the hardware was adjusted to this recognized standard, then what you saw in an edit suite would also be what the network or broadcaster would see going out over the air.

Fast forward to the present when nearly all deliverables are sent as files. Aesthetic judgements – especially by clients and off-site producers – are commonly made viewing MOV or MP4 files on some type of computer or device screen. As an editor who also does color correction, making sure that I’m sending the client a file that matches what I saw when it was created is very important.

Color management and your editing software

In researching and writing several articles and posts about trusting displays and color management, I’ve come to realize the following. If you expect the NLE viewer to be a perfect match with the output to a video display or an exported file playing in every media player, then good luck! The chances are slim.

There are several reasons for this. First, Macs and PCs use different gamma standards when displaying media files. Second, not all computer screens work in the same color space. For instance, some use P3-D65 while others use sRGB. Third, these color space and gamma standards differ from the standards used by televisions and also projection systems.

I’ll stick to standard dynamic range (SDR) in this discussion. HDR is yet another mine field best left for another day. The television display standard for SDR video is Rec. 709 with a 2.4 gamma value. Computers do not use this; however, NLEs use it as the working color space for the timeline. Some NLEs will also emulate this appearance within the source and record viewers in order to match the Rec. 709, 2.4 gamma feed going out through the i/o hardware to a video monitor.

As with still photos, a color profile is assigned when you export a video file, regardless of file wrapper or codec. This color profile is metadata that any media player software can use to interpret how a file should be displayed to the screen. For example, if you edit in Premiere Pro, Adobe uses a working SDR color space of Rec. 709 with 2.4 gamma. Exported files are assigned a color profile of 1-1-1. They will appear slightly lighter and less saturated in QuickTime Player as compared with the Premiere Pro viewer. That’s because computer screens default to a different gamma value – usually 1.96 on Macs. However, if you re-import that file back into Premiere, it will be properly interpreted and will match the original within Premiere. There’s nothing wrong with the exported file. It’s merely a difference based on differing display targets.

The developer’s conundrum

A developer of editing software has several options when designing their color management system. The first is to assume that the viewer should match Rec. 709, 2.4 gamma, since that’s the television standard and is consistent with legacy workflows. This is the approach taken by Adobe, Avid, and Blackmagic, but with some variations. Premiere Pro offers no alternate SDR timeline options, but After Effects does. Media Composer editors can set the viewer based on several standards and different video levels for Rec. 709: legal range (8-bit levels of 16-235) versus full range (8-bit levels of 0-255). Blackmagic enables different gamma options even when the Rec. 709 color space is selected.

Apple has taken a different route with Final Cut Pro by utilizing ColorSync. The same image in an FCP viewer will appear somewhat brighter than in the viewer of other NLEs; however, it will match the playback of an exported file in QuickTime Player. In addition, the output through AJA or Blackmagic i/o hardware to a video display will also match. Not only does the image look great on Apple screens, but it looks consistent across all apps on any Apple device that uses the ColorSync technology.

You have to look at it this way. A ton of content is being delivered only over the internet via sites like Instagram, Facebook, and YouTube rather than through traditional broadcast. A file submitted to a large streamer like Netflix will be properly interpreted within their pipeline, so no real concerns there. This begs the question. Should the app’s viewer really be designed to emulate Rec. 709, 2.4 gamma or should it look correct for the computer’s display technology?

The rubber meets the road

Here’s what happens in actual practice. When you export from Premiere Pro, Final Cut Pro, or Media Composer, the result is a media file tagged with the 1-1-1 color profile. For Premiere and Media Composer, exports will appear with somewhat less contrast and saturation than the image in the viewer.

In Resolve, you can opt to work in Rec. 709 with different gamma settings, including 2.4 or 709-A (“A” for Apple, I presume). These two different output settings would look the same until you start to apply a color grade (so don’t switch midstream). If you are set to 2.4 (or automatic), then the exported file has a color profile of 1-2-1. But with 709-A the exported file has a color profile of 1-1-1. These Resolve files will match the viewer and each other, but will also look darker than the comparable Premiere Pro and FCP exports.

All of the major browsers use the color profile. So do most media players, except VLC. These differences are also apparent on a PC, so it’s not an Apple issue per se. More importantly the profile determines how a file is interpreted. For instance, the two Resolve ProRes exports (one at 1-1-1, the other at 1-2-1) look the same in this first generation export. But let’s say you use Adobe Media Encoder to generate H.264 MP4 viewing copies from those ProRes files. The transcoded MP4 of the 709-A export (1-1-1 color profile) will match its ProRes original. However, the transcoded MP4 of the 2.4 export (1-2-1 color profile) will now look a bit brighter than its ProRes original. That’s because the color profile of the MP4 has been changed to 1-1-1.

Gamma changes mostly affect the midrange and shadow portion of a video signal. Therefore, differences are also more or less apparent depending on content. The more extreme your grading, the more apparent (and to some, obnoxious) these differences become. If these really bother you, then there are several ways to create files that are “enhanced” for computer viewing. This will make them a bit darker and more saturated.

  1. You can tweak the color correction by using an adjustment layer to export a file with a bit more contrast and saturation. In Premiere Pro, you can use a Lumetri effect in the adjustment layer to add a slight s-curve along with a 10% bump in saturation.
  2. You can use a QT Gamma Correction LUT (such as from Adobe) as part of the export. However, in my experience, it’s a bit too dark in the shadows for my taste.
  3. You can pass the exported file through After Effects and create a separate sRGB version.

These approaches are not transparent. In other words, you cannot re-import these files and expect them to match the original. Be very careful about your intentions when using any of these hacks, because you are creating misadjusted files simply for viewing purposes. 

In the end, is it really right to use Rec. 709 2.4 gamma as the standard for an NLE viewer? Personally, I think Apple used the better and more modern approach. Should you do any of these hacks? Well, that’s up to you. More and more people are reviewing content on smart phones and tablets – especially iPhones and iPads – all of which show good-looking images. So maybe these concerns are simply much ado about nothing.

Or paraphrasing Dr. Strangelove – How I Learned to Stop Worrying and Love Color Profiles.

©2021 Oliver Peters