Color Finale Connect – Remote Grading for FCPX

Remote workflows didn’t start with COVID, but that certainly drove the need home for many. While editing collaboration at a distance can be a challenge, it’s a far simpler prospect than remote color grading. That’s often a very interactive process that happens on premises between a colorist and a client, director, or cinematographer. Established high-end post facilities, like Company3 with locations in the US, Canada, and England, have pioneered remote color grading sessions using advanced systems like Resolve and Baselight. This allows a director in Los Angeles and a colorist in London to conduct remote, real-time, interactive grading sessions. But the investment in workflow development, hardware, and grading environments to make this happen is not inconsequential.

High-end remote grading comes to Final Cut Pro X

The Color Finale team has been on a quest to bring advanced grading tools to the Final Cut Pro X ecosystem with last December’s release of Color Finale 2. Many editors are working from home these days, so the team decided to leverage the frameworks for macOS and FCPX to enable remote grading in a far simpler method than with other grading solutions.

The result is Color Finale Connect, which is a Final Cut Pro X workflow extension currently in free public beta. Connect enables two or more Final Cut Pro X users to collaborate in near-real-time in a color grading session, regardless of their location. This review is in the context of long distance sessions, but Connect can also be used within a single facility where the participants might be in other parts of the building or in different buildings.

Color Finale Connect requires each user in a session to be on macOS Catalina, running licensed copies of Final Cut Pro X (not trial) and Color Finale 2.2 Pro (or higher). Download and install Color Finale Connect, which shows up as a Final Cut workflow extension. You can work in a Connect session with or without local media on every participant’s system. In order to operate smoothly and keep the infrastructure lightweight, person-to-person communication is handled outside of Connect. For example, interact with your director via Skype or Zoom on an iPad while you separately control Final Cut on your iMac.

Getting started

To start a session, each participant launches the Color Finale Connect extension within Final Cut. Whoever starts a session is the “broadcaster” and others that join this session are “followers.” The session leader (who has the local media) drags the Project icon to the Connect panel and “publishes” it. This generates a session code, which can be sent to the other participants to join the session from within their Connect extension panels.

Once a session is joined, the participants drag the Project icon from the Connect panel into an open FCPX Event. This generates a timeline of clips. If they have the matching local media, the timeline will be populated with the initial graded clips. If they don’t have media, then the timeline is populated with placeholder clips. Everyone needs to keep their Connect panel open to stay in the session (it can be minimized).

Data transfer is very small, since it consists mainly of Color Finale instructions; therefore, crazy-fast internet speeds aren’t required. It is peer-to-peer and doesn’t live anywhere “in the cloud.” If a participant doesn’t have local media installed, then as the session leader makes a color correction change in Color Finale 2 Pro, an “in-place” full-resolution frame is sent for that clip on the timeline. As more changes are made, the frames are updated in near-real-time.

The data communication is between Color Finale on one system and Color Finale on the others. All grading must happen within the Color Finale 2 Pro plug-in, not FCPX’s native color wheels or other plug-ins. The “in-place” frames support all native Final Cut media formats, such as H.264, ProRes, and ProRes RAW; however, formats that require a plug-in, like RED camera raw files, will not transmit “in-place” frames. In that case, the data applied to the placeholder frame is updated, but you won’t see a reference image.

This isn’t a one-way street. The session leader can enable any participant to also have control. Let’s say the session leader is the colorist and the director of photography is a participant. The colorist can enable remote control for the DP, which would permit them to make tweaks on their own system. This in turn would update back on the colorist’s system, as well as for all the other participants.

Color Finale Connect workflows

I’ve been testing a late-stage beta version of Connect and Color Finale 2.2 Pro and the system works well. The “in-place” concept is ingenious, but the workflow is best when each session member has local media. This has been improved with the enhanced proxy workflow updated in Final Cut Pro X 10.4.9. Let’s say the editor has the full-resolution, original media and generates smaller proxies – for example, 50% size H.264 files. These are small enough that you can easily send the Library and proxy media to all participants using services like WeTransfer, MASV, FileMail, or Frame.io.

One of the session members could be a favored colorist on the other side of the world. In this case, he or she would be working with the proxy media. If the editor and colorist are both able to control the session, then it becomes highly interactive. Formats like RED don’t pose a problem thanks to the proxy transcodes, as long as no local changes are made outside of the Color Finale plug-in. In other words, don’t change the RED raw source settings within this session. Once the colorist has completed the grade using proxy media, those grading settings would be updated through a Connect session on the editor’s system where the original media resides.

Color management

How do you know that your client sees the color in the same way as you do on a reference display? Remote color grading has always been hampered by color management and monitor calibration. It would, of course, be ideal for each participant in the session to have Blackmagic or AJA output hardware connected to a calibrated display. If there is an a/v output for FCPX, then the Connect session changes will also be seen on that screen. But that’s a luxury most clients don’t have.

This is where Apple hardware, macOS, and Final Cut Pro X’s color management come to the rescue and make Color Finale Connect a far simpler solution than other methods. If both you and your client are using Apple hardware (iMac, iMac Pro, Pro Display XDR) then color management is tightly controlled and accurate. First make sure that macOS display settings like True Tone and Night Shift are turned off on all systems. Then you are generally going to see the same image within the Final Cut viewer on your iMac screen as your client will see on theirs.

The one caveat is that users still have manual control of the screen brightness, which can affect the perception of the color correction. One tip is to include a grayscale or color chart that can be used to roughly calibrate the display’s brightness setting. Can everyone just barely see the darkest blocks on the chart? If not, brighten the display setting slightly. It’s not a perfect calibration, but it will definitely get you in the ballpark.

Color Finale 2 Pro turns Final Cut Pro X into an advanced finishing solution. Thanks to the ecosystem and extensions framework, Final Cut opens interesting approaches to collaboration, especially in the time of COVID. Tools like Frame.io and Postlab enable better long-distance collaboration in easier-to-use ways than previous technologies. Color Finale Connect brings that same ease-of-use and efficient remote collaboration to FCPX grading. Remember this is still a beta, albeit a stable one, so make sure you provide feedback should any issues crop up.

Originally written for FCP.co.

©2020 Oliver Peters

Working with ACES in DaVinci Resolve

In the film days, a cinematographer had a good handle on what the final printed image would look like. The film stocks, development methods, and printing processes were regimented with specific guidelines and limited variations. In color television production, up through the early adoption of HD, video cameras likewise adhered to the standards of Rec. 601 (SD) and Rec. 709 (HD). The advent of the video colorist allowed for more creative looks derived in post. Nevertheless, video directors of photography could also rely on knowing that the image they were creating would translate faithfully throughout post-production.

As video moved deeper into “cinematic” images, raw recording and log encoding became the norm. Many cinematographers felt their control of the image slipping away, thanks to the preponderance of color science approaches and LUTs (color look-up tables) generated from a variety of sources and applied in post. As a result, the Academy Color Encoding System (ACES) was developed as a global standard for managing color workflows. It’s an open color standard and method of best practices created by filmmakers and color scientists under the auspices of the Science and Technology Council of the Academy of Motion Picture Arts and Sciences (AMPAS, aka “The Academy”). To dive into the nuances of ACES – complete with user guides – check out the information at ACEScentral.com.

The basics of how ACES works

Traditionally, Rec. 709 is the color space and gamma encoding standard that dictates your input, timeline, and exports for most television projects. Raw and log recordings are converted into Rec. 709 through color correction or LUTs. The color gamut is then limited to the Rec. 709 color space. Therefore, if you later try to convert a Rec. 709 ProResHQ 4:2:2 master file into full RGB, Rec. 2020, HDR, etc., then you are starting from an already-restricted range of color data. The bottom line is that this color space has been defined by the display technology – the television set.

ACES is its own color space designed to be independent of the display hardware. It features an ultra-wide color gamut that encompasses everything the human eye can see. It is larger than Rec. 709, Rec. 2020, P3, sRGB, and others. When you work in an ACES pipeline, ACES is an intermediate color space not intended for direct viewing. In other world, ACES is not dictated by current display technology. Files being brought into ACES and being exported for delivery from ACES pass through input and output device transforms. These are mathematical color space conversions.

For example, film with an ARRI Alexa, record as LogC, and grade in a Rec. 709 pipeline. A LogC-to-REC709 LUT will be applied to the clip to convert it to the Rec. 709 color space of the project. The ACES process is similar. When working in an ACES pipeline, instead of applying a LUT, I would apply an Input Device Transform (IDT) specific for the Alexa camera. This is equivalent to a camera profile for each camera manufacturer’s specific color science.

ACES requires one extra step, which is to define the target device on which this image will be displayed. If your output is intended to be viewed on television screens with a standard dynamic range, then an Output Device Transform (ODT) for Rec. 709 would be applied as the project’s color output setting. In short, the camera file is converted by the IDT into the ACES working color space, but is viewed on your calibrated display based on the ODT used. Under the hood, ACES preserves all of the color data available from the original image. In addition to IDTs and ODTs, ACES also provides for Look Modification Transforms (LMT). These are custom “look” files akin to various creative LUTs built for traditional Rec. 709 workflows.

ACES holds a lot of promise, but it is still a work-in-progress. If your daily post assignments don’t include major network or studio deliverables, then you might wonder what benefit ACES has for you. In that case, yes, continuing to stick with a Rec. 709 color pipeline will likely be fine for a while. But companies like Netflix are behind the ACES initiative and other media outlets are bound to follow. You may well find yourself grading a project that requires ACES deliverables at some point in the future.

There is no downside in adopting an ACES pipeline now for all of your Resolve Rec. 709 projects. Working in ACES does not mean you can instantly go from a grade using a Rec. 709 ODT to one with a Rec. 2020 ODT without an extra trim pass. However, ACES claims to make that trim pass easier than other methods.

The DaVinci Resolve ACES color pipeline

Resolve has earned a position of stature within the industry. With its low price point, it also offers the most complete ACES implementation available to any editor and/or colorist. Compared with Media Composer, Premiere Pro, or Final Cut Pro X, I would only trust Resolve for an accurate ACES workflow at this point in time. However, you can start your edit in Resolve as Rec. 709 – or roundtrip from another editor into Resolve – and then switch the settings to ACES for the grade and delivery. Or you can start with ACES color management from the beginning. If you start a Resolve project using a Rec. 709 workflow for editing and then switch to ACES for the grade, be sure to remove any LUTs applied to clips and reset grading nodes. Those adjustments will all change once you shift the settings into ACES color management.

To start with an ACES workflow, select the Color Management tab in the Master Settings (lower right gear icon). Change Color Science to ACEScct and ACES version 1.1. (The difference between ACEScc and ACEScct is that the latter has a slight roll-off at the bottom, thus allowing a bit more shadow detail.) Set the rest as follows: ACES Input Device Transform to No Input Transform. ACES Output Device Transform to Rec. 709 (when working with a calibrated grading display). Process Node LUTs in ACEScc AP1 Timeline Space. Finally, if this is for broadcast, enable Broadcast Safe and set the level restrictions based on the specs that you’ve been supplied by the media outlet.

With these settings, the next step is to select the IDT for each camera type in the Media page. Sort the list to change all cameras of a certain model at once. Some media clips will automatically apply an IDT based on metadata embedded into the clip by the camera. I found this to be the case with the raw formats I tested, such as RED and BRAW. While an IDT may appear to be doing the same thing as a technical LUT, the math is inherently  different. As a result, you’ll get a slightly different starting look with Rec. 709 and a LUT, versus ACES and an IDT.

Nearly all LUTs are built for the Rec. 709 color space and should not be used in an ACES workflow. Yes, you can apply color space transforms within your node structure, but the results are highly unpredictable and should be avoided. Technical camera LUTs in Resolve were engineered by Blackmagic Design based on a camera manufacturer’s specs. They are not actually supplied as a plug-in by the manufacturer to Blackmagic. The same is true for Apple, Avid, and Adobe, which means that in all cases a bit of secret sauce may have been employed. Apple’s S-Log conversion may not match Avid’s for instance. ACES IDTs and ODTs within Resolve are also developed by Blackmagic, but based on ACES open standards. In theory, the results of an IDT in Resolve should match that same IDT used by another software developer.

Working with ACES on the Color page

After you’ve set up color management and the transforms for your media clips, you’ll have no further interaction with ACES during editing. Likewise, when you move to the Color page, your grading workflow will change very little. Of course, if you are accustomed to applying LUTs in a Rec. 709 workflow, that step will no longer be necessary. You might find a reason to change the IDT for a clip, but typically it should be whatever is the correct camera profile for the associated clip. Under the hood, the timeline is actually working in a log color space (ACEScc AP1); therefore, I would suggest grading with Log rather than Primary color wheels. The results will be more predictable. Otherwise, grade any way you like to get the look that you are after.

Currently Resolve offers few custom look presets specific to the ACES workflow. There are three LMTs found under the LUTs option / CLF (common LUT format) tab (right-click any node). These include LMT Day for Night. LMT Kodak 2383 Print Emulation, LMT Neon Suppression. I’m not a fan of either of the first two looks. Quite frankly, I feel Resolve film stock emulations are awful and certainly nowhere near as pleasing as those available through Koji Advance or FilmConvert Nitrate. But the third is essential. The ACES color space has one current issue, which is that extremely saturated colors with a high brightness level, like neon lights, can induce image artifacts. The Neon Suppression LMT can be applied to tone down extreme colors in some clips. For example, a shot with a highly saturated red item will benefit from this LMT, so that the red looks normal.

If you have used LUTs and filters for certain creative looks, like film stock emulation or the orange-and-teal look, then use PowerGrades instead. Unlike LUTs, which are intended for Rec. 709 and are typically a “black box,” a PowerGrade is simply a string of nodes. Every time you grab a still in the Color page, you have stored that series of correction nodes as a PowerGrade. A few enterprising colorists have developed their own packs of custom Resolve PowerGrades available for free or sale on the internet.

The advantages are twofold. First, a PowerGrade can be applied to your clip without any transform or conversion to make it work. Second, because these are a series of nodes, you can tweak or disable nodes to your liking. As a practical matter, because PowerGrades were developed with a base image, you should insert a node in front of the added PowerGrade nodes. This will allow you to optimize your image for the settings of the PowerGrade nodes and provide an optimal starting point.

Deliverables

The project’s ODT is still set to Rec. 709, so nothing changes in the Resolve Deliver page. If you need to export a ProResHQ master, simply set the export parameters as you normally would. As an extra step of caution, set the Data Levels (Advanced Settings) to Video and Color Space and Gamma Tags to Rec. 709, Gamma 2.4. The result should be a proper video file with correct broadcast levels. So far so good.

One of the main reasons for an ACES workflow is future proofing, which is why you’ve been working in this extended color space. No common video file format preserves this data. Furthermore, formats like DNxHR and ProRes are governed by companies and aren’t guaranteed to be future-proofed.

An ACES archival master file needs to be exported in the Open EXR file format, which is an image sequence of EXR files. This will be a separate deliverable from your broadcast master file. First, change the ACES Output Device Transform (Color Management setting) to No Output Device and disable Broadcast Safe limiting. At this point all of your video clips will look terrible, because you are seeing the image in the ACES log color space. That’s fine. On the Deliver page, change the format to EXR, RGB float (no compression), and Data Levels to Auto. Color Space and Gamma Tags to Same As Project. Then Export.

In order to test the transparency of this process, I reset my settings to an ODT of Rec. 709 and imported the EXR image sequence – my ACES master file. After import, the clip was set to No Input Transform. I placed it back-to-back on the timeline against the original. The two clips were a perfect match: EXR without added grading and the original with correction nodes. The one downside of such an Open EXR ACES master is a huge size increase. My 4K ProRes 4444 test clip ballooned from an original size of 3.19GB to 43.21GB in the EXR format.

Conclusion

Working with ACES inside of DaVinci Resolve involves some different terminology, but the workflow isn’t too different once you get the hang of it. In some cases, camera matching and grading is easier than before, especially when multiple camera formats are involved. ACES is still evolving, but as an open standard supported globally by many companies and noted cinematographers, the direction can only be positive. Any serious colorist working with Resolve should spend a bit of time learning and getting comfortable with ACES. When the time comes that you are called upon to deliver an ACES project, the workflow will be second nature.

UPDATE 2/23/21

Since I wrote this post, I’ve completed a number of grading jobs using the ACES workflow in DaVinci Resolve. I have encountered a number of issues. This primarily relates to banding and artifacts with certain colors.

In a recent B-roll shoot, the crew was recording in a casino set-up with an ARRI Alexa Mini in Log-C. The set involved a lot of extreme lights and colors. The standard Resolve ACES workflow would be to set the IDT to Alexa, which then automatically corrects the Log-C image back to the default working color space. In addition, it’s also recommended to apply neon suppression in order to tone down the bright colors, like vibrant reds.

I soon discovered that the color of certain LED lights in the set became wildly distorted (see image). The purple trim lighting on the frames of signs or the edges of slot machines became very garish and artificial. When I set the IDT to Rec 709 instead of Alexa and graded the shot manually without any IDT or LUT, then I was able to get back to a proper look. It’s worth noting that I tested these same shots in Final Cut Pro using the Color Finale 2 Pro grading plug-in, which also incorporates ACES and log corrections. No problems there.

After scrutinizing a number of other shots within this batch of B-roll footage, I noticed quite a bit more banding in mid-range portions of these Alexa shots. For example, the slight lighting variations on a neutral wall in the background displayed banding, as if it were an 8-bit shot. In general, natural gradients within an image didn’t look as smooth as they should have. This is something I don’t normally see in a Rec 709 workflow with Log-C Alexa footage.

Overall, after this experience, I am now less enthusiastic about using ACES in Resolve than I was when I started out. I’m not sure if the issue is with Blackmagic Design’s implementation of these camera IDTs or if it’s an inherent problem with ACES. I’m not yet willing to completely drop ACES as a possible workflow, but for now, I have to advise that one should proceed with caution, if you intend to use ACES.

Originally written for Pro Video Coalition.

©2020, 2021 Oliver Peters

FilmConvert Nitrate

When it comes to film emulation software and plug-ins, FilmConvert is the popular choice for many editors. It was one of the earliest tools for film stock emulation in digital editing workflows. It not only provides excellent film looks, but also functions as a primary color correction tool in its own right. FilmConvert has now been updated into FilmConvert Nitrate – a name that’s a tip of the hat to the chemical composition of early film stocks.

The basics of film emulation with Nitrate

FilmConvert Nitrate uses built-in looks based on 19 film stocks. These include a variety of motion and still photo negative and positive stocks, ranging from Kodak and Fuji to Polaroid and Ilford. Each stock preset includes built-in film grain based on 6K film scans. Unlike other plug-ins that simply add a grain overlay, FilmConvert calculates and integrates grain based on the underlying color of the image. Whenever you apply a film stock style, a matching grain preset, which changes with each stock choice, is automatically added. The grain amount and texture can be changed or you can dial the settings back to zero if you simply want a clean image.

These film stock emulations are not simply LUTs applied to the image. In order to work its magic, FilmConvert Nitrate starts with a camera profile. Custom profiles have been built for different camera makes and models and these work inside the plug-in. This allows the software to tailor the film stock to the color science of the selected camera for more accurate picture styles. When you select a specific camera from the pulldown menu instead of the FilmConvert default, you’ll be prompted to download any camera pack that hasn’t already been installed. Free camera profile packs are available from the FilmConvert website and currently cover most of the major brands, including ARRI, Sony, Blackmagic, Canon, Panasonic, and more. You don’t have to download all of the packs at first and can add new camera packs at any time as your productions require it.

New features in FilmConvert Nitrate include Cineon log emulation, curves, and more advanced grain controls. The Cineon-to-print option appears whenever you apply FilmConvert Nitrate to a log clip, such as from an ARRI Alexa recorded in Log-C. This option enables greater control over image contrast and saturation. Remember to first remove any automatic or manually-applied LUTs, otherwise the log conversion will be doubled.

Taking FilmConvert Nitrate for a spin

As with my other color reviews, I’ve tested a variety of stock media from various cameras. This time I added a clip from Philip Bloom’s Sony FX9 test. The clip was recorded with that camera’s S-Cinetone profile, which is based on Sony’s Venice color. It looks quite nice to begin with, but of course, that doesn’t mean you shouldn’t tweak it! Other clips included ARRI Alexa log and Blackmagic BRAW files.

In Final Cut Pro X, apply the FilmConvert Nitrate plug-in to a clip and launch the floating control panel from the inspector. In Premiere, all of the controls are normally exposed in the effects controls panel. The plug-in starts with a default preset applied, so next select the camera manufacturer, model, and profile. If you haven’t already installed that specific camera pack, you’ll be prompted to download and install it. Once that’s done, simply select the film stock and adjust the settings to taste. Non-log profiles present you with film chroma and luma sliders. Log profiles change those sliders into film color and Cineon-to-print film emulation.

Multiple panes in the panel expand to reveal the grain response and primary color controls. Grading adjustments include exposure/temperature/tint, low/mid/high color wheels, and saturation. As you move the temperature and tint sliders left or right, the slider bar shows the color for the direction in which you are moving that control. That’s a nice UI touch. In addition, there are RGB curves (which can be split by color) and a levels control. Overall, this plug-in plays nice with Final Cut Pro X and Premiere Pro. It’s responsive and real-time playback performance is typically not impacted.

It is common in other film emulation filters to include grain as an overlay effect. Adjusting the filter with and without grain often results in a large difference in level. Since Nitrate’s grain is a built-in part of the preset, you won’t get an unexpected level change as you apply more grain. In addition to grain presets for film stocks from 8mm to 35mm Full Frame, you can adjust grain luminance, saturation, and size. You can also soften the picture under the grain, which might be something you’d want to do for a more convincing 8mm emulation. One unique feature is a separate response curve for grain, allowing you to adjust the grain brightness levels for lows, mids, and highs. In order to properly judge the amount of grain you apply, set Final Cut Pro X’s playback setting to Better Quality.

For a nice trick, apply two instances of Nitrate to a clip. On the first one, set the camera profile to a motion film negative stock, like Kodak 5207 Vision 3. Then apply a second instance with the default preset, but select a still photo positive stock, like Fuji Astia 100. Finally, tweak the color settings to get the most pleasing look. At this point, however, you will need to render for smooth playback. The result is designed to mimic a true film process where you would shoot a negative stock and then print it to a photograph or release print.

FilmConvert Nitrate supports the ability to export your settings as a 3D LUT (.cube) file, which will carry the color information, although not the grain. To test the transparency of this workflow, I exported my custom Nitrate setting as a LUT. Next, I removed the plug-in effect from the clip and added the Custom LUT effect back to it. This was linked to the new LUT that I had just exported. When I compared the clip with the Nitrate setting versus just the LUT, they were very close with only a minor level difference between. This is a great way to move a look between systems or into other applications without having FilmConvert Nitrate installed in all of them.

Wrap-up

Any color correction effect – especially film emulation styles – are highly subjective, so no single filter is going to be a perfect match for everyone’s taste. FilmConvert Nitrate advances the original FilmConvert plug-in with an updated interface, built around a venerable set of film stock choices. This makes it a good choice if you want to nail the look of film. There’s plenty you can tweak to fine-tune the look, not to mention a wide variety of specific camera profiles. Even Apple iPhones are covered.

FilmConvert Nitrate is available for Final Cut Pro X 10.4.8 and Motion running under macOS 10.13.6 or later. It is also available for Premiere Pro/After Effects, DaVinci Resolve, and Media Composer on both macOS and Windows 10. The plug-in can be purchased for individual applications or as a bundle that covers all of the NLEs. If you already own FilmConvert, then the company has upgrade offers to switch to FilmConvert Nitrate.

Originally written for FCP.co.

©2020 Oliver Peters

Color Finale 2.1 Update

Color grading roundtrips are messy and prone to errors. Most editors want high-quality solutions that keep them within their favorite editing application. Color Trix launched the revamped Color Finale 2 this past December with the goal of building Final Cut Pro X into a competitive, professional grading environment. In keeping to that goal, Color Trix just released Color Finale 2.1 – the first major update since the December launch. Color Finale 2.1 is a free upgrade to Color Finale 2 owners and adds several new features, including inside/outside mask grading, an image mask, a new smoothness function, and the ability to copy and paste masks between layers. (Right-click images to see enlarged view.)

Grading with inside/outside masks

Color Finale 2 launched with trackable, spline masks that could be added to any group or layer. But in version 2.0, grading occurred either inside or outside of the mask, but not both. The new version 2.1 feature allows a mask to be applied to a group, which then becomes the parent mask. Grading would then be done within that mask. If you want to also grade the area outside of that mask, simply apply a new group inside the first group. Then add a new mask that is an invert of the parent mask. Now you can add new layers to grade the area outside of the same mask.

In the example image, I first applied a mask around the model at the beach and color corrected her. Then I applied a new group with an inverted mask to adjust for the sky. In that group I could add additional masking, such as an edge mask to create a gradient. The parent mask around the model maintains that the sky gradient is applied behind her rather than in the foreground. Once you get used to this grouping strategy with inside and outside masks, you can achieve some very complex results.

Image masks

The second major addition is that of image masks. This is a monochrome version of the image in which the dark-to-light contrast range acts as a qualifier or matte source to restrict the correction being applied to the image. The mask controls include black and white level sliders, blurring, and the ability to invert the mask. Wherever you see a light area in the mask is where that color correction will be applied. This enables a number of grading tricks that are also popular in photography, including split-toning and localized contrast control.

Simply put, split-toning divides the image according to darks and lights (based on the image mask) and enables you to apply a different correction to each. This can be as extreme as a duotone look or something a bit more normal, yet still stylized.

In the duotone example, I first removed saturation from the original clip to create a black-and-white image. Then, the boxer’s image mask divides the range so that I could apply red and blue tinting for the duotone look.

In the second example, the image mask enabled me to create glowing highlights on the model’s face, while pushing the mids and shadows back for a stylistic appearance.

Another use for an image mask can be for localized contrast control. This technique allows me to isolate regions of the image and grade them separately. For example, if I want to only correct the shadow areas of the image, I can apply an image mask, invert it (so that dark areas are light in the mask), and then apply grading within just the dark areas of the image – as determined by the mask.

Smoothness

Color Finale 2 included a sharpness slider. New in version 2.1 is the ability to go in the opposite direction to soften the image, simply by moving the slider left into negative values. This slider controls the high frequency detail of the overall image – positive values increase that detail, while negative values decrease it.

Since this is an overall effect, it can’t be masked within the layers panel. If you wanted to apply it just to a person’s face, like other “beauty” filters, then that can be achieved by using Final Cut Pro X’s built-in effects masks. This way a similar result can be reached while staying within the Color Finale workflow.

One last addition to version 2.1 is that Final Cut Pro X’s hotkeys now stay active while the Color Finale layers panel is open. Color Trix has stated that they plan more upgrades and options over the next nine months, so look for more ahead. Color finale 2.1 is already a powerful grading tool for nearly any level of user. Nevertheless, more features will certainly be music to the ears of advanced users who prefer to stay within Final Cut Pro X to finish and deliver their projects. Stay tuned.

Click this link for the Color Finale 2 Easy Reference Guide.

Originally written for FCP.co.

©2020 Oliver Peters

Chasing the Elusive Film Look

Ever since we started shooting dramatic content on video, directors have pushed to achieve the cinematic qualities of film. Sometimes that’s through lens selection, lighting, or frame rate, but more often it falls on the shoulders of the editor or colorist to make that video look like film. Yet, many things contribute to how we perceive the “look of film.” It’s not a single effect, but rather the combination of careful set design, costuming, lighting, lenses, camera color science, and color correction in post.

As editors, we have control over the last ingredient, which brings me to LUTs and plug-ins. A number of these claim to offer looks based on certain film emulsions. I’m not talking about stylized color presets, but the subtle characteristics of film’s color and texture. But what does that really mean? A projected theatrical film is the product of four different stocks within that chain – original camera negative, interpositive print, internegative, and the release print. Conversely, a digital project shot on film and then scanned to a file only involves one film stock. So it doesn’t really mean much to say you are copying the look of film emulsion, without really understanding the desired effect.

My favorite film plug-in is Koji Advance, which is distributed through the FxFactory platform. Koji was developed between Crumplepop and noted film timer, Dale Grahn. A film timer is the film lab’s equivalent to a digital colorist. Grahn selected several color and black-and-white film stocks as the basis for the Koji film looks and film grain emulation. Then Crumplepop’s developers expanded those options with neutral, saturated, and low contrast versions of each film stock and included camera-based conversions from log or Rec 709 color spaces. This is all wrapped into a versatile color correction plug-in with controls for temperature/tint, lift/gamma/gain/density (low, mid, high, master), saturation, and color correction sliders. (Click an image to see an expanded view.)

This post isn’t a review of the Koji Advance plug-in, but rather how to use such a filter effectively within an NLE like Final Cut Pro X (or Premiere Pro and After Effects, as well). In fact, these tips can also be used with other similar film look plug-ins. Koji can be used as your primary color correction tool, applying and adjusting it on each clip. But I really see it as icing on the cake and so will take a different approach.

1. Base grade/shot matching. The first thing you want to do in any color correction session is to match your shots within the sequence. It’s best to establish a base grade before you dive into certain stylized looks. Set the correct brightness and contrast and then adjust for proper balance and color tone. For these examples, I’ve edited a timeline consisting of a series of random FilmSupply stock footage clips. These clips cover a mix of cameras and color spaces. Before I do anything, I have to grade these to look consistent.

Since these are not all from the same set-up, there will naturally be some variances. A magic hour shot can never be corrected to be identical to a sunny exterior or an office shot. Variations are OK, as long as general levels are good and the tone feels right. Final Cut Pro X features a solid color correction tool set that is aided by the comparison view. That makes it easy to match a shot to the clip before and after it in the timeline.

2. Adding the film look. Once you have an evenly graded sequence of shots, add an adjustment layer. I will typically apply the Koji filter, an instance of Hue/Sat Curves, and a broadcast-safe limiter into that layer.

Within the Koji filter, select generic Rec 709 as the camera format and then the desired film stock. Each selection will have different effects on the color, brightness, and contrast of the clips. Pick the one closest to your intended effect. If you also want film grain, then select a stock choice for grain and adjust the saturation, contrast, and mix percentage for that grain. It’s best to view grain playing back at close to your target screen size with Final Cut set to Better Quality. Making grain judgements in a small viewer or in Better Performance mode can be deceiving. Grain should be subtle, unless you are going for a grunge look.

The addition of any of these film emulsion effects will impact the look of your base grade; therefore, you may need to tweak the color settings with the Koji controls. Remember, you are going for an overall look. In many cases, your primary grade might look nice and punchy – perfect for TV commercials. But that style may feel too saturated for a convincing film look of a drama. That’s where the Hue/Sat Curves tool comes in. Select LUMA vs SAT and bring down the low end to taste. You want to end up with pure blacks (at the darkest point) and a slight decrease in shadow-area saturation.

3. Readjust shots for your final grade. The application of a film effect is not transparent and the Koji filter will tend to affect the look of some clips more than others. This means that you’ll need to go back and make slight adjustments to some of the clips in your sequence. Tweak the clip color correction settings applied in the first step so that you optimize each clip’s final appearance through the Koji plug-in.

4. Other options. Remember that Koji or similar plug-ins offer different options – so don’t be afraid to experiment. Want film noir? Try a black-and-white film stock, but remember to also turn down the grain saturation.

You aren’t going for a stylized color correction treatment with these tips. What you are trying to achieve is a look that is more akin to that of a film print. The point of adding a film filter on top is to create a blend across all of your clips – a type of visual “glue.” Since filters like this and the adjustment layer as a whole have opacity settings, is easy to go full bore with the look or simply add a hint to taste. Subtlety is the key.

Originally written for FCP.co.

©2020 Oliver Peters