YouTube influencers are a big part of the content creation landscape today. Their videos cover many, different niches and often have a surprisingly large base of followers. I take a look at YouTube, aviation, and the use of Final Cut Pro to post-produce these videos (FCP.co).
Unless you’ve delivered master files for broadcast, you might not be as tuned into meeting delivery specs, especially when it comes to the perceived loudness levels of your mix. I discuss working with audio in Final Cut Pro to mix and master at optimal levels (FCP.co).
The folks at Color Trix have come up with an ingenious solution to augment Final Cut Pro’s camera raw support. The new Color Finale Transcoder adds additional camera raw formats, notably Blackmagic RAW. Check out my review of the software (FCP.co).
In the “good old days” of post, directors, cinematographers, and clients would all judge final image quality in an online edit or color correction suite using a single, calibrated reference monitor. We’ve moved away from rooms that look like the bridge of the Enterprise into more minimalist set-ups. This is coupled with the current and possibly future work-from-home and general remote post experiences. Without everyone looking at the same reference display, it becomes increasingly difficult to be sure that what everyone sees is actually the proper appearance of the image. For some, clients rarely comes into the suite anymore. Instead, they are often making critical judgements based on what they see on their home or work computers and/or devices.
The lowest common denominator
Historically, a common item in most recording studios was a set of Auratone sound cubes. These small, single speaker monitors, which some mixers dubbed “awful-tones,” were intended to provide a representation of the mix as it would sound on radios and cheaper hi-fi audio set-ups. TV show re-recording mixers would also use these to check a mix in order to hear how it would translate to home TV sets.
Today, smart phones and tablets have become the video equivalent of that cheap hi-fi set-up. Generally that means Apple iPhones or iPads. In fact, thanks to Apple’s color management, videos played back on iPads and iPhones do approximate the correct look of your master file. As editors or colorists, we often ask clients to evaluate the image on an Apple device, not because they are perfect (they aren’t), but rather because they are the best of the many options out in the consumer space. In effect, checking against an iPhone has become the modern video analog of the Auratone sound cubes.
Apple color management
Apple’s color management includes several techniques that are helpful, but can also trip you up. If you are going to recommend that your clients use an iPhone, iPad, or even on iMac to judge the material, then you also want to make sure they have correctly set up their device. This also applies to you, the editor, if you are creating videos and only making judgements on an iMac (or XDR) display, without any actual external video (not computer) display.
Apple computers enable the use of different color profiles and the ability to make adjustments according to calibration. If you have a new iMac, then you are generally better off leaving the color profile set to the default iMac setting instead of fiddling with other profiles. New Apple device displays are set to P3 D65 color with a higher brightness capacity (up to 500 nits – more with XDR). You cannot expect them to perfectly reproduce an image that looks 100% like a Rec 709 100 nits TV set. But, they do get close.
I routinely edit/grade with Media Composer, Premiere Pro, DaVinci Resolve, and Final Cut Pro on iMacs and iMac Pros. Of these four, only Final Cut Pro shows an image in the edit viewer window that is relatively close to the way that image appears on the video output to a monitor. This is thanks to Apple’s color management and the broader Apple hardware/software ecosystem. The viewer image for the other three may look darker, be more saturated, have richer reds, and/or show more contrast.
Once you get past the color profile (Mac only), then most Apple devices offer two or three additional user controls (depending on OS version). Obviously there’s brightness, which can be manual or automatic. When set to automatic, the display will adjust brightness based on the ambient light. Generally auto will be fine, unless you really need to see crucial shadow detail. For example, the pluge portion of a test pattern (darkest gray patches) may not be discernible unless you crank up the brightness or are in a dark room.
Nevertheless, they do have a pleasing effect, because these features make the display warmer or cooler based on the time of day or the color temperature of the ambient light in the room. Typically the display will appear warmer at night or in a dimmer room. If you are working with a lot of white on the screen, such as working with documents, then these modes do feel more comfortable on your eyes (at least for me). However, your brain adjusts to the color temperature shift of the display when using something like True Tone. The screen doesn’t register in your mind as being obviously warm.
If you are doing anything that involves judging color, the LAST thing you want to use is True Tone or Night Shift. This applies to editing, color correction, art, photography, etc. It’s important to note that these settings only affect the way the image is displayed on the screen. They don’t actually change the image itself. Therefore, if you take a screen grab with True Tone or Night Shift set very cool or warm, the screen grab itself will still be neutral.
In my case, I leave these off for all of the computers I use, but I’m OK with leaving them on for my iPhone and iPad. However, this does mean I need to remember to turn the setting off whenever I use the iPhone or iPad to remotely judge videos. And there’s the rub. If you are telling your client to remotely judge a video using an Apple device – and color is part of that evaluation – then it’s imperative that you ask them (and maybe even teach them how) to turn off those settings. Unless they are familiar with the phenomena, the odds are that True Tone and/or Night Shift has been enabled on their device(s) and they’ve never thought twice about it simply because the mind adjusts.
QuickTime Player is the default media player for many professionals and end users, especially those using Macs. The way QuickTime displays a compatible file to the screen is determined by the color profile embedded into the file metadata. If I do a color correction session in Resolve, with the color management set to Rec 709 2.4 gamma (standard TV), then when I render a ProRes file, it will be encoded with a color profile of 1-2-1 (the 2 indicates 2.4 gamma).
If I export that same clip from Final Cut Pro or Premiere Pro (or re-encode the Resolve export through one of those apps) the resulting ProRes now has a profile of 1-1-1. The difference through QuickTime Player is that the Resolve clip will look darker in the shadows than the clip exported from FCP or Premiere Pro. Yet both files are exactly the same. It’s merely how QuickTime player displays it to the screen based on the metadata. If I open both clips in different players, like Switch or VLC, which don’t use this same metadata, then they will both appear the same, without any gamma shift.
How should one deal with such uncertainties? Obviously, it’s a lot easier to tackle when everyone is in the same room. Unfortunately, that’s a luxury that may become totally obsolete. It already has for many. Fortunately most people aren’t as sensitive to color issues as the typical editor, colorist, or DP. In my experience, people tend to have greater issues with the mix than they do with color purity. But that doesn’t preclude you from politely educating your client and making sure certain best practices are followed.
First, make sure that features like True Tone and Night Shift are disabled, so that a neutral image is being viewed. Second, if you use a review-and-approval service, like frame.io or Vimeo, then you can upload test chart image files (color bars, grayscale, etc). These may be used whenever you need to check the image with your client. Is the grayscale a neutral gray in appearance or is it warmer or cooler? Can you see separation in the darkest and brightest patches of these charts? Or are they all uniformly black or white? Knowing the answers will give you a better idea about what the client is seeing and how to guide them to change or improve their settings for more consistent results.
Finally, if their comments seem to relate to a QuickTime issue, then suggest using a different player, such as Switch (free with watermarks will suffice) or VLC.
The brain, eyes, and glasses
Some final considerations… No two people see colors in exactly the same way. Many people suffer from mild color blindness, i.e. color vision deficiencies. This means they may be more or less sensitive to shades of some colors. Eye glasses affect your eyesight. For example, many glasses, depending on the coatings and material, will yellow over time. I cannot use polycarbonate lenses, because I see chromatic aberration on highlights wearing this material, even though most opticians and other users don’t see that at all. CR-9 (optical plastic) or glass (no longer sold) are the only eyeglass materials that work for me.
If I’m on a flight in a window seat, then the eye closest to the window is being bombarded with a different color temperature of light than the eye towards the plane’s interior. This can be exacerbated with sunglasses. After extended exposure to such a differential, I can look at something neutral and when I close one eye or the other, I will see the image with a drastically different color temperature for one eye versus the other. This eventually normalizes itself, but it’s an interested situation.
One of the many color correction challenges is matching dissimilar cameras used within the same production. This tends to be the case in many web, streaming, and non-scripted projects, where budgets and availability often dictate the mix of cameras to be used. I frequently end up with RED, ARRI, Panasonic, Sony, DJI, and GoPro cameras all in the same show. Most NLEs do include basic, albeit imperfect, shot-matching features. However, now several software developers are taking that challenge head on.
One such developer is New Zealand’s FilmConvert, developers of the FilmConvert Nitrate film emulation plug-in. Their newest product is CineMatch, a camera-matching plug-in that’s currently available for DaVinci Resolve and Premiere Pro – and coming to Final Cut Pro X in the future. As with Nitrate, CineMatch is a cross-platform plug-in that may be purchased for a specific NLE host or as a bundle license to cover all products.
The CineMatch concept is very straightforward. Most productions have a main or “hero” camera – typically designated as the A-camera. Then there are other cameras for cutaways and alternate angles – B-camera, C-camera, etc. The principle is to match the look of the B- and C-cameras to that of the A-camera.
Dealing with color science
Each camera manufacturer uses different color science for their products. Sony will have a distinctly different look from Canon or Panasonic. FilmConvert builds its plug-ins based on camera packs, which are each customized for a specific manufacturer and model in order to properly match that camera’s color science.
If you have a production that mixes a Sony FX9, an ARRI Amira, and a Panasonic GH5, then each uses a different camera pack. CineMatch is designed to work with Log/RAW/BRAW formats, so there are fewer packs available on the CineMatch site than on the Nitrate site. That’s because many of the prosumer cameras supported by Nitrate do not record in log and, therefore, wouldn’t be appropriate for CineMatch. Since CineMatch uses fewer camera packs, all currently-supported camera packs are included in the installer at this point in time.
The basics of matching
To start, disable any embedded LUT or remove any that you may have added. Next, apply the CineMatch effect to the clips on the timeline in Premiere Pro or as nodes in Resolve. On A-camera clips, set the appropriate source camera profile, but no target profile. For B-cams, C-cams, and other clips, set their source camera profile; however, set their target profile to match the A-camera source.
In a situation with an ALEXA A-cam and a Panasonic EVA1 as the B-cam, the ALEXA would only use the ALEXA source profile. The EVA1 would be set to the EVA1 source, but ALEXA as the target profile. Essentially you are moving all cameras into a color space matching the ARRI ALEXA Log-C color science.
To properly view the CineMatch output, apply the REC 709 transform. However, since CineMatch has converted all of these clips into a common log space, such as ARRI’s Log-C, you can also opt to leave this transform off within the clip filter and apply a conversion LUT at a different point, such as in an adjustment layer in Premiere Pro or as a timeline grading node in Resolve. This way, CineMatch is not limited to REC 709/SDR projects.
Additional color correction tools
Ideally the camera crew should have maintained proper and consistent exposure and white balance among cameras used on a common set-up. Even better if color charts have also been recorded at the start. In a perfect world, you’d now be done. Unfortunately, that’s never the case. You’ve unified the color space, but this doesn’t automatically match one clip to the next. CineMatch includes a comprehensive color correction toolkit to further match and adjust clips. There are white balance and exposure controls for quick adjustments.
If you use the split screen comparison view in Premiere Pro or Resolve, CineMatch HSL curves can be used to refine the match between source and target clips. As with Nitrate, there’s a full set of secondary color controls, including wheels, curves, and levels. Not only can you better match cameras to each other, but you can also use CineMatch to cover most basic grading needs without ever touching Resolve’s grading controls or Premiere’s Lumetri panel.
Working with CineMatch
Although this plug-in is marketed for camera matching, you can use it completely apart from that task. That’s primarily because of the camera packs. For example, when you film with a Panasonic GH5 in a log profile, no NLE offers a stock LUT that is correct for that camera. Generally you end up just correcting it without a LUT or applying a generic Panasonic V-Log LUT. That was designed for the Varicam’s color science and is not a perfect match for every other Panasonic camera. Close, but not spot-on. CineMatch lets you apply a correction that is tailor-made for each individual camera profile, thanks to FilmConvert’s development of a wide range of professional and prosumer camera packs.
The second advantage is that you can impart the look of other cameras. For example, I’m a fan of ARRI’s color science and really prefer the look of an ALEXA over most other cameras. I can apply CineMatch to a GH5 clip, set the source profile to GH5 and the target to ALEXA and impart a bit of that ARRI color to the GH5 clip. While it’s not a replacement for shooting with an ALEXA and the color conversion might not be absolutely perfect, it’s a nice adjustment that gives me a better image than working with that clip on its own.
Finally, if you own both CineMatch and FilmConvert Nitrate, it is possible to use the two in conjunction with each other. Just be very careful of the processes and their order. In the GH5/ALEXA example, make the profile conversion in CineMatch. Make no color adjustments there and don’t apply the REC 709 transform. Then add FilmConvert Nitrate, set its profile to the ARRI settings and make your film emulation and color adjustments to taste.
Building on the heels of the previous post, I’d like to cover five simple steps to follow when performing basic color correction, aka “grading,” in Final Cut Pro X. Not every clip or project will use all of these, but apply the applicable steps when appropriate.
Step 1. LUTs (color look-up tables)
There are technical and creative LUTs. Here we are talking only about technical camera LUTs that are useful when your footage was recorded in a log color space. These LUTs convert the clip from log to a display color space (REC 709 or other) and turn the clip’s appearance from flat to colorful. Each manufacturer offers specific LUTs for the profile of their camera models.
Some technical LUTs are already included with the default FCPX installation and can be accessed through the settings menu in the inspector. Others must be downloaded from the manufacturer or other sources and stored elsewhere on your system. If you don’t see an appropriate option in the inspector, then apply the Custom LUT effect and navigate to a matching LUT stored on your system.
Step 2. Balance Color
Next, apply the Balance Color effect for each clip. This will slightly expand the contrast of the clip and create an averaged color balance. This is useful for many, but not all clips. For instance, a clip shot during “golden hour” will have a warm, yellow-ish glow. You don’t want that to be balanced neutral. You have no control over the settings of the Balance Color process, other than to pick between Automatic and White Balance. Test and see when and where this works to your advantage.
Note that this works best for standard footage without a LUT or when the LUT was applied through the inspector menu. If the LUT was applied as a Custom LUT effect, then Balance Color will be applied ahead of the Custom LUT and may yield undesirable results.
Step 3. Color correction – color board, curves, or color wheels
This is where you do most of the correction to alter the appearance of the clip. Any or all of FCPX’s color correction tools are fine and the tool choice often depends on your own preference. For most clips it’s mainly a matter of brightening, expanding contrast, increasing or decreasing saturation, and shifting the hue offsets of lows (shadow area), midrange, and highlights. What you do here is entirely subjective, unless you are aiming for shot-matching, like two cameras in an interview. For most projects, subtlety is the key.
Step 4. Luma vs Sat
It’s easy to get carried away in Step 3. This is your chance to reign it back in. Apply the Hue/Sat Curves tool and select the Luma vs Sat Curve. I described this process in the previous post. The objective is to roll off the saturation of the shadows and highlights, so that you retain pure blacks and whites at the extreme ends of the luminance range.
Step 5. Broadcast Safe
If you deliver for broadcast TV or a streaming channel, your video must be legal. Different outlets have differing standards – some looser or stricter than others. To be safe, limit your luminance and chrominance levels by applying a Broadcast Safe effect. This is best applied to an adjustment layer added as a connected clip at the topmost level above the entire sequence. Final Cut Pro X does not come with an adjustment layer Motion template title, but there are plenty available for download.
Apply the Broadcast Safe effect to that adjustment layer clip. Make sure it’s set to the color space that matches your project (sequence) setting (typically Rec 709 for HD and 4K SDR videos). At its default, video will be clipped at 0 and 100 on the scopes. Move the amount slider to the right for more clipping when you need to meet more restrictive specs.
These five steps are not the end-all/be-all of color correction/grading. They are merely a beginning guide to achieve quick and attractive grading using Final Cut Pro X. Test them out on your footage and see how to use them with your own workflows.
Color correction plug-ins are certainly fun to use, but Final Cut Pro X has plenty of horsepower on its own. There are also a number of features and techniques that often get overlooked. Here are five simple tricks you can use to enhance the look of your videos. (Click any image to see an enlarged view.)
Balance Color / Match Color
Final Cut has offered the ability to auto balance the color of a shot and to match two shots to each other since the early days. Balance color analyzes a shot and corrects it to a neutral tonality. This is typically a “best guess,” based on either an automatic overall adjustment or one focusing on white balance.
You can select all of the clips in your timeline and balance them in a single command. Most of the time, automatic will result in a pleasing enhancement of the image. The exception is when there’s no clear white reference in the shot, like a close-up on fire. In that case, you’d want to retain the orange/yellow qualities of the shot. The balance color results are not adjustable (other than to select between automatic or white balance), so consider it a first step to be further enhanced by the other color correction tools.
Match color is different in that it corrects the tonality of a selected clip to match another clip in the timeline. Park on the clip, select match color, skim to the clip used for the match, click on that frame for a preview, and then click Apply Match if you like the preview results. A common application might be to match A and B cameras to each other.
Match color can be used creatively, as well. For example, let’s say you want your clips to match the tonality of shots from a particular movie. A great resource for film reference frames is the Shotdeck website (free if you register as a beta user). Find and download a reference frame. Import the image file and edit it to the end of your timeline. Now use match color for a selected clip and skim to that reference film image on your timeline.
Obviously it won’t make a sunny daytime shot look exactly like a moody night shot from Bladerunner, but it will adopt the overall tonality as closely as possible. Once you’ve completed the match, delete the reference image from your timeline.
Timeline clips include blend mode attributes. The default is the normal mode, which can easily be changed to screen, overlay, soft light, and so on. Blend modes (also called composite and transfer modes) are commonly used by graphic artists and designers, but they are also useful in creating special looks for video.
To start, option-drag a clip above itself to create a duplicate as a connected clip. You now have two versions of that clip in perfect sync with each other. Adjust the blend mode of the connected clip. Let’s say you have a low-contrast clip. When you drag it above itself and change the top clip’s mode to soft light, it will instantly result in more contrast and saturation.
This same trick can be used to create stylish effects. For example, when you add a gaussian blur to the lower clip, the results are a dreamy image. Add a sharpening filter to the top clip and now you’ve added some localized contrast. Push it far enough and the results are almost cartoon-like.
You can use this trick on single clips or the entire sequence. Simply compound the sequence and then option-drag the compound up to create a duplicate, connected clip. Select the desired blend mode of the connected clip and tweak for your look.
Hue/Saturation Curves – Luma vs Sat
When you push color correction to an extreme, the black and white detail in the image becomes contaminated by the color shift introduced by your correction. You no longer have anything that’s true black or white in the frame. When a colorist is creatively adjusting a shot, they still want to end up with nearly pure black at the darkest part of the image and pure white at the brightest.
One way to achieve this is with the Luma vs Sat curve in the Hue/Saturation Curves tool. From left to right, the base line represents the brightness range from black to white. Pulling the curve up or down increases or decreases saturation at the brightness value corresponding to that point along the line. To reduce color saturation for black and white, add a control point inwards from each end of the line. Now drag the outer points down to decrease saturation. Drag the inner points more or less towards the center to adjust the roll-off from full saturation to zero saturation. The more gentle the slope of the curve, the smoother the roll-off with fewer potential artifacts. However, the trade-off is that you may lose too much saturation overall. So adjust to taste.
Orange and Teal
Filmmakers are enamored with the “orange-and-teal” look – skin tones tend to be warm (orange), while middle-range portions of the image take on a teal color cast. This is aided by proper lighting, costuming, and set design that is conducive to such a grade. For example, walls and furnishings that are neutral, black, gray, or dark in some way tend to be easier to swing towards the teal than a background of bright red walls.
Final Cut Pro X does include a color preset (Spring Sun) that mimics this look. You can also achieve it with any of Final Cut’s grading tools, such as the color wheels. First, use any color tool to establish a normal grade. Then apply color wheels for the look. The objective is to isolate skin tones from the rest of the image. Add an image mask for the wheels and use the HSL keyer. Use the color picker to select a skin tone. View the mask in black-and-white and adjust the HSL settings for a smooth key. It’s OK if the mask includes more than only the skin tones. The smoothness of the key is important.
Once you are happy with the key, go back to the image and grade inside of the mask. Push the midrange color wheel to the orange as needed. Or keep the skin tones neutral if you want to preserve a natural appearance. Change the mask toggle to outside and shift the midrange, shadows, and master towards teal. Finally, add the Luma vs Sat adjustment as described above to restore natural blacks and whites to the shadow and highlight areas.
Color look-up tables – LUTs – are used to apply the proper color profile for cameras. Custom LUTs can also be used creatively to create and preserve preset looks such as stylized grades, film stock emulation, and more. Final Cut cannot export LUTs, however, it can import any LUT in the standard .cube format. The internet offers plenty of options to purchase custom, creative LUTs.
Let’s take a more DIY approach. Several third-party color correction plug-ins, including FilmConvert Nitrate and Color Finale 2 Pro, allow the user to export the grade done within that plug-in as a self-contained LUT. Maybe you want to preserve the grade for future use that doesn’t depend on having the plug-in. Or maybe you don’t own that plug-in, but can collaborate with an editor/colorist who does. In that example, send your selected shots or sequence to them for a grade. Then, instead of returning the files to you with the baked-in look, simply export the grades for these clips as LUTs.
Back on your system, apply the Custom LUT effect to each clip and import the corresponding LUT. Make sure the settings match your color space (Rec 709 across the board for Rec 709 libraries and projects). In the case of Color Finale 2 Pro and FilmConvert Nitrate, most aspects of the grade done within their panels will be reproduced. Certain non-grading features, such as film grain emulation will not be included in the LUT. Overall, if I’m looking for a perfect match (same shot to same shot), then I’ve had more accurate results using Color Finale 2 Pro. This method is a great way of creating and transferring custom looks in a non-destructive manner.
I hope these simple tips will give you some ideas on how you can get more out of Final Cut Pro X to create and apply your own special touches.