It is time to reconsider Final Cut Pro X?

While Final Cut Pro X may have ultimately landed in the market sector that Apple envisioned, the industry widely acknowledged that the original launch could have been better managed. Many staunch Final Cut Pro (“legacy”) users were irrevocably alienated. That’s a shame, because FCPX wasn’t a bad design when released – merely incomplete. In the eight years that have followed, the user base has grown to more than 2.5 million (April 2018) and the application sports the widest third-party support of any editing software.

I have certainly gone back and forth in my own use of FCPX, depending on whether it was the right tool for a given job. I cut a feature film with it back in the pre-10.1 days when it was a bifurcated application with separate Event and Project files. Since then, I have also used it on plenty of spots and corporate videos. Although my daily workflow is largely Premiere Pro-based now, I regularly use Final Cut Pro X when appropriate, as well as Blackmagic Design DaVinci Resolve and Avid Media Composer. Modern editors need to be NLE-multilingual.

I realize that winning Oscars and cutting large-scale productions isn’t what the majority of editors do. Nevertheless, these types of productions give any product street cred. You are probably aware of Focus and Whiskey Tango Foxtrot, but there are certainly others that have used FCPX. Hollywood studios films are dominated by films cut with Avid Media Composer; however, short films cut using FCPX have won the short film Oscar category for two years in a row. While largely invisible to many US viewers, major international productions, on par with Game of Thrones, have been edited using Final Cut Pro X.

If you were one of those FCP7 users who jumped ship to another tool, then maybe it’s time to revisit Final Cut Pro X. There are many reasons I say that. In the past eight years, Apple has added wide codec support, LUTs, HDR capabilities, vastly improved color correction tools, and an easy method of working with captioning. Final Cut is clearly the better tool in many situations and here’s a quick overview why I feel that way.

What productions are best with FCPX?

Final Cut Pro X is capable of handling all types of editing, but it’s more ideal for some than others. The biggest differentiator is turnaround time. If you have to get done quickly – from ingest to delivery – then FCPX is hard to beat. It handles media better than any other NLE without the need for the beefiest hardware. Want to cut 4K ProResHQ on a two-year-old MacBook Pro? Then FCPX shines. That makes it a natural in broadcast news, promos, and sports. It’s also perfect for non-broadcast event coverage. Frankly, I’m surprised that US broadcasters haven’t gravitated to it like various other broadcasters around the world – especially for cutting news stories. The workflow, interface, and low hardware requirements make it well-suited to the task.

Station promo production might be questionable for some, but stop and think about the use of Motion Templates and how that technology can be applied to broadcast design. Final Cut features the unique ability to use templates that any user can create and publish as an effect out of Apple Motion. Therefore, custom effects, animation, and graphics can easily be created specifically for a station’s bespoke look.

For example, a broadcast group or network that owns multiple stations in different cities could have one creative team develop a custom station graphics package for each outlet, simply by using Motion. Those templates could be deployed to each promo department and installed into the individual FCPX edit systems. This would allow each editor to modify or customize time and event information based on the published parameters without mistakenly deviating from the prescribed graphic look. That’s a broadcast creative director’s dream.

A simple hardware footprint

Obviously Final Cut requires Apple computers, but there’s easy connectivity to media from external Thunderbolt, USB, and ethernet-based storage. Some facilities certainly need elaborate shared storage systems for collaborative workflows, but others don’t. If you are a creative editorial boutique, all of a given project’s proxy editing files can be stored on a single SSD drive, allowing the editor to easily move from room to room, or home to work, simply by carrying the SSD with them. They can even be cutting on a laptop and then bring that in to work, connect to an external display for better monitoring, and keep rocking. With the advent of external GPU systems (eGPU), you can easily augment the horsepower of middle-level Macs when the need arises. 

No external i/o hardware is required for monitoring. While I recommend a simple audio i/o interface and external speakers as a minimum, there are plenty of fixed-location systems where the editors only use headphones. AJA or Blackmagic interfaces to play video out to an external display are optional. Simply connect a high-quality display to the Mac via HDMI or Thunderbolt and FCPX will feed real video to it full screen. Premiere Pro can also do this, but Media Composer and Resolve do not.

Third-party ecosystem

Some of Final Cut’s deficits have developed into a huge asset. It enjoys one of the best ecosystems of third-party tools that enhance the application. These range from translation tools from vendors like Intelligent Assistance and Marquis Broadcast, to a myriad of plug-ins, such as those from FxFactory and Coremelt. Final Cut already comes with a very solid set of built-in effects filters – probably the most useful variety of the various NLE options. Even better, if you also purchase Motion, you can easily create more effects by building your own as Motion Templates. This has resulted in a ton of small developers who create and sell their own variations using this core technology.

You certainly don’t have to purchase any additional effects to be productive with FCPX, but if you do, then one of the better options is FxFactory by Noise Industries. FxFactory is both a set of effects and a delivery platform for other developers. You can use the FxFactory interface to purchase, install, and manage plug-ins and even applications from a diverse catalogue of tools. Pick and choose what you need and grow the repertoire as you see fit. One of the first options to start with is idustrial revolution’s newly revamped XEffects Toolkit. This includes numerous effects and title templates to augment your daily work. Some of these employ built-in tracking technology that allows you to pin items to objects within a shot.

Apple’s latest feature addition is workflow extensions. Adobe introduced this technology first in its products. But Apple has built upon it through macOS integration with apps like Photos and now in Final Cut Pro X. In short, an extension allows direct FCPX integration with another application. Various extensions can be downloaded from the Mac App Store and installed into FCPX. An extension then adds a panel into Final Cut, which allows you to interact with that application from inside the FCPX interface. Initially some of the companies offering extensions include frame.io, Shutterstock, Simon Says, and others.

Subscription

A sore point for many Adobe customers was the shift to the subscription business model. While the monthly rates are reasonable if you are an ongoing business, they have caused some to stick with software as old as CS6 (yikes!). As more companies adopt subscriptions, you have to start wondering when enough is enough. I don’t think we are there yet and Creative Cloud is still a solid value. But if you are an individual who doesn’t make a living with these tools, then it’s a concern. Adobe recently raised eyebrows with the doubling of the monthly cost for its Photography plan. As it turns out this is an additional pricing plan with more storage and not a replacement, but that’s only evident after the website page appears to have been quickly fixed. Predictably this gives competitors like ON1 an avenue for counter-marketing.

Concerned with subscriptions? Then the Apple professional applications are an alternative. Final Cut Pro X, Compressor, Motion, and Logic ProX – coupled with photo and graphics tools from Affinity and/or Pixelmator – provide a viable competing package to Adobe Creative Cloud. Heck, augment that with Fusion and/or DaVinci Resolve – even the free versions – and the collection becomes a formidable toolkit.

The interface

Naturally, the elephant in the room is the FCPX interface. It’s what simultaneously excited and turned off so many FCP7 users. In the end, how you edit with Final Cut Pro X does not have to be all that different than your editing style with other NLEs. Certainly there are differences, but once you get used to the basics, there’s more that’s similar than is different.

Isn’t imitation the highest form of flattery? You only have to look at Adobe Premiere Rush or the new Cut Page in Resolve 16 to realize that just maybe, others are starting to see the value in Apple’s approach. On top of that, there are features touted in Resolve 16, like facial (actually shape) recognition or adjustment layers, that were there even in FCPX 10.0. Whether this all is blatant copying or simply a tip-of-the-hat doesn’t matter. Each company has come to the conclusion that some workflows and some newer editors need a faster and more direct user interface that is easily scalable to small and large screens and to single and dual-display systems.

I realize that many out there will read this post and scream Apple apologist. Whatever. If you’ve shifted to PC, then very little of what I’ve said applies to you. I make my daily living with Apple hardware. While I recognize you can often get superior performance with a PC, I don’t find the need to make a change yet. This means that Final Cut Pro X remains a great option for my workflows. It’s a tool I can use for nearly any job and one that is often times better than most. If you rejected it eight years ago, maybe it’s time to take a second look.

©2019 Oliver Peters

Advertisements

CoreMelt PaintX

When Apple launched Final Cut Pro X, it was with a decidedly simplified set of video effects. This was enhanced by the easy ability for users to create their own custom effects, using Apple Motion as a development platform. The result has been an entirely new ecosystem of low-cost, high-quality video effects. As attractive as that is, truly advanced visual effects still require knowledgeable plug-in developers who are able to work within the FCPX and macOS architecture in order to produce more powerful tools. For example, other built-in visual effects tools, such as Avid Media Composer’s Intraframe Paint or the Fusion page in DaVinci Resolve, simply aren’t within the scope of FCPX, nor what users can create on their own through Motion templates.

To fill that need, developers like CoreMelt have been designing a range of advanced visual effects tools for the Final Cut Pro market, including effects for tracking, color correction, stabilization, and more. Their newest release is PaintX, which adds a set of Photoshop-style tools to Final Cut Pro X. As with many of CoreMelt’s other offerings, PaintX includes planar tracking, thanks to the licensing of Mocha tracking technology.

To start, drop the PaintX effect onto a clip and then launch the custom interface. PaintX requires a better control layout than the standard FCPX user interface has been designed for. Once inside the PaintX interface window, you have a choice of ten brush functions, including paint color, change color, blur, smear, sharpen, warp, clone, add noise, heal, and erase. These functions cover a range of needs, from simple wire removal to beauty enhancements and even pseudo horror makeup effects. You have control over brush size, softness, aspect ratio, angle, and opacity. The various brushes also have specific controls for their related functions, such as the blur range for the blur brush. Effects are applied in layers and actions. Each stroke is an action and both remain editable. If you aren’t the most precise artist, then the erase brush comes in handy. Did you color a bit too far outside of the lines? Simply use the erase brush on that layer and trim back your excess.

Multiple brush effects can be applied to the same or different areas within the image, simply by adding a new layer for each effect. Once you’ve applied the first paint stroke, an additional brush control panel opens – allowing you to edit the brush parameters, after the fact. So, if your brush size was too large or not soft enough, simply alter those settings without the need to redo the effect. Each effect can be individually tracked in either direction. The Mocha tracker offers additional features, such as transform (scale/position) versus perspective tracking, along with the ability to copy and paste tracking data between brush layers.

As a Final Cut Pro X effect, PaintX works within the standard video pipeline. If you applied color correction upstream of your PaintX filter, then that grade is visible within the PaintX interface. But if the color correction is applied downstream of the PaintX effect, you won’t see it when you open the PaintX interface. However, that correction will still be uniformly applied to the clip, including the areas altered within the PaintX effect. If you’ve “punched into” a 4K clip on an HD timeline, when you open PaintX, you’ll still see the full 4K frame. Finally, you have additional FCPX control over the opacity and mix of the applied PaintX filter.

I found PaintX to be well-behaved even on a modest Mac, like my 3-year-old laptop. However, if you don’t have a beefy Mac, keep the effect simple. The more brush effects that you apply and track in a single clip, the slower the real-time response will become, especially on under-powered machines. These effects are GPU-intensive and paint strokes are really a particle system; therefore, simple, single-layer effects are the easiest on the machine. But, if you intend to do more complex effects like blurs and sharpens in multiple layers, then you will really want one of the more powerful Macs. Playback response is generally better, once you’ve saved the effect and exit back to Final Cut. I did run into one minor issue with the clone brush on a single isolated clip, while using a 2013 Mac Pro. CoreMelt told me there have been a few early bugs with certain GPUs and is looking into the anomaly I discovered. That model in particular has been notorious for GPU issues with video effects. (Update: CoreMelt sent me a new build, which has corrected this problem.)

Originally written for RedShark News

©2018 Oliver Peters

Hawaiki AutoGrade

The color correction tools in Final Cut Pro X are nice. Adobe’s Lumetri controls make grading intuitive. But sometimes you just want to click a few buttons and be happy with the results. That’s where AutoGrade from Hawaiki comes in. AutoGrade is a full-featured color correction plug-in that runs within Final Cut Pro X, Motion, Premiere Pro and After Effects. It is available from FxFactory and installs through the FxFactory plug-in manager.

As the name implies, AutoGrade is an automatic color correction tool designed to simplify and speed-up color correction. When you install AutoGrade, you get two plug-ins: AutoGrade and AutoGrade One. The latter is a simple, one-button version, based on global white balance. Simply use the color-picker (eye dropper) and sample an area that should be white. Select enable and the overall color balance is corrected. You can then tweak further, by boosting the correction, adjusting the RGB balance sliders, and/or fine-tuning luma level and saturation. Nearly all parameters are keyframeable, and looks can be saved as presets.

AutoGrade One is just a starter, though, for simple fixes. The real fun is with the full version of AutoGrade, which is a more comprehensive color correction tool. Its interface is divided into three main sections: Auto Balance, Quick Fix, and Fine-Tune. Instead of a single global balance tool, the Auto Balance section permits global, as well as, any combination of white, black, and/or skin correction. Simply turn on one or more desired parameters, sample the appropriate color(s) and enable Auto Balance. This tool will also raise or lower luma levels for the selected tonal range.

Sometimes you might have to repeat the process if you don’t like the first results. For example, when you sample the skin on someone’s face, sampling rosy cheeks will yield different results than if you sample the yellowish highlights on a forehead. To try again, just uncheck Auto Balance, sample a different area, and then enable Auto Balance again. In addition to an amount slider for each correction range, you can also adjust the RGB balance for each. Skin tones may be balanced towards warm or neutral, and the entire image can be legalized, which clamps video levels to 0-100.

Quick Fix is a set of supplied presets that work independently of the color balance controls. These include some standards, like cooling down or warming up the image, the orange and teal look, adding an s-curve, and so on. They are applied at 100% and to my eye felt a bit harsh at this default. To tone down the effect, simply adjust the amount slider downwards to get less intensity from the effect.

Fine-Tune rounds it out when you need to take a deeper dive. This section is built as a full-blown, 3-way color corrector. Each range includes a luma and three color offset controls. Instead of wheels, these controls are sliders, but the results are the same as with wheels. In addition, you can adjust exposure, saturation, vibrance, temperature/tint, and even two different contrast controls. One innovation is a log expander, designed to make it easy to correct log-encoded camera footage, in the absence of a specific log-to-Rec709 camera LUT.

Naturally, any plug-in could always offer more, so I have a minor wish list. I would love to see five additional features: film grain, vignette, sharpening, blurring/soft focus, and a highlights-only expander. There are certainly other individual filters that cover these needs, but having it all within a single plug-in would make sense. This would round out AutoGrade as a complete, creative grading module, servicing user needs beyond just color correction looks.

AutoGrade is a deceptively powerful color corrector, hidden under a simple interface. User-created looks can be saved as presets, so you can quickly apply complex settings to similar shots and set-ups. There are already many color correction tools on the market, including Hawaiki’s own Hawaiki Color. The price is very attractive, so AutoGrade is a superb tool to have in your kit. It’s a fast way to color-grade that’s ideal for both users who are new or experienced when it comes to color correction.

(Click any image to see an enlarged view.)

©2018 Oliver Peters

More about ProRes RAW

A few weeks ago I wrote a two-part post – HDR and RAW Demystified. In the second part, I covered Apple’s new ProRes RAW codec. I still see a lot of misinformation on the web about what exactly this is, so I felt it was worth an additional post. Think of this post as an addendum to Part 2. My apologies up front, if there is some overlap between this and the previous post.

_____________________________

Camera raw codecs have been around since before RED Digital Camera brought out their REDCODE RAW codec. At NAB, Apple decided to step into the game. RED brought the innovation of recording the raw signal as a compressed movie file, making on-board recording and simplified post-production possible. Apple has now upped the game with a codec that is optimized for multi-stream playback within Final Cut Pro X, thus taking advantage of how FCPX leverages Apple hardware. At present, ProRes RAW is incompatible with all other applications. The exception is Motion, which will read and play the files, but with incorrect default – albeit correctable – video levels.

ProRes RAW is only an acquisition codec and, for now, can only be recorded externally using an Atomos Inferno or Sumo 19 monitor/recorder, or in-camera with DJI’s Inspire 2 or Zenmuse X7. Like all things Apple, the complexity is hidden under the surface. You don’t get the type of specific raw controls made available for image tweaking, as you do with RED. But, ProRes RAW will cover the needs of most camera raw users, making this the raw codec “for the rest of us”. At least that’s what Apple is banking on.

Capturing in ProRes RAW

The current implementation requires a camera that exports a camera raw signal over SDI, which in turn is connected to the Atomos, where the conversion to ProRes RAW occurs. Although no one is very specific about the exact process, I would presume that Atomos’ firmware is taking in the camera’s form of raw signal and rewrapping or transforming the data into ProRes RAW. This means that the Atomos firmware would require a conversion table for each camera, which would explain why only a few Sony, Panasonic, and Canon models qualify right now. Others, like ARRI Alexa or RED cameras, cannot yet be recorded as ProRes RAW. The ProRes RAW codec supports 12-bit color depth, but it depends on the camera. If the SDI output to the Atomos recorder is only 10-bit, then that’s the bit-depth recorded.

Until more users buy or update these specific Atomos products – or more manufacturers become licensed to record ProRes RAW onboard the camera – any real-word comparisons and conclusions come from a handful of ProRes RAW source files floating around the internet. That, along with the Apple and Atomos documentation, provides a pretty solid picture of the quality and performance of this codec group.

Understanding camera raw

All current raw methods depend on single-sensor cameras that capture a Bayer-pattern image. The sensor uses a monochrome mosaic of photosites, which are filtered to register the data for light in the red, green, or blue wavelengths. Nearly all of these sensors have twice as many green receptors as red or blue. At this point, the sensor is capturing linear light at the maximum dynamic range capable for the exposure range of the camera and that sensor. It’s just an electrical signal being turned into data, but without compression (within the sensor). The signal can be recorded as a camera raw file, with or without compression. Alternatively, it can also be converted directly into a full-color video signal and then recorded – again, with or without compression.

If the RGGB photosite data (camera raw) is converted into RGB pixels, then sensor color information is said to be “baked” into the file. However, if the raw conversion is stored in that form and then later converted to RGB in post, sensor data is preserved intact until much later into the post process. Basically, the choice boils down to whether that conversion is best performed within the camera’s electronics or later via post-production software.

The effect of compression may also be less destructive (fewer visible artifacts) with a raw image, because data, rather than video is being compressed. However, converting the file to RGB, does not mean that a wider dynamic range is being lost. That’s because most camera manufacturers have adopted logarithmic encoding schemes, which allow a wide color space and a high dynamic range (big exposure latitude) to be carried through into post. HDR standards are still in development and have been in testing for several years, completely independent of whether or not the source files are raw.

ProRes RAW compression

ProRes RAW and ProRes RAW HQ are both compressed codecs with roughly the same data footprint as ProRes and ProRes HQ. Both raw and standard versions use a variable bitrate form of compression, but in different ways. Apple explains it this way in their white paper: 

“As is the case with existing ProRes codecs, the data rates of ProRes RAW are proportional to frame rate and resolution. ProRes RAW data rates also vary according to image content, but to a greater degree than ProRes data rates. 

With most video codecs, including the existing ProRes family, a technique known as rate control is used to dynamically adjust compression to meet a target data rate. This means that, in practice, the amount of compression – hence quality – varies from frame to frame depending on the image content. In contrast, ProRes RAW is designed to maintain constant quality and pristine image fidelity for all frames. As a result, images with greater detail or sensor noise are encoded at higher data rates and produce larger file sizes.”

ProRes RAW and HDR do not depend on each other

One of my gripes, when watching some of the ProRes RAW demos on the web and related comments on forums, is that ProRes RAW is being conflated with HDR. This is simply inaccurate. Raw applies to both SDR and HDR workflows. HDR workflows do not depend on raw source material. One of the online demos I saw recently immediately started with an HDR FCPX Library. The demo ProRes RAW clips were imported and looked blown out. This made for a dramatic example of recovering highlight information. But, it was wrong!

If you start with an SDR FCPX Library and import these same files, the default image looks great. The hitch here, is that these ProRes RAW files were shot with a Sony camera and a default LUT is applied in post. That’s part of the file’s metadata. To my knowledge, all current, common camera LUTs are based on conversion to the Rec709 color space, not HDR or wide gamut. If you set the inspector’s LUT tab to “none” in either SDR or HDR, you get a relatively flat, log image that’s easily graded in whatever direction you want.

What about raw-specific settings?

Are there any advantages to camera raw in the first place? Most people will point to the ability to change ISO values and color temperature. But these aren’t actually something inherently “baked” into the raw file. Instead, this is metadata, dialed in by the DP on the camera, which optimizes the images for the sensor. ISO is a sensitivity concept based on the older ASA film standard for exposing film. In modern digital cameras, it is actually an exposure index (EI), which is how some refer to it. (RedShark’s Phil Rhodes goes into depth in this linked article.)

The bottom line is that EI is a cross-reference to that camera sensor’s “sweet spot”. 800 on one camera might be ideal, while 320 is best on another. Changing ISO/EI has the same effect as changing gain in audio. Raising or lowering ISO/EI values means that you can either see better into the darker areas (with a trade-off of added noise) – or you see better highlight detail, but with denser dark areas. By changing the ISO/EI value in post, you are simply changing that reference point.

In the case of ProRes RAW and FCPX, there are no specific raw controls for any of this. So it’s anyone’s guess whether changing the master level wheel or the color temp/tint sliders within the color wheels panel is doing anything different for a ProRes RAW file than doing the same adjustment for any other RGB-encoded video file. My guess is that it’s not.

In the case of RED camera files, you have to install a camera raw plug-in module in order to work with the REDCODE raw codec inside of Final Cut Pro X. There is a lot of control of the image, prior to tweaking with FCPX’s controls. However, the amount of image control for the raw file is significantly more for a REDCODE file in Premiere Pro, than inside of FCPX. Again, my suspicion is that most of these controls take effect after the conversion to RGB, regardless of whether or not the slider lives in a specific camera raw module or in the app’s own color correction controls. For instance, changing color temperature within the camera raw module has no correlation to the color temperature control within the app’s color correction tools. It is my belief that few of these actually adjust file data at the raw level, regardless of whether this is REDCODE or ProRes RAW. The conversion from raw to RGB is proprietary with every manufacturer.

What is missing in the ProRes RAW implementation is any control over the color science used to process the image, along with de-Bayering options. Over the years, RED has reworked/improved its color science, which theoretically means that a file recorded a few years ago can look better today (using newer color science math) than it originally did. You can select among several color science models, when you work with the REDCODE format. 

You can also opt to lower the de-Bayering resolution to 1/2, 1/4, 1/8, etc. for a RED file.  When working in a 1080p timeline, this speeds up playback performance with minimal impact on the visible resolution displayed in the viewer. For full-quality conversion, software de-Bayering also yields different results than hardware acceleration, as with the RED Rocket-X card. While this level of control is nice to have, I suspect that’s the sort of professional complication that Apple seeks to avoid.

The main benefit of ProRes RAW may be a somewhat better-quality image carried into post at a lower file size. To get the comparable RGB image quality you’d need to go up to uncompressed, ProRes 4444, or ProRes 4444 XQ – all of which become very taxing in post. Yet, for many standard productions, I doubt you’ll see that great of a difference. Nevertheless, more quality with a lower footprint will definitely be welcomed.

People will want to know whether this is a game-changer or not. On that count, probably not. At least not until there are a number of in-camera options. If you don’t edit – and finish – with FCPX, then it’s a non-starter. If you shoot with a camera that records in a high-quality log format, like an ARRI Alexa, then you won’t see much difference in quality or workflow. If you shoot with any RED camera, you have less control over your image. On the other hand, it’s a definite improvement over all raw workflows that capture in image sequences. And it breathes some life into an older camera, like the Sony FS700. So, on balance, ProRes RAW is an advancement, but just not one that will affect as large a part of the industry as the rest of the ProRes family has.

(Note – click any image for an enlarged view. Images courtesy of Apple, FilmPlusGear, and OffHollywood.)

©2018 Oliver Peters

Luca Visual FX builds Mystery & Suspense

For most editors, creating custom music scores tends to fall into the “above my pay grade” category. If you are a whizz with GarageBand or Logic Pro X, then you might dip into Apple’s loop resources. But most commercials and corporate videos are easily serviced by the myriad of stock music sites, like Premium Beat and Music Bed. Some music customization is also possible with tracks from companies like SmartSound.

Yet, none of the go-to music library sites offer curated, genre-based, packages of tracks and elements that make it easy to build up a functional score for longer dramatic productions. Such projects are usually the work of composers or a specific music supervisor, sound designer, or music editor doing a lot of searching and piecing together from a wide range of resources.

Enter Luca Visual FX – a developer best known for visual effects plug-ins, such as Light Kit 2.0. It turns out that Luca Bonomo is also a composer. The first offering is the Mystery & Suspense Music and Sound Library, which is a collection of 500 clips, comprising music themes, atmospheres, drones, loops, and sound effects. This is a complete toolkit designed to make it easy to combine elements, in order to create a custom score for dramatic productions in the mystery or suspense genre.

These tracks are available for purchase as a single library through the LucaVFX website. They are downloaded as uncompressed, stereo AIF files in a 24-bit/96kHz resolution. This means they are of top quality and compatible with any Mac or PC NLE or DAW application. Best yet, is the awesome price of $79. The package is licensed for a single user and may be used for any audio or video production, including for commercial purposes.

Thanks to LucaVFX, I was able to download and test out the Library on a recent short film. The story is a suspense drama in the style of a Twilight Zone episode, so creating a non-specific, ethereal score fits perfectly. Drones, dissonance, and other suspenseful sounds are completely in line, which is where this collection shines.

Although I could have used any application to build this, I opted for Apple’s Final Cut Pro X. Because of its unique keyword structure, it made sense to first set up a separate FCPX library for only the Mystery & Suspense package. During import, I let FCPX create keyword collections based on the Finder folders. This keeps the Mystery & Suspense FCPX library organized in the same way as they are originally grouped. Doing so, facilitates fast and easy sorting and previewing of any of the 500 clips within the music library. Then I created a separate FCPX library for the production itself. With both FCPX libraries open, I could quickly preview and place clips from my music library to the edit sequence for the film, located within the other FCPX library.

Final Cut uses Connected Clips instead of tracks. This means that you can quickly build up and align overlapping atmospheres, transitions, loops, and themes for a densely layered music score in a very freeform manner. I was able to build up a convincing score for a half-hour-long piece in less that an afternoon. Granted, this isn’t mixed yet, but at least I now have the musical elements that I want and where I want them. I feel that style of working is definitely faster in Final Cut Pro X – and more conducive to creative experimentation – but it would certain work just as well in other applications.

The Mystery & Suspense Library is definitely a winner, although I do have a few minor quibbles. First, the music and effects are in keeping with the genre, but don’t go beyond it. When creating a score for this kind of production, you also need some “normal” or “lighter” moods for certain scenes or transitions. I felt that was missing and I would still have to step outside of this package to complete the score. Secondly, many of the clips have a synthesized or electronic tone to them, thanks to the instruments used to create the music. That’s not out of character with the genre, but I still would have liked some of these to include more natural instruments than they do. In fairness to LucaVFX, if the Mystery & Suspense Library is successful, then the company will create more libraries in other genres, including lighter fare.

In conclusion, this is a high quality library perfectly in keeping with its intended genre. Using it is fast and flexible, making it possible for even the most musically-challenged editor to develop a convincing, custom score without breaking the bank.

©2018 Oliver Peters

FCPX Color Wheels Take 2

Prior to version 10.4, the color correction tools within Final Cut Pro X were very basic. You could get a lot of work done with the color board, but it just didn’t offer tools competitive with other NLEs – not to mention color plug-ins or a dedicated grading app like DaVinci Resolve. With the release of 10.4, Apple upped the game by adding color wheels and a very nice curves implementation. However, for those of us who have been doing color correction for some time, it quickly became apparent that something wasn’t quite right in the math or color science behind these new FCPX color wheels. I described those anomalies in this January post.

To summarize that post, the color wheels tool seems to have been designed according to the lift/gamma/gain (LGG) correction model. The standard behavior for LGG is evident with a black-to-white gradient image. On a waveform display, this appears as a diagonal line from 0 to 100. If you adjust the highlight control (gain), the line appears to be pinned at the bottom with the higher end pivoting up or down as you shift the slider. Likewise, the shadow control (lift) leaves the line pinned at the top with the bottom half pivoting. The midrange control (gamma) bends the middle section of the line inward or outward, with no affect on the two ends, which stay pinned at 0 and 100, respectively. In addition to luminance value, when you shift the hue offset to an extreme edge – like moving the midrange puck completely to yellow – you should still see some remaining black and white at the two ends of the gradient.

That’s how LGG is supposed to work. In FCPX version 10.4, each color wheel control also altered the levels of everything else. When you adjusted midrange, it also elevated the shadow and highlight ranges. In the hue offset example, shifting the midrange control to full-on yellow tinted the entire image to yellow, leaving no hint of black or white. As a result, the color wheels correction tool was unpredictable and difficult to use, unless you were doing only very minor adjustments. You ended up chasing your tail, because when one correction was made, you’d have to go back and re-adjust one of the other wheels to compensate for the unwanted changes made by the first adjustment.

With the release of FCPX 10.4.1 this April, Apple engineers have changed the way the color wheels tool behaves. Corrections now correspond to the behavior that everyone accepts as standard LGG functionality. In other words, the controls mostly only affect their part of the image without also adjusting all other levels. This means that the shadows (lift) control adjusts the bottom, highlights (gain) will adjust the top end, and midrange (gamma) will lighten or darken the middle portion of the image. Likewise, hue offsets don’t completely contaminate the entire image.

One important thing to note is that existing FCPX Libraries created or promoted under 10.4 will now be promoted again when opened in 10.4.1. In order that your color wheel corrections don’t change to something unexpected when promoted, Projects in these Libraries will behave according to the previous FCPX 10.4 color model. This means that the look of clips where color wheels were used – and their color wheel values – haven’t changed. More importantly, the behavior of the wheels when inside those Libraries will also be according to the “old” way, should you make any further corrections. The new color wheels behavior will only begin within new Libraries created under 10.4.1.

These images clarify how the 10.4.1 adjustments now work (click to see enlarged and expanded views).

©2018 Oliver Peters

What’s up with Final Cut’s Color Wheels?

NOTE: The information presented here has been superseded by the release of FCPX 10.4.1 in April 2018. With that release the color wheels model has been changed. Please read the linked blog post for updated information.

Apple Final Cut Pro X 10.4 introduced new, advanced color correction tools to this editing application, including color wheels, curves, and hue vs. saturation curves. These are tools that users of other NLEs have enjoyed for some time – and, which were part of Final Cut Studio (FCP 7, Color). Like others, my first reaction was, “Super! They’ve added some nice advanced tools, which will improve the use of FCPX for higher-end users.” But, as I started to primarily use the Color Wheels with real correction work, I quickly realized that something wasn’t quite right in how they operated. Or at least, they didn’t work in a way that we’ve come to understand.

In trying to figure it out, I reached out to other industry pros and developers for their thoughts. Naturally this led to some spirited discussions at forums like those at Creative COW. However, other editors have noticed the same problems, so you can also find threads in the Facebook FCPX group and at FCP.co. It is certainly easy to characterize this as just another internet kerfuffle, surrounding Apple’s “think different” approaches to FCPX. But those arguments fall flat when you actually try to use the tools as intended.

The FCPX Color Wheels panel includes four wheels – Master, Shadows, Midtones, and Highlights. The puck in the center of each wheel is a hue offset control to push hues in the direction that you move the puck. The slider to the right of the wheel controls the brightness of that range. The left slider controls the saturation. One of the main issues is that when you adjust luminance using one of these controls, the affected range is too broad. Specifically, in the case of the Midtones control, as you adjust the luminance slider up or down, you are affecting most of the image and not just the midrange levels. This is not the way this type of control normally works in other tools, and in fact, it’s not how FCPX’s Color Board controls work either.

“What’s the big deal?” you might ask. Fair enough. I see two operational issues. The first is that to properly grade the image using the Color Wheels, you end up having to go back-and-forth a lot between wheels, to counteract the changes made by one control with another. The second is that using the Midtones slider tends to drive highlights above 100 IRE, where they will be clipped if any broadcast limiting is used. This doesn’t happen with other color tools, notably Apple’s own Color Board.

A lot of the discussion focuses on luma levels and specifically the Midtones slider, since it’s easy to see the issue there. However, other controls are also affected, but that’s too much to dissect in a single post. Throughout this post, be sure to click on the images to see the full view. I have presented various samples against each other and you will only get the full understanding if you open the thumbnail (which is small but also cropped) to the full image. I have compared the effect using five different tools – the Color Board, the Color Wheels, a color corrector plug-in that I built as a Motion template using Motion effects, Rubber Monkey Software FilmConvert (the wheels portion only), and finally, the Adobe Lumetri controls in Premiere Pro.

I am using three different test images – a black-to-white ramp, a test pattern, and a demo video image. The ramp without correction will appear as a diagonal line (0-100 IRE) on the scope, which makes it easy to analyze what’s happening. The video image has definite shadow and highlight areas, which lets us see how these controls work in the real world. For example, if you want to brighten the area of the shot where the man is in the shadows, but don’t want to make the highlights any brighter, this would normally be done using a Midtones control. Be aware that these various tools certainly aren’t calibrated the same way and some have a greater range of control than others. The weakest of these is FilmConvert’s wheels, since this plug-in has additional level controls in other parts of its interface.

Color science models

In the various forum threads, the argument is made that Apple is simply using a different color science method or a different weighing of some existing models. That’s certainly possible, since not all color correctors are built the same way. The most common approaches are Lift/Gamma/Gain and Shadows/Mids/Highlights. Be careful with naming. Just because something uses the terminology of Shadows, Midtones, and Highlights, does not mean that it also uses the SMH color science model. Many tools use the Lift/Gamma/Gain model, but in fact, call the controls shadows (Lift), mids (Gamma), and highlights (Gain). Another term you may run across is Set-up in some correction tools. This is typically used for control of shadows (equal to Lift), but can also function is an offset control that raises the level of the entire image. Avid Symphony employs this solution. Finally, both Symphony and Adobe SpeedGrade use what has been dubbed a 12-way color corrector. Each range is further subdivided into its own subset of shadows, mids, and highlights controls.

An LGG model provides broad control of shadows and highlights, with the midtones control working like a curve that covers the whole range, but with the largest effect in the middle. An SMH model normally divides the levels into three distinct, precisely overlapping ranges. This is much like a three-band audio equalizing filter. A number of the color correctors add a luma range control, which gives the user the ability to change how much of the image a specific range will affect. In other words, how broad is the control of the shadows, mids, or highlights control? This is like a Q control in an audio equalizer, where you change the shape of the envelope at a certain frequency.

Red Giant’s Magic Bullet Looks offers both color correction models with two different tools – the 4-way color corrector (SMH) and the Colorista color corrector (LGG). When you adjust the midrange control of their 4-way, the result is a graceful S-shaped curve to the levels on the waveform.

To study the effect of an LGG-based corrector, test the ramp. The shadows control (Lift) will raise or lower the dark areas of the image without changing the absolute highlights. The diagonal line of the ramp on the waveform essentially pivots, hinged at the 100 IRE point. Conversely, change the highlights control (Gain) pivots the line pinned to 0 IRE (at black). When you adjust the midtones control (Gamma), you create a curve to the line, which stays pinned at 0 and 100 IRE at either end. In this way you are effectively “expanding” or “compressing” the levels in the middle portion of your image without changing the position of your black or white points.

How the various color correction tools react

Looking at the luma control for the Midtones, two things are clear. First, all of these tools are using the LGG color science model. It’s not clear what the Color Wheels are using, but it isn’t SMH, as there is no bulge or S-curve visible in the scope. Second, the Color Wheels quickly drive the image levels into clipping, while the other tools generally keep black and while levels in place. In essence, the Midtones control affects the image more like a master or offset control would, than a typical mids or Gamma control. Yet, clearly Apple’s Color Board controls adhere to the standard LGG model. The concern, of course, is clipping. In the test image of the man walking on the village street, the sunlit building walls on the opposite side of the street will become overexposed and risk being clipped when the Color Wheels are used.

What about color? As a simple test, I next shifted the Midtones puck to the yellow. Bear in mind that the range of each of these controls is different, so you will see varying degrees of yellow intensity. Nevertheless, the way the control should work is that some pure black and white should be preserved at the top and bottom of the video levels. All of these tools maintain that, except for the Color Wheels. There, the entire image is yellow, effectively making the hue offset puck function more like a tint control.

One other issue to note, is that the Color Wheels offer an extraordinarily control range. The hue offset control RGB intensity values go from 0 (center of the wheel) to 1023. However, the puck icon can only go to the rim of the wheel, which it hits at about 200. With a mouse (or numerical entry), you can keep going well past the stop of the wheel icon – five times farther, in fact. The image not only becomes very yellow in this case, but you can easily lose the location of your control, since the GUI position in no longer relevant.

The working theory

The big question is why don’t the Color Wheels conform to established principles, when in fact, the Color Board controls do? Until there is some further clarification from Apple, one possible explanation is with HDR. FCPX 10.4 introduced High Dynamic Range (HDR) features. One of the various HDR standards is Rec. 2020 PQ. In that color space, the 0-100 IRE limitations of Rec. 709 are expanded to 0-10,000 nits. 0-100 nits is roughly the same brightness as we are used to with Rec. 709.

Looking at this image of the man walking along the street – where I’ve attempted to get a pleasing look with all of the tools – you’ll see that the Color Wheels in Rec. 709 don’t react correctly and will drive the highlights into a range to be clipped. However, in the bottom pane, which is the same image in Rec. 2020 PQ color space, the grade looks pretty normal. And, in practice, the Color Wheels controls work more or less the way I would have expected them to work. Yes, the same controls work differently in the different color spaces – properly in 2020 PQ and not in 709.

But why is that the case? I have no answer, but I do have a wild guess. Maybe, just maybe, the Color Wheels were designed for – or intended to only be used for – HDR work. Or maybe there’s conversion or recalibration of the controls that hasn’t taken place yet in this version. If the tool is only calibrated for HDR, then its range and weighing will be completely wrong for Rec. 709 video. If you increase the Midtones luma of the ramp in both Rec. 709 and Rec. 2020 PQ, you’ll see a similar curve. In fact, if you overlay a screen shot of each waveform, placing the full Rec. 709 scope image over the bottom portion of the Rec. 2020 PQ scale, you’ll notice that these sort of align up to about 100 IRE and nits. It’s as if one is simply a slice out of the other.

Regardless of why, this is something where I would hope Apple will provide a white paper or other demonstration of what the best practices will be for using this tool effectively. If it isn’t intentional, and actually is a mistake, then I presume a fix will be forthcoming. In either case, put in your feedback comments to Apple.

A word about HDR

Over the course of testing this tool and this theory, I’ve done a bit of testing with the HDR color spaces in FCPX. If you want to know more about HDR, I would encourage you to check out these contrary blog posts by Stu Maschwitz and Alexis Van Hurkman. I tend to side with Stu’s point-of-view and am not a big fan of HDR.

The way Apple has implemented these features in Final Cut Pro X 10.4 is to allow the user to set and override color spaces. If you set up your project to be Rec. 2020 PQ (and set preferences to “show HDR as raw values”), then the viewer and a/v output (direct from the Mac, not through a hardware i/o device) are effectively dimmed through the Mac’s color profile system. When you grade the image based on the 0-10,000 nits scale, you’ll end up seeing an image that looks pleasing and essentially the same as if you were working in Rec. 709. However – and I cannot over-emphasize this – you are not going to be able to produce an image that’s truly compatible with Dolby Vision and actually look correct as HDR, unless you have the correct AJA i/o hardware and a proper display. And by display, I mean a top-end Dolby, Canon, or Sony unit, costing tens of thousands of dollars.

As I understand the PQ specs, the bulk of the higher range is for the highlights that are normally constrained or clipped in our current video systems. However, that 10,000 nits scale is weighed, so that about 50% of the image value is in the first 100 nits, making it of comparable brightness to the current 100 IRE. The rest of that range is for brighter information, like specular highlights. You don’t necessarily get more brightness in the shadow detail. Therefore, if you are grading a shot in FCPX in a 2020 PQ color space and you only have the computer display to go by, you’ll grade by eye as much as by scope. This means that to get a pleasing image, you will end up making the average appearance of the image brighter than it really should be. When this is viewed on a real HDR monitor, it will be painfully bright. Having a higher-nits computer display, like on the iMac Pro (up to 500 nits), won’t make much difference, unless maybe, you crank the display brightness to its maximum (ouch!).  “Mine goes the 11!”

Right now, HDR is the wild, wild west. If you are smart, you’ll realize that you don’t know what you don’t know. While it’s nice to have these new features in FCPX, they can be very dangerous in the wrong hands.

But that’s another matter. Right now, I just hope Apple (or one of the usual suspects, like Ripple Training, LumaForge, or Larry Jordan) will come out with more elaboration on the Color Wheels.

©2018 Oliver Peters