Viva Las Vegas – NAB 2018

As more and more folks get all of their information through internet sources, the running question is whether or not trade shows still have value. A show like the annual NAB (National Association of Broadcasters) Show in Las Vegas is both fun and grueling, typified by sensory overload and folks in business attire with sneakers. Although some announcements are made before the exhibits officially open – and nearly all are pretty widely known before the week ends – there still is nothing quite like being there in person.

For some, other shows have taken the place of NAB. The annual HPA Tech Retreat in the Palm Springs area is a gathering of technical specialists, researchers, and creatives that many consider the TED Talks for our industry. For others, the Cine Gear Expo in LA is the prime showcase for grip, lighting, and camera offerings. RED Camera has focused on Cine Gear instead of NAB for the last couple of years. And then, of course, there’s IBC in Amsterdam – the more humane version of NAB in a more pleasant setting. But for me, NAB is still the main event.

First of all, the NAB Show isn’t merely about the exhibit floor at the sprawling Las Vegas Convention Center. Actual NAB members can attend various sessions and workshops related to broadcasting and regulations. There are countless sidebar events specific to various parts of the industry. For editors that includes Avid Connect – a two-day series of Avid presentations in the weekend leading into NAB; Post Production World – a series of workshops, training sessions, and presentations managed by Future Media Concepts; as well as a number of keynote presentations and artist gatherings, including SuperMeet, FCPexchange, and the FCPX Guru Gathering. These are places where you’ll rub shoulders with some well-known editors, colorists, artists, and mixers, learn about new technologies like HDR (high dynamic range imagery), and occasionally see some new product features from vendors who might not officially be on the show floor with a booth, like Apple.

One of the biggest benefits I find in going to NAB is simply walking the floor, checking out the companies and products who might not get a lot of attention. These newcomers often have the most innovative technologies and it’s these new things that you find, which were never on the radar prior to that week.

The second benefit is connection. I meet up again in person with friends that I’ve made over the years – both other users, as well as vendors. Often it’s a chance to meet people that you might only know through the internet (forums, blogs, etc.) and to get to know them just a bit better. A bit more of that might make the internet more friendly, too!

Here are some of my random thoughts and observations from Las Vegas.

__________________________________

Editing hardware and software – four As and a B

Apple uncharacteristically pre-announced their new features just prior to the show, culminating with App Store availability on Monday when the NAB exhibits opened. This includes new Final Cut Pro X/Motion/Compressor updates and the official number of 2.5 million FCPX users. That’s a growth of 500,000 users in 2017, the biggest year to date for Final Cut. The key new feature in FCPX is a captioning function to author, edit, and export both closed and embedded (open) captions. There aren’t many great solutions for captioning and the best to date have been expensive. I found that the Apple approach was now the best and easiest to use that I’ve seen. It’s well-designed and should save time and money for those who need to create captions for their productions – even if you are using another brand of NLE. Best of all, if you own FCPX, you already have that feature. When you don’t have a script to start out, then manual or automatic transcription is required as a starting point. There is now a tie-in between Speedscriber (also updated this week) and FCPX that will expedite the speech-to-text function.

The second part of Apple’s announcement was the introduction of a new camera raw codec family – ProResRAW and ProResRAW HQ. These are acquisition codecs designed to record the raw sensor data from Bayer-pattern sensors (prior to debayering the signal into RGB information) and make that available in post, just like RED’s REDCODE RAW or CinemaDNG. Since this is an acquisition codec and NOT a post or intermediate codec, it requires a partnership on the production side of the equation. Initially this includes Atomos and DJI. Atomos supplies an external recorder, which can record the raw output from various cameras that offer the ability to record raw data externally. This currently includes their Shogun Inferno and Sumo 19 models. As this is camera-specific, Atomos must then create the correct profile by camera to remap that sensor data into ProResRAW. At the show, this included several Canon, Sony, and Panasonic cameras. DJI does this in-camera on the Inspire 2.

The advantage with FCPX, is that ProResRAW is optimized for post, thus allowing for more streams in real-time. ProResRAW data rates (variable) fall between that of ProRes and ProResHQ, while the less compressed ProResRAW HQ rates are between ProRes HQ and ProRes 4444. It’s very early with this new codec, so additional camera and post vendors will likely add ProResRAW support over the coming year. It is currently unknown whether or not any other NLEs can support ProResRAW decode and playback yet.

As always, the Avid booth was quite crowded and, from what I heard, Avid Connect was well attended with enthused Avid users. The Avid offerings are quite broad and hard to encapsulate into any single blog post. Most, these days, are very enterprise-centric. But this year, with a new CEO at the helm, Avid’s creative tools have been reorganized into three strata – First, standard, and Ultimate. This applies to Sibelius, Pro Tools, and Media Composer. In the case of Media Composer, there’s Media Composer | First – a fully functioning free version, with minimal restrictions; Media Composer; and Media Composer | Ultimate – includes all options, such as PhraseFind, ScriptSync, NewsCutter, and Symphony. The big difference is that project sharing has been decoupled from Media Composer. This means that if you get the “standard” version (just named Media Composer) it will not be enabled for collaboration on a shared storage network. That will require Media Composer | Ultimate. So Media Composer (standard) is designed for the individual editor. There is also a new subscription pricing structure, which places Media Composer at about the same annual cost as Adobe Premiere Pro CC (single app license). The push is clearly towards subscription, however, you can still purchase and/or maintain support for perpetual licenses, but it’s a little harder to find that info on Avid’s store website.

Though not as big news, Avid is also launching the Avid DNxID capture/export unit. It is custom-designed by Blackmagic Design for Avid and uses a small form factor. It was created for file-base acquisition, supports 4K, and includes embedded DNx codecs for onboard encoding. Connections are via component analog, HDMI, as well as an SD card slot.

The traffic around Adobe’s booth was thick the entire week. The booth featured interesting demos that were front and center in the middle of one of the South Hall’s main thoroughfares, generally creating a bit of a bottleneck. The newest Creative Cloud updates had preceded the show, but were certainly new to anyone not already using the Adobe apps. Big news for Premiere Pro users was the addition of automatic ducking that was brought over from Audition, and a new shot matching function within the Lumetri color panel. Both are examples of Adobe’s use of their Sensei AI technology. Not to be left out, Audition can now also directly open sequences from Premiere Pro. Character Animator had been in beta form, but is now a full-fledged CC product. And for puppet control Adobe also introduced the Advanced Puppet Engine for After Effects. This is a deformation tool to better bend, twist, and control elements.

Of course when it comes to NLEs, the biggest buzz has been over Blackmagic Design’s DaVinci Resolve 15. The company has an extensive track record of buying up older products whose companies weren’t doing so well, reinvigorating the design, reducing the cost, and breathing new life into them – often to a new, wider customer base. This is no more evident than Resolve, which has now grown from a leading color correction system to a powerful, all-in-one edit/mix/effects/color solution. We had previously seen the integration of the Fairlight audio mixing engine. This year Fusion visual effects were added. As before, each one of these disparate tools appears on its own page with a specific UI optimized for that task.

A number of folks have quipped that someone had finally resurrected Avid DS. Although all-in-ones like DS and Smoke haven’t been hugely successful in the past, Resolve’s price point is considerably more attractive. The Fusion integration means that you now have a subset of Fusion running inside of Resolve. This is a node-based compositor, which makes it easy for a Resolve user to understand, since it, too, already uses nodes in the color page. At least for now, Blackmagic Design intends to also maintain a standalone version of Fusion, which will offer more functions for visual effects compositing. Resolve also gained new editorial features, including tabbed sequences, a pancake timeline view, captioning, and improvements in the Fairlight audio page.

Other Blackmagic Design news includes updates to their various mini-converters, updates to the Cintel Scanner, and the announcement of a 4K Pocket Cinema Camera (due in September). They have also redesigned and modularized the Fairlight console mixing panels. These are now more cost-effective to manufacture and can be combined in various configurations.

This was the year for a number of milestone anniversaries, such as the 100th for Panasonic and the 25th for AJA. There were a lot of new product announcements at the AJA booth, but a big one was the push for more OpenGear-compatible cards. OpenGear is an open source hardware rack standard that was developed by Ross and embraced by many manufacturers. You can purchase any OpenGear version of a manufacturer’s product and then mix and match a variety of OpenGear cards into any OpenGear rack enclosure. AJA’s cards also offer Dashboard support, which is a software tool to configure and control the cards. There are new KONA SDI and HDMI cards, HDR support in the IO 4K Plus, and HDR capture and playback with the KiPro Ultra Plus.

HDR

It’s fair to say that we are all learning about HDR, but from what I observed on the floor, AJA is one of the only companies with a number of hardware product offerings that will allow you to handle HDR. This is thanks to their partnership with ColorFront, who is handling the color science in these products. This includes the FS | HDR – an up/down/cross, SDR/HDR synchronizer/converter. It also includes support for the Tangent Element Kb panel. The FS | HDR was a tech preview last year, but a product now. This year the tech preview product is the HDR Image Analyzer, which offers waveform and histogram monitoring at up to 4K/60fps.

Speaking of HDR (high dynamic range) and SDR (standard dynamic range), I had a chance to sit in on Robbie Carman’s (colorist at DC Color, Mixing Light) Post Production World HDR overview. Carman has graded numerous HDR projects and from his HDR presentation – coupled with exhibits on the floor – it’s quite clear that HDR is the wild, wild west right now. There is much confusion about color space and dynamic range, not to mention what current hardware is capable of versus the maximums expressed in the tech standards. For example, the BT 2020 spec doesn’t inherently mean that the image is HDR. Or the fact that you must be working in 4K to also have HDR and the set must accept the HDMI 2.0 standard.

High dynamic range grading absolutely requires HDR-compatible hardware, such as the proper i/o device and a display with the ability to receive metadata that turns on and sets its target HDR values. This means investing in a device like AJA’s IO 4K Plus or Blackmagic’s UltraStudio 4K Extreme 3. It also means purchasing a true grading monitor costing tens of thousands of dollars, like one from Sony, Canon, or Flanders. You CANNOT properly grade HDR based on the image of ANY computer display. So while the latest version of FCPX can handle HDR, and an iMac Pro screen features a high nits rating, you cannot rely on this screen to see proper HDR.

LG was a sponsor of the show and LG displays were visible in many of the exhibits. Many of their newest products qualify at the minimum HDR spec, but for the most part, the images shown on the floor were simply bright and not HDR – no matter what the sales reps in the booths were saying.

One interesting fact that Carman pointed out was that HDR displays cannot be driven across the full screen at the highest value. You cannot display a full screen of white at 1,000 nits on a 1,000 nits display without causing damage. Therefore, automatic gain adjustments are used in the set’s electronics to dim the screen. Only a smaller percentage of the image (20% maybe?) can be driven at full value before dimming occurs. Another point Carman made was that standard lift/gamma/gain controls may be too coarse to grade HDR images with finesse. His preference is to use Resolve’s log grading controls, because you can make more precise adjustments to highlight and shadow values.

Cameras

I’m not a camera guy, but there was notable camera news at the show. Many folks really like the Panasonic colorimetry for which the Varicam products are known. For people who want a full-featured camera in a small form factor, look no further than the Panasonics AU-EVA-1. It’s a 4K, Super35, handheld cinema camera featuring dual ISOs. Panasonic claims 14 stops of latitude. It will take EF lenses and can output camera raw data. When paired with an Atmos recorder it will be able to record ProResRAW.

Another new camera is Canon’s EOS C700 FF. This is a new full-frame model in both EF and PL lens mount versions. As with the standard C700, this is a 4K, Super35 cinema camera that records ProRes or X-AVC at up to 4K resolution onboard to CFast cards. The full-frame sensor offers higher resolution and a shallower depth of field.

Storage

Storage is of interest to many. As costs come down, collaboration is easier than ever. The direct-attached vendors, like G-Tech, LaCie, OWC, Promise, and others were all there with new products. So were the traditional shared storage vendors like Avid, Facilis, Tiger, 1 Beyond, and EditShare. But three of the newer companies had my interest.

In my editing day job, I work extensively with QNAP, which currently offers the best price/performance ratio of any system. It’s reliable, cost-effective, and provides reasonable JKL response cutting HD media with Premiere Pro in a shared editing installation. But it’s not the most responsive and it struggles with 4K media, in spite of plenty of bandwidth  – especially when the editors are all banging away. This has me looking at both Lumaforge and OpenDrives.

Lumaforge is known to many of the Final Cut Pro X editors, because the developers have optimized the system for FCPX and have had early successes with many key installations. Since then they have also pushed into more Premiere-based installations. Because these units are engineered for video-centric facilities, as opposed to data-centric, they promise a better shared storage, video editing experience.

Likewise, OpenDrives made its name as the provider for high-profile film and TV projects cut on Premiere Pro. Last year they came to the show with their highest performance, all-SSD systems. These units are pricey and, therefore, don’t have a broad appeal. This year they brought a few of the systems that are more applicable to a broader user base. These include spinning disk and hybrid products. All are truly optimized for Premiere Pro.

The cloud

In other storage news, “the cloud” garners a ton of interest. The biggest vendors are Microsoft, Google, IBM, and Amazon. While each of these offers relatively easy ways to use cloud-based services for back-up and archiving, if you want a full cloud-based installation for all of your media needs, then actual off-the-shelf solutions are not readily available. The truth of the matter is that each of these companies offers APIs, which are then handed off to other vendors – often for totally custom solutions.

Avid and Sony seem to have the most complete offerings, with Sony Ci being the best one-size-fits-all answer for customer-facing services. Of course, if review-and-approval is your only need, then Frame.io leads and will have new features rolled out during the year. IBM/Aspera is a great option for standard archiving, because fast Aspera up and down transfers are included. You get your choice of IBM or other (Google, Amazon, etc.) cloud storage. They even offer a trial period using IBM storage for 30 days at up to 100GB free. Backblaze is a competing archive solution with many partnering applications. For example, you can tie it in with Archiware’s P5 Suite of tools for back-up, archiving, and server synchronization to the cloud.

Naturally, when you talk of the “cloud”, many people interpret that to mean software that runs in the cloud – SaaS (software as a service). In most cases, that is nowhere close to happening. However, the exception is The Foundry, which was showing Athera, a suite of its virtualized applications, like Nuke, running on the Google Cloud Platform. They demo’ed it running inside the Chrome browser, thanks to this partnership with Google. The Foundry had a pod in the Google partners pavilion.

In short, you can connect to the internet with a laptop, activate a license of the tool or tools that you need, and then all media, processing, and rendering is handled in the cloud, using Google’s services and hardware. Since all of this happens on Google’s servers, only an updated UI image needs to be pushed back to the connected computer’s display. This concept is ideal for the visual effects world, where the work is generally done on an individual shot basis without a lot of media being moved in real-time. The target is the Nuke-centric shop that may need to add on a few freelancers quickly, and who may or may not be able to work on-premises.

Interesting newcomers

As I mentioned at the beginning, part of the joy of NAB is discovering the small vendors who seek out NAB to make their mark. One example this year is Lumberjack Systems, a venture by Philip Hodgetts and Greg Clarke of Intelligent Assistance. They were in the Lumaforge suite demonstrating Lumberjack Builder, which is a text-based NLE. In the simplest of explanations, your transcription or scripted text is connected to media. As you re-arrange or trim the text, the associated picture is edited accordingly. Newly-written text for voiceovers turns into spoken word media courtesy of the computer’s internal audio system and system voice. Once your text-based rough cut is complete, an FCPXML is sent to Final Cut Pro X, for further finesse and final editing.

Another new vendor I encountered was Quine, co-founded by Norwegian DoP Grunleik Groven. Their QuineBox IoT device attaches to the back of a camera, where it can record and upload “conformable” dailies (ProRes, DNxHD) to your SAN, as well as proxies to the cloud via its internal wi-fi system. Script notes can also be incorporated. The unit has already been battle-test on the Netflix/NRK production of “Norsemen”.

Closing thoughts

It’s always interesting to see, year over year, which companies are not at the show. This isn’t necessarily indicative of a company’s health, but can signal a change in their direction or that of the industry. Sometimes companies opt for smaller suites at an area hotel in lieu of the show floor (Autodesk). Or they are a smaller part of a reseller or partner’s booth (RED). But often, they are simply gone. For instance, in past years drones were all the rage, with a lot of different manufacturers exhibiting. DJI has largely captured that market for both vehicles and camera systems. While there were a few other drone vendors besides DJI, GoPro and Freefly weren’t at the show at all.

Another surprise change for me was the absence of SAM (Snell Advanced Media) – the hybrid company formed out of Snell & Wilcox and Quantel. SAM products are now part of Grass Valley, which, in turn, is owned by Belden (the cable manufacturer). Separate Snell products appear to have been absorbed into the broader Grass Valley product line. Quantel’s Go and Rio editors continue in Grass Valley’s editing line, alongside Edius – as simple, middle, and advanced NLE products. A bit sad actually. And very ironic. Here we are in the world of software and file-based video, but the company that still has money to make acquisitions is the one with a heavy investment in copper (I know, not just copper, but you get the point).

Speaking of “putting a fork in it”, I would have to say that stereo 3D and 360 VR are pretty much dead in the film and video space. I understand that there is a market – potentially quite large – in gaming, education, simulation, engineering, training, etc. But for more traditional entertainment projects, it’s just not there. Vendors were down to a few, and even though the leading NLEs have ways of working with 360 VR projects, the image quality still looks awful. When you view a 4K image within even the best goggles, the qualitative experience is like watching a 1970s-era TV set from a few inches away. For now, it continues to be a novelty looking for a reason to exist.

A few final points… It’s always fun to see what computers were being used in the booths. Apple is again a clear winner, with plenty of MacBook Pros and iMac Pros all over the LVCC when used for any sort of creative products or demos. eGPUs are of interest, with Sonnet being the main vendor. However, eGPUs are not a solution that solves every problem. For example, you will see more benefit by adding an eGPU to a lesser-powered machine, like a 13” MacBook Pro than one with more horsepower, like an iMac Pro. Each eGPU takes one Thunderbolt 3 bus, so realistically, you are likely to only add one additional eGPU to a computer. None of the NLE vendors could really tell me how much of a boost their application would have with an eGPU. Finally, if you are looking for some great-looking, large, OLED displays that are pretty darned accurate and won’t break the bank, then LG is the place to look.

©2018 Oliver Peters

Advertisements

What’s up with Final Cut’s Color Wheels?

Apple Final Cut Pro X 10.4 introduced new, advanced color correction tools to this editing application, including color wheels, curves, and hue vs. saturation curves. These are tools that users of other NLEs have enjoyed for some time – and, which were part of Final Cut Studio (FCP 7, Color). Like others, my first reaction was, “Super! They’ve added some nice advanced tools, which will improve the use of FCPX for higher-end users.” But, as I started to primarily use the Color Wheels with real correction work, I quickly realized that something wasn’t quite right in how they operated. Or at least, they didn’t work in a way that we’ve come to understand.

In trying to figure it out, I reached out to other industry pros and developers for their thoughts. Naturally this led to some spirited discussions at forums like those at Creative COW. However, other editors have noticed the same problems, so you can also find threads in the Facebook FCPX group and at FCP.co. It is certainly easy to characterize this as just another internet kerfuffle, surrounding Apple’s “think different” approaches to FCPX. But those arguments fall flat when you actually try to use the tools as intended.

The FCPX Color Wheels panel includes four wheels – Master, Shadows, Midtones, and Highlights. The puck in the center of each wheel is a hue offset control to push hues in the direction that you move the puck. The slider to the right of the wheel controls the brightness of that range. The left slider controls the saturation. One of the main issues is that when you adjust luminance using one of these controls, the affected range is too broad. Specifically, in the case of the Midtones control, as you adjust the luminance slider up or down, you are affecting most of the image and not just the midrange levels. This is not the way this type of control normally works in other tools, and in fact, it’s not how FCPX’s Color Board controls work either.

“What’s the big deal?” you might ask. Fair enough. I see two operational issues. The first is that to properly grade the image using the Color Wheels, you end up having to go back-and-forth a lot between wheels, to counteract the changes made by one control with another. The second is that using the Midtones slider tends to drive highlights above 100 IRE, where they will be clipped if any broadcast limiting is used. This doesn’t happen with other color tools, notably Apple’s own Color Board.

A lot of the discussion focuses on luma levels and specifically the Midtones slider, since it’s easy to see the issue there. However, other controls are also affected, but that’s too much to dissect in a single post. Throughout this post, be sure to click on the images to see the full view. I have presented various samples against each other and you will only get the full understanding if you open the thumbnail (which is small but also cropped) to the full image. I have compared the effect using five different tools – the Color Board, the Color Wheels, a color corrector plug-in that I built as a Motion template using Motion effects, Rubber Monkey Software FilmConvert (the wheels portion only), and finally, the Adobe Lumetri controls in Premiere Pro.

I am using three different test images – a black-to-white ramp, a test pattern, and a demo video image. The ramp without correction will appear as a diagonal line (0-100 IRE) on the scope, which makes it easy to analyze what’s happening. The video image has definite shadow and highlight areas, which lets us see how these controls work in the real world. For example, if you want to brighten the area of the shot where the man is in the shadows, but don’t want to make the highlights any brighter, this would normally be done using a Midtones control. Be aware that these various tools certainly aren’t calibrated the same way and some have a greater range of control than others. The weakest of these is FilmConvert’s wheels, since this plug-in has additional level controls in other parts of its interface.

Color science models

In the various forum threads, the argument is made that Apple is simply using a different color science method or a different weighing of some existing models. That’s certainly possible, since not all color correctors are built the same way. The most common approaches are Lift/Gamma/Gain and Shadows/Mids/Highlights. Be careful with naming. Just because something uses the terminology of Shadows, Midtones, and Highlights, does not mean that it also uses the SMH color science model. Many tools use the Lift/Gamma/Gain model, but in fact, call the controls shadows (Lift), mids (Gamma), and highlights (Gain). Another term you may run across is Set-up in some correction tools. This is typically used for control of shadows (equal to Lift), but can also function is an offset control that raises the level of the entire image. Avid Symphony employs this solution. Finally, both Symphony and Adobe SpeedGrade use what has been dubbed a 12-way color corrector. Each range is further subdivided into its own subset of shadows, mids, and highlights controls.

An LGG model provides broad control of shadows and highlights, with the midtones control working like a curve that covers the whole range, but with the largest effect in the middle. An SMH model normally divides the levels into three distinct, precisely overlapping ranges. This is much like a three-band audio equalizing filter. A number of the color correctors add a luma range control, which gives the user the ability to change how much of the image a specific range will affect. In other words, how broad is the control of the shadows, mids, or highlights control? This is like a Q control in an audio equalizer, where you change the shape of the envelope at a certain frequency.

Red Giant’s Magic Bullet Looks offers both color correction models with two different tools – the 4-way color corrector (SMH) and the Colorista color corrector (LGG). When you adjust the midrange control of their 4-way, the result is a graceful S-shaped curve to the levels on the waveform.

To study the effect of an LGG-based corrector, test the ramp. The shadows control (Lift) will raise or lower the dark areas of the image without changing the absolute highlights. The diagonal line of the ramp on the waveform essentially pivots, hinged at the 100 IRE point. Conversely, change the highlights control (Gain) pivots the line pinned to 0 IRE (at black). When you adjust the midtones control (Gamma), you create a curve to the line, which stays pinned at 0 and 100 IRE at either end. In this way you are effectively “expanding” or “compressing” the levels in the middle portion of your image without changing the position of your black or white points.

How the various color correction tools react

Looking at the luma control for the Midtones, two things are clear. First, all of these tools are using the LGG color science model. It’s not clear what the Color Wheels are using, but it isn’t SMH, as there is no bulge or S-curve visible in the scope. Second, the Color Wheels quickly drive the image levels into clipping, while the other tools generally keep black and while levels in place. In essence, the Midtones control affects the image more like a master or offset control would, than a typical mids or Gamma control. Yet, clearly Apple’s Color Board controls adhere to the standard LGG model. The concern, of course, is clipping. In the test image of the man walking on the village street, the sunlit building walls on the opposite side of the street will become overexposed and risk being clipped when the Color Wheels are used.

What about color? As a simple test, I next shifted the Midtones puck to the yellow. Bear in mind that the range of each of these controls is different, so you will see varying degrees of yellow intensity. Nevertheless, the way the control should work is that some pure black and white should be preserved at the top and bottom of the video levels. All of these tools maintain that, except for the Color Wheels. There, the entire image is yellow, effectively making the hue offset puck function more like a tint control.

One other issue to note, is that the Color Wheels offer an extraordinarily control range. The hue offset control RGB intensity values go from 0 (center of the wheel) to 1023. However, the puck icon can only go to the rim of the wheel, which it hits at about 200. With a mouse (or numerical entry), you can keep going well past the stop of the wheel icon – five times farther, in fact. The image not only becomes very yellow in this case, but you can easily lose the location of your control, since the GUI position in no longer relevant.

The working theory

The big question is why don’t the Color Wheels conform to established principles, when in fact, the Color Board controls do? Until there is some further clarification from Apple, one possible explanation is with HDR. FCPX 10.4 introduced High Dynamic Range (HDR) features. One of the various HDR standards is Rec. 2020 PQ. In that color space, the 0-100 IRE limitations of Rec. 709 are expanded to 0-10,000 nits. 0-100 nits is roughly the same brightness as we are used to with Rec. 709.

Looking at this image of the man walking along the street – where I’ve attempted to get a pleasing look with all of the tools – you’ll see that the Color Wheels in Rec. 709 don’t react correctly and will drive the highlights into a range to be clipped. However, in the bottom pane, which is the same image in Rec. 2020 PQ color space, the grade looks pretty normal. And, in practice, the Color Wheels controls work more or less the way I would have expected them to work. Yes, the same controls work differently in the different color spaces – properly in 2020 PQ and not in 709.

But why is that the case? I have no answer, but I do have a wild guess. Maybe, just maybe, the Color Wheels were designed for – or intended to only be used for – HDR work. Or maybe there’s conversion or recalibration of the controls that hasn’t taken place yet in this version. If the tool is only calibrated for HDR, then its range and weighing will be completely wrong for Rec. 709 video. If you increase the Midtones luma of the ramp in both Rec. 709 and Rec. 2020 PQ, you’ll see a similar curve. In fact, if you overlay a screen shot of each waveform, placing the full Rec. 709 scope image over the bottom portion of the Rec. 2020 PQ scale, you’ll notice that these sort of align up to about 100 IRE and nits. It’s as if one is simply a slice out of the other.

Regardless of why, this is something where I would hope Apple will provide a white paper or other demonstration of what the best practices will be for using this tool effectively. If it isn’t intentional, and actually is a mistake, then I presume a fix will be forthcoming. In either case, put in your feedback comments to Apple.

A word about HDR

Over the course of testing this tool and this theory, I’ve done a bit of testing with the HDR color spaces in FCPX. If you want to know more about HDR, I would encourage you to check out these contrary blog posts by Stu Maschwitz and Alexis Van Hurkman. I tend to side with Stu’s point-of-view and am not a big fan of HDR.

The way Apple has implemented these features in Final Cut Pro X 10.4 is to allow the user to set and override color spaces. If you set up your project to be Rec. 2020 PQ (and set preferences to “show HDR as raw values”), then the viewer and a/v output (direct from the Mac, not through a hardware i/o device) are effectively dimmed through the Mac’s color profile system. When you grade the image based on the 0-10,000 nits scale, you’ll end up seeing an image that looks pleasing and essentially the same as if you were working in Rec. 709. However – and I cannot over-emphasize this – you are not going to be able to produce an image that’s truly compatible with Dolby Vision and actually look correct as HDR, unless you have the correct AJA i/o hardware and a proper display. And by display, I mean a top-end Dolby, Canon, or Sony unit, costing tens of thousands of dollars.

As I understand the PQ specs, the bulk of the higher range is for the highlights that are normally constrained or clipped in our current video systems. However, that 10,000 nits scale is weighed, so that about 50% of the image value is in the first 100 nits, making it of comparable brightness to the current 100 IRE. The rest of that range is for brighter information, like specular highlights. You don’t necessarily get more brightness in the shadow detail. Therefore, if you are grading a shot in FCPX in a 2020 PQ color space and you only have the computer display to go by, you’ll grade by eye as much as by scope. This means that to get a pleasing image, you will end up making the average appearance of the image brighter than it really should be. When this is viewed on a real HDR monitor, it will be painfully bright. Having a higher-nits computer display, like on the iMac Pro (up to 500 nits), won’t make much difference, unless maybe, you crank the display brightness to its maximum (ouch!).  “Mine goes the 11!”

Right now, HDR is the wild, wild west. If you are smart, you’ll realize that you don’t know what you don’t know. While it’s nice to have these new features in FCPX, they can be very dangerous in the wrong hands.

But that’s another matter. Right now, I just hope Apple (or one of the usual suspects, like Ripple Training, LumaForge, or Larry Jordan) will come out with more elaboration on the Color Wheels.

©2018 Oliver Peters

Apple Final Cut Pro X 10.4

December finally delivered the much-anticipated simultaneous release of new versions of Apple Final Cut Pro X, Motion, and Compressor – all on the same day as the iMac Pro officially went on sale. In the broader ecosystem, we also saw updates for macOS High Sierra, Logic Pro X, Pixelmator Pro, and Blackmagic Design DaVinci Resolve.

Final Cut Pro X (“ten”), version 10.4 is the fifth major release of Apple’s professional NLE in a little over six years. There are changes under the hood tied to technologies in High Sierra (macOS 10.13), which won’t get much press, but are very important in the development and operation of an application. This version will still run on a wide range of recent and older Macs. The minimum OS requirement is 10.12.4, but 10.13 or later is recommended. There are four new, marquee features in this release: advanced color correction tools, 360° editing, HDR (wide gamut) color space support, and HEVC/H.265 codec support for editing and encoding.

New advanced color tools

Final Cut Pro X was first launched with a color correction tool called the color board. It substituted sliders on a color swatch for the standard curves and color wheel controls that editors had been used to. While the color board was and is effective, as well as a bit deceptive in what you can accomplish, it was an instant turn-off for many. The lack of a more advanced color correction interface opened the field for third party color correction plug-in developers who came up with some great tools. With the release of FCPX 10.4, it’s hard for me to see why FCPX diehards would still buy a color correction plug-in. Yet, I have heard from at least one plug-in developer that their color corrector plug-in sales are staying stable. Clearly users want choice and that’s a good thing.

With this update, you’ve gained three new, native color tools, including color wheels, curves, and hue vs. saturation curves. All are elegantly designed, operate quite fluidly, and generally mimic what you can do in DaVinci Resolve. However, the color board didn’t go away however. There’s a preference setting for which of these four color tools is the default effect when first applying color correction (CMD+6).

Once you start color correcting, you can add more instances of any of these four tools in any combination. Final Cut Pro X sports robust performance, so you can apply several layers of correction to a clip and still have real-time playback without rendering. There are also additional keyboard commands to quickly step through effects or clips on your timeline. While not quite as fluid of a grading workflow as you’d have in a true color correction application, like Resolve, you can get pretty close with some experience. My biggest beef is that you are limited to the controls being locked within the inspector pane. You can’t move the controls around and there is no special color correction workspace. So for me, the ergonomics are poor. In my testing, I’ve also hit some flaws in how the processing is done (more on that in a future post). Ironically the color board actually seems to achieve more accurate correction than the color wheels.

There are a few quirks. Previously created presets for the color board will be converted into color preset effects, which now appear in the effects browser. This enables you to preview a color preset applied to a clip by skimming over the effect thumbnail. Unfortunately, I found this conversion didn’t always work. On a Sierra machine (10.12), the older presets were automatically converted after waiting a few minutes; however, nothing happened on a High Sierra machine (10.13). I eventually resorted to copying my converted effects presets from the Sierra Mac over to the High Sierra Mac. I suspect, that because the High Sierra update automatically reformats the internal SSD drive to the new Apple File System (APFS), this conversion process is somehow impeded. Of course, if you don’t already have any existing custom presets, then it’s not an issue.

(You can check out my previously-created color presets for instructions and downloads here.)

There is no control surface support yet, although future support for third party color correction controllers has been alluded to. It would be nice to see support for Tangent or Avid panels at the very least. There’s a new FCPXML version (1.7) that includes this new color metadata; however, it doesn’t seem to be imported into the newest version of Resolve. It’s possible that color metadata in the FCPXML file is only intended for FCPX-to-FCPX transfers and not round tripping to other applications.

360° editing

Let me say up front that this doesn’t hit my hot button. It’s an area where Apple is playing catch-up to Adobe. Quite frankly, for both of these companies, it only appeals to a small percentage of users. Not all 360° formats are supported. Your footage must be equirectangular (stitched panorama), in order that FCPX can properly correct its display. Nevertheless, if you do work on 360° productions, then FCPX provides you a nice tool kit.

You can set up your timeline sequence for monoscopic or stereoscopic 360° editing. Once set up, simply open a separate 360° viewer, side-by-side to the normal viewer. When you do this, you’ll see the uncorrected image on the right and the adjusted point-of-view image on the left. What’s really cool, is that you can play the timeline and actively navigate your view of the content within of the 360° viewer, without ever stopping playback. Plus I’m talking about 4K material here! Clearly the engineers have tweaked the performance and not just integrated a plug-in.

There are also a set of custom effects designed for seamless use on 360° images. For example, if you apply a standard blur, there will be a visible seam where the left and right edges meet. If you apply a 360° blur effect, then the image and effect are properly blended. If you want to get the full effect, just attach an HTC Vive VR headset to view clips in full 360°. Want to test this, but don’t have any footage? A quick web search will provide a ton of downloadable, equirectangular clips to play with.

Wide gamut / high dynamic range (HDR)

Apple is trying to establish leadership with the integration of workflows to support HDR editing. I suspect that their ultimate goal is proper HDR support for Apple TV 4K and the iPhone X. The state of HDR today is very confusing without any real standards. There’s DolbyVision and HDR10, an open standard. The latter leaves the actual implementation up to manufacturers, while Dolby licenses its technology with tight specs. The theoretical DolbyVision brightness standard is 10,000 nits (cd/m2), but their current target is only 4,000 nits. HDR10 caps at 1,000 nits. Current consumer TV sets run in the 300 to 500 nit range with none exceeding 1,000 nits. Finally, projected brightness in movie theaters is even lower.

To work in HDR within Final Cut Pro X, first set up the FCPX Library as wide instead of standard gamut. Then set the Project (sequence) to one of four standards: Rec 709 (standard dynamic range), Rec 2020, Rec 2020 PQ, or Rec 2020 HLG. The first Rec 2020 mode simply preserves the full dynamic range of log-encoded camera files when FCPX applies its LUTs. The PQ and HLG options are designed for DolbyVision and/or HDR10 mastering. HDR tools are provided to go between color spaces, such as mastering in Rec 2020 PQ and delivering in Rec 709 (consult Apple’s workflow document). However, it is only in the Rec 2020 PQ color space that the FCPX scope will display in nits, rather than IRE. When set to nits, the scale is 0 to 10,000 nits instead of 0 to 120 IRE.

To edit in one of these wide gamut color spaces, set your preferences to display HDR in raw values. Then Final Cut interacts with the color profile of the monitor through macOS to effectively dim the viewer image for this new color space. However, this technique is not applied to the filmstrips and thumbnail images in the browser, which will appear with blown out levels unless you manually override the colorspace setting for each clip. If your footage was shot with camera raw or log-encoding, using a RED, ARRI or similar camera, then you are ready to work in HDR today.

It’s critical to note that no current computer display or consumer flat panel will give you an accurate HDR image to grade by. This includes the new iMac Pro screens. You will need the proper AJA i/o hardware and a calibrated HDR display to see a proper HDR image. Even then, it’s still a question of which HDR levels you are trying to master to. For example, if you are using the scope in FCPX with a brightness level up to 10,000 nits, but your target display can only achieve 1,000 nits, then what good is the reading on the scope? We are still early in the HDR process, but I’m concerned that FCPX 10.4 will give users a false impression of what it really takes to do HDR properly.

HEVC / H.265

You can now import iMovie for iOS projects into FCPX 10.4.  Support for the H.265 (HEVC) codec has been added with this release, but you’ll need to be on High Sierra. If you shot video with an iPhone X and started organizing it in iMovie on the phone, then that video may have used the H.265 codec. Now you can bring that into FCPX to continue the job.

Going the other way will require Compressor encoding. HEVC is also the required format to send HDR material to the web. Apple is late to the game in H.265 support, as Sorenson and Adobe users have been able to do that for a while. I tested H.265 encoding of short clips in Compressor on my mid-2014 Retina MacBook Pro and it was extremely slow. There was no issue with H.264 encoding. The same H.265 test in Adobe Media Encoder – even when it was uprezzing a 1080p file to 4K – was significantly faster than Compressor.

Final thoughts

For current users. When you update to Final Cut Pro X 10.4, please remember that it will update each FCPX library file that you open afterwards. Although this has generally been harmless for most users, it’s best to follow some precautions. Zip your 10.3 (or earlier) version of the application and move that .zip file out of the applications folder before you update. Archive all of your existing Final Cut libraries. This way you can find your way back, in case of some type of failure.

Final Cut Pro X 10.4 is a solid upgrade that will have loyal FCPX users applauding. Overall, these new tools are useful and, as before, FCPX is a very fluid, enjoyable editing application. It slices through 4K content better than any other NLE on the Mac platform. If you like its editing paradigm, then nothing else comes close.

Unfortunately, Apple didn’t squash some long-standing bugs. For example, numerous users online are still complaining about the issue where browser text intermittently disappears. I do feel that there were missed opportunities. The functionality of audio lanes – a feature introduced in 10.3 as a way to get closer to track-style audio mixing – hasn’t been expanded. The hope for an enhanced, roles-based audio mixer has once again gone unanswered. On the other hand, the built-in audio plug-ins have been updated to those used by Logic Pro X and there’s a clean path to send your audio to Logic if you want to mix there.

I definitely welcome these updates. The new color tools make it a more powerful application to use for color grading, so I’m happy to see that Apple has been listening. Now, I hope that we’ll see some of the other needs addressed before another year passes us by.

©2017, 2018 Oliver Peters

Stocking Stuffers 2017

It’s holiday time once again. For many editors that means it’s time to gift themselves with some new tools and toys to speed their workflows or just make the coming year more fun! Here are some products to consider.

Just like the tiny house craze, many editors are opting for their laptops as their main editing tool. I’ve done it for work that I cut when I’m not freelancing in other shops, simply because my MacBook Pro is a better machine than my old (but still reliable) 2009 Mac Pro tower. One less machine to deal with, which simplifies life. But to really make it feel like a desktop tool, you need some accessories along with an external display. For me, that boils down to a dock, a stand, and an audio interface. There are several stands for laptops. I bought both the Twelve South BookArc and the Rain Design mStand: the BookArc for when I just want to tuck the closed MacBook Pro out of the way in the clamshell mode and the mStand for when I need to use the laptop’s screen as a second display. Another option some editors like is the Vertical Dock from Henge Docks, which not only holds the MacBook Pro, but also offers some cable management.

The next hardware add-on for me is a USB audio interface. This is useful for any type of computer and may be used with or without other interfaces from Blackmagic Design or AJA. The simplest of these is the Mackie Onyx Blackjack, which combines interface and output monitor mixing into one package. This means no extra small mixer is required. USB input and analog audio output direct to a pair of powered speakers. But if you prefer a separate small mixer and only want a USB interface for input/output, then the PreSonus Audiobox USB or the Focusrite Scarlett series is the way to go.

Another ‘must have’ with any modern system is a Thunderbolt dock in order to expand the native port connectivity of your computer. There are several on the market but it’s hard to go wrong with either the CalDigit Thunderbolt Station 2 or the OWC Thunderbolt 2 Dock. Make sure you double-check which version fits for your needs, depending on whether you have a Thunderbolt 2 or 3 connection and/or USB-C ports. I routinely use each of the CalDigit and OWC products. The choice simply depends on which one has the right combination of ports to fit your needs.

Drives are another issue. With a small system, you want small portable drives. While LaCie Rugged and G-Technology portable drives are popular choices, SSDs are the way to go when you need true, fast performance. A number of editors I’ve spoken to are partial to the Samsung Portable SSD T5 drives. These USB3.0-compatible drives aren’t the cheapest, but they are ultraportable and offer amazing read/write speeds. Another popular solution is to use raw (uncased) drives in a drive caddy/dock for archiving purposes. Since they are raw, you don’t pack for the extra packaging, power supply, and interface electronics with each, just to have it sit on the shelf. My favorite of these is the HGST Deckstar NAS series.

For many editors the software world is changing with free applications, subscription models, and online services. The most common use of the latter is for review-and-approval, along with posting demo clips and short films. Kollaborate.tv, Frame.io, Wipster.io, and Vimeo are the best known. There are plenty of options and even Vimeo Pro and Business plans offer a Frame/Wipster-style review-and-approval and collaboration service. Plus, there’s some transfer ability between these. For example, you can publish to a Vimeo account from your Frame account. Another expansion of the online world is in team workgroups. A popular solution is Slack, which is a workgroup-based messaging/communication service.

As more resources become available online, the benefits of large-scale computing horsepower are available to even single editors. One of the first of these new resources is cloud-based, speech-to-text transcription. A number of online services provide this functionality to any NLE. Products to check out include Scribeomatic (Coremelt), Transcriptive (Digital Anarchy), and Speedscriber (Digital Heaven). They each offer different pricing models and speech analysis engines. Some are still in beta, but one that’s already out is Speedscriber, which I’ve used and am quite happy with. Processing is fast and reasonably accurate, given a solid audio recording.

Naturally free tools make every user happy and the king of the hill is Blackmagic Design with DaVinci Resolve and Fusion. How can you go wrong with something this powerful and free with ongoing company product development? Even the paid versions with some more advanced features are low cost. However, at the very least the free version of Resolve should be in every editor’s toolkit, because it’s such a Swiss Army Knife application.

On the other hand, editors who have the need to learn Avid Media Composer, need look no further than the free Media Composer | First. Avid has tried ‘dumbed-down’ free editing apps before, but First is actually built off of the same code base as the full Media Composer software. Thus, skills translate and most of the core functions are available for you to use.

Many users are quite happy with the advantages of Adobe’s Creative Cloud software subscription model. Others prefer to own their software. If you work in video, then it’s easy to put together alternative software kits for editing, effects, audio, and encoding that don’t touch an Adobe product. Yet for most, the stumbling block is Photoshop – until now. Both Affinity Photo (Serif) and Pixelmator Pro are full-fledged graphic design and creation tools that rival Photoshop in features and quality. Each of these has its own strong points. Affinity Photo offers Mac and Windows versions, while Pixelmator Pro is Mac only, but taps more tightly into macOS functions.

If you work in the Final Cut Pro X world, several utilities are essential. These include SendToX and XtoCC from Intelligent Assistance, along with X2Pro Audio Convert from Marquis Broadcast. Marquis’ newest is Worx4 X – a media management tool. It takes your final sequence and creates a new FCPX library with consolidated (trimmed) media. No transcoding is involved, so the process is lighting fast. Although in some cases media is copied without being trimmed. This can reduce the media to be archived from TBs down to GBs. They also offer Worx4 Pro, which is designed for Premiere Pro CC users. This tool serves as a media tracking application, to let editors find all of the media used in a Premiere Pro project across multiple volumes.

Most editors love to indulge in plug-in packages. If you can only invest in a single, large plug-in package, then BorisFX’s Boris Continuum Complete 11 and/or their Sapphire 11 bundles are the way to go. These are industry-leading tools with wide host and platform support. Both feature mocha tracking integration and Continuum also includes the Primatte Studio chromakey technology.

If you want to go for a build-it-up-as-you-need-it approach – and you are strictly on the Mac – then FxFactory will be more to your liking. You can start with the free, basic platform or buy the Pro version, which includes FxFactory’s own plug-ins. Either way, FxFactory functions as a plug-in management tool. FxFactory’s numerous partner/developers provide their products through the FxFactory platform, which functions like an app store for plug-ins. You can pick and choose the plug-ins that you need when the time is right to purchase them. There are plenty of plug-ins to recommend, but I would start with any of the Crumplepop group, because they work well and provide specific useful functions. They also include the few audio plug-ins available via FxFactory. Another plug-in to check out is the Hawaiki Keyer 4. It installs into both the Apple and Adobe applications and far surpasses the built-in keying tools within these applications.

The Crumplepop FxFactory plug-ins now includes Koji Advance, which is a powerful film look tool. I like Koji a lot, but prefer FilmConvert from Rubber Monkey Software. To my eyes, it creates one of the more pleasing and accurate film emulations around and even adds a very good three-way color corrector. This opens as a floating window inside of FCPX, which is less obtrusive than some of the other color correction plug-ins for FCPX. It’s not just for film emulation – you can actually use it as the primary color corrector for an entire project.

I don’t want to forget audio plug-ins in this end-of-the-year roundup. Most editors don’t feel too comfortable with a ton of surgical audio filters, so let me stick to suggestions that are easy-to-use and very affordable. iZotope is a well-known audio developer and several of its products are perfect for video editors. These fall into repair, mixing, and mastering needs. These include the Nectar, Ozone, and RX bundles, along with the RX Loudness Control. The first three groups are designed to cover a wide range of needs and, like the BCC video plug-ins, are somewhat of an all-encompassing product offering. But if that’s a bit rich for the blood, then check out iZotope’s various Elements versions.

The iZotope RX Loudness Control is great for accurate loudness compliance, and best used with Avid or Adobe products. However, it is not real-time, because it uses analysis and adaptive processing. If you want something more straightforward and real-time, then check out the LUFS Meter from Klangfreund. It can be used for loudness control on individual tracks or the master output. It works with most of the NLEs and DAWs. A similar tool to this is Loudness Change from Videotoolshed.

Finally, let’s not forget the iOS world, which is increasingly becoming a viable production platform. For example, I’ve used my iPad in the last year to do location interview recordings. This is a market that audio powerhouse Apogee has also recognized. If you need a studio-quality hardware interface for an iPhone or iPad, then check out the Apogee ONE. In my case, I tapped the Apogee MetaRecorder iOS application for my iPad, which works with both Apogee products and the iPad’s built-in mic. It can be used in conjunction with FCPX workflows through the integration of metadata tagging for Keywords, Favorites, and Markers.

Have a great holiday season and happy editing in the coming year!

©2017 Oliver Peters

Chromatic

Since its introduction six years ago, Apple Final Cut Pro X has only offered the Color Board as its color correction/grading tool. That’s in addition to some automatic correction features and stylized “look” effects. The Color Board interface is based on color swatches and puck sliders, instead of traditional color wheels, leaving many users pining for something else. To answer this need, several third-party, plug-in developers have created color corrector effects modules to fill the void. The newest of these is Chromatic from Coremelt – a veteran Final Cut plug-in developer.

The toolset

Chromatic is the most feature-rich color correction module currently available for FCPX. It offers four levels of color grading, including inside and/or outside of a mask, overall frame, and also a final output correction. When you first apply the Chromatic Grade effect to a clip, you’ll see controls appear within the FCPX inspector window. These are the final output adjustments. To access the full toolset, you need to click on the Grade icon, which launches a custom UI. Like other grading tools that require custom interfaces, Chromatic’s grading toolset opens as a floating window. This is necessitated by the FCPX architecture, which doesn’t give developers the ability to integrate custom interface panels, like you’ll find in Adobe applications. To work around this limitation, developers have come up with various ingenious solutions, including floating UI windows, HUDs (heads up displays), and viewer overlays. Chromatic uses all of these approaches.

The Chromatic toolset includes nine correction effects, which can be stacked in any order onto a clip. These include lift/gamma/gain sliders, lows/mids/highs color wheels, auto white balance, replace color, color balance/temperature/exposure/saturation, three types of curves (RGB, HSL, and Lab), and finally, color LUTs. As you use more tools on a clip, these will stack into the floating window like layers. Click on any of these tools within the window to access those specific controls. Drag tools up or down in this window to rearrange the order of operation of Chromatic’s color correction processes. The specific controls work and look a lot like similar functions within DaVinci Resolve. This is especially true of HSL Curves, where you can control Hue vs. Sat or Hue vs. Luma.

Masking with the power of Mocha

Corrections can be masked, in order to effect only specific regions of the image. If you select “overall”, then your correction will affect the entire image. But is you select “inside” or “outside” of the mask, then you can grade regions of the image independent of each other. Take, for example, a common, on-camera interview situation with a darkened face in front of a brightly exposed exterior window. Once you mask around the face, you can then apply different correction tools and values to the face, as opposed to the background window. Plus, you can still apply an overall grade to the image, as well as final output adjustment tweaks with the sliders in the inspector window. That’s a total of four processes, with a number of correction tools used in each process.

To provide masking, Coremelt has leveraged its other products, SliceX and TrackX. Chromatic uses the same licensed Mocha planar tracker for fast, excellent mask tracking. In our face example, should the talent move around within the frame, then simply use the tracker controls in the masking HUD to track the talent’s movement within the shot. Once tracked, the mask is locked onto the face.

Color look-up tables (LUTs)

When you purchase Chromatic, you’ll also get a LUT (color look-up table) browser and a default collection of looks. (More looks may be purchased from Coremelt.) The LUT browser is accessible within the grading window. I’m not a huge fan of LUTs, as these are most often a very subjective approach to a scene that simply doesn’t work with all footage equally well. All “bleach bypass” looks are not equal. Chromatic’s LUT browser also enables access to any other LUTs you might have installed on your system, regardless of where they came from, as long as they are in the .cube format.

LUTs get even more confusing with camera profiles, which are designed to expand flat-looking, log-encoded camera files into colorful Rec709 video. Under the best of circumstances these are mathematically correct LUTs developed by the camera manufacturer. These work as an inverse of the color transforms applied as the image is recorded. But in many cases, commonly available camera profile LUTs don’t come from the manufacturers themselves, but are actually reverse-engineered to function closely to the manufacturer’s own LUT. They will look good, but might not yield identical results to a true camera LUT.

In the case of FCPX, Apple has built in a number of licensed camera manufacturer LUTs for specific brands. These are usually auto-detected and applied to the footage without appearing as an effect in the inspector. So, for instance, with ARRI Alexa footage that was recorded as Log-C, FCPX automatically adds a LogC-to-Rec709 LUT. However, if you disable that and then subsequently add Chromatic’s LogC-to-Rec709 LUT, you’ll see quite a bit of difference in gamma levels. Apple actually uses two of these LUTs – a 2D and a 3D cube LUT. Current Alexa footage defaults to the 3D LUT, but if you change the inspector pulldown to the regular LogC LUT, you’ll see similar gamma levels to what Chromatic’s LUT shows. I’m not sure if the differences are because the LUT isn’t correct, or whether it’s an issue of where, within the color pipeline, the LUT is being inserted. My recommendation is to stick with the FCPX default camera profile LUTs and then use the Chromatic LUTs for creative looks.

In use

Chromatic is a 1.0 product and it’s not without some birthing issues. One that manifested itself is a clamping issue with 2013 Mac Pros. Apparently this depends on which model of AMD D-series GPU your machine has. On some machines with the D-500 chips, video will clamp at 0 and 100, regardless of whether or not clamping has been enabled in the plug-in. Coremelt is working on a fix, so contact them for support if you have this or other issues.

Overall, Chromatic is well-behaved as custom plug-ins go. Performance is good and rendering is fast. Remember that each tool you use on a clip is like adding an additional effects filters. Using all nine tools on a clip is like applying nine effects filters. Performance will depend on a lot of circumstances. For example, if you are working with 4K footage playing back from a fast NAS storage system, then it will take only a few applied tools before you start impacting performance. However, 1080p local media on a fast machine is much more forgiving, with very little performance impact during standard grading using a number of applied tools.

Coremelt has put a lot of work into Chromatic. To date, it’s the most comprehensive grading toolset available within Final Cut Pro X. It is like having a complete grading suite right inside of the Final Cut timeline. If you are serious about grading within the application and avoiding a roundtrip through DaVinci Resolve, then Chromatic is an essential plug-in tool to have.

©2017 Oliver Peters

Spice with Templates

One way in which Apple’s Final Cut Pro X has altered editing styles is through the use of effects built as Motion templates, using the common engine shared with Apple Motion. There are a number of developers marketing effects templates, but the biggest batch can be found at the Fxfactory website. A regular development partner is idustrial Revolution, the brainchild of editor (and owner of FCP.co) Peter Wiggins. Wiggins offers a number of different effects packages, but the group marketed under the XEffects brand includes various templates that are designed to take the drudgery out of post, more so than just being eye-catching visual effects plug-ins.

XEffects includes several packages designed to be compatible with the look of certain styles of production, such as news, sports, and social media. These packages are only for FCP X and come with modifiable, preset moves, so you don’t have to build complex title and video moves through a lot of keyframe building. The latest is XEffects Viral Video, which is a set of moves, text, and banners that fit in with the style used today for trendy videos. The basic gist of these effects covers sliding or moving banners with titles and templates that have been created to conform to both 16:9 and square video projects. In addition, there are a set of plug-ins to create simple automatic moves on images, which is helpful in animating still photos. Naturally several title templates can be used together to create a stacked graphic design.

Another company addressing this market is Rampant Design Tools with a series of effects templates for both Apple Final Cut X and Adobe Premiere Pro CC. Their Premiere Pro templates include both effects presets and template projects. The effects presets can be imported into Premiere and become part of your arsenal of presets. For example, if you what to have text slide in from the side, blurred, and then resolve itself when it comes to rest – there’s a preset for that. Since these are presets, they are lightweight, as no extra media is involved.

The true templates are actually separate Premiere Pro template projects. Typically these are very complex, layered, and nested timelines that allow you to create very complex effects without the use of traditional plug-ins. These projects are designed to easily guide you where to place your video, so no real compositing knowledge is needed. Rampant has done the hard part for you. As with any Premiere Pro project, you can import the final effects sequence into your active project, so there’s no need to touch the template project itself. However, these template projects do include media and aren’t as lightweight as the presets, so be mindful of your available hard drive space.

For Final Cut Pro X, Rampant has done much the same, creating both a set of installable Motion template effects, like vignette or grain, as well as more complex FCP X Libraries designed for easy and automatic use. As with the Premiere products, some of these Libraries contain media and are larger than others, so be mindful of your space.

Both of these approaches offer new options in the effects market. These developers give you plug-in style effects without actually coding a specific plug-in. This makes for faster development and less concern that a host application version change will break the plug-in. As with any of these new breed of effects, the cost is much lower than in the past and effects can be purchase a la carte, which enables you to tailor your editor’s tool bag to your immediate needs.

©2017 Oliver Peters

A Conversation with Thomas Grove Carter

The NAB Show is a great place to see the next level of media hardware and software. Even better, it’s also a great place to meet old friends, make new ones, and pick up the tips and tricks of your craft through the numerous tutorials, seminars, and off-site events that accompany the show.

This year I had the chance to interview Thomas Grove Carter, an editor at Trim Editing, which is a London-based creative editorial shop. He appeared at several sessions to present his techniques for maximizing the power of Final Cut Pro X. These sessions were moderated by Apple and FCPWORKS.

Thomas Grove Carter has a number of high-profile projects on his reel, including work for Honda, Game of Thrones, Audi, and numerous music artists. Carter is a familiar name in the Final Cut Pro X editing community. He first came to prominence with Honda’s “The Other Side” long-form web commercial. In it, Carter juxtaposes parallel day and night driving scenarios covering the main actor – dad by day, undercover police officer by night. On the interactive website, you can toggle in-sync between the two versions. Thanks to FCPX’s way of connecting clips and the nature of its magnetic timeline, Carter could use this then-young application to build the commercial, as well as preview the interactivity for the client – all on a very tight deadline.

I had the pleasure of sitting down with Carter in a semi-quiet corner of the NAB Press Room shortly after his Post Production World keynote session on Sunday evening.

____________________________________

[Oliver Peters]: We first started hearing your name when Honda’s “The Other Side” long-form commercial hit the web. That fit ideally with Final Cut Pro X’s unique ability to connect clips above and below the primary storyline on the timeline. Was that something you came up with intuitively?

[Thomas Grove Carter]: I knew that Final Cut Pro X was going to be good for this interactive piece. As you’re playing back in FCPX you can enable and disable layers. This meant I could actually do a rough preview of what it’s going to look like. I knew that I was going to have these two layers of video, but I didn’t exactly know what it was going to be until the edit, so I started to assemble each story separately. Then at some point, once I had each narrative roughly built, I put them both together on the same timeline and started adding the sound. From then on I was able to play it ‘interactively’ right inside FCPX.  Back then, I split the day and night audio above and below the primary storyline. Today though, I’d probably assign a role for the day and a role for all of the night. Because, you can’t add audio-only above the primary storyline anymore. So that’s what I’d do to divide it out. All the audio and video still connects in exactly the same way – it just looks slightly different. Another great advantage of doing this in X was clip connections. For any given shot, there was the day and night version, and then, all the audio for the day and all the audio for the night. Just by grabbing the one clip in the primary and moving it or trimming it – everything for day and night – picture and audio – both would move together.

[OP]: Tell me a bit about your relationship with Trim Editing.

[TGC]: There are three partners, who are the most senior three editors. Then there are four or five other main editors and two or three junior editors, plus a number of assistants and runners.

It’s been running over 12 years and I joined the team just over 4 years ago.

[OP]: Are all of you using Final Cut Pro X?

[TGC]: Originally, before anyone started using Final Cut Pro X, we had a mix of Avid and Final Cut Pro 7. Then we began to move to Avid as we saw that Final Cut Pro 7 was not going to be improved. So I started to move to Avid, too. But, I was using Final Cut Pro X on my own personal projects. I began to use it on smaller jobs and one of the other editors said, “That’s cool, that thing you’re doing there.” And he started to try it out. Now we’re kind of at a point where most of the editors are on Final Cut Pro X. One is using Avid, so our assistants need to be able to work with both.

[OP]: Have you been able to convert the last hold-out?

[TGC]: He’s always been Avid. That’s what he uses. The company doesn’t dictate what we use to edit with. It’s all about making the best work. If I decided tomorrow that I wanted to cut in Avid or Premiere – it wouldn’t be an issue. Anyone can cut with anything they like.

[OP]: Any thoughts of going to Premiere?

[TGC]: We’ve fallen in love with the way FCPX works – the browser and the timeline. I think Premiere is good, because it feels very much like a continuation of where Final Cut Pro 7 was, which is why loads of people have moved to it. I understand that. It’s an easy move. But it’s the core way that X functions that I love. That stuff just isn’t in any other NLE. What I’ve found with everyone who has moved to it, including myself – there were always a few little hooks that keep people coming back, even if you don’t like the whole app initially. For me, the first thing I liked is how you can pull out the audio clips and things move out of the way automatically. And I always just thought ‘I can’t make this thing work, but that feature is cool’. And then I kept coming back to it and slowly fell I love with the rest of it. One of the other editors loved the way of making dynamic selects in the browser and said, “I’m going to do this job in X.” He’d select in the browser using favorites and rejects and he absolutely loved it. Loved the way it was so fluid with the thumbnails and he felt immersed in his rushes. Then he gets to the timeline. “Oh, I can’t make this work.” He sent it back to Final Cut Pro 7 and finished up there. He did that on two or three jobs, because it takes time to get comfortable with the timeline. It’s strange when you come from track-based. But once it clicks, it’s amazing.

[OP]: How do your assistant editors fit into the workflow?

[TGC]: Generally I go from one job to the next. It might be two weeks or a month and a quick turnaround. Occasionally there might be an overlap – like, the next job has already started shooting and I haven’t finished the last one off yet. So it might be that I need an assistant editor to load my stuff. Or maybe I have to move on to the next job and I’ve got an assistant doing final tweaks on the last one. It’s much simpler to load projects in X than it is in Avid and one thing I’ve heard in the industry is, “Oh, does that mean you’re going to fire a lot of assistants, because you don’t need them?” No! Of course, we’re going to employ them, but we’ll actually give them editing work to do whenever we can – not just grunt work. Let them do the cut-downs, versions, first assemblies. There’s more time now for them to be doing creative work.

We also try to promote from within. I was the first person who was hired from outside of the company. Almost all the other editors, apart from the partners, have been people who’ve moved up from within. Yes, we could be paying this assistant to be loading all our stuff and making QuickTimes. But if you can be paying the assistant and they can be doing another job, why wouldn’t you do that? It’s another revenue stream for the company. So it’s great to be able to get them up to a level where they can pick up work and build up their own reels and creative chops.

[OP]: Are you primarily working with proxy media?

[TGC]: Not ‘Final Cut Pro X proxy media’, but we use ProRes Proxy or  LT files, which are often transcoded by a DIT on set. They look great, but the post house always goes back to the camera originals for the grade. Sometimes if it’s a smaller job – a low budget music video, for example – I’ll get the ARRI files if they shooting ProRes and just take them into Final Cut straight away- just to get working quicker.

[OP]: Since you work in the area of high-end commercials, do you typically send out audio, color and effects to outside post facilities?

[TGC]: Sound and post work is finished off elsewhere. We work with all the big post facilities –  The Mill, Framestore, and MPC, for example. The directors we work with have their favorite colorists. They’re hiring them because they have the right eye, the right creative skills – not just because they can push the buttons. But we’re doing more and more in the offline now. Clients aren’t used to seeing things as ‘offline’ these days. They’re used to things looking slick. I do a lot of sound design, because it goes so hand in hand with the picture edit. Sometimes the picture doesn’t work without any of the sound, so I do quite a lot of it – get it sounding really great, but it will ultimately be remixed later. I might be working on a project for a month and the sound becomes a very integral creative element. And then the sound mixer only gets a day to pull it all together. They do a great job, but it’s really important to give them as much as we can to work with – to really set the creative direction of the audio.

[OP]: In your presentations, you’ve mentioned Trim’s light hardware footprint. How is the facility configured?

[TGC]: Well, we’ve got ‘cylinder’ Mac Pros, Retina iMacs, and more recently we’ve been trying out a few of the new MacBook Pros, alongside the LG 5K displays. I’ve actually been cutting with that set up a lot recently. I really like it, because I turn up at the suite with my laptop, plug two cables in and that’s it! One cable for the 5K display, power and audio. The second cable goes out to HDMI. It runs the client monitor (HD/4K TV) and a USB hub. It’s a really slick and flexible set up.

For storage, we’re currently using Samsung T3 SSD drives, which are so fast and light, they can handle most things we throw at them. It’s a really slick and flexible set up. But with a few potential feature films in the near future, we are looking again at shared storage. I think that’s an interesting area of the market these days. There are some really amazing new products, which don’t come from the same old vendors.

[OP]: How do clients react to this modular suite approach?

[TGC]: If were doing our jobs, clients shouldn’t really notice the tech were using to drive the edit. And people love the space we’ve created. We’ve got really nice rooms – none of our suites are small. Clients are looking at a 50″ to 60” TV, which is 4K in some of our suites. And we’ve got really great sound systems. So, in terms of what clients are seeing and hearing, it doesn’t get much better in an edit suite.

Sometimes directors will come by even when they’re not editing with us. They’ll come by and write their treatments and just hang out, which is really nice. There’s a lot of common space with areas to work and meet.

There’s a lot of art all over the place and when anyone sees a sign that has the word ‘trim’ in it – they buy it. It might be a street sign or a ‘trim something’ logo. So, you see these signs all over the building. It adds a really nice character to the place. When I joined the company, I wanted to bring something to it – and I love LEGO – so I built our logo using it. That’s mounted at our entrance now.

[OP]: There’s a certain mentality in working with agencies. How does Trim approach that?

[TGC]: We tend to focus on the directors. That’s where you develop the greatest relationships, which is where the best work comes from. Not that I dislike working with an agency, but you build a much closer creative bond with your directors.

One small way we help build a good working environment for directors and agencies is to all have lunch together, every single day. We have lunch rather than editing and eating at our desks. One of the great things about this is that directors get to meet other agencies and editors get to meet other directors. It’s really good to be able to socialize like that. It also helps build different relationships than what would ever happen if we we’re all locked away in a suite all day.

[OP]: At what point do you typically get involved with a job?

[TGC]: I’ll usually get pencilled on a job while the director is still pitching it. And then I’ll start work straight after the shoot. Occasional we’ll be on set, but only if it’s a really tight deadline. On that Honda job, that was a six-day shoot to make two, 2 1/2 minute films and then they needed to see it really soon after the shoot. So, I had to be on set. But typically I like not being on set, because when you’re on set you’re suddenly part of the, “Oh, this shot was amazing. It took us four hours to get in the pouring rain.” You’re invested in that baggage. Whereas, when you just view it coldly in the edit, you don’t know what happened on set. You can go, “This shot doesn’t work – let’s lose it.” That fresh vision is a great reason for the editor to be as far from a shoot as possible.

[OP]: One of the projects on your reel is a Games of Thrones promo. How did that job come your way?

[TGC]: That was actually a director I hadn’t worked with – but, just a director who wanted to work with me. He’d been trying to get me on a few jobs that I hadn’t been able to do. It was an outside director that HBO brought in to shoot. It wasn’t a trailer made of footage from the show. They brought in a commercials and music director to shoot the piece and he wanted to work with me. So, it came down like that and then I worked with him and HBO to bring it all together.

[OP]: Do you have any preferences for the types of projects you work on?

[TGC]: Things like the Audi commercial are really fun, because there’s a lot of sound design. A lot of commercials are heavily storyboarded, but it can often be more satisfying if the director has been a bit more loose in the filming. It might be a montage of different people doing activities, for example. And those can be quite fun, because the final thing – you’ve come up with it and you’ve created the narrative and the flow of it. I say that with hindsight, because they turn out to be the most creatively satisfying. But, the process can be much harder when you’re in the thick of it – because it’s on your shoulders and you haven’t got a really locked storyboard to fall back on. I’ll happily do really long hours and work really hard, if it’s a good bit of work – and, at the end of the day, I’ve worked with nice people.

[OP]: With Final Cut Pro X – anything that you’d like to see different?

[TGC]: Maybe collaboration is one thing that would be interesting to see if there’s a new and interesting take on it. Avid bin-locking is great, but actually when you boil it down, it’s quite a simple thing. It locks this bin, you can’t go in there. You can make a copy of it. That’s all it’s doing, but it’s simple and it works really well. All the cloud-based things I’ve seen so far – they’ve not really gotten me excited. I don’t feel like anyone has really nailed what that is yet. Everyone is just doing it because they can, not because it works really well, or is actually useful. I’d be interested to see if there’s something that can be done there.

In the timeline, I’d like to be able to look inside compound clips without stepping into them. I often use compound clips to combine sound effects or music stems. I’d like to be able to open them in context in the timeline and edit the contents inline with the master timeline. And I’d love some kind of dupe detection in the timeline. But otherwise, I’m really enjoying the new version.

Click this link to watch Thomas Grove Carter in action with FCPX at this year’s Las Vegas SuperMeet at NAB.

____________________________________

I certainly appreciated the time Thomas Grove Carter spent with me to do this interview. Along with a few other interviews, it made for a better-than-average Vegas trip. As a side note, I recorded my interviews (for transcription only) on my iPad, with the aid of the Apogee MetaRecorder app. This works with iPhones and iPads and starts at free, however, you should spend the $4.99 in-app upgrade to be able to do anything useful with it. It can use the built-in mic and records full quality audio WAV files – and – it features a connection to FCPX with fcpxml. Finally, to aid in generating a text transcript, I used Digital Heaven’s SpeedScriber. Although still in beta, it worked well for what I needed. As with all audio-to-text transcription applications, there’s no such thing as perfect. I did need to do a fair amount of clean-up, however, that’s not uncommon.

©2017 Oliver Peters