A Light Footprint

When I started video editing, the norm was an edit suite with three large quadraplex (2”) videotape recorders, video switcher, audio mixer, B&W graphics camera(s) for titles, and a computer-assisted, timecode-based edit controller. This was generally considered  an “online edit suite”, but in many markets, this was both “offline” (creative cutting) and “online” (finishing). Not too long thereafter, digital effects (ADO, NEC, Quantel) and character generators (Chyron, Aston, 3M) joined the repertoire. 2” quad eventually gave way to 1” VTRs and those, in turn, were replaced by digital – D1, D2, and finally Digital Betacam. A few facilities with money and clientele migrated to HD versions of these million dollar rooms.

Towards the midpoint in the lifespan for this way of working, nonlinear editing took hold. After a few different contenders had their day in the sun, the world largely settled in with Avid and/or Media 100 rooms. While a lower cost commitment than the large online bays of the day, these nonlinear edit bays (NLE) still required custom-configured Macs, a fair amount of external storage, along with proprietary hardware and monitoring to see a high-quality video image. Though crude at first, NLEs eventually proved capable of handling all the video needs, including HD-quality projects and even higher resolutions today.

The trend towards smaller

As technology advanced, computers because faster and more powerful, storage capacities increased, and software that required custom hardware evolved to work in a software-only mode. Today, it’s possible to operate with a fraction of the cost, equipment, and hassle of just a few years ago, let along a room from the mid-70s. As a result, when designing or installing a new room, it’s important to question the assumptions about what makes a good edit bay configuration.

For example, today I frequently work in rooms running newer iMacs, 2013 Mac Pros, and even MacBook Pro laptops. These are all perfectly capable of running Apple Final Cut Pro X, Adobe Premiere Pro, Avid Media Composer, and other applications, without the need for additional hardware. In my interview with Thomas Grove Carter, he mentioned often working off of his laptop with a connected external drive for media. And that’s at Trim, a high-end London commercial editing boutique.

In my own home edit room, I recently set aside my older Mac Pro tower in favor of working entirely with my 2015 MacBook Pro. No more need to keep two machines synced up and the MBP is zippier in all respects. With the exception of some heavy-duty rendering (infrequent), I don’t miss using the tower. I run the laptop with an external Dell display and have configured my editing application workspaces around a single screen. The laptop is closed and parked in a BookArc stand tucked behind the Dell. But I also bought a Rain stand for those times when I need the MBP open and functioning as a second display.

Reduce your editing footprint

I find more and more editors working in similar configurations. For example, one of my clients is a production company with seven networked (NAS storage) workstations. Most of these are iMacs with few other connected peripherals. The main room has a 2013 “trash can” Mac Pro and a bit more gear, since this is the “hero” room for clients. If you are looking to downsize your editing environment, here are some pointers.

While you can work strictly from a laptop, I prefer to build it up for a better experience. Essential for me is a Thunderbolt dock. Check out OWC or CalDigit for two of the best options. This lets you connect the computer to the dock and then everything else connects to that dock. One Thunderbolt cable to the laptop, plus power for the computer, leaving you with a clean installation with an easy-to-move computer. From the dock, I’m running a Presonus Audiobox USB audio interface (to a Mackie mixer and speakers), a TimeMachine drive, a G-Tech media drive, and the Dell display. If I were to buy something different today, I would use the Mackie Onyx Blackjack interface instead of the Presonus/Mackie mixer combo. The Blackjack is an all-in-one solution.

Expand your peripherals as needed

At the production company’s hero room, we have the extra need to drive some video monitors for color correction and client viewing. That room is similarly configured as above, except with a Mac Pro and connection to a QNAP shared storage solution. The latter connects over 10Gb/s Ethernet via a Sonnet Thunderbolt/Ethernet adapter.

When we initially installed the room, video to the displays was handled by a Blackmagic Design UltraStudio device. However, we had a lot of playback performance issues with the UltraStudio, especially when using FCPX. After some experimenting, we realized that both Premiere Pro and FCPX can send a fullscreen, [generally] color-accurate signal to the wall-mounted flat panel using only HDMI and no other video i/o hardware. We ended up connecting the HDMI from the dock to the display and that’s the standard working routine when we are cutting in either Premiere Pro or Final Cut.

The rub for us is DaVinci Resolve. You must use some type of Blackmagic Design hardware product in order to get fullscreen video to a display when in Resolve. Therefore, the Ultrastudio’s HDMI port connects to the second HDMI input of the large client display and SDI feeds a separate TV Logic broadcast monitor. This is for more accurate color rendition while grading. With Media Composer, there were no performance issues, but the audio and video signal wants to go through the same device. So, if we edit Avid, then the signal chain goes through the UltraStudio, as well.

All of this means that in today’s world, you can work as lightly as you like. Laptop-only – no problem. iMac with some peripherals – no problem. A fancy, client-oriented room – still less hassle and cost than just a few short years ago. Load it up with extra control surfaces or stay light with a keyboard, mouse, or tablet. It all works today – pretty much as advertised. Gone are the days when you absolutely need to drop a small fortune to edit high-quality video. You just have to know what you are doing and understand the trade-offs as they arise.

©2017 Oliver Peters

Adobe’s Late-2017 Creative Cloud Updates

According to Cisco, 82% of internet traffic will be video by 2021. Adobe believes over 50% of that will be produced video and not just simple user content. This means producers will be expected to produce more – working faster and smarter. In the newest Creative Cloud update, Adobe has focused on just such workflow improvements. These were previewed at IBC and will be released later this year.

Adobe Premiere Pro CC

With this release, Adobe has finally enabled the ability to have more than one project file open at the same time. You can move clips and sequences between open projects. In addition, projects can be locked by the user, making Premiere Pro the first NLE to enable multiple open projects and locking within a single application. In addition, Adobe has expanded project types to include both Team Projects (your project is in the cloud) and shared projects (your project is local). The latter is ideal for SAN/NAS environments and adds Avid-style collaboration.

Editors will enjoy specific timeline enhancements, like “close all gaps” and up to 16 label colors. The Essential Graphics panel gets some love with font filtering and a visual font preview window. Graphics templates will now include a minimum duration, so that these clips can be extended on the timeline, while leaving the fade-in and fade-out constant.

Adobe is doubling down on VR using its acquired Skybox technology. New are 19 immersive effects and transitions specific to VR projects. These are needed to properly seam wraparound edges when effects are added to VR clips. They are all GPU-only effects; however, as some VR clips can be 5K wide and larger, performance can be challenging. Nevertheless, Adobe reports decent performance with 6K VR clips at half-resolution on laptops like the HP z820 or the 2017 15” MacBook Pro. There is also an immersive playback viewer designed for HMDs (head mount displays). It will display the image along with the Premiere Pro timeline window.

Premiere Pro’s non-VR editing updates, including shared projects, are explained well by the reTooled blog (video here).

Adobe Audition

Audition is the place to finalize your Premiere Pro mix, so a new auto-ducking mix tool has been added. This is based on Sensei, Adobe’s umbrella name for its artificial intelligence technologies. To use auto-ducking, the editor simply has to adjust sensitivity, amount of reduction, and fades, and then let Audition do the rest. Under AI, it will detect pauses in the dialogue and adjust music volume accordingly.

Other Audition enhancements include a timeline timecode overlay for the video viewer, the ability to simultaneously adjust dual-sided fades on clips, and new record and punch-in preferences for ADR work (“looping”).

After Effects

Here’s another example of this focus on time-savings. After Effects gains a new start-up window to set-up the first composition. It also gains a keyboard command editor, and in this release, will add the same font previewing tools as Premiere Pro. The biggest new feature is an expansion of the expression controls. These will be tied to data files for the quick updating of template graphics. If you create a graphic – such as a map of the US with certain information displayed by colors for each state – and it’s based on a template tied to data, then changing the supporting data information will automatically update the graphic. Other enhancements include GPU acceleration for third-party plug-ins that use the Mercury Playback Engine.

Character Animator

This live-capture, cartoon animation tool finally comes out of beta. A new feature will be the adjustment of the responsiveness of the animation tracking. This will permit live animation to look more hand-drawn. Actions can now be triggered by MIDI control panels. Triggers are editable in the timeline with a waveform for better matching of lip-sync.

There’s plenty of good user news, too, including the the release of 6 Below, an ultra-wide film designed for the Barco three-screen format. It was edited by Vashi Nedomansky using Premiere Pro. Other Premiere Pro news includes the dramatic feature film, Only The Brave, edited by Bill Fox, and Coup 53, a documentary in post being cut by Walter Murch. Both of these noted editors have been using Premiere Pro.

For more in-depth info, check out these links for a solid overview of Adobe’s soon-to-come Creative Cloud application updates:

ProVideo Coalition – Scott Simmons

Premiere Bro blog

Adobe’s own Digital Video & Audio blog

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Chromatic

Since its introduction six years ago, Apple Final Cut Pro X has only offered the Color Board as its color correction/grading tool. That’s in addition to some automatic correction features and stylized “look” effects. The Color Board interface is based on color swatches and puck sliders, instead of traditional color wheels, leaving many users pining for something else. To answer this need, several third-party, plug-in developers have created color corrector effects modules to fill the void. The newest of these is Chromatic from Coremelt – a veteran Final Cut plug-in developer.

The toolset

Chromatic is the most feature-rich color correction module currently available for FCPX. It offers four levels of color grading, including inside and/or outside of a mask, overall frame, and also a final output correction. When you first apply the Chromatic Grade effect to a clip, you’ll see controls appear within the FCPX inspector window. These are the final output adjustments. To access the full toolset, you need to click on the Grade icon, which launches a custom UI. Like other grading tools that require custom interfaces, Chromatic’s grading toolset opens as a floating window. This is necessitated by the FCPX architecture, which doesn’t give developers the ability to integrate custom interface panels, like you’ll find in Adobe applications. To work around this limitation, developers have come up with various ingenious solutions, including floating UI windows, HUDs (heads up displays), and viewer overlays. Chromatic uses all of these approaches.

The Chromatic toolset includes nine correction effects, which can be stacked in any order onto a clip. These include lift/gamma/gain sliders, lows/mids/highs color wheels, auto white balance, replace color, color balance/temperature/exposure/saturation, three types of curves (RGB, HSL, and Lab), and finally, color LUTs. As you use more tools on a clip, these will stack into the floating window like layers. Click on any of these tools within the window to access those specific controls. Drag tools up or down in this window to rearrange the order of operation of Chromatic’s color correction processes. The specific controls work and look a lot like similar functions within DaVinci Resolve. This is especially true of HSL Curves, where you can control Hue vs. Sat or Hue vs. Luma.

Masking with the power of Mocha

Corrections can be masked, in order to effect only specific regions of the image. If you select “overall”, then your correction will affect the entire image. But is you select “inside” or “outside” of the mask, then you can grade regions of the image independent of each other. Take, for example, a common, on-camera interview situation with a darkened face in front of a brightly exposed exterior window. Once you mask around the face, you can then apply different correction tools and values to the face, as opposed to the background window. Plus, you can still apply an overall grade to the image, as well as final output adjustment tweaks with the sliders in the inspector window. That’s a total of four processes, with a number of correction tools used in each process.

To provide masking, Coremelt has leveraged its other products, SliceX and TrackX. Chromatic uses the same licensed Mocha planar tracker for fast, excellent mask tracking. In our face example, should the talent move around within the frame, then simply use the tracker controls in the masking HUD to track the talent’s movement within the shot. Once tracked, the mask is locked onto the face.

Color look-up tables (LUTs)

When you purchase Chromatic, you’ll also get a LUT (color look-up table) browser and a default collection of looks. (More looks may be purchased from Coremelt.) The LUT browser is accessible within the grading window. I’m not a huge fan of LUTs, as these are most often a very subjective approach to a scene that simply doesn’t work with all footage equally well. All “bleach bypass” looks are not equal. Chromatic’s LUT browser also enables access to any other LUTs you might have installed on your system, regardless of where they came from, as long as they are in the .cube format.

LUTs get even more confusing with camera profiles, which are designed to expand flat-looking, log-encoded camera files into colorful Rec709 video. Under the best of circumstances these are mathematically correct LUTs developed by the camera manufacturer. These work as an inverse of the color transforms applied as the image is recorded. But in many cases, commonly available camera profile LUTs don’t come from the manufacturers themselves, but are actually reverse-engineered to function closely to the manufacturer’s own LUT. They will look good, but might not yield identical results to a true camera LUT.

In the case of FCPX, Apple has built in a number of licensed camera manufacturer LUTs for specific brands. These are usually auto-detected and applied to the footage without appearing as an effect in the inspector. So, for instance, with ARRI Alexa footage that was recorded as Log-C, FCPX automatically adds a LogC-to-Rec709 LUT. However, if you disable that and then subsequently add Chromatic’s LogC-to-Rec709 LUT, you’ll see quite a bit of difference in gamma levels. Apple actually uses two of these LUTs – a 2D and a 3D cube LUT. Current Alexa footage defaults to the 3D LUT, but if you change the inspector pulldown to the regular LogC LUT, you’ll see similar gamma levels to what Chromatic’s LUT shows. I’m not sure if the differences are because the LUT isn’t correct, or whether it’s an issue of where, within the color pipeline, the LUT is being inserted. My recommendation is to stick with the FCPX default camera profile LUTs and then use the Chromatic LUTs for creative looks.

In use

Chromatic is a 1.0 product and it’s not without some birthing issues. One that manifested itself is a clamping issue with 2013 Mac Pros. Apparently this depends on which model of AMD D-series GPU your machine has. On some machines with the D-500 chips, video will clamp at 0 and 100, regardless of whether or not clamping has been enabled in the plug-in. Coremelt is working on a fix, so contact them for support if you have this or other issues.

Overall, Chromatic is well-behaved as custom plug-ins go. Performance is good and rendering is fast. Remember that each tool you use on a clip is like adding an additional effects filters. Using all nine tools on a clip is like applying nine effects filters. Performance will depend on a lot of circumstances. For example, if you are working with 4K footage playing back from a fast NAS storage system, then it will take only a few applied tools before you start impacting performance. However, 1080p local media on a fast machine is much more forgiving, with very little performance impact during standard grading using a number of applied tools.

Coremelt has put a lot of work into Chromatic. To date, it’s the most comprehensive grading toolset available within Final Cut Pro X. It is like having a complete grading suite right inside of the Final Cut timeline. If you are serious about grading within the application and avoiding a roundtrip through DaVinci Resolve, then Chromatic is an essential plug-in tool to have.

©2017 Oliver Peters