Five Adobe Workflow Tips

Subscribers to Adobe Creative Cloud have a whole suite of creative tools at their fingertips. I believe most users often overlook some of the less promoted features. Here are five quick tips for your workflow. (Click on images to see an enlarged view.)

Camera Raw. Photographers know that the Adobe Camera Raw module is used to process camera raw images, such as .cr2 files. It’s a “develop” module that opens first when you import a camera raw file into Photoshop. It’s also used in Bridge and Lightroom. Many people use Photoshop for photo enhancement – working with the various filters and adjustment layer tools available. What may be overlooked is that you can use the Camera Raw Filter in Photoshop on any photo, even if the file is not raw, such as a JPEG or TIFF.

Select the layer containing the image and choose the Camera Raw Filter. This opens that image into this separate “develop” module. There you have all the photo and color enhancement tools in a single, comprehensive toolkit – the same as in Lightroom. Once you’re done and close the Camera Raw Filter, those adjustments are now “baked” into the image on that layer.

Remix. Audition is a powerful digital audio workstation application that many use in conjunction with Premiere Pro or separately for audio productions. One feature it has over Premiere Pro is the ability to use AI to automatically edit the length of music tracks. Let’s say you have a music track that’s 2:47 in length, but you want a :60 version to underscore a TV commercial. Yes, you could manually edit it, but Audition Remix turns this into an “automagic” task. This is especially useful for projects where you don’t need to have certain parts of the song time to specific visuals.

Open Audition, create a multitrack session, and place the music selection on any track in the timeline. Right-click the selection and enable Remix. Within the Remix dialogue box, set the target duration and parameters – for example, short versus long edits. Audition will calculate the number and location of edit points to seamlessly shorten the track to the approximate desired length.

Audition attempts to create edits at points that are musically logical. You won’t necessarily get an exact duration, since the value you entered is only a target. This is even more true with tracks that have a long musical fade-out. A little experimentation may be needed. For example, a target value of :59 will often yield significantly different results than a target of 1:02, thanks to the recalculation. Audition’s remix isn’t perfect, but will get you close enough that only minimal additional work is required. Once you are happy, bounce out the edited track for the shortened version to bring into Premiere Pro.

Photoshop Batch Processing. If you want to add interesting stylistic looks to a clip, then effects filters in Premiere Pro and/or After Effects usually fit the bill. Or you can go with expensive third party options like Continuum Complete or Sapphire from Boris FX. However, don’t forget Photoshop, which includes many stylized looks not offered in either of Adobe’s video applications, such as specific paint and brush filters. But, how do you apply those to a video clip?

The first step is to turn your clip into an image sequence using Adobe Media Encoder. Then open a representative frame in Photoshop to define the look. Create a Photoshop action using the filters and settings you desire. Save the action, but not the image. Then create a batch function to apply that stored action to the clean frames within the image sequence folder. The batch operation will automatically open each image, apply the effects, and save the stylized results to a new destination folder.

Open that new image sequence using any app that supports image sequences (including QuickTime) and save it as a ProRes (or other) movie file. Stylized effects, like oil paint, are applied to individual frames and will vary with the texture and lighting of each frame; therefore, the stitched movie will display an animated appearance to that effect.

After Effects for broadcast deliverables. After Effects is the proverbial Swiss Army knife for editors and designers. It’s my preferred conversion tool when I have 24p masters that need to be delivered as 60i broadcast files.

Import a 23.98 master and place it into a new composition. Scale, if needed (UHD to HD, for instance). Send to the Render Queue. Set the frame rate to 29.97, field render to Upper (for HD), and enable pulldown (any whole/split frame cadence is usually OK). Turn off Motion Blur and Frame Blending. Render for a proper interlaced broadcast deliverable file.

Photoshop motion graphics. One oft-ignored (or forgotten) feature of Photoshop is that you can do layer-based video animation and editing within. Essentially there’s a very rudimentary version of After Effects inside Photoshop. While you probably wouldn’t want to use it for video instead of using After Effects or Premiere Pro, Photoshop does have a value in creating animated lower thirds and other titles.

Photoshop provides much better text and graphic style options than Premiere Pro. The files are more lightweight than an After Effects comp on your Premiere timeline – or rendering animated ProRes 4444 movies. Since it’s still a Photoshop file (albeit a special version), the “edit in original” command opens the file in Photoshop for easy revisions. Let’s say you are working on a show that has 100 lower thirds that slide in and fade out. These can easily be prepped for the editor by the graphics department in Photoshop – no After Effects skills required.

Create a new file in Photoshop, turn on the timeline window, and add a new blank video layer. Add a still onto a layer for positioning reference, delete the video layer, and extend the layers and timeline to the desired length. Now build your text and graphic layers. Keyframe changes to opacity, position, and other settings for animation. Delete the reference image and save the file. This is now a keyable Photoshop file with embedded animation properties.

Import the Photoshop file into Premiere with Merged Layers. Add to your timeline. The style in Premiere should match the look created in Photoshop. It will animate based on the keyframe settings created in Photoshop.

©2021 Oliver Peters

Project organization

Leading into the new year, it’s time to take a fresh look at a perennial subject. Whether you work as a solo editor or part of a team, having a plan for organizing your projects – along with a workflow for moving media though your system – will lead to success in being able to find and restore material when needed at a future date. For a day-to-day workflow, I rely on five standard applications: Post Haste, Hedge, Better Rename, DiskCatalogMaker, and Kyno. I work on Macs, but there are Windows versions or alternatives for each.

Proper project organization. Regardless of your NLE, it’s a good idea to create a project “silo” for each job on your hard drive, RAID, or networked storage (NAS). That’s a main folder for the job, with subfolders for the edit project files, footage, audio, graphics, documents, exports, etc. I use Post Haste to create a new set of project folders for each new project.

Post Haste uses default or custom templates that can include Adobe project files. This provides a common starting point for each new project based on a template that I’ve created. Using this template, Post Haste generates a new project folder with common subfolders. A template Premiere Pro project file with my custom bin structure is contained within the Post Haste template. When each new set of folders is created, this Premiere file is also copied.

In order to track productions, each job is assigned a number, which becomes part of the name structure assigned within Post Haste. The same name is applied to the Premiere Pro project file. Typically, the master folder (and Premiere project) for a new job created through Post Haste will be labelled according to this schema: 9999_CLIENT_PROJECT_DATE.

Dealing with source footage, aka rushes or dailies. The first thing you have to deal with on a new project is the source media. Most of the location shoots for my projects come back to me with around 1TB of media for a day’s worth of filming. That’s often from two or three cameras, recorded in a variety of codecs at 4K/UHD resolution and 23.98fps. Someone on location (DIT, producer, DP, other) has copied the camera cards to working SSDs, which will be reused on later productions. Hedge is used to copy the cards, in order to provide checksum copy verification.

I receive those SSDs and not the camera cards. The first step is to copy that media “as is” into the source footage subfolder for that project on the editing RAID or NAS. Once my copy is complete, those same SSDs are separately copied “as is” via Hedge to one or more Western Digital or Seagate portable drives. Theoretically, this is for a deep archive, which hopefully will never be needed. Once we have at least two copies of the media, these working SSDs can be reformatted for the next production. The back-up drives should be stored in a safe location on-premises or better yet, offsite.

Since video cameras don’t use a standard folder structure on the cards, the next step is to reorganize the copied media in the footage folder according to date, camera, and roll. This means ripping media files out of their various camera subfolders. Within the footage folder, my subfolder hierarchy becomes shoot date (MMDDYY), then camera (A-CAM, B-CAM, etc), and then camera roll (A001, A002, etc). Media is located within the roll subfolder. Double-system audio recordings go into a SOUND folder for that date and follow this same hierarchy for sound rolls. When this reorganization is complete, I delete the leftover camera subfolders, such as Private, DCIM, etc.

It may be necessary to rename or append prefixes to file names in order to end up with completely unique file names within this project. That’s where Better Rename comes in. This is a Finder-level batch renaming tool. If a camera generates default names on a card, such as IMG_001, IMG_002 and so on, then renaming becomes essential. I try to preserve the original name in order to be able to trace the file back to back-up drives if I absolutely have to. Therefore, it’s best to append a prefix. I base this on project, date, camera, and roll. As an example, if IMG_001 was shot as part of the Bahamas project on December 20th, recorded by E-camera on roll seven, then the appended file would be named BAH1220E07_IMG_001.

Some camera codecs, like those used by drones and GoPros, are a beast for many NLEs to deal with. Proxy media is one way or you can transcode only the offending files. If you choose to transcode these files, then Compressor, Adobe Media Encoder, or Resolve are the best go-to applications. Transcode at the native file size and resolution into an optimized codec, like ProRes. Maintain log color spaces, because these optimized files become the new “camera” files in your edit. I will add separate folders for ORIG (camera original media) and PRORES (my transcoded, optimized files) within each camera roll folder. Only the ProRes media is to be imported into the NLE for editing.

Back-up! Do not proceed to GO! Now that you’ve spent all of this effort reorganizing, renaming, and transcoding media, you first want a back-up the files before starting to edit. I like to back up media to raw, removable, enterprise-grade HGST or Seagate hard drives. Over the years, I’ve accumulated a variety of drive sizes ranging from 2TB to now 8TB. Larger capacities are available, but 8TB is a cost-effective and manageable capacity. When placed into a Thunderbolt or USB drive dock, these function like any other local hard drive. 

When you’ve completed dealing with the media from the shoot, simply copy the whole job folder to a drive. You can store multiple projects on the same drive, depending on their capacity. This is an easy overnight process with most jobs, so it won’t impact your edit time. The point is to back up the newly organized version of your raw media. Once completed, you will have three copies of the source footage – the “as is” copy, the version on your RAID or NAS, and this back-up on the raw drive. After the project has been completed and delivered, load up the back-up drive and copy everything else from this job to that drive. This provides a “clone” of the complete job on both your RAID/NAS and the back-up drive.

In order to keep these back-up drives straight, you’ll need a catalog. At home, I’ve accumulated 12 drives thus far. At work we’ve accumulated over 200. I’ve found the easiest way to deal with this is an application called DiskCatalogMaker. It scans the drive and stores the file information in a catalog document. Each drive entry mimics what you see in the Finder, including folders, files, sizes, dates, and so on. The catalog document is searchable, which is why job numbers become important. It’s a good idea to periodically mount and spin up these drives to maintain reliability. Once a year is a minimum.

If you have sufficient capacity on your RAID or NAS, then you don’t want to immediately delete jobs and media when the work is done. In our case, once a job has been fully backed up, the job folder is moved into a BACKED UP folder on the NAS. This way we know when a job has been backed up, yet it is still easily retrieved should the client come back with revisions. Plus, you still have three total copies of the source media.

Other back-ups. I’ve talked a lot about backing up camera media, but what about other files? Generally files like graphics are supplied, so these are also backed up elsewhere. Plus they will get backed up on the raw drive when the job is done.

I also use Dropbox for interim back-ups of project files. Since a Premiere Pro project file is light and doesn’t carry media, it’s easy to back up in the cloud. At work, at the end of each day, each editor copies in-progress Premiere files to a company Dropbox folder. The idea is that in the event of some catastrophe, you could get your project back from Dropbox and then use the backed up camera drives to rebuild an edit. In addition, we also export and copy Resolve projects to Dropbox, as well as the DiskCatalogMaker catalog documents.

Whenever possible, audio stems and textless masters are exported for each completed job. These are stored with the final masters. Often it’s easier to make revisions using these elements, than to dive back into a complex job after it’s been deeply archived. Our NAS contains a separate top-level folder for all finished masters, in addition to the master subfolder within each project. When a production is done, the master file is copied into this other folder, resulting in two sets of the master files on the NAS. And by “master” I generally mean a final ProRes file along with a high-quality MP4 file. The MP4 is most often what the client will use as their “master,” since so much of our work these days is for the web. Therefore, both NAS locations hold a ProRes and an MP4. That’s in addition to the masters stored on the raw, back-up drive.

Final, Final revised, no really, this one is Final. Let’s address file naming conventions. Every editor knows the “danger” of calling something Final. Clients love to make changes until they no longer can. I work on projects that have running changes as adjustments are made for use in new presentations. Calling any of these “Final” never works. Broadcast commercials are usually assigned definitive ISCI codes, but that’s rarely the case with non-broadcast projects. The process that works for us is simply to use version numbers and dates. This makes sense and is what software developers use.

We use this convention: CLIENT_PROJECTNAME_VERSION_DATE_MODIFIER. As an example, if you are editing a McDonald’s Big Mac :60 commercial, then a final version might be labelled “MCD_Big Mac 60_v13_122620.” A slight change on that same day would become “MCD_Big Mac 60_v14_122620.” We use the “modifier” to designate variations from the norm. Our default master files are formatted as 1080p at 23.98 with stereo audio. So a variation exported as 4K/UHD or 720p or with a 5.1 surround mix would have the added suffix of “_4K” or “_720p” or “_51MIX.”

Some projects go through many updates and it’s often hard to know when a client (working remotely) considers a version truly done. They are supposed to tell you that, but they often just don’t. You sort of know, because the changes stop coming and a presentation deadline has been met. Whenever that happens, we export a ProRes master file plus high-quality MP4 files. The client may come back a week later with some revisions. Then, new ProRes and MP4 files are generated. Since version numbers are maintained, the ProRes master files will also have different version numbers and dates and, therefore, you can differentiate one from the other. Both variations may be valid and in use by the client.

Asset management. The last piece of software that comes in handy for us is Kyno. This is a lightweight asset management tool that we use to scan and find media on our NAS. Our method of organization makes it relatively easy to find things just by working in the Finder. However, if you are looking for that one piece of footage and need to be able to identify it visually, then that’s where Kyno is helpful. It’s like Adobe Bridge on steroids. One can organize and sort using the usual database tools, but it also has a very cool “drill down” feature. If you want to browse media within a folder without stepping through a series of subfolders, simply enable “drill down” and you can directly browse all media that’s contained therein. Kyno also features robust transcode and “send to” features designed with NLEs in mind. Prep media for an edit or create proxies? Simply use Kyno as an alternative to other options.

Hopefully this recap has provided some new workflow pointers for 2021. Good luck!

©2021 Oliver Peters

Simple Color Workflow in FCPX

Building on the heels of the previous post, I’d like to cover five simple steps to follow when performing basic color correction, aka “grading,” in Final Cut Pro X. Not every clip or project will use all of these, but apply the applicable steps when appropriate.

Step 1. LUTs (color look-up tables)

There are technical and creative LUTs. Here we are talking only about technical camera LUTs that are useful when your footage was recorded in a log color space. These LUTs convert the clip from log to a display color space (REC 709 or other) and turn the clip’s appearance from flat to colorful. Each manufacturer offers specific LUTs for the profile of their camera models.

Some technical LUTs are already included with the default FCPX installation and can be accessed through the settings menu in the inspector. Others must be downloaded from the manufacturer or other sources and stored elsewhere on your system. If you don’t see an appropriate option in the inspector, then apply the Custom LUT effect and navigate to a matching LUT stored on your system.

Step 2. Balance Color

Next, apply the Balance Color effect for each clip. This will slightly expand the contrast of the clip and create an averaged color balance. This is useful for many, but not all clips. For instance, a clip shot during “golden hour” will have a warm, yellow-ish glow. You don’t want that to be balanced neutral. You have no control over the settings of the Balance Color process, other than to pick between Automatic and White Balance. Test and see when and where this works to your advantage.

Note that this works best for standard footage without a LUT or when the LUT was applied through the inspector menu. If the LUT was applied as a Custom LUT effect, then Balance Color will be applied ahead of the Custom LUT and may yield undesirable results.

Step 3. Color correction – color board, curves, or color wheels

This is where you do most of the correction to alter the appearance of the clip. Any or all of FCPX’s color correction tools are fine and the tool choice often depends on your own preference. For most clips it’s mainly a matter of brightening, expanding contrast, increasing or decreasing saturation, and shifting the hue offsets of lows (shadow area), midrange, and highlights. What you do here is entirely subjective, unless you are aiming for shot-matching, like two cameras in an interview. For most projects, subtlety is the key.

Step 4. Luma vs Sat

It’s easy to get carried away in Step 3. This is your chance to reign it back in. Apply the Hue/Sat Curves tool and select the Luma vs Sat Curve. I described this process in the previous post. The objective is to roll off the saturation of the shadows and highlights, so that you retain pure blacks and whites at the extreme ends of the luminance range.

Step 5. Broadcast Safe

If you deliver for broadcast TV or a streaming channel, your video must be legal. Different outlets have differing standards – some looser or stricter than others. To be safe, limit your luminance and chrominance levels by applying a Broadcast Safe effect. This is best applied to an adjustment layer added as a connected clip at the topmost level above the entire sequence. Final Cut Pro X does not come with an adjustment layer Motion template title, but there are plenty available for download.

Apply the Broadcast Safe effect to that adjustment layer clip. Make sure it’s set to the color space that matches your project (sequence) setting (typically Rec 709 for HD and 4K SDR videos). At its default, video will be clipped at 0 and 100 on the scopes. Move the amount slider to the right for more clipping when you need to meet more restrictive specs.

These five steps are not the end-all/be-all of color correction/grading. They are merely a beginning guide to achieve quick and attractive grading using Final Cut Pro X. Test them out on your footage and see how to use them with your own workflows.

©2020 Oliver Peters

Drive – Postlab’s Virtual Storage Volume

Postlab is the only service designed for multi-editor, remote collaboration with Final Cut Pro X. It works whether you have a team collaborating on-premises within a facility or spread out at various locations around the globe. Since the initial launch, Hedge has also extended Postlab’s collaboration to Premiere Pro.

When using Postlab, projects containing Final Cut Pro X libraries or Premiere Pro project files are hosted on Hedge’s servers. But, the media lives on local drives or shared storage and not “in the cloud.” When editors work remotely, media needs to be transferred to them by way of “sneakernet,” High Tail, WeTransfer, or other methods.

Hedge has now solved that media issue with the introduction of Drive, a virtual storage volume for media, documents, and other files. Postlab users can utilize the original workflow and continue with local media – or they can expand remote capabilities with the addition of Drive storage. Since it functions much like DropBox, Drive can also be used by team members who aren’t actively engaged in editing. As a media volume, files on Drive are also accessible to Avid Media Composer and DaVinci Resolve editors.

Drive promises significantly better performance than a general business cloud service, because it has been fine-tuned for media. The ability to use Drive is included with each Postlab plan; but, storage costs are based on a flat rate per month for the amount of storage you need. Unlike other cloud services, there are no hidden egress charges for downloads. If you only want to use Drive as a single user, then Hedge’s Postlab Solo or Pro plan would be the place to start.

How Drive works

Once Drive storage has been added to an account, each team member simply needs to connect to Drive from the Postlab interface. This mounts a Drive volume on the desktop just like any local hard drive. In addition, a cache file is stored at a designated location. Hedge recommends using a fast SSD or RAID for this cache file. NAS or SAN network volumes cannot be used.

After the initial set up, the operation is similar to DropBox’s SmartSync function. When an editor adds media to the local Drive volume, that media is uploaded to Hedge’s cloud storage. It will then sync to all other editors’ Drive volumes. Initially those copies of the media are only virtual. The first time a file is played by a remote team member, it is streamed from the cloud server. As it streams, it is also being added the local Drive cache. Every file that has been fully played is now stored locally within the cache for faster access in the future.

Hedge feels that latency is as or more important than outright connection speed for a fluid editing experience. They recommend wired, rather than wi-fi, internet connections. However, I tested the system using wi-fi with office speeds of around 575Mbps down / 38Mbps up. This is a business connection and was fast enough to stream 720p MP4 and 1080p ProRes Proxy files with minimal hiccups on the initial streamed playback. Naturally, after it was locally cached, access was instantaneous.

From the editor’s point of view, virtual files still appear in the FCPX event browser as if local and the timeline is populated with clips. Files can also be imported or dragged in from Drive as if they are local. As you play the individual clips or the timeline from within FCPX or Premiere, the files become locally cached. All in all, the editing experience is very fluid.

In actual practice

The process works best with lightweight, low-res files and not large camera originals. That is possible, too, of course, but not very efficient. Drive and the Hedge servers support most common media files, but not a format like REDCODE raw. As before, each editor will need to have the same effects, LUTs, Motion templates, and fonts installed for proper collaboration.

I did run into a few issues, which may be related to the recent 10.4.9 Final Cut update. For example, the built-in proxy workflow is not very stable. I did get it to work. Original files were on a NAS volume (not Drive) and the generated proxies (H.264 or ProRes Proxy) were stored on the Drive volume of the main system. The remote editing system would only get the proxies, synced through Drive. In theory that should work, but it was hit or miss. When it worked, some LUTs, like the standard ARRI Log-C LUTs, were not applied on the remote system in proxy mode. Also the “used” range indicator lines for the event browser clips were present on the original system, but not the remote system. Other than these few quirks, everything was largely seamless.

My suggested workflow would be to generate editing proxies outside of the NLE and copy those to Drive. H.264 or ProRes Proxy with matching audio configurations to the original camera files work well. Treat these low-res files as original media and import them into Final Cut Pro X or Premiere Pro for editing. Once the edit is locked, go to the main system and transfer the final sequence to a local FCPX Library or Premiere Pro project for finishing. Relink that sequence to the original camera files for grading and delivery. Alternatively, you could export an FCPXML or XML file for a Resolve roundtrip.

One very important point to know is that the entire Postlab workflow is designed around team members staying logged into the account. This maintains the local caches. It’s OK to quit the Postlab application, plus eject and reconnect the Drive volume. However, if you log out, those local caches for editing files and Drive media will be flushed. The next time you log back in, connection to Drive will need to be re-established, Drive information must be synced again, and clips within FCPX or Premiere Pro will have to be relinked. So stay logged in for the best experience.

Additional features

Thanks to the Postlab interface, Drive offers features not available for regular hard drives. For example, any folder within Drive can be bookmarked in Postlab. Simply click on a Bookmark to directly open that folder. The Drop Off feature lets you generate a URL with an expiration date for any Bookmarked folder. Send that link to any non-team member, such as an outside contributor or client, and they will be able to upload additional media or other files to Drive. Once uploaded to Hedge’s servers, those files show up in Drive within the folder and will be synced to all team members.

Hedge offers even more features, including Mail Drop, designed for projects with too much media to efficiently upload. Ship Hedge a drive to copy dailies straight into their servers. Pick Up is another feature still in development. When updated, you will be able to select files on Drive, generate a Pick Up link, and send that to your client for download.

Editing with Drive and Postlab makes remote collaboration nearly like working on-site. The Hedge team is dedicated to expanding these capabilities with more services and broader NLE support. Given the state of post this year, these products are at the right time and place.

Check out this Soho Editors masterclass in collaboration using Postlab and Drive.

Originally written for FCP.co.

©2020 Oliver Peters

Color Finale Connect – Remote Grading for FCPX

Remote workflows didn’t start with COVID, but that certainly drove the need home for many. While editing collaboration at a distance can be a challenge, it’s a far simpler prospect than remote color grading. That’s often a very interactive process that happens on premises between a colorist and a client, director, or cinematographer. Established high-end post facilities, like Company3 with locations in the US, Canada, and England, have pioneered remote color grading sessions using advanced systems like Resolve and Baselight. This allows a director in Los Angeles and a colorist in London to conduct remote, real-time, interactive grading sessions. But the investment in workflow development, hardware, and grading environments to make this happen is not inconsequential.

High-end remote grading comes to Final Cut Pro X

The Color Finale team has been on a quest to bring advanced grading tools to the Final Cut Pro X ecosystem with last December’s release of Color Finale 2. Many editors are working from home these days, so the team decided to leverage the frameworks for macOS and FCPX to enable remote grading in a far simpler method than with other grading solutions.

The result is Color Finale Connect, which is a Final Cut Pro X workflow extension currently in free public beta. Connect enables two or more Final Cut Pro X users to collaborate in near-real-time in a color grading session, regardless of their location. This review is in the context of long distance sessions, but Connect can also be used within a single facility where the participants might be in other parts of the building or in different buildings.

Color Finale Connect requires each user in a session to be on macOS Catalina, running licensed copies of Final Cut Pro X (not trial) and Color Finale 2.2 Pro (or higher). Download and install Color Finale Connect, which shows up as a Final Cut workflow extension. You can work in a Connect session with or without local media on every participant’s system. In order to operate smoothly and keep the infrastructure lightweight, person-to-person communication is handled outside of Connect. For example, interact with your director via Skype or Zoom on an iPad while you separately control Final Cut on your iMac.

Getting started

To start a session, each participant launches the Color Finale Connect extension within Final Cut. Whoever starts a session is the “broadcaster” and others that join this session are “followers.” The session leader (who has the local media) drags the Project icon to the Connect panel and “publishes” it. This generates a session code, which can be sent to the other participants to join the session from within their Connect extension panels.

Once a session is joined, the participants drag the Project icon from the Connect panel into an open FCPX Event. This generates a timeline of clips. If they have the matching local media, the timeline will be populated with the initial graded clips. If they don’t have media, then the timeline is populated with placeholder clips. Everyone needs to keep their Connect panel open to stay in the session (it can be minimized).

Data transfer is very small, since it consists mainly of Color Finale instructions; therefore, crazy-fast internet speeds aren’t required. It is peer-to-peer and doesn’t live anywhere “in the cloud.” If a participant doesn’t have local media installed, then as the session leader makes a color correction change in Color Finale 2 Pro, an “in-place” full-resolution frame is sent for that clip on the timeline. As more changes are made, the frames are updated in near-real-time.

The data communication is between Color Finale on one system and Color Finale on the others. All grading must happen within the Color Finale 2 Pro plug-in, not FCPX’s native color wheels or other plug-ins. The “in-place” frames support all native Final Cut media formats, such as H.264, ProRes, and ProRes RAW; however, formats that require a plug-in, like RED camera raw files, will not transmit “in-place” frames. In that case, the data applied to the placeholder frame is updated, but you won’t see a reference image.

This isn’t a one-way street. The session leader can enable any participant to also have control. Let’s say the session leader is the colorist and the director of photography is a participant. The colorist can enable remote control for the DP, which would permit them to make tweaks on their own system. This in turn would update back on the colorist’s system, as well as for all the other participants.

Color Finale Connect workflows

I’ve been testing a late-stage beta version of Connect and Color Finale 2.2 Pro and the system works well. The “in-place” concept is ingenious, but the workflow is best when each session member has local media. This has been improved with the enhanced proxy workflow updated in Final Cut Pro X 10.4.9. Let’s say the editor has the full-resolution, original media and generates smaller proxies – for example, 50% size H.264 files. These are small enough that you can easily send the Library and proxy media to all participants using services like WeTransfer, MASV, FileMail, or Frame.io.

One of the session members could be a favored colorist on the other side of the world. In this case, he or she would be working with the proxy media. If the editor and colorist are both able to control the session, then it becomes highly interactive. Formats like RED don’t pose a problem thanks to the proxy transcodes, as long as no local changes are made outside of the Color Finale plug-in. In other words, don’t change the RED raw source settings within this session. Once the colorist has completed the grade using proxy media, those grading settings would be updated through a Connect session on the editor’s system where the original media resides.

Color management

How do you know that your client sees the color in the same way as you do on a reference display? Remote color grading has always been hampered by color management and monitor calibration. It would, of course, be ideal for each participant in the session to have Blackmagic or AJA output hardware connected to a calibrated display. If there is an a/v output for FCPX, then the Connect session changes will also be seen on that screen. But that’s a luxury most clients don’t have.

This is where Apple hardware, macOS, and Final Cut Pro X’s color management come to the rescue and make Color Finale Connect a far simpler solution than other methods. If both you and your client are using Apple hardware (iMac, iMac Pro, Pro Display XDR) then color management is tightly controlled and accurate. First make sure that macOS display settings like True Tone and Night Shift are turned off on all systems. Then you are generally going to see the same image within the Final Cut viewer on your iMac screen as your client will see on theirs.

The one caveat is that users still have manual control of the screen brightness, which can affect the perception of the color correction. One tip is to include a grayscale or color chart that can be used to roughly calibrate the display’s brightness setting. Can everyone just barely see the darkest blocks on the chart? If not, brighten the display setting slightly. It’s not a perfect calibration, but it will definitely get you in the ballpark.

Color Finale 2 Pro turns Final Cut Pro X into an advanced finishing solution. Thanks to the ecosystem and extensions framework, Final Cut opens interesting approaches to collaboration, especially in the time of COVID. Tools like Frame.io and Postlab enable better long-distance collaboration in easier-to-use ways than previous technologies. Color Finale Connect brings that same ease-of-use and efficient remote collaboration to FCPX grading. Remember this is still a beta, albeit a stable one, so make sure you provide feedback should any issues crop up.

Originally written for FCP.co.

©2020 Oliver Peters