Six Premiere Pro Game Changers

When a software developer updates any editing application, users often look for big changes, fancy features, and new functionality. Unfortunately, many little updates that can really change your day-to-day workflow are often overlooked.

Ever since the shift to its Create Cloud subscription model, Adobe has brought a string of updates to its core audio and video applications. Although there are several that have made big news, the more meaningful changes often seem less than awe inspiring to Adobe’s critics. Let me counter that narrative and point out six features that have truly improved the daily workflow for my Premiere Pro projects.

Auto Reframe Sequence. If you deliver projects for social media outlets, you know that various vertical formats are required. This is truly a pain when starting with content designed for 16×9 horizontal distribution. The Auto Reframe feature in Premiere Pro makes it easy to reformat any sequence for 9×16, 4×5, and 1×1 formats. It takes care of keyframing each shot to follow an area of interest within that shot, such as a person walking.

While other NLEs, like Final Cut Pro, also offer reformatting for vertical aspect ratios, none offer the same degree of automatic control to reposition the clip. It’s not perfect, but it works for most shots. It you don’t like the results on a shot, simply override the existing keyframes and manually reposition the clip. Auto Reframe works best if you start with a flattened, textless file, which brings me to the next feature.

Scene Edit Detection. This feature is generally used in color correction to automatically determine cuts between shots in a flattened file. The single clip in the sequence is split at each detected cut point. While you can use it for color correction in Premiere Pro, as well, it is also useful when Auto Reframing a sequence for verticals. If you try to apply Auto Reframe to a flattened file, Premiere will attempt to analyze and apply keyframes across the entire sequence since it’s one long clip. With these added splices created by Scene Edit Detection, Premiere can analyze each shot separately within the flattened file.

Auto Transcribe Sequence / Captioning. Modern deliverables take into account the challenges many viewers face. One of these is closed captions, which are vital to hearing-impaired viewers. Captions are also turned on by many viewers with otherwise normal hearing abilities for a variety of reasons. Just a few short years ago, getting interviews transcribed, adding subtitles for foreign languages, or creating closed captions required using an outside service, often at a large cost. 

Adobe’s first move was to add caption and subtitle functions to Premiere Pro, which enabled editors to import, create, and/or edit caption and subtitle text. This text can be exported as a separate sidecar file (such as .srt) or embedded into the video file. In a more recent update, Adobe augmented these features with Auto Transcribe. It’s included as part of your Creative Cloud subscription and there is generally no length limitation for reasonable use. If you have an hourlong interview that needs to be transcribed – no problem. 

Adobe uses cloud-based AI for part of the transcription process, so an internet connection is required. The turnaround time is quite fast and the accuracy is one of the best I’ve encountered. While the language options aren’t as broad as some of the competitors, most common Romance and Asian languages are covered. After the analysis and the speech-to-text process has been completed, that text can be used as a transcription or as captions (closed captions and/or subtitles). The transcription can also be exported as a text file with timecode. That’s handy for producers to create a paper cut for the editor.

Remix. You’ve just cut a six-minute corporate video and now you have to edit a needle drop music cue as a bed. It’s only 2:43, but needs to be extended to fit the 6:00 length and correctly time out to match the ending. You can either do this yourself or let Adobe tackle it for you. Remix came into Premiere Pro from Audition. This feature lets you use Adobe Sensei (their under-the-hood AI technology) to automatically re-edit a music track to a new target length. 

Open the Essential Sound panel, designate the track containing the cue as Music, enable the Duration tab, and select Remix. Set your target length and see what you get. You can customize the number of segments and variations to make the track sound less repetitive if needed. Some tracks have long fade-outs. So you may have to overshoot your target length in order to get the fade to properly coincide with the end of the video. I often still make one manual music edit to get it just right. Nevertheless the Remix feature is a great time-saver that usually gets me 90% of the way there.

Audition. If you pay for a full Creative Cloud subscription, then you benefit from the larger Adobe ecosystem. One of those applications is Audition, Adobe’s digital audio workstation (DAW) software. Audition is often ignored in most DAW roundups, because it doesn’t include many music-specific features, like software instruments and MIDI. Instead, Audition is targeted at general audio production (VO recordings, podcasts, commercials) and audio-for-video post in conjunction with Premiere Pro. Audition is designed around editing and processing a single audio file or for working in a multitrack session. I want to highlight the first method here.

Noise in location recordings is a fact of life for many projects. Record an interview in a working commercial kitchen and there will be a lot of background noise. Premiere Pro includes a capable noise reduction audio filter, which can be augmented by many third party tools from Accusonus, Crumplepop, and of course, iZotope RX. But if the Premiere Pro filter isn’t good enough, you need look no further than Audition. Export the track(s) from Premiere and open those (or the original files) in Audition.

Select the Noise Reduction/Restoration category under the Effects pulldown menu. First capture a short noise print in a section of the track with only background noise. This “trains” the filter for what is to be removed. Then select Noise Reduction (process). Follow the instructions and trust your own hearing to remove as much noise as possible with the least impact on the dialogue. If the person speaking sounds like they are underwater, then you’ve gone too far. Apply the effect in order to render the processing and then bounce (export) that processed track. Import the new track into Premiere. While this is a two-step process, you aren’t encumbering your computer with any real-time noise reduction filter when using such a pre-processed audio file.

Link Media. OK, I know relinking isn’t new to Premiere Pro and it’s probably not a marquee feature for editors always working with native media. When moving projects from offline to online – creative to finishing editorial – you know that if you cannot properly relink media files, a disaster will ensue.

Media Composer, Final Cut Pro, and Resolve all have relink functions. They work well with application-controlled, optimized media. But at other times, when working with camera original, native files, it might not work at all. I find Premiere Pro works about the best of these NLEs when it comes to relinking a wide variety of media files. That’s precisely because the user has a lot of control of the relink criteria in Premiere Pro. It’s not left up entirely to the application.

Premiere Pro expects the media be in the same relative path on the drive. Let’s say that you move the entire project to a different folder (like from Active Projects to Archived Projects) on your storage system. Navigate to and locate the first missing file and Premiere will find all the rest.

The relinking procedure is also quite forgiving, because various file criteria used to relink can be checked or unchecked. For example, I frequently edit with watermarked temporary music tracks, which are 44.1kHz MP3 files. When the cut is approved and the music is licensed, I download new, non-watermarked versions of that music as 48kHz WAV or AIF files. Premiere Pro easily relinks to the WAV or AIF files instead of the MP3s once I point it in the right direction. All music edits (including internal edits made by Remix) stay as intended and there is no mismatch due the the sample rate change.

These features might not make it into everyone’s Top 10 list, but they are tools generally not found in other NLEs. I use them quite often to speed up the session and remove drudgery from the editing process.

©2022 Oliver Peters

Generalists versus Specialists

“Jack of all trades, master of none” is a quote most are familiar with. But the complete quote “Jack of all trades, master of none, but oftentimes better than master of one” actually has quite the opposite perceived meaning. In the world of post production you have Jacks and Jills of all trades (generalists) and masters of one (specialists). While editors are certainly specialized in storytelling, I would consider them generalists when comparing their skillset to those of other specialists, such as visual effects artists, colorists, and audio engineers. Editors often touch on sound, effects, and color in a more general (often temp) way to get client approval. The others have to deliver the best, final results within a single discipline. Editors have to know the tools of editing, but not the nitty gritty of color correction or visual effects.

This is closely tied to the Pareto Principle, which most know as the 80/20 Rule. This principle states that 80% of the consequences come from 20% of the causes, but it’s been applied in various ways. When talking about software development, the 80/20 Rule predicts that 80% of the users are going to use 20% of the features, while only 20% of users will find a need for the other features. The software developer has to decide whether the target customer is the generalist (the 80% user) or the specialist (the 20% user). If the generalist is the target, then the challenge is to add some specialized features to service the advanced user without creating a bloated application that no one will use.

Applying these concepts to editing software development

When looking at NLEs, the first question to ask is, “Who is defined as a video editor today?” I would separate editors into three groups. One group would be the “I have to do it all” group, which generates most of what we see on local TV, corporate videos, YouTube, etc. These are multi-discipline generalists who have neither the time nor interest in dealing with highly specialized software. In the case of true one-man bands, the skill set also includes videography, plus location lighting and sound.

The “top end” – national and international commercials, TV series, and feature films – could be split into two groups: craft (aka film or offline) editors and finishing (aka online) editors. Craft editors are specialists in molding the story, but generalists when it comes to working software. Their technical skills don’t have to be the best, but they need to have a solid understanding of visual effects, sound, and color, so that they can create a presentable rough cut with temp elements. The finishing editor’s role is to take the final elements from sound, color, and the visual effects houses, and assemble the final deliverables. A key talent is quality control and attention to detail; therefore, they have no need to understand dedicated color, sound, or effects applications, unless they are also filling one of these roles.

My motivation for writing this post stemmed from an open letter to Tim Cook, which many editors have signed – myself included. Editors have long been fans of Apple products and many gravitated from Avid Media Composer to Apple Final Cut Pro 1-7. However, when Apple reimagined Final Cut and dropped Final Cut Studio in order to launch Final Cut Pro X many FCP fans were in shock. FCPX lacked a number of important features at first. A lot of these elements have since been added back, but that development pace hasn’t been fast enough for some, hence the letter. My wishlist for new features is quite small. I recognize Final Cut for what it is in the Apple ecosystem. But I would like to see Apple work to raise the visibility of Final Cut Pro within the broader editing community. That’s especially important when the decision of which editing application to use is often not made by editors.

Blackmagic Design DaVinci Resolve – the über-app for specialists

This brings me to Resolve. Editors point to Blackmagic’s aggressive development pace and the rich feature set. Resolve is often viewed as the greener pasture over the hill. I’m going to take a contrarian’s point of view. I’ve been using Resolve since it was introduced as Mac software and recently graded a feature film that was cut on Resolve by another editor.

Unfortunately, the experience was more problematic than I’ve had with grades roundtripped to Resolve from other NLEs. Its performance as an editor was quite slow when trying to move around in the timeline, replace shots, or trim clips. Resolve wouldn’t be my first NLE choice when compared to Premiere Pro, Media Composer, or Final Cut Pro. It’s a complex program by necessity. The color management alone is enough to trip up even experienced editors who aren’t intimately familiar with what the various settings do with the image.

DaVinci Resolve is an all-in-one application that integrates editing (2 different editing models), color correction (aka grading), Fusion visual effects, and the Fairlight DAW. Historically, all-in-ones have not had a great track record in the market. Other such über-apps would include Avid|DS and Autodesk Smoke. Avid pulled the plug on DS and Autodesk changed their business model for the Flame/Smoke/Lustre product family into subscription. Neither DS nor Smoke as a standalone application moved the needle for market share.

At its core, Resolve is a grading application with Fusion and Fairlight added in later. Color, effects, and audio mixing are all specialized skills and the software is designed so that each specialist if comfortable with the toolset presented on those pages/modes. I believe Blackmagic has been attempting to capitalize on Final Cut editor discontent and create the mythical “FCP8” or “FC Extreme” that many wanted. However, adding completely new and disparate functions to an application that at its core is designed around color correction can make it quite unwieldy. Beginning editors are never going to touch most of what Resolve has to offer and the specialists would rather have a dedicated specialized tool, like Nuke, After Effects, or Pro Tools.

Apple Final Cut Pro – reimagining modern workflows for generalists

Apple makes software for generalists. Pages, Numbers, Keynote, Photos, GarageBand, and iMovie are designed for that 80%. Apple also creates advanced software for the more demanding user under the ProApps banner (professional applications). This is still “generalist” software, but designed for more complex workflows. That’s where Final Cut Pro, Motion, Compressor, and Logic Pro fit.

Apple famously likes to “skate to where the puck will be” and having control over hardware, operating system, and software gives the teams special incite to develop software that is optimized for the hardware/OS combo. As a broad-based consumer goods company Apple also understands market trends. In the case of iPhones and digital photography it also plays a huge role in driving trends.

When Apple launched Final Cut Pro X the goal was an application designed for simplified, modernized workflows – even if “Hollywood” wasn’t quite ready. This meant walking away from the comprehensive “suite of tools” concept (Final Cut Studio). They chose to focus on a few applications that were better equipped for where the wider market of content creators was headed – yet, one that could still address more sophisticated needs, albeit in a different way.

This reimagining of Final Cut Pro had several aspects to it. One was to design an application that could easily be used on laptops and desktop systems and was adaptable to single and dual screen set-ups. It also introduced workflows based on metadata to improve edit efficiency. It was intended as a platform with third parties filling in the gaps. This means you need to augment FCP to cover a few common industry workflows. In short, FCP is designed to appeal to a broad spectrum of today’s “professionals” and not how one might have defined that term in the early 1990s, when nonlinear editing first took hold.

For a developer, it gets down to who the product is marketed towards and which new features to prioritize. Generalists are going to grow the market faster, hence a better return on development resources. The more complex an application becomes, the more likely it is to have bugs or break when the hardware or OS is updated. Quality assurance testing (QA) expands exponentially with complexity.

Final thoughts

Do my criticisms of Resolve mean that it’s a bad application? No, definitely not! It’s powerful in the right hands, especially if you work within its left-to-right workflow (edit -> Fusion -> color -> Fairlight). But, I don’t think it’s the ideal NLE for craft editing. The tools are designed for a collection of specialists. Blackmagic has been on this path for a rather long time now and seem to be at a fork in the road. Maybe they should step back, start from a clean slate, and develop a fresh, streamlined version of Resolve. Or, split it up into a set of individual, focused applications.

So, is Final Cut Pro the ideal editing platform? It’s definitely a great NLE for the true generalist. I’m a fan and use it when it’s the appropriate tool for the job. I like that it’s a fluid NLE with a responsive UI design. Nevertheless, it isn’t the best fit for many circumstances. I work in a market and with clients that are invested in Adobe Creative Cloud workflows. I have to exchange project files and make sure plug-ins are all compatible. I collaborate with other editors and more than one of us often touches these projects.

Premiere Pro is the dominant NLE for me in this environment. It also clicks with how my mind works and feels natural to me. Although you hear complaints from some, Premiere has been quite stable for me in all my years of use. Premiere Pro hits the sweet spot for advanced editors working on complex productions without becoming overly complex. Product updates over the past year have provided new features that I use every day. However, if I were in New York or Los Angeles, that answer would likely be Avid Media Composer, which is why Avid maintains such dominance in broadcast operations and feature film post.

In the end, there is no right or wrong answer. If you have the freedom to choose, then assess your skills. Where do you fall on the generalist/specialist spectrum? Pick the application that best meets your needs and fits your mindset.

For another direct comparison check out this previous post.

©2022 Oliver Peters

Adobe’s Frame Rollout

Adobe acquired Frame.io last October. The latest Adobe Creative Cloud application updates showcase the first formal integration of Frame.io as a product within the Creative Cloud ecosystem. Frame.io had already developed a Premiere Pro integration using Adobe’s extensions architecture; however, the latest version of Premiere Pro and After Effects adds an integrated interface panel called Review with Frame.io.

Now your individual Adobe Creative Cloud subscription includes a Frame.io account at no additional charge. This includes 100GB of cloud storage (separate from existing Creative Cloud storage) for up to five projects, use by two collaborators, and unlimited access for reviewers. If you need more storage or to add more collaborators, then you can upgrade to a larger Frame.io plan, but at additional cost.

Adobe Creative Cloud Team and Enterprise accounts don’t fall under this plan and those admins will need to consult Adobe or Frame.io for a plan that best meets their needs. In other words, if you are a production company paying for an Adobe Team account with multiple users on the account, you don’t get 100GB of “free” Frame.io storage for each user. This offering is primarily designed for individual Adobe Creative Cloud subscribers.

Something to know before you start

There’s a gotcha for some existing Frame.io customers. You activate your new Adobe CC Frame.io service by logging in with the same e-mail and password as used for your Adobe ID. Let’s say you work freelance at a facility and are a collaborator on their Frame.io Team account. In that case, you might be using a personal email address to log into Frame.io. However, if that email is the same as used for your personal Adobe ID, then Frame.io does not know how to differentiate between the two.

To rectify this you need to use a different email for one of these two log-ins. This is generally a minor issue, since most people have more than one email address that they use. In my own case, I needed to change my Adobe ID email, which was a relatively quick procedure. This allows me to separately access either of the two Frame.io accounts as a collaborator, based on which email I log in with.

One confusing thing I encountered was that the account starts as a 30-day trial for a Frame.io Team account, so it looks like you are going to get billed extra after the trial ends. This is not the case. I think it’s a mistake for Adobe and Frame.io to do this, because they are trying to upsell you to the paid account. Fortunately there’s no need to enter payment information up front. I wish that this was clearer in the marketing details. Hopefully Adobe will correct this after the initial rollout. At the end of the 30-trial, you will be asked whether to pay or end the trial. If you opt to end the trial, then the account reverts to the free plan, which is the one included with your Adobe Creative Cloud subscription.

Getting started

Open the Review with Frame.io panel in Premiere Pro or After Effects and sign-in using your Adobe ID. This will open your default browser and send you to the Frame.io website to complete the sign-in. As long as you stay signed in, you can access Frame.io either in your web browser or within the panel. If you sign out, then next time you’ll need to sign in again using the Adobe ID.

I won’t go into how Frame.io itself works, since there are plenty of tutorials. This integration doesn’t change any of the operation. The Frame.io panel works like the previous extensions panel. A clip with reviewer comments can be synced to your Premiere Pro timeline for easy changes. Or you can simply work from the web portal and ignore the panel entirely. 100GB is plenty if your intent is to use Frame.io for low-resolution review files. However, if your intention is a larger, more complex workflow, then you may need to upgrade your Frame.io account after all.

Enter C2C

The bigger picture is that Frame.io is enthusiastically pushing its camera-to-cloud (C2C) workflow. I’m not really a big believer in this concept, but I know plenty of companies are going to announce more cloud and remote services at NAB. For many reasons, I don’t believe that all of our media will be in the cloud in a decade or two. However, I think Adobe does. In my opinion, it’s not a particularly good goal for users or the planet. But, I digress. In today’s world, what C2C offers in conjunction with the Premiere Pro integration is a Dropbox-style experience.

Let’s say your videographer is recording a corporate CEO interview in Los Angeles. The company’s PR rep is in New York and the editor in Atlanta. And there’s a very short turnaround schedule. In this basic scenario, both the videographer and editor are collaborators on a Frame.io project. While the interview is being recorded, the feed is being uploaded to Frame.io in near real-time. This requires some hardware on the camera side or it could be done by someone on set right after the recording ends. Once it’s in Frame.io, the PR rep in NYC can access and review the takes. The editor in Atlanta also sees the footage appear in the Frame.io panel within Premiere Pro. Files can be downloaded from the panel to the editor’s drives and the edit can start right away.

Given most standard internet speeds today and the 100GB bucket, this workflow makes sense if you are uploading smaller camera proxy files. Some proxies can actually be good enough to master with – especially in fast turnaround situations. In other scenarios, the proxies might be used to start the edit and later replaced with the high-res camera originals, once received from the shoot.

I feel that such situations are a lot fewer than the marketers want you to believe. Moving high-res files over the internet is never fast. FedEx often still offers the better option. So unless you really do need to get started right away, just wait for the media to arrive a day or so later. However, C2C for the purpose of an out-of-town producer reviewing takes remotely – especially in light of workflow changes caused by COVID over the past couple of years – has gained steam.

Frame.io is clear that just because they are an Adobe company doesn’t change their dedication to other workflows and other applications, such as Final Cut Pro. New announcements include native FilmLight Baselight integration, an app for Apple TV, and C2C partnerships with FiLMiC Pro.

If you are a current Frame.io customer without any Adobe subscription – no problem. Nothing changes for you. I’ve been using Frame.io since it launched and have been happy with the service. There are occasional glitches, but no worse than any other internet service, including your regular e-mail provider. Better yet, clients love the process. It’s not perfect, but it is one of the better review-and-approval sites and services on the market. If this is the first time you start using Frame.io by virtue of your Adobe subscription, then you are bound to see your daily workflow enhanced.

©2022 Oliver Peters

Application Color Management

I’m previously written about the challenge of consistent gamma and saturation across multiple monitoring points. Getting an app’s viewer, QuickTime playback, and the SDI output to all look the same can be a fool’s errand. If you work on a Mac, then there are pros and cons to using Mac displays like an iMac. In general, Apple’s “secret sauce” works quite well for Final Cut Pro. However, if you edit or grade in Resolve, Premiere Pro, or Media Composer, then you aren’t quite as lucky. I’ve opined that you might actually need to generate separate files for broadcast and web deliverables.

The extra step of optimized file creation isn’t practical for most. In my case, the deliverables I create go to multiple platforms; however, few are actually destined for traditional broadcast or to be played in a theater. In most cases, my clients are creating content for the web or to be streamed in various venues. I predominantly edit in Premiere Pro and grade with Resolve. I’ve been tinkering with color management settings in each. The goal is a reasonably close match across both app viewers, the output I see to a Rec 709 display, and the look of the exported file when I view it in QuickTime Player on the computer.

Some of this advice might be a bit contrary to what I previously wrote. Both situations are still valid, depending on the projects you edit or grade. Granted, this is based on what I see on iMac and iMac Pro displays, so it may or may not be consistent with other display brands or when using PCs. And this applies to SDR, Rec 709, or sRGB outputs and not HDR grading. As a starting point, leave the Mac display profile alone. Don’t change its default profile. Yes, I know an iMac is P3, but that’s simply something you’ll have to live with.

Adobe Premiere Pro

Premiere Pro’s Rec 709 timelines are based on 2.4 gamma, which is the broadcast standard. However, an exported file is displayed with a QuickTime color profile of 1-1-1 (1.96 gamma). The challenge is to work with the Premiere Pro viewer and see an image that matches the exported file. I have changed to disabling (unchecking) the Display Color Management in General Preferences. This might seem counter-intuitive, but it results in a setting where the viewer, a Rec 709 output to a monitor, and the exported image all largely look the same.

If you enable Display Color Management, you’ll get an image in the viewer with a somewhat closer match for saturation, but gamma will be darker than the QuickTime or the video monitor. If you disable this setting, the gamma will be a better match (shadows aren’t crushed); however, the saturation of reds will be somewhat enhanced in the Premiere Pro viewer. It’s a bit of a trade-off, but I prefer the setting to be off.

Blackmagic Design DaVinci Resolve

Resolve has multiple places that can trip you up. But I’ve found that once you set them up, the viewer image will be a closer match to the exported file and to the Rec 709 image than is the case for Premiere Pro. There are three sections to change. The first is in the Project Settings pane (gear menu). This is the first place to start with every new Resolve project. Under Color Management, set the Timeline color space to Rec. 709 (Scene). I’ve experimented with various options, including ACES. Unfortunately the ongoing ACES issue with fluorescent color burned me on a project, so I’ll wait until I really have a need to use ACES again. Hopefully it will be less of a work-in-progress then. I’ve gone back to working in Rec. 709, but new for me is to use the Scene variant. I also turn on Broadcast Safe, but use the gentler restricted range of -10 to -110.

The next adjustment is in Resolve Preferences. Go to the General section and turn on: Use 10-bit precision in viewers, Use Mac display color profiles, and Automatically tag Rec. 709 Scene clips as Rec. 709-A. What this last setting does is make sure the exports are tagged with the 1-1-1 QuickTime color profile. If this is not checked, the file will be exported with a profile of 1-2-1 (2.4 gamma) and look darker when you play it to the desktop using QuickTime Player.

The last setting is on the Deliver page. Data levels can be set to Auto or Video. The important thing is to set the Color Space Tag and Gamma Tag to Same as Project. By doing so, the exported files will adhere to the settings described above.

Making these changes in Premiere Pro and Resolve gives me more faith in what I see in the viewer of each application. My exports are a closer match with fewer surprises. Is it a perfect match? Absolutely not. But it’s enough in the ballpark for most footage to be functional for editing purposes. Obviously you should still make critical image and color adjustments using your scopes and a calibrated reference display, but that’s not always an option. Going with these settings should mean that if you have to go by the computer screen alone, then what you see will be close to what you get!

©2021 Oliver Peters

May 2021 Links

It’s time to check out some more articles and reviews that I’ve written for other publications since January, but which I haven’t reposted here.

Understanding Premiere Pro’s Color Management 

I wrote about trusting Apple Displays. This is a follow-up article about color management and Premiere Pro (Pro Video Coalition).

Aviation and Final Cut Pro – Combining Passions for Compelling Videos 

YouTube influencers are a big part of the content creation landscape today. Their videos cover many, different niches and often have a surprisingly large base of followers. I take a look at YouTube, aviation, and the use of Final Cut Pro to post-produce these videos (FCP.co).

Is Your Audio Mix Too Loud?

Unless you’ve delivered master files for broadcast, you might not be as tuned into meeting delivery specs, especially when it comes to the perceived loudness levels of your mix. I discuss working with audio in Final Cut Pro to mix and master at optimal levels (FCP.co).

Color Finale Transcoder – BRAW, ARRIRAW, DNG and CinemaDNG in FCP

The folks at Color Trix have come up with an ingenious solution to augment Final Cut Pro’s camera raw support. The new Color Finale Transcoder adds additional camera raw formats, notably Blackmagic RAW. Check out my review of the software (FCP.co).

©2021 Oliver Peters