Audio Design Desk

The concept of the digital audio workstation stems from a near-century-old combination of a recording system and a mixing desk. Nearly every modern DAW is still built around that methodology. Gabriel Cowan, CEO and co-founder of Audio Design Desk, sought to modernize the approach with a DAW focused on sound design, using the power of metadata for workflow efficiency. The application was launched a couple of years ago and has since won several trade show awards for innovation, including a Product of the Year Award for audio just last week at the 2022 NAB Show.

Every video editor knows that a kicking sound track can often elevate an otherwise lackluster video. Audio Design Desk is intended to do just that, regardless of whether you are an editor, musician, or sound designer. The application is currently a Mac-only product that supports both Intel and M1 Macs natively. It breaks down into sound design (synthetic sounds, like whooshes, drones, hits, and risers), foley (real world sound effects), ambiences, and music.

Click here to continue this article at Pro Video Coalition.

©2022 Oliver Peters

Generalists versus Specialists

“Jack of all trades, master of none” is a quote most are familiar with. But the complete quote “Jack of all trades, master of none, but oftentimes better than master of one” actually has quite the opposite perceived meaning. In the world of post production you have Jacks and Jills of all trades (generalists) and masters of one (specialists). While editors are certainly specialized in storytelling, I would consider them generalists when comparing their skillset to those of other specialists, such as visual effects artists, colorists, and audio engineers. Editors often touch on sound, effects, and color in a more general (often temp) way to get client approval. The others have to deliver the best, final results within a single discipline. Editors have to know the tools of editing, but not the nitty gritty of color correction or visual effects.

This is closely tied to the Pareto Principle, which most know as the 80/20 Rule. This principle states that 80% of the consequences come from 20% of the causes, but it’s been applied in various ways. When talking about software development, the 80/20 Rule predicts that 80% of the users are going to use 20% of the features, while only 20% of users will find a need for the other features. The software developer has to decide whether the target customer is the generalist (the 80% user) or the specialist (the 20% user). If the generalist is the target, then the challenge is to add some specialized features to service the advanced user without creating a bloated application that no one will use.

Applying these concepts to editing software development

When looking at NLEs, the first question to ask is, “Who is defined as a video editor today?” I would separate editors into three groups. One group would be the “I have to do it all” group, which generates most of what we see on local TV, corporate videos, YouTube, etc. These are multi-discipline generalists who have neither the time nor interest in dealing with highly specialized software. In the case of true one-man bands, the skill set also includes videography, plus location lighting and sound.

The “top end” – national and international commercials, TV series, and feature films – could be split into two groups: craft (aka film or offline) editors and finishing (aka online) editors. Craft editors are specialists in molding the story, but generalists when it comes to working software. Their technical skills don’t have to be the best, but they need to have a solid understanding of visual effects, sound, and color, so that they can create a presentable rough cut with temp elements. The finishing editor’s role is to take the final elements from sound, color, and the visual effects houses, and assemble the final deliverables. A key talent is quality control and attention to detail; therefore, they have no need to understand dedicated color, sound, or effects applications, unless they are also filling one of these roles.

My motivation for writing this post stemmed from an open letter to Tim Cook, which many editors have signed – myself included. Editors have long been fans of Apple products and many gravitated from Avid Media Composer to Apple Final Cut Pro 1-7. However, when Apple reimagined Final Cut and dropped Final Cut Studio in order to launch Final Cut Pro X many FCP fans were in shock. FCPX lacked a number of important features at first. A lot of these elements have since been added back, but that development pace hasn’t been fast enough for some, hence the letter. My wishlist for new features is quite small. I recognize Final Cut for what it is in the Apple ecosystem. But I would like to see Apple work to raise the visibility of Final Cut Pro within the broader editing community. That’s especially important when the decision of which editing application to use is often not made by editors.

Blackmagic Design DaVinci Resolve – the über-app for specialists

This brings me to Resolve. Editors point to Blackmagic’s aggressive development pace and the rich feature set. Resolve is often viewed as the greener pasture over the hill. I’m going to take a contrarian’s point of view. I’ve been using Resolve since it was introduced as Mac software and recently graded a feature film that was cut on Resolve by another editor.

Unfortunately, the experience was more problematic than I’ve had with grades roundtripped to Resolve from other NLEs. Its performance as an editor was quite slow when trying to move around in the timeline, replace shots, or trim clips. Resolve wouldn’t be my first NLE choice when compared to Premiere Pro, Media Composer, or Final Cut Pro. It’s a complex program by necessity. The color management alone is enough to trip up even experienced editors who aren’t intimately familiar with what the various settings do with the image.

DaVinci Resolve is an all-in-one application that integrates editing (2 different editing models), color correction (aka grading), Fusion visual effects, and the Fairlight DAW. Historically, all-in-ones have not had a great track record in the market. Other such über-apps would include Avid|DS and Autodesk Smoke. Avid pulled the plug on DS and Autodesk changed their business model for the Flame/Smoke/Lustre product family into subscription. Neither DS nor Smoke as a standalone application moved the needle for market share.

At its core, Resolve is a grading application with Fusion and Fairlight added in later. Color, effects, and audio mixing are all specialized skills and the software is designed so that each specialist if comfortable with the toolset presented on those pages/modes. I believe Blackmagic has been attempting to capitalize on Final Cut editor discontent and create the mythical “FCP8” or “FC Extreme” that many wanted. However, adding completely new and disparate functions to an application that at its core is designed around color correction can make it quite unwieldy. Beginning editors are never going to touch most of what Resolve has to offer and the specialists would rather have a dedicated specialized tool, like Nuke, After Effects, or Pro Tools.

Apple Final Cut Pro – reimagining modern workflows for generalists

Apple makes software for generalists. Pages, Numbers, Keynote, Photos, GarageBand, and iMovie are designed for that 80%. Apple also creates advanced software for the more demanding user under the ProApps banner (professional applications). This is still “generalist” software, but designed for more complex workflows. That’s where Final Cut Pro, Motion, Compressor, and Logic Pro fit.

Apple famously likes to “skate to where the puck will be” and having control over hardware, operating system, and software gives the teams special incite to develop software that is optimized for the hardware/OS combo. As a broad-based consumer goods company Apple also understands market trends. In the case of iPhones and digital photography it also plays a huge role in driving trends.

When Apple launched Final Cut Pro X the goal was an application designed for simplified, modernized workflows – even if “Hollywood” wasn’t quite ready. This meant walking away from the comprehensive “suite of tools” concept (Final Cut Studio). They chose to focus on a few applications that were better equipped for where the wider market of content creators was headed – yet, one that could still address more sophisticated needs, albeit in a different way.

This reimagining of Final Cut Pro had several aspects to it. One was to design an application that could easily be used on laptops and desktop systems and was adaptable to single and dual screen set-ups. It also introduced workflows based on metadata to improve edit efficiency. It was intended as a platform with third parties filling in the gaps. This means you need to augment FCP to cover a few common industry workflows. In short, FCP is designed to appeal to a broad spectrum of today’s “professionals” and not how one might have defined that term in the early 1990s, when nonlinear editing first took hold.

For a developer, it gets down to who the product is marketed towards and which new features to prioritize. Generalists are going to grow the market faster, hence a better return on development resources. The more complex an application becomes, the more likely it is to have bugs or break when the hardware or OS is updated. Quality assurance testing (QA) expands exponentially with complexity.

Final thoughts

Do my criticisms of Resolve mean that it’s a bad application? No, definitely not! It’s powerful in the right hands, especially if you work within its left-to-right workflow (edit -> Fusion -> color -> Fairlight). But, I don’t think it’s the ideal NLE for craft editing. The tools are designed for a collection of specialists. Blackmagic has been on this path for a rather long time now and seem to be at a fork in the road. Maybe they should step back, start from a clean slate, and develop a fresh, streamlined version of Resolve. Or, split it up into a set of individual, focused applications.

So, is Final Cut Pro the ideal editing platform? It’s definitely a great NLE for the true generalist. I’m a fan and use it when it’s the appropriate tool for the job. I like that it’s a fluid NLE with a responsive UI design. Nevertheless, it isn’t the best fit for many circumstances. I work in a market and with clients that are invested in Adobe Creative Cloud workflows. I have to exchange project files and make sure plug-ins are all compatible. I collaborate with other editors and more than one of us often touches these projects.

Premiere Pro is the dominant NLE for me in this environment. It also clicks with how my mind works and feels natural to me. Although you hear complaints from some, Premiere has been quite stable for me in all my years of use. Premiere Pro hits the sweet spot for advanced editors working on complex productions without becoming overly complex. Product updates over the past year have provided new features that I use every day. However, if I were in New York or Los Angeles, that answer would likely be Avid Media Composer, which is why Avid maintains such dominance in broadcast operations and feature film post.

In the end, there is no right or wrong answer. If you have the freedom to choose, then assess your skills. Where do you fall on the generalist/specialist spectrum? Pick the application that best meets your needs and fits your mindset.

For another direct comparison check out this previous post.

©2022 Oliver Peters

Think you can mix… Round 2

A month ago I discussed LEWITT Audio’s music challenge mixing contest. While there’s no current contest, all of the past tracks are still available for download, so you can try your hand at mixing. To date, there are six songs running about 3 1/2 minutes each. LEWITT has selected a cross-section of eclectic European artists, showcasing styles that include R&B, rock, jazz, folk, swing, and punk. I mixed one of the songs for that first post, but decided to take my own suggestion and ended up mixing all six. I have posted that compilation to Vimeo.

As I previously mentioned, I don’t mix songs for a living, so why add this to my workload? Well, partially for fun, but also to learn. Consider it a video editor’s version of the “busman’s holiday.”

Each of the six songs poses different challenges. LEWITT’s marketing objective is to sell microphones and these recordings showcase those products. As such, the bands have been recorded in a live or semi-live studio environment. This means mics placed in front of instrument cabs, as well as mics placed all around the drum kit. Depending on the proximity of the band members to each other and how much acoustic baffling was used, some instrument tracks are more isolated than others. Guitars, bass, and keys might have additional direct input (DI) tracks for some songs, as well as additional overdubs for second and third parts. The track count ranged from 12 to 28 tracks. As is typical, drums had more tracks to deal with than any other instrument.

The performing styles vary widely, which also presents some engineering decisions that have to be made in how you mix. Move Like A Ghost was deliciously nasty by design. Do you try to clean that up or just embrace it? 25 Reasons is closer to a live stage performance with tons of leakage between tracks.

The video component

LEWITT, in conjunction with the performers, produced videos for five of the six songs. The mixes are specifically for LEWITT, not the official release versions of any of these songs. Therefore, the LEWITT videos were designed to accompany and promote the contest tracks. This makes it easy to sync your mix to each video, which is what I did for my compilation. In the case of Move Like a Ghost, LEWITT did not produce a full video with Saint Agnes. So, I pulled a stylized live music video for the band from YouTube for my version of the mix. I assembled this reel in Final Cut Pro, but any editing was really just limited to titles and the ins/outs for each song. The point is the mix and not the editing.

On the other hand, working with the video did inform my mix. For example, if a lead instrument had a riff in the song that’s shown in the video, then I’d emphasize it just a bit more in the mix. That’s not a bad thing, per se, if it’s not overdone. One quirk I ran into was on The Palace. The tracks included several vocal passes, using different mics. I picked the particular vocal track that I thought sounded best in the mix. When I synced it up to the video, I quickly realized that one vocal line (on camera) was out-of-sync and that LEWITT’s mixers must have blended several vocal performances in their own mix. Fortunately, it was an easy fix to use that one line from a different track and then everything was in sync.

Working the DAW

One of the tracks even included a Pro Tools session for Pro Tools users, but Logic Pro is my DAW of choice. Audition and Resolve (Fairlight) could have been options, but I prefer Logic Pro. It comes with really good built-in plug-ins, including reverbs, EQs, compressors, and amp modelers. I used all of these, plus a few paid and free third-party plug-ins from Accusonus, Analog Obsession, iZotope, FabFilter, Klevgrand, Sound Theory, TBProAudio, and Tokyo Dawn Labs.

One big selling point for me is Logic’s track stack feature, which is a method of grouping and organizing tracks and their associated channel strips. Use stacks to organize instrument tracks by type, such as drums, guitars, keys, vocals, etc. A track stack can be a folder or a summing stack. When summing is selected, then a track stack functions like a submix bus. Channel strips within the stack are summed and additional plug-ins can then be applied to the summing stack. If you think in terms of an NLE, then a track stack is a bit like a compound clip or nest. You can collapse or expand the tracks that have been grouped into the stack with a reveal button. Want to bring your visual organization from a couple of dozen tracks down to only a few? Then track stacks organized by instruments are the way to go.

For those unfamiliar with Logic Pro basics, here’s a simplified look at Logic’s signal flow. Audio from the individual track flows into the channel strip. That signal first hits any plug-ins, EQ, or filtering, and then flows out through the volume control (fader). If you need to raise or lower the volume of a track going into the plug-in chain, then you either have to adjust the input of the plug-in itself, or add a separate gain plug-in as the first effect in the chain. The volume control/fader affects the level after plug-ins have been applied. This signal is then routed through the track stack (if used). On a summing track stack, the signal flow through its channel strip works the same way – plug-ins first, then volume fader. Of course, it can get more complex with groups, sends, and side-chaining.

All track stack signals, as well as any channel not placed into a track stack, flow through the stereo out bus (on a stereo project) – again, into the plug-ins first and then out through the volume control. In addition to the stereo output bus, there’s also a master output fader, which controls the actual volume of the file written to the drive. If you place a metering plug-in into the chain of the stereo output bus, it indicates the level for peaks or loudness prior to the volume control of the stereo output AND the master output bus. Therefore, I would recommend that you ALWAYS leave both faders at their zero default, in order to get accurate readings.

All mixes are subjective

The approach to the mix varies with different engineers. What worked best for me was to concentrate on groups of instruments. The order isn’t important, but start with drums, for instance. The kit will likely have the highest number of tracks. Work with the soloed drum tracks to get a well-defined drum sound as a complete kit. Same for guitars, vocals, or any other instrument. Then work with the combination to get the right overall balance. Lastly, add and adjust mastering plug-ins to the stereo output channel strip to craft the final sound.

Any mix is totally subjective and technical perfection is merely an aspiration. I personally prefer my mix of Dirty to the others. The song is fun and the starting tracks nice and clean. But I’m certainly happy with my mix on the others, in spite of sonic imperfections. To make sure your mix is as good as it can be, check your mix in different listening environments. Fortunately, Audition can still burn your mix to an audio CD. Assuming you still own a disc burner and CD players, then it’s a great portable medium to check your mix in the car or on your home stereo system. Overall, during the course of mixing and then reviewing, I probably checked this on four different speaker set-ups, plus headphones and earbuds. The latter turned out to be the best way to detect stereo imaging issues, though not necessarily the most accurate sound otherwise. But, that is probably the way a large swath of people consume music these days.

I hope you enjoy the compilation if you take the time to listen. The order of the songs is:

The Seeds of your Sorrow

Spitting Ibex

25 Reasons

Louis Berry and band

The Palace

Cosmix Studios session

featuring Celina Ann

with

Thomas Hechenberger (guitar)

Valentin Omar (keys)

David Leisser (drums)

Bernhard Osanna (bass)

Home

AVEC

Dirty

Marina & the Kats

Move Like a Ghost

Saint Agnes

©2022 Oliver Peters

Pro Tips for FCP Editors

Every nonlinear editing application has strengths and weaknesses. Each experienced editor has a list of features and enhancements that they’d like to see added to their favorite tool. Final Cut Pro has many fans, but also its share of detractors, largely because of Apple’s pivot when Final Cut Pro changed from FCP7 to FCPX a decade ago. That doesn’t mean it’s not adequate for professional-level work. In fact, it’s a powerful tool in its own right. But there are ways to adapt it to workflows you may miss from competing NLEs. I discuss five of these tips in my article Making Final Cut More Pro over at FCP.co.

©2022 Oliver Peters

CineMatch for FCP

Last year FilmConvert, developers of the Nitrate film emulation plug-in, released CineMatch. It’s a camera-matching plug-in designed for multiple platforms – including operating systems and different editing/grading applications. The initial 2020 release worked with DaVinci Resolve and Premiere Pro. Recently FilmConvert added Final Cut Pro support. You can purchase the plug-in for individual hosts or as a bundle for multiple hosts. If you bought the bundled version last year, then that license key is also applicable to the new Final Cut Pro plug-in. So, nothing extra to purchase for bundle owners.

CineMatch is designed to work with log and raw formats and a wide range of camera packs is included within the installer. To date, 70 combinations of brands and models are supported, including iPhones. FilmConvert has created these profiles based on the color science of the sensor used in each of the specific cameras.

CineMatch for FCP works the same way as the Resolve and Premiere Pro versions. First, select the source profile for the camera used. Next, apply the desired target camera profile. Finally, make additional color adjustments as needed.

If you a shoot with one predominant A camera that is augmented by B and C cameras of different makes/models, then you can apply CineMatch to the B and C camera clips in order to better match them to the A camera’s look.

You can also use it to shift the look of a camera to that of a different camera. Let’s say that you want a Canon C300 to look more like an ARRI Alexa or even an iPhone. Simply use CineMatch to do that. In my example images, I’ve adjusted Blackmagic and Alexa clips so that they both emulate the color science of a Sony Venice camera.

When working in Final Cut Pro, remember that it will automatically apply Rec 709 LUTs to some log formats, like ARRI Alexa Log-C. When you plan to use CineMatch, be sure to also set the Camera LUT pulldown selector in the inspector pane to “none.” Otherwise, you will be stacking two LUT conversions resulting in a very ugly look.

Once camera settings have been established, you can further adjust exposure, color balance, lift/gamma/gain color wheels, saturation, and the luma curve. There is also an HSL curves panel to further refine hue, saturation, and luma for individual color ranges. This is helpful when trying to match two cameras or shots to each other with greater accuracy. FCP’s comparison viewer is a great aid in making these tweaks.

As a side note, it’s also possible to use CineMatch in conjunction with FilmConvert Nitrate (if you have it) to not only adjust color science, but then to subsequently emulate different film stocks and grain characteristics.

CineMatch is a useful tool when working with different camera types and want to achieve a cohesive look. It’s easy and quick to use with little performance impact. CineMatch now also supports M1 Macs.

©2021 Oliver Peters