Could Fairlight be your next DAW?

When I review audio plug-ins and software, it’s from my perspective as a video editor. I’m not a recording engineer or mixer; however, I do dabble with music mixes as a hobbyist and to improve my audio chops. As such, I occasionally delve into digital audio workstation software, such as Sound Forge, Audio Design Desk, and others. My favorite is Apple Logic Pro, but as a DaVinci Resolve and Adobe user, I also have Fairlight (part of Resolve) and Adobe Audition. I touched on the Fairlight page in some detail as part of my Resolve Studio 18 review, but in this post I want to focus on it purely from the perspective of a DAW user on music projects.

Blackmagic’s reimagining

When Blackmagic Design acquired the assets of Fairlight, the software was refreshed and developed into the Fairlight page within DaVinci Resolve. Even though it’s nested inside of a video editing and grading tool, Fairlight is capable of being a standalone audio application. No need to ever have video enter into the equation.

Fairlight is integrated into both DaVinci Resolve (free) and Resolve Studio ($295). The Studio version can be activated on two computers at the same time. Nearly all Fairlight features and effects are the same in both versions, with the exception of ATMOS and spatial audio mixing/monitoring, which requires the Studio version. If your only interest is stereo recording and mixing, then Resolve is one of the only, truly free DAWs on the market. No significant feature restrictions and no Blackmagic hardware required. Plus, it works in Windows, Linux, and macOS.

Along with this software development, Blackmagic Design has expanded the ecosystem of companion Fairlight products. These include an accelerator card, a modular chassis, control surfaces, controllers, and an audio interface. The Fairlight page also supports Blackmagic’s two editor keyboards. You can run Fairlight without any external hardware, yet it’s scalable up to a complete recording studio rig. On a Mac, any Core Audio device will do, so recording into Fairlight and monitoring the output is compatible with simple USB audio interfaces, like Focusrite, PreSonus and others.

Understanding the interface

The Fairlight interface is compatible with single and dual-display set-ups and uses UI panels that can be turned on and off or slid onto the screen as needed. You can show or hide individual pieces of the mixer, as well. Unfortunately in a single display system, like an iMac, you cannot display the mixer panel full-screen. A project with 20 to 30 or more source tracks, requires left to right scrolling. However, since the 18.1 update, the meter bridge panel allows for two rows of meters.

The mixer uses a channel strip format for each track, which includes input/output/send routing, effects, and a built-in parametric equalizer and compressor. This is much like the channel strip of a traditional analog studio console, like an SSL or Neve. Unlike some other DAWs, you can also change the signal order of effects, EQ, and dynamics (compression) within each channel strip.

Modern plug-ins

Resolve includes Fairlight FX audio plug-ins that cover most common needs. But since this software is targeted towards the film and TV customer, it doesn’t include music-centric plug-ins, like the guitar amp and pedal emulations offered in Logic Pro. That focus is true of the plug-in presets, as well. For example, the factory preset choices in the compressor will be for dialogue and not musical instruments, like a drum kit or guitar. That doesn’t mean you can’t do music with these plug-ins. Presets are just suggestions anyway, so you should tweak based on what sounds right to you.

Fairlight does not color the sound. The sonic character, interface, and plug-in design take a clean, modern approach. There are no vintage options and none of the plug-ins are designed as skeuomorphic emulations of studio gear synonymous with classic recordings from the 70s. After all, film re-recording mixers have never been particularly precious about certain consoles or outboard gear from ages ago. Other than maybe a love for old Nagras, I doubt there’s much fondness for old audio gear like mag dubbers. At least not in the same way that music recording engineers still like to use analog recorders in the signal chain.

If you do want vintage tools, then Fairlight supports third-party AU and VST plug-ins. However, as with other video applications, I’ve found that some of the skeuomorphic effects don’t always work or look right. For example, I often use the free VU meter from TBProAudio. In Fairlight, only the AU version will appear as intended. And if you own an M1 or M2 Mac, then double-check that your favorite third-party plug-in is natively supported.

Fairlight isn’t just for audio post

Avid’s Pro Tools is the 800-pound gorilla. But, many Pro Tools users are often frustrated with the cost of staying current and dealing with Avid as a company. While such concerns may or may not be justified, Pro Tools isn’t the only game in town. Unless you need to interchange Pro Tools projects, there are plenty of alternatives. And that’s where Fairlight comes in. First of all, if audio post for film and TV is your primary focus, then Fairlight is up to the task. Resolve will import XML, FCPXML, and AAF files for both color and sound finishing. Fairlight includes an ADR recording routine, a free sound effects library, and a foley sampler plug-in. But let me focus on Fairlight as a music DAW.

I started with multitracks of song covers available from Warren Huart’s “Produce Like A Pro” YouTube channel. I didn’t record my own tracks, other than to test how recording might work. I’m a big believer that a great mix is achieved by doing 90% of the work at the time of the studio recording. It’s not about building the sound through plug-ins and tricks, but getting the right blend of gear, mics, and performance from the players. That was already there in the multitracks, so the mix was more about the right balance of these elements.

Achieving a successful mix

Fairlight works with as many tracks and busses as are created in your timeline. My standard layout for mixing is to use summing busses. You can create as many as you need. The 35 tracks for this song include drums, percussion, bass, piano, electric and acoustic guitars. I route each set of instrument tracks to a buss dedicated to that group, even if there’s only one instrument track in that group. These six busses are then routed to a submix buss, which in turn is routed to the master buss for output. This allows for gain staging and quickly balancing  levels. The default Fairlight layout automatically routes the first buss (drums in my case) as the output to the speakers and on the Deliver page. Be sure to change each of these to your master buss for the proper intended output.

My goal was to come out with a result that hit desired loudness targets and sounded good to me, mainly using the stock plug-ins. You’re going to adjust levels, but most of the effects center around EQ, compression, and reverb. Each of these is adequately covered by the complement of Fairlight FX. If you have singers, then there are also vocal processing effects, like de-essing. However, an investment in iZotope RX is certainly a useful add-on. For example, RX includes a specific tool to remove or reduce guitar squeaks and string noise. The Resolve 18.1 update added many audio-centric features, including a new voice isolation feature. It works well for any vocal situation and in my opinion has fewer negative artifacts than most of the competing options.

In my test mix, I adjusted level, panning, EQ, and compression on each channel strip. At the buss level, I added more EQ and compression, plus some reverb. The last stage was a multiband compressor and a brick wall limiter on the submix buss. Only meter plug-ins were added to the master buss. Of course, Fairlight includes its own useful set of meters for level and loudness.

Fairlight is actually quite good for music production, editing, and mixing. Since it’s built into an NLE, the project supports multiple mixes. You can have bins and timelines to organize the tracks and mixes for various different songs, as well as different versions of each mix. Resolve 18 added new cloud collaboration tools, however, you can easily collaborate on mixes by exporting a timeline file to send to a colleague. Assuming the other system has access to the same audio files and third-party plug-ins (if used), then it’s simply a matter of importing that timeline file.

Processing for this number of tracks and effects was easily handled by my iMac. It could have handled more, including more intense third-party plug-ins, like Gullfoss, Ozone, FabFilter, or Sonible. If you really need to go BIG, then Blackmagic Design promises up to 2,000 real-time tracks for the full Fairlight hardware installation! So if Pro Tools isn’t in the cards for you, then look over Fairlight and Resolve. It might just be right for your music mixing needs.

Additional thoughts

Some of the comments I received on the PVC version of this article (see link below) pointed out that Fairlight does not include such music-centric tools as MIDI and a piano roll, like some other DAWs do. While this is true, these are tools used by music creators working with synthetic instruments, like software samples for guitar, strings, drums, etc. That’s not a universal requirement, especially if you record and mix live performers using real instruments. Certainly if you need those specialized features, then other DAWs are a better fit for you.

It’s important to remember that digital audio workstation (DAW) software is used for a wide variety of audio production tasks. Such productions are often recorded and edited with tools that do not include some of these music features either. For example, Adobe Audition is widely used in the production of podcasts and radio commercials. So while Fairlight might not fit all needs, there’s little harm in trying a free application and then seeing where that leads.

Want to try mixing in Fairlight for yourself, but don’t have the tracks? Check out these 50 free, downloadable multitrack song sets from Warren Huart. I’ve only scratched the surface, so be sure to check out Blackmagic’s Fairlight training series.

This review also appears at Pro Video Coalition.

©2023 Oliver Peters

Sonible smart:comp 2

Audio software plug-ins (effects and filters) come in two forms. On one hand, you have a wide range of products that emulate vintage analog hardware, often showcasing a skeuomorphic interface design. If you know how the original hardware version worked and sounded, then that will inform your expectations for the software equivalent. The other approach is to eschew the sonic and visual approach of analog emulation and build a plug-in with a modern look and sound. Increasingly this second group of plug-ins employ intelligent profiles and “assistants” to analyze your track and provide you with automatic settings that form a good starting point.

Austria has a long and proud musical history and heritage of developing leading audio products. There are many high-end Austrian audio manufacturers. One of those companies is Sonible, which develops both hardware and software products. The Sonible software falls into that second camp of plug-ins, with clean sonic qualities and a modern interface design. Of key interest is the “smart:” category, including smart:comp 2, smart:limit, smart:EQ 3, smart:reverb, and smart:EQ live. The first four of these are also available as the smart:bundle.

Taking a spin with Sonible’s spectro-dynamic compressor

I tested out smart:comp 2, which is billed as a spectro-dynamic compressor. It’s compatible with Windows and macOS and installs AU, VST, VST3, and AAX (Avid) versions. Licensing uses an iLok or is registered to your computer (up to two computers at a time). Let’s start with why these are “smart.” In a similar fashion to iZotope’s Ozone and others, smart:comp 2 can automatically analyze your track and assign compressor settings based on different profiles. The settings may be perfect out of the gate or form a starting point for additional adjustments. Of course, you can also just start by making manual adjustments.

Spectro-dynamic is a bit of a marketing term, but in essence, smart:comp 2 works like a highly sophisticated multiband compressor. The compression ranges are based on the sonic spectrum of the track. Instead of the four basic bands of most multiband compressors, smart:comp 2 carves up the signal into 2,000 slices to which compression is dynamically applied. As a compressor, this plug-in is equally useful on individual tracks or on the full mix as a mastering plug-in.

In addition, I would characterize the interface design as “discoverable.” When you first open the plug-in, you see a clean user interface with simple adjustments for level and ratio. However, you can click certain disclosure triangles to open other parts of the interface, such as control of attack and release timing, as well as side-chain filtering. There are three unique sound shaping controls at the bottom. Style controls the character of the compressor between “clean” (transparent) and “dirty” (warm and punchy). The Spectral Compression control dials in the amount of spectral (multiband) compression being applied. At zero, smart:comp 2 will act as an ordinary broadband compressor. The Color control lets you emphasis “darker” or “brighter” ranges within the spectral compression.

Simple, yet powerful functions

Start by selecting a profile (or leave on “Universal”). Play a louder section of your mix and let smart:comp 2 “learn” the track. Once learning is done and a profile established, you may be done. Or you may want to make further adjustments to taste. For example, the plug-in features automatic input riding along with automatic output (make-up gain). I found that for my mixes, input riding worked well, but I preferred a fixed output gain, which can be set manually.

There’s a “limit” function, which is always set to 0dBFS. When enabled, the limit option becomes a soft clipper. All peaks exceeding 0dBFS will be tamed to avoid hard clipping. It’s like a smooth limiter set to 0dBFS after the compression stage. However, if your intended use is broadcast production, rather than music mixes, you may still need to add a separate limiter plug-in (such as Sonible’s smart:limit) in the mastering chain after smart:comp 2. Especially if your target is lower, such as true peaks at -3dB or -6dB.

smart:comp2 did a wonderful job as a master bus compressor on my music mixes. I tested it against other built-in and third-party compressors within Logic Pro and DaVinci Resolve Fairlight. First, smart:comp 2 is very clean when you press it hard. There’s always a pleasing sound. However, the biggest characteristic is that the mixes sound more open with better clarity.

smart:comp 2 for mixing video projects

I’m a video editor and most of my mixes are more basic than multitrack music mixes with large track counts. Just a few dialogue, music, and sound effects tracks and that’s it. So the next test was applying smart:comp 2 on Premiere Pro’s mix bus. When I originally mixed this particular project, I used Adobe’s built-in tube-modeled compression on the dialogue tracks and then Adobe’s  multiband compressor and limiter of the mix buss. For this test, I stripped all of those out and only added smart:comp 2 to the mix output buss.

I noticed the same openness as in the music mixes, but input riding was even more evident. My sequence started with a 15 second musical lead-in. Then the music ducks under the dialogue as the presenter appears. I had mixed this level change manually for a good-sounding balance. When I applied smart:comp 2, I noticed that the opening music was louder than with the plug-in bypassed. Yet, this automatic loudness level change felt right and the transition to the ducked music was properly handled by smart:comp 2. Although the unprocessed mix initially sounded fine to me, I would have to say that using smart:comp 2 made it a better-sounding mix overall. It was also better than when I used the built-in options.

How you use plug-ins is a matter of taste and talent. Some pros may look at automatic functions as some sort of cheat. I think that’s wrong. Software analysis can give you a good starting point in less time, allowing more time for creativity. You aren’t getting bogged down twirling knobs. That’s a good thing. I realize vintage plug-ins often look cool, but if you don’t know the result you’ll get, they can be a waste of time and money. This is where plug-ins like the smart: series from Sonible will enhance to your daily mixing workflow, regardless of whether you are a seasoned recording engineer or a video editor.

©2022 Oliver Peters

Analogue Wayback, Ep. 21

The Jacksonville Jazz Festival

Regular readers probably know by now that I have a soft spot in my heart for music production. I’ve worked on a number of films and TV shows that were tied to musical performances and it’s always been an enjoyable experience for me. One of those ongoing experiences was post for the Jacksonville Jazz Festival PBS specials in the 80s and 90s. Although I was living in Jacksonville at the start of this annual event, I really didn’t get involved with the shows until a few years after I’d left town.

The yearly Jacksonville Jazz Festival is a cultural highlight for the city of Jacksonville, Florida. Launched in 1980, the first two years were hosted in the neighboring fishing town of Mayport, home of a large US Navy base. It quickly shifted to downtown Jacksonville’s Metropolitan Park by the St. Johns River, which cuts through the heart of the city.

Recording jazz in the “backyard”

WJCT, the local PBS and NPR affiliate, had been covering the annual event for PBS since the second year of the festival. By 1983, the festival and the station were tightly intertwined. In that year, the park was renovated with a new WJCT facility adjacent to it. Having the building next to the park provided a unique opportunity to install direct audio and video cable runs between the station facility and the covered pavilion performance stage at the park. To inaugurate both, WJCT covered the festival with an eight-hour live broadcast.

From 1981 until 1994 (with the exception of 1983), WJCT produced each year’s festival as a one-hour TV special for PBS distribution. This was a fall event, which was posted over the subsequent months and aired early the next year. My involvement started with the 1984 show, helping to post eight of the eleven TV specials during those years. I worked closely with the station’s VP of Programming, Richard V. Brown, and Creative Services Director, Bill Weather.

Production and post arrangements varied from year to year. Bill Weather was the show’s producer/director for the live event recordings most of those eleven years. (Other directors included Dan Kossoff, David Atwood, and Patrick Kelly.) Weather and I traded off working as the creative editor, so in some years I was the online editor and in others, both editor and online editor. During that decade of shows, post was either at Century III (where I worked) or at our friendly crosstown rival, The Post Group at The Disney-MGM Studios.

Turning the festival into a TV show

Richard V. Brown was the show’s executive producer and also handled the artist arrangements for the show and the festival. Performers received fees for both the live event appearance and the TV show (if they were featured in it), so budgets often dictated who was presented in the telecast. A legendary, but expensive performer like Ray Charles or Miles Davis might headline the festival, yet not appear in the TV special. However, this wasn’t always dictated by money, since top names already brought with them a level of overexposure in the media. And so, the featured artists each year covered a wide spectrum of traditional and contemporary jazz styles, often introducing lesser known artists to a wider TV audience. New Orleans, fusion, Latin, blues, and even some rock performers were included in this eclectic mix.

The artist line-up for each special was decided before the event. Most shows highlighted four acts of about 10 to 15 minutes each. The songs to be included from each artist were selected from the live set, which tended to run for about an hour. The first editorial step (handled by Brown and Weather) was to select which songs to use from each performer, as well as any internal song edits needed to ensure that the final show length fit PBS guidelines.

Recording the live experience

Production and post grew in sophistication over time. Once the WJCT building was completely ready, multiple cameras could be controlled and switched from the regular production control room. No mobile unit required. This usually included up to seven cameras for the event. A line cut was recorded to 1″ videotape, along with several of the cameras as extra iso recordings to be used in post.

The station’s own production equipment was augmented with other gear, including stage lighting, camera dolly, and camera boom. With such an important local event, the station crew was also expanded thanks to local production professionals from the town, including a few top directors and cinematographers working the stage and running cameras; and volunteers working tirelessly to truly make each year memorable.

When it came to sound, the new WJCT facility also included its own 24-track audio recorder. Stage mic signals could be split in order to simultaneously feed the “front of the house” mixing board, the stage monitors, and run back into the building to the multitrack recorder. These 2″ analog audio masters also recorded “time of day” timecode, thus could be synced with the video line cut and iso recordings in post.

Editing is more than just editing

Although my role was post, I was able to attend several of the live festivals, even if I was only the online editor. I sat in the control room and functioned a bit like an assistant director, noting potential editorial issues. But I also made sure that I had coverage of all the members of the band. One performer might take a solo, but I also needed options for other camera angles. As with most live jazz and rock performances, the band members might trade off solos, so it was important to keep an eye on where the focus of the performance could switch to next. Since the director had his hands full just focusing on the real-time action, I would often lean over and ask for a little different coverage from one of the other cameras not currently punched up.

None of the crew was intimately familiar with the live performances of these acts, so it was all about having a sixth sense for the music. However, there was one surprising exception. That was the year that Paul Shaffer and the World’s Most Dangerous Band headlined. As you probably know this was the house band for The David Letterman Show, but they also had a limited live touring schedule.

For their set, Shaffer sent in a coordinator with a printout of their entire set rundown. Shaffer and the band had choreographed the whole set, so he was able to give the director a “heads up” for each part of the performance. In addition, Shaffer is the consummate band leader. His set included a jam with his band and several other jazz artists from earlier in the day. Each had a cameo solo. This sort of ad hoc, live jam can often become a big mess; but this one went off as if they’d rehearsed it. Shaffer literally put this together in quick conversations with the other artists during the course of that day.

3/4″ and a legal pad of notes

Once everything was in the can, post could start – initially with content selection. Then camera cuts could be cleaned up using the iso angles. This “offline edit” was largely done by reviewing the 3/4″ U-matic tapes, which had been recorded for the line cut and three of the iso angles using a quad-split generator with a timecode overlay. This gave the editor a multicam view, but from a single tape source. Unfortunately, listing camera cut changes to specific angles required a lot of meticulous, handwritten timecode notes. (Early days had four monitors and a timecode generator display stacked as closely as possible, with an independent camera recording to 3/4″tape.)

Based on these notes, the show master could then be edited in a linear, online session using the 1″ originals and mastering to 1″ or D2. If the line cut of the live recording was solid, then any given song might only have new edits for about 10-25% of the song. Edits might mean a cut to a different angle or maybe the same angle, but just a bit sooner. In addition to the live camera angles, we also had extra ENG footage, including audience shots, party boats anchored in the river nearby, and even some helicopter aerials of the wider event grounds, the pavilion stage, and the audience.

In a typical year, I would finish the camera clean-up edits and trims unsupervised, then Brown and Weather would arrive for the supervised part of the online edit. Here we would build the visual style for the show open and transitions between songs and bands. Plus final credits. This was at the dawn of digital post, so most show opens involved a lot of layering.

It’s all about the mix

The Jacksonville Jazz Festival PBS specials were, of course, about the music. Getting the best possible mix was a very important consideration. In the earliest years, the live recording and remix methodology was evolving, but generally run under the auspices of the WJCT audio engineers. This shifted to our Century III staff audio engineer, Jerry Studenka. He handled the mix for the shows for several years in the late 80s.

To the best of my recollection, the 24-track tapes were recorded at 15ips with Dolby SR noise reduction. This enabled an hourlong set to be recorded on a single reel of tape. Audio mixes/remixes were recorded onto two tracks of that same 24-track tape. In later years, working out of the Century III facility on the lot at Universal, we used Sony 24-track digital audio recorders. The staff would first bounce the analog master reels to digital tape ahead of the audio mix session. Then the audio engineer would mix from one digital recorder to the other. Century III and The Post Group were equipped with Solid State Logic consoles in their main audio rooms, which provided a comfort factor for any experienced music mixer.

The performances were recorded live and mixed on-the-fly during each set as the first pass. Then in the post session, they were polished or remixed in part with punch-ins or even fully remixed depending on what everyone felt gave the best result. But the mixes were all based on the actual live recordings – no overdubs added later.

Every year, each performer was afforded the opportunity to bring in their own recording engineer or representative for the show’s mix. Only two artists ever took Brown up  on that  – Paul Shaffer and Spyro Gyra. Larry Swist came down for Spyro Gyra, who appeared at numerous festivals and was featured in several of the specials. Swist, who later became a well-respected studio designer, was the recording engineer for the band’s albums. Shaffer sent Will Lee (the band’s vocalist/bassist) as his rep to the mixing session. Spyro Gyra and Shaffer’s band happened to be on the same show that year. By the time Lee arrived, Studenka and Swist already had a good mix, so Lee was able to quickly sign off.

Swist had an easy-going, “no drama” personality. Everyone had such a good experience working with him that for each year thereafter, Swist was brought in for all of the sessions. He coordinated both the live recording to multitrack during the event and then remixed all the music for the show during post.

These remixes weren’t as straightforward as they might seem. All sound post was handled on tape, not with any sort of DAW. It was a linear process, just like the picture edits. First of all, there were internal edits within the songs. Therefore, all outboard processing and console and fader settings had to match at the edit point, so that the edit was undetectable. Second, the transitions between songs or from one artist to the next had to be bridged. This was generally done by overlapping additional crowd applause across the change to hide the performance edit, which again required audio matching.

The Jacksonville Jazz Festival of 1994 (aired 1995) was the last of the PBS specials, due in part to the cost of production and TV rights. Eventually WJCT turned over production of the festival itself to the City of Jacksonville. The results for that time speak for themselves. The collective effort produced not only great festival experiences, but also memorable television. Unfortunately, some of the production folks involved, like Richard V. Brown, Larry Swist, and Jerry Studenka are no longer with us. And likewise, neither are some of the featured performers. But together, they left a worthwhile legacy that is still carried on by the City of Jacksonville to this day. 

©2022 Oliver Peters

Analogue Wayback, Ep. 19

Garage bands before the boy bands

As an editor, I’ve enjoyed the many music-oriented video productions I’ve worked on. In fact one of my first feature films was a concert film highlighting many top Reggae artists. Along the way, I’ve cut numerous jazz concerts for PBS, along with various videos for folks like Jimmy Buffet and the Bob Marley Foundation.

We often think about the projects that “got away” or never happened. For me, one of those was a documentary about the “garage band” acts of central Florida during the 1960s. These were popular local and regional acts with an eye towards stardom, but who never became household names, like Elvis or The Beatles. Central Florida was a hot bed for such acts back then, in the same way as San Francisco, Memphis, or Seattle have been during key moments in rock ‘n roll history.

For much of the early rock ‘n roll era music was a vertically-integrated business. Artist management, booking, recording studios, and marketing/promotion/distribution were all handled by the same company. The money was made in booking performances more so than record sales.

Records were produced, especially 45RPM “singles”, in order to promote the band. Singles were sent for free to radio stations in hopes that they would be placed into regular rotation by the station. That airplay would familiarize listeners/fans with the bands and their music. While purchasing the records was a goal, the bigger aim was name recognition, so that when a band was booked for a local event (dance, concert, youth club appearance, tour date) the local fans would buy tickets and show up to the event. Naturally some artists broke out in a big way, which meant even more money in record sales, as well as touring.

Record labels, studios, recording  studios, and talent booking services – whether the same company or separate entities – enjoyed a very symbiotic relationship. Much of this is chronicled in a mini-doc I cut for the Memphis Rock ‘n Soul Museum. It highlighted studios like Sun, Stax, and Hi and their role in the birth of rock ‘n roll and soul music.

In the central Florida scene, one such company was Bee Jay, started by musician/entrepreneur Eric Schabacker. Bee Jay originally encompassed a booking service and eventually a highly regarded recording studio responsible for many local acts. Many artists passed through those studio doors, but one of the biggest acts to record there was probably Molly Hatchet. I got to know Schabacker when the post facility I was with acquired the Bee Jay Studios facility.

Years later Schabacker approached me with an interesting project – a documentary about the local garage bands on the 60s. Together with a series of interviews with living band members, post for the documentary would also involve the restoration of several proto-music videos. Bee Jay had videotaped promotional videos for 13 of the bands back in the day. While Schabacker handled the recording of the interviews, I tackled the music videos.

The original videos were recorded using a rudimentary black-and-white production system. These were recorded onto half-inch open reel videotape. Unfortunately, the video tubes in the cameras back then didn’t always handle bright outdoor light well and the video switcher did not feature clean vertical interval switching. The result was a series of recordings in which video levels fluctuated and camera cuts often glitched. There were sections in the recordings where the tape machine lost servo lock during recording. The audio was not recorded live. Instead, the bands lip-synced to playback of their song recordings, which was also recorded in sync with the video. These old videos were transferred to DV25 QuickTime files, which formed my starting point.

Step one was to have clean audio. The bands’ tunes had been recorded and mixed at Bee Jay Studios at the time into a 13-song LP that was used for promotion to book those bands. However, at this point over three decades later, the master recordings were no longer available. But Schabacker did have pristine vinyl LPs from those session. These were turned over to local audio legend and renowned master engineer, Bob Katz. In turn, he took those versions and created remastered files for my use.

Now that I had good sound, my task was to take the video – warts and all – and rebuild it in sync with the song tracks, clean up the video, get rid of any damage and glitches, and in general end up with a useable final video for each song. Final Cut Pro (legacy) was the tool of choice at that time. Much of the “restoration” involved the slight slowing or speeding up of shots to resync the files – shot by shot. I also had to repeat and slomo some shots for fit-and-fill, since frames would be lost as glitchy camera cuts and other disturbances were removed. In the end, I rebuilt all 13 into a presentable form.

While that was a labor of love, the down side was that the documentary never came to be. All of these bands had recorded great-sounding covers (such as Solitary Man), but no originals. Unfortunately, it would have been a nightmare and quite costly to clear the music rights for these clips if used in the documentary. A shame, but that’s life in the filmmaking world.

None of these bands made it big, but in subsequent years, bands of another era like *NSYNC and the Backstreet Boys did. And they ushered a new boy band phenomenon, which carries on to this day in the form of K-pop, among other styles.

©2022 Oliver Peters

Will DaVinci Resolve 18 get you to switch?

DaVinci Resolve has been admired by users of other editing applications, because of the pace of Blackmagic Design’s development team. Many have considered a switch to Resolve. Since its announcement earlier this year, DaVinci Resolve fans and pros alike have been eagerly awaiting Resolve 18 to get out of public beta. It was recently released and I’ve been using it ever since for a range of color correction jobs.

DaVinci Resolve 18 is available in two versions: Resolve (free) or Resolve Studio (paid). These are free updates to existing customers. They can be downloaded/bought either from the Blackmagic Design website (Windows, Mac, Linux) or through the Apple Mac App Store (macOS only – Intel and M1). The free version of Resolve is missing only a few of the advanced features available in Resolve Studio. Due to App Store policies and sandboxing, there are also some differences between the Blackmagic and App Store installations. The Blackmagic website installations may be activated on up to two computers at the same time using a software activation code. The App Store versions will run on any Mac tied to your Apple ID.

(Click images to see an enlarged view.)

A little DaVinci Resolve history

If you are new to DaVinci Resolve, then here’s a quick recap. The application is an amalgam of the intellectual property and assets acquired by Blackmagic Design over several years from three different companies: DaVinci Systems, eyeon (Fusion), and Fairlight Instruments. Blackmagic Design built upon the core of DaVinci Resolve to develop an all-in-one, post production solution. The intent is to encompass an end-to-end workflow that integrates the specialized tasks of editing, color grading, visual effects, and post production sound all within a single application.

The interface character and toolset tied to each of these tasks is preserved using a page-style, modal user interface. In effect, you have separate tools, tied to a common media engine, which operate under the umbrella of a single application. Some pages are fluidly interoperable (like edit and Color) and others aren’t. For example, color nodes applied to clips in the Color page do not appear as nodes within the Fusion page. Color adjustments made to clips in a Fusion composition need to be done with Fusion’s separate color tools.

Blackmagic has expanded Resolve’s editing features – so much so that it’s a viable competitor to Avid Media Composer, Apple Final Cut Pro, and/or Adobe Premiere Pro. Resolve sports two editing modes: the Cut page (a Final Cut Pro-style interface for fast assembly editing) and the Edit page (a traditional track-based interface). The best way to work in Resolve is to adhere to its sequential, “left to right” workflow – just like the pages/modes are oriented. Start by ingesting in the Media page and then work your way through the tasks/pages until it’s time to export using the Deliver page.

Blackmagic Design offers a range of optional hardware panels for Resolve, including bespoke editing keyboards, color correction panels, and modular control surface configurations for Fairlight (sound mixing). Of course, there’s also Blackmagic’s UltraStudio, Intensity Pro, and DeckLink i/o hardware.

A new collaboration model through Blackmagic Cloud

The biggest news is that DaVinci Resolve 18 was redesigned for multi-user collaboration. Resolve projects are usually stored in a database on your local computer or a local drive, rather than as separate binary project files. Sharing projects in a multi-user environment requires a separate database server, which isn’t designed for remote editing. To simplify this and address remote work, Blackmagic Design established and hosts the new Blackmagic Cloud service.

As I touched on in my Cloud Store Mini review, anyone may sign up for a free Blackmagic Cloud account. When ready, the user creates a Library (database) on Cloud from within the Resolve UI. That user is the “owner” of the Library, which can contain multiple projects. The owner pays $5/library/month for each Library hosted on Blackmagic Cloud.

The Library owner can share a project with any other registered Blackmagic Cloud user. This collaboration model is similar to working in Media Composer and is based on bin locking. The first user to open a bin has read/write permission to that bin and any timelines contained in it. Other users opening the same timeline operate with read-only permission. Changes made by the user with write permission can then be updated by the read-only users on their systems.

Blackmagic Design only hosts the Library/project files and not any media, which stays local for each collaborator. The media syncing workflow is addressed through features of the Cloud Store storage products (see my review). Both collaboration via Blackmagic Cloud and the storage products are independent of each other. You can use either without needing the other. However, since Blackmagic Cloud is hosted “in the cloud” you do need an internet connection. 

There is some latency between the time a change is made by one user before it’s updated on the other users’ machines. In my tests, the collaborator needs to relink to the local media each time a shared project is accessed again. You can also move a project from Cloud back to your local computer as needed.

What else is new in DaVinci Resolve 18?

Aside from the new collaboration tools, DaVinci Resolve 18 also features a range of enhancements. Resolve 17 already introduced quite a few new features, which have been expanded upon in Resolve 18. The first of these is a new, simplified proxy workflow using the “prefer proxies” model. Native media handling has always been a strength of Resolve, especially with ProRes or Blackmagic RAW (BRAW) files. (Sorry, no support for Apple ProRes RAW.) But file sizes, codecs, and your hardware limitations can impede efficient editing. Therefore, working with proxy files may be the better option on some projects. When you are ready to deliver, then switch back to the camera originals for the final output.

The website installer for DaVinci Resolve Studio 18 includes the new Blackmagic Proxy Generator application. This automatically creates H.264, H.265, or ProRes proxy files using a watch folder. However, you can also create proxies internally from Resolve without using this app, or externally using Apple Compressor or Adobe Media Encoder. The trick is that proxy files must have matching names, lengths, timecode values, and audio channel configurations.

Proxy files should be rendered into a subfolder called “Proxy” located within each folder of original camera files. (Resolve and/or the Proxy Generator application do this automatically.) Then Resolve’s intelligent media management automatically detects and attaches the proxies to the original file. This makes linking easy and allows you to automatically toggle between the proxy and the original files.

Regarding other enhancements, the Color page didn’t see any huge new features, since tools like the Color Warper and HDR wheels were added in Resolve 17. However, there were some new items, including object replacement and enhanced tracking. But, I didn’t find the results to be as good as Adobe’s Content Aware Fill techniques.

Two additions worth mentioning are the Automatic Depth Map and Resolve FX Beauty effect. The beauty effect is a subtle skin smoothing tool. It’s nice, but quite frankly, too subtle. My preference in this type of tool would be Digital Anarchy’s Beauty Box or Boris FX’s Beauty Studio. However, Resolve does include other similar tools, like Face Refinement where you have more control.

Automatic Depth Map is more of a marquee feature. This is a pretty sophisticated process – analyzing depth separation in a moving image without the benefit of any lens metadata. It shows up as a Resolve FX in the Edit, Fusion, and the Color pages. Don’t use it in the Edit page, because you can’t do anything with it there. In the Color page, rather than apply it to a node, drag the effect into the node tree, where it creates its own node.

After brief clip analysis, the tool generates a mask, which you can use as a qualifier to isolate the foreground and background. Bear in mind this is for mild grading differences. Even though you might think of this for blurring a background, don’t do it! The mask is relatively broad. If you try to tighten the mask and use it to blur a background, you’ll get a result that looks like a Zoom call background. Instead, use it to subtly lighten or darken the foreground versus the background within a shot. Remember, the shot is moving, which can often lead to some chatter on the edges of the mask as the clip moves. So you’ll have to play with it to get the best result. Playback performance at Better Quality was poor on a 2017 iMac Pro. Use Faster while working and then switch to Better when you are ready to export or render.

Fusion

Complex visual effects and compositing are best done in the Fusion page. Fusion is both a component of Resolve, as well as a separate application offered by Blackmagic Design. It uses a node-based interface, but these nodes are separate and unrelated to the nodes in the Color page. Fusion features advanced tracking, particle effects, and a true 3D workspace that can work with 3D models. If you have any stereoscopic projects, then Fusion is the tool to use. The news for Fusion and the standalone Fusion Studio 18 application in this update focuses on GPU acceleration.

Before the acquisition by Blackmagic Design, eyeon offered several integrations of Fusion with NLEs like DPS Velocity and Avid Media Composer. The approach within Resolve is very similar to those – send a clip to Fusion for the effect, work with it inside the Fusion UI, and then it’s updated on the timeline as a Fusion clip. This is not unlike the Dynamic Link connection between Premiere Pro and After Effects, except that it all happens inside the same piece of software.

If you are used to working with a layer-style graphics application, like After Effects, Motion, or maybe HitFilm, then Fusion is going to feel foreign. It is a high-end visual effects tool, but might feel cumbersome to some when doing standard motion graphics. Yet for visual effects, the node-based approach is actually superior. There are plenty of good tutorials for Fusion, for any user ready to learn more about its visual effects power.

There are a few things to be aware of with Fusion. The image in the Fusion viewer and the output through UltraStudio to a monitor will be dark, as compared with that same image on the Edit page. Apparently this has been an ongoing user complaint and I have yet to find a color management setting that definitively solves this issue. There is also no way to “decompose” or “break apart” a Fusion composition on the timeline. You can reset the clip to a Fusion default status, but you cannot revert the timeline clip back to that camera file without it being part of a Fusion composition. Therefore, the best workaround is to copy the clip to a higher track before sending it to Fusion. That way you have both the Fusion composition and the original clip on the timeline.

In addition to visual effects, Fusion templates are also used for animated titles. These can be dropped onto a track in the Edit page and then modified in the inspector or the Fusion page. These Fusion titles function a lot like Apple’s Motion templates being used in Final Cut Pro.

Fairlight

Fairlight Instruments started with a popular digital audio workstation (Fairlight CMI) at the dawn of digital audio. After Blackmagic’s acquisition, the software portion of Fairlight was reimagined as a software module for audio post built into DaVinci Resolve. The Fairlight hardware and control surfaces were modularized. You can definitely run Fairlight in Resolve without any extra hardware. However, you can improve real-time performance on mixes with heavy track counts by adding the Fairlight Audio Core accelerator card. You can also configure one or more Blackmagic control surfaces into a large mixing console.

Taken as a whole, this makes the Fairlight ecosystem a very scalable product line in its own right that can appeal to audio post engineers and other audio production professionals. In other words, you can use the Fairlight portion of Resolve without ever using any of the video-centric pages. In that way, Resolve with Fairlight competes with Adobe Audition, Avid Pro Tools, Apple Logic Pro, and others. In fact, Fairlight is still the only professional DAW that’s actually integrated into an NLE.

Fairlight is designed as a DAW for broadcast and film with meter calibration based on broadcast standards. It comes with a free library of sound effects that can be downloaded from Blackmagic Design. The Fairlight page also includes an ADR workflow. DaVinci Resolve 18 expanded the Fairlight toolset. There’s new compatibility for FlexBus audio busing/routing with legacy projects. A lot of work has been put into Dolby Atmos support, including a binaural renderer, and an audio Space view of objects in relation to the room in 3D space.

On the other hand, if you are into music creation, then Fairlight lacks software instruments and music-specific plug-ins, like amp emulation. The MIDI support is focused on sound design. A musician would likely gravitate towards Logic Pro, Cubase, Luna, or Ableton Live. Nevertheless, Fairlight is still quite capable as a DAW for music mixes. Each track/fader integrates a channel strip for effects, plus built-in EQ and compression. Resolve comes with its own complement of Fairlight FX plug-ins, plus it supports third-party AU/VST plug-ins.

I decided to test that concept using some of the mixes from the myLEWITT music sessions. I stacked LEWITT’s multitrack recordings onto a blank Fairlight timeline, which automatically created new mono or stereo tracks, based on the file. I was able to add new busses (stem or submaster channels) for each instrument group and then route those busses to the output. It was easy to add effects and control levels by clip, by track, or by submaster.

Fairlight might not be my first choice if I were a music mixer, but I could easily produce a good mix with it. The result is a transparent, modern sound. If you prefer vintage, analog-style coloration, then you’ll need to add third-party plug-ins for that. Whether or not Fairlight fits the bill for music will depend on your taste as a mixer.

Conclusion

Once again, Blackmagic Design has added more power in the DaVinci Resolve 18 release. Going back to the start of this post – is this the version that will finally cause a paradigm shift away from the leading editing applications? In my opinion, that’s doubtful. As good as it is, the core editing model is probably not compelling enough to coax the majority of loyal users away from their favorite software. However, that doesn’t mean those same users won’t tap into some of Resolve’s tools for a variety of tasks.

There will undoubtedly be people who shift away from Premiere Pro or Final Cut Pro and over to DaVinci Resolve. Maybe it’s for Resolve’s many features. Maybe they’re done with subscriptions. Maybe they no longer feel that Apple is serious. Whatever the reason, Resolve is a highly capable editing application. In fact, during the first quarter of this year I graded and finished a feature film that had been cut entirely in Resolve 17.

Software choices can be highly personal and intertwined with workflow, muscle memory, and other factors. Making a change often takes a big push. I suspect that many Resolve editors are new to editing, often because they got a copy when they bought one of the Blackmagic Design cameras. Resolve just happens to be the best application for editing BRAW files and that combo can attract new users.

DaVinci Resolve 18 is a versatile, yet very complex application. Even experienced users don’t tap into the bulk of what it offers. My advice to any new user is to start with a simple project. Begin in the Cut or Edit page, get comfortable, and ignore everything else. Then learn more over time as you expand the projects you work on and begin to master more of the toolkit. If you really want to dive into DaVinci Resolve, then check out the many free and paid tutorials from Blackmagic Design, Mixing Light, and Ripple Training. Resolve is one application where any user, regardless of experience, will benefit from training, even if it’s only a refresher.

I’ve embedded a lot of links throughout this post, so I hope you’ll take the time to check them out. They cover some of the enhancements that were introduced in earlier versions, the history of DaVinci Resolve, and links to the new features of DaVinci Resolve 18. Enjoy!

©2022 Oliver Peters