KISS in Post

Of course, I’m talking about the principle, not the band: keep it simple, stupid. Nowhere does this apply more than in mixing and color correction.

Many of the films were admire from the 1970s or the classic hit records of the 1960s and 1970s that are held up as sonic pinnacles were finished with tools that we might now consider primitive. Film timing (the equivalent of modern color correction) and analog mixing were as much alchemy and art as they were the precise application of technology.

Tools have become cheaper, thanks to low-cost software like DaVinci Resolve and a plethora of audio plug-ins. We often look for a quick fix, rather than think through the process to achieve the right result. Add in the many YouTube “experts” – some legit, others with no real credits to their name – and you have a never-ending stream of tutorials trying to teach you the “secrets of the masters.”

Color correction

I’ve graded tons of video projects, including feature films and documentaries. If you have to do a ton of grading/color correction/color timing, then something was probably deficient on the production end. Maybe it’s low budget and the production was rushed. Maybe it’s a run-and-gun reality or documentary project. Or maybe, just maybe, the DP wasn’t very good or simply relied more heavily on manipulation in post than he or she should have. My point is that the better the planning and execution for what happens in front of the camera, the better the final look.

I recently watched A Gentleman in Moscow, which is a period drama set from roughly the 1920s to the 1950s. Nearly all of the story takes place within the grand hotel, which is mostly a set on a soundstage in Manchester. While I know nothing about the particulars of the color grade on this project, to my eye, much of the look was created though set design, costuming, and lighting. Plus some visual effects. I’m sure the grade wasn’t simple, but I’d guess that most of the look was in place with just some standard color transforms, balancing, and shot matching.

Watch any YouTube tutorial that focuses on Resolve and you often see the most elaborate set of nodes. Often this is unnecessary. Remember that many colorists use a node template, but not all of the nodes are used or enabled. I’ve done color correction with many of the NLEs, as well as DaVinci Resolve. I can often get the right look within a single node or layer, however, using several nodes or layers helps you to stay organized. For instance, my typical approach in Resolve uses five to seven nodes when dealing with log footage. This lets me deal with log conversion, exposure, primaries, secondaries, and vignetting – each in separate nodes. The same with layers in a tool like Final Cut Pro.

The bottom line is that if the DP has the time and inclination to set a look on location or on set, then not only will the color grading session go faster with fewer adjustments, but the final look will be better and appear to be more organic.

Mixing

Like YouTube tutorials about grading with Resolve, a quick search will turn up tons of tips-and-tricks videos dealing with audio recording and mixing. The tools are cheap and probably more folks fancy themselves as musicians that can self-produce than fancy themselves as colorists. 

Many of these are product reviews where the plug-in developer sent a free copy of the software in exchange for a review. Most of these are independent without pressure from the company. I do those, too. But others are officially sponsored. Nothing wrong with that as long as you know. However, even though a given Waves Abbey Road plug-in promises to make your mixes sound like those of the Beatles, you can achieve the same results with the tools that were already included with your DAW application.

As noted in some of my other blog posts, I’ve been mixing music as a hobby for a few years now. The source tracks are from various internet downloads. Aside from the enjoyment, this exercise helps to inform the approach I take with the simpler mixes I do on paying video edits. The corollary with color correction is that the better the initial recording, the better the mix is and with fewer machinations to get there.

In the heyday of many of the classic rock hits, there was no differentiation between a recording and a mix engineer. The recording engineer handled the studio recording session and the mix was an organic part of that process. Using a different person for the mix is a relatively new phenomena in music production. A recording engineer had to know a lot about proper mic placement, isolation between players in the studio, and the esoteric of the electronics. Prior to DAWs, software plug-ins, and in-the-box (ITB) mixing, recording engineers had to rely on a relatively small number of outboard units together with the features of the console itself.

Take an orchestra, for example. The reason you have the specific number of brass players versus strings versus winds and so on is that this creates the balance in the sound. Usually that was spelled out by the composer, because the orchestra had to be able to achieve the proper “mix” in a live setting without electronic aids. And so, if you look at how orchestras are recorded, it’s often with relatively few mics arranged in very specific patterns.

I’ve had a chance to sit in on a choral recording session. The engineer recorded a relatively small ensemble in three passes to a 24-track analog recorder – six to eight tracks at a time. With each pass, he re-arranged the pattern and the grouping assignments of the mics. The result was not only the tripling of the sound through overdubs, but a change in the way the group was being picked up in each pass. The mixed result was a whole that was greater than a sum of its parts. My point is that if you have properly recorded tracks, you don’t have to do as much in the mix. In fact, for some projects, the entire session could be recorded with a single stereo microphone, if the mic and the performers are properly positioned.

My own mixing method is to group instruments into summed stems (drums, guitars, vocals, etc), which are then mixed into a submix bus before going the to stereo output. Most of the time, I will use native Logic Pro plug-ins on some of the individual instrument tracks (if at all). My favorite plug-in to place on each group bus is the KIT BB N73, which is a Neve 1073 channel strip emulation. I may also use the Logic compressor here as needed. Finally, I like to add the Softube VCA Compressor, Kiive NFuse, and FabFilter Pro-L2 plug-ins on the submix bus. The exact choice of plug-ins will vary with genre, but right now, these are my go-tos.

If the recording session was handled properly, then you should be able to set up the initial mix with a basic balance of levels and be at least 75% of the way there. If you are having to place a long chain of plug-ins onto each track/fader, then either the original recordings weren’t great to start with or you are overthinking the process.

Ultimately all processes in post become more complex when the original production was short-changed. But, start out with the simplest approach and build upon that. It’s likely that you’ll get to your desired result with less fuss and in a shorter amount of time.

©2024 Oliver Peters

Life in Six Strings

Whether you’re a guitar nerd or just into rock ‘n roll history, learning what makes our music heroes tick is always entertaining. Music journalist and TV presenter Kylie Olsson started a YouTube channel during the pandemic lockdown, teaching herself how to play guitar and reaching out to famous guitarists that she knew. This became the concept for a TV series called “Life in Six Strings with Kylie Olsson” that airs on AXS TV. The show is in the style of “Comedians in Cars getting Coffee.” Olsson explores the passions behind these guitarists, plus gets a few guitar pointers along the way.

I spoke with James Tonkin and Leigh Brooks about the post workflow for these episodes. Tonkin is founder of Hangman in London, which handled the post on the eight-part series. He was also the director of photography for the first two episodes and has handled the online edit and color grading for all of the episodes. Leigh Brooks (Firebelly Films) was the offline (i.e. creative) editor on the series, starting with episode three. Together they have pioneered an offline-to-online post workflow based entirely around Blackmagic Design DaVinci Resolve.

James, how did you get started on this project?

James Tonkin: Kylie approached us about shooting a pilot for the series. We filmed that in Nashville with Joe Bonamassa and it formed the creative style for the show. We didn’t want to just fixate on the technical side of the guitar and tone of these players, but their geographical base – explore the city a little bit. We had to shoot it very documentary style, but wrap it up into a 20-25 minute episode. No pre-lighting, just a tiny team following her around, interacting with with these people.

Then we did a second one with Nuno Bettencourt and that solidified the look of the show during those two initial episodes. She eventually got distribution through AXS TV in the States for the eight-part series. I shot the first two episodes and the rest were shot by a US-based crew, which followed the production workflow that we had set up. Not only the look and retaining the documentary format, but also maintaining the highest production value we could give it in the time and budget that we’re working with.

We chose to shoot anamorphic with a cinematic aspect ratio, because it’s slightly different from the usual off-the-cuff reality TV look. Also whenever possible, record in a raw codec, because we (Hangman) were doing all of the post on it and me specifically being the colorist. I always advocate for a raw workflow, especially something in a documentary style. People are walking from daylight into somebody’s house and then down to a basement, basically following them around. And Kylie wants to keep interacting with whomever she’s interviewing without needing to wait for cameras to stop and re-balance. She wants to keep it flowing. So when it comes to posting that, you’ve got a much more robust digital negative to work with [if it was shot as camera raw].

What was the workflow for the shows and were there any challenges?

Leigh Brooks: The series was shot mainly with Red and Canon cameras as 6K anamorphic files. Usually the drive came to me and I would transcode the rushes or create proxy files and then send the drive to James. The program is quite straightforward and narrative-based, without much scope for doing crazy things with it. It’s about the nuts and bolts of guitars and the players that use them. But that being said, each episode definitely had its own little flavor and style. Once we locked the show, James took the sequence, got hold of the rushes, and then got to work on the grade and the sound.

What Kylie’s pulled off on her own is no small feat. She’s a great producer, knows her stuff, and really does the research. She’s so passionate about the music and the people that she’s interviewing and that really comes across. The Steve Vai episode was awesome. He’s very holistic. These people dictate the narrative and tell you where the edit is going to go. Mick Mars was also really good fun. That was the trickiest show to do, because the A and B-side camera set-up wasn’t quite working for us. We had to really get clever in the edit.

DaVinci Resolve gets a lot of press when it’s used for finishing and color grading. But on this project it was used for the offline cut as well. TV series post is usually dominated by Avid Media Composer. Why do you feel that Resolve is a good choice for offline editing?

Tonkin: I’ve been a longtime advocate of working inside of Resolve, not just from a grading perspective, but editorial. As soon as the Edit page started to offer me the feature set that we needed, it became a no-brainer that we should do all of our offline in Resolve, whenever possible. On a show like this, I’ve got about six hours of online time and I want to spend the majority being as creative as I can. So, focusing on color correction, looking at anything I need to stabilize, resize, any tracking, any kind of corrective work – rather than spending two or three hours conforming from one timeline into another.

The offline on this series was done in DaVinci Resolve, with the exception of the first episode, which was cut in Final Cut Pro X. I’m trying to leave editors open to the choice of the application they like to use. My gentlemen’s agreement with Matt [Cronin], who cut the first pilot, was that he could cut it in whatever he liked, as long as he gave me back a .drp (DaVinci Resolve project) file. He loves Final Cut Pro X, because that’s what he’s quickest at. But he also knows the pain that conforms can be. So he handled that on his side and just gave me back a .drp file. So it was quick and easy.

From episode three onwards, I was delighted to know that Leigh was based in Resolve, as well, as his primary workflow. Everything just transfers and translates really quickly. Knowing that we had six more episodes to work through together, I suggested things that would help us a lot, both for picture for me and for audio as well, which was also being done here in our studio. We’re generating the 5.1 mix.

Brooks: I come from an Avid background. I was an engineer initially before ever starting to edit. When I started editing, I moved from Avid to Final Cut Pro 7 and then back to Avid, after which I made the push to go to Resolve. It’s a joy to edit on and does so many things really well. It’s become my absolute workhorse. Avid is fine in a multi-user operation, but now that doesn’t really matter. Resolve does it so well with the cloud management. I even bought the two editor keyboards. The jog wheel is fantastic! The scrolling on that is amazing.

You mentioned cloud. Was any of that a factor in the post on “Life in Six Strings?”

Tonkin: Initially, when Leigh was reversioning the first two episodes for AXS TV, we were using his Blackmagic Cloud account. But for the rest of the episodes we were just exchanging files. Rushes either came to me or would go straight to Leigh. He makes his offline cut and then the files come to me for finishing, so it was a linear progression.

However, I worked on a pilot for another project where every version was effectively a finished online version. And so we used [the Blackmagic] Cloud for that all the way through. The editor worked offline with proxies in Resolve. We worked from the same cloud project and every time he had finished, I would log in and switch the files from proxy to camera originals with a single click. That was literally all we had to do in terms of an offline-to-online workflow.

Brooks: I’m working on delivering a feature length documentary for [the band] Nickelback that’s coming out in cinemas later in March. I directed it, cut it in Avid, and then finished in Resolve. My grader is in Portsmouth and I can sit here and watch that grade being done live, thanks to the cloud management. It definitely still has a few snags, but they’re on it. I can phone up Blackmagic and get a voice – an actual person to talk to that really wants to fix my problem.

Were there any particular tools in Resolve that benefitted these shows?

Tonkin: The Blackmagic team are really good at introducing new tools to Resolve all the time. I’ve used trackers for years and that’s one of my favorite things about Resolve. Their AI-based subtitling is invaluable. These are around 20-minute episodes and 99% of it is talking. Without that tool, we would have to do a lot of extra work.

Resolve is also good for the more complex things. For example, a driving sequence in the bright California sun that wasn’t shot as camera raw. The only way I could get around the blown-out sky was with corrections specifically for the sky portion of the shot. Obviously, you want to track the subject so that the sky is not going over his head. All of those types of tools are just built in. When I’ve effectively got six hours to work on an episode and I might have about 20 shots like that, then having these tools inside are invaluable from a finishing perspective.

You’ve both worked with a variety of other nonlinear editing applications. How do you see the industry changing?

Tonkin: Being in post for a couple of decades now and using Final Cut Studio, Final Cut Pro X, and a bit of Premiere Pro throughout the years, I find that the transition from offline to online starts to blur more and more these days. Clients watching their first pass want to get a good sense of what it should look like with a lot of finishing elements in place already. So, you’re effectively doing these finishing things right at the beginning.

It’s really advantageous when you’re doing both in Resolve. When you offline in a different NLE, not all of that data is transferred or correctly converted between applications. By both of us working in Resolve, even simple things you wouldn’t think of, like timeline markers, come through. Maybe he’s had some clips that need extra work. He can leave a marker for me and that will translate through. You can fudge your way through one episode using different systems, but if you’re going to do at least six or eight of them – and we’re hopefully looking at a season two this year – then you want to really establish your workflow upfront just to make things more straightforward.

Brooks: Editing has changed so much over the years. When I became an engineer, it was linear and nonlinear, right? I was working on “The World Is Not Enough” – the James Bond film around 1998. One side of the room was conventional – Steenbeck’s, bins, numbering machines. The other side was Avid. We were viewing 2K rushes on film, because that’s what you can see on the screen. On Avid it was AVR 77. It’s really interesting to see it come full circle. Now with Resolve, you’re seeing what you need to see rather than something that’s subpar.

I’d say there are a lot of editors who are ‘Resolve curious.’ If you’re in Premiere Pro you’re not moving, because you’re too tied into the way Adobe’s apps work. If you know Premiere, you know After Effects and are not going to move to Resolve and relearn Fusion. I think more people would move from Avid to Resolve, because simple things in Resolve are very complicated in Avid – the effects tab, the 3D warp, and so on.

Editors often have quite strange egos. I find the incessant arguing between platforms is just insane. It’s this playground kind of argument about bloody software! [laugh] After all, these tools are all there to tell stories. There are a lot of enthusiastic people on YouTube that make really good videos about Resolve, as well. It’s nice to be in that ecosystem. I’d implore any Avid editor to look at Resolve. If you’re frustrated and you want to try something else, then this might open your eyes to a new way of working. And the Title Tool works! [smile]

This article was originally posted at postPerspective.

©2024 Oliver Peters

Analog Mojo For Your Fairlight Mixes

Before Blackmagic Design acquired the assets, Fairlight was one of the originators of the digital audio workstation. Thanks to its modern integration within DaVinci Resolve, Fairlight has added pro-level audio performance to this all-in-one application. When it comes to recording and mixing real musicians, as well as all levels of audio-for-film/video post, Fairlight brings needed competition to the market. There are audio restoration tools, a built-in sound effects library, and advanced features including Dolby Atmos.

As a mixing application, Fairlight uses the traditional track/mixer/meter bridge configuration. Each track has a corresponding channel strip complete with fader, EQ, and gate/compressor/limiter, plus inserts for other plugin effects. The user interface is optimized for single and multi-display arrangements, but also accommodates Fairlight control surfaces with their own integrated screens.

Blackmagic Design offers a broad ecosystem of native hardware accessories to expand the system, similar to those offered by Avid for Pro Tools. This includes several console/control surface options, an Audio Editor panel, and PCIe cards for multi-channel i/o and audio effects acceleration. Put all of this in play and, according to Blackmagic Design, the system is capable of realtime playback with effects for up to 2,000 tracks.

The analog vibe

This makes Fairlight a nice digital audio workstation, with emphasis on the word digital. What it lacks is the harmonic color and character typical of native analog-style plug-ins available in Pro Tools, Luna, Logic Pro, and other DAWs. Furthermore, the presets are focused on film and TV mixing, but not music. For example, there is no suggested “kick drum” preset for the compressor. If you mix music in Fairlight and want the sonic benefits of analog emulation, then that’s where third-party plugin effects enter the picture.

Typical “vintage” plugins emulate classic British and American consoles and outboard equipment from the 1960s and 1970s. The goal is to duplicate the character of the hardware, such as their unique EQ curves, which would be hard to derive with most digital tools. But, the actual sonic character of these plugins is highly dependent on what was used to model the software, which is usually something you wouldn’t know as the user.

KIT Plugins Blackbird Bundle

KIT Plugins, which is a relatively new plugin developer based in Nashville, made their Blackbird Bundle available to me for this review. Blackbird Studio, one of the premier music studios in town, owns a huge collection of vintage gear, including iconic Neve and API consoles. Some of this hardware was used by KIT Plugins to model their software products. Since Blackbird Studio has its name on these products, nothing gets out the door until studio founder/mixer John McBride has put his stamp of approval on it. As a result, the software tools in the Blackbird-branded plugins match the sound and character of the actual hardware at Blackbird Studio.

The Blackbird Bundle includes six analog-style EQs as part of four plugin effects. Licensing is via a free iLok account. Through the iLok application, you can activate any of these licenses to the iLok Cloud or to an iLok USB dongle (generation 2 or 3). Dongles are popular with freelance mixers, because they are transportable between gigs ($45 for USB-A and $55 for USB-C at Sweetwater). If you opt for cloud activation (free), then you’ll need a working internet connection during your session. Activation to the local computer isn’t supported.

Applying the American and British sound

KIT Plugins’ BB A5 Channel Strip emulates classic API (Automated Processes, Inc) hardware from the API 500 modular series for that “American” sound. This plugin includes a 3-band (55A), a 4-band (55L), and a 10-band graphic (56L) EQ. The key to the character of the 3-band and 4-band EQs is an API innovation called “Proportional Q.” The filter bandwidth stays wide when the gain change is small. It narrows and becomes more surgical as the setting is increased in either direction. KIT Plugins replicated this feature, which isn’t always the case with other API emulations on the market. Like the hardware, these EQs use stepped frequency values to quickly dial in a sound. However, the gain controls can be switched between stepped or variable.

The BB A5 interface sports a modern look. Frequency and level knobs are separate and not concentric like on the hardware. Separation makes more sense with mouse control, while concentric works when you can actually place fingers on a physical knob. There are some faux scratches on these interfaces to evoke “vintage.” Fortunately the designers didn’t go overboard with those.

The two Neve emulations (BB N73 and BB N105) give you the classic “British” sound. N105 is based on Blackbird’s highly modified Neve 8078 console. The Neve emulations look more faithful to the hardware with concentric knobs. The N73 uses the familiar 1073 design, but adds an output section modeled after the output bus of Blackbird’s Neve 8058 console. When you enable this function, it adds some subtle harmonics and compression, depending on how hard you drive it. This master bus feature is also on the BB A5.

The last in this bundle is the MO-Q, which is based on a boutique equalizer built by the Motown engineers in the 1960s. McBride owns one of these in his personal collection, which was the basis for KIT’s modeling. It features seven fixed bands at musical frequencies.

On the tech side, these plugins are built with KIT’s proprietary Full Range Modeling that has been sampled from 10 Hz to 96 kHz. There’s oversampling and a ton of presets. The API and Neve models include auto gain and continuous gain. There’s a a function called Analog Hum, which applies 60 Hz (approx.) noise to the signal with three level settings. It remains barely audible even at the full settings with the speaker volume cranked. The term “Hum” has a negative connotation, so don’t let that scare you. I didn’t really hear what would be considered traditional hum, but rather very subtle white noise. I think it helps some mixes, but regardless, its use is optional.

The Fairlight mix

I imported 20 source tracks from a studio recording session into DaVinci Resolve. The tracks included a female singer, background vocals, and the session band. Each channel got an instance of BB A5, which effectively turned the Fairlight mixer into a virtual API console. Pre-amp and EQ control was handled through the Blackbird effect. The channels were routed to the stereo output bus without any additional instrument grouping/summing or VCA bus control.

These plugins come with an extensive array of instrument presets from KIT, Blackbird, and several contributing mix engineers. While I normally treat presets as merely a suggestion or starting point, I decided to leave these pretty much the way they were set up for this test mix. Pull up a different preset for each track – the kick drum, snare drum, vocals, etc. After some minor tweaking, plus level and panning adjustments, the mix was quickly in a really good place. Granted, it still needed some automation for punch here and there, but the mix was already presentable without it. The only other effects used in this mix were the native Fairlight compressor on the kick drum track, as well as the BB N73 and the native compressor/limiter on the stereo output.

One cool feature that’s usually not seen in other plugins is the Link function. If you have multiple instances of these plugins in the mix – as I did by placing BB A5 on each channel – then you can link the oversampling status, UI size, and hum setting. With link enabled, the values for one instance will then be matched on all the others. That’s really useful when setting up 20 instances of the same plugin across all channels.

One minor nuisance in this test mix was that the Link settings didn’t “stick” for the BB A5 plugin when saving a project. I had to change them from the default again when re-opening the project. Fortunately, that’s simply a matter of changing it for one and then the other linked effects will update correctly. I’ve reported this issue to KIT, so hopefully there will be an update soon. It did work correctly for the Neve emulations.

Some could argue that the difference between Fairlight’s native digital effects and using analog emulation is too subtle to worry about. You may or may not agree. But, using plugins like those in the Blackbird Bundle also let you get to a mix more quickly and reduce the amount of time lost to the indecision often caused by a plethora of digital options.

These benefits also apply to any film/TV/social media project. That’s why editors using DaVinci Resolve owe it to themselves to get comfortable with the Fairlight page. Once your project extends past a few tracks, then you’ll get a better mix using Fairlight than simply staying in the Edit page. Plugins, whether native Fairlight or third-party analog emulations, will make your mix sound more polished. I’ve used a number of brands and those in KIT’s Blackbird Bundle are high-quality and easy to use. Given the Grammy-winning pedigree behind the software, it’s a great way to add analog mojo to your digital mix.

This article was originally written for Pro Video Coalition.

©2024 Oliver Peters

Blackmagic’s Resolve Micro Panel

Six years ago Blackmagic Design expanded its line of color correction control panels by adding the Micro and Mini panels to the full-size Resolve Advanced panel. At the time I reviewed the Mini panel. I typically split my duties between editing and color correction, so much of my grading work is done with a mouse. That includes full sessions in DaVinci Resolve, as well as Lumetri color when the project stays totally inside Premiere Pro. However, now it was time to finally buy and use the Micro panel on a real project.

The challenge

On a recent a grading project the thought of doing it all with a mouse seemed taxing to me. I was fortunate to get the production company I work with to purchase the Micro panel. We service a major international cruise line with creative content used for marketing, corporate communications, shipboard videos, social media, and travel-related entertainment content. 

Over the years, the company has accumulated and curated quite a lot of B-roll footage from destinations all around the globe. This project involved building new presentations for their shipboard sales centers. We created 11 hours of edited content (Premiere Pro), that is split into 22 30-minute blocks, set to music. These blocks represented ten worldwide travel regions, plus an hour of sunrises and sunsets. The source footage came from a wide range of sources, cameras, and file formats, along with some additional stock footage.

In order to be consistent and efficient, we opted to run this through Resolve for the final look, instead of tackling it with Lumetri tweaks in Premiere Pro. Although the amount of footage was comparable to several feature films, the budget and delivery times weren’t. Grading had to get done as quickly as possible. In total, there were over 5700 clips across the 11 hours. Grading time (excluding scene cut detection, exports, and quality-control checks) took roughly a day for each region (one hour of content).

Why the Micro was the right panel

As I previously wrote, if you are an editor/colorist and grade in Resolve less than 50% of the time, then the Micro is probably the right choice. More than 50% and you should look at the Mini. If you are a full-time, successful colorist, then the investment in the full-size Advanced panel is likely the right decision. We opted for the Micro for two reasons. First, I fit into the first category. Second, I have limited desk space and need to operate the panel, keyboard, and mouse within a comfortable space.

The DaVinci Resolve Micro panel is just a control surface without any integrated displays. It plugs in via USB and comes with a USB-A to USB-C cable. That connection carries minimal power and control data. The packaged version sold through most retail outlets is bundled with a license of Resolve Studio, but the panel doesn’t need to be paired with it. So if you already have DaVinci Resolve Studio installed, simply plug in the panel and you are ready to go. Unlike other panels, such as Tangents, these Blackmagic panels will only work together with Resolve. However, there is no set-up required. You can tweak the gearing of the knobs and rings within Resolve, but if you like the default feel, then it’s ready right out of the box.

The panel layout

The Micro panel is designed with three trackballs and rings. These trackballs control the lift/gamma/gain values for the standard color wheels. However, there are two modifier controls. Press the “log” button and these controls change to log wheels. Press “offset” and then the left and middle rings control temperature and tint, while the right trackball/ring controls the offset wheel. By toggling the offset button you can control all four values: lift, gamma, gain, and overall. Through this configuration, you can control two of the three sets of Resolve color wheels – just not HDR wheels or color bars (color mix sliders). Across the top of the panel are knobs for Y controls, contrast, pivot, saturation, hue, and more. The right side consists of navigation, recall, and transport control buttons.

Since the Micro panel covers a smaller subset of DaVinci Resolve color controls, you cannot totally do away with the keyboard and mouse (or pen and tablet if that’s your interface preference). For example, camera raw settings, curves, noise reduction settings, and setting up power windows all require use of the keyboard and mouse. The same is true for saving and loading memory positions.

Micro permits you to save a still to the galley. In Resolve these stills are thumbnails plus an embedded grade for that clip. You can recall a still from the panel, preview the grade onto a new clip, and apply it. But if you’ve grabbed several stills, then you have to go through the gallery and select the correct clip.

What you can’t do from the panel – and I find this a huge design oversight – is to save a grade into one of Resolve’s eight memory buckets and then recall and apply that grade by number. It would be great even if only one or two were made available through the panel, or you could recall and apply the grade from one clip or two clips back. This should have either been integrated into the panel design or been a function that could be custom-mapped unto one of the other buttons. For know, you simply have to perform these tasks with the keyboard.

Use encourages discovery

The more functions you can perform within a single node, the more you can make use of the panel’s controls. However, if you create additional nodes with the keyboard, you can navigate between them using the panel. The majority of my clips only required one node, but some used more for power windows and tracking. Once I had created qualifiers and/or masks with the mouse, color adjustments could proceed via the panel. And you can step back and forth between the nodes from the panel.

When you work with a panel you tend to use controls that you wouldn’t normally use with just a mouse. Maybe you stick to the color wheels when working with a mouse. But, with the panel you might reach for the knobs more often. Internal Resolve color science is based on YRGB and not just RGB. When you rotate one of the trackball rings to affect brightness, it changes the YRGB values equally. At the top of the panel are Y-Lift, Y-Gamma, and Y-Gain knobs. Adjusting these is the same as changing only the Y value under each wheel in the user interface. Adjusting a Y-Gain knob versus the trackball ring will have a different impact on the saturation of the clip.

As with most control surfaces, you build up a muscle memory after working with it for a while. Your hands intuitively know where to go without the need to look up at a screen and see what your mouse is doing. The panel is an inherent way of working faster, but it also reduces the potential wrist and hand fatigue caused by constant mouse use.

For us, getting Blackmagic’s DaVinci Resolve Micro panel was definitely a good investment. When I’m mainly editing and not grading, then I simply unplug the panel and set it aside. Need it again? Plug it back in and it’s ready to go. Sometimes a little bit of kit makes you better, more creative, and fresher at the end of the session.

©2023 Oliver Peters

Blackmagic Camera vs FiLMiC Pro

As one would expect, IBC 2023 featured new product announcements from a host of manufacturers including Blackmagic Design. They announced hardware product updates along with the new Blackmagic Cinema Camera 6K. This is a DSLR-style package with a full-frame sensor, OLPF, and the ability to record Blackmagic RAW. But if you want to shoot in RAW, you’ll also need to update Resolve to version 18.6, as well as the Blackmagic RAW Player. Sample clips, shot by director/dp John Brawley, are available at Blackmagic’s website.

Camera announcements are something we’ve come to expect from the Aussies, but equally intriguing was the immediate availability of the free Blackmagic Camera application for the iPhone (iOS 16.0 or higher). If you’ve done any serious iPhone filming, then you’ve likely been using Filmic Pro. Although there are alternative applications, including the built-in Camera app, Filmic Pro has been the go-to for those serious about using the iPhone in a professional environment.

I’ve done a number of write-ups about Filmic Pro and the related workflow to get the media to post, including camera-to-cloud. I was eager to compare and contrast the two applications. Let’s start with some key points. Blackmagic Camera to date is only marketed as an iPhone application. I don’t know if it will work on an iPad, because mine doesn’t support iOS 16. Filmic Pro is available for both iOS (iPhone and iPad) and Android devices.

Both apps cover the basics well and even offer advanced functions like desqueezing the image from anamorphic lenses. Overall though, Filmic Pro has more features, including the ability to record offspeed (such as 60fps capture for 24fps playback) and to apply various film looks in-camera. You can also turn on the flashlight function of the iPhone and use it as a camera light. 

Conversely, Blackmagic Camera includes an in-app chat function between the videographer and the editor using Blackmagic Cloud. Considering that this is Blackmagic Design’s first iPhone camera application, it’s nicely polished right out of the gate. Although there’s no user guide yet, the interface is intuitively designed.

The big different is cost. Blackmagic Camera is free, whereas Filmic Pro, starting with version 7, is now only available through an in-app subscription. If cost is the main driver, then Blackmagic wins; but if you want the most features, then I presume Filmic Pro would get the nod.

Hardware and operating system limitations

Settings and operation for both applications mirror each other closely. That’s because these are really just software “overlays” for the camera sensor within the phone, the lens configuration, and the operating system. I’m not a big iPhone guy and don’t run out to get the latest and greatest, so I’m running a 2020 iPhone SE (2nd generation). Even though both applications can encode to ProRes, my phone is limited to H264 and H265 (HEVC). It has a single 28mm lens, so I don’t have the other lens options that each application could take advantage of.

Another item under control of the operating system is frame rate. Both apps can only capture in whole number frame rates, such as 24.0 instead of 23.976 fps. Captures use a variable frame rate. Essentially this means that the exact duration of each frame will vary slightly and not be a constant. For instance, a 24fps file might show up as 24.6354 and not 24.0000. Neither of these pose huge issues for most NLEs, but it’s one area where mobile recordings differ from professional hardware.

The only exception to variable frame rate files were my offspeed Filmic Pro recordings. The reason is because Filmic Pro does this in two passes. Once the file is captured, it is retimed and written to storage. During retiming, the file is properly corrected to a “solid” (non-variable) frame rate.

The iPhone sensor image is overly sharpened to hide some of the flaws inherent in the design. This is especially noticeable in 1080 versus 4K and at slower frame rates. Fine detail and complex textures, like tree foliage and small blades of grass, will scintillate, particularly during horizontal movement.

Much of this is a result of the lack of an OLPF (optical low pass filter) plus enhanced sharpening in the camera sensor. Some of it can be counteracted in post using various softening effects. Of course, the operator can mitigate this by sticking to close-ups or images that don’t have fine detail, as well as not moving the camera too much. Naturally, such limitations aren’t very realistic. While these present some challenges for the videographer, they are out of the control of third-party application developers.

Similarities

With the caveats that I’ve outlined, let’s talk about how Blackmagic Camera and Filmic Pro offer a similar experience. I recorded my test footage in the best quality available to me on the SE, which was HEVC (H265 on Blackmagic Camera and Filmic Extreme in Filmic Pro). I have both Filmic Pro Legacy (v6.x) and Filmic Pro (v7.x) installed, so I was able to do some test recordings using all three applications. I shot 1080 at 24fps, as well as a few clips with slight slomo. I was able to do this in-camera (30-over-24) with Filmic Pro. For the Blackmagic Camera app, I recorded 1080p/24 plus some 4K clips at 30fps. The latter were conformed (slowed) to 24fps in the edit timeline.

The Blackmagic H265 files are encoded with a very tight bitrate range, so all of those files were encoded at a hair over 12Mbps for 1080p/24 and around 54Mbps for 2160p/60 (4K). The Filmic Extreme H265 setting uses a target rate of 50Mbps for 1080, but with wide minimum and maximum rates. The Filmic Pro files ranged between 20 and 50Mbps for 1080p/24. 2160p/60 (4K) files were around 125Mbps. Regardless of these encoding rates, none of the files displayed any significant compression artifacts.

Naturally, if you shoot handheld with an iPhone, it’s hard for it not to be shaky. Both apps offer the same three levels of stabilization, which are set by the sensor and operating system. Standard mode uses only optical sensor stabilization. Cinematic mode is sensor stabilization enhanced by the operating system. Cinematic Extreme (labeled Cinematic+ in Filmic Pro and Extreme in Blackmagic Camera) applies even more intense software smoothing. These last two settings increasingly add latency to the preview image on your screen. I recommend using Standard and then supplementing that in the NLE with a stabilization plug-in for better control.

Finally, there’s the issue of motion blur. If you shoot at 24fps, it’s because you want it to look more film-like. That look is due in part to motion blur caused by the length that the camera shutter stays open. For 24fps, you’d typically want a 180-degree shutter angle, which is 1/48th of a second. This results in a slight blur within each frame as you pan or a person or vehicle moves across the screen. A smaller fraction, like 1/120th, gives you crisper frames. This causes moving objects to appear to stutter or strobe as they travel across the scene.

The shutter angle/speed that you can set is balanced against ISO/exposure. If you want to maintain a 180-degree (1/48th) value, then the ISO has to be low. Both applications allow you to run in an Auto mode or dial in specific values and lock those settings. Unfortunately, if you are shooting with an iPhone on a sunny day at 24fps and at 1/48th shutter speed, then it’s impossible to get the ISO low enough. The shot will be completely overexposed. To make those settings useful in bright daylight filming, it’s imperative that you invest in accessories, particularly filters. External ND (neutral density) filters lower the amount of light entering the lens.

Differences

Both camera applications can record in 16:9 at 720, 1080, and 4K resolutions. But if you want more, that’s where Filmic Pro shines. You get those, but also 2K, 3K, and down to 540. There are also different ratios, such as 3:2, 4:3, 1:1, 2.39:1, and others. For example, in my previous test, I shot with Filmic Pro set to 3K at 2.39:1 for a 2K DCI edit. To do the same with Blackmagic Camera, I would need to shoot in 4K and set my display crop marks for the right framing. And of course, stick within those marks during filming. That’s rather hard to do in bright sunshine when you can barely see the screen.

The Blackmagic Camera application includes a unique vertical orientation mode. You can lock the aspect to 16:9 horizontal, but still hold the iPhone vertically. The display is repositioned into the upper portion of the screen and the sensor is cropped to record the expected 16:9 horizontal image. However, in my tests this yielded some odd files. Even though the file looked correct, the resolution was identified as 1214 x 2160 instead of 1920 x 1080 and 60fps instead of 24fps. However, this resolution came up as 2160 x 1214 in Final Cut Pro’s inspector. Clearly Blackmagic is manipulating the pixel aspect ratio to achieve this magic. The data rate was also predictably higher at around 36Mbps instead of the 12Mbps rate of the standard 1080 files. While it’s an interesting feature, I wouldn’t recommend it for professional filming.

Both camera apps let you adjust exposure, temp, and tint or run in an Auto mode. Filmic Pro also offers optional Film Looks that are applied during capture. These emulations include options like infrared, black-and-white, and film stocks. My test footage for Filmic Pro v7 was recorded using their Bel Air look, which is inspired by Kodak Vision stock.

Blackmagic Camera only offers the ability to load LUTs to the preview display. I presume this is for two reasons. First, I think the point of this app is to use the iPhone as your B-camera, side-by-side with a Blackmagic Cinema Camera 6K or maybe an Ursa. I’m sure Blackmagic Design would rather have you buy one of their actual cameras instead of only using a free app on your phone. The second reason is that the envisioned workflow has you finishing your edit in DaVinci Resolve. Not only a best-of-class grading tool, but it also includes a few built-in film emulation effects.

Camera-to-cloud

It’s a puzzle to me why manufacturers are jumping on this bandwagon. The camera-to-cloud concept got a boost during covid, but I question its usefulness as a general workflow. Most of the world does not have super-fast internet and wi-fi. When companies I’ve worked with have tried a camera-to-cloud workflow, the results weren’t great. Even the most stalwart advocates see it in general use maybe by 2030. Regardless of my feelings towards the matter, if you do want to utilize a camera-to-cloud solution, then both of these companies have you covered.

Blackmagic Camera lets you upload original and/or proxy media to Blackmagic Cloud. The company sees this as part of an ecosystem focused on Resolve as the hub. Filmic Pro does the same with proxies to Frame.io, if you add the Cinematographer Toolkit option. It requires a Frame.io pro or Adobe Creative Cloud account.

The Blackmagic Cloud has been upgraded to accept media files in addition to collaborating with Resolve project files. It’s still in beta, but the current pricing is free up to 2GB and then ranges from $15/mo. for 500GB to $240/mo. for 8TB and so on. The maximum is 1 Petabyte. However, unlike a Frame.io account, Blackmagic Cloud storage is a closed ecosystem tied to their cameras, Resolve, and their storage devices. You cannot simply log into a Blackmagic Cloud account from a browser and drag-and-drop files.

Conclusion

I shot some test clips with Blackmagic Camera, Filmic Pro Legacy, and Filmic Pro v7 (current version) on my iPhone SE and thew together a short video that I’ve posted on Vimeo. Overall, the best look came out of the 4K files, shot at 30fps, and then scaled and slowed to 1080p/24. I also added some slight softening, stabilization, and minor color correction to many of these clips.

My impression with all of these apps is favorable. Truthfully, I’m not all that wild about smart phone videography. But under the premise that the best camera is the one you have with you, then there’s a compelling argument for using any of the recent smart phones. The new top-of-the-line iPhone 15 models can turn out some stunning imagery and enable such features as ProRes and log recording, along with the ability to use external media storage.

As an editor, I get plenty of poorly shot iPhone footage. Usually shaky, often vertical, and more often than not in DolbyVision (argh). It would make my job easier and the end result better if any of these apps were used instead of the default Camera app. Blackmagic Camera and Filmic Pro are easy to set up with well-designed, simple interfaces. While there are tools in each designed for videographers, even an amateur can get great results.

As I said up top, it comes down to features versus cost. Filmic Pro makes a great application that has been the standard for smart phone videography. But Blackmagic Design has tossed down the gauntlet. Even if you need the bigger feature set of Filmic Pro or have already purchased it, there are still plenty of reasons to also install Blackmagic Camera. The apps don’t conflict with each other and it gives you another option should you need it. If you are using an iPhone as a companion camera alongside another Blackmagic Design camera, then that’s all the more reason to add it to your iPhone.

Addendum: Filmic was acquired by Bending Spoons of Italy in late 2022. Reports trickling out here at the end of 2023 indicate a layoff of the original Filmic team. Bending Spoons is moving towards integrating the Filmic apps into its platform. We’ll have to wait and see how this impacts the future of the Filmic Pro and Firstlight applications.

©2023 Oliver Peters