When it comes to audio plug-ins, video editors have different needs than audio mixers. Sure, you need EQ, compression, and limiting, but the plug-ins you reach for most often revolve around the clean-up and enhancement of dialogue.
There are a number of third-party plug-in solutions, which augment the built-in enhancement features of most editing applications. However, solutions like the full version of iZotope RX can be pricey, especially for upgrades. Accusonus, another alternative, has left the plug-in business. The tools in Premiere, Final Cut Pro, and DaVinci Resolve vary in effectiveness; however, they lack comprehensive user control, often presenting only an amount slider. So there’s room for innovation.
Version 2 was introduced in 2019 and added 64-bit OS support, improved algorithms, and Mid/Side processing functions. Acon Digital’s software supports Windows and macOS and runs natively on Apple Silicon, as well as Intel processors. The plug-ins install AU, AAX, VST, and VST3 versions.
Of the four plug-ins that constitute Restoration Suite 2, DeNoise 2 provides the most versatility. It’s designed to remove background noise, like wind or waves hitting the beach, but there are additional factory presets for voice and music. You can run it adaptively or with a noise profile (noise print). Adaptive processing can tackle broadband or combined noise. The difference is that combined processing takes into account noise with a tonal quality, like hum. Using the combined mode will affect the voice to a greater degree.
The noise profile works in a similar fashion to other tools. Run a piece of the audio that only has background noise for a few seconds and click Learn. Then click the “power” icon to apply that noise profile. According to Acon Digital, “Version 2 introduces the novel dynamic noise profiles that help reducing noise that varies randomly over time, such as wind noise or rustle from lavaliere microphones. Where the earlier versions merely captured a static noise print with time-constant noise levels, the dynamic noise profiles capture statistics from the noise to be reduced. The noise suppression algorithm then estimates the most suitable noise threshold curve for the noisy input signal using the measured statistics.”
Unlike most other noise reduction plug-ins, the DeNoise 2 interface includes controls for reduction, knee, attenuation, and reaction time. The Adaptation Time slider sets how long before the the plug-in responds to changes in the noise floor. A shorter time means that processing kicks in more quickly, but it can affect the desired signal.
A histogram dynamically displays the audio signal versus the processing curve. Click Emphasis and you now have control at multiple frequency ranges. Let’s say you want to remove background wind noise. That’s usually a higher frequency noise. Simply drag down that control point and adjust the curve. You can then raise the other control points if you like. While sounds like wind noise work well in the adaptive mode, I got the best result setting a noise profile and using Emphasis to tweak the control points. Finally, there’s a Mid/Side mode if you are working with stereo source material.
DeClick filters are commonly used to remove clicks, scratches, and pops in recordings. This is typical with any music tracks that come from an older vinyl LP. Another source of clicks can be from digital recordings. This filter can be used to minimize them if minor, but if the audio is completely trashed, you are out of luck. However, DeClick 2 has other uses, such as the reduction of plosives – a voice-over announcer popping “p” consonants. There are separate factory preset groups of 78 RPM, Vinyl, and Voice. The later group includes presets for reducing mouth clicks and for plosives. So DeClick 2 covers more audio artifacts than the name might imply.
This plug-in is designed to restore distorted, clipped recordings, such as an over-driven voice recording. DeClip 2 replaces distorted peaks with an estimate of the proper signal level. The histogram displays the signal with a positive and negative threshold slider.
The first step is to adjust the input gain so the signal is loud enough for the filter to make a proper correction. But, make sure some headroom is left. Pick the worst-sounding section and click Detect for an automatic selection. Then manually tweak the two threshold sliders to fine-tune the sound. Adjust the output gain if needed. This filter did a great job for me in recovering the transients in my test clips, thus repairing otherwise distorted voice recordings.
I didn’t have a real-world dialogue source with hum to test this last plug-in. Instead, I created my own, taking a VO track, mixing in 60Hz hum, and bouncing that out as my test clip. DeHum 2 completely removed the embedded hum without distorting the voice. There’s a preset for 50Hz and 60Hz sources, along with a variable frequency control. You can manually dial in the frequency and sensitivity or click the “target” icon to set the profile automatically. The hum in my test clip turned out to be 59.97 instead of a true 60Hz.
The number of harmonics can be selected, should the offending hum have those. This is displayed on the histogram. There are two modes selected by toggling the Notch Filter button. The Notch Filter mode reduces CPU load, but can impact the voice more. When it’s disabled, DeHum 2 subtracts a hum signal created through a re-synthesis technique in order to minimize signal distortions.
Acon Digital has developed a very useful audio enhancement/repair toolkit for video editors. It’s also handy for anyone producing podcasts – especially those recording interviews via Zoom. These four plug-ins are easy to set and adjust and give you plenty of control. Three include a solo function to monitor the noise being removed. In addition to the factory presets, you can save your own – tailored to your particular audio needs.
For complex challenges, stack more than a single instance of these filters onto a clip. For instance, you might apply DeClick 2 (remove plosives) plus DeNoise 2 (remove background noise) to the same on-camera presenter audio for the cleanest results. Each filter can tackle a range of similar audio artifacts.
When you compare that to competing products, others might require you to buy several plug-ins to tackle the same set of conditions. For instance, you might have to purchase separate filters for plosive removal and mouth clicks, rather than one filter that’s able to perform either task. With only four individual plug-ins contained in the Restoration Suite 2 bundle, you might mistakenly think another bundle with more plug-ins is also more comprehensive. That’s definitely not the case here.
Along with Acon Digital’s Restoration Suite 2, Acoustica, and a separate Mastering Suite, the individual plug-in products also include a handy, free reverb filter (Verberate Basic). You may run the software on as many computers as you personally own and have control of. All of the plug-ins that I tested worked well in the Apple and Adobe DAWs and NLEs that I use. Be sure to check out a trial version first if you have any questions about your particular kit.
This week Apple dropped a big one on its users. By the end of May, there will be new, mobile versions of Final Cut Pro and Logic Pro for the iPad. These apps feature interfaces optimized for touch and the Pencil. Both will also include some advanced features not yet found in the desktop versions. Presumably updates of those will also come in short order to maintain compatibility.
The other two Apple professional video applications – Motion and Compressor – have been left out of the loop. At least for now. System requirements for Final Cut Pro are somewhat different than for Logic Pro. FCP will require a newer iPad Pro or iPad Air. Logic Pro will run on any iPad with an A12 Bionic chip or later. Both require iPadOS 16.4 or later.
The timing of this announcement is curious, since it precedes WWDC 2023. The speculation is that it was designed to beat Google to the punch one day before the Google I/O 2023 event, where details of the Pixel Tablet have been revealed. It’s an iPad competitor, although not equal to the iPad Pro models. However, by making this early iPad-related announcement, Apple gains the attention of the tech press and might pull some attention away from Google. Hmm…
What does the competition look like?
I reviewed LumaTouch’s LumaFusion when it was launched five years ago. At a casual glance, I would say that Apple picked up design inspiration from LumaTouch. Of course, it could be argued that LumaTouch mimicked FCP in the first place. There’s also Adobe Rush and Blackmagic Design’s port of DaVinci Resolve to the iPad. Rush uses the same UI for mobile and desktop versions, but it’s completely different than Premiere Pro. Unlike the others, Resolve’s iPad application is largely the same as Resolve on the desktop.
Toss in iMovie and GarageBand and you now have at least five NLEs and two DAWs running on the iPad. (Search the App Store and you’ll find more, but most are ones you’ve never heard of and are not marketed for high-end use.) I have no doubt that Final Cut Pro and Logic Pro will be the most fluid performers among these various options. The obvious question is what type of user are these really designed for?
Split the professional user market into two camps. In one camp, you have traditional editors who cut commercials, corporate videos, television shows, and films. In the other, you have content creators who produce much of the media seen on social platforms. Social media editors range from individual influencers to mini-marketing firms. They produce unboxing and review videos for a fee (or ad revenue) and populate multiple YouTube (and other) channels under various channel names. The needs of these two types of pros – traditional and social – can diverge wildly, but there is also some overlap. For instance, many of the larger social media content creators have made investments into traditional high-end gear – lighting, cameras, post infrastructure, etc.
Meanwhile, Apple has been looking at a different type of pro who is working the social media side of the market. After all, the needs of social media content creators are different than those of a feature film editor. Many up and coming video enthusiasts have their sights set on being the next big YouTuber rather than the next Walter Murch. The application that works well for this new type of pro and aspiring pro is more broadly applicable to a wider set of potential buyers. Furthermore, this demographic skews younger – meaning that they are less likely to rely solely on traditional desktop computers and laptops (“What’s a computer?”). They are also more open to not owning gear and software – i.e. open to subscription models.
Current US ownership of tablets is around 66%. Gen Z (those born in the late 1990s or early 2000s) is around one-fifth of the US population. From most studies, these younger users tend to use multiple devices, prefer mobile devices, and are less patient with technology issues, like slow or less-than-fluid operations.
Designing for the needs of the “new” professional
It’s into this market that Apple is now releasing Final Cut Pro and Logic Pro for the iPad. The iPad itself has always been a product that’s hard to pin down. Is it a creation device or a consumption device or both? For me, the iPad Pro models still don’t make me want to use them over a standard laptop or desktop unit. Maybe I’m too old, but I also don’t produce content on location. I’m also not a fan of the less accurate touch environment. I prefer the resources available from a traditional computing device. However, there are many users of all ages who travel and produce media along the way. For them even a small laptop is more than they’d like to carry. This is another group for whom the iPad Pro is ideal.
Whatever the ideal market may be, Apple still has the need to convince buyers that an iPad Pro is a comparable creative tool to any of its other computers. From this angle, shouldn’t a product labelled Pro also have software labelled Pro if it comes from the same company? Whether it really is or not.
I have not tested either of these two applications. However, this iJustine “first look” video is a good place to start if you are curious. Based on the early videos that I’ve seen, I believe the claim that these versions will only differ slightly from their desktop siblings is hyperbole at best. These are a 1.0 version of a reimagined application.
One feature that has received positive response on forums is the Scene Removal Mask for FCP on the iPad. The demo shows a person on a solid white background. This is an easy replacement. The feature is much like Keyper by Sheffield Softworks (available through FxFactory). I have tested that and the use cases are minimal. Once you use a real-world background rather than a white cyc, it’s hard to perfectly refine the edge of a moving person. Keyper – and presumably the Scene Removal Mask as well – is fine when you want to place text behind a person or add some visual effect to the background, but not the person. In such situations, the edge refinement issue is less obvious. Maybe Apple’s in-house approach yields better results, but I’m skeptical.
On the other hand, there are two features that look intriguing to me. The first is Live Drawing, which takes advantage of the Pencil. Draw lines on the screen to highlight something and FCP will then “animate” those lines by stacking a series of connected title clips. The other is an elegant form of music retiming. Adobe Premiere Pro and Audition have had a similar feature for years. The FCP version “automagically” lengthens or shortens a music track to fit your video as you slide the end of the clip. It seems to work more elastically than Adobe’s feature. It’s not clear yet what the actual control variables are nor what audio artifacts occur when you do this.
Things to consider before you leap
There are several things to consider, such as getting media in and out of an iPad and whether or not you can work with an external drive. According to iJustine, media is ingested and stored within the FCP library file on the iPad itself. If you intend to do serious work, you’ll probably want a 2TB model. Right now a fully loaded 2TB iPad Pro with Pencil, Smart Keyboard Folio, and AppleCare (2 years) is about $2,900, plus taxes and cellular service plan.
Then there’s your Apple iCloud plan. This is optional and not required for FCP or Logic to work on the iPad. However, how many iPhone and iPad users actually bother to manage their cloud back-up settings? Mobile devices like the iPhone and iPad are designed to back-up to iCloud. So, if you do intend to back-up a device with a lot of media on a 2TB internal drive, then you will likely also have to increase your Apple iCloud+ premium account. Apple has historically been stingy with storage, limiting free space to 5GB. While iCloud+ plans aren’t onerous, a 2TB plan in the US is $9.99/month. iCloud is also more of a consumer solution and doesn’t compare well to Dropbox or Google Drive in professional scenarios.
I imagine that Apple sees the ideal workflow this way. Start by shooting with an iPhone, the iPad itself, or a DSLR. Media is then brought into the iPad via AirDrop or an SD card. You do a preliminary edit in FCP on the iPad. Or even finalize it there. If you need to refine it further, transfer the FCP library file to your Mac or MacBook Pro computer and complete the project.
In this first iteration, you cannot move a project/library from the desktop version of FCP back to the iPad. There are also no third-party Motion templates or plug-ins available for the iPad, although that’s listed as “coming soon” by Apple. Watch Larsen’s video for Logic Pro. He also makes a good point about third-party audio plug-in development for the iPad.
The subscription pricing model
This brings me to the biggest change. Both Final Cut Pro and Logic Pro for the iPad are only available through subscriptions ($4.99/month or $49 annually for each) with a one-month free trial. It’s the first time Apple has used subscription pricing for any of its key applications. This indirectly brings into question how pricing will be handled going forward for any of the desktop applications and even macOS, iOS, or iPadOS. WWDC is coming up, so hopefully many of these questions will be answered.
I’m not sure the mechanisms and regulatory guidelines for the two Apple App Stores are the same. Subscription pricing for mobile apps was introduced a while back. If you are a Filmic Pro user, the most recent version is only offered though subscription. If you purchased Filmic Pro before this change, you can still use the legacy version on your device(s) without change. But if you want to update, then you shift to a subscription for the new version (free download with in-app purchases).
Final Cut Pro and Logic Pro users have become quite comfortable with the idea of buying the app once and getting free updates, including content in the case of Logic Pro users. Potentially Apple could shift to paid updates or in-app purchases instead of subscription. However, based on their experiences with the legacy version of Final Cut, this introduces complex tax implications in how a corporation realizes revenue. Ultimately such a move could impede the timely development of new features.
My understanding of the current App Store rules is that a developer (presumably including Apple itself) can only charge for an update if the changes are so significant that the application can be released as a new product. Pixelmator did this in the change from Pixelmator to Pixelmator Pro. In May, Pixelmator released Photomator as a new desktop application. It is a hybrid between Apple Photos and Pixelmator Pro designed specifically for photo editing, a la Adobe’s Lightroom. This is priced in a similar manner to the updated Filmic Pro. Download the application from the App Store for free, but then pay a subscription or a one-time charge via the in-app purchase mechanism.
Hypothetically, if a new whizz-bang version of Final Cut Pro were released – let’s say Final Cut Pro XI – then maybe Apple could legitimately charge you another $300 (or whatever). But this would be a completely separate application and not an update to your existing installed software. At the moment, what I’m discussing is nothing more than pure speculation and we’ll need to wait to see what happens.
Will a shift to subscription – at least for mobile apps – stick for Apple and is it even a good move? As with any professional subscriptions, if you are making money from your work – as opposed to the hobbyist or student user – then application subscriptions are a cost of doing business. That’s typically how most Adobe customers have reconciled this. However, if you are only a casual user, why spend the money, especially when there are other options? Part of the reason Apple creates software is to showcase the potential of their hardware and generate hardware sales. For a mobile app, you may well spend $10 – $50 (once) to become an occasional user. By going subscription, I contend that Apple risks losing this tier.
As I said earlier, I’m viewing this through the lens of an older person. Apple is seeking a demographic that seems to be willing to rack up a large amount of cumulative monthly expenses, from Netflix and Apple TV+ to HelloFresh and software. This isn’t all bad and some of it can even be a trade-off in cost versus time savings. Also, this market segment’s view of owning perpetual software licenses is different than mine. Or as iJustine pointed out, cut out a couple of Starbucks trips a month and you’ve paid for the subscription.
I’ll close my thoughts with the issue of innovation. Will any of these changes – mobile versions, subscriptions, etc – spurn new and significant features in their professional desktop applications? I don’t know. All I can say is that for traditional professional editors, Apple is running a distant third behind Blackmagic Design and Adobe. Even Avid is picking up innovation awards for items that Final Cut Pro editors would love to see in their favorite tool. However, as I’ve been pointing out, Apple’s target user is different and Final Cut Pro and Logic Pro are positioned accordingly.
I do think this move is an interesting development that might expand the user base for the desktop applications as well. But, it’s not for users like me – traditional professional editors. I need access to more than just Final Cut Pro. I need my mobile editing applications to match my desktop applications. For me, that’s a powerful laptop and not an iPad. Ultimately the market will decide the path forward for both Apple and its customers.
UPDATE – May 24, 2023 – Yesterday Apple officially released Final Cut Pro and Logic Pro for the iPad. Alongside these, Apple also released companion/compatibility upgrades for the desktop versions of Final Cut Pro, Motion, Compressor, Logic Pro, iMovie, and an accompanying pro video formats pack.
On a side note – if you are running these apps with macOS Ventura, then things have changed with Audio Units and some third-party audio plug-ins will no longer work in Logic Pro and/or Final Cut Pro. Unfortunately it’s different ones in each application. The reason is that AU validation will fail and, therefore, the offenders will be disabled. Of course, they work just fine in the Adobe apps or Resolve. Most of the time you can fix the issue by updating the plug-in to a newer version. On my system, this affected certain plug-ins from iZotope, Acon Digital, and Sonible. Updates will also be required for certain video plug-ins.
In the beginning there was analog and it was good. Then audio engineers developed digital and it was better. Wait, not so fast! The more we heard pure digital production, the more we started to miss the character – and yes, flaws – of analog. After all, grit, saturation, and distortion are synonymous with rock ‘n’ roll. And so, since the dawn of the digital era, many audio software developers have designed products to bring back some of that missing analog sound.
The character or color of analog is generally a function of saturation, which originally came about because of the physical medium of audio tape and electronics used to record and mix songs. Tubes, transformers, and tape introduced internal compression and frequency roll-offs, but most importantly, harmonic overtones. You could push the signal hard and get a certain amount of “warmth.” Push it past the limit and you’d get distortion, but not harsh clipping in the same way as digital signals.
Plug-in developers have sought to emulate the characteristics of analog through various types of emulation. Generally these take the form of a simple saturation or overdrive plug-in. The bulk of these feature very simple controls with a range from subtle warmth to complete distortion. Unfortunately, you don’t have many options to tailor the effect with most of them.
Enter FabFilter Saturn 2
If you really want the ultimate in saturation control, then one of the best tools on the market is FabFilter Saturn 2 from FabFilter Software Instruments. I’ve reviewed their Pro-L2 limiter and it’s since become my go-to limiter for any music mix that I do. It’s often a good idea to stick within a family of products when you find a top-notch developer. There’s a reason that FabFilter plug-ins are the preferred choice by many mixers.
First of all, if you want a plug-in that mimics the appearance and operational controls of some vintage piece of gear, then this isn’t it. All of the FabFilter products feature a similar, modern design aesthetic with easy-to-use controls. At first glance, FabFilter Saturn 2 appears deceptively simple. But, once you dive into the presets, options, and various controls, you’ll find that it can work as a simple saturation plug-in all the way up to a multi-band audio processing device.
The main set of controls are accessed from the bottom control bar. The style selector button covers a range of emulations, including vacuum tube, audio tape, guitar amps, saturation, transformers, and special effects (smudge, breakdown, foldback, rectify, and destroy). Each of these categories includes numerous options within. As you can see from these groups, the options cover not only a wide range of musical production possibilities, but also broadens into sound design. The central drive dial controls the strength of the saturation. Controls to the left and right let you further adjust the sound. At the top right, you can also select from a wide range of presets.
Click into the top area of the interactive display to change the plug-in into a multi-band processor. Each band has its own set of controls. You can also adjust the crossover position and slope between the bands. Each band can make use of different styles of emulation, along with different control adjustments.
In the default mode, Saturn 2 looks pretty simple. But you can open the area below the control bar to further modulate any of the effects. There are numerous ways to modulate these and the interface uses a handy color scheme and visualization to give you a better idea of what’s being done.
There are also drag points. For example, click on the circle above an envelope setting and drag it to one of the controls on the control bar. In FabFilter’s description this now connects a modulation source to a modulation target. These are now linked and the action of one impacts the other.
Saturn 2 installs on both Windows and macOS (Intel and Apple Silicon) systems in AU, VST/VST3, and AAX formats. Therefore, these plug-ins will work with most video editing and audio mixing applications. There is no right or wrong way to use Saturn 2. For example, you could apply it with a default setting to a vocal track, pick a warm tube selection, and dial in a bit of drive for character. Since it has four tone sliders, you can also use it for a bit of equalization.
Or add Saturn 2 to a guitar track and use one of the guitar amp emulations. If you’ve ever used Apple Logic Pro, then the options will be similar. The choices are listed by generic names, since actual brands haven’t been licensed. However, the selections are designed to sound similar to recognizable amp manufacturers, such as Fender, Vox, Marshall, and others.
You can certainly use Saturn 2 on an instrument stem or the final mix bus and dial in a more aggressive setting. This is where the presets help you to learn how the controls function. Apply the plug-in to the mix bus and you can radically add character to the complete mix. Of course, there’s no reason you couldn’t apply multiple instances of Saturn 2 at various stages to get the analog color you are looking for.
Aside from analog character for music, Saturn 2 can also be used for some pretty extreme effects. Need to pitch down a voice-over to sound like a monster? No sweat – Saturn 2 can do this. Special effects settings work for vocal processing, sci-fi sounds, and musical processing, like ping-pong and tremolo filtering.
I’m personally running this on mixes on a 2020 iMac in Logic Pro. Like FabFilter’s other filters, it performs well. Even when I have Saturn 2 applied to numerous tracks, there’s really no drag on the application. The filter has two high quality settings – good and superb. Even with all instances set to superb, my iMac had no issues. While you can definitely go crazy with the possibilities, a little goes a long way. When using Saturn 2, it’s best to start out with a subtle setting and dial in the various adjustments by ear.
There are plenty of other saturation filters on the market, but I have yet to find one with such a wide range of filtering options. This is especially helpful when your audio production needs cover more than just music mixes. It’s hard to tell from demos and blog posts whether a plug-in fits your needs, so be sure to check out FabFilter’s 30-day trial for this or any of their other outstanding plug-ins.
Blackmagic Cloud Store comes in three sizes: 20TB ($7,595 USD), 80TB ($22,995 USD), and 320TB ($88,995 USD). Each uses the same canister-style enclosure as the company’s eGPU. It features dual power supplies and fast, quiet M.2 SSD memory cards, which are installed around a central core. You can literally leave it on your desk and hardly hear the fans running. Cloud Store runs Blackmagic OS and applies wear leveling, so each M.2 card won’t see excessive data writes. Every sixth M.2 card is used for RAID-5 parity/data protection. The quoted capacity is a net figure, meaning you actually have the full 20TB, 80TB, or 320TB of useable storage.
In the unlikely case of hardware issues, such an M.2 SSD card going down, you would need to contact Blackmagic Design support. Cloud Store is not designed for end-user repair. However, it would be easy to repair by an authorized service engineer, even though it’s not a rack mount design. Various internal assemblies can be unbolted from the core chassis and replaced.
For editors and colorists working with shared media, there’s a built-in 10G switch with four 10G high-speed (10Gb/s) ethernet ports. Connect an external network switch to one of these if you need high-speed access to the array from more than four computers. Next, there are two USB-C and two standard 1G ethernet ports, which can be used to connect additional users at slower ethernet speeds.
The intended use for the USB ports is to connect external drives for ingest and back-up*. An ethernet cable from your internet modem or switch to the 1G ports is needed for Dropbox and Google Drive syncing (more on that in a moment). There is also an HDMI port for a monitor used to display real-time data, such as storage activity, drive health, and connected users. Functions like port aggregation of the 10G switch and the USB-C media I/O have not yet been enabled.
In theory, all eight data ports could be used to connect users, if you forgo syncing and the media I/O function. Although the M.2 SSD array is fast, the network connections will determine the true speed. For example, the 1G and USB-C ports yield write/read speeds of around 200-300MB/s, whereas the 10G ports perform in the 800-1,100 MB/s range.
Setting up a shared network
Download the Blackmagic Cloud Store set-up application to your computer. Run the installer for the set-up app, a user guide PDF, and the standalone Proxy Generator application. Review the network set-up section of the user guide. You can connect a device without using the application, but you’ll need it to set up media sharing over the internet. Bring your own standard 3-prong AC power cord for the unit, too.
I created a small workgroup by connecting the Blackmagic Cloud Store to my 2020 27″ iMac and my 2021 14″ M1 Max MacBook Pro. The iMac has a 10G port and was directly connected. The MacBook Pro was connected using a Sonnet Solo 10G Thunderbolt-to-10G ethernet adapter. If you use a bus-powered adapter like the Sonnet with a laptop, make sure you keep the laptop on AC power. Otherwise, the storage volume will tend to unmount. As with most NAS systems, each time you start the computer, you’ll need to manually mount the storage volume again within the OS.
The Cloud Store device is largely plug-and-play using standard network protocols built into the computer’s operating system. The iMac connected right away, but, I had to change the IP address for the MacBook Pro within its preferences. Other than that minor hiccup, setting up Blackmagic Cloud Store was the easiest installation that I’ve ever done with any NAS system.
If you are using Blackmagic Cloud Store on-premises in a workgroup, then you are set to go. Blackmagic Design intended this to be an easy system to administer. Therefore, you cannot subdivide it into different virtual volumes nor assign different levels of user permissions. The Blackmagic Cloud Store drive is mounted as a single drive volume on your desktop and shared media is accessible on all systems. This product is not solely built for DaVinci Resolve users. Apple Final Cut Pro or Adobe Premiere Pro library/project files also work fine when stored on the Cloud Store volume.
Syncing remote media
Blackmagic Design has factored in remote workflows, which is where Dropbox and Google Drive come in. Connect an ethernet cable for an internet feed to one of the 1G ports on the Blackmagic Cloud Store. There’s a tab in the Cloud Store set-up application for Dropbox or Google Drive. Now assign the Cloud Store volume as the location for your Dropbox or Google Drive folder. You can the opt to share only proxy media or full-res files and proxies. Proxy media files can be generated by DaVinci Resolve itself or using the Proxy Generator application.
Editors with whom you collaborate remotely will have access to the media thanks to Dropbox or Google Drive syncing. The remote editors don’t need Blackmagic Cloud Store units for this to work and can certainly work with other storage solutions. There are a variety of possible workflows, depending on whether it’s an editor sharing files with a colorist or an editor working with assistants on a feature film.
Dropbox and Google Drive syncing allows for an incremental workflow. For example, many productions are filmed over several days. As new media is added to the primary Blackmagic Cloud Store volume, syncing can happen automatically for all remote collaborators. Remember that the Dropbox and Google Drive options are based on your account and not Blackmagic Design. So you may incur charges based on your plan with these companies.
I personally have reservations about leaving your storage directly connected to the internet. As many NAS owners who had systems exposed to the internet can attest, getting hacked and having your media held for ransom is a very real risk. So take precautions – you’ve been warned.
Blackmagic Cloud and DaVinci Resolve
Blackmagic Design has specifically tailored the workflow for DaVinci Resolve, which works with a database (library) containing multiple projects. There are three types of databases: local (stored on your computer), server (stored on a separate networked computer), or cloud, i.e. the Blackmagic Cloud server. Anyone can sign up at the Blackmagic Design website to get their own free Cloud account. If you decide to add a library to Blackmagic Cloud, then the charge is $5 per month, per library. Of course, a single library can contain multiple projects.
In a typical Blackmagic Cloud scenario, the main editor adds a Resolve library to Cloud and creates the active Resolve project there. When its time to share the project with other editors/VFX artists/colorists, turn on multi-user collaboration within Resolve. The library owner sends an invitation to the email address tied to the remote user’s account. The second editor has already received the media via a shipped drive or synced over the internet. That editor logs into their Blackmagic Cloud account to gain access to the library and that project. Open the project, relink the media, and it’s off to the races.
The first person to open a sequence has write access to that sequence. Everyone else has read-only access to the open sequence, but write access to any others that they open. If a change is made to a writeable sequence and saved, the library on Blackmagic Cloud is updated. This is relatively fast, but not as instant as if the database were local. Anyone viewing the sequence in a read-only mode is prompted to refresh the sequence. Both Resolve Studio and Resolve (the free App Store version) worked fine.
Who this is for
There are three potential use cases for Blackmagic Cloud Store. You could simply use it as a local drive attached to one computer. This wouldn’t be the best solution, because Thunderbolt arrays are faster and cheaper. The second use case is the small workgroup under one roof. For example, this could be a small post house or a team of editors cutting a film. Simply connect four computers to the Blackmagic Cloud Store unit and now everyone can share media and project files.
The final use case embraces a remote workflow. One or more users are connected to a Blackmagic Cloud Store at one location. They can then share media and Resolve projects using the built-in syncing and Blackmagic Cloud. For example, you might be a great editor, but not the best colorist. Using this workflow, you could share your project remotely with an experienced colorist and work together through a sequence interactively. Or it’s a feature film and several editors, each working remotely, is editing a different reel of the same film.
There’s plenty of competition in the market for shared, networked storage solutions. Most require a certain level of IT knowledge to set up and administer. Blackmagic Cloud Store is a deceptively simple, yet powerful, storage device that can fit many operational models. It’s a high-performance drive array that can sit quietly on your desktop without the need for rack space or extra cooling. Couple it with a Blackmagic Cloud account and you have one of the simplest way to collaborate across town or across the country.
2023 marks the 100th year of the NAB Convention, which started out as a radio gathering in New York City. This year you could add ribbons to your badges indicating the number of years that you’d attended – 5, 10, etc. My first NAB was 1979 in Dallas, so I proudly displayed the 25+ ribbon. Although I haven’t attended each one in those intervening years, I have attended many and well over 25.
Some have been ready to sound the death knell for large, in-person conventions, thanks to the pandemic and proliferation of online teleconferencing services like Zoom. 2019 was the last pre-covid year with an attendance of 91,500 – down from previous highs of over 100,000. 2022 was the first post-covid NAB and attendance was around 52,400. That was respectable given the climate a year ago. This year’s attendance was over 65,000, so certainly an upward trend. If anything, this represents a pent-up desire to kick the tires in person and hook back up with industry friends from all over the world. My gut feeling is that international attendance is still down, so I would expect future years’ attendance to grow higher.
As with most NAB conventions, these halls were loosely organized by themes. Location and studio production gear could mostly be found in Central. Post was mainly in the North hall, but next year I would expect it to be back in the South hall. The West hall included a mixture of vendors that fit under connectivity topics, such as streaming, captioning, etc. It also included some of the radio services.
Although the booths covered nearly all of the floor space, it felt to me like many of the big companies were holding back. By that I mean, products with large infrastructure needs (big shared storage systems, large video switchers, huge mixing desks, etc) were absent. Mounting a large booth at the Las Vegas Convention Center – whether that’s for CES or NAB – is quite costly, with many unexpected charges.
Nevertheless, there were still plenty of elaborate camera sets and huge booths, like that of Blackmagic Design. If this was your first year at NAB, the sum of the whole was likely to be overwhelming. However, I’m sure many vendors were still taking a cautious approach. For example, there was no off-site Avid Connect event. There were no large-scale press conferences the day before opening.
A lot of this is a function of the industry tightening up. While there’s a lot more media production these days, there are also many inexpensive solutions to create that media. Therefore, many companies are venturing outside of their traditional lanes. For example. Sennheiser still manufactures great microphone products, but they’ve also developed the AMBEO immersive audio product line. At NAB they demonstrated the AMBEO 2-Channel Spatial Audio renderer. This lets a mixer take surround mixes and/or stems and turn them into 2-channel spatial mixes that are stereo-compatible. The control software allows you to determine the stereo width and amount of surround and LFE signal put into the binaural mix. In the same booth, Neumann was demoing their new KH 120-II near-field studio monitors.
Overall, I didn’t see any single trend that would point to an overarching theme for the show. AI/ML/Neural Networks were part of many companies’ marketing strategy. Yet, I found nothing that jumped out like the current public fascination with ChatGPT. You have to wonder how much of this is more evolutionary than revolutionary and that the terms themselves are little more than hype.
Stereoscopic production is still around, although I only found one company with product (Stereotec). Virtual sets were aplenty, including a large display byVu Studios and even a mobile expando trailer by Magicbox for virtual set production on-location. Insta360 was there, but tucked away in the back of Central hall.
Of course, everyone has a big push for “the cloud” in some way, shape, or form. However, if there is any single new trend that seems to be getting manufacturers’ attention, it’s passing video over IP. The usual companies who have dealt in SDI-based video hardware, like AJA, Blackmagic Design, and Matrox, were all showing IP equivalents. Essentially, where you used to send SDI video signals using the uncompressed SDI protocol, you will now use the SMPTE ST2110 IP protocol to send it through 1GigE networks.
The world of post production
Let me shift to post – specifically Adobe, Avid, and Blackmagic Design. Unlike Blackmagic, neither Avid nor Adobe featured their usual main stage presentations. I didn’t see Apple’s Final Cut Pro anywhere on the floor and only one sighting in the press room. Avid’s booth was a shadow of itself, with only a few smaller demo pods. Their main focus was showing the tighter integration between Media Composer and Pro Tools (finally!). There were no Pro Tools control surfaces to play with. However, in their defense, NAMM 2023 (the large audio and music products exhibition) was held just the week before. Most likely this was a big problem for any audio vendor that exhibits at both shows. NAMM shifts back to January in 2024, which is its historical slot on the calendar.
Uploading media to the cloud for editing has been the mantra at Frame io, which is now under the Adobe wing. They’ve enhanced those features with direct support by Fujifilm (video) and Capture One (photography). In addition, Frame has improved features specific to the still photography market. New to the camera-to-cloud game is also Atomos, which demoed its own cloud-based editor developed by asset management developer Axle ai.
Adobe’s approach seems better for documentary projects. Text is generated through speech-to-text software within Premiere Pro. That is now processed on your computer instead of in the cloud. When you highlight text in the transcription panel, it automatically marks the in and out points on that source clip. Then, using insert and overwrite commands while the transcription panel is still selected, automatically edit that portion of the source clip to the timeline. Once you shift your focus to the timeline, the transcription panel displays the edited text that corresponds to the clips on the timeline. Rearrange the text and Premiere Pro automatically rearranges the clips on the timeline. Or rearrange the clips and the text follows.
I was surprised to see that Blackmagic Design was not promoting Resolve on the iPad. There was only one demo station and no dedicated demo artist. I played with it a bit and it felt to me like it’s not truly optimized for iPadOS yet. It does work well with the Speed Editor keyboard. That’s useful for any user, since the Cut page is probably where anyone would do the bulk of the work in this version of Resolve. When I used the Apple Pencil, the interface lacked any feedback as icons were clicked. So I was never quite sure if an action had happened or not when I used the Pencil. I’m not sure many will do a complete edit with Resolve on the iPad; however, it could evolve into a productive tool for preliminary editing in the field.
Here’s an interesting side note. Nearly all of the Blackmagic Design demo pods for DaVinci Resolve were running on Apple’s 24″ candy-colored iMacs. Occasionally performance was a bit sluggish from what I could tell. Especially when the operator demoed the new Relight feature to me. Nevertheless, they seemed to work well throughout the show.
You must be logged in to post a comment.