Analogue Wayback, Ep. 20

D2 – recursive editing

Video production and post transitioned from analog to digital starting in the late 1980s. Sony introduced the component digital D1 videotape recorder, but that was too expensive for most post facilities. These were also harder to integrate into existing composite analog facilities. In 1988 Ampex and Sony introduced the D2 format – an uncompressed, composite digital VTR with built-in A/D and D/A conversion.

D2 had a successful commercial run of about 10 years. Along the way it competed for marketshare with Panasonic’s D3 (composite) and D5 (component) digital formats. D2 was eventually usurped by Sony’s own mildly compressed Digital Betacam format. That format coincided with the widespread availability of serial digital routing, switching, and so on, successfully moving the industry into a digital production and post environment.

During D2’s heyday, these decks provided the ideal replacement for older 1″ VTRs, because they could be connected to existing analog routers, switchers, and patch bays. True digital editing and transfer was possible if you connected the decks using composite digital hardware and cabling (with large parallel connections, akin to old printer cables). Because of this bulk, there weren’t too many composite digital edit suites. Instead, digital i/o was reserved for direct VTR to VTR copies – i.e. a true digital clone. Some post houses touted their “digital” edit suites, but in reality their D2 VTRs were connected to the existing analog infrastructure, such as the popular Grass Valley Group 200 and 300 video switchers.

One unique feature of the D2 VTRs was “read before write”, also called “preread”. This was later adopted in the Digital Betacam decks, too. Preread enabled the deck to play a signal and immediately record that same signal back onto the same tape. If you passed the signal through a video switcher, you could add more elements, such as titles. There was no visual latency in using preread. While you did incur some image degradation by going through D/A and A/D conversions along the way, the generation loss was minor compared with 1″ technology. If you stayed within a reasonable number of generations, then there was no visible signal loss of any consequence.

Up until D2, performing a simple transition like a dissolve required three VTRs – the A and B playback sources, plus the recorder. If the two clips were on the same source tape, then one of the two clips had to be copied (i.e dubbed) onto a second tape to enable the transition. If you knew that a lot of these transitions were likely, an editor might take the time before any session to immediately copy the camera tape, creating a “B-roll dub” before ever starting. One hourlong camera tape would take an hour to copy. Longer, if the camera originals were longer.

With D2 and preread, the B-roll dub process could be circumvented, thus shaving unproductive time off of the session. Plus, only two VTRs were required to make the same edit – a player and a recorder. The editor would record the A clip long in order to have a “handle” for the length of the dissolve. Then switch on preread and preview the edit. If the preview looked good, then record the dissolve to the incoming B clip, which was playing from the same camera tape. This was all recorded onto the same master videotape.

Beyond this basic edit solution, D2’s preread ushered in what I would call recursive editing techniques. It has a lot of similarities with sound-on-sound audio recording innovated by the legendary Les Paul. For example, television show deliverables often require the master plus a “textless” master (no credits or titles). With D2, the editor could assemble the clean, textless master of the show. Next make a digital clone of that tape. Then go back on one of the two and use the preread function to add titles over the existing video. Another example would be simple graphic composites, like floating video boxes over a background image or a simple quad split. Simply build up all layers with preread, one at a time, in successive edit passes recorded onto the same tape. 

The downside was that if you made a mistake, you had to start over again. There was no undo. However, by this time linear edit controllers were pretty sophisticated and often featured complex integrations with video switchers and digital effects devices. This was especially true in an online bay made up of all Sony hardware. If you did make a mistake, you could simply start over using the edit controller’s auto-assembly function to automatically re-edit the events up to the point of the mistake. Not as good as modern software’s undo feature, but usually quite painless.

D2 held an important place in video post. Not only as the mainstream beginning of digital editing, but also for the creative options it inspired in editors.

©2022 Oliver Peters

Will DaVinci Resolve 18 get you to switch?

DaVinci Resolve has been admired by users of other editing applications, because of the pace of Blackmagic Design’s development team. Many have considered a switch to Resolve. Since its announcement earlier this year, DaVinci Resolve fans and pros alike have been eagerly awaiting Resolve 18 to get out of public beta. It was recently released and I’ve been using it ever since for a range of color correction jobs.

DaVinci Resolve 18 is available in two versions: Resolve (free) or Resolve Studio (paid). These are free updates to existing customers. They can be downloaded/bought either from the Blackmagic Design website (Windows, Mac, Linux) or through the Apple Mac App Store (macOS only – Intel and M1). The free version of Resolve is missing only a few of the advanced features available in Resolve Studio. Due to App Store policies and sandboxing, there are also some differences between the Blackmagic and App Store installations. The Blackmagic website installations may be activated on up to two computers at the same time using a software activation code. The App Store versions will run on any Mac tied to your Apple ID.

(Click images to see an enlarged view.)

A little DaVinci Resolve history

If you are new to DaVinci Resolve, then here’s a quick recap. The application is an amalgam of the intellectual property and assets acquired by Blackmagic Design over several years from three different companies: DaVinci Systems, eyeon (Fusion), and Fairlight Instruments. Blackmagic Design built upon the core of DaVinci Resolve to develop an all-in-one, post production solution. The intent is to encompass an end-to-end workflow that integrates the specialized tasks of editing, color grading, visual effects, and post production sound all within a single application.

The interface character and toolset tied to each of these tasks is preserved using a page-style, modal user interface. In effect, you have separate tools, tied to a common media engine, which operate under the umbrella of a single application. Some pages are fluidly interoperable (like edit and Color) and others aren’t. For example, color nodes applied to clips in the Color page do not appear as nodes within the Fusion page. Color adjustments made to clips in a Fusion composition need to be done with Fusion’s separate color tools.

Blackmagic has expanded Resolve’s editing features – so much so that it’s a viable competitor to Avid Media Composer, Apple Final Cut Pro, and/or Adobe Premiere Pro. Resolve sports two editing modes: the Cut page (a Final Cut Pro-style interface for fast assembly editing) and the Edit page (a traditional track-based interface). The best way to work in Resolve is to adhere to its sequential, “left to right” workflow – just like the pages/modes are oriented. Start by ingesting in the Media page and then work your way through the tasks/pages until it’s time to export using the Deliver page.

Blackmagic Design offers a range of optional hardware panels for Resolve, including bespoke editing keyboards, color correction panels, and modular control surface configurations for Fairlight (sound mixing). Of course, there’s also Blackmagic’s UltraStudio, Intensity Pro, and DeckLink i/o hardware.

A new collaboration model through Blackmagic Cloud

The biggest news is that DaVinci Resolve 18 was redesigned for multi-user collaboration. Resolve projects are usually stored in a database on your local computer or a local drive, rather than as separate binary project files. Sharing projects in a multi-user environment requires a separate database server, which isn’t designed for remote editing. To simplify this and address remote work, Blackmagic Design established and hosts the new Blackmagic Cloud service.

As I touched on in my Cloud Store Mini review, anyone may sign up for a free Blackmagic Cloud account. When ready, the user creates a Library (database) on Cloud from within the Resolve UI. That user is the “owner” of the Library, which can contain multiple projects. The owner pays $5/library/month for each Library hosted on Blackmagic Cloud.

The Library owner can share a project with any other registered Blackmagic Cloud user. This collaboration model is similar to working in Media Composer and is based on bin locking. The first user to open a bin has read/write permission to that bin and any timelines contained in it. Other users opening the same timeline operate with read-only permission. Changes made by the user with write permission can then be updated by the read-only users on their systems.

Blackmagic Design only hosts the Library/project files and not any media, which stays local for each collaborator. The media syncing workflow is addressed through features of the Cloud Store storage products (see my review). Both collaboration via Blackmagic Cloud and the storage products are independent of each other. You can use either without needing the other. However, since Blackmagic Cloud is hosted “in the cloud” you do need an internet connection. 

There is some latency between the time a change is made by one user before it’s updated on the other users’ machines. In my tests, the collaborator needs to relink to the local media each time a shared project is accessed again. You can also move a project from Cloud back to your local computer as needed.

What else is new in DaVinci Resolve 18?

Aside from the new collaboration tools, DaVinci Resolve 18 also features a range of enhancements. Resolve 17 already introduced quite a few new features, which have been expanded upon in Resolve 18. The first of these is a new, simplified proxy workflow using the “prefer proxies” model. Native media handling has always been a strength of Resolve, especially with ProRes or Blackmagic RAW (BRAW) files. (Sorry, no support for Apple ProRes RAW.) But file sizes, codecs, and your hardware limitations can impede efficient editing. Therefore, working with proxy files may be the better option on some projects. When you are ready to deliver, then switch back to the camera originals for the final output.

The website installer for DaVinci Resolve Studio 18 includes the new Blackmagic Proxy Generator application. This automatically creates H.264, H.265, or ProRes proxy files using a watch folder. However, you can also create proxies internally from Resolve without using this app, or externally using Apple Compressor or Adobe Media Encoder. The trick is that proxy files must have matching names, lengths, timecode values, and audio channel configurations.

Proxy files should be rendered into a subfolder called “Proxy” located within each folder of original camera files. (Resolve and/or the Proxy Generator application do this automatically.) Then Resolve’s intelligent media management automatically detects and attaches the proxies to the original file. This makes linking easy and allows you to automatically toggle between the proxy and the original files.

Regarding other enhancements, the Color page didn’t see any huge new features, since tools like the Color Warper and HDR wheels were added in Resolve 17. However, there were some new items, including object replacement and enhanced tracking. But, I didn’t find the results to be as good as Adobe’s Content Aware Fill techniques.

Two additions worth mentioning are the Automatic Depth Map and Resolve FX Beauty effect. The beauty effect is a subtle skin smoothing tool. It’s nice, but quite frankly, too subtle. My preference in this type of tool would be Digital Anarchy’s Beauty Box or Boris FX’s Beauty Studio. However, Resolve does include other similar tools, like Face Refinement where you have more control.

Automatic Depth Map is more of a marquee feature. This is a pretty sophisticated process – analyzing depth separation in a moving image without the benefit of any lens metadata. It shows up as a Resolve FX in the Edit, Fusion, and the Color pages. Don’t use it in the Edit page, because you can’t do anything with it there. In the Color page, rather than apply it to a node, drag the effect into the node tree, where it creates its own node.

After brief clip analysis, the tool generates a mask, which you can use as a qualifier to isolate the foreground and background. Bear in mind this is for mild grading differences. Even though you might think of this for blurring a background, don’t do it! The mask is relatively broad. If you try to tighten the mask and use it to blur a background, you’ll get a result that looks like a Zoom call background. Instead, use it to subtly lighten or darken the foreground versus the background within a shot. Remember, the shot is moving, which can often lead to some chatter on the edges of the mask as the clip moves. So you’ll have to play with it to get the best result. Playback performance at Better Quality was poor on a 2017 iMac Pro. Use Faster while working and then switch to Better when you are ready to export or render.


Complex visual effects and compositing are best done in the Fusion page. Fusion is both a component of Resolve, as well as a separate application offered by Blackmagic Design. It uses a node-based interface, but these nodes are separate and unrelated to the nodes in the Color page. Fusion features advanced tracking, particle effects, and a true 3D workspace that can work with 3D models. If you have any stereoscopic projects, then Fusion is the tool to use. The news for Fusion and the standalone Fusion Studio 18 application in this update focuses on GPU acceleration.

Before the acquisition by Blackmagic Design, eyeon offered several integrations of Fusion with NLEs like DPS Velocity and Avid Media Composer. The approach within Resolve is very similar to those – send a clip to Fusion for the effect, work with it inside the Fusion UI, and then it’s updated on the timeline as a Fusion clip. This is not unlike the Dynamic Link connection between Premiere Pro and After Effects, except that it all happens inside the same piece of software.

If you are used to working with a layer-style graphics application, like After Effects, Motion, or maybe HitFilm, then Fusion is going to feel foreign. It is a high-end visual effects tool, but might feel cumbersome to some when doing standard motion graphics. Yet for visual effects, the node-based approach is actually superior. There are plenty of good tutorials for Fusion, for any user ready to learn more about its visual effects power.

There are a few things to be aware of with Fusion. The image in the Fusion viewer and the output through UltraStudio to a monitor will be dark, as compared with that same image on the Edit page. Apparently this has been an ongoing user complaint and I have yet to find a color management setting that definitively solves this issue. There is also no way to “decompose” or “break apart” a Fusion composition on the timeline. You can reset the clip to a Fusion default status, but you cannot revert the timeline clip back to that camera file without it being part of a Fusion composition. Therefore, the best workaround is to copy the clip to a higher track before sending it to Fusion. That way you have both the Fusion composition and the original clip on the timeline.

In addition to visual effects, Fusion templates are also used for animated titles. These can be dropped onto a track in the Edit page and then modified in the inspector or the Fusion page. These Fusion titles function a lot like Apple’s Motion templates being used in Final Cut Pro.


Fairlight Instruments started with a popular digital audio workstation (Fairlight CMI) at the dawn of digital audio. After Blackmagic’s acquisition, the software portion of Fairlight was reimagined as a software module for audio post built into DaVinci Resolve. The Fairlight hardware and control surfaces were modularized. You can definitely run Fairlight in Resolve without any extra hardware. However, you can improve real-time performance on mixes with heavy track counts by adding the Fairlight Audio Core accelerator card. You can also configure one or more Blackmagic control surfaces into a large mixing console.

Taken as a whole, this makes the Fairlight ecosystem a very scalable product line in its own right that can appeal to audio post engineers and other audio production professionals. In other words, you can use the Fairlight portion of Resolve without ever using any of the video-centric pages. In that way, Resolve with Fairlight competes with Adobe Audition, Avid Pro Tools, Apple Logic Pro, and others. In fact, Fairlight is still the only professional DAW that’s actually integrated into an NLE.

Fairlight is designed as a DAW for broadcast and film with meter calibration based on broadcast standards. It comes with a free library of sound effects that can be downloaded from Blackmagic Design. The Fairlight page also includes an ADR workflow. DaVinci Resolve 18 expanded the Fairlight toolset. There’s new compatibility for FlexBus audio busing/routing with legacy projects. A lot of work has been put into Dolby Atmos support, including a binaural renderer, and an audio Space view of objects in relation to the room in 3D space.

On the other hand, if you are into music creation, then Fairlight lacks software instruments and music-specific plug-ins, like amp emulation. The MIDI support is focused on sound design. A musician would likely gravitate towards Logic Pro, Cubase, Luna, or Ableton Live. Nevertheless, Fairlight is still quite capable as a DAW for music mixes. Each track/fader integrates a channel strip for effects, plus built-in EQ and compression. Resolve comes with its own complement of Fairlight FX plug-ins, plus it supports third-party AU/VST plug-ins.

I decided to test that concept using some of the mixes from the myLEWITT music sessions. I stacked LEWITT’s multitrack recordings onto a blank Fairlight timeline, which automatically created new mono or stereo tracks, based on the file. I was able to add new busses (stem or submaster channels) for each instrument group and then route those busses to the output. It was easy to add effects and control levels by clip, by track, or by submaster.

Fairlight might not be my first choice if I were a music mixer, but I could easily produce a good mix with it. The result is a transparent, modern sound. If you prefer vintage, analog-style coloration, then you’ll need to add third-party plug-ins for that. Whether or not Fairlight fits the bill for music will depend on your taste as a mixer.


Once again, Blackmagic Design has added more power in the DaVinci Resolve 18 release. Going back to the start of this post – is this the version that will finally cause a paradigm shift away from the leading editing applications? In my opinion, that’s doubtful. As good as it is, the core editing model is probably not compelling enough to coax the majority of loyal users away from their favorite software. However, that doesn’t mean those same users won’t tap into some of Resolve’s tools for a variety of tasks.

There will undoubtedly be people who shift away from Premiere Pro or Final Cut Pro and over to DaVinci Resolve. Maybe it’s for Resolve’s many features. Maybe they’re done with subscriptions. Maybe they no longer feel that Apple is serious. Whatever the reason, Resolve is a highly capable editing application. In fact, during the first quarter of this year I graded and finished a feature film that had been cut entirely in Resolve 17.

Software choices can be highly personal and intertwined with workflow, muscle memory, and other factors. Making a change often takes a big push. I suspect that many Resolve editors are new to editing, often because they got a copy when they bought one of the Blackmagic Design cameras. Resolve just happens to be the best application for editing BRAW files and that combo can attract new users.

DaVinci Resolve 18 is a versatile, yet very complex application. Even experienced users don’t tap into the bulk of what it offers. My advice to any new user is to start with a simple project. Begin in the Cut or Edit page, get comfortable, and ignore everything else. Then learn more over time as you expand the projects you work on and begin to master more of the toolkit. If you really want to dive into DaVinci Resolve, then check out the many free and paid tutorials from Blackmagic Design, Mixing Light, and Ripple Training. Resolve is one application where any user, regardless of experience, will benefit from training, even if it’s only a refresher.

I’ve embedded a lot of links throughout this post, so I hope you’ll take the time to check them out. They cover some of the enhancements that were introduced in earlier versions, the history of DaVinci Resolve, and links to the new features of DaVinci Resolve 18. Enjoy!

©2022 Oliver Peters

Analogue Wayback, Ep. 17

The shape of your stomach.

The 1970s into the early 1990s was an era of significant experimentation and development in analog and digital video effects and animation. This included computer video art projects, broadcast graphics, image manipulation, and more. Denver-based Computer Image Corporation was both a hardware developer and a production company. Hardware included an advanced video switcher and the Scanimate computer animation system. The video switchers were optimized for compositing and an integral part of the system; however, it was the Scanimate analog computer that is most remembered.

Computer Image developed several models of Scanimate, which were also sold to other production companies, including Image West in Los Angeles (an offshoot of CI) and Dolphin Productions in New York. Dave Sieg, Image West’s former chief engineer, has a detailed website dedicated to preserving the history of this technology.

I interviewed for a job at Dolphin in the mid-1980s and had a chance to tour the facility. This was a little past the company’s prime, but they still had a steady stream of high-end ad agency and music video clients. Some of Dolphin’s best-known work included elements for PBS’ Sesame Street and The Electric Company, the show open for Washington Week in Review (PBS), news opens for NBC, CBS, and ABC News, as well as numerous national commercials. One memorial Pepto Bismal campaign featured actors that step forward from a live action scene. As they do, their body turns a greenish monochrome color and the stomach expands and becomes distorted.

Dolphin was situated in a five-story brownstone near Central Park. It had formerly housed a law practice. Behind reception on the ground floor was the videotape room, cleverly named Image Storage and Retrieval. The second floor consisted of an insert stage plus offices. Editing/Scanimate suites were on the third and fourth floors. What had been the fifth-floor law library now held the master videotape reels instead of books. A stairwell connected the floors and provided the cable runs to connect the electronics between rooms.

Each edit suite housed several racks of Scanimate and switcher electronics, the editor’s console, and client seating. At the time of my interview and tour, Dolphin had no computer-assisted linear edit controllers, such as CMX (these were added later). Cueing and editing was handled via communication between the editor and the VTR operator on the ground floor. They used IVC-9000 VTRs, which were 2″ helical scan decks. These are considered to have provided the cleanest image over multiple generations of any analog VTR ever produced.

Each suite could use up to four decks and animation was created by layering elements over each other from one VTR to the next. The operator would go round-robin from deck to deck. Play decks A/B/C and record onto D. Next pass, play B/C/D and record onto A to add more. Now, play C/D/A and record onto B for more again, and so on – until maybe as many as 20 layers were composited in sophisticated builds. Whichever reel the last pass ended up on was then the final version from that session. Few other companies or broadcasters possessed compatible IVC VTRs. So 2″ quad copies of the finished commercial or video were made from the 2″ helical and that’s the master tape a client left with.

This method of multi-pass layering is a technique that later took hold in other forms, such as the graphic design for TBS and CNN done by J. C. Burns and then more sophisticated motion layering by Charlex using the Abekas A-62. The concept is also the foundation for such recursive recording techniques as the preread edit function that Sony integrated into its D2 and Digital Betacam VTRs.

The path through Scanimate started with a high-resolution oscilloscope and companion camera. The camera signal was run through the electronics, which included analog controls and patching. Any image to be manipulated (transformed, moved, rotated, distorted, colorized) was sourced from tape, an insert stage camera, or a copy stand titling camera and displayed in monochrome on the oscilloscope screen. This image was re-photographed off of the oscilloscope screen by the high-resolution video camera and that signal sent into the rest of the Scanimate system.

Images were manipulated in two ways. First, the operator could use Scanimate to manipulate/distort the sweep of the oscilloscope itself, which would in turn cause the displayed image to distort. Once this distorted oscilloscope display was then picked up by the high-resolution camera, then the rest of Scanimate could be used to further alter that image through colorization and other techniques. Various keying and masking methods were used to add in each new element as layers were combined for the final composite.

Stability was of some concern since this was an analog computer. If you stopped for lunch, you might not be able to perfectly match what you had before lunch. The later Scanimate systems developed by Computer Image addressed this by using digital computers to control the analog computer hardware, making them more stable and consistent.

The companies evolved or went out of business and the Scanimate technology went by the wayside. Nevertheless, it’s an interesting facet of video history, much like that of the early music synthesizers. Even today, it’s hard to perfectly replicate the look of some of the Scanimate effects, in part, because today’s technology is too good and too clean! While it’s not a perfect analogy, these early forms of video animation offer a similar charm to the analog consoles, multitrack recorders, and vinyl cherished by many audiophiles and mixing engineers.

Check out this video at Vimeo if you want to know more about Scanimate and see it in action.

©2022 Oliver Peters

Blackmagic Cloud Store Workgroup Ideas

In my previous post I reviewed the new Blackmagic Cloud Store Mini. (Click here for the full article.) Blackmagic Design has a track record of being disruptive with cool new ideas and products. Their cloud storage line is no exception. I am less interested in the Blackmagic Cloud, as I’m not a fan of either the Resolve database format for projects nor parking your projects in the cloud. However, the storage products don’t require the Blackmagic Cloud and that opens up a number of possibilities.

I’ve worked with shared storage (SAN and NAS) products going back to Avid’s first fiber channel MediaShare. Blackmagic’s Cloud Store and Mini are light years ahead of those early units. Yet the multi-user workflows developed back then are still in play today. This is especially true if you are a facility owner with a handful of workstations all connected over a network to the same storage pool. It’s also true if you are an editorial team working together on a feature film. Both are workgroups that can easily be serviced by one of the Cloud Store or Cloud Store Mini storage products.

Building the workgroup

I’m a fan of simplicity. Today’s edit suites don’t need all the gadgets that they’ve had in the past. Less is often more. Let’s envision four workstations running either Final Cut Pro or Premiere Pro. Resolve may be part of the mix, but not for collaborative editing. I’ll leave that discussion for another day. Mac-based, of course, but by personal preference since the storage also works with Windows. With those parameters, what would you buy?

For a fixed facility, the Mac Studio is a no-brainer. The Mac Pro makes no sense to me for most editorial work nor for most grading or audio mixing. A loaded M1 Max with a 1TB or 2TB internal drive is plenty. This unit already includes a built-in 10G Ethernet port. If the budget can’t handle four of those, then a loaded M1 Mac mini is another option. Just make sure to add the 10G upgrade.

What if the team needs to more portable, such as for on-site editing? In that case, go for one of the MacBook Pros instead. Since these do not include Ethernet ports, you’ll need add a Sonnet Solo 10G (Thunderbolt 3 to 10G Ethernet) converter for each computer in order to get 10G speeds.

External displays are required for the two desktop Mac models, but optional with the laptops. Equip each unit with one or two of Apple’s 27″ Studio Displays. An alternate option would be the Apple Store’s LG UltraFine 4K display. Don’t forget audio. Unless everyone is working with headphones, I would recommend a simple audio interface (PreSonus, Focusrite, Universal Audio, etc) and desktop speakers (M-Audio, KRK, Adam, etc).

This team is using 10G Ethernet. If you opt for the Cloud Store Mini 8TB drive, then you’ll need to integrate a standard 10G Ethernet switch to connect the four workstations with the Mini over a 10G network. There are a range of options, but at a minimum the switch will require that five or more ports supports 10G. There are many small combo switches on the market with both 1G and 10G ports. It’s easy to misread and think you have enough ports and then find out that most are only 1G.

In this hypothetical scenario, if you only have four workstations and are using the larger Cloud Store, then a switch isn’t needed, since the Cloud Store already has a 4-port 10G switch built in. And finally, you’ll need enough Cat 6 Ethernet cable for the cable runs between rooms. If any run exceeds 100 feet or passes alongside a lot of electrical wiring, then you may want to consider Cat 7 cable.

Storage strategies

Now that we’ve built out the body of the facility or workgroup, what about its heart? Blackmagic Design offers CloudStore with 20TB, 80TB, or 320TB capacities, and Cloud Store Mini with 8TB.

The type of work being done will determine which one is the best fit. For example, if you work with high-resolution media and commonly edit with native camera files, then you’re going to need as big of a unit as you can afford. On the other hand, if most of the media is smaller or you generally edit with proxy files, then the Mini with 8TB might be plenty. Remember that you can always add a large Thunderbolt array (Promise, OWC, etc) with spinning drives at a very reasonable cost as a local drive on one of the stations.

A feature film might have a large Thunderbolt storage tower connected to one of the workstations. It would hold all of the original camera media. Those files would be transcoded to proxy media located on the Mini 8TB. All four workstations would be able to edit with the common proxy files, because the Mini is shared storage. When the cut is locked and it’s time to finish, the unit with the Thunderbolt tower would relink to the original camera media for grading and output.

There are many variations of these scenarios. It’s just a few ways that Blackmagic’s cloud storage products can be used to build powerful workgroups with a very light footprint. However, a key part of this innovation is the ease of putting such a system together.

©2022 Oliver Peters

A look at the Blackmagic Cloud Store Mini

Blackmagic Design is striving to democratize shared storage and edit collaboration with the introduction of the Blackmagic Cloud and the Blackmagic cloud storage product line. Let’s focus on storage first, which in spite of the name is very much earthbound.

Blackmagic Design’s cloud storage product line-up

Blackmagic Cloud Store (starting at $9,595 USD) sits at the top end. This is a desktop network-attached storage system engineered into the same chassis design that was developed for Blackmagic’s eGPUs. It features two redundant power supplies and an M.2 NVMe SSD drive array, which is configured as RAID-5 for data protection in case of drive failure. Cloud Store integrates an internal 10G Ethernet switch for up to four users connected at 10Gbps speeds. It also supports port aggregation for a combined speed of 40Gbps.

Cloud Store will ship soon and be available with 20TB, 80TB, or 320TB capacities. If you are familiar with RAID-5 systems, you know that some of that stated capacity is unaccessible due to the data parity required. Blackmagic Design has factored that in up front, because according to them, the size in the name, like 20TB, correctly reflects the useable amount of storage space.

Cloud Store Mini ($2,995 USD) is an 8TB unit using Blackmagic’s half rack width form factor. There are four internal M.2 flash cards configured as RAID-0. It sports three different Ethernet ports: 10Gbps, 1Gbps, and USB-C, which uses a built-in Ethernet adaptor. Lastly, the Cloud Pod ($395 USD) is a small 10G Ethernet unit designed for customers who supply their own USB-C storage.

All three models are designed for fast 10G Ethernet connectivity and are compatible with both Windows and macOS. Although there are many SAN and NAS products on the market, Blackmagic Design is targeting the customer who wants a high-performance shared storage solution that’s plug-and-play. These storage products are not there to usurp solutions like Avid Nexis. Instead, Blackmagic Design is appealing to customers without that sort of “heavy iron” infrastructure.

Cloud Store Mini as a storage device

The Blackmagic Cloud Store Mini is shipping, so I was able to test drive it for a couple of weeks. I connected my 2020 iMac (which includes the 10G option) via the 10G Ethernet port using a recommended Cat 6 cable. I also connected my M1 MacBook Pro on the Ethernet via USB-C port. This gave me a small “workgroup” of two workstations connected to shared storage.

Continue reading the full article at Pro Video Coalition – click here.

©2022 Oliver Peters