A look at the Blackmagic Cloud Store Mini

Blackmagic Design is striving to democratize shared storage and edit collaboration with the introduction of the Blackmagic Cloud and the Blackmagic cloud storage product line. Let’s focus on storage first, which in spite of the name is very much earthbound.

Blackmagic Design’s cloud storage product line-up

Blackmagic Cloud Store (starting at $9,595 USD) sits at the top end. This is a desktop network-attached storage system engineered into the same chassis design that was developed for Blackmagic’s eGPUs. It features two redundant power supplies and an M.2 NVMe SSD drive array, which is configured as RAID-5 for data protection in case of drive failure. Cloud Store integrates an internal 10G Ethernet switch for up to four users connected at 10Gbps speeds. It also supports port aggregation for a combined speed of 40Gbps.

Cloud Store will ship soon and be available with 20TB, 80TB, or 320TB capacities. If you are familiar with RAID-5 systems, you know that some of that stated capacity is unaccessible due to the data parity required. Blackmagic Design has factored that in up front, because according to them, the size in the name, like 20TB, correctly reflects the useable amount of storage space.

Cloud Store Mini ($2,995 USD) is an 8TB unit using Blackmagic’s half rack width form factor. There are four internal M.2 flash cards configured as RAID-0. It sports three different Ethernet ports: 10Gbps, 1Gbps, and USB-C, which uses a built-in Ethernet adaptor. Lastly, the Cloud Pod ($395 USD) is a small 10G Ethernet unit designed for customers who supply their own USB-C storage.

All three models are designed for fast 10G Ethernet connectivity and are compatible with both Windows and macOS. Although there are many SAN and NAS products on the market, Blackmagic Design is targeting the customer who wants a high-performance shared storage solution that’s plug-and-play. These storage products are not there to usurp solutions like Avid Nexis. Instead, Blackmagic Design is appealing to customers without that sort of “heavy iron” infrastructure.

Cloud Store Mini as a storage device

The Blackmagic Cloud Store Mini is shipping, so I was able to test drive it for a couple of weeks. I connected my 2020 iMac (which includes the 10G option) via the 10G Ethernet port using a recommended Cat 6 cable. I also connected my M1 MacBook Pro on the Ethernet via USB-C port. This gave me a small “workgroup” of two workstations connected to shared storage.

Continue reading the full article at Pro Video Coalition – click here.

©2022 Oliver Peters

Analogue Wayback, Ep. 16

Football. That’s where they play in quarters, right?

One of the fun experiences while in Birmingham was putting together the telecast of three football games. It was a brief moment in time when the original, upstart USFL professional football league challenged the NFL’s football supremacy during the spring season. Birmingham was one of several cities around the nation with a team, which was a great opportunity for the TV station.

Back then (and maybe still today) one way for a television station to shore up its revenue for the month was to bump a night of primetime network programming and replace it with its own. An affiliate lost that night’s worth of network compensation (the money a network pays the affiliate to run network programming). However, they were then able to fill all of the commercial slots for the night, which more than made up for it.

As long as an affiliate didn’t do this too often, networks wouldn’t challenge it, especially if this was for a strong local event. Furthermore, a broadcaster could promote this as a special event, like coverage of important away games that were normally unavailable. The station could charge premium ad dollars for commercial placement within the game, as well as other ancillary income, like sponsorship of the broadcast.

The station covered three away games being played in New Jersey at the Meadowlands, in Chicago at Soldier Field, and in Denver’s Mile High Stadium. The first two were split feeds. A split feed is when you are tagging onto another company or network that is broadcasting the game with a full production truck and crew. All the station had to do was book a smaller truck that would piggyback off of the main production truck. The main truck would send our truck a “clean feed” from their video switcher without their graphics or talent inserts. It also included game audio without their announcers. In the split feed truck, we added our own graphics, mixed in our own play-by-play announcer audio, and then cut to our own single camera whenever we wanted to see our sports reporter.

As production manager for the station, I flew in to produce the telecast, along with our reporter and graphics producer. Chyron (character generator) material, like logos, player names, and templates for stats pages, had been produced in advance. We hired local crew members, including a camera operator, technical director, audio engineer, and Chyron operator.

It got off to a fun start in the Meadowlands. Our New York-based Chryron operator was experienced with hockey games. Football – not so much. As we started to finalize the Chyron pages prior to the game, his first response was, “We’re doing football. Right? That uses quarters, right? OK, I get it.” Everything went off without a hitch. The Chicago experience went equally well, except the taxi driver was a bit confused about where the entrance to Solider Field was! In addition, the director in the main production truck seemed very “high strung” based on what we were hearing through our intercom connection.

Denver, on the other hand, was a completely different experience. We were the main truck doing a full production and not a split feed. This meant hiring a full 40-foot production truck, plus crew. We arranged all of it through a production coordinator who specialized in large sports events. It was fun producing a full-on sports telecast. However, you never know who the locally-hired crew are. The director was highly capable, but his main sports experience was baseball, which led to some interesting camera cutting. For instance, in most football game coverage, when the quarterback passes the ball the camera stays wide and follows the ball to the receiver without cuts. However, this director chose to cut camera angles during the pass. It worked fine, but was a bit different than expected.

I learned to appreciate such live productions, because when they are done they are done. There’s no post-production with infinite client revisions. All of the stress is during the build-up and the live production. No matter how good or bad the broadcast turns out to be, the end is truly the end. That’s a rather cathartic experience. When it’s over, everyone gets a high-five and you go out to a nice, late crew dinner!

©2022 Oliver Peters

Analogue Wayback, Ep. 15

A radio station with pictures.

The mid 80s found me working for a year at a facility that operated two radio stations and owned two satellite transponders. I managed the video production side of the company. Satellite space was hard to get at the time, so they operated their own network on one of them and sublet the other to a different company and network.

At that same time MTV had come to the end of its first contract with cable companies and many wanted other options. Creating a new music video channel alternative was something of interest for us. Unfortunately, our other transponder client was still leasing space within that short window when cable companies could have chosen an alternative option rather than renewing with MTV. Thus, a missed opportunity, because it was shortly thereafter that our client moved on anyway, leaving us with an unfilled satellite transponder. In spite of the unfortunate timing, our company’s owner still decided to launch a new and competing music video network instead of seeking a new client. That new channel was called Odyssey.

As head of production, I was part of the team tasked with figuring out the hardware and general operation of this network. This was the era of the early professional videocassette formats, so we settled on the first generation of M-format decks from Panasonic.

The M-format was a professional videocassette format developed by Panasonic and RCA. It was marketed under the Recam name by Panasonic, RCA, and Ampex. Much like VHS versus Betamax, it was Panasonic’s M-format versus Sony’s Betacam. M-format decks recorded onto standard VHS videocassettes that ran at a faster speed. They used component analog instead of composite recording. This first generation of the M-format was later replaced by the MII series, which had a slightly better professional run, but ultimately still failed in the marketplace.

It was important for us to use a premium brand of VHS tape in these decks, since music videos would play in a high rotation, putting wear and tear on the tape. The Odyssey master control featured seven decks, plus a computer-controlled master control system designed to sequence the playlist of videos, commercials, promos, etc. The computer system was developed by Larry Seehorn, a Silicon Valley engineer who was one of the early developers of computer-assisted, linear editing systems.

We launched at the end of the year right at the start of the holiday week between Christmas and New Year. Everything was off and running… Until the playlist computer system crashed. We quickly found out that it would only support 1500 events and then stop. This was something that the manufacturer failed to disclose when we purchased the system. You had to reload a new list and start over, losing a lot of time in between. It would have been fine in a normal TV station operation, since you had long program segments between commercial breaks. For us, this was insufficient time, because we only had the length available of a music video in order to reload and reboot a new playlist.

Fortunately as a back-up in case of some sort of system failure, we had prepared a number of hourlong 1″ video tapes with music video blocks in advance. Running these allowed us to temporarily continue operation while we figured out plan B.

Ultimately the solution we settled on was to chuck the master control computer and replace it with a Grass Valley master control switcher. This was an audio-follows-video device, meaning that switching sources simultaneously switched audio and video. If you used the fader bar to dissolve between sources, it would also mix between the audio sources. This now became a human-controlled operation with the master control operator loading and cueing tapes, switching sources, and so on. Although manual, it proved to be superior to a playlist-driven automated system.

The operators effectively became radio station disk jockeys and those same guidelines applied. Our radio station program director selected music, set up a manual playlist, a “clock” for song genre and commercial rotation, and so on. Music videos sent to us by record labels would be copied to the M-format VHS tapes with a countdown and any added graphics, like music video song credits. Quite frankly, I have to say that our song selection were more diverse than the original MTV. In addition, having human operators allowed us to adjust timing on-the-fly in ways that an automated list couldn’t.

As ambitious as this project was, it had numerous flaws. The company was unable to get any cable provider to commit a full channel as they had with MTV. Consequently programming was offered to any broadcast station or cable company in any market on a first-come-first-served basis, but without a time requirement. If a small, independent TV station in a large market decided to contract for only a few hours on the weekend, then they locked up that entire market.

The other factor that worked against Odyssey was that Turner Broadcasting had already tried to launch their music channel with a LOT more money. Turner’s effort crashed and burned in a month. Needless to say, our little operation was viewed with much skepticism. Many would-be customers and advertisers decided to hold off at least a year to see if we’d still be in business at that time. Of course, that didn’t help our bottom line.

In spite of these issues, Odyssey hung on for ten months before the owner finally tossed in the towel. Even though it didn’t work out and I had moved on anyway, it was still a very fun experience that took me back to when I started out in radio.

©2022 Oliver Peters

Virtual Production

Thanks to the advances in video game software and LED display technology, virtual production has become an exciting new tool for the filmmaker. Shows like The Mandalorian have thrust these techniques into the mainstream. To meet the demand, numerous companies around the world are creating virtual production sound stages, often referred to as “the volume.” I recently spoke with Pixomondo and Trilith Studios about their moves into virtual production.

Pixomondo

Pixomondo is an Oscar and Emmy-winning visual effects company with multiple VFX and virtual production stages in North America and Europe. Their virtual production credits include the series Star Trek: Strange New Worlds and the upcoming Netflix series Avatar: The Last Airbender.

The larger of the two virtual production stages at Pixomodo’s Toronto facilities is 300 feet x 90 feet and 24 feet tall. The LED screen system is 72 feet in diameter. Josh Kerekas is Pixomondo’s Head of Virtual Production.

Why did Pixomondo decide to venture into virtual production?

We saw the potential of this new technology and launched a year-long initiative to get our virtual production division off the ground. We’re really trying to embrace real-time technology, not just in the use case of virtual production in special studios, but even in traditional visual effects.

Click here to continue this article at postPerspective.

©2022 Oliver Peters

Analogue Wayback, Ep. 14

What’s old is new again.

When I watch shows like The Mandalorian and learn about using the volume, it becomes apparent that such methods conceptually stem from the earliest days of film. Some of these old school techniques are still in use today.

Rear-screen projection draws the most direct line to the volume. In its simplest form, there’s a translucent screen behind the talent. Imagery is projected from behind onto the screen. The camera sees the actors against this background scene as if that was a real set or landscape. No compositing is required since this is all in-camera. In old films, this was a common technique for car driving scenes. The same technique was used by David Fincher for Mank. Instead of projected images, large high-resolution video screens were used.

Front-screen projection is a similar process. The camera faces a special reflective backdrop coated with tiny glass beads. There’s a two-way mirror block between the camera lens and the talent who is standing in front of the screen. A projection source sits at 90 degrees to the camera and shines into the mirror, which is at a 45-degree angle inside the block. This casts the image onto the reflective backdrop. The camera shoots through this same mirror and sees both the talent and the projected image behind them, much like front screen projection.

The trick is that the projected image is also shining onto the talent, but you don’t actually see it on the talent. The reason is that the projector light level is so low that it’s washed out by the lighting on the talent. The glass beads of the backdrop act as tiny lenses to focus the light of the projected background image back towards the camera lens. The camera sees a proper combination without contamination onto the talent, even if that’s not what you see with the naked eye.

A similar concept is used in certain chromakey techniques. A ring light on the camera lens shines green or blue light onto the talent and the grey, reflective backdrop behind the talent. This backdrop also contains small glass beads that act as tiny lenses. The camera sees color-correct talent, but instead of grey, it’s a perfect green or blue screen behind them.

Aerial image projection is a cool technique that I haven’t personally seen used in modern production, although it’s probably still used in some special effects work. The process was used in multimedia production to add camera moves on still images. In a sense it led to digital video effects. There’s a projection source that shines an image onto a translucent, suspended pane of ground glass. A camera is positioned on the opposite side, so both camera and projector face the glass pane. The projected image is focused onto the glass, so that it’s crisp. Then the camera records the image, which can be resized as needed. In addition, a camera operator can add camera moves while recording the projected image that is “floating” on the glass pane.

©2022 Oliver Peters