Video Technology 2020 – Shared Storage

Shared storage used to be the domain of “heavy iron” facilities with Avid, Facilis, and earlier Apple Xserve systems providing the horsepower. Thanks to advances in networking and Ethernet technology, shared storage is accessible to any user. Whether built-in or via adapters, modern computers can tap into 1Gbps, 10Gbps, and even higher, networking speeds. Most computers can natively access Gigabit Ethernet networks (1Gbps) – adequate for SD and HD workflows. Computers designed for the pro video market increasingly sport built-in 10GbE ports, enabling comfortable collaboration with 4K media and up. Some of today’s most popular shared storage vendors include QNAP, Synology, and LumaForge.

This technology will become more prolific in 2020, with systems easier to connect and administer, making shared storage as plug-and-play as any local drives. Network Attached Storage (NAS) systems can service a single workstation or multiple users. In fact, companies like QNAP even offer consumer versions of these products designed to operate as home media servers. Even LumaForge sells a version of its popular Jellyfish through the online Apple Store. A simple, on-line connection guide will get you up and running, no IT department required. This is ideal for the individual editor or small post shop.

Expect 2020 to see higher connection speeds, such as 40GbE, and NAS proliferation even more widespread. It’s not just a matter of growth. These vendors are also interested in extending the functionality of their products beyond being a simple bucket for media. NAS systems will become full-featured media hubs. For example, if you an Avid user, you are familiar with their Media Central concept. In essence, this means the shared storage solution is a platform for various other applications, including the editing software. There are additional media applications that include management apps for user permission control, media queries, and more. Like Avid, the other vendors are exploring similar extensibility through third-party apps, such as Axle Video, Kyno, Hedge, Frame.io, and others. As such, a shared network becomes the whole that is greater than the sum of its parts.

Along with increased functionality, expect changes in the hardware, too. Modern NAS hardware is largely based on RAID arrays with spinning mechanical drives. As solid state (SSD) storage devices become more affordable, many NAS vendors will offer some of their products featuring RAID arrays configured with SSDs or even NVMe systems. Or a mixture of the two, with the SSD-based units used for short-term projects or cache files. Eventually the cost will come down enough so that large storage volumes can be cost-effectively populated with only SSDs. Don’t expect to be purchasing 100TB of SSD storage at a reasonable price in 2020; however, that is the direction in which we are headed. At least in this coming year, mechanical drives will still rule. Nevertheless, start looking at some percentage of your storage inventory to soon be based on SSDs.

Click here for more on shared storage solutions.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Video Technology 2020 – The Cloud

The “cloud” is merely a collection of physical data centers in multiple locations around the world – not much different than a small storage center you might have. Of course, they employ more advanced systems for power, redundancy, and security than you do. When you work with one of the companies marketing cloud-based editing or a review-and-approval service, like Frame.io or Wipster, they provide the user-facing interface, but are actually renting storage space from one of the big three cloud providers – Google, Amazon, or Microsoft.

There are three reasons that I’m skeptical about ubiquitous, cloud-based editing (with media at native resolutions) in the short term: upload speeds, cost, and security.

Speed

5G (fifth generation wireless) is the technology predicted to offer adequate speeds and low latency for native 4K (and higher) media. While 5G will be a great advancement for many things, it’s a short distance signal requiring more transmission spots than current wireless technology. Full coverage in most metro areas, let alone widespread geographical coverage worldwide, will take many years to fully deploy. Other than potential camera-to-cloud uploads of proxy media in the field, 5G won’t soon be the killer solution. Current technology still dictates that if you want the fastest possible upload speeds for large amounts of data, then you have to tap as close as possible to the internet’s backbone.

Cost

Cloud storage is cheap, but extensive upload and download times aren’t. Unfortunately modern video resolutions also result in huge amounts of data generated on every shoot. Uploading native 4K media for a week-long production is considerably more expensive than FedEx and overnight charges to ship drives. What about long term storage? Let’s say that all of your native media is in the cloud and you pay according to a monthly or annual subscription plan. But what if you want to stop? That media will have to be downloaded and stored locally, which will incur data rate charges, as well as your time to download everything.

Security

Think these sites are unequivocally secure? Look at any data hack at a major company. Security is such a concern in our business that most major movie studios won’t let their editors connect the computers to the internet. Many make these editors check their cell phones at the door. No matter how secure, it’s going to be a hard sell, except for limited slices of the production, such as cloud-based VFX rendering.

I do believe 2020 will be a year in which many will take advantage of some modes of long distance, cloud-based edit services using low-res proxy media. Increasingly some services will be used to move dailies and deliverables around the globe via the cloud. But that’s a big difference from cloud-based editing becoming the norm. One edit scenario many will experiment with is to store the edit project files in the cloud, but with the media mirrored locally at each edit site. This way only the lightweight files used for edit collaboration need be moved over the internet. Think of this as Google Docs for editing. Adobe already offers a version of this, but I suspect you’ll see others, including solutions for Final Cut Pro X. So while true cloud-based editing is not a near-term solution, bits and pieces will become increasingly commonplace.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Video Technology 2020 – Apple and the PC Landscape

Apple enjoys a small fraction of the total computer market, yet has an oversized influence on video production and post. Look anywhere in our business and you’ll see a high percentage of Apple Mac computers and laptops in use by producers, DITs, editors, mixers, and colorists. This has influenced the development and deployment of certain technologies, such as optimization for Metal, Thunderbolt i/o, ProRes codecs, and more. This may irritate Windows users, but it’s something companies like Avid, Adobe, and others cannot ignore. Apple deprecates OpenGL, OpenCL, and CUDA in favor of Metal, and so, developers of software for Apple computers will follow suit so that their Mac-based customers enjoy a good experience.

Going into 2020, Apple is offering a better line-up of professional Mac products than it has in years. MacBook Pro laptops, iMacs and iMac Pros, and the new Mac Pro are clearly targeted at the professional customer. Add to this the Pro Display XDR and authorized third-party products available through Apple, like LumaForge Jellyfish storageBlackmagic and Sonnet eGPUs. Clearly Apple intends to offer an end-to-end hardware and software ecosystem designed to appeal to the pro video customer.

Apple’s prices can be a turn-off for some. Similar investments in a PC – especially custom configurations – may yield better performance in certain applications. Nevertheless, most former and present owners of Mac Pro “cheese grater” towers feel like they got their money’s worth and will at least have interest in the new Mac Pro. Same for MacBook Pro owners. So while these new machines may not move the needle for the larger consumer computer market, it will definitely keep current Mac users in the fold and prevent migration to Windows or Linux PCs. It also reinforces Apple’s interest in the professional market – not just video, but also animation, design, audio, science, and engineering.

The unknown will be the impact of Apple’s new Afterburner card for the Mac Pro. While accelerator cards have been offered by various manufacturers in the past, recent computing developments have focused on processor core counts and GPU technology. The Apple Afterburner is the first introduction for Apple of a new FPGA-based (programmable ASIC) hardware accelerator card. Designed for transcoding, it promises to increase stream counts with 4K and 8K raw and standard codecs in the Mac Pro. Once it’s out in the wild, we will have a better idea of who supports it (beyond Apple’s own software) and its real-world performance.

As Apple goes, so goes the rest of the industry. How will the PC world counter this? Will we see similar cards from HP or Dell? Or will NVIDIA respond with similar results using their GPUs? That’s unknown right now, but my guess is that it will take at least this next year for the rest of the world to respond with competing solutions.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Every NLE is a Database

Apple’s Final Cut Pro X has spawned many tribal arguments since its launch eight years ago. There have been plenty of debates about the pros and cons of its innovative design and editing model. One that I’ve heard a number of times is that FCPX is a relational database, while traditional editing applications are more like an Excel spreadsheet. I can see how the presentation of a bin in the list view format might convey that impression, but that doesn’t make it accurate. Spreadsheets are a grid of cells that are based on a combination of mathematical formulae, regardless of whether the info is text or numbers. All nonlinear editing applications (NLE) use a relational database to track media, although the type and format of this database will differ among brands. In all cases, these function altogether differently than how a spreadsheet functions.

It started with film

When all editing was done on film, the editors cut work print, which was a reversal copy printed from the camera negative. Edits made on the work print were eventually duplicated on the pristine negative by a negative cutter, based on a cut list. Determining where to cut and join the film segments together was based on a list of edit points corresponding to the source rolls of the film, plus a foot+frame count for specific edit points. The work print, which the editors could physically cut and splice as needed, was effectively an abstraction of – and stand-in for – the negative.

In order to enable the process, assistant editors (or in some cases, the editor) created a handwritten log, known as a codebook. This started with the dailies and included all the pertinent information, such as source roll, shoot days/dates, scenes/takes, director’s notes, editor’s notes, and so on. The codebook was a physical database that allowed an editor to know what the options were and where to find them.

During the videotape-editing era prior to NLEs, any sort of database for tracking source information was still manual. Only the cut list portion, known as the edit decision list, could be generated by the edit computer, based on the timecode values recorded on the tape. Timecode became the electronic equivalent of the foot+frame count of physical film.

Fast forward to the modern era with file-based camera acquisition and ubiquitous, inexpensive editing software. The file recorded by the camera is a container of sorts that holds essence (audio and video) and metadata (information about the essence). Some cameras generate a lot of metadata and others don’t. One example of this type of metadata that we all encounter is the information embedded into digital still photos, which can include location, lens data, and a ton more.

When clips are ingested/imported into your NLE – whether into a project, bin, folder, or an event – the NLE links to the essence of the media clips on the hard drive or camera card and brings in whatever clip metadata is understood by that application. In addition, the user can add and merge a lot more metadata derived from other sources, like the sound recorder, script supervisor notes, electronic script, and manually-added data.

The clip that you see in the bin/event/folder is an abstraction for the actual audio and video media, just like work print was for film editors. The bin/folder/event data entries are like the film editor’s codebook and are tracked in the internal database used by that application to cross-reference the clip with the actual stored media. Since a clip in the app’s browser is simply an abstraction, it can appear in multiple places at the same time – in various bins and sequences. The internal database makes sure that each of these instances of the clip all reference the same piece of media accurate down to the video frame or audio sample.

It doesn’t matter how the bin looks

The spreadsheet comparison is based on how bins have appeared in most NLEs, including Final Cut Pro “legacy,” Avid Media Composer, and others. Unfortunately that opinion is usually based on a narrow exposure to other NLEs. As I said, at the core, every NLE is a relational database. And so, there are other things that can be tracked and/or ways it can be displayed.

For instance, older Quantel edit systems displayed source information based on what we would consider a smart search view today. The entirety of the source material was not displayed in front of the editor, since it was a single-screen layout. Entering data into a search field would sift through and present clips matching the requested data.

Avid Media Composer systems also track media based on Script Integration (sometimes incorrectly referred to as ScriptSync, which is a separate Avid option). This is a graphical bin layout with the script text displayed on screen and clips linked to coverage of that scene. Media Composer and now Premiere Pro both permit a freeform clip view for a bin, in which the editor can freely rearrange the position of the clip thumbnails within the bin window. This visual juxtaposition by the user of clips conveys important information to the editor.

All NLEs have multiple ways to present the data and aren’t limited to a grid-style list view that resembles a spreadsheet or a grid of clip thumbnails. Enabling these alternate views takes a lot more than simply cross-referencing your bin and timelines against a set of edit points. That’s where databases come in and why every NLE is built around one.

How can you be in two places at once when you’re not anywhere at all?

My apologies to Firesign Theatre. A huge aspect of the Final Cut Pro X edit workflow is the use of keyword collections. You aren’t limited to being in just a single bin thanks to them. While this is a selling point for FCPX, it is also well within the capabilities of most NLEs.

Organizing your event (bin) media in FCPX can start by assigning keywords to each clip. Each new keyword used creates a keyword collection – sort of a “smart sub-bin.” As you assign one or more keywords to a clip, FCPX automatically sorts the clip into those corresponding keyword collections.  For example, let’s say you have a series of wide and close-up shots featuring both male and female actors. Clip 1 might be sorted into WIDE and MAN; Clip 2 into WIDE and WOMAN; Clip 3 into WOMAN and CLOSE-UP. So then the keyword collection for WIDE displays Clip 1 and Clip 2; MAN displays Clip 1; WOMAN displays Clip 2 and Clip 3; CLOSE-UP displays Clip 3.

Once this initial step is completed it enables the editor to view source clips in a more focused manner. Instead of wading through 100 clips in the event (bin) each time, the editor may only have to deal with 10 clips in the CLOSE-UP keyword collection. Or in any other collection. The beauty of FCPX’s interface design is the speed and fluidity with which this can be accomplished. This feature is one of the hallmarks of the application and no other NLE does it nearly as elegantly. In fact, FCPX tackles the challenge of narrowing down the browser options through three methods – ratings, keyword collections, and smart collection (described in this linked tutorial by Simon Ubsdell).

As elegantly as Final Cut tackles this task, that doesn’t mean that other NLEs can’t function in a similar manner. Within Premiere Pro, those exact same keywords can be assigned to the clips. Then simply create a set of search bins using those same keywords as search criteria. The result is the exact same type of distribution of clips into collections where multiple clips can appear in multiple bins at the same time. Likewise, the editor doesn’t need to go through the full set of clips in a bin, but can concentrate on the small handful in any given search bin. Media Composer also offers search functions, as well as, custom sift routines, which enable you to only display clips matching specific column details, like a custom keyword.

Most NLEs can only store one set of in/out edit marks on a clip within a bin at any given time. On the other hand, Final Cut Pro X offers range-based selection. Clips can retain multiple in/out selections at once. Nevertheless other NLE aren’t behind here either. The obvious solution that most editors use when this is needed is to create a subclip, which can be a duplicate of the entire clip or a portion from within a single clip. Need to pull multiple sections of the clip? Simply create multiple subclips. In effect, these are the same as range-based selections in Final Cut Pro X. Admittedly the FCPX method is more fluid and straightforward. Nevertheless, range-based selections are virtual subclips that are dynamically created by the editor; but unlike subclips, these can’t be moved separately to other events (bins). Two ways to tackle a very similar need.

The bottom line is that under the hood, all NLEs are still very much the same. Let me emphasize that I’m not arguing the superiority, speed, or elegance of one approach or tool over another. Every company has their own set of unique features that appeal to different types of editors. They are simply different methods to place information at your fingertips, get roadblocks out of the way, and thus to make editing more creative and enjoyable.

©2019 Oliver Peters

Mind your TCO

TCO = total cost of ownership.

When fans argue PCs versus Macs, the argument tends to only focus on the purchase price of the hardware. But owning a computer is also about the total operating cost or TCO over its lifespan. In the corporate world, IBM has already concluded Mac deployment has been cheaper for the IT Department. For video editors, a significant part of the equation is the software we run. Here, all things are not equal, since there are options for the Mac that aren’t available to PC users. Yes, I know that Avid, Adobe, and Blackmagic Design offer cross-platform tools, but this post is a thought exercise, so bear with me.

If you are a PC user, odds are that you will be using Adobe Creative Cloud software, which is only available in the form of a subscription. Sure, you could be using Media Composer or Edius, but likely it will be Premiere Pro and the rest of the Creative Cloud tools, such as Photoshop and After Effects. Avid offers both perpetual and subscription plans, but the perpetual licenses require an annual support renewal to stay current with the software. The operating cost between Avid and Adobe end up in a very similar place over time.

Mac users could use the same tools, of course, but they do have significant alternatives in non-subscription software, like Apple’s own Pro Applications. In addition, macOS includes additional productivity software that PC users would have to purchase at additional cost. The bottom line is that you have to factor in the cost of the subscription over the lifespan of the PC, which adds to its TCO*.

For this exercise, I selected two 15″ laptops – a Dell and a MacBook Pro. I configured each as close to the other as possible, with the exception that Dell only offers a 6-core CPU, whereas new MacBook Pros use 8-core chips. That comes to $2395 for the Dell and $3950 for the Apple – a pretty big difference. But now let’s add software tools.

For the PC’s suite of tools, I have included the full Adobe Creative Cloud bundle, along with a copy of Microsoft Office. Adobe’s current subscription rate for individuals comes to $636/year (when paid annually, up front). You would also have to add Microsoft Office to get Word, Excel, and Powerpoint. Even though Microsoft is moving to subscriptions, you can still buy Office outright. A home/small business license is $250.

You could, of course, make the same choices for the Mac, but that’s not the point of this exercise. I’m also not trying to make the argument that one set of tools is superior to the other. This post is strictly about comparing cost. If you decide to add alternative software to the Mac in order to parallel the Adobe Creative Cloud bundle, you would have to purchase Final Cut Pro X, Motion, Compressor, and Logic Pro X. To cover Photoshop/Illustrator/InDesign tasks, add Affinity Photo, Designer, and Publisher. You can decide for yourself whether or not macOS Photos is a viable substitute for Lightroom; but, for sake of argument, let’s add ON1 Photo RAW to our alternative software package. Some Adobe tools, like Character Animator, won’t have an equal, but that’s an application that most editors have probably never touched anyway. Of course, macOS comes with Pages, Numbers, and Keynote, so no requirement to add Microsoft Office for the MacBook Pro. Total all of this together and the ballpark sum comes to $820. But you have purchased perpetual licenses and do not require annual subscription payments.

In the first year of ownership, PC users clearly have the edge. In fact, up until year three, the TCO is cheaper for PC owners. Odds are you’ll own your laptop longer than three years. I’m typing this on a mid-2014 15″ MacBook Pro, which is also my primary editing machine for any work I do at home. Once you cross into the fourth year and longer, the Mac is cheaper to own and operate, purely because of the differences in software license models.

Remember this is a simple thought exercise and you can mix and match software combinations any way you would like. These are worthwhile considerations when comparing products. It’s just not as simple as saying one hardware product is cheaper than the other, which is why a TCO analysis can be very important.

*Totals shown have been rounded for simplicity.

©2019 Oliver Peters