Regardless of whether you own or work for a small editorial company or a large studio cranking out blockbusters, media and how you manage it is the circulatory system of your operation. No matter the size, many post operations have some of the same concerns, although may approach them with solutions that are vastly different from company to company.
Last year I wrote on this topic for postPerspective and interviewed key players at Molinare and Republic. This year I’ve revisited the topic, taking a look at top Midwestern spot shops Drive Thru and Utopic, as well as Marvel Studios. In addition, I’ve also broken down the “best practices” that Netflix suggests to its production partners.
Here are links to these articles at postPerspective:
We often think about the projects that “got away” or never happened. For me, one of those was a documentary about the “garage band” acts of central Florida during the 1960s. These were popular local and regional acts with an eye towards stardom, but who never became household names, like Elvis or The Beatles. Central Florida was a hot bed for such acts back then, in the same way as San Francisco, Memphis, or Seattle have been during key moments in rock ‘n roll history.
For much of the early rock ‘n roll era music was a vertically-integrated business. Artist management, booking, recording studios, and marketing/promotion/distribution were all handled by the same company. The money was made in booking performances more so than record sales.
Records were produced, especially 45RPM “singles”, in order to promote the band. Singles were sent for free to radio stations in hopes that they would be placed into regular rotation by the station. That airplay would familiarize listeners/fans with the bands and their music. While purchasing the records was a goal, the bigger aim was name recognition, so that when a band was booked for a local event (dance, concert, youth club appearance, tour date) the local fans would buy tickets and show up to the event. Naturally some artists broke out in a big way, which meant even more money in record sales, as well as touring.
Record labels, studios, recording studios, and talent booking services – whether the same company or separate entities – enjoyed a very symbiotic relationship. Much of this is chronicled in a mini-doc I cut for the Memphis Rock ‘n Soul Museum. It highlighted studios like Sun, Stax, and Hi and their role in the birth of rock ‘n roll and soul music.
Years later Schabacker approached me with an interesting project – a documentary about the local garage bands on the 60s. Together with a series of interviews with living band members, post for the documentary would also involve the restoration of several proto-music videos. Bee Jay had videotaped promotional videos for 13 of the bands back in the day. While Schabacker handled the recording of the interviews, I tackled the music videos.
The original videos were recorded using a rudimentary black-and-white production system. These were recorded onto half-inch open reel videotape. Unfortunately, the video tubes in the cameras back then didn’t always handle bright outdoor light well and the video switcher did not feature clean vertical interval switching. The result was a series of recordings in which video levels fluctuated and camera cuts often glitched. There were sections in the recordings where the tape machine lost servo lock during recording. The audio was not recorded live. Instead, the bands lip-synced to playback of their song recordings, which was also recorded in sync with the video. These old videos were transferred to DV25 QuickTime files, which formed my starting point.
Step one was to have clean audio. The bands’ tunes had been recorded and mixed at Bee Jay Studios at the time into a 13-song LP that was used for promotion to book those bands. However, at this point over three decades later, the master recordings were no longer available. But Schabacker did have pristine vinyl LPs from those session. These were turned over to local audio legend and renowned master engineer, Bob Katz. In turn, he took those versions and created remastered files for my use.
Now that I had good sound, my task was to take the video – warts and all – and rebuild it in sync with the song tracks, clean up the video, get rid of any damage and glitches, and in general end up with a useable final video for each song. Final Cut Pro (legacy) was the tool of choice at that time. Much of the “restoration” involved the slight slowing or speeding up of shots to resync the files – shot by shot. I also had to repeat and slomo some shots for fit-and-fill, since frames would be lost as glitchy camera cuts and other disturbances were removed. In the end, I rebuilt all 13 into a presentable form.
While that was a labor of love, the down side was that the documentary never came to be. All of these bands had recorded great-sounding covers (such as Solitary Man), but no originals. Unfortunately, it would have been a nightmare and quite costly to clear the music rights for these clips if used in the documentary. A shame, but that’s life in the filmmaking world.
None of these bands made it big, but in subsequent years, bands of another era like *NSYNC and the Backstreet Boys did. And they ushered a new boy band phenomenon, which carries on to this day in the form of K-pop, among other styles.
Football. That’s where they play in quarters, right?
One of the fun experiences while in Birmingham was putting together the telecast of three football games. It was a brief moment in time when the original, upstart USFL professional football league challenged the NFL’s football supremacy during the spring season. Birmingham was one of several cities around the nation with a team, which was a great opportunity for the TV station.
Back then (and maybe still today) one way for a television station to shore up its revenue for the month was to bump a night of primetime network programming and replace it with its own. An affiliate lost that night’s worth of network compensation (the money a network pays the affiliate to run network programming). However, they were then able to fill all of the commercial slots for the night, which more than made up for it.
As long as an affiliate didn’t do this too often, networks wouldn’t challenge it, especially if this was for a strong local event. Furthermore, a broadcaster could promote this as a special event, like coverage of important away games that were normally unavailable. The station could charge premium ad dollars for commercial placement within the game, as well as other ancillary income, like sponsorship of the broadcast.
The station covered three away games being played in New Jersey at the Meadowlands, in Chicago at Soldier Field, and in Denver’s Mile High Stadium. The first two were split feeds. A split feed is when you are tagging onto another company or network that is broadcasting the game with a full production truck and crew. All the station had to do was book a smaller truck that would piggyback off of the main production truck. The main truck would send our truck a “clean feed” from their video switcher without their graphics or talent inserts. It also included game audio without their announcers. In the split feed truck, we added our own graphics, mixed in our own play-by-play announcer audio, and then cut to our own single camera whenever we wanted to see our sports reporter.
As production manager for the station, I flew in to produce the telecast, along with our reporter and graphics producer. Chyron (character generator) material, like logos, player names, and templates for stats pages, had been produced in advance. We hired local crew members, including a camera operator, technical director, audio engineer, and Chyron operator.
It got off to a fun start in the Meadowlands. Our New York-based Chryron operator was experienced with hockey games. Football – not so much. As we started to finalize the Chyron pages prior to the game, his first response was, “We’re doing football. Right? That uses quarters, right? OK, I get it.” Everything went off without a hitch. The Chicago experience went equally well, except the taxi driver was a bit confused about where the entrance to Solider Field was! In addition, the director in the main production truck seemed very “high strung” based on what we were hearing through our intercom connection.
Denver, on the other hand, was a completely different experience. We were the main truck doing a full production and not a split feed. This meant hiring a full 40-foot production truck, plus crew. We arranged all of it through a production coordinator who specialized in large sports events. It was fun producing a full-on sports telecast. However, you never know who the locally-hired crew are. The director was highly capable, but his main sports experience was baseball, which led to some interesting camera cutting. For instance, in most football game coverage, when the quarterback passes the ball the camera stays wide and follows the ball to the receiver without cuts. However, this director chose to cut camera angles during the pass. It worked fine, but was a bit different than expected.
I learned to appreciate such live productions, because when they are done they are done. There’s no post-production with infinite client revisions. All of the stress is during the build-up and the live production. No matter how good or bad the broadcast turns out to be, the end is truly the end. That’s a rather cathartic experience. When it’s over, everyone gets a high-five and you go out to a nice, late crew dinner!
The mid 80s found me working for a year at a facility that operated two radio stations and owned two satellite transponders. I managed the video production side of the company. Satellite space was hard to get at the time, so they operated their own network on one of them and sublet the other to a different company and network.
At that same time MTV had come to the end of its first contract with cable companies and many wanted other options. Creating a new music video channel alternative was something of interest for us. Unfortunately, our other transponder client was still leasing space within that short window when cable companies could have chosen an alternative option rather than renewing with MTV. Thus, a missed opportunity, because it was shortly thereafter that our client moved on anyway, leaving us with an unfilled satellite transponder. In spite of the unfortunate timing, our company’s owner still decided to launch a new and competing music video network instead of seeking a new client. That new channel was called Odyssey.
As head of production, I was part of the team tasked with figuring out the hardware and general operation of this network. This was the era of the early professional videocassette formats, so we settled on the first generation of M-format decks from Panasonic.
The M-format was a professional videocassette format developed by Panasonic and RCA. It was marketed under the Recam name by Panasonic, RCA, and Ampex. Much like VHS versus Betamax, it was Panasonic’s M-format versus Sony’s Betacam. M-format decks recorded onto standard VHS videocassettes that ran at a faster speed. They used component analog instead of composite recording. This first generation of the M-format was later replaced by the MII series, which had a slightly better professional run, but ultimately still failed in the marketplace.
It was important for us to use a premium brand of VHS tape in these decks, since music videos would play in a high rotation, putting wear and tear on the tape. The Odyssey master control featured seven decks, plus a computer-controlled master control system designed to sequence the playlist of videos, commercials, promos, etc. The computer system was developed by Larry Seehorn, a Silicon Valley engineer who was one of the early developers of computer-assisted, linear editing systems.
We launched at the end of the year right at the start of the holiday week between Christmas and New Year. Everything was off and running… Until the playlist computer system crashed. We quickly found out that it would only support 1500 events and then stop. This was something that the manufacturer failed to disclose when we purchased the system. You had to reload a new list and start over, losing a lot of time in between. It would have been fine in a normal TV station operation, since you had long program segments between commercial breaks. For us, this was insufficient time, because we only had the length available of a music video in order to reload and reboot a new playlist.
Fortunately as a back-up in case of some sort of system failure, we had prepared a number of hourlong 1″ video tapes with music video blocks in advance. Running these allowed us to temporarily continue operation while we figured out plan B.
Ultimately the solution we settled on was to chuck the master control computer and replace it with a Grass Valley master control switcher. This was an audio-follows-video device, meaning that switching sources simultaneously switched audio and video. If you used the fader bar to dissolve between sources, it would also mix between the audio sources. This now became a human-controlled operation with the master control operator loading and cueing tapes, switching sources, and so on. Although manual, it proved to be superior to a playlist-driven automated system.
The operators effectively became radio station disk jockeys and those same guidelines applied. Our radio station program director selected music, set up a manual playlist, a “clock” for song genre and commercial rotation, and so on. Music videos sent to us by record labels would be copied to the M-format VHS tapes with a countdown and any added graphics, like music video song credits. Quite frankly, I have to say that our song selection were more diverse than the original MTV. In addition, having human operators allowed us to adjust timing on-the-fly in ways that an automated list couldn’t.
As ambitious as this project was, it had numerous flaws. The company was unable to get any cable provider to commit a full channel as they had with MTV. Consequently programming was offered to any broadcast station or cable company in any market on a first-come-first-served basis, but without a time requirement. If a small, independent TV station in a large market decided to contract for only a few hours on the weekend, then they locked up that entire market.
The other factor that worked against Odyssey was that Turner Broadcasting had already tried to launch their music channel with a LOT more money. Turner’s effort crashed and burned in a month. Needless to say, our little operation was viewed with much skepticism. Many would-be customers and advertisers decided to hold off at least a year to see if we’d still be in business at that time. Of course, that didn’t help our bottom line.
In spite of these issues, Odyssey hung on for ten months before the owner finally tossed in the towel. Even though it didn’t work out and I had moved on anyway, it was still a very fun experience that took me back to when I started out in radio.
“Jack of all trades, master of none” is a quote most are familiar with. But the complete quote “Jack of all trades, master of none, but oftentimes better than master of one” actually has quite the opposite perceived meaning. In the world of post production you have Jacks and Jills of all trades (generalists) and masters of one (specialists). While editors are certainly specialized in storytelling, I would consider them generalists when comparing their skillset to those of other specialists, such as visual effects artists, colorists, and audio engineers. Editors often touch on sound, effects, and color in a more general (often temp) way to get client approval. The others have to deliver the best, final results within a single discipline. Editors have to know the tools of editing, but not the nitty gritty of color correction or visual effects.
This is closely tied to the Pareto Principle, which most know as the 80/20 Rule. This principle states that 80% of the consequences come from 20% of the causes, but it’s been applied in various ways. When talking about software development, the 80/20 Rule predicts that 80% of the users are going to use 20% of the features, while only 20% of users will find a need for the other features. The software developer has to decide whether the target customer is the generalist (the 80% user) or the specialist (the 20% user). If the generalist is the target, then the challenge is to add some specialized features to service the advanced user without creating a bloated application that no one will use.
Applying these concepts to editing software development
When looking at NLEs, the first question to ask is, “Who is defined as a video editor today?” I would separate editors into three groups. One group would be the “I have to do it all” group, which generates most of what we see on local TV, corporate videos, YouTube, etc. These are multi-discipline generalists who have neither the time nor interest in dealing with highly specialized software. In the case of true one-man bands, the skill set also includes videography, plus location lighting and sound.
The “top end” – national and international commercials, TV series, and feature films – could be split into two groups: craft (aka film or offline) editors and finishing (aka online) editors. Craft editors are specialists in molding the story, but generalists when it comes to working software. Their technical skills don’t have to be the best, but they need to have a solid understanding of visual effects, sound, and color, so that they can create a presentable rough cut with temp elements. The finishing editor’s role is to take the final elements from sound, color, and the visual effects houses, and assemble the final deliverables. A key talent is quality control and attention to detail; therefore, they have no need to understand dedicated color, sound, or effects applications, unless they are also filling one of these roles.
My motivation for writing this post stemmed from an open letter to Tim Cook, which many editors have signed – myself included. Editors have long been fans of Apple products and many gravitated from Avid Media Composer to Apple Final Cut Pro 1-7. However, when Apple reimagined Final Cut and dropped Final Cut Studio in order to launch Final Cut Pro X many FCP fans were in shock. FCPX lacked a number of important features at first. A lot of these elements have since been added back, but that development pace hasn’t been fast enough for some, hence the letter. My wishlist for new features is quite small. I recognize Final Cut for what it is in the Apple ecosystem. But I would like to see Apple work to raise the visibility of Final Cut Pro within the broader editing community. That’s especially important when the decision of which editing application to use is often not made by editors.
Blackmagic Design DaVinci Resolve – the über-app for specialists
This brings me to Resolve. Editors point to Blackmagic’s aggressive development pace and the rich feature set. Resolve is often viewed as the greener pasture over the hill. I’m going to take a contrarian’s point of view. I’ve been using Resolve since it was introduced as Mac software and recently graded a feature film that was cut on Resolve by another editor.
Unfortunately, the experience was more problematic than I’ve had with grades roundtripped to Resolve from other NLEs. Its performance as an editor was quite slow when trying to move around in the timeline, replace shots, or trim clips. Resolve wouldn’t be my first NLE choice when compared to Premiere Pro, Media Composer, or Final Cut Pro. It’s a complex program by necessity. The color management alone is enough to trip up even experienced editors who aren’t intimately familiar with what the various settings do with the image.
DaVinci Resolve is an all-in-one application that integrates editing (2 different editing models), color correction (aka grading), Fusion visual effects, and the Fairlight DAW. Historically, all-in-ones have not had a great track record in the market. Other such über-apps would include Avid|DS and Autodesk Smoke. Avid pulled the plug on DS and Autodesk changed their business model for the Flame/Smoke/Lustre product family into subscription. Neither DS nor Smoke as a standalone application moved the needle for market share.
At its core, Resolve is a grading application with Fusion and Fairlight added in later. Color, effects, and audio mixing are all specialized skills and the software is designed so that each specialist if comfortable with the toolset presented on those pages/modes. I believe Blackmagic has been attempting to capitalize on Final Cut editor discontent and create the mythical “FCP8” or “FC Extreme” that many wanted. However, adding completely new and disparate functions to an application that at its core is designed around color correction can make it quite unwieldy. Beginning editors are never going to touch most of what Resolve has to offer and the specialists would rather have a dedicated specialized tool, like Nuke, After Effects, or Pro Tools.
Apple Final Cut Pro – reimagining modern workflows for generalists
Apple makes software for generalists. Pages, Numbers, Keynote, Photos, GarageBand, and iMovie are designed for that 80%. Apple also creates advanced software for the more demanding user under the ProApps banner (professional applications). This is still “generalist” software, but designed for more complex workflows. That’s where Final Cut Pro, Motion, Compressor, and Logic Pro fit.
Apple famously likes to “skate to where the puck will be” and having control over hardware, operating system, and software gives the teams special incite to develop software that is optimized for the hardware/OS combo. As a broad-based consumer goods company Apple also understands market trends. In the case of iPhones and digital photography it also plays a huge role in driving trends.
When Apple launched Final Cut Pro X the goal was an application designed for simplified, modernized workflows – even if “Hollywood” wasn’t quite ready. This meant walking away from the comprehensive “suite of tools” concept (Final Cut Studio). They chose to focus on a few applications that were better equipped for where the wider market of content creators was headed – yet, one that could still address more sophisticated needs, albeit in a different way.
This reimagining of Final Cut Pro had several aspects to it. One was to design an application that could easily be used on laptops and desktop systems and was adaptable to single and dual screen set-ups. It also introduced workflows based on metadata to improve edit efficiency. It was intended as a platform with third parties filling in the gaps. This means you need to augment FCP to cover a few common industry workflows. In short, FCP is designed to appeal to a broad spectrum of today’s “professionals” and not how one might have defined that term in the early 1990s, when nonlinear editing first took hold.
For a developer, it gets down to who the product is marketed towards and which new features to prioritize. Generalists are going to grow the market faster, hence a better return on development resources. The more complex an application becomes, the more likely it is to have bugs or break when the hardware or OS is updated. Quality assurance testing (QA) expands exponentially with complexity.
Do my criticisms of Resolve mean that it’s a bad application? No, definitely not! It’s powerful in the right hands, especially if you work within its left-to-right workflow (edit -> Fusion -> color -> Fairlight). But, I don’t think it’s the ideal NLE for craft editing. The tools are designed for a collection of specialists. Blackmagic has been on this path for a rather long time now and seem to be at a fork in the road. Maybe they should step back, start from a clean slate, and develop a fresh, streamlined version of Resolve. Or, split it up into a set of individual, focused applications.
So, is Final Cut Pro the ideal editing platform? It’s definitely a great NLE for the true generalist. I’m a fan and use it when it’s the appropriate tool for the job. I like that it’s a fluid NLE with a responsive UI design. Nevertheless, it isn’t the best fit for many circumstances. I work in a market and with clients that are invested in Adobe Creative Cloud workflows. I have to exchange project files and make sure plug-ins are all compatible. I collaborate with other editors and more than one of us often touches these projects.
Premiere Pro is the dominant NLE for me in this environment. It also clicks with how my mind works and feels natural to me. Although you hear complaints from some, Premiere has been quite stable for me in all my years of use. Premiere Pro hits the sweet spot for advanced editors working on complex productions without becoming overly complex. Product updates over the past year have provided new features that I use every day. However, if I were in New York or Los Angeles, that answer would likely be Avid Media Composer, which is why Avid maintains such dominance in broadcast operations and feature film post.
In the end, there is no right or wrong answer. If you have the freedom to choose, then assess your skills. Where do you fall on the generalist/specialist spectrum? Pick the application that best meets your needs and fits your mindset.