Rethinking NLE design

The launch of Apple Final Cut Pro X stood the editing industry on its collective ear. It also sparked some very interesting – and occasionally heated – discussions about what the design of a modern, nonlinear editing application really should be. NLEs that we all use today are a collection of terminology, tools, designs and workflows borrowed from different technologies.

Film editing was the first, so-called “nonlinear” editing method for picture and sound, because new clips could be spliced into the assembled sequence of clips. Those clips that followed would simply move down in order with a corresponding change in overall duration. You could build a movie, reel or scene in any order, since these were all independent pieces that you were free to re-arrange.

Videotape editing was “linear”, because edited sequences were locked to their recorded location on a master videotape reel. Ironically, the earliest tape editing involved physically splicing pieces of magnetic tape and was by definition also nonlinear – even if that wasn’t as practical as it was in film. Once electronic editing came about, clips were locked against time as a function of their recorded position on the tape, so making changes that altered clip order and/or program duration required the re-recording of all the footage that followed the point of the change. You tended to build a program in a linear fashion, from start to finish.

Although electronic, random-access, nonlinear editing had a two-decades-long evolution, most younger editors recognize the original EMC2 and Avid Media Composer of the late 80s as the starting point for today’s software-based NLEs. These two pioneering systems mashed up functions from a variety of disciplines and have become the basis for current NLE design. Insert or splice-in edits and roller-style trimming came from film, as did bins and clips. Overwrite edits, a two-up (source-record) window display and digital video effects came from linear video editing. Copy-and-paste methods came from general computing and word processing. We all understood this paradigm … until June 2011.

Enter Final Cut Pro X. Along with some exciting new features, Apple decided to rethink what an NLE should be, amidst screams that they had thrown out 100 years of film editing grammar. For evidence, the critics point to the unified viewer, the magnetic timeline and the trackless timeline. As Apple is wont to do, the ProApps team designed the NLE that they thought was best for the future and not one built on evolutionary upgrades to the previous version’s feature set.

If you break the concept down, they have in many ways gone back to the core of editing as practiced by a film editor. That is, giving you tools built on editing story-centric timelines. Unlike previous NLE sequences, which place clips along a timeline based on a location in time, the FCP X timeline represent a series of connected clips organized along storylines. Clips have no specific relationship to their order in time, but rather have a parent-child relationship to each other. The single, unified viewer and the behavior of the magnetic timeline are actually quite similar to editing film on a Moviola or flatbed. Having combined A/V clips with locked, sync audio is not unlike the relationship of sprocketed film sound, where the physical sprockets prevent sync errors.

So far for me, the metaphors hold up, but the trackless timeline feels odd to me (at least for now) – especially for audio. The concept of tracks precedes the origins of editing. Think of a musical score used by a conductor. Multiple parallel staffs define the music played by each instrument – violins on one, flutes on another and so on. This is no different than tracks in NLE software and it’s a concept that we’ve understood for hundreds of years. The same principles apply to film sound mixing, multi-channel audio production, live concert venues through a mixing board, as well as Apple Logic and GarageBand.

In an effort to simplify the navigation of NLE track-patching and management of target selection that tracks invariably require, Apple has chosen to make a more fluid timeline. Audio and video clips are not locked into any rigid vertical order. In fact, audio and video clips can live in completely opposite places from where a track-based system would organize them.

This has simplified the number of keystrokes you need to edit a clip to the timeline, but it has made it much harder to engage in any type of track-based audio mixing. It also makes it all but impossible to use the timeline as a sort of visual scratch pad in the way many film editors use it. To mitigate the impact, Apple has added Roles, with an eye towards managing some of this via metadata. We’ll see if the industry warms to that. I think it will take a few more updates to fully flesh out this feature.

Most of the concepts employed in FCP X are hardly new to NLEs. Many have been used in various other products at one time or another. Non-standard editing interfaces have generally not withstood the test of time. Witness Montage, EdiFlex, the original Adobe Premiere, ImMix, EditDroid and Quantel Harry. All of these used a layout that was vastly different than we are used to now. Even Lightworks, which started out with a much different layout, eventually came to one based on the source-record-timeline model.

Apple is the only company that could do this and get away with it – and quite possibility have it pay off in the end. Avid would never do it. They’ve had Avid DS for years. It’s a great tool, but enough different from Media Composer that the veteran Avid base of Media Composer, Symphony and NewsCutter editors simply rejected early efforts to make it the flagship editing model. Apple on the other hand can afford to temporarily alienate their users without much affect on the bottom line. As such, they have time to hone the model, stomp out the bugs and gradually wait it out while a loyal core of new (and old) users grows to like the design of the product.

For me, FCP X (at version 10.0.2) is still a love-hate relationship. The redesign aids some parts of my workflow and hinders others. For example, on my own system, I work with two displays and no external broadcast monitor. The unified viewer gives me a larger video window within the UI. Running the Events browser on the left in the list view lets me quickly and visually identify footage as I arrow down through the list. The clip I’m on appears in the filmstrip at the top. On other NLEs this normally requires double-clicking each clip to load it into the source window and scrub through it to see what it is. Or setting the bin to thumbnail view, which I find useless most of the time.

FCP X isn’t the best tool for all projects, yet. It may never work for feature film editors the way “classic” FCP did. Right or wrong, Apple certainly isn’t waiting for what a small and notoriously change-resistant niche feels about FCP X. After all, “video literacy” will likely drive tremendous sales for the product. Whatever the future holds, it sure has us thinking about what we need and don’t need in editing software. We’re not just comparing features, but really thinking about which NLE actually lets us cut better and have more fun while we are doing it.

Click here for Part II.

©2011 Oliver Peters