The State of the NLE 2019

It’s a new year, but the doesn’t mean that the editing software landscape will change drastically in the coming months. For all intents and purpose, professional editing options boil down to four choices: Avid Media Composer, Adobe Premiere Pro, Apple Final Cut Pro X, and Blackmagic Design DaVinci Resolve. Yes, I know Vegas, Lightworks, Edius, and others are still out there, but those are far off on the radar by comparison (no offense meant to any happy practitioners of these tools). Naturally, since blogs are mainly about opinions, everything I say from here on is purely conjecture. Although it’s informed by my own experiences with these tools and my knowing many of the players involved on the respective product design and management teams – past and present.

Avid continues to be the go-to NLE in the feature film and episodic television world. That’s certainly a niche, but it’s a niche that determines the tools developed by designers for the broader scope of video editing. Apple officially noted two million users for Final Cut Pro X last year and I’m sure it’s likely to be at least 2.5M by now. Adobe claims Premiere Pro to be the most widely used NLE by a large margin. I have no reason to doubt that statement, but I have also never seen any actual stats. I’m sure through the Creative Cloud subscription mechanism Adobe not only knows how many Premiere Pro installations have been downloaded, but probably has a good idea as to actual usage (as opposed to simply downloading the software). Bringing up the rear in this quartet is Resolve. While certainly a dominant color correction application, I don’t yet see it as a key player in the creative editing (as opposed to finishing) space. With the stage set, let’s take a closer look.

Avid Media Composer

Editors who have moved away from Media Composer or who have never used it, like to throw shade on Avid and its marquee product. But loyal users – who include some of the biggest names in film editing – stick by it due in part to familiarity, but also its collaborative features and overall stability. As a result, the development pace and rate of change is somewhat slow compared with the other three. In spite of that, Avid is currently on a schedule of a solid, incremental update nearly every month – each of which chips away at a long feature request list. The most recent one dropped on December 31st. Making significant changes without destroying the things that people love is a difficult task. Development pace is also hindered by the fact that each one of these developers is also chasing changes in the operating system, particularly Apple and macOS. Sometimes you get the feeling that it’s two steps forward, one step back.

As editors, we focus on Media Composer, but Avid is a much bigger company than just that, with its fingers in sound, broadcast, storage, cloud, and media management. If you are a Pro Tools user, you are just as concerned about Avid’s commitment to you, as editors are to them. Like any large company, Avid must advance not just a single core product, but its ecosystem of products. Yet it still must advance the features in these products, because that’s what gets users’ attention. In an effort to improve its attraction to new users, Avid has introduced subscription plans and free versions to make it easier to get started. They now cover editing and sound needs with a lower cost-of-entry than ever before.

I started nonlinear editing with Avid and it will always hold a spot in my heart. Truth be told, I use it much less these days. However, I still maintain current versions for the occasional project need plus compatibility with incoming projects. I often find that Media Composer is the single best NLE for certain tasks, mainly because of Avid’s legacy with broadcast. This includes issues like proper treatment of interlaced media and closed captioning. So for many reasons, I don’t see Avid going away any time soon, but whether or not they can grow their base remains an unknown. Fortunately many film and media schools emphasize Avid when they teach editing. If you know Media Composer, it’s an easy jump to any other editing tool.

Adobe Premiere Pro CC

The most widely used NLE? At least from what I can see around me, it’s the most used NLE in my market, including individual editors, corporate media departments, and broadcasters. Its attraction comes from a) the versatility in editing with a wide range of native media formats, and b) the similarity to – and viable replacement for – Final Cut Pro “legacy”. It picked up steam partly as a reaction to the Final Cut Pro X roll-out and users have generally been happy with that choice. While the shift by Adobe to a pure subscription model has been a roadblock for some (who stopped at CS6), it’s also been an advantage for others. I handle the software updates at a production company with nine edit systems and between the Adobe Creative Cloud and Apple Mac App Store applications, upgrades have never been easier.

A big criticism of Adobe has been Premiere’s stability. Of course, that’s based on forum reads, where people who have had problems will pipe up. Rarely does anyone ever post how uneventful their experience has been. I personally don’t find Premiere Pro to be any less stable than any other NLE or application. Nonetheless, working with a mix of oddball native media will certainly tax your system. Avid and Apple get around this by pushing optimized and proxy media. As such, editors reap the benefits of stability. And the same is true with Premiere. Working with consistent, optimized media formats (transcoded in advance) – or working with Adobe’s own proxies – results in a more stable project and a better editing experience.

Avid Media Composer is the dominant editing tool in major markets, but mainly in the long-form entertainment media space. Many of the top trailer and commercial edit shops in those same markets use Premiere Pro. Again, that goes back to the FCP7-to-Premiere Pro shift. Many of these companies had been using the old Final Cut rather than Media Composer. Since some of these top editors also cut features and documentaries, you’ll often see them use Premiere on the features that they cut, too. Once you get below the top tier of studio films and larger broadcast network TV shows, Premiere Pro has a much wider representation. That certainly is good news for Adobe and something for Avid to worry about.

Another criticism is that of Adobe’s development pace. Some users believed that moving to a subscription model would speed the development pace of new versions – independent of annual or semi-annual cycles. Yet cycles still persist – much to the disappointment of those users. This gets down to how software is actually developed, keeping up with OS changes, and to some degree, marketing cycles. For example, if there’s a big Photoshop update, then it’s possible that the marketing “wow” value of a large Premiere Pro update might be overshadowed and needs to wait. Not ideal, but that’s the way it is.

Just because it’s possible, doesn’t mean that users really want to constantly deal with automatic software updates that they have to keep track of. This is especially true with After Effects and Premiere Pro, where old project files often have to be updated once you update the application. And those updates are not backwards compatible. Personally, I’m happy to restrict that need to a couple of times a year.

Users have the fear that a manufacturer is going to end-of-life their favorite application at some point. For video users, this was made all too apparent by Apple and FCPX. Neither Apple nor Adobe has been exempt from killing off products that no longer fit their plans. Markets and user demands shift. Photography is an obvious example here. In recent years, smart phones have become the dominant photographic device, which has enabled cloud-syncing and storage of photos. Adobe and Apple have both shifted the focus for their photo products accordingly. If you follow any of the photo blogs, you’ll know there’s some concern that Adobe Lightroom Classic (the desktop version) will eventually give way completely to Lightroom CC (the cloud version). When a company names something as “classic”, you have to wonder how long it will be supported.

If we apply that logic to Premiere Pro, then the new Adobe Rush comes to mind. Rush is a simpler, nimbler, cross-platform/cross-device NLE targeted as users who produce video starting with their smart phone or tablet. Since there’s also a desktop version, one could certainly surmise that in the future Rush might replace Premiere Pro in the same way that FCPX replaced FCP7. Personally, I don’t think that will happen any time soon. Adobe treats certain software as core products. Photoshop, Illustrator, and After Effects are such products. Premiere Pro may or may not be viewed that way internally, but certainly more so now than ever in the past. Premiere Pro is being positioned as a “hub” application with connections to companion products, like Prelude and Audition. For now, Rush is simply an interesting offshoot to address a burgeoning market. It’s Adobe’s second NLE, not a replacement. But time will tell.

Apple Final Cut Pro X

Apple released Final Cut Pro X in the summer of 2011 – going on eight years now. It’s a versatile, professional tool that has improved greatly since that 2011 launch and gained a large and loyal fan base. Many FCPX users are also Premiere Pro users and the other way around. It can be used to cut nearly any type of project, but the interface design is different from the others, making it an acquired taste. Being a Mac-only product and developed within the same company that makes the hardware and OS, FCPX is optimized to run on Macs more so than any cross-platform product can be. For example, the fluidity of dealing with 4K ProRes media on even older Macs surpasses that of any other NLE.

Prognosticating Apple’s future plans is a fool’s errand. Some guesses have put the estimated lifespan of FCPX at 10 years, based in part on the lifespan of FCP “legacy”. I have no idea whether that’s true of not. Often when I read interviews with key Apple management (as well as off-the-record, casual discussions I’ve had with people I know on the inside), it seems like a company that actually has less of a concrete plan when it comes to “pro” users. Instead, it often appears to approach them with an attitude of “let’s throw something against the wall and see what sticks”. The 2013 Mac Pro is a striking example of this. It was clearly innovative and a stellar exhibit for Apple’s “think different” mantra. Yet it was a product that obviously was not designed by actually speaking with that product’s target user. Apple’s current “shunning” of Nvidia hardware seems like another example.

One has to ask whether a company so dominated by the iPhone is still agile enough to respond to the niche market of professional video editors. While Apple products (hardware and software) still appeal to creatives and video professionals, it seems like the focus with FCPX is towards the much broader sphere of pro video. Not TV shows and feature films (although that’s great when it comes) – or even high-end commercials and trailers – but rather the world of streaming channels, social media influencers, and traditional publishers who have shifted to an online media presence from a print legacy. These segments of the market have a broad range of needs. After all, so called “YouTube stars” shoot with everything from low-end cameras and smart phones all the way up to Alexas and REDs. Such users are equally professional in their need to deliver a quality product on a timetable and I believe that’s a part of the market that Apple seeks to address with FCPX.

If you are in the world of the more traditional post facility or production company, then those users listed above may be market segments that you don’t see or possibly even look down upon. I would theorize that among the more traditional sectors, FCPX may have largely made the inroads that it’s going to. Its use in films and TV shows (with the exception of certain high-profile, international examples) doesn’t seem to be growing, but I could be wrong. Maybe the marketing is just behind or it no longer has PR value. Regardless, I do see FCPX as continuing strong as a product. Even if it’s not your primary tool, it should be something in your toolkit. Apple’s moves to open up ProRes encoding and offering LumaForge and Blackmagic eGPU products in their online store are further examples that the pro customer (in whatever way you define “pro”) continues to have value to them. That’s a good thing for our industry.

Blackmagic Design DaVinci Resolve

No one seems to match the development pace of Blackmagic Design. DaVinci Resolve underwent a wholesale transformation from a tool that was mainly a high-end color corrector into an all-purpose editing application. Add to this the fact that Blackmagic has acquired and integrated a number of companies, whose tools have been modernized and integrated into Resolve. Blackmagic now offers a post-production solution with some similarities to FCPX while retaining a traditional, track-based interface. It includes modes for advanced audio post (Fairlight) and visual effects (Fusion) that have been adapted from those acquisitions. Unlike past all-in-one applications, Resolve’s modal pages retain the design and workflow specific to the task at hand, rather than making them fit into the editing application’s interface design. All of this in a very short order and across three operating systems, thus making their pace the envy of the industry.

But a fast development pace doesn’t always translate into a winning product. In my experience each version update has been relatively solid. There are four ways to get Resolve (free and paid, Mac App Store and reseller). That makes it a no-brainer for anyone starting out in video editing, but who doesn’t have the specific requirement for one application over another. I have to wonder though, how many new users go deep into the product. If you only edit, there’s no real need to tap into the Fusion, Fairlight, or color correction pages. Do Resolve editors want to finish audio in Fairlight or would they rather hand off the audio post and mix to a specialist who will probably be using Pro Tools? The nice thing about Resolve is that you can go as deep as you like – or not – depending on your mindset, capabilities, and needs.

On the other hand, is the all-in-one approach better than the alternatives: Media Composer/Pro Tools, Premiere Pro/After Effects/Audition, or Final Cut Pro X/Motion/Logic Pro X? I don’t mean for the user, but rather the developer. Does the all-in-one solution give you the best product? The standalone version of Fusion is more full-featured than the Fusion page in Resolve. Fusion users are rightly concerned that the standalone will go away, leaving them with a smaller subset of those tools. I would argue that there are already unnecessary overlaps in effects and features between the pages. So are you really getting the best editor or is it being compromised by the all-in-one approach? I don’t know the answer to these questions. Resolve for me is a good color correction/grading application that can also work for my finishing needs (although I still prefer to edit in something else and roundtrip to/from Resolve). It’s also a great option for the casual editor who wants a free tool. Yet in spite of all its benefits, I believe Resolve will still be a distant fourth in the NLE world, at least for the next year.

The good news is that there are four great editing options in the lead and even more coming from behind. There are no bad choices and with a lower cost than ever, there’s no reason to limit your knowledge to only one. After all, the products that are on top now may be gone in a decade. So broaden your knowledge and define your skills by your craft – not your tools!

©2019 Oliver Peters

Advertisements

Preparing your Film for Distribution

First-time filmmakers are elated when their film finally gets picked up for distribution. But the hardest work may be next. Preparing your film and companion materials can be a very detailed and complex endeavor if you didn’t plan for it properly from the outset. While each distributor and/or network has slightly different specs, the general requirements are the same. Here are the more common ones.

1. Film master. Supplying a master file is self-evident, but the exact details are not consistent across the board. Usually some additional post will be required when you get distribution. You will need to add the distributor’s logo animation up front, make sure the first video starts at a specified timecode, and that you have audio channels in a certain configuration (see Item 2).

In spite of the buzz over 4K, many distributors still want 1920×1080 files at 23.98fps (or possibly 24.0fps) – usually in the Apple ProResHQ* video codec. The frame rate may differ for broadcast-oriented films, such as documentaries. In that case, 29.97fps might be required. Also, some international distributors will require 25.0fps. If you have any titles over the picture, then “textless” material must also be supplied. Generally, you can add those sections, such as the video under opening titles, at the end of the master, following the end credits of the film.

*Occasionally film festivals and some distributors will also require a DCP package instead of a single QuickTime or MXF master file.

2. Audio mixes and tracks. Stereo and/or 5.1 surround mixes are the most commonly requested audio configurations. You’ll often be asked to supply both the full mixes and the “stems”. The latter are separate submixes of only dialogue, sound effects, and music. Some distributors want these stems as separate files, while others want them attached to the master file. These are easy to supply if the film was originally mixed with that in mind. But if your mixer only produced a final mix, then it’s a lot harder to go back and get new stem tracks. A typical channel assignment on a delivery master is eight tracks for the 5.1 surround mix (L, R, C, LFE, Ls, Rs), the stereo mix (left, right), and a stereo M&E mix (combined music and effects, minus the dialogue).

3. Subtitles and captions. In order to be compliant with various accessibility regulations, you will likely have to supply closed captioning sidecar files that sync to your master. There are numerous formats and several NLEs allow you to create these. However, it’s far easier and usually more accurate to have a service create your files. There are numerous vendors, with prices starting as low as $1/minute. Closed captions should not be confused with subtitles, also called open captions. These appear on-screen and are common when someone is speaking in another language. Check with your distributor if this applies to you, because they may want the video without titles, in the event of international distribution.

4. Legal documentation. There’s a wide range of paperwork that you should be prepared to turn over. This includes licensing for any music and stock footage, talent releases, contracts, and deal memos. One important element is to be able to prove “chain-of-title”. You must be able to prove that you own the rights to the story and the film. Music is often a sticking point for indie filmmakers. If you used temp music or had a special deal for film festival showings, now is the time to pay up. You won’t get distribution until all music is clearly licensed. Music info should also include a cue sheet (song names, length, and position within the film).

5. Errors and omissions insurance. This is a catch-all policy you’ll need to buy to satisfy many distributors. It’s designed to cover you in the event that there’s a legal claim (frivolous or otherwise) against the film. For example, if someone comes out of the woodwork saying that you ripped them off and stole their story idea and that you now owe them money.

6. Trailer. Distributors often request a trailer to be used to promote the film. The preference seems to be that the trailer is under two minutes in length. It may or may not need to include the MPAA card at the front and should have a generic end tag (no “coming soon” or date at the end). Often a simple stereo mix will be fine, but don’t take that for granted. If you are going through full sound post anyway in creating a trailer, be sure to generate the full audio package – stereo and surround mixes and splits in various combinations, just like your feature film master.

7. Everything else. Beyond this list, you’ll often be asked for additional “nice to have” items. These include screeners (DVD or web), behind-the-scenes press clips or photos, frame grabs from the film, a final script, biographies of the creative team and lead actors, as well as a poster image.

As you can see, none of this seems terribly difficult if you are aware of these needs going in. But if you have prepared none of this in advance, it will become a mad scramble at the end to keep the distributor happy.

Originally written for RedShark News

©2018 Oliver Peters

Five Decades of Edit Suite Evolution

I spent last Friday setting up two new Apple iMac Pros as editing workstations. When I started as an editor in the 1970s, it was the early days of computer-assisted video editing. Edit suites (or bays) were intended for either “offline” editing with simple hardware, where creative cutting was the goal – or they were “online”, designed for finishing and used the most expensive gear. Sometimes the online bay would do double-duty for both creative and final post.

The minimum investment for such a linear edit suite would include three 2” videotape recorders, a video switcher (vision mixer), edit controller, audio mixer, and a small camera for titles and artwork. Suites were designed with creature comforts, since clients would often spend days at a time supervising the edit session. Before smart phones and the internet, clients welcomed the chance to get out of the office and go to the edit. Outfitting one of these edit suites would start at several hundred thousand dollars.

At my current edit gig, the company runs nine Mac workstations within a footprint that would have only supported three edit suites of the past, including a centralized machine room. Clients rarely come to supervise an edit, so the layout is more akin to the open office plan of a design studio. Editing can be self-contained on a Mac or PC and editors work in a more collegial, collaborative environment. There’s one “hero” room for when clients do decide to drop in.

In these five decades, computer-assisted editing has gone through four phases:

Phase 1 – Offline and online edit suites, primarily based on linear videotape technology.

Phase 2 – Nonlinear editing took hold with the introduction of Avid, EMC, Media 100, and Lightworks. The resolution was too poor for finishing, but the systems were ideal for the creative process. VTR-based linear rooms still handled finishing.

Phase 3 – As the quality improved, nonlinear systems could deliver finished masters. But camera acquisition and delivery was still centered on videotape. Nonlinear systems still had to be able to output to tape, which required specialized i/o hardware.

Phase 4 (current) – Editing is completely based around the computer. Most general-purpose desktop and even laptop computers are capable of the whole gamut of post services without the need for specialized hardware. That has become optional. The full shift to Phase 4 came when file-based acquisition and delivery became the norm.

This transition brought about a sea change in cost, workflow, facility design, and talent needs. It has been driven by technology, but also a number of socioeconomic factors.

1. Technology always advances. Computers get more powerful at a lower cost point. Moore’s Law and all that. Although our demands increase – SD, HD, 4K, 8K, and beyond – computers, so far, have not been outpaced. I can edit 4K today with an investment of under $10K, which was impossible in 1980, even with an investment of $500K or more. This cost reduction also applies to shared storage solutions (NAS and SAN systems). They are cheaper, easier to install, and more reliable than ever. Even the smallest production company can now afford to design editing around the collaboration of several editors and workstations.

2. The death of videotape came with the 2011 Tohoku earthquake and tsunami in Japan that disabled the Fukushima nuclear plant. A byproduct of this natural disaster was that it damaged the Sony videotape manufacturing plant, putting supplies of HDCAM-SR stock on indefinite backorder. This pointed to the vulnerability of videotape and hastened the acceptance of file-based delivery for masters by key networks and distributors.

3. Interactions with clients and human beings in general has changed – thanks to smartphones, personal computers, and the internet. While both good and bad, the result is a shift in our communication with clients. Most of the time, edit session review and approval is handled over internet services. Post your cut. Get feedback. Make your changes and post again. Repeat. Along with a smaller hardware footprint than in the past, this is one of the prime reasons that room designs have changed. You don’t need a big, comfortable edit suite designed for clients, if they aren’t going to come. A smaller room will do as long as your editors are happy and productive.

Such a transition isn’t new. It’s been mirrored in the worlds of publishing, graphic design, and recording studios. Nevertheless, it is interesting to look back at how far things have come. Naturally, some will view this evolution as a threat and others as filled with opportunities And, of course, where it goes from here is anyone’s guess.

All I know is that setting up two edit systems in a day would have been inconceivable in 1975!

Originally written for RedShark News

The hear a bit more about the changes and evolution of facilities, check out the Dec. 13th edition of the Digital Production Buzz. Click this link.

©2018 Oliver Peters

Viva Las Vegas – NAB 2018

As more and more folks get all of their information through internet sources, the running question is whether or not trade shows still have value. A show like the annual NAB (National Association of Broadcasters) Show in Las Vegas is both fun and grueling, typified by sensory overload and folks in business attire with sneakers. Although some announcements are made before the exhibits officially open – and nearly all are pretty widely known before the week ends – there still is nothing quite like being there in person.

For some, other shows have taken the place of NAB. The annual HPA Tech Retreat in the Palm Springs area is a gathering of technical specialists, researchers, and creatives that many consider the TED Talks for our industry. For others, the Cine Gear Expo in LA is the prime showcase for grip, lighting, and camera offerings. RED Camera has focused on Cine Gear instead of NAB for the last couple of years. And then, of course, there’s IBC in Amsterdam – the more humane version of NAB in a more pleasant setting. But for me, NAB is still the main event.

First of all, the NAB Show isn’t merely about the exhibit floor at the sprawling Las Vegas Convention Center. Actual NAB members can attend various sessions and workshops related to broadcasting and regulations. There are countless sidebar events specific to various parts of the industry. For editors that includes Avid Connect – a two-day series of Avid presentations in the weekend leading into NAB; Post Production World – a series of workshops, training sessions, and presentations managed by Future Media Concepts; as well as a number of keynote presentations and artist gatherings, including SuperMeet, FCPexchange, and the FCPX Guru Gathering. These are places where you’ll rub shoulders with some well-known editors, colorists, artists, and mixers, learn about new technologies like HDR (high dynamic range imagery), and occasionally see some new product features from vendors who might not officially be on the show floor with a booth, like Apple.

One of the biggest benefits I find in going to NAB is simply walking the floor, checking out the companies and products who might not get a lot of attention. These newcomers often have the most innovative technologies and it’s these new things that you find, which were never on the radar prior to that week.

The second benefit is connection. I meet up again in person with friends that I’ve made over the years – both other users, as well as vendors. Often it’s a chance to meet people that you might only know through the internet (forums, blogs, etc.) and to get to know them just a bit better. A bit more of that might make the internet more friendly, too!

Here are some of my random thoughts and observations from Las Vegas.

__________________________________

Editing hardware and software – four As and a B

Apple uncharacteristically pre-announced their new features just prior to the show, culminating with App Store availability on Monday when the NAB exhibits opened. This includes new Final Cut Pro X/Motion/Compressor updates and the official number of 2.5 million FCPX users. That’s a growth of 500,000 users in 2017, the biggest year to date for Final Cut. The key new feature in FCPX is a captioning function to author, edit, and export both closed and embedded (open) captions. There aren’t many great solutions for captioning and the best to date have been expensive. I found that the Apple approach was now the best and easiest to use that I’ve seen. It’s well-designed and should save time and money for those who need to create captions for their productions – even if you are using another brand of NLE. Best of all, if you own FCPX, you already have that feature. When you don’t have a script to start out, then manual or automatic transcription is required as a starting point. There is now a tie-in between Speedscriber (also updated this week) and FCPX that will expedite the speech-to-text function.

The second part of Apple’s announcement was the introduction of a new camera raw codec family – ProResRAW and ProResRAW HQ. These are acquisition codecs designed to record the raw sensor data from Bayer-pattern sensors (prior to debayering the signal into RGB information) and make that available in post, just like RED’s REDCODE RAW or CinemaDNG. Since this is an acquisition codec and NOT a post or intermediate codec, it requires a partnership on the production side of the equation. Initially this includes Atomos and DJI. Atomos supplies an external recorder, which can record the raw output from various cameras that offer the ability to record raw data externally. This currently includes their Shogun Inferno and Sumo 19 models. As this is camera-specific, Atomos must then create the correct profile by camera to remap that sensor data into ProResRAW. At the show, this included several Canon, Sony, and Panasonic cameras. DJI does this in-camera on the Inspire 2.

The advantage with FCPX, is that ProResRAW is optimized for post, thus allowing for more streams in real-time. ProResRAW data rates (variable) fall between that of ProRes and ProResHQ, while the less compressed ProResRAW HQ rates are between ProRes HQ and ProRes 4444. It’s very early with this new codec, so additional camera and post vendors will likely add ProResRAW support over the coming year. It is currently unknown whether or not any other NLEs can support ProResRAW decode and playback yet.

As always, the Avid booth was quite crowded and, from what I heard, Avid Connect was well attended with enthused Avid users. The Avid offerings are quite broad and hard to encapsulate into any single blog post. Most, these days, are very enterprise-centric. But this year, with a new CEO at the helm, Avid’s creative tools have been reorganized into three strata – First, standard, and Ultimate. This applies to Sibelius, Pro Tools, and Media Composer. In the case of Media Composer, there’s Media Composer | First – a fully functioning free version, with minimal restrictions; Media Composer; and Media Composer | Ultimate – includes all options, such as PhraseFind, ScriptSync, NewsCutter, and Symphony. The big difference is that project sharing has been decoupled from Media Composer. This means that if you get the “standard” version (just named Media Composer) it will not be enabled for collaboration on a shared storage network. That will require Media Composer | Ultimate. So Media Composer (standard) is designed for the individual editor. There is also a new subscription pricing structure, which places Media Composer at about the same annual cost as Adobe Premiere Pro CC (single app license). The push is clearly towards subscription, however, you can still purchase and/or maintain support for perpetual licenses, but it’s a little harder to find that info on Avid’s store website.

Though not as big news, Avid is also launching the Avid DNxID capture/export unit. It is custom-designed by Blackmagic Design for Avid and uses a small form factor. It was created for file-base acquisition, supports 4K, and includes embedded DNx codecs for onboard encoding. Connections are via component analog, HDMI, as well as an SD card slot.

The traffic around Adobe’s booth was thick the entire week. The booth featured interesting demos that were front and center in the middle of one of the South Hall’s main thoroughfares, generally creating a bit of a bottleneck. The newest Creative Cloud updates had preceded the show, but were certainly new to anyone not already using the Adobe apps. Big news for Premiere Pro users was the addition of automatic ducking that was brought over from Audition, and a new shot matching function within the Lumetri color panel. Both are examples of Adobe’s use of their Sensei AI technology. Not to be left out, Audition can now also directly open sequences from Premiere Pro. Character Animator had been in beta form, but is now a full-fledged CC product. And for puppet control Adobe also introduced the Advanced Puppet Engine for After Effects. This is a deformation tool to better bend, twist, and control elements.

Of course when it comes to NLEs, the biggest buzz has been over Blackmagic Design’s DaVinci Resolve 15. The company has an extensive track record of buying up older products whose companies weren’t doing so well, reinvigorating the design, reducing the cost, and breathing new life into them – often to a new, wider customer base. This is no more evident than Resolve, which has now grown from a leading color correction system to a powerful, all-in-one edit/mix/effects/color solution. We had previously seen the integration of the Fairlight audio mixing engine. This year Fusion visual effects were added. As before, each one of these disparate tools appears on its own page with a specific UI optimized for that task.

A number of folks have quipped that someone had finally resurrected Avid DS. Although all-in-ones like DS and Smoke haven’t been hugely successful in the past, Resolve’s price point is considerably more attractive. The Fusion integration means that you now have a subset of Fusion running inside of Resolve. This is a node-based compositor, which makes it easy for a Resolve user to understand, since it, too, already uses nodes in the color page. At least for now, Blackmagic Design intends to also maintain a standalone version of Fusion, which will offer more functions for visual effects compositing. Resolve also gained new editorial features, including tabbed sequences, a pancake timeline view, captioning, and improvements in the Fairlight audio page.

Other Blackmagic Design news includes updates to their various mini-converters, updates to the Cintel Scanner, and the announcement of a 4K Pocket Cinema Camera (due in September). They have also redesigned and modularized the Fairlight console mixing panels. These are now more cost-effective to manufacture and can be combined in various configurations.

This was the year for a number of milestone anniversaries, such as the 100th for Panasonic and the 25th for AJA. There were a lot of new product announcements at the AJA booth, but a big one was the push for more OpenGear-compatible cards. OpenGear is an open source hardware rack standard that was developed by Ross and embraced by many manufacturers. You can purchase any OpenGear version of a manufacturer’s product and then mix and match a variety of OpenGear cards into any OpenGear rack enclosure. AJA’s cards also offer Dashboard support, which is a software tool to configure and control the cards. There are new KONA SDI and HDMI cards, HDR support in the IO 4K Plus, and HDR capture and playback with the KiPro Ultra Plus.

HDR

It’s fair to say that we are all learning about HDR, but from what I observed on the floor, AJA is one of the only companies with a number of hardware product offerings that will allow you to handle HDR. This is thanks to their partnership with ColorFront, who is handling the color science in these products. This includes the FS | HDR – an up/down/cross, SDR/HDR synchronizer/converter. It also includes support for the Tangent Element Kb panel. The FS | HDR was a tech preview last year, but a product now. This year the tech preview product is the HDR Image Analyzer, which offers waveform and histogram monitoring at up to 4K/60fps.

Speaking of HDR (high dynamic range) and SDR (standard dynamic range), I had a chance to sit in on Robbie Carman’s (colorist at DC Color, Mixing Light) Post Production World HDR overview. Carman has graded numerous HDR projects and from his HDR presentation – coupled with exhibits on the floor – it’s quite clear that HDR is the wild, wild west right now. There is much confusion about color space and dynamic range, not to mention what current hardware is capable of versus the maximums expressed in the tech standards. For example, the BT 2020 spec doesn’t inherently mean that the image is HDR. Or the fact that you must be working in 4K to also have HDR and the set must accept the HDMI 2.0 standard.

High dynamic range grading absolutely requires HDR-compatible hardware, such as the proper i/o device and a display with the ability to receive metadata that turns on and sets its target HDR values. This means investing in a device like AJA’s IO 4K Plus or Blackmagic’s UltraStudio 4K Extreme 3. It also means purchasing a true grading monitor costing tens of thousands of dollars, like one from Sony, Canon, or Flanders. You CANNOT properly grade HDR based on the image of ANY computer display. So while the latest version of FCPX can handle HDR, and an iMac Pro screen features a high nits rating, you cannot rely on this screen to see proper HDR.

LG was a sponsor of the show and LG displays were visible in many of the exhibits. Many of their newest products qualify at the minimum HDR spec, but for the most part, the images shown on the floor were simply bright and not HDR – no matter what the sales reps in the booths were saying.

One interesting fact that Carman pointed out was that HDR displays cannot be driven across the full screen at the highest value. You cannot display a full screen of white at 1,000 nits on a 1,000 nits display without causing damage. Therefore, automatic gain adjustments are used in the set’s electronics to dim the screen. Only a smaller percentage of the image (20% maybe?) can be driven at full value before dimming occurs. Another point Carman made was that standard lift/gamma/gain controls may be too coarse to grade HDR images with finesse. His preference is to use Resolve’s log grading controls, because you can make more precise adjustments to highlight and shadow values.

Cameras

I’m not a camera guy, but there was notable camera news at the show. Many folks really like the Panasonic colorimetry for which the Varicam products are known. For people who want a full-featured camera in a small form factor, look no further than the Panasonics AU-EVA-1. It’s a 4K, Super35, handheld cinema camera featuring dual ISOs. Panasonic claims 14 stops of latitude. It will take EF lenses and can output camera raw data. When paired with an Atmos recorder it will be able to record ProResRAW.

Another new camera is Canon’s EOS C700 FF. This is a new full-frame model in both EF and PL lens mount versions. As with the standard C700, this is a 4K, Super35 cinema camera that records ProRes or X-AVC at up to 4K resolution onboard to CFast cards. The full-frame sensor offers higher resolution and a shallower depth of field.

Storage

Storage is of interest to many. As costs come down, collaboration is easier than ever. The direct-attached vendors, like G-Tech, LaCie, OWC, Promise, and others were all there with new products. So were the traditional shared storage vendors like Avid, Facilis, Tiger, 1 Beyond, and EditShare. But three of the newer companies had my interest.

In my editing day job, I work extensively with QNAP, which currently offers the best price/performance ratio of any system. It’s reliable, cost-effective, and provides reasonable JKL response cutting HD media with Premiere Pro in a shared editing installation. But it’s not the most responsive and it struggles with 4K media, in spite of plenty of bandwidth  – especially when the editors are all banging away. This has me looking at both Lumaforge and OpenDrives.

Lumaforge is known to many of the Final Cut Pro X editors, because the developers have optimized the system for FCPX and have had early successes with many key installations. Since then they have also pushed into more Premiere-based installations. Because these units are engineered for video-centric facilities, as opposed to data-centric, they promise a better shared storage, video editing experience.

Likewise, OpenDrives made its name as the provider for high-profile film and TV projects cut on Premiere Pro. Last year they came to the show with their highest performance, all-SSD systems. These units are pricey and, therefore, don’t have a broad appeal. This year they brought a few of the systems that are more applicable to a broader user base. These include spinning disk and hybrid products. All are truly optimized for Premiere Pro.

The cloud

In other storage news, “the cloud” garners a ton of interest. The biggest vendors are Microsoft, Google, IBM, and Amazon. While each of these offers relatively easy ways to use cloud-based services for back-up and archiving, if you want a full cloud-based installation for all of your media needs, then actual off-the-shelf solutions are not readily available. The truth of the matter is that each of these companies offers APIs, which are then handed off to other vendors – often for totally custom solutions.

Avid and Sony seem to have the most complete offerings, with Sony Ci being the best one-size-fits-all answer for customer-facing services. Of course, if review-and-approval is your only need, then Frame.io leads and will have new features rolled out during the year. IBM/Aspera is a great option for standard archiving, because fast Aspera up and down transfers are included. You get your choice of IBM or other (Google, Amazon, etc.) cloud storage. They even offer a trial period using IBM storage for 30 days at up to 100GB free. Backblaze is a competing archive solution with many partnering applications. For example, you can tie it in with Archiware’s P5 Suite of tools for back-up, archiving, and server synchronization to the cloud.

Naturally, when you talk of the “cloud”, many people interpret that to mean software that runs in the cloud – SaaS (software as a service). In most cases, that is nowhere close to happening. However, the exception is The Foundry, which was showing Athera, a suite of its virtualized applications, like Nuke, running on the Google Cloud Platform. They demo’ed it running inside the Chrome browser, thanks to this partnership with Google. The Foundry had a pod in the Google partners pavilion.

In short, you can connect to the internet with a laptop, activate a license of the tool or tools that you need, and then all media, processing, and rendering is handled in the cloud, using Google’s services and hardware. Since all of this happens on Google’s servers, only an updated UI image needs to be pushed back to the connected computer’s display. This concept is ideal for the visual effects world, where the work is generally done on an individual shot basis without a lot of media being moved in real-time. The target is the Nuke-centric shop that may need to add on a few freelancers quickly, and who may or may not be able to work on-premises.

Interesting newcomers

As I mentioned at the beginning, part of the joy of NAB is discovering the small vendors who seek out NAB to make their mark. One example this year is Lumberjack Systems, a venture by Philip Hodgetts and Greg Clarke of Intelligent Assistance. They were in the Lumaforge suite demonstrating Lumberjack Builder, which is a text-based NLE. In the simplest of explanations, your transcription or scripted text is connected to media. As you re-arrange or trim the text, the associated picture is edited accordingly. Newly-written text for voiceovers turns into spoken word media courtesy of the computer’s internal audio system and system voice. Once your text-based rough cut is complete, an FCPXML is sent to Final Cut Pro X, for further finesse and final editing.

Another new vendor I encountered was Quine, co-founded by Norwegian DoP Grunleik Groven. Their QuineBox IoT device attaches to the back of a camera, where it can record and upload “conformable” dailies (ProRes, DNxHD) to your SAN, as well as proxies to the cloud via its internal wi-fi system. Script notes can also be incorporated. The unit has already been battle-test on the Netflix/NRK production of “Norsemen”.

Closing thoughts

It’s always interesting to see, year over year, which companies are not at the show. This isn’t necessarily indicative of a company’s health, but can signal a change in their direction or that of the industry. Sometimes companies opt for smaller suites at an area hotel in lieu of the show floor (Autodesk). Or they are a smaller part of a reseller or partner’s booth (RED). But often, they are simply gone. For instance, in past years drones were all the rage, with a lot of different manufacturers exhibiting. DJI has largely captured that market for both vehicles and camera systems. While there were a few other drone vendors besides DJI, GoPro and Freefly weren’t at the show at all.

Another surprise change for me was the absence of SAM (Snell Advanced Media) – the hybrid company formed out of Snell & Wilcox and Quantel. SAM products are now part of Grass Valley, which, in turn, is owned by Belden (the cable manufacturer). Separate Snell products appear to have been absorbed into the broader Grass Valley product line. Quantel’s Go and Rio editors continue in Grass Valley’s editing line, alongside Edius – as simple, middle, and advanced NLE products. A bit sad actually. And very ironic. Here we are in the world of software and file-based video, but the company that still has money to make acquisitions is the one with a heavy investment in copper (I know, not just copper, but you get the point).

Speaking of “putting a fork in it”, I would have to say that stereo 3D and 360 VR are pretty much dead in the film and video space. I understand that there is a market – potentially quite large – in gaming, education, simulation, engineering, training, etc. But for more traditional entertainment projects, it’s just not there. Vendors were down to a few, and even though the leading NLEs have ways of working with 360 VR projects, the image quality still looks awful. When you view a 4K image within even the best goggles, the qualitative experience is like watching a 1970s-era TV set from a few inches away. For now, it continues to be a novelty looking for a reason to exist.

A few final points… It’s always fun to see what computers were being used in the booths. Apple is again a clear winner, with plenty of MacBook Pros and iMac Pros all over the LVCC when used for any sort of creative products or demos. eGPUs are of interest, with Sonnet being the main vendor. However, eGPUs are not a solution that solves every problem. For example, you will see more benefit by adding an eGPU to a lesser-powered machine, like a 13” MacBook Pro than one with more horsepower, like an iMac Pro. Each eGPU takes one Thunderbolt 3 bus, so realistically, you are likely to only add one additional eGPU to a computer. None of the NLE vendors could really tell me how much of a boost their application would have with an eGPU. Finally, if you are looking for some great-looking, large, OLED displays that are pretty darned accurate and won’t break the bank, then LG is the place to look.

©2018 Oliver Peters

A Light Footprint

When I started video editing, the norm was an edit suite with three large quadraplex (2”) videotape recorders, video switcher, audio mixer, B&W graphics camera(s) for titles, and a computer-assisted, timecode-based edit controller. This was generally considered  an “online edit suite”, but in many markets, this was both “offline” (creative cutting) and “online” (finishing). Not too long thereafter, digital effects (ADO, NEC, Quantel) and character generators (Chyron, Aston, 3M) joined the repertoire. 2” quad eventually gave way to 1” VTRs and those, in turn, were replaced by digital – D1, D2, and finally Digital Betacam. A few facilities with money and clientele migrated to HD versions of these million dollar rooms.

Towards the midpoint in the lifespan for this way of working, nonlinear editing took hold. After a few different contenders had their day in the sun, the world largely settled in with Avid and/or Media 100 rooms. While a lower cost commitment than the large online bays of the day, these nonlinear edit bays (NLE) still required custom-configured Macs, a fair amount of external storage, along with proprietary hardware and monitoring to see a high-quality video image. Though crude at first, NLEs eventually proved capable of handling all the video needs, including HD-quality projects and even higher resolutions today.

The trend towards smaller

As technology advanced, computers because faster and more powerful, storage capacities increased, and software that required custom hardware evolved to work in a software-only mode. Today, it’s possible to operate with a fraction of the cost, equipment, and hassle of just a few years ago, let along a room from the mid-70s. As a result, when designing or installing a new room, it’s important to question the assumptions about what makes a good edit bay configuration.

For example, today I frequently work in rooms running newer iMacs, 2013 Mac Pros, and even MacBook Pro laptops. These are all perfectly capable of running Apple Final Cut Pro X, Adobe Premiere Pro, Avid Media Composer, and other applications, without the need for additional hardware. In my interview with Thomas Grove Carter, he mentioned often working off of his laptop with a connected external drive for media. And that’s at Trim, a high-end London commercial editing boutique.

In my own home edit room, I recently set aside my older Mac Pro tower in favor of working entirely with my 2015 MacBook Pro. No more need to keep two machines synced up and the MBP is zippier in all respects. With the exception of some heavy-duty rendering (infrequent), I don’t miss using the tower. I run the laptop with an external Dell display and have configured my editing application workspaces around a single screen. The laptop is closed and parked in a BookArc stand tucked behind the Dell. But I also bought a Rain stand for those times when I need the MBP open and functioning as a second display.

Reduce your editing footprint

I find more and more editors working in similar configurations. For example, one of my clients is a production company with seven networked (NAS storage) workstations. Most of these are iMacs with few other connected peripherals. The main room has a 2013 “trash can” Mac Pro and a bit more gear, since this is the “hero” room for clients. If you are looking to downsize your editing environment, here are some pointers.

While you can work strictly from a laptop, I prefer to build it up for a better experience. Essential for me is a Thunderbolt dock. Check out OWC or CalDigit for two of the best options. This lets you connect the computer to the dock and then everything else connects to that dock. One Thunderbolt cable to the laptop, plus power for the computer, leaving you with a clean installation with an easy-to-move computer. From the dock, I’m running a Presonus Audiobox USB audio interface (to a Mackie mixer and speakers), a TimeMachine drive, a G-Tech media drive, and the Dell display. If I were to buy something different today, I would use the Mackie Onyx Blackjack interface instead of the Presonus/Mackie mixer combo. The Blackjack is an all-in-one solution.

Expand your peripherals as needed

At the production company’s hero room, we have the extra need to drive some video monitors for color correction and client viewing. That room is similarly configured as above, except with a Mac Pro and connection to a QNAP shared storage solution. The latter connects over 10Gb/s Ethernet via a Sonnet Thunderbolt/Ethernet adapter.

When we initially installed the room, video to the displays was handled by a Blackmagic Design UltraStudio device. However, we had a lot of playback performance issues with the UltraStudio, especially when using FCPX. After some experimenting, we realized that both Premiere Pro and FCPX can send a fullscreen, [generally] color-accurate signal to the wall-mounted flat panel using only HDMI and no other video i/o hardware. We ended up connecting the HDMI from the dock to the display and that’s the standard working routine when we are cutting in either Premiere Pro or Final Cut.

The rub for us is DaVinci Resolve. You must use some type of Blackmagic Design hardware product in order to get fullscreen video to a display when in Resolve. Therefore, the Ultrastudio’s HDMI port connects to the second HDMI input of the large client display and SDI feeds a separate TV Logic broadcast monitor. This is for more accurate color rendition while grading. With Media Composer, there were no performance issues, but the audio and video signal wants to go through the same device. So, if we edit Avid, then the signal chain goes through the UltraStudio, as well.

All of this means that in today’s world, you can work as lightly as you like. Laptop-only – no problem. iMac with some peripherals – no problem. A fancy, client-oriented room – still less hassle and cost than just a few short years ago. Load it up with extra control surfaces or stay light with a keyboard, mouse, or tablet. It all works today – pretty much as advertised. Gone are the days when you absolutely need to drop a small fortune to edit high-quality video. You just have to know what you are doing and understand the trade-offs as they arise.

©2017 Oliver Peters

Faster, Together at NAB

With the NAB trade show just around the corner, it’s time to shore up your last minute plans for things to do and see. In addition to the tons of exhibits in the Las Vegas Convention Center halls, there are numerous outside meetings, conferences, training sessions, and places for production and post professionals to meet and greet.

A new addition this year is LumaForge’s Faster, Together Stage presentations. These are being held Monday through Wednesday across the street at the Courtyard by Marriott Las Vegas Convention Center. I’ll be part of the “State of the NLE” panel discussion Wednesday at 3PM. It should be fun and although some have referred to this as the “NLE cage match”, we are all friends and looking forward to an enlightening discussion. The presentations are free, but you must register in advance. See you there!

©2017 Oliver Peters

Final Cut Pro X – Reflecting on Six Years

df0417_fcpx5yrs_01_sm

Some personal musings…

Apple’s Final Cut Pro X has passed its five-year mark – and by now nearly most of its sixth. Although it’s getting increasing respect from many corners of the professional editing community, there are still many that dismiss it, due to its deviation from standard editing software conventions. Like so many other things that are Apple, FCPX tends to be polarizing with a large cohort of both fanboys and haters.

For me software is a tool. I’ve been editing since the 70s and have used about 15 different linear and nonlinear systems on billable work during that time. More like 20 if you toss in color correction applications. Even more with tools where I’ve had a cursory exposure to (such as in product reviews), but haven’t used on real jobs. All of these tools are a love-hate relationship for me. I have to laugh when folks talk about FCPX bringing back fun to their editing experience. I hope that the projects I work on bring me fun. I don’t really care about the software itself. Software should just get out of the way and let me do my job.

These six years have been a bit of a personal journey with Final Cut Pro X after a number of years with the “classic” version. I’ve been using FCPX since it first came out on commercials, corporate videos, shorts and even an independent feature film. It’s not my primary NLE most of the time, because my clients have largely moved to Adobe Premiere Pro CC and ask me to be compatible with them. My FCPX work tends to be mixed in and around my Premiere Pro editing gigs. For instance, right now I’m simultaneously involved in two large corporate video jobs – one of which I’m cutting in Premiere Pro and the other in Final Cut Pro X. As these things go, it can be frustrating, because you always want some function, tool or effect that’s available in Application A while you’re working in Application B. However, it also provides a perspective on what’s good and bad about each and where real speed advantages exist.

I have to say that even after six years, Final Cut Pro X is still more of a crapshoot than any other editing tool that I’ve used. I love its organizing power and often start a job really liking it. However, the deeper I get into the job – and the larger the library becomes – and the more complex the sequences become – the more bogged down FCPX becomes. It’s also the most inconsistent across various Mac models. I’ve run it on older towers, new MacBook Pros, iMacs and 2013 Mac Pros. Of these experiences, the laptops seem to be the most optimized for FCPX.

Quite frankly, working with the “trash can” Mac Pros, at times I wonder if Apple has lost its mojo. Don’t get me wrong – it’s a sweet machine, but its horsepower leaves me underwhelmed. Given the right upgrades, a 2010 Mac Pro tower is still quite competitive against it. Couple that with intermittent corrupt renders and exports on Adobe applications – due to the D-series AMD GPUs – one really has to question Apple’s design compromises. On the other hand, working with recent and new MacBook Pros, it seems pretty obvious that this is where Apple’s focus has been. And in fact, that’s where Final Cut really shines. Run a complex project on a MacBook Pro versus an older tower and it’s truly a night-and-day experience. By comparison, the performance with Adobe and Avid on the same range of machines results in a much more graduated performance curve. Best might not be quite as good, but worst isn’t nearly as awful.

A lot is made of new versus old code in these competing applications. The running argument is that FCPX uses a sleek, new codebase, whereas Premiere Pro and Media Composer run on creaky old software. Yet Final Cut has been out publicly for six years, which means development started a few years before that. Hmmm, no longer quite so new. Yet, if you look at the recent changes from 10.2 to 10.3, it seems pretty clear that a lot more was changed than just cosmetics. The truth of the matter is that all three of these major applications are written in a way that modules of software can be added, removed or changed, without the need to start from scratch. Therefore, from a coding standpoint, Final Cut doesn’t have nearly the type of advantages that many think it has.

The big advantage that FCPX does have, is that Apple can optimize its performance for the holistic hardware and macOS software architecture of their own machines. As such, performance, render speeds, etc. aren’t strictly tied to only the CPU or the GPU. It’s what enables the new MacBook Pro to offer top-end performance, while still staying locked to 16GB of RAM. It seems to me, that this is also why the Core-series processors appear to be better performers than are the Xeon-series chips, when it comes to Final Cut, Motion and Compressor.

If you compare this to Premiere Pro, Adobe hits the GPUs much harder than does Apple, which is the reason behind the occasional corruptions on the “trash can” Macs with Adobe renders. If you were running the Adobe suite on a top-level PC with high-end Nvidia cards, performance would definitely shine over that of the Macs. This is largely due to leveraging the CUDA architecture of these Nvidia GPUs. With Apple’s shift to using only AMD and Intel GPUs, CUDA acceleration isn’t available on newer Macs. Under the current software versions of Adobe CC (at the time of this writing) and Sierra, you are tied to OpenCL or software-only rendering and cannot even use Apple’s Metal acceleration. This is a driver issue still being sorted out between Apple and Adobe. Metal is something that Apple tools take advantage of and is a way that they leverage the combined hardware power, without focusing solely on CPU or GPU acceleration.

All of this leads me back to a position of love-hate with any of these tools. I suspect that my attitude is more common than most folks who frequent Internet forum debates want to admit. The fanboy backlash is generally large. When I look at how I work and what gets the results, I usually prefer track-based systems to the FCPX approach. I tend to like Final Cut as a good rough-cut editing application, but less as a fine-cut tool. Maybe that’s just me. That being said, I’ve had plenty of experiences where FCPX quite simply is the better tool under the circumstance. On a recent on-site edit gig at CES, I had to cut some 4K ARRI ALEXA material on my two-year-old Retina MacBook Pro. Premiere Pro couldn’t hack it without stuttering playback, while FCPX was buttery smooth. Thus FCPX was the axe for me throughout this gig.

Likewise, in the PC vs. Mac hardware debates,  I may criticize some of Apple’s moves and long to work on a fire-breathing platform. But if push came to shove and I had to buy a new machine today, it would be either a Mac Pro “trash can” or a tricked-out iMac. I don’t do heavy 3D renders or elaborate visual effects – I edit and color correct. Therefore, the overall workflow, performance and “feel” of the Apple ecosystem is a better fit for me, even though at times performance might be middling.

Wrapping up this rambling post – it’s all about personal preference. I applaud Apple for making the changes in Final Cut Pro X that they did; however, a lot of things are still in need of improvement. Hopefully these will get addressed soon. If you are looking to use FCPX professionally, then my suggestion is to stick with only the newest machines and keep your productions small and light. Keep effects and filters to a minimum and you’ll be happiest with the results and the performance. Given the journey thus far, let’s see what the next six years will bring.

©2017 Oliver Peters