Final Cut Pro X – Reflecting on Six Years

df0417_fcpx5yrs_01_sm

Some personal musings…

Apple’s Final Cut Pro X has passed its five-year mark – and by now nearly most of its sixth. Although it’s getting increasing respect from many corners of the professional editing community, there are still many that dismiss it, due to its deviation from standard editing software conventions. Like so many other things that are Apple, FCPX tends to be polarizing with a large cohort of both fanboys and haters.

For me software is a tool. I’ve been editing since the 70s and have used about 15 different linear and nonlinear systems on billable work during that time. More like 20 if you toss in color correction applications. Even more with tools where I’ve had a cursory exposure to (such as in product reviews), but haven’t used on real jobs. All of these tools are a love-hate relationship for me. I have to laugh when folks talk about FCPX bringing back fun to their editing experience. I hope that the projects I work on bring me fun. I don’t really care about the software itself. Software should just get out of the way and let me do my job.

These six years have been a bit of a personal journey with Final Cut Pro X after a number of years with the “classic” version. I’ve been using FCPX since it first came out on commercials, corporate videos, shorts and even an independent feature film. It’s not my primary NLE most of the time, because my clients have largely moved to Adobe Premiere Pro CC and ask me to be compatible with them. My FCPX work tends to be mixed in and around my Premiere Pro editing gigs. For instance, right now I’m simultaneously involved in two large corporate video jobs – one of which I’m cutting in Premiere Pro and the other in Final Cut Pro X. As these things go, it can be frustrating, because you always want some function, tool or effect that’s available in Application A while you’re working in Application B. However, it also provides a perspective on what’s good and bad about each and where real speed advantages exist.

I have to say that even after six years, Final Cut Pro X is still more of a crapshoot than any other editing tool that I’ve used. I love its organizing power and often start a job really liking it. However, the deeper I get into the job – and the larger the library becomes – and the more complex the sequences become – the more bogged down FCPX becomes. It’s also the most inconsistent across various Mac models. I’ve run it on older towers, new MacBook Pros, iMacs and 2013 Mac Pros. Of these experiences, the laptops seem to be the most optimized for FCPX.

Quite frankly, working with the “trash can” Mac Pros, at times I wonder if Apple has lost its mojo. Don’t get me wrong – it’s a sweet machine, but its horsepower leaves me underwhelmed. Given the right upgrades, a 2010 Mac Pro tower is still quite competitive against it. Couple that with intermittent corrupt renders and exports on Adobe applications – due to the D-series AMD GPUs – one really has to question Apple’s design compromises. On the other hand, working with recent and new MacBook Pros, it seems pretty obvious that this is where Apple’s focus has been. And in fact, that’s where Final Cut really shines. Run a complex project on a MacBook Pro versus an older tower and it’s truly a night-and-day experience. By comparison, the performance with Adobe and Avid on the same range of machines results in a much more graduated performance curve. Best might not be quite as good, but worst isn’t nearly as awful.

A lot is made of new versus old code in these competing applications. The running argument is that FCPX uses a sleek, new codebase, whereas Premiere Pro and Media Composer run on creaky old software. Yet Final Cut has been out publicly for six years, which means development started a few years before that. Hmmm, no longer quite so new. Yet, if you look at the recent changes from 10.2 to 10.3, it seems pretty clear that a lot more was changed than just cosmetics. The truth of the matter is that all three of these major applications are written in a way that modules of software can be added, removed or changed, without the need to start from scratch. Therefore, from a coding standpoint, Final Cut doesn’t have nearly the type of advantages that many think it has.

The big advantage that FCPX does have, is that Apple can optimize its performance for the holistic hardware and macOS software architecture of their own machines. As such, performance, render speeds, etc. aren’t strictly tied to only the CPU or the GPU. It’s what enables the new MacBook Pro to offer top-end performance, while still staying locked to 16GB of RAM. It seems to me, that this is also why the Core-series processors appear to be better performers than are the Xeon-series chips, when it comes to Final Cut, Motion and Compressor.

If you compare this to Premiere Pro, Adobe hits the GPUs much harder than does Apple, which is the reason behind the occasional corruptions on the “trash can” Macs with Adobe renders. If you were running the Adobe suite on a top-level PC with high-end Nvidia cards, performance would definitely shine over that of the Macs. This is largely due to leveraging the CUDA architecture of these Nvidia GPUs. With Apple’s shift to using only AMD and Intel GPUs, CUDA acceleration isn’t available on newer Macs. Under the current software versions of Adobe CC (at the time of this writing) and Sierra, you are tied to OpenCL or software-only rendering and cannot even use Apple’s Metal acceleration. This is a driver issue still being sorted out between Apple and Adobe. Metal is something that Apple tools take advantage of and is a way that they leverage the combined hardware power, without focusing solely on CPU or GPU acceleration.

All of this leads me back to a position of love-hate with any of these tools. I suspect that my attitude is more common than most folks who frequent Internet forum debates want to admit. The fanboy backlash is generally large. When I look at how I work and what gets the results, I usually prefer track-based systems to the FCPX approach. I tend to like Final Cut as a good rough-cut editing application, but less as a fine-cut tool. Maybe that’s just me. That being said, I’ve had plenty of experiences where FCPX quite simply is the better tool under the circumstance. On a recent on-site edit gig at CES, I had to cut some 4K ARRI ALEXA material on my two-year-old Retina MacBook Pro. Premiere Pro couldn’t hack it without stuttering playback, while FCPX was buttery smooth. Thus FCPX was the axe for me throughout this gig.

Likewise, in the PC vs. Mac hardware debates,  I may criticize some of Apple’s moves and long to work on a fire-breathing platform. But if push came to shove and I had to buy a new machine today, it would be either a Mac Pro “trash can” or a tricked-out iMac. I don’t do heavy 3D renders or elaborate visual effects – I edit and color correct. Therefore, the overall workflow, performance and “feel” of the Apple ecosystem is a better fit for me, even though at times performance might be middling.

Wrapping up this rambling post – it’s all about personal preference. I applaud Apple for making the changes in Final Cut Pro X that they did; however, a lot of things are still in need of improvement. Hopefully these will get addressed soon. If you are looking to use FCPX professionally, then my suggestion is to stick with only the newest machines and keep your productions small and light. Keep effects and filters to a minimum and you’ll be happiest with the results and the performance. Given the journey thus far, let’s see what the next six years will bring.

©2017 Oliver Peters

2017 Technology Predictions

df3316_techpred_sm

The next year will certainly be an interesting one. Not only because of the forces of innovation, but also those of politics. With the new President vowing to use the bully pulpit to entice, encourage or cajole US corporations to bring their offshore manufacturing back to the states, it seems pretty clear that companies in the media industries will be affected. The likely targets will be storage, camera and computer manufacturers. I presume that Apple will become the most visible and possibly vocal of these, but that awaits to be seen.

At present, Apple is more of an engineering design and services company than a manufacturer. The exception being the Mac Pro. Given their volume and the expertise of suppliers like Foxconn, it’s hard to see how moving iPhone production to the US would be possible or at least cost-effective. However, low volume products, like the 2013 Mac Pro model are a better fit, which is why that product is assembled in Austin. But of course, there’s plenty of speculation that the “trash can” Mac isn’t long for this world. It’s sorely in need of a refresh and has been largely overshadowed by the new MacBook Pro models. Although I think from a business perspective Apple would just as soon drop it, the Mac Pro does have the advantage of servicing a market segment that Apple likes to be associated with – creative media professionals. If you add in the political climate, it’s a good counterpoint to say that Apple’s highest end product is made here.

Factoring all that in, I predict that we’ll see at least one more iteration of the Mac Pro. I don’t expect a form factor change, but I would expect newer Xeon chips, when available, and a shift to the Thunderbolt 3 protocol, using the USB-C plugs. This way it will be compatible with the same peripherals as can be used by the new MacBook Pros. The same will be true of the next iMacs. I also expect to see at least one more version of the Mac Mini, as this provides a small package that many can used as a server machine. It will sport new Xeon or new Core i7 chips and Thunderbolt 3/USB-C ports. However, once these new machines hit the market, there are plenty of signs to predict that those products will be the last of their kind, leaving Apple to only make iMac and laptop form factors for their macOS products. That’s a couple of years out.

If tariffs and a change in trade agreements become public policy, then imported products will become more expensive than they have been. I see this having the greatest impact with cameras, as so many (nearly all) are produced by foreign companies, such as Sony, Canon and ARRI. This may well be a very positive development for a company like RED. If all of a sudden ALEXAs become a lot more expensive as compared with RED Epics, Weapons, etc., well then you just might see a shift in the sales numbers. Of course, a lot of this is just reading the tea leaves, but if politics were ever a driver, this would be the year that we’ll see it.

Another continuing trend will be mergers and acquisitions as weaker companies consolidate with stronger competitors. The ripest of these is Avid Technology. Their financial issues have spilled over into business news and it’s hard to see how they can dig themselves out of the current holes with such lackluster sales. The smart money predicts a breakup or sell-off. If this occurs, the predictions (with which I agree) would have ProTools going to Dolby and Media Composer – and maybe also storage – going to Blackmagic Design. The rest, including Interplay, the Media Central Platform and the Orad products would go elsewhere or just be closed down.

The obvious question would be why Blackmagic Design would want Media Composer? After all, they are already developing DaVinci Resolve into an NLE in its own right. By picking up Media Composer, they add a highly respected editing application to the portfolio and thus buy into an existing marketshare, just as they did in color correction. Once acquired, I’m pretty confident that Blackmagic’s software engineers, together with the staff retained from Avid, would quickly clean up and improve Media Composer from its current state. Only Blackmagic seems to have the will to suffer through the complaints that such a move may have from loyalists. Avid editors are legendary in their reluctance to accept changes to the interface.

When it comes to nonlinear editing applications, I continue to see a rosy future for Adobe. Premiere Pro’s penetration is increasing in the world of entertainment, broadcast and corporate media, which has been Avid’s stronghold. While Avid is still strong in these areas, they seem to be selling to existing customers and not growing their base. Adobe, on the other hand, is pulling from Avid and Apple customers, plus new ones. While there was a lot of grousing about the Adobe subscription model, most users seem OK with it and are happy to be able to keep their software current with each Creative Cloud update. Likewise, Apple is doing well with Final Cut Pro X. Their market seems to be more individual users and “creative enthusiasts” than is the case for Adobe. In addition, FCP X also seems to be doing well internationally. Since Apple has another five years to go on its public commitment to FCP X development, I only see more growth for this application.

Apple has long held an outsized percentage of the creative market, as compared with its overall marketshare of all computers. However, it doesn’t take much sleuthing to see the enthusiasm expressed for the Microsoft Surface Studio. In my own travels, I see a lot of Surface tablets in regular use. So far, the ones I encounter are being used for general computing, but that will change. Since these devices run Windows, any application that can run under Windows will work. As the Surface line becomes more powerful, I fully expect to see creatives routinely running all of the Adobe apps, Media Composer, Resolve, Lightworks and others without any difficulty. Among some users, many would love to cut the Apple chord, and I predict the Surface and Surface Studio are just the tools to enable that move. Add to that the innovative menu control knob that was introduced with Surface Studio and you can see that creative design thinking isn’t limited to Cupertino.

For storage products, I see two shifts.The first is the move to the Thunderbolt 3 protocol. If you’ve invested heavily in Thunderbolt 2 or USB-3 devices, technology has just leapfrogged you. While these products will continue to be useful and can be connected via legacy ports or docks and adapters, storage manufacturers will embrace Thunderbolt 3 for direct-attached products. The shared storage providers will continue down the 10-Gigabit and 40-Gigabit Ethernet route for awhile, until Thunderbolt 3 networking really becomes viable. We aren’t there yet, but I can’t see why it won’t come soon. Right now, if you have two to ten users, a low cost shared storage environment is pretty easy to set up. The hitch is controlling the application permissions of the software being used. Avid had a lock on that, but there are now ways to enable Avid bin-locking for a few hundred bucks per seat. No need to buy expensive storage and pay annual support contracts any longer.

Along these lines is Adobe’s project sharing through Team Projects (currently in beta testing). Once they get the kinks ironed out, Team and Enterprise accounts will be able to work collaboratively and simultaneously on the same production. I see it as only a matter of time before Apple offers a similar capability with Final Cut Pro X. I certainly seems like all the hooks are there under the hood to make that possible. So maybe 2017 will be the year the project sharing comes to Final Cut users. Once both Adobe and Apple can offer reliable project collaboration in fashion that rivals Avid, you’ll see an even greater shift to these editing tools and away from Media Composer within the film and broadcast editing communities.

As laptops grow in power, expect an even faster demise of the desktop, workstation PC. More and more, people want to be mobile. Having a laptop connected to all the bells and whistles at your base station edit suite, yet being able to unplug and go where you need to be – that’s the future direction for a lot of post professionals. Wrapping this up, remember, these predictions are free and worth just what you paid for them!

Originally written for Digital Video magazine / Creative Planet Network

© 2016 Oliver Peters

NLE as Post Production Hub

df2316_main_sm

As 2009 closed, I wrote a post about Final Cut Studio as the center of a boutique post production workflow. A lot has changed since then, but that approach is still valid and a number of companies can fill those shoes. In each case, rather than be the complete, self-contained tool, the editing application becomes the hub of the operation. Other applications surround it and the workflow tends to go from NLE to support tool and back for delivery. Here are a few solutions.

Adobe Premiere Pro CC

df2316_prproNo current editing package comes as close to the role of the old Final Cut Studio as does Adobe’s Creative Cloud. You get nearly all of the creative tools under a single subscription and facilities with a team account can equip every room with the full complement of applications. When designed correctly, workflows in any room can shift from edit to effects to sound to color correction – according to the load. In a shared storage operation, projects can stay in a single bay for everything or shift from bay to bay based on operator speciality and talent.

While there are many tools in the Creative Cloud kit, the primary editor-specific applications are Premiere Pro CC, After Effects CC and Audition CC. It goes without saying that for most, Photoshop CC and Adobe Media Encoder are also givens. On the other hand, I don’t know too many folks using Prelude CC, so I can’t say what the future for this tool will be. Especially since the next version of Premiere Pro includes built-in proxy transcoding. Also, as more of SpeedGrade CC’s color correction tools make it into Premiere Pro, it’s clear to see that SpeedGrade itself is getting very little love. The low-cost market for outboard color correction software has largely been lost to DaVinci Resolve (free). For now, SpeedGrade is really “dead man walking”. I’d be surprised if it’s still around by mid-2017. That might also be the case for Prelude.

Many editors I know that are heavy into graphics and visual effects do most of that work in After Effects. With CC and Dynamic Link, there’s a natural connection between the Premiere Pro timeline and After Effects. A similar tie can exist between Premiere Pro and Audition. I find the latter to be a superb audio post application and, from my experience, provides the best transfer of a Premiere Pro timeline into any audio application. This connection is being further enhanced by the updates coming from Adobe this year.

Rounding out the package is Photoshop CC, of course. While most editors are not big Photoshop artists, it’s worth noting that this application also enables animated motion graphics. For example, if you want to create an animated lower third banner, it can be done completely inside of Photoshop without ever needing to step into After Effects. Drop the file onto a Premiere Pro timeline and it’s complete with animation and proper transparency values. Update the text in Photoshop and hit “save” – voila the graphic is instantly updated within Premiere Pro.

Given the breadth and quality of tools in the Creative Cloud kit, it’s possible to stay entirely within these options for all of a facility’s post needs. Of course, roundtrips to Resolve, Baselight, ProTools, etc. are still possible, but not required. Nevertheless, in this scenario I typically see everything starting and ending in Premiere Pro (with exports via AME), making the Adobe solution my first vote for the modern hub concept.

Apple Final Cut Pro X

df2316_fcpxApple walked away from the market for an all-inclusive studio package. Instead, it opted to offer more self-contained solutions that don’t have the same interoperability as before, nor that of the comparable Adobe solutions. To build up a similar toolkit, you would need Final Cut Pro X, Motion, Compressor and Logic Pro X. An individual editor/owner would purchase these once and install these on as many machines as he or she owned. A business would have to buy each application for each separate machine. So a boutique facility would need a full set for each room or they would have to build rooms by specialty – edit, audio, graphics, etc.

Even with this combination, there are missing links when going from one application to another. These gaps have to be plugged by the various third-party productivity solutions, such as Clip Exporter, XtoCC, 7toX, Xsend Motion, X2Pro, EDL-X and others. These provide better conduits between Apple applications than Apple itself provides. For example, only through Automatic Duck Xsend Motion can you get an FCPX project (timeline) into Motion. Marquis Broadcast’s X2Pro Audio Convert provides a better path into Logic than the native route.

If you want the sort of color correction power available in Premiere Pro’s Lumetri Color panel, you’ll need more advanced color correction plug-ins, like Hawaiki Color or Color Finale. Since Apple doesn’t produce an equivalent to Photoshop, look to Pixelmator or Affinity Photo for a viable substitute. Although powerful, you still won’t get quite the same level of interoperability as between Photoshop and Premiere Pro.

Naturally, if your desire is to use non-Apple solutions for graphics and color correction, then similar rules apply as with Premiere Pro. For instance, roundtripping to Resolve for color correction is pretty solid using the FCPXML import/export function within Resolve. Prefer to use After Effects for your motion graphics instead of Motion? Then Automatic Duck Ximport AE on the After Effects side has your back.

Most of the tools are there for those users wishing to stay in an Apple-centric world, provided you add a lot of glue to patch over the missing elements. Since many of the plug-ins for FCPX (Motion templates) are superior to a lot of what’s out there, I do think that an FCPX-centric shop will likely choose to start and end in X (possibly with a Compressor export). Even when Resolve is used for color correction, I suspect the final touches will happen inside of Final Cut. It’s more of the Lego approach to the toolkit than the Adobe solution, yet I still see it functioning in much the same way.

Blackmagic Design DaVinci Resolve

df2316_resolveIt’s hard to say what Blackmagic’s end goal is with Resolve. Clearly the world of color correction is changing. Every NLE developer is integrating quality color correction modules right inside of their editing application. So it seems only natural that Blackmagic is making Resolve into an all-in-one tool for no other reason than self-preservation. And by golly, they are doing a darn good job of it! Each version is better than the last. If you want a highly functional editor with world-class color correction tools for free, look no further than Resolve. Ingest, transcoded and/or native media editing, color correction, mastering and delivery – all there in Resolve.

There are two weak links – graphics and audio. On the latter front, the internal audio tools are good enough for many editors. However, Blackmagic realizes that specialty audio post is still the domain of the sound engineering world, which is made up predominantly of Avid Pro Tools shops. To make this easy, Resolve has built-in audio export functions to send the timeline to Pro Tools via AAF. There’s no roundtrip back, but you’d typically get composite mixed tracks back from the engineer to lay into the timeline.

To build on the momentum it started, Blackmagic Design acquired the assets of EyeOn’s Fusion software, which gives then a node-based compositor, suitable for visual effects and some motion graphics. This requires a different mindset than After Effects with Premiere Pro or Motion with Final Cut Pro X (when using Xsend Motion). You aren’t going to send a full sequence from Resolve to Fusion. Instead, the Connect plug-in links a single shot to Fusion, where it can be effected through series of nodes. The Connect plug-in provides a similar “conduit” function to that of Adobe’s Dynamic Link between Premiere Pro and After Effects, except that the return is a rendered clip instead of a live project file. To take advantage of this interoperability between Resolve and Fusion, you need the paid versions.

Just as in Apple’s case, there really is no Blackmagic-owned substitute for Photoshop or an equivalent application. You’ll just have to buy what matches your need. While it’s quite possible to build a shop around Resolve and Fusion (plus maybe Pro Tools and Photoshop), it’s more likely that Resolve’s integrated approach will appeal mainly to those folks looking for free tools. I don’t see too many advanced pros doing their creative cutting on Resolve (at least not yet). However, that being said, it’s pretty close, so I don’t want to slight the capabilities.

Where I see it shine is as a finishing or “online” NLE. Let’s say you perform the creative or “offline” edit in Premiere Pro, FCPX or Media Composer. This could even be three editors working on separate segments of a larger show – each on a different NLE. Each’s sequence goes to Resolve, where the timelines are imported, combined and relinked to the high-res media. The audio has gone via a parallel path to a Pro Tools mixer and graphics come in as individual clips, shots or files. Then all is combined inside Resolve, color corrected and delivered straight from Resolve. For many shops, that scenario is starting to look like the best of all worlds.

I tend to see Resolve as less of a hub than either Premiere Pro or Final Cut Pro X. Instead, I think it may take several possible positions: a) color correction and transcoding at the front end, b) color correction in the middle – i.e. the standard roundtrip, and/or c) the new “online editor” for final assembly, color correction, mastering and delivery.

Avid Media Composer

df2316_avidmcThis brings me to Avid Media Composer, the least integrated of the bunch. You can certainly build an operation based on Media Composer as the hub – as so many shops have. But there simply isn’t the silky smooth interoperability among tools like there is with Adobe or the dearly departed Final Cut Pro “classic”. However, that doesn’t mean it’s not possible. You can add advanced color correction through the Symphony option, plus Avid Pro Tools in your mixing rooms. In an Avid-centric facility, rooms will definitely be task-oriented, rather than provide the ease of switching functions in the same suite based on load, as you can with Creative Cloud.

The best path right now is Media Composer to Pro Tools. Unfortunately it ends there. Like Blackmagic, Avid only offers two hero applications in the post space – Media Composer/Symphony and Pro Tools. They have graphics products, but those are designed and configured for news on-air operations. This means that effects and graphics are typically handled through After Effects, Boris RED or Fusion.

Boris RED runs as an integrated tool, which augments the Media Composer timeline. However, RED uses its own user interface. That operation is relatively seamless, since any “roundtrip” happens invisibly within Media Composer. Fusion can be integrated using the Connect plug-in, just like between Fusion and Resolve. Automatic Duck’s AAF import functions have been integrated directly into After Effects by Adobe. It’s easy to send a Media Composer timeline into After Effects as a one-way trip. In fact, that’s where this all started in the first place. Finally, there’s also a direct connection with Baselight Editions for Avid, if you add that as a “plug-in” within Media Composer. As with Boris RED, clips open up in the Baselight interface, which has now been enhanced with a smoother shot-to-shot workflow inside of Media Composer.

While a lot of shops still use Media Composer as the hub, this seems like a very old-school approach. Many editors still love this NLE for its creative editing prowess, but in today’s mixed-format, mixed-codec, file-based post world, Avid has struggled to keep Media Composer competitive with the other options. There’s certainly no reason Media Composer can’t be the center – with audio in Pro Tools, color correction in Resolve, and effects in After Effects. However, most newer editors simply don’t view it the same way as they do with Adobe or even Apple. Generally, it seems the best Avid path is to “offline” edit in Media Composer and then move to other tools for everything else.

So that’s post in 2016. Four good options with pros and cons to each. Sorry to slight the Lightworks, Vegas Pro, Smoke/Flame and Edius crowds, but I just don’t encounter them too often in my neck of the woods. In any case, there are plenty of options, even starting at free, which makes the editing world pretty exciting right now.

©2016 Oliver Peters

Tips for Production Success – Part 2

df2015_prodtips_2_smPicking up from my last post (part 1), here are 10 more tips to help you plan for a successful production.

Create a plan and work it. Being a successful filmmaker – that is, making a living at it – is more than just producing a single film. Such projects almost never go beyond the festival circuit, even if you do think it is the “great American film”. An indie producer may work on a project for about four years, from the time they start planning and raising the funds – through production and post – until real distribution starts. Therefore, the better approach is to start small and work your way up. Start with a manageable project or film with a modest budget and then get it done on time and in budget. If that’s a success, then start the next one – a bit bigger and more ambitious. If it works, rinse and repeat. If you can make that work, then you can call yourself a filmmaker.

Budget. I have a whole post on this subject, but in a nutshell, an indie film that doesn’t involve union talent or big special effects will likely cost close to one million dollars, all in. You can certainly get by on less. I’ve cut films that were produced for under $150,000 and one even under $50,000, but that means calling in a lot of favors and having many folks working for free or on deferment. You can pull that off one time, but it’s not a way to build a business, because you can’t go back to those same resources and ask to do it a second time. Learn how to raise the money to do it right and proceed from there.

Contingencies at the end. Intelligent budgeting means leaving a bit for the end. A number of films that I’ve cut had to do reshoots or spend extra days to shoot more inserts, establishing shots, etc. Plan for this to happen and make sure you’ve protected these items in the budget. You’ll need them.

Own vs. rent. Some producers see their film projects as a way to buy gear. That may or may not make sense. If you need a camera and can otherwise make money with it, then buy it. Or if you can buy it, use it, and then resell it to come out ahead – by all means follow that path. But if gear ownership is not your thing and if you have no other production plans for the gear after that one project, then it will most likely be a better deal to work out rentals. After all, you’re still going to need a lot of extras to round out the package.

Shooting ratios. In the early 90s I worked on the post of five half-hour and hourlong episodic TV series that were shot on 35mm film. Back then shooting ratios were pretty tight. A half-hour episode is about 20-22 minutes of content, excluding commercials, bumpers, open, and credits. An hourlong episode is about 44-46 minutes of program content. Depending on the production, these were shot in three to five days and exposed between 36,000 and 50,000 feet of negative. Therefore, a typical day meant 50-60 minutes of transferred “dailies” to edit from – or no more than five hours of source footage, depending on the series. This would put them close to the ideal mark (on average) of approximately a 10:1 shooting ratio.

Today, digital cameras make life easier and with the propensity to shoot two or more cameras on a regular basis, this means the same projects today might have conservatively generated more than 10 hours of source footage for each episode. This impacts post tremendously – especially if deadline is a factor. As a new producer, you should strive to control these ratios and stay within the goal of a 10:1 ratio (or lower).

Block and rehearse. The more a scene is buttoned down, the fewer takes you’ll need, which leads to a tighter shooting ratio. This means rehearse a scene and make sure the camera work is properly blocked. Don’t wing it! Once everything is ready, shoot it. Odds are you’ll get it in two to three takes instead of the five or more that might otherwise be required.

Control the actors. Unless there’s a valid reason to let your actors improvise, make sure the acting is consistent. That is, lines are read in the same order each take, props are handled at the same point, and actors consistently hit their marks each take. If you stray from that discipline, the editorial time becomes longer. If allowed to engage in too much freewheeling improvisation, actors may inadvertently paint you into a corner. To avoid that outcome, control it from the start.

Visual effects planning. Most films don’t require special effects, but there are often “invisible” fixes that can be created through visual effects. For example, combining elements of two takes or adding items to a set. A recent romantic drama I post-supervised used 76 effects shots of one type or another. If this is something that helps the project, make sure to plan for it from the outset. Adobe After Effects is the ubiquitous tool that makes such effects affordable. The results are great and there are plenty of talented designers who can assist you within almost any budget range.

Multiple cameras vs. single camera vs. 4K. Some producers like the idea of shooting interviews (especially two-shots) in 4K (for a 1080 finish) and then slice out the frame they want. I contend that often 4K presents focus issues, due to the larger sensors used in these cameras. In addition, the optics of slicing a region out of a 4K image are different than using another camera or zooming in to reframe the shot. As a result, the look that you get isn’t “quite right”. Naturally, it also adds one more component that the editor has to deal with – reframing each and every shot.

Conversely, when shooting a locked-off interview with one person on-camera, using two cameras makes the edit ideal. One camera might be placed face-on towards the speaker and the other from a side angle. This makes cutting between the camera angles visually more exciting and makes editing without visible jump cuts easier.

In dramatic productions, many new directors want to emulate the “big boys” and also shoot with two or more cameras for every scene. Unfortunately this isn’t always productive, because the lighting is compromised, one camera is often in an awkward position with poor framing, or even worse, often the main camera blocks the secondary camera. At best, you might get 25% usability out of this second camera. A better plan is to shoot in a traditional single-camera style. Move the camera around for different angles. Tweak the lighting to optimize the look and run the scene again for that view.

The script is too long. An indie film script is generally around 100 pages with 95-120 scenes. The film gets shot in 20-30 days and takes about 10-15 weeks to edit. If your script is inordinately long and takes many more days to shoot, then it will also take many more days to edit. The result will usually be a cut that is too long. The acceptable “standard” for most films is 90-100 minutes. If you clock in at three hours, then obviously a lot of slashing has to occur. You can lose 10-15% (maybe) through trimming the fat, but a reduction of 25-40% (or more) means you are cutting meat and bone. Scenes have to be lost, the story has to be re-arranged, or even more drastic solutions. A careful reading of the script and conceiving that as a finished concept can head off issues before production ever starts. Losing a scene before you shoot it can save time and money on a large scale. So analyze your script carefully.

Click here for Part 1.

©2015 Oliver Peters

Tips for Production Success – Part 1

df1915_prodtips_1_smThroughout this blog, I’ve written numerous tips about how to produce projects, notably indie features, with a successful outcome in mind. I’ve tried to educate on issues of budget and schedule. In these next two entries, I’d like to tackle 21 tips that will make your productions go more smoothly, finish on time, and not become a disaster during the post production phase. Although I’ve framed the discussion around indie features, the same tips apply to commercials, music videos, corporate presentations, and videos for the web.

Avoid white. Modern digital cameras handle white elements within a shot much better than in the past, but hitting a white shirt with a lot of light complicates your life when it comes to grading and directing the eye of the viewer. This is largely an issue of art direction and wardrobe. The best way to handle this is simply to replace whites with off-whites, bone or beige colors. The sitcom Barney Miller, which earned DP George Spiro Dibie recognition for getting artful looks out of his video cameras, is said to have had the white shirts washed in coffee to darken them a bit. The whiteness was brought back once the cameras were set up. The objective in all of this is to get the overall brightness into a range that is more controllable during color correction and to avoid clipping.

Expose to the right. When you look at a signal on a histogram, the brightest part is on the righthand side of the scale. By pushing your camera’s exposure towards a brighter, slightly over-exposed image (“to the right”), you’ll end up with a better looking image after grading (color correction). That’s because when you have to brighten an image by bringing up highlights or midtones, you are accentuating the sensor noise from the camera. If the image is already brighter and the correction is to lower the levels, then you end up with a cleaner final image. Since most modern digital cameras use some sort of log or hyper gamma encoding to record a flatter signal, which preserves latitude, opening up the exposure usually won’t run the risk of clipping the highlights. In the end, a look that stretches the shadow and mids to expose more detail to the eye gives you a more pleasing and informative image than one that places emphasis on the highlight portion.

Blue vs. green-screen. Productions almost ubiquitously use green paint, but that’s wrong. Each paint color has a different luminance value. Green is brighter and should be reserved for a composite where the talent should appear to be outside. Blue works best when the composited image is inside. Paint matters. The correct paint to use is still the proper version of Ultimatte blue or green paint, but many people try to cut corners on cost. I’ve even had producers go so far as to rig up a silk with a blue lighting wash and expect me to key it! When you light the subject, move them as far away from the wall as possible to avoid contamination of the color onto their hair and wardrobe. This also means, don’t have your talent stand on a green or blue floor, when you aren’t intending to see the floor or see them from their feet to their head.

Rim lighting. Images stand out best when your talent has some rim lighting to separate them from the background. Even in a dark environment, seek to create a lighting scheme that achieves this rimming effect around their head and shoulders.

Tonal art direction. The various “blockbuster” looks are popular – particularly the “orange and teal” look. This style pushes skin tones warm for a slight orange appearance, while many darker background elements pick up green/blue/teal/cyan casts. Although this can be accentuated in grading, it starts with proper art direction in the set design and costuming. Whatever tonal characteristic you want to achieve, start by looking at the art direction and controlling this from step one.

Rec. 709 vs. Log. Digital cameras have nearly all adopted some method of recording an image with a flat gamma profile that is intended to preserve latitude until final grading. This doesn’t mean you have to use this mode. If you have control over your exposure and lighting, there’s nothing wrong with recording Rec. 709 and nailing the final look in-camera. I highly recommend this for “talking head” interviews, especially ones shot on green or blue-screen.

Microphone direction/placement. Every budding recording engineer working in music and film production learns that proper mic placement is critical to good sound. Pay attention to where mics are positioned, relative to where the person is when they speak. For example, if you have two people in an interview situation wearing lavaliere mics on their lapels, the proper placement would be on each’s inner lapel – the side closer to the other person. That’s because each person will turn towards the other to address them as they speak and thus talk over that shoulder. Having the mic on this side means they are speaking into the mic. If it were on their outer lapel, they would be speaking away from the mic and thus the audio would tend to sound hollow. For the same reasons, when you use a boom or fish pole overhead mic, the operator needs to point the mic in the direction of the person talking. They will need to shift the mic’s direction as the conversation moves from one person to the next to follow the sound.

Multiple microphones/iso mics. When recording dialogue for a group of actors, it’s best to record their audio with individual microphones (lavs or overhead booms) and to record each mic on an isolated track. Cameras typically feature on-board recording of two to four audio channels, so if you have more mics than that, use an external multi-channel recorder. When external recording is used, be sure to still record a composite track to your camera for reference.

Microphone types. There are plenty of styles and types of microphones, but the important factors are size, tonal quality, range, and the axis of pick-up. Make sure you select the appropriate mic for the task. For example, if you are recording an actor with a deep bass voice using a lavaliere, you’d be best to use a type that gives you a full spectrum recording, rather than one that favors only the low end.

Sound sync. There are plenty of ways to sync sound to picture in double-system sound situations. Synchronizing by matched timecode is the most ideal, but even there, issues can arise. Assure that the camera’s and sound recorder’s timecode generators don’t drift during the day – or use a single, common, external timecode generator for both. It’s generally best to also include a clapboard and, when possible, also record reference audio to the camera. If you plan to sync by audio waveforms (PluralEyes, FCP X, Premiere Pro CC), then make sure the reference signal on the camera is of sufficient quality to make synchronization possible.

Record wild lines on set. When location audio is difficult to understand, ADR (automatic dialogue replacement, aka “looping”) is required. This happens because the location recording was not of high quality due to outside factors, like special effects, background noise, etc. Not all actors are good at ADR and it’s not uncommon to watch a scene with ADR dialogue and have it jump out at you as the viewer. Since ADR requires extra recording time with the actor, this drives up cost on small films. One workaround in some of these situations is for the production team to recapture the lines separately – immediately after the scene was shot – if the schedule permits. These lines would be recorded wild and may or may not be in sync. The intent is to get the right sonic environment and emotion while you are still there on site. Since these situations are often fast-paced action scenes, sync might not have to be perfect. If close enough, the sound editors can edit the lines into place with an acceptable level of sync so that viewers won’t notice any issues. When it works, it saves ADR time down the road and sounds more realistic.

Click here for Part 2.

©2015 Oliver Peters

More Life for your Mac Pro

df_life_macproI work a lot with a local college’s film production technology program as an advisor, editing instructor and occasionally as an editor on some of their professional productions. It’s a unique program designed to teach hands-on, below-the-line filmmaking skills. The gear has to be current and competitive, because they frequently partner with outside producers to turn out actual (not student) products with a combination of professional and student crews. The department has five Mac Pros that are used for editing, which I’ve recently upgraded to current standards, as they get ready for a new incoming class. The process has given me some thoughts about how to get more life out of your aging Apple Mac Pro towers, which I’ll share here.

To upgrade or not

Most Apple fans drool at the new Mac Pro “tube” computers, but for many, such a purchase simply isn’t viable. Maybe it’s the cost or the need for existing peripherals or other concerns, but many editors are still opting to get as much life as possible out of their existing Mac Pro towers.

In the case of the department, four of the machines are fast 2010 quad-cores and the fifth is a late 2008 eight-core. As long as your machine is an Intel of late 2008 or newer vintage, then generally it’s upgradeable to the most current software. Early 2008 and older is really pushing it. Anything before 2009 probably shouldn’t be used as a primary workhorse system. At 2009, you are on the cusp of whether it’s worth upgrading or not. 2010 and newer would be definitely solid enough to get a few more productive years out of the machine.

The four 2010 Mac Pros are installed in rooms designated as cutting rooms. The 2008 Mac was actually set aside and largely unused, so it had the oldest configuration and software. I decided it needed an upgrade, too, although mainly as an overflow unit. This incoming class is larger than normal, so I felt that having a fifth machine might be useful, since it still could be upgraded.

Software

All five machines have largely been given the same complement of software, which means Mavericks (10.9.4) and various editing tools. The first trick is getting the OS updated, since the oldest machines were running on versions that cannot be updated via the Mac App Store. Secondly, this kind of update really works best when you do a clean install. To get the Mavericks installer, you have to download it to a machine that can access the App Store. Once you’ve done the download, but BEFORE you actually start the installation, quit out of the installer. This leaves you with the Install Mavericks application in your applications folder. This is a 4GB installer file that you can now copy to other drives.

In doing the updates, I found it best to move drives around in the drive bays, putting a blank drive in bay 1 and moving the existing boot drive to bay 2. Format the bay 1 drive and copy the Mavericks installer to it. Run the installer, but select the correct target drive, which should be your new, empty bay 1 drive and NOT the current boot drive that’s running. Once the installation is complete, set up a new user account and migrate your applications from the old boot drive to the new boot drive. I do this without the rest (no documents or preferences). Since these systems didn’t have purchased third-party plug-ins, there weren’t any authorization issues after the migration. My reason for migrating the existing apps was that some of the software, like volume-licensed versions of Microsoft Office and Apple Final Cut Studio were there and I didn’t want to track down the installers again from IT. Naturally before doing this I had already uninstalled junk, like old trial versions or other software a student might have installed in the past. Any needed documents had already been separately backed up.

Once I’m running 10.9.4 on the new boot drive, I access the App Store, sign in with the proper ID and install all the App Store purchases. Since the school has a new volume license for Adobe Creative Cloud, I also have an installer from IT to cover the Adobe apps. Once the software dance is done, my complement includes:

Apple Final Cut Pro Studio “legacy” (FCP 7, DVD Studio Pro, Cinema Tools, Soundtrack Pro, Compressor, Motion, Color)

Apple Final Cut Pro X  “new” applications and utilities (FCP X, Motion, Compressor, Xto7, 7toX, Sync-N-Link X, EDL-X, X2Pro)

Adobe Creative Cloud 2014 (Prelude, Premiere Pro, SpeedGrade, Adobe Media Encoder, Illustrator, Photoshop, After Effects, Audition)

Avid Media Composer and Sorenson Squeeze (2 machines only)

Blackmagic Design DaVinci Resolve 11

Miscellaneous applications (Titanium Toast, Handbrake, MPEG Streamclip, Pages, Numbers, Keynote, Word, Excel, Redcine-X Pro)

Internal hard drives

All Mac Pro towers support four internal drives. Last year I had upgraded two of these machines with 500GB Crucial SSDs as their boot drive. While these are nice and fast, I opted to stick with spinning drives for everything else. The performance demand on these systems is not such that there’s really a major advantage over a good mechanical drive. For the most part, all machines now have four internal 1TB Western Digital Black 7200 RPM drives. The exceptions are the two machines with 500GB SSD boot drives and the 2008 Mac, which has two 500GB drives that it came supplied with.

After rearranging the drives, the configuration is: bay 1 – boot drive, bay 2 – “Media A”, bay 3 – “Media B” and bay 4 – Time Machine back-up. The Media A and B drives are used for project files, short term media storage and stock sound effects and music. When these systems were first purchased, I had configured the three drives in the 2, 3 and 4 slots as a single 3TB volume by RAIDing them as a RAID-0 software stripe. This was used as a common drive for media on each of the computers. However, over this last year, one of the machines appeared to have an underperforming drive within the stripe, which was causing all sorts of media problems on this machine. Since this posed the risk of potentially losing 3TB worth of media in the future on any of the Macs, I decided to rethink the approach and split all the drives back to single volumes. I replaced the underperforming drive and changed all the machines to this four volume configuration, without any internal stripes.

RAM and video cards

The 2010 machines originally came with ATI 5870 video cards and the 2008 an older NVIDIA card. In the course of the past year, one of the 5870 cards died and was replaced with a Sapphire 7950. In revitalizing the 2008 Mac, I decided to put one of the other 5870s into it and then replace it in the 2010 machine with another Sapphire. While the NVIDIA GTX 680 card is also a highly-regarded option, I decided to stick with the ATI/AMD card family for consistency throughout the units. One unit also includes a RED Rocket card for accelerated transcoding of RED .r3d files.

The 2010 machines have all been bumped up to 32GB of RAM (Crucial or Other Word Computing). The 2008 uses an earlier vintage of RAM and originally only had 2GB installed. The App Store won’t even let you download FCP X with 2GB. It’s been bumped up the 16GB, which will be more than enough for an overflow unit.

Of these cutting rooms, only one is designed as “higher end” and that’s where most of the professional projects are cut, when the department is directly involved in post. It includes Panasonic HD plasma and Sony SD CRT monitors that are fed by an AJA KONA LHi card. This room was originally configured as an Avid Xpress Meridien-based room back in the SD days, so there are also Digibeta, DVCAM and DAT decks. These still work fine, but are largely unused, as most of the workflow now is file-based (usually RED or Canon).

In order to run Resolve on any external monitor, you need a Blackmagic Design Decklink card. I had temporarily installed a loaner in place of the KONA, but it died, so the KONA went back in. Unfortunately with the KONA and FCP X, I cannot see video on both the Panasonic and Sony at the same time with 1080p/23.98 projects. That’s because of the limitations of what the Panasonic will accept over HDMI, coupled with the secondary processing options of the KONA. The HDMI signal wants P and not PsF and this results in the conflict. In the future, we’ll probably revisit the Decklink card issue, budget permitting, possibly moving the KONA to another bay.

All four 2010 units are equipped with two 27” Apple Cinema Displays, so the rooms without external monitoring simply use one of the screens to display a large viewer in most of the software. This is more than adequate in a small cutting room. The fifth 2008 Mac has dual 20” ACDs. Although my personal preference is to work with something smaller that dual 27” screens – as the lateral distance is too great – a lot of the modern software feels very crowded on smaller screens, such as the 20” ACDs. This is especially true of Resolve 11, which feels best with two 27” screens. Personally I would have opted for dual 23” or 24” HPs or Dells, but these systems were all purchased this way and there’s no real reason to change.

External storage

Storage on these units has always been local, so in addition to the internal drives, they are also equipped with external storage. Typically users are encouraged to supply their own external drives for short edits, but storage is made available for extended projects. The main room is equipped with a large MAXX Digital array connected via an ATTO card. All four 2010 rooms each gained a LaCie 4Big 12TB array last year. These were connected on one of the FireWire 800 ports and initially configured as RAID-1 (mirror), so only half the capacity was available.

This year I reconfigured/reformatted them as RAID-5, which nets a bit over 8TB of actual capacity. To increase the data throughput, I also added CalDigit FASTA-6GU3 cards to each. This is a PCIe combo host adapter card that provides two USB 3.0 and two SATA ports. By connecting the LaCie to each of the Macs via USB 3.0, it improves the read/write speeds compared to FireWire 800. While it’s not as fast Thunderbolt or even the MAXX array, the LaCies on USB 3.0 easily handle ProRes 1080p files and even limited use of native RED files within projects.

Other

A few other enhancements were made to round out the rooms as cutting bays. First audio. The main room uses the KONA’s analog audio outputs routed through a small Mackie mixer to supply volume to the speakers. To provide similar capabilities in the other rooms, I added a PreSonus AudioBox USB audio interface and a small Mackie mixer to each. The speakers are a mix of Behringer Truth, KRK Rokit 5 and Rokit 6 powered speaker pairs, mounted on speaker pedestals behind the Apple Cinema Displays. Signal flow is from the computer to the AudioBox via USB (or KONA in one room), the channel 1 and 2 analog outputs from the AudioBox (or KONA) into the Mackie and then the main mixer outputs to the left and right speakers. In this way, the master fader volume on the mixer is essentially the volume control for the system. This is used mainly for monitoring, but this combination does allow the connection of a microphone for input back into the Mac for scratch recordings. Of course, having a small mixer also lets you plug in another device just to preview audio.

The fifth Mac Pro isn’t installed in a room that’s designated as a cutting room, so it simply got the repurposed Roland powered near field speakers from an older Avid system. These were connected directly to the computer output.

Last, but not least, it’s the little things. When I started this upgrade round, one of the machines was considered a basket case, because it froze a lot and, therefore, was generally not used. That turned out to simply be a bad Apple Magic Mouse. The mouse would mess up, leaving the cursor frozen. Users assumed the Mac had frozen up, when in fact, it was fine. To fix this and any other potential future mouse issues, I dumped all the Apple Bluetooth mice and replaced them with Logitech wireless mice. Much better feel and the problem was solved!

©2014 Oliver Peters

Avid Everywhere

df_avid_ev_1It’s interesting to see that in spite of a lot of press, the Avid Everywhere concept still results in confusion. They’ve certainly been enunciating it since last year, with a full roll-out at NAB this past April. For whatever reason, Avid Everywhere seems to be lumped together with Adobe Anywhere in the minds of many. Maybe it’s the similarity of names or it’s that they both have a cloud component, but they aren’t the same thing. Avid Everywhere is a corporate vision, while Adobe Anywhere is a specific product (more on that later).

Vision and strategy

Avid Technology is a company with a diverse range of hardware and software products, covering content creation (video, audio, graphics, news), asset management, audio/video hardware i/o, consoles and control surfaces, storage and servers. In an effort to consolidate and rebrand a wide-ranging set of offerings, Avid has repackaged these existing (and future) products under the banner of Avid Everywhere. This is a marketing strategy designed to convey the message that whatever your media needs might be, Avid has a product or service to satisfy that need. This is coupled to a community of users that can benefit from their common use of Avid products.

This vision positions Avid’s products as a “platform”, in the same way that Windows, Mac OS X, iOS, Android, Apple hardware and PC hardware are all platforms. Within this platform concept, the products become stratified into product tiers or “suites”. Bear in mind that “suite” really refers to a group of products and not specifically a collection of hardware or software that you purchase as a single unit. The base layer of this platform contains the various software hooks that tie the products together – for example, APIs required to use Media Composer software with Interplay asset management or in an ISIS SAN environment. This is called the Avid MediaCentral Platform.

df_avid_ev_2On top of this sits the Storage Suite, which consists of the various Avid storage solutions, such as ISIS, along with news play-out servers. The next tier is the Media Suite, which encompasses the Interplay asset management and iNews newsroom products. In the transition to the Avid Everywhere strategy, you’ll see a lot of references on Avid’s website and in their marketing literature to “formerly Interplay ___”. That’s because Avid is in the process of rebranding these products into something with a “Media ___” name.

Most users who are editing and audio professionals will mainly associate Avid with the Artist Suite tier. This is the layer of content creation tools, including Media Composer, Pro Tools, Sibelius and the control surfaces that came out of Digidesign and Euphonix, including the Artist panels. If you are a single user of Media Composer, Pro Tools or Sibelius and own no other Avid infrastructure, like ISIS or Interplay, then the entire Avid Everywhere media platform doesn’t touch you very much for now.

The top layer of the platform chart is MediaCentral | UX, which was formerly known as Interplay Central. This is a web front-end that allows you to browse, log and notate Interplay assets from a desktop computer, laptop or mobile device. Although the current iteration is targeted at news production, the concept is story-centric and could provide functionality in other arenas, such as drama and reality series production.

Surrounding the entire structure are support services (tech support and professional integration services) plus a private and public marketplace. Media Composer software has included a Marketplace menu item for a few versions. Until now, this has been a web portal to buy plug-ins and stock footage. The updated vision for this is more along the lines of services like SoundCloud, Adobe’s Behance service or the files section of Creative Cloud. For example, let’s say you are a composer that uses Pro Tools. You create licensable music tracks and post them to the Marketplace. Other users can browse the Marketplace and find your tracks, complete with licensing and payment arrangements. To make this work, the Avid MediaCentral Platform includes things like proper security to enable such transactions.

All clouds are not the same

df_avid_ev_3I started this post with the comment that I feel many editors confuse Adobe Anywhere and Avid Everywhere. I believe that’s because they mistakenly interpret Avid Everywhere as the specific version of the Media Composer product that enables remote-access editing. As I’ve explained above, Everywhere is a concept and vision, not a product. That specific Media Composer product (formerly Interplay Sphere) is now branded as Media Composer | Cloud. As a product, it most closely approximates Adobe Anywhere, but there are key differences.

Adobe Anywhere is a system that requires a centralized server and storage. Any computer with Premiere Pro CC or CC 2014 can remotely access the assets on this system, which streams proxy media back to that computer. All the “heavy lifting” is done at the central site and the editor’s Premiere Pro is effectively working only as a local front-end. The operation does not allow hybrid editing with a combination of local and remote assets. All local assets have to be uploaded to the server and then streamed back to the editor. That’s because Anywhere manages the assets for multiple editors during collaborative workflows and handles project versioning. If you are working on an Anywhere production, you always have to be connected to the network.

df_avid_ev_4In contrast, Media Composer | Cloud is primarily a plug-in that works with an otherwise standard version of the Media Composer software. In order for it to function, the “home base” facility must have an appropriate Interplay/ISIS infrastructure so that Media Composer | Cloud can talk to it. In Avid marketing parlance “you’ve got to get on the platform” for some of these things to work.

Media Composer | Cloud permits hybrid editing. For example, a news videographer in the field can be editing at the proverbial Starbucks using local assets. Maybe part of the story requires access to past b-roll footage that lives back at the station on its newsroom storage. Through Media Composer | Cloud and Interplay, the videographer can access those files as proxies and integrate them into the piece. Meanwhile, local assets can be uploaded back to the station. When the piece is cut, a “publish” command (an AAF of the sequence) goes back to the station for quick turnaround to air. Media Composer | Cloud, by its nature, doesn’t require continuous connection, so editing can continue during transit, such as in a vehicle.

While not everything about Avid Everywhere has been fully implemented, yet, it certainly is an aggressive strategy. It is an attempt to move the company as a whole into areas beyond just editing software, while still allowing users and owners to leverage their Avid assets into other opportunities.

©2014 Oliver Peters