Avid Everywhere

df_avid_ev_1It’s interesting to see that in spite of a lot of press, the Avid Everywhere concept still results in confusion. They’ve certainly been enunciating it since last year, with a full roll-out at NAB this past April. For whatever reason, Avid Everywhere seems to be lumped together with Adobe Anywhere in the minds of many. Maybe it’s the similarity of names or it’s that they both have a cloud component, but they aren’t the same thing. Avid Everywhere is a corporate vision, while Adobe Anywhere is a specific product (more on that later).

Vision and strategy

Avid Technology is a company with a diverse range of hardware and software products, covering content creation (video, audio, graphics, news), asset management, audio/video hardware i/o, consoles and control surfaces, storage and servers. In an effort to consolidate and rebrand a wide-ranging set of offerings, Avid has repackaged these existing (and future) products under the banner of Avid Everywhere. This is a marketing strategy designed to convey the message that whatever your media needs might be, Avid has a product or service to satisfy that need. This is coupled to a community of users that can benefit from their common use of Avid products.

This vision positions Avid’s products as a “platform”, in the same way that Windows, Mac OS X, iOS, Android, Apple hardware and PC hardware are all platforms. Within this platform concept, the products become stratified into product tiers or “suites”. Bear in mind that “suite” really refers to a group of products and not specifically a collection of hardware or software that you purchase as a single unit. The base layer of this platform contains the various software hooks that tie the products together – for example, APIs required to use Media Composer software with Interplay asset management or in an ISIS SAN environment. This is called the Avid MediaCentral Platform.

df_avid_ev_2On top of this sits the Storage Suite, which consists of the various Avid storage solutions, such as ISIS, along with news play-out servers. The next tier is the Media Suite, which encompasses the Interplay asset management and iNews newsroom products. In the transition to the Avid Everywhere strategy, you’ll see a lot of references on Avid’s website and in their marketing literature to “formerly Interplay ___”. That’s because Avid is in the process of rebranding these products into something with a “Media ___” name.

Most users who are editing and audio professionals will mainly associate Avid with the Artist Suite tier. This is the layer of content creation tools, including Media Composer, Pro Tools, Sibelius and the control surfaces that came out of Digidesign and Euphonix, including the Artist panels. If you are a single user of Media Composer, Pro Tools or Sibelius and own no other Avid infrastructure, like ISIS or Interplay, then the entire Avid Everywhere media platform doesn’t touch you very much for now.

The top layer of the platform chart is MediaCentral | UX, which was formerly known as Interplay Central. This is a web front-end that allows you to browse, log and notate Interplay assets from a desktop computer, laptop or mobile device. Although the current iteration is targeted at news production, the concept is story-centric and could provide functionality in other arenas, such as drama and reality series production.

Surrounding the entire structure are support services (tech support and professional integration services) plus a private and public marketplace. Media Composer software has included a Marketplace menu item for a few versions. Until now, this has been a web portal to buy plug-ins and stock footage. The updated vision for this is more along the lines of services like SoundCloud, Adobe’s Behance service or the files section of Creative Cloud. For example, let’s say you are a composer that uses Pro Tools. You create licensable music tracks and post them to the Marketplace. Other users can browse the Marketplace and find your tracks, complete with licensing and payment arrangements. To make this work, the Avid MediaCentral Platform includes things like proper security to enable such transactions.

All clouds are not the same

df_avid_ev_3I started this post with the comment that I feel many editors confuse Adobe Anywhere and Avid Everywhere. I believe that’s because they mistakenly interpret Avid Everywhere as the specific version of the Media Composer product that enables remote-access editing. As I’ve explained above, Everywhere is a concept and vision, not a product. That specific Media Composer product (formerly Interplay Sphere) is now branded as Media Composer | Cloud. As a product, it most closely approximates Adobe Anywhere, but there are key differences.

Adobe Anywhere is a system that requires a centralized server and storage. Any computer with Premiere Pro CC or CC 2014 can remotely access the assets on this system, which streams proxy media back to that computer. All the “heavy lifting” is done at the central site and the editor’s Premiere Pro is effectively working only as a local front-end. The operation does not allow hybrid editing with a combination of local and remote assets. All local assets have to be uploaded to the server and then streamed back to the editor. That’s because Anywhere manages the assets for multiple editors during collaborative workflows and handles project versioning. If you are working on an Anywhere production, you always have to be connected to the network.

df_avid_ev_4In contrast, Media Composer | Cloud is primarily a plug-in that works with an otherwise standard version of the Media Composer software. In order for it to function, the “home base” facility must have an appropriate Interplay/ISIS infrastructure so that Media Composer | Cloud can talk to it. In Avid marketing parlance “you’ve got to get on the platform” for some of these things to work.

Media Composer | Cloud permits hybrid editing. For example, a news videographer in the field can be editing at the proverbial Starbucks using local assets. Maybe part of the story requires access to past b-roll footage that lives back at the station on its newsroom storage. Through Media Composer | Cloud and Interplay, the videographer can access those files as proxies and integrate them into the piece. Meanwhile, local assets can be uploaded back to the station. When the piece is cut, a “publish” command (an AAF of the sequence) goes back to the station for quick turnaround to air. Media Composer | Cloud, by its nature, doesn’t require continuous connection, so editing can continue during transit, such as in a vehicle.

While not everything about Avid Everywhere has been fully implemented, yet, it certainly is an aggressive strategy. It is an attempt to move the company as a whole into areas beyond just editing software, while still allowing users and owners to leverage their Avid assets into other opportunities.

©2014 Oliver Peters

Red Giant Universe

df_rgsu_1

Red Giant Software, developers of such popular effects and editing tools as Trapcode and Magic Bullet, recently announced Red Giant Universe. Red Giant has adopted a hybrid free/subscription model. Once you sign into Universe for a Red Giant account, you have access to all the free filters and transitions that are part of this package. Initially this includes 31 free plug-ins (22 effects, 9 transitions) and 19 premium plug-ins (12 effects, 7 transitions). Universe users have a 30-day trial period before the premium effects become watermarked. Premium membership pricing will be $10/month, $99/year or $399/lifetime. Lifetime members will receive routine updates without any further cost.

A new approach to a fresh and growing library of effects

The general mood among content creators has been against subscription models; however, when I polled thoughts about the Universe model on one of the Creative COW forums, the comments were very positive. I originally looked at Red Giant’s early press on Universe and I had gotten the impression that Universe would be an environment in which users could create their own custom effects. In fact, this isn’t the case at all. The Universe concept is built on Supernova, an internal development tool that Red Giant’s designers use to create new effects and transitions. Supernova draws from a library of building block filters that can be combined to create new plug-in effects. This is somewhat the same as Apple’s Quartz Composer development tool; however, it is not part of the package that members can access.

df_rgsu_3Red Giant plans to build a community around the Universe members, who will have some input into the types of new plug-ins created. These plug-ins will only be generated by Red Giant designers and partner developers. Currently they are working with Crumplepop, with whom they created Retrograde – one of the premium plug-ins. The point of being a paid premium member is to continue receiving routine updates that add to the repertoire of Universe effects that you own. In addition, some of the existing Red Giant products will be ported to Universe in the future as new premium effects.

df_rgsu_2This model is similar to what GenArts had done with Sapphire Edge, which was based on an upfront purchase, plus a subscription for updated effects “collections” (essentially new preset versions of an Edge plug-in). These were created by approved designers and added to the library each month. (Note: Sapphire Edge – or at least the FX Central subscription – appears to have been discontinued this year.) Unlike the Sapphire Edge “collections”, the Universe updates are not limited to presets, but will include brand new plug-ins. Red Giant tells me they currently have several dozen in the development pipeline already.

Red Giant Universe supports both Mac and Windows and runs in recent versions of Adobe After Effects, Premiere Pro, Apple Final Cut Pro X and Motion. At least for now, Universe doesn’t support Avid, Sony Vegas, DaVinci Resolve, EDIUS or Nuke hosts. Members will be able to install the software on two computers and a single installation of Universe will install these effects into all applicable hosts, so only one purchase is necessary for all.

Free and premium effects with GPU acceleration

In this initial release, the range of effects includes many standards as free effects, including blurs, glows, distortion effects, generators and transitions. The premium effects include some that have been ported over from other Red Giant products, including Knoll Light Factory EZ, Holomatrix, Retrograde, ToonIt and others. In case you are concerned about duplication if you’ve already purchased some of these effects, Red Giant answers this in their FAQ: “We’ve retooled the tools. Premium tools are faster, sleeker versions of the Red Giant products that you already know and love. ToonIt is 10x faster. Knoll Light Factory is 5x faster. We’ve streamlined [them]with fewer controls so you can work faster. All of the tools work seamlessly with [all of the] host apps, unlike some tools in the Effects Suite.”

df_rgsu_4The big selling point is that these are high-quality, GPU-accelerated effects, which use 32-bit float processing for trillions of colors. Red Giant is using OpenGL rather than OpenCL or NVIDIA’s CUDA technology, because it is easier to provide support across various graphics cards and operating systems. The recommendation is to have one of the newer, faster NVIDIA or AMD cards or mobile GPUs. The minimum GPU is an Intel HD 3000 integrated graphics chip. According to Red Giant, “Everything is rendered on the GPU, which makes Universe up to 10 times faster than CPU-based graphics. Many tools use advanced render technology that’s typically used in game development and simulation.”

In actual use

After Universe is installed, the updates are managed through the Red Giant Link utility. This will now keep track of all Red Giant products that you have installed (along with Universe) and lets you update as needed. The effects themselves are nice and the quality is high, but these are largely standard effects, so far. There’s nothing major yet, that isn’t already represented with a similar effect within the built-in filters and transitions that come as part of FCP X, Motion or After Effects. Obviously, there are subjective differences in one company’s “bad TV” or “cartoon” look versus that of another, so whether or not you need any additional plug-ins becomes a personal decision.

As far as GPU-acceleration is concerned, I do find the effects to be responsive when I adjust them and preview the video. This is especially true in a host like Final Cut Pro X, which is really tuned for the GPU. For example, adding and adjusting a Knoll lens flare from the Universe package performs better on my 2009 Mac Pro (8-core with an NVIDIA Quadro 4000), than do the other third-party flare filters I have available on this unit.

df_rgsu_5The field is pretty crowded when you stack up Universe against such established competitors as GenArts Sapphire, Boris Continuum Complete, Noise Industries FxFactory Pro and others. As yet, Universe does not offer any tools that fill in workflow gaps, like tracking, masking or even keyers. I’m not sure the monthly subscription makes sense for too many customers. It would seem that free will be attractive to many, while an annual or lifetime subscription will be the way most users will purchase Universe. The lifetime price lines up well when you compare it to the others, in terms of purchasing a filter package.

Red Giant Universe is an ideal package of effects for editors. While Apple has developed a system with Motion where any user can created new FCP X effects based on templates, the reality is that few working editors have the time or interest to do that. They want effects that can be quickly applied with a minimum amount of tweaking and that perform well on a timeline. This is what impresses clients and what wins over editors to your product. With that target in mind, Red Giant definitely will do well with Universe if it holds to its promise. Ultimately the success of Universe will hang on how prolific the developers are and how quickly new effects come through the subscription pipeline.

Originally written for Digital Video magazine/Creative Planet Network

©2014 Oliver Peters

Final Cut Pro X Batch Export

df_batchex_1_sm

One of the “legacy” items that editors miss when switching to Final Cut Pro X is the batch export function. For instance, you might want to encode H.264 versions of numerous ProRes files from your production, in order to upload raw footage for client review. While FCP X can’t do it directly, there is a simple workaround that will give you the same results. It just takes a few steps.

df_batchex_2_smStep one. The first thing to do is to find the clips that you want to batch export. In my example images, I selected all the bread shots from a grocery store commercial. These have been grouped into a keyword collection called “bread”. Next, I have to edit these to a new sequence (FCP X project) into order to export. These can be in a random order and should include the full clips. Once the clips are in the project, export an FCPXML from that project.

df_batchex_3_smStep two. I’m going to use the free application ClipExporter to work the magic. Launch it and open the FCPXML for the sequence of bread shots. ClipExporter can be used for a number of different tasks, like creating After Effects scripts, but in this case we are using it to create QuickTime movies. Make sure that all of the other icons are not lit. If you toggle the Q icon (QuickTime) once, you will generate new self-contained files, but these might not be the format you want. If you toggle the Q twice, it will display the icon as QR, which means you are now ready to export QuickTime reference files – also something useful from the past. ClipExporter will generate a new QuickTime file (self-contained or reference) for each clip in the FCP X project. These will be copied into the target folder location that you designate.df_batchex_4_sm

df_batchex_5_smStep three. ClipExporter places each new QuickTime clip into its own subfolder, which is a bit cumbersome. Here’s a neat trick that will help. Use the Finder window’s search bar to locate all files that ends with the .mov extension. Make sure you limit the search to only your target folder and not the entire hard drive. Once the clips have been selected, copy-and-paste them to a new location or drag them directly into your encoding application. If you created reference files, copying them will go quickly and not take up additional hard drive space.

df_batchex_6_smStep four. Drop your selected clips into Compressor or whatever other encoding application you choose. (It will need to be able to read QuickTime reference movies.) Apply your settings and target destination and encode.

df_batchex_7_smStep five. Since many encoding presets typically append a suffix to the file name, you may want to alter or remove this on the newly encoded files. I use Better Rename to do this. It’s a batch utility for file name manipulation.

There you go – five easy steps (less if you skip some of the optional tasks) to restore batch exports to FCP X.

©2014 Oliver Peters

New NLE Color Features

df_mascliplut_2_sm

As someone who does color correction as often within an NLE as in a dedicated grading application, it’s nice to see that Apple and Adobe are not treating their color tools as an afterthought. (No snide Apple Color comments, please.) Both the Final Cut Pro 10.1.2 and Creative Cloud 2014 updates include new tools specifically designed to improve color correction. (Click the images below for an expanded view with additional explanation.)

Apple Final Cut Pro 10.1.2

df_mascliplut_3_sm

This FCP X update includes a new, built-in LUT (look-up table) feature designed to correct log-encoded camera files into Rec 709 color space. This type of LUT is camera-specific and FCP X now comes with preset LUTs for ARRI, Sony, Canon and Blackmagic Design cameras. This correction is applied as part of the media file’s color profile and, as such, takes affect before any filters or color correction is applied.

These LUTs can be enabled for master clips in the event, or after a clip has been edited to a sequence (FCP X project). The log processing can be applied to a single clip or a batch of clips in the event browser. Simply highlight one or more clips, open the inspector and choice the “settings” selection. In that pane, access the “log processing” pulldown menu and choose one of the camera options. This will now apply that camera LUT to all selected clips and will stay with a clip when it’s edited to the sequence. Individual clips in the sequence can later be enabled or disabled as needed. This LUT information does not pass though as part of an FCPXML roundtrip, such as sending a sequence to Resolve for color grading.

Although camera LUTs are specific to the color science used for each camera model’s type of log encoding, this doesn’t mean you can’t use a different LUT. Naturally some will be too extreme and not desirable. Some, however, are close and using a different LUT might give you a desirable creative result, somewhat like cross-processing in a film lab.

Adobe CC 2014 – Premiere Pro CC and SpeedGrade CC

df_mascliplut_1_sm

In this CC 2014 release, Adobe added master clip effects that travel back and forth between Premiere Pro CC and SpeedGrade CC via Direct Link. Master clip effects are relational, meaning that the color correction is applied to the master clip and, therefore, every instance of this clip that is edited to the sequence will have the same correction applied to it automatically. When you send the Premiere Pro CC sequence to SpeedGrade CC, you’ll see that the 2014 version now has two correction tabs: master clip and clip. If you want to apply a master clip effect, choose that tab and do your grade. If other sections of the same clip appear on the timeline, they have automatically been graded.

Of course, with a lot of run-and-gun footage, iris levels and lighting changes, so one setting might not work for the entire clip. In that case, you can add a second level of grading by tweaking the shot in the clip tab. Effectively you now have two levels of grading. Depending on the show, you can grade in the master clip tab, the clip tab or both. When the sequence goes back to Premiere Pro CC, SpeedGrade CC corrections are applied as Lumetri effects added to each sequence clip. Any master clip effects also “ripple back” to the master clip in the bin. This way, if you cut a new section from an already-graded master clip to that or any other sequence, color correction has already been applied to it.

In the example I created for the image above, the shot was graded as a master clip effect. Then I added more primary correction and a filter effect, by using the clip mode for the first time the clip appears in the sequence. This was used to create a cartoon look for that segment on the timeline. Compare the two versions of these shots – one with only a master clip effect (shots match) and the other with a separate clip effect added to the first (shots are different).

Since master clip effects apply globally to source clips within a project, editors should be careful about changing them or copy-and-pasting them, as you may inadvertently alter another sequence within the same project.

©2014 Oliver Peters

Adobe Anywhere

df_anywhere_1_sm

Adobe Anywhere for video is Adobe’s first foray into collaborative editing. Anywhere functions a lot like other shared storage environments, except that editors and producers are not bound to working within the facility and its hard-wired network. The key difference between Adobe Anywhere and other NLE/SAN combinations is that all media is stored at the central location and the system’s servers handle the actual editing and compositing functions of the editing software. This means that no media is stored on the editor’s local computer and lightweight client stations can be used, since the required horsepower exists at the central location. Anywhere works within a facility using the existing LAN or externally over the internet when client systems connect remotely over VPN. Currently Adobe Anywhere is integrated directly into Adobe Premiere Pro CC and Prelude CC (Windows and OS X). Early access to After Effects integration is part of Adobe Anywhere 1.6, with improved integration available in the next release.

The Adobe Anywhere cluster

df_anywhere_4_smAdobe Anywhere software is installed on a set of Windows servers, which are general purpose server computers that you would buy from a vendor like Dell or HP. The software creates two types of nodes: a single Adobe Anywhere Collaboration Hub node and three or more Adobe Mercury Streaming Engine nodes. Each node is installed on a separate server, so a minimum configuration requires four computers. This is separate from the shared storage. If you use a SAN, such as a Facilis Technology or an EditShare system, the SAN will be mounted at the OS level by the computing cluster of Anywhere servers. Local and remote editors can upload source media to the SAN for shared access via Anywhere.

The Collaboration Hub computer stores all of the Anywhere project metadata, manages user access and coordinates the other nodes in the system. The Mercury Streaming Engine computers provide real-time, dynamic viewing streams of Premiere Pro and Prelude sequences with GPU-accelerated effects. Media stays in its native file format on the storage servers. There are no proxy files created by the system. In order to handle real-time effects, each of the Streaming Engine servers must be equipped with a high-end NVIDIA graphics card.

As a rule of thumb, this minimum cluster size supports 10-15 active users, according to Adobe. However, the actual number depends on media type, resolution, number of simultaneous source clips needed per editor, as well as activities that may be automated like import and export. Adobe prices the Anywhere software based on the number of named users. This is a subscription model of $1,000/year/user. That’s in addition to installed seats of Creative Cloud and the cost of the hardware to make the system work, which is supplied by other vendors and not Adobe. Since this is not sold as a turnkey installation by Adobe, certain approved vendors, like TekServe and Keycode Media, have been qualified as Adobe Anywhere system integrators.

How it works

df_anywhere_5_smWhile connected to Adobe Anywhere and working with an Anywhere project, the Premiere Pro or Prelude application on the local computer is really just functioning as the software front-end that is driving the application running back at the server. The result of the edit decisions are streamed back to the local machine in real-time as a single stream of video. The live stream of media from the Mercury Streaming Engine is being handled in a similar fashion to the playback resolution throttle that’s already part of Premiere Pro. As native media is played, the computer adjusts the stream’s playback compression based on bandwidth. Whenever playback is paused, the parked frame is updated to full resolution – thus, enabling an editor to tweak an effect or composite and always see the full resolution image while making the adjustments.

To understand this better, let’s use the example of a quad split. If this were done locally, the drives would be playing back four streams of video and the software and GPU of that local computer would composite the quad split and present a single stream of video to the viewer display. In the case of Adobe Anywhere, the playback of these four streams and the compositing of the quad split would take place on the Mercury Streaming Engine computer. In turn, it would stream this live composite as a single feed of video back to the remotely connected computer. Since all the “heavy lifting” is done at “home base” the system requirements for the client machine can be less beefy. In theory, you could be working with a MacBook Air, while editing RED Epic 5K footage.

Productions

Another difference with Adobe Anywhere is that instead of having Premiere Pro or Prelude project files, users create shared productions, designed for multi-user and multi-application access. This way a collaborating team is set up like a workgroup with assigned permission levels. Media is common and central to avoid media duplication. Any media that is added on-site, is uploaded to the production in its native resolution and becomes part of the shared assets of the production. The Collaboration Hub computer manages the database for all productions.

When a user remotely logs into an Adobe Anywhere Production, media to which he has been granted access is available for browsing using Premiere Pro’s standard Media Browser panel. When an editor starts working, Anywhere automatically makes a virtual “clone” of his or her production items and opens them in a private session. Because multiple people can be working in the same production at the same time, Adobe Anywhere provides protection against conflicts or overwrites. In order to share your private changes, you must first get any updates from the shared production. This pulls all shared changes into your private view. If another person has changed the same asset you are working on, you are provided with information about the conflict and given the opportunity to keep the other person’s changes, your changes or both. Once you make your choices, you can then transfer your changes back to the shared production. Anywhere also maintains a version history, so if unwanted changes are made, you can revert back to an earlier or alternate version.

Adobe Anywhere in the wild

df_anywhere_2_smAlthough large installations like CNN are great for publicity headlines, Adobe Anywhere is proving to be useful at smaller facilities, too. G-Men Media is a production company based in Venice, California. They are focused primarily on feature film and commercial broadcast work. According to G-Men COO, Jeff Way, “G-Men was originally founded with the goal of utilizing the latest digital technologies available to reduce costs, accelerate workflow and minimize turnaround time for our clients. Adobe Anywhere allowed us to provide our clients a more efficient workflow on post productions without having to grow infrastructure on a per project basis.”

“A significant factor of Adobe Anywhere, which increased the growth of our client base, was the system’s ability to organize production teams based on talent instead of location. If we can minimize or eliminate time required for coordinating actual production work (i.e. shipping hard drives, scheduling meetings with editors, awaiting review/approval), we can save clients money that they can then invest into more creative aspects of the project – or simply undercut their budget. Furthermore, we have the ability to scale up or down without added expenses in infrastructure. All that’s required on our end is simply granting the Creative Cloud seat access to the system assets for their production.”

df_anywhere_3_smThe G-Men installation was handled by Keycode Media, based on the recommended Adobe configuration described at the beginning of this article. This includes four SuperMicro 1U rack-mounted SuperServers. Three of these operate as the Adobe Anywhere Mercury Streaming Engines and the fourth acts as the Adobe Anywhere Collaboration Hub. Each of the Mercury Streaming Engines has its own individual NVIDIA Tesla K10 GPU card. The servers are connected to a Facilis Terrablock shared storage array via a 10 Gigabit Ethernet switch. Their Internet feed is via a fiber optic connection, typically operating at 500Mbps (down) /150Mbps (up). G-Men has used the system on every project, since it went live in August of 2013. Noteworthy was its use for post on Savageland – the first feature film to run through an Adobe Anywhere system.

Way continued, “Savageland ended up being a unique situation and the ultimate test of the system’s capabilities. Savageland was filmed over three years with various forms of media from iPhone and GoPro footage to R3D raw and Canon 5D. It was really a matter of what the directors/producers could get their hands on from day-to-day. After ingesting the assets into our system, we were able to see a fluid transition straight into editing without having to transcode media assets. One of the selling factors of gaining Savageland as a client was the flexibility and feasibility of allowing all of the directors and editors (who lived large distances from each other in Los Angeles) to work at their convenience. The workflow for them changed from setting aside their weekends and nights for review meetings at a single location to a readily available review via their MacBooks and iPads.”

“For most of our clients, the system has allowed them to bring on the editorial talent they want without having to worry about the location of the editor. At the same time, the editors enjoyed the flexibility of working from wherever they wanted – many times out of their own homes. The benefit for editors and directors is the capability to remotely collaborate and provide feedback immediately. We’ve had a few productions where there are more than one editor working on the same assets – both creating different versions of the same edit. At the same time we had a director viewing the changes immediately after they were shared, with notes on each version. Then they had the ability to immediately make a decision on one or the other or provide creative feedback, so the editors could immediately apply the changes in real time.”

G-Men is in production on Divine Access, a feature film being shot in Austin, Texas. Way explained, “We’re currently in Austin beginning principal photography. Knowing the cloud-based editing workflows available to us, we wanted to expand the benefits we are gaining in post to the entirety of a feature film production from first location scout to principal photography and all the way through to delivery. We’re using our infrastructure to ingest and begin edits as we shoot, which is really new and exciting to all of the producers working on the film.  With the upload speeds we have available to us, we are able to provide review/approvals to our director the same day.”

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Cold In July

df_cij_2_smJim Mickle started his career as a freelance editor in New York, working on commercials and corporate videos, like so many others. Bitten by the filmmaking bug, Mickle has gone on to successfully direct four indie feature films, including his latest, Cold in July. Like his previous film, We Are What We Are, both films had a successful premiere at the Sundance Film Festival.

Cold In July, which is based on a novel by Joe R. Lansdale, is a noir crime drama set in 1980s East Texas. It stars Michael C. Hall (Dexter), Sam Shepard (Out of the Furnace, Killing Them Softly) and Don Johnson (Django Unchained, Miami Vice). Awakened in the middle of the night, small town family man Richard Dane (Hall) kills a burglar in his house. Dane soon fears for his family’s safety when the burglar’s ex-con father, Ben (Shepard), comes to town, bent on revenge. However, the story takes a twist into a world of corruption and violence. Add Jim Bob (Johnson) to this mix, as a pig-farming, private eye, and you have an interesting trio of characters.

According to Jim Mickle, Cold In July was on a fast-track schedule. The script was optioned in 2007, but production didn’t start until 2013. This included eight weeks of pre-production beginning in May and principal photography starting in July (for five weeks) with a wrap in September. The picture was “locked” shortly after Thanksgiving. Along with Mickle, John Paul Hortsmann (Killing Them Softly) shared editing duties.

df_cij_1_smI asked Mickle how it was to work with another editor. He explained, “I edited my last three films by myself, but with this schedule, post was wedged between promoting We Are What We Are and the Sundance deadline. I really didn’t have time to walk away from it and view it with fresh eyes. I decided to bring John Paul on board to help. This was the first time I’ve worked with another editor. John Paul was cutting while I was shooting and edited the initial assembly, which was finished about a week before the Sundance submission deadline. I got involved in the edit about mid-October. At that point, we went back to tighten and smooth out the film. We would each work on scenes and then switch and take a pass at each other’s work.”

df_cij_4_smMickle continued, “The version that we submitted to Sundance was two-and-a-half hours long. John Paul and I spent about three weeks polishing and were ready to get feedback from the outside. We held a screening for 20 to 25 people and afterwards asked questions about whether the plot points were coherent to them. It’s always good for me, as the director, to see the film with an audience. You get to see it fresh – with new eyes – and that helps you to trim and condense sections of the film. For example, in the early versions of the script, it generally felt like the middle section of the film lost tension. So, we had added a sub-plot element into the script to build up the mystery. This was a car of agents tailing our hero that we could always reuse, as needed. When we held the screening, it felt like that stuff was completely unnecessary and simply put on top of the rest of the film. The next day we sliced it all out, which cut 10 minutes out of the film. Then it finally felt like everything clicked.”

df_cij_3_smThe director-editor relationship always presents an interesting dynamic, since the editor can be objective in cutting out material that may have cost the director a lot of time and effort on set to capture. Normally, the editor has no emotional investment in production of the footage. So, how did Jim Mickle as the editor, treat his own work as the director? Mickle answered, “As an editor, I’m more ruthless on myself as the director. John Paul was less quick to give up on scenes than I. There are things I didn’t think twice about losing if they didn’t work, but he’d stay late to fix things and often have a solution the next day. I shoot with plenty of coverage these days, so I’ll build a scene and then rework it. I love the edit. It’s the first time you really feel comfortable and can craft the story. On the set, things happen so quickly, that you always have to be reactive – working and thinking on your feet.”

df_cij_5_smAlthough Mickle had edited We Are What We Are with Adobe Premiere Pro, the decision was made to shift back to Apple Final Cut Pro 7 for the edit of Cold In July. Mickle explained, “As a freelance editor in New York, I was very comfortable with Final Cut, but I’m also an After Effects user. When doing a lot of visual effects, it really feels tedious to go back and forth between Final Cut and After Effects. The previous film was shot with RED cameras and I used a raw workflow in post, cutting natively with Premiere Pro. I really loved the experience – working with raw files and Dynamic Link between Premiere and After Effects. When we hired John Paul as the primary editor on the film, we opted to go back to Final Cut, because that is what he is most comfortable with. That would get the job done in the most expedient fashion, since he was handling the bulk of the editing.”

df_cij_6_sm“We shot with RED cameras again, but the footage was transcoded to ProRes for the edit. I did find the process to be frustrating, though, because I really like the fluidness of using the raw files in Premiere. I like the editing process to live and breath and not be delineated. Having access to the raw films, lets me tweak the color correction, which helps me to get an idea of how a scene is shaping up. I get the composer involved early, so we have a lot of the real music in place as a guide while we edit. This way, your cutting style – and the post process in general – are more interactive. In any case, the ProRes files were only used to get us to the locked cut. Our final DI was handled by Light Iron in New York and they conformed the film from the original RED files for a 2K finish.”

The final screening with mix, color correction and all visual effects occurred just before Sundance. There the producers struck a distribution deal with IFC Films. Cold In July started its domestic release in May of this year.

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Apple’s New Mac Pro

df_mp2013_4_smThe run of the brushed aluminum tower design that highlighted Apple’s PowerMac G5 and Intel Mac Pros ended with the introduction of a radical replacement in late 2013. No matter what the nickname – “the cylinder”, “the tube” or whatever - Apple’s new 2013 Mac Pro is a tour de force of industrial design. Few products have had such pent up demand. The long lead times for custom machines originally ran months, but by now, with accelerated production, has been reduced to 24 hours. Nevertheless, if you are happy with a stock configuration, then it’s possible to walk out with a new unit on the same day at some of the Apple Store or reseller retail locations.

Design

The 2013 Mac Pro features a cylindrical design. It’s about ten inches tall, six-and-a-half inches in diameter and, thanks to a very dense component construction, weighs about eleven pounds. The outer shell – it’s actually a sleeve that can be unlocked and lifted off – uses a dark (not black) reflective coating. Internally, the circuits are mounted onto a triangle-shaped core. There’s a central vent system that draws air in through the bottom and out through the top, much like a chimney. You can still mount the Mac Pro sideways without issue, as long as the vents are not blocked. This design keeps the unit quiet and cool most of the time. During my tests, the fan noise was quieter than my tower (generally a pretty quiet unit) and the fans never kicked into high.

Despite the small size, all components are workstation class and not mobile or desktop products, as used in the Apple laptops or iMacs. It employs the fastest memory and storage of any Mac and is designed to pick up where the top-of-the-line iMac leaves off. The processors are Intel Xeon instead of Core i5 or Core i7 CPUs and graphics cards are AMD FirePro GPUs. This Xeon model is a multicore, single CPU chip. Four processor options are offered (4, 6, 8 and 12-core), ranging in speed from 3.7GHz (4-core) to 2.7GHz (12-core). RAM can be maxed out to a full 64GB. It is the only component of the Mac Pro where a user-installed, third-party upgrade is an easy option.

The Mac Pro is optimized for dual graphics processors with three GPU choices: D300 (2GB VRAM each), D500 (3GB VRAM each) or D700 (6GB VRAM each) GPUs. Internal storage is PCIe-based flash memory in 256GB, 512GB or 1TB configurations. These are not solid state drives (SSDs), but rather flash storage like that used in the iPads. Storage is connected directly to the PCIe bus of the Mac Pro for the fastest possible data i/o. The stock models start at $2,999 (4-core) and $3,999 (6-core).

Apple shipped me a reviewer’s unit,  configured in a way that they feel is the “sweet spot” for high-end video. My Mac Pro was the 8-core model, with 32GB of RAM, dual D700 GPUs and 512GB of storage. This configuration with a keyboard, mouse and AppleCare extended warranty would retail at $7,166.

Connectivity

df_mp2013_5_smAll connectors are on the back – four USB 3.0, six Thunderbolt 2, two Gigabit Ethernet and one HDMI 1.4. There is also wireless, Bluetooth, headset and speaker support. The six Thunderbolt 2 ports are split out from three internal Thunderbolt 2 buses, with the bottom bus also taking care of the HDMI port.

You can have multiple Thunderbolt monitors connected, as well as a 4K display via the HDMI spigot, however you will want to separate these onto the different buses. For example, you wouldn’t be able to support two 27” Apple displays and a 4K HDMI-connected monitor all on one single Thunderbolt bus. However, you can support up to six non-4K displays if you distribute the load across all of the connections. Since the plug for Thunderbolt is the same as Mini Display Port, you can connect nearly any standard computer monitor to these ports if you have the proper plug. For example, I used my 20” Apple Cinema Display, which has a DVI plug, by simply adding a DVI-to-MDP adapter.

The change to Thunderbolt 2 enables faster throughput. The first version of Thunderbolt used two channels of 10Gb/s data and video, with each channel going in opposite directions. Thunderbolt 2 combines this for two channels going in the same direction, thus a total of 20Gb/s. You can daisy-chain Thunderbolt devices and it is possible to combine Thunderbolt 1 and Thunderbolt 2 devices in the same chain. First generation Thunderbolt devices (such as monitors) should be at the end of the chain, so as not to create a bottleneck.

The USB 3.0 ports will support USB 1.0 and 2.0 devices, but of course, there is no increase in their speed. There is no legacy support for FireWire or eSATA, so if you want to connect older drives, you’ll need to invest in additional docks, adapters and/or expansion units. (Apple sells a $29 Thunderbolt-to-FireWire 800 adapter.) This might also include a USB hub. For example, I have more than four USB-connected devices on my current 2009 Mac Pro. The benefit of standardizing on Thunderbolt, is that all of the Thunderbolt peripherals will work with any of Apple’s other computers, including MacBook Pros, Minis and iMacs.

The tougher dilemma is if you need to accommodate current PCIe cards, such as a RED Rocket accelerator card, a FibreChannel adapter or a mini-SAS/eSATA card. In that case, a Thunderbolt 2 expansion unit will be required. One such solution is the Sonnet Technologies Echo Express III-D expansion chassis.

Mac Pro as your main edit system

df_mp2013_2_smI work in many facilities with various vintages of Mac Pro towers. There’s a wide range of connectivity needs, including drives, shared storage and peripherals. Although it’s very sexy to think about just a 2013 Mac Pro sitting on your desk with nothing else, other than a Thunderbolt monitor, that’s not the real world of post. If you are evaluating one of these as your next investment, consider what you must add. First and foremost is storage. Flash storage and SSDs are great for performance, but you’re never going to put a lot of video media on a 1TB (or smaller) drive. Then you’ll need monitors and most likely adapters or expansion products for any legacy connection.

I priced out the same unit I’m reviewing and then factored in an Apple 27” display, the Sharp 32” UHD monitor, a Promise Pegasus2 R6 12TB RAID, plus a few other peripherals, like speakers, audio i/o, docks and adapters. This bumps the total to over $15K. Granted, I’ve pretty much got a full system that will last me for years. The point is, that it’s important to look at all the ramifications when you compare the new Mac Pro over a loaded iMac or a MacBook Pro or simply upgrading a recently-purchased Mac Pro tower.

Real world performance

df_mp2013_6_smMost of the tests promoting the new Mac Pro have focused on 4K video editing. That’s coming and the system is certainly good for it, but that’s not what most people encounter today. Editors deal with a mix of media, formats, frame rates, frame sizes, etc. I ran a set of identical tests on the 2013 Mac Pro and on my own 2009 Mac Pro tower. That’s an eight-core (dual 4-core Xeons) 2.26GHz model with 28GB of RAM. The current video card is a single NVIDIA Quadro 4000 and my media is on an internal two-drive (7200RPM eSATA) RAID-0 array. Since I had no external drives connected to the 2013 Mac Pro, all media was playing from and writing to the internal flash storage. This means that performance would be about as good as you can get, but possibly better than with externally-connected drives.

I tested Apple Final Cut Pro X, Motion, Compressor, Adobe Premiere Pro CC and After Effects CC. Media included RED EPIC 5K camera raw, ARRI ALEXA 1080p ProRes 4444, Blackmagic Cinema Camera 2.5K ProResHQ and more. Most of the sequences included built-in effects and some of the new Red Giant Universe filters.

df_mp2013_3_smTo summarize the test results, performance – as measured in render or export times – was significantly better on the 2013 Mac Pro. Most of the tests showed a 2X to 3X bump in performance, even with the Adobe products. Naturally FCP X loves the GPU power of this machine. The “BruceX” test, developed as a benchmark by Alex Gollner for FCP X, consists of a 5K timeline with a series of generators. I exported this as a 5K ProRes 4444 file. The older tower accomplished this in 1:47, while the new Mac Pro smoked it in just :19. My After Effects timeline consisted of ProRes 4444 clips with a bunch of intensive Cycore filters. The old versus new renders were 23:26 and 12:53, respectively.  I also ran tests with DaVinci Resolve 10, another application that loves more than one GPU. These were RED EPIC 5K files in a 1080p timeline. Debayer resolution was set to full (no RED Rocket card used). The export times ran at 4-12fps (depending on the clip) on the tower versus 15-40fps on the new Mac Pro.

df_mp2013_1_smIn general, all operations with applications were more responsive. This is, of course, true with any solid state storage. The computer boots faster and applications load and respond more quickly. Plus, more RAM, faster processors and other factors all help to optimize the 2013 Mac Pro for best performance. For example, the interaction between Adobe Premiere Pro CC and SpeedGrade CC using the Direct Link and Lumetri filters was noticeably better with the new machine. Certainly that’s true of Final Cut Pro X and Motion, which are ideally suited for it. I would add that using a single 20” monitor connected to the Mac Pro placed very little drag on one GPU, so the second could be totally devoted to processing power. Performance might vary if I had two 27” displays, plus a 4K monitor hooked to it.

I also tested Avid Media Composer. This software doesn’t particularly use a lot of GPU processing, so performance was about the same as with my 2009 Mac Pro. It also takes a trick to get it to work. The 2013 Mac Pro has no built-in audio device, which Media Composer needs to see in order to launch. If you have an audio device connected, such as an Mbox2 Mini or even just a headset with a microphone, then Media Composer detects a core audio device and will launch. I downloaded and installed the free Soundflower software. This acts as a virtual core audio device and can be set as the computer’s audio input in the System Preferences sound panel. Doing so enabled Media Composer to launch and operate normally.

Whether the new 2013 Mac Pro is the ideal tower replacement for you comes down to budget and many other variables. Rest assured that it’s the best machine Apple has to offer today. Analogies to powerful small packages (like the Mini Cooper or Bruce Lee) are quite apt. The build quality is superb and the performance is outstanding. If you are looking for a machine to service your needs for the next five years, then it’s the ideal choice.

(Note: This unit was tested prior to the release of 10.9.3, so I didn’t encounter any of the render issues that have been plaguing Adobe and DaVinci users.)

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters