Red Giant Universe

df_rgsu_1

Red Giant Software, developers of such popular effects and editing tools as Trapcode and Magic Bullet, recently announced Red Giant Universe. Red Giant has adopted a hybrid free/subscription model. Once you sign into Universe for a Red Giant account, you have access to all the free filters and transitions that are part of this package. Initially this includes 31 free plug-ins (22 effects, 9 transitions) and 19 premium plug-ins (12 effects, 7 transitions). Universe users have a 30-day trial period before the premium effects become watermarked. Premium membership pricing will be $10/month, $99/year or $399/lifetime. Lifetime members will receive routine updates without any further cost.

A new approach to a fresh and growing library of effects

The general mood among content creators has been against subscription models; however, when I polled thoughts about the Universe model on one of the Creative COW forums, the comments were very positive. I originally looked at Red Giant’s early press on Universe and I had gotten the impression that Universe would be an environment in which users could create their own custom effects. In fact, this isn’t the case at all. The Universe concept is built on Supernova, an internal development tool that Red Giant’s designers use to create new effects and transitions. Supernova draws from a library of building block filters that can be combined to create new plug-in effects. This is somewhat the same as Apple’s Quartz Composer development tool; however, it is not part of the package that members can access.

df_rgsu_3Red Giant plans to build a community around the Universe members, who will have some input into the types of new plug-ins created. These plug-ins will only be generated by Red Giant designers and partner developers. Currently they are working with Crumplepop, with whom they created Retrograde – one of the premium plug-ins. The point of being a paid premium member is to continue receiving routine updates that add to the repertoire of Universe effects that you own. In addition, some of the existing Red Giant products will be ported to Universe in the future as new premium effects.

df_rgsu_2This model is similar to what GenArts had done with Sapphire Edge, which was based on an upfront purchase, plus a subscription for updated effects “collections” (essentially new preset versions of an Edge plug-in). These were created by approved designers and added to the library each month. (Note: Sapphire Edge – or at least the FX Central subscription – appears to have been discontinued this year.) Unlike the Sapphire Edge “collections”, the Universe updates are not limited to presets, but will include brand new plug-ins. Red Giant tells me they currently have several dozen in the development pipeline already.

Red Giant Universe supports both Mac and Windows and runs in recent versions of Adobe After Effects, Premiere Pro, Apple Final Cut Pro X and Motion. At least for now, Universe doesn’t support Avid, Sony Vegas, DaVinci Resolve, EDIUS or Nuke hosts. Members will be able to install the software on two computers and a single installation of Universe will install these effects into all applicable hosts, so only one purchase is necessary for all.

Free and premium effects with GPU acceleration

In this initial release, the range of effects includes many standards as free effects, including blurs, glows, distortion effects, generators and transitions. The premium effects include some that have been ported over from other Red Giant products, including Knoll Light Factory EZ, Holomatrix, Retrograde, ToonIt and others. In case you are concerned about duplication if you’ve already purchased some of these effects, Red Giant answers this in their FAQ: “We’ve retooled the tools. Premium tools are faster, sleeker versions of the Red Giant products that you already know and love. ToonIt is 10x faster. Knoll Light Factory is 5x faster. We’ve streamlined [them]with fewer controls so you can work faster. All of the tools work seamlessly with [all of the] host apps, unlike some tools in the Effects Suite.”

df_rgsu_4The big selling point is that these are high-quality, GPU-accelerated effects, which use 32-bit float processing for trillions of colors. Red Giant is using OpenGL rather than OpenCL or NVIDIA’s CUDA technology, because it is easier to provide support across various graphics cards and operating systems. The recommendation is to have one of the newer, faster NVIDIA or AMD cards or mobile GPUs. The minimum GPU is an Intel HD 3000 integrated graphics chip. According to Red Giant, “Everything is rendered on the GPU, which makes Universe up to 10 times faster than CPU-based graphics. Many tools use advanced render technology that’s typically used in game development and simulation.”

In actual use

After Universe is installed, the updates are managed through the Red Giant Link utility. This will now keep track of all Red Giant products that you have installed (along with Universe) and lets you update as needed. The effects themselves are nice and the quality is high, but these are largely standard effects, so far. There’s nothing major yet, that isn’t already represented with a similar effect within the built-in filters and transitions that come as part of FCP X, Motion or After Effects. Obviously, there are subjective differences in one company’s “bad TV” or “cartoon” look versus that of another, so whether or not you need any additional plug-ins becomes a personal decision.

As far as GPU-acceleration is concerned, I do find the effects to be responsive when I adjust them and preview the video. This is especially true in a host like Final Cut Pro X, which is really tuned for the GPU. For example, adding and adjusting a Knoll lens flare from the Universe package performs better on my 2009 Mac Pro (8-core with an NVIDIA Quadro 4000), than do the other third-party flare filters I have available on this unit.

df_rgsu_5The field is pretty crowded when you stack up Universe against such established competitors as GenArts Sapphire, Boris Continuum Complete, Noise Industries FxFactory Pro and others. As yet, Universe does not offer any tools that fill in workflow gaps, like tracking, masking or even keyers. I’m not sure the monthly subscription makes sense for too many customers. It would seem that free will be attractive to many, while an annual or lifetime subscription will be the way most users will purchase Universe. The lifetime price lines up well when you compare it to the others, in terms of purchasing a filter package.

Red Giant Universe is an ideal package of effects for editors. While Apple has developed a system with Motion where any user can created new FCP X effects based on templates, the reality is that few working editors have the time or interest to do that. They want effects that can be quickly applied with a minimum amount of tweaking and that perform well on a timeline. This is what impresses clients and what wins over editors to your product. With that target in mind, Red Giant definitely will do well with Universe if it holds to its promise. Ultimately the success of Universe will hang on how prolific the developers are and how quickly new effects come through the subscription pipeline.

Originally written for Digital Video magazine/Creative Planet Network

©2014 Oliver Peters

Adobe Anywhere

df_anywhere_1_sm

Adobe Anywhere for video is Adobe’s first foray into collaborative editing. Anywhere functions a lot like other shared storage environments, except that editors and producers are not bound to working within the facility and its hard-wired network. The key difference between Adobe Anywhere and other NLE/SAN combinations is that all media is stored at the central location and the system’s servers handle the actual editing and compositing functions of the editing software. This means that no media is stored on the editor’s local computer and lightweight client stations can be used, since the required horsepower exists at the central location. Anywhere works within a facility using the existing LAN or externally over the internet when client systems connect remotely over VPN. Currently Adobe Anywhere is integrated directly into Adobe Premiere Pro CC and Prelude CC (Windows and OS X). Early access to After Effects integration is part of Adobe Anywhere 1.6, with improved integration available in the next release.

The Adobe Anywhere cluster

df_anywhere_4_smAdobe Anywhere software is installed on a set of Windows servers, which are general purpose server computers that you would buy from a vendor like Dell or HP. The software creates two types of nodes: a single Adobe Anywhere Collaboration Hub node and three or more Adobe Mercury Streaming Engine nodes. Each node is installed on a separate server, so a minimum configuration requires four computers. This is separate from the shared storage. If you use a SAN, such as a Facilis Technology or an EditShare system, the SAN will be mounted at the OS level by the computing cluster of Anywhere servers. Local and remote editors can upload source media to the SAN for shared access via Anywhere.

The Collaboration Hub computer stores all of the Anywhere project metadata, manages user access and coordinates the other nodes in the system. The Mercury Streaming Engine computers provide real-time, dynamic viewing streams of Premiere Pro and Prelude sequences with GPU-accelerated effects. Media stays in its native file format on the storage servers. There are no proxy files created by the system. In order to handle real-time effects, each of the Streaming Engine servers must be equipped with a high-end NVIDIA graphics card.

As a rule of thumb, this minimum cluster size supports 10-15 active users, according to Adobe. However, the actual number depends on media type, resolution, number of simultaneous source clips needed per editor, as well as activities that may be automated like import and export. Adobe prices the Anywhere software based on the number of named users. This is a subscription model of $1,000/year/user. That’s in addition to installed seats of Creative Cloud and the cost of the hardware to make the system work, which is supplied by other vendors and not Adobe. Since this is not sold as a turnkey installation by Adobe, certain approved vendors, like TekServe and Keycode Media, have been qualified as Adobe Anywhere system integrators.

How it works

df_anywhere_5_smWhile connected to Adobe Anywhere and working with an Anywhere project, the Premiere Pro or Prelude application on the local computer is really just functioning as the software front-end that is driving the application running back at the server. The result of the edit decisions are streamed back to the local machine in real-time as a single stream of video. The live stream of media from the Mercury Streaming Engine is being handled in a similar fashion to the playback resolution throttle that’s already part of Premiere Pro. As native media is played, the computer adjusts the stream’s playback compression based on bandwidth. Whenever playback is paused, the parked frame is updated to full resolution – thus, enabling an editor to tweak an effect or composite and always see the full resolution image while making the adjustments.

To understand this better, let’s use the example of a quad split. If this were done locally, the drives would be playing back four streams of video and the software and GPU of that local computer would composite the quad split and present a single stream of video to the viewer display. In the case of Adobe Anywhere, the playback of these four streams and the compositing of the quad split would take place on the Mercury Streaming Engine computer. In turn, it would stream this live composite as a single feed of video back to the remotely connected computer. Since all the “heavy lifting” is done at “home base” the system requirements for the client machine can be less beefy. In theory, you could be working with a MacBook Air, while editing RED Epic 5K footage.

Productions

Another difference with Adobe Anywhere is that instead of having Premiere Pro or Prelude project files, users create shared productions, designed for multi-user and multi-application access. This way a collaborating team is set up like a workgroup with assigned permission levels. Media is common and central to avoid media duplication. Any media that is added on-site, is uploaded to the production in its native resolution and becomes part of the shared assets of the production. The Collaboration Hub computer manages the database for all productions.

When a user remotely logs into an Adobe Anywhere Production, media to which he has been granted access is available for browsing using Premiere Pro’s standard Media Browser panel. When an editor starts working, Anywhere automatically makes a virtual “clone” of his or her production items and opens them in a private session. Because multiple people can be working in the same production at the same time, Adobe Anywhere provides protection against conflicts or overwrites. In order to share your private changes, you must first get any updates from the shared production. This pulls all shared changes into your private view. If another person has changed the same asset you are working on, you are provided with information about the conflict and given the opportunity to keep the other person’s changes, your changes or both. Once you make your choices, you can then transfer your changes back to the shared production. Anywhere also maintains a version history, so if unwanted changes are made, you can revert back to an earlier or alternate version.

Adobe Anywhere in the wild

df_anywhere_2_smAlthough large installations like CNN are great for publicity headlines, Adobe Anywhere is proving to be useful at smaller facilities, too. G-Men Media is a production company based in Venice, California. They are focused primarily on feature film and commercial broadcast work. According to G-Men COO, Jeff Way, “G-Men was originally founded with the goal of utilizing the latest digital technologies available to reduce costs, accelerate workflow and minimize turnaround time for our clients. Adobe Anywhere allowed us to provide our clients a more efficient workflow on post productions without having to grow infrastructure on a per project basis.”

“A significant factor of Adobe Anywhere, which increased the growth of our client base, was the system’s ability to organize production teams based on talent instead of location. If we can minimize or eliminate time required for coordinating actual production work (i.e. shipping hard drives, scheduling meetings with editors, awaiting review/approval), we can save clients money that they can then invest into more creative aspects of the project – or simply undercut their budget. Furthermore, we have the ability to scale up or down without added expenses in infrastructure. All that’s required on our end is simply granting the Creative Cloud seat access to the system assets for their production.”

df_anywhere_3_smThe G-Men installation was handled by Keycode Media, based on the recommended Adobe configuration described at the beginning of this article. This includes four SuperMicro 1U rack-mounted SuperServers. Three of these operate as the Adobe Anywhere Mercury Streaming Engines and the fourth acts as the Adobe Anywhere Collaboration Hub. Each of the Mercury Streaming Engines has its own individual NVIDIA Tesla K10 GPU card. The servers are connected to a Facilis Terrablock shared storage array via a 10 Gigabit Ethernet switch. Their Internet feed is via a fiber optic connection, typically operating at 500Mbps (down) /150Mbps (up). G-Men has used the system on every project, since it went live in August of 2013. Noteworthy was its use for post on Savageland – the first feature film to run through an Adobe Anywhere system.

Way continued, “Savageland ended up being a unique situation and the ultimate test of the system’s capabilities. Savageland was filmed over three years with various forms of media from iPhone and GoPro footage to R3D raw and Canon 5D. It was really a matter of what the directors/producers could get their hands on from day-to-day. After ingesting the assets into our system, we were able to see a fluid transition straight into editing without having to transcode media assets. One of the selling factors of gaining Savageland as a client was the flexibility and feasibility of allowing all of the directors and editors (who lived large distances from each other in Los Angeles) to work at their convenience. The workflow for them changed from setting aside their weekends and nights for review meetings at a single location to a readily available review via their MacBooks and iPads.”

“For most of our clients, the system has allowed them to bring on the editorial talent they want without having to worry about the location of the editor. At the same time, the editors enjoyed the flexibility of working from wherever they wanted – many times out of their own homes. The benefit for editors and directors is the capability to remotely collaborate and provide feedback immediately. We’ve had a few productions where there are more than one editor working on the same assets – both creating different versions of the same edit. At the same time we had a director viewing the changes immediately after they were shared, with notes on each version. Then they had the ability to immediately make a decision on one or the other or provide creative feedback, so the editors could immediately apply the changes in real time.”

G-Men is in production on Divine Access, a feature film being shot in Austin, Texas. Way explained, “We’re currently in Austin beginning principal photography. Knowing the cloud-based editing workflows available to us, we wanted to expand the benefits we are gaining in post to the entirety of a feature film production from first location scout to principal photography and all the way through to delivery. We’re using our infrastructure to ingest and begin edits as we shoot, which is really new and exciting to all of the producers working on the film.  With the upload speeds we have available to us, we are able to provide review/approvals to our director the same day.”

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Apple’s New Mac Pro

df_mp2013_4_smThe run of the brushed aluminum tower design that highlighted Apple’s PowerMac G5 and Intel Mac Pros ended with the introduction of a radical replacement in late 2013. No matter what the nickname – “the cylinder”, “the tube” or whatever - Apple’s new 2013 Mac Pro is a tour de force of industrial design. Few products have had such pent up demand. The long lead times for custom machines originally ran months, but by now, with accelerated production, has been reduced to 24 hours. Nevertheless, if you are happy with a stock configuration, then it’s possible to walk out with a new unit on the same day at some of the Apple Store or reseller retail locations.

Design

The 2013 Mac Pro features a cylindrical design. It’s about ten inches tall, six-and-a-half inches in diameter and, thanks to a very dense component construction, weighs about eleven pounds. The outer shell – it’s actually a sleeve that can be unlocked and lifted off – uses a dark (not black) reflective coating. Internally, the circuits are mounted onto a triangle-shaped core. There’s a central vent system that draws air in through the bottom and out through the top, much like a chimney. You can still mount the Mac Pro sideways without issue, as long as the vents are not blocked. This design keeps the unit quiet and cool most of the time. During my tests, the fan noise was quieter than my tower (generally a pretty quiet unit) and the fans never kicked into high.

Despite the small size, all components are workstation class and not mobile or desktop products, as used in the Apple laptops or iMacs. It employs the fastest memory and storage of any Mac and is designed to pick up where the top-of-the-line iMac leaves off. The processors are Intel Xeon instead of Core i5 or Core i7 CPUs and graphics cards are AMD FirePro GPUs. This Xeon model is a multicore, single CPU chip. Four processor options are offered (4, 6, 8 and 12-core), ranging in speed from 3.7GHz (4-core) to 2.7GHz (12-core). RAM can be maxed out to a full 64GB. It is the only component of the Mac Pro where a user-installed, third-party upgrade is an easy option.

The Mac Pro is optimized for dual graphics processors with three GPU choices: D300 (2GB VRAM each), D500 (3GB VRAM each) or D700 (6GB VRAM each) GPUs. Internal storage is PCIe-based flash memory in 256GB, 512GB or 1TB configurations. These are not solid state drives (SSDs), but rather flash storage like that used in the iPads. Storage is connected directly to the PCIe bus of the Mac Pro for the fastest possible data i/o. The stock models start at $2,999 (4-core) and $3,999 (6-core).

Apple shipped me a reviewer’s unit,  configured in a way that they feel is the “sweet spot” for high-end video. My Mac Pro was the 8-core model, with 32GB of RAM, dual D700 GPUs and 512GB of storage. This configuration with a keyboard, mouse and AppleCare extended warranty would retail at $7,166.

Connectivity

df_mp2013_5_smAll connectors are on the back – four USB 3.0, six Thunderbolt 2, two Gigabit Ethernet and one HDMI 1.4. There is also wireless, Bluetooth, headset and speaker support. The six Thunderbolt 2 ports are split out from three internal Thunderbolt 2 buses, with the bottom bus also taking care of the HDMI port.

You can have multiple Thunderbolt monitors connected, as well as a 4K display via the HDMI spigot, however you will want to separate these onto the different buses. For example, you wouldn’t be able to support two 27” Apple displays and a 4K HDMI-connected monitor all on one single Thunderbolt bus. However, you can support up to six non-4K displays if you distribute the load across all of the connections. Since the plug for Thunderbolt is the same as Mini Display Port, you can connect nearly any standard computer monitor to these ports if you have the proper plug. For example, I used my 20” Apple Cinema Display, which has a DVI plug, by simply adding a DVI-to-MDP adapter.

The change to Thunderbolt 2 enables faster throughput. The first version of Thunderbolt used two channels of 10Gb/s data and video, with each channel going in opposite directions. Thunderbolt 2 combines this for two channels going in the same direction, thus a total of 20Gb/s. You can daisy-chain Thunderbolt devices and it is possible to combine Thunderbolt 1 and Thunderbolt 2 devices in the same chain. First generation Thunderbolt devices (such as monitors) should be at the end of the chain, so as not to create a bottleneck.

The USB 3.0 ports will support USB 1.0 and 2.0 devices, but of course, there is no increase in their speed. There is no legacy support for FireWire or eSATA, so if you want to connect older drives, you’ll need to invest in additional docks, adapters and/or expansion units. (Apple sells a $29 Thunderbolt-to-FireWire 800 adapter.) This might also include a USB hub. For example, I have more than four USB-connected devices on my current 2009 Mac Pro. The benefit of standardizing on Thunderbolt, is that all of the Thunderbolt peripherals will work with any of Apple’s other computers, including MacBook Pros, Minis and iMacs.

The tougher dilemma is if you need to accommodate current PCIe cards, such as a RED Rocket accelerator card, a FibreChannel adapter or a mini-SAS/eSATA card. In that case, a Thunderbolt 2 expansion unit will be required. One such solution is the Sonnet Technologies Echo Express III-D expansion chassis.

Mac Pro as your main edit system

df_mp2013_2_smI work in many facilities with various vintages of Mac Pro towers. There’s a wide range of connectivity needs, including drives, shared storage and peripherals. Although it’s very sexy to think about just a 2013 Mac Pro sitting on your desk with nothing else, other than a Thunderbolt monitor, that’s not the real world of post. If you are evaluating one of these as your next investment, consider what you must add. First and foremost is storage. Flash storage and SSDs are great for performance, but you’re never going to put a lot of video media on a 1TB (or smaller) drive. Then you’ll need monitors and most likely adapters or expansion products for any legacy connection.

I priced out the same unit I’m reviewing and then factored in an Apple 27” display, the Sharp 32” UHD monitor, a Promise Pegasus2 R6 12TB RAID, plus a few other peripherals, like speakers, audio i/o, docks and adapters. This bumps the total to over $15K. Granted, I’ve pretty much got a full system that will last me for years. The point is, that it’s important to look at all the ramifications when you compare the new Mac Pro over a loaded iMac or a MacBook Pro or simply upgrading a recently-purchased Mac Pro tower.

Real world performance

df_mp2013_6_smMost of the tests promoting the new Mac Pro have focused on 4K video editing. That’s coming and the system is certainly good for it, but that’s not what most people encounter today. Editors deal with a mix of media, formats, frame rates, frame sizes, etc. I ran a set of identical tests on the 2013 Mac Pro and on my own 2009 Mac Pro tower. That’s an eight-core (dual 4-core Xeons) 2.26GHz model with 28GB of RAM. The current video card is a single NVIDIA Quadro 4000 and my media is on an internal two-drive (7200RPM eSATA) RAID-0 array. Since I had no external drives connected to the 2013 Mac Pro, all media was playing from and writing to the internal flash storage. This means that performance would be about as good as you can get, but possibly better than with externally-connected drives.

I tested Apple Final Cut Pro X, Motion, Compressor, Adobe Premiere Pro CC and After Effects CC. Media included RED EPIC 5K camera raw, ARRI ALEXA 1080p ProRes 4444, Blackmagic Cinema Camera 2.5K ProResHQ and more. Most of the sequences included built-in effects and some of the new Red Giant Universe filters.

df_mp2013_3_smTo summarize the test results, performance – as measured in render or export times – was significantly better on the 2013 Mac Pro. Most of the tests showed a 2X to 3X bump in performance, even with the Adobe products. Naturally FCP X loves the GPU power of this machine. The “BruceX” test, developed as a benchmark by Alex Gollner for FCP X, consists of a 5K timeline with a series of generators. I exported this as a 5K ProRes 4444 file. The older tower accomplished this in 1:47, while the new Mac Pro smoked it in just :19. My After Effects timeline consisted of ProRes 4444 clips with a bunch of intensive Cycore filters. The old versus new renders were 23:26 and 12:53, respectively.  I also ran tests with DaVinci Resolve 10, another application that loves more than one GPU. These were RED EPIC 5K files in a 1080p timeline. Debayer resolution was set to full (no RED Rocket card used). The export times ran at 4-12fps (depending on the clip) on the tower versus 15-40fps on the new Mac Pro.

df_mp2013_1_smIn general, all operations with applications were more responsive. This is, of course, true with any solid state storage. The computer boots faster and applications load and respond more quickly. Plus, more RAM, faster processors and other factors all help to optimize the 2013 Mac Pro for best performance. For example, the interaction between Adobe Premiere Pro CC and SpeedGrade CC using the Direct Link and Lumetri filters was noticeably better with the new machine. Certainly that’s true of Final Cut Pro X and Motion, which are ideally suited for it. I would add that using a single 20” monitor connected to the Mac Pro placed very little drag on one GPU, so the second could be totally devoted to processing power. Performance might vary if I had two 27” displays, plus a 4K monitor hooked to it.

I also tested Avid Media Composer. This software doesn’t particularly use a lot of GPU processing, so performance was about the same as with my 2009 Mac Pro. It also takes a trick to get it to work. The 2013 Mac Pro has no built-in audio device, which Media Composer needs to see in order to launch. If you have an audio device connected, such as an Mbox2 Mini or even just a headset with a microphone, then Media Composer detects a core audio device and will launch. I downloaded and installed the free Soundflower software. This acts as a virtual core audio device and can be set as the computer’s audio input in the System Preferences sound panel. Doing so enabled Media Composer to launch and operate normally.

Whether the new 2013 Mac Pro is the ideal tower replacement for you comes down to budget and many other variables. Rest assured that it’s the best machine Apple has to offer today. Analogies to powerful small packages (like the Mini Cooper or Bruce Lee) are quite apt. The build quality is superb and the performance is outstanding. If you are looking for a machine to service your needs for the next five years, then it’s the ideal choice.

(Note: This unit was tested prior to the release of 10.9.3, so I didn’t encounter any of the render issues that have been plaguing Adobe and DaVinci users.)

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Amira Color Tool and your NLE

df_amiracolor_1I was recently alerted to the new Amira Color Tool by Michael Phillips’ 24p blog. This is a lightweight ARRI software application designed to create custom in-camera looks for the Amira camera. You do this by creating custom color look-up tables (LUT). The Amira Color Tool is available as a free download from the ARRI website (free registration required). Although the application is designed for the camera, you can also export looks in a variety of LUT file formats, which in turn, may be installed and applied to footage in a number of different editing and color correction applications. I tested this in both Apple Final Cut Pro X and Avid Media Composer | Software (v8) with good results.

The Amira Color Tool is designed to correct log-C encoded footage into a straight Rec709 offset or with a custom look. ARRI offers some very good instructions, white papers, sample looks and tutorials that cover the operation of this software. The signal flow is from the log-C image, to the Rec709 correction, and then to the CDL-based color correction. To my eye, the math appears to be floating point, because a Rec709 conversion that throws a shot into clipping, can be pulled back out of clipping in the look tab, using the CDL color correction tools. Therefore it is possible to use this tool for shots other than ARRI Amira or Alexa log-C footage, as long as it is sufficiently flat.

The CDL correction tools are based on slope, offset and power. In that model slope is equivalent to gain, offset to lift and power to gamma. In addition to color wheels, there’s a second video look parameters tab for hue intensities for the six main vectors (red, yellow, green, cyan, blue and magenta). The Amira Color Tool is Mac-only and opens both QuickTime and DPX files from the clips I tested. It worked successfully with clips shot on an Alexa (log-C), Blackmagic Cinema Camera (BMD Film profile), Sony F-3 (S-log) and Canon 1DC (4K Canon-log). Remember that the software is designed to correct flat, log-C images, so you probably don’t want to use this with images that were already encoded with vibrant Rec709 colors.

FCP X

df_amiracolor_4To use the Amira Color Tool, import your clip from the application’s file browser, set the look and export a 3D LUT in the appropriate format. I used the DaVinci Resolve setting, which creates a 3D LUT in a .cube format file. To get this into FCP X, you need to buy and install a LUT filter, like Color Grading Central’s LUT Utility. To install a new LUT there, open the LUT Utility pane in System Preferences, click the “+” symbol and navigate to where the file was saved.df_amiracolor_5_sm In FCP X, apply the LUT Utility to the clip as a filter. From the filter’s pulldown selection in the inspector, choose the new LUT that you’ve created and installed. One caveat is to be careful with ARRI files. Any files recorded with newer ARRI firmware are flagged for log-C and FCP X automatically corrects these to Rec709. Since you don’t want to double up on LUTs, make sure “log processing” is unchecked for those clips in the info tab of the inspector pane.

Media Composer

df_amiracolor_6_smTo use the custom LUTs in Media Composer, select “source settings” for the clip. Go to the color management tab and install the LUT. Now it will be available in the pull-down menu for color conversions. This color management change can be applied to a single clip or to a batch of clips within a bin.

In both cases, the source clips in FCP X and/or Media Composer will play in real-time with the custom look already applied.

df_amiracolor_2_sm

df_amiracolor_3_sm

©2014 Oliver Peters

Final Cut “Studio 2014″

df_fcpstudio_main

A few years ago I wrote some posts about Final Cut Pro as a platform and designing an FCP-centric facility. Those options have largely been replaced by an Adobe approach built around Creative Cloud. Not everyone has warmed up to Creative Cloud. Either they don’t like the software or they dislike the software rental model or they just don’t need much of the power offered by the various Adobe applications.

If you are looking for alternatives to a Creative Cloud-based production toolkit, then it’s easy to build your own combination with some very inexpensive solutions. Most of these are either Apple software or others that are sold through the Mac App Store. As with all App Store purchases, you buy the product once and get updates for free, so long as the product is still sold as the same. Individual users may install the apps onto as many Mac computers as they personally own and control, all for the one purchase price. With this in mind, it’s very easy for most editors to create a powerful bundle that’s equal to or better than the old Final Cut Studio bundle – at less than its full retail price back in the day.

The one caveat to all of this is how entrenched you may or may not be with Adobe products. If you need to open and alter complex Illustrator, Photoshop, After Effects or Premiere Pro project files, then you will absolutely need Adobe software to do it. In that case, maybe you can get by with an old version (CS6 or earlier) or maybe trial software will work. Lastly you could outsource to a colleague with Adobe software or simply pick up a Creative Cloud subscription on a month-by-month rental. On the other hand, if you don’t absolutely need to interact with Adobe project files, then these solutions may be all you need. I’m not trying to advocate for one over the other, but rather to add some ideas to think about.

Final Cut Pro X / Motion / Compressor

df_fcpstudio_fcpx_smThe last Final Cut Studio bundle included FCP 7, Motion, Compressor, Cinema Tools, DVD Studio Pro, Soundtrack Pro and Color. The current Apple video tools of Final Cut Pro X, Motion and Compressor cover all of the video bases, including editing, compositing, encoding, transcoding and disc burning. The latter may be a bone of contention for many – since Apple has largely walked away from the optical disc world. Nevertheless, simple one-off DVDs and Blu-ray discs can still be created straight from FCP X or Compressor. Of course, FCP X has been a mixed bag for editors, with many evangelists and haters on all sides. If you square off Premiere Pro against Final Cut Pro X, then it really boils down to tracks versus trackless. Both tools get the job done. Which one do you prefer?

df_fcpstudio_motion_smMotion versus After Effects is a tougher call. If you are a power user of After Effects, then Motion may seem foreign and hard to use. If the focus is primarily on motion graphics, then you can certainly get the results you want in either. There is no direct “send to” from FCP X to Motion, but on the plus side, you can create effects and graphics templates using Motion that will appear and function within FCP X. Just like with After Effects, you can also buy stock Motion templates for graphics, show opens and other types of design themes and animations.

Logic Pro X

df_fcpstudio_lpx_smLogic Pro X is the DAW in our package. It becomes the replacement for Soundtrack Pro and the alternative to Adobe Audition or Avid Pro Tools. It’s a powerful music creation tool, but more importantly for editors, it’s a strong single file and multitrack audio production and post production application. You can get FCP X files to it via FCPXML or AAF (converted using X2Pro). There are a ton of plug-ins and mixing features that make Logic a solid DAW. I won’t dive deeply into this, but suffice it to say, that if your main interest in using Logic is to produce a better mix, then you can learn the essentials quickly and get up and running in short order.

DaVinci Resolve

df_fcpstudio_resolve_smEvery decent studio bundle needs a powerful color correction tool. Apple Color is gone, but Blackmagic Design’s DaVinci Resolve is a best-of-breed replacement. You can get the free Resolve Lite version through the App Store, as well as Blackmagic’s website. It does most of everything you need, so there’s little reason to buy the paid version for most editors who do some color correction.

Resolve 11 (due out soon) adds improved editing. There is a solid synergy with FCP X, making it not only a good companion color corrector, but also a finishing editorial tool. OFX plug-ins are supported, which adds a choice of industry standard creative effects if you need more than FCP X or Motion offer.

Pixelmator / Aperture

df_fcpstudio_pixelmator_smThis one’s tough. Of all the Adobe applications, Photoshop and Illustrator are hardest to replace. There are no perfect alternatives. On the other hand, most editors don’t need all that power. If direct feature compatibility isn’t a need, then you’ve got some choices. One of these is Pixelmator, a very lightweight image manipulation tool. It’s a little like Photoshop in the version 4-7 stages, with a mix of Illustrator tossed in. There are vector drawing and design tools and it’s optimized for core image, complete with a nice set of image filters. However, it does not include some of Photoshop CC’s power user features, like smart objects, smart filters, 3D, layer groups and video manipulation. But, if you just need to doctor some images, extract or modify logos or translate various image formats, Pixelmator might be the perfect fit. For more sophistication, another choice (not in the App Store) is Corel’s Painter, as well as Adobe Photoshop Elements (also available at the App Store).

df_fcpstudio_aperture_smAlthough Final Cut Studio never included a photo application, the Creative Cloud does include Lightroom. Since the beginning, Apple’s Aperture and Adobe’s Lightroom have been leapfrogging each other with features. Aperture hasn’t changed much in a few years and is likely the next pro app to get the “X” treatment from Apple’s engineers. Photographers have the same type of “Chevy vs. Ford” arguments about Aperture and Lightroom as editors do about NLEs. Nevertheless, editors deal a lot with supplied images and Aperture is a great tool to use for organization, clean up and image manipulation.

Other

The list I’ve outlined creates a nice set of tools, but if you need to interchange with other pros using a variety of different software, then you’ll need to invest in some “glue”. There are a number of utilities designed to go to and from FCP X. Many are available through the App Store. Examples include Xto7, 7toX, EDL-X, X2Pro, Shot Notes X, Lumberjack and many others.

For a freewheeling discussion about this topic and other matters, check out my conversation with Chris Fenwick at FCPX Grille.

©2014 Oliver Peters

Comparing Color, Resolve, SpeedGrade and Symphony

df_ccc_main_sm

It’s time to talk about color correctors. In this post, I’ll compare Color, Resolve, SpeedGrade and Symphony. These are the popular desktop color correction systems in use today. Certainly there are other options, like Filmlight’s Baselight Editions plug-in, as well as other NLEs with their own powerful color correction tools, including Autodesk Smoke and Quantel Rio. Some of these fall outside of the budget range of small shops or don’t really provide a correction workflow. For the sake of simplicity, in this post I’ll stick with the four I see the most.

df_ccc_sym_sm

Avid Technology Media Composer + Symphony

Although it started as a separate NLE product with dedicated hardware, today’s Symphony is really an add-on option to Media Composer. The main feature that differentiates Symphony from Media Composer in file-based workflows is an enhanced color correction toolset. Symphony used to be the “gold standard” for color correction within an NLE, combining controls “borrowed” from many other software and systems, like Photoshop, hardware proc amps and hardware versions of the DaVinci correctors. It was the first to use the color wheel control model for balance/hue offsets. A subset of the Symphony tools has been migrated into Media Composer. Basic correction features in Symphony include channel mixing, hue offsets (color balance), levels, curves and more.

Many perceive Symphony correction as a single level or layer of correction, but that’s not exactly true. Color correction occurs on two levels – segment and program track. Most of your correction is on individual clips and Symphony offers a relational grading system. This means you can apply grades based on single clips or all instances of a master clip, tape ID, camera, etc. All clips used from a common source can be automatically graded once the first instance of that clip is graded on the timeline. The program track grade allows the colorist to apply an additional layer of grading to a clip, a section of the timeline or the entire timeline. So, when the client asks for everything to be darker, a global adjustment can be made using the program track.

Symphony also offers secondary grading based on isolating colors via an HSL key and adjusting that range. Although Symphony doesn’t offer nodes or correction layers like other software, you can use Avid’s video track timeline hierarchy to add additional correction to blank tracks above those tracks containing the video clips. In this way you are using the tracks as de facto adjustment layers. The biggest weakness is the lack of built-in masking tools to create what is commonly referred to as “power windows” (a term originated by DaVinci). The workaround is to use Avid’s built-in Intraframe/Animatte effects tools to create masks. Then you can apply additional spot correction within the mask area. It takes a bit more work than other tools, but it’s definitely possible. Finally, many plug-in packages, like GenArts Sapphire, Boris Continuum Complete and Magic Bullet Looks include vignette filters that will work with Symphony.

The bottom line is that Symphony started it all, though by today’s standards is “long-in-the-tooth”. Nevertheless, the relational grading model – and the fact that you are working within the NLE and can freely move between color correction and editing/trimming – makes Symphony a fast unit to operate, especially in time-sensitive, long-form productions, like TV shows.

df_ccc_spgrd_sm

Adobe SpeedGrade CC

If you are current as a Creative Cloud subscriber, then you have access to the most recent version of Adobe Premiere Pro CC and SpeedGrade CC. With the updates introduced late last year, Adobe added Direct Link interaction between Premiere Pro and SpeedGrade. When you use Direct Link to send your Premiere Pro timeline to SpeedGrade, the actual Premiere Pro sequence becomes the SpeedGrade sequence. This means codec decoding, transitions and Premiere Pro effects are handled by Premiere Pro’s effects engine, even though you are working inside SpeedGrade. As such, a project created via Direct Link supports features and codecs that would not be possible within a standalone SpeedGrade project.

Another unique aspect is that native and third-party transitions and effects used in Premiere Pro are visible (though not adjustable) when you are working inside SpeedGrade. This is an important distinction, because other correction workflows that rely on roundtrips don’t include NLE-based filters. You can’t see how the correction will be affected by a filter used in the NLE timeline. Naturally, in the case of SpeedGrade, this only works if you are working on a machine with the same third-party filters installed. When you return to Premiere Pro from SpeedGrade, the color corrections on clips are collapsed into a Lumetri filter effect that is applied to the clip or adjustment layer within the Premiere Pro sequence. Essentially this Lumetri effect is similar to a LUT that encapsulates all of the grading layers applied in SpeedGrade into a single effect in Premiere Pro. This is possible because the two applications share the same color science. The result is a render-free workflow with the easy ability to go back-and-forth between Premiere Pro and SpeedGrade for changes and adjustments. Unlike a standard LUT, Lumetri filters can carry masks, keyframes and are 100% precise.

As a color corrector, SpeedGrade is designed with a layer-based interface, much like Photoshop. Layers can be primary (fullscreen), secondary (keys and masks) or filters. A healthy selection of effects filters and LUTs are included. The correction model splits the signal into what amounts to a 12-way color wheel arrangement. There are lift/gamma/gain controls for the overall image, as well as for each of the shadow, middle and highlight ranges. Controls can be configured as wheels or sliders, with additional sliders for contrast, pivot, temperature (red vs. blue bias), magenta (red/blue vs. green bias) and saturation. There are no curves controls.

Overall, I like the looks I get with SpeedGrade, but I find it lacking in some ways. There are definite plusses and minuses. I miss the curves. It currently does not work with Blackmagic Design hardware. Matrox, Bluefish and AJA are OK. It’s got a tracker, but I find both tracking and masking to be mediocre. The biggest workflow shortcoming is the lack of a temporary memory register feature. You can save a whole grade, which saves the entire stack of grading layers applied to a clip as a Lumetri filter. You can apply grades from earlier timeline clips quite simply and SpeedGrade lets you open multiple playheads for comparison/correction between multiple shots on the timeline. You can access the nine grades ahead and the nine grades beyond the current playhead position. You can also copy the grade from the clip below mouse position to the clip under the playhead by pressing the C key. What you cannot do is store a random set of grades or just a single layer in a temporary buffer and then apply it from that buffer somewhere else in the timeline. Adding these two items would greatly speed up the SpeedGrade workflow.

df_ccc_resolve_sm

Blackmagic Design DaVinci Resolve

The DaVinci name is legendary among color correction products, but that reputation was earned with its hardware products, like the DaVinci 2K. Resolve was the software-based product built around a Linux cluster. When Blackmagic bought the assets and technology of DaVinci, all of the legacy hardware products were dropped, in favor of concentrating on Resolve as the software that had the most life for the future. There are now four versions, including Resolve Lite (free), Resolve (paid – software only), Resolve with a Blackmagic control surface and Resolve for Linux. The first three work on Mac and PC. You may download the free Lite version from the Blackmagic website or Apple’s Mac App Store. The Lite version has nearly all of the power of the paid software, but with these limitations: noise reduction, stereoscopic tools and the ability to output at a resolution above UltraHD requires a paid version.

I’m writing this based on Resolve 10, which has rudimentary editing features. It is designed as a standalone color corrector that can be used for some editing. Blackmagic Design doubled-down on the editing side with Resolve 11 (shown at NAB 2014). When that’s finally released this summer, you’ll have a powerful NLE built into the application. The demos at NAB were certainly impressive. If that turns out to be the case, Resolve 11 would function as an Avid Symphony or Quantel Rio type of system. That means you could freely move between creative editing and color correction, simply by changing tabs in the interface. For now, Resolve 10 is mainly a color corrector, with some very good roundtrip and conforming support for other NLEs. Specifically there is very good support for Avid and FCP X workflows.

As a color corrector, Resolve offers the widest set of correction tools of any of these systems. In the work I’ve done, Resolve allows for more extreme grading and is more precise when trying to correct problem shots. I’ve done corrections with it that would have been impossible with any other tool. The correction controls include curves, wheels, primary sliders, channel mixers and more. Corrections are node-based and can be applied to clips or an entire track. Nodes can be applied in a serial or parallel fashion, with special splitter/combiner and layer mixing nodes. The latter includes Photoshop-style blend modes. Unlike SpeedGrade, you can store the value of a single node in a buffer (using the keyboard copy function) and then paste the value of just that node somewhere else. This makes it pretty fast when working up and down a timeline. Finally, the tracker is amazing.

A few things bother me about Resolve, in spite of its powerful toolset. The interface almost presents too many tools and it becomes very easy to lose track of what was done and where. There is no large viewer or fullscreen mode that doesn’t hide the node tree. This forces a lot of toggling between workspace configurations. If you have two displays, you cannot use the second display for anything other than the scopes and audio mixer. (This will change with Resolve 11.) Finally, you can only use Blackmagic Design hardware to view the video output on a grading monitor.

df_ccc_color_sm

Apple Color

Some of you are saying, “Why talk about that? It was killed off a few years ago! Who uses that anymore?” Yes, I know. What people so quickly forget, was that when the software was FinalTouch (before Apple’s purchase), it was very expensive and considered to be very innovative. Apple bought it, added some features and cleaned up some of the workflow. As part of Final Cut Studio, it set the standard for round-tripping with an NLE. Unfortunately for many Mac users, it retained its less glossy, “Unixy” interface and thus, didn’t really catch on for many editors. However, it still works just fine on the newest machines and OS versions and remains a fast, high-quality color corrector.

Nearly all of the long-form jobs I’ve done – including feature films and TV shows up to even a few months ago – have been done with Color. There are two reasons that I prefer it. The first is that most of these jobs were cut using FCP 7, so it’s still the most integrated software for these projects. More importantly, there are several key features that make it faster than SpeedGrade and Resolve for projects that fall within a standard range of grading. In other words, the in-camera look was good and there were no huge problem areas, plus the desired grade didn’t swing into extreme looks.

Color is designed with 10 levels of grading per clip – primary in, eight secondaries and primary out. Since secondaries can be fullscreen or a portion of the image qualified by an HSL key or mask, each secondary layer can actually have two corrections – inside and outside of the mask. In addition to these, there’s a ColorFX layer for node-based filter effects, which can also include color adjustments. In reality, the maximum number of corrections to a single clip could be up to 19. The primary corrections can include value changes for RGB lift/gamma/gain and saturation levels, as well a printer lights. On top of this are lift/gamma/gain color wheels and luma controls. Lastly there are curves. The secondaries include custom mask shapes and hue/sat/luma curves. There’s a tracker, too, but it’s not that great.

Where Color still shines for me is in workflow. Each layer is represented by a labelled bar on the timeline under the clip. This makes it easy to apply only a single secondary adjustment to other clips on the timeline simply by sliding the corresponding secondary bar from one timeline clip to one or more of the others. For example, I used Secondary 3 to qualify a person’s face and brighten it. I could then simply drag the bar for S3 that appears under the first clip on the timeline over to every other clip with the same person and similar set-up. All without selecting each of these clips prior to applying the adjustment.

Color works with all cards that work with Final Cut Pro, so there’s no AJA versus Blackmagic issue as mentioned above. Dual monitors work well. You can have scopes and the viewer (or a fullscreen viewer) on one display and the full control interface on the other. Realistically, Color works best with up to 2K video and one of the standard Apple codecs (uncompressed or ProRes work best). A lot of the footage I’ve graded with it was ProResHQ or ProRes 4444 that came native from an ARRI Alexa or transcoded from a C300, RED or a Canon 5D/7D. But I’ve also done a film that was all native EX rewrapped as .mov from a Sony camera and Color had no issues. Log-profile footage grades very nicely in Color, so Alexa ProRes 4444 encoded as Log-C forms a real sweet spot for Apple Color.

©2014 Oliver Peters

CoreMelt TrackX

df_trackx_1_sm

Tracking isn’t something every editor does on a regular basis, but when you need it, very few NLEs have built-in tracking tools. This is definitely true with Apple Final Cut Pro X. CoreMelt makes some nice effects plug-ins, but in addition, they’ve produced a number of workflow tools that enhance the capabilities of Final Cut Pro X. These include Lock & Load X (stabilization) and SliceX (masking). The newest tool in the group is TrackX and like SliceX, it uses Mocha tracking technology licensed from Imagineer Systems. In keeping with the simplified controls common to FCP X effects, the tracking controls in TrackX are very easy to apply and use.

TrackX installs as three generators within FCP X – Simple Tracker, Track Layer and Track Text. All use the same planar-based Mocha tracker. The easiest to use – and where I get the best results – is the Simple Tracker. This lets you attach text or objects to a tracked item, so they travel with its movement.

The example used in their tutorial is of a downhill skier. As he races downhill, a timer read-out travels next to him. This works well and displays well, because the tracked objects do not have to perfectly adhere to each other. It uses a two-step process. First, create the item you want to attach and place it into a compound clip. Therefore, it can be a complex graphic and not just text. The second step is to track the object you want to follow. Apply the TrackX generator and trim to length, use the rectangle tool to select an area to be tracked, drop the compound clip into the filter control pane’s image well and then track forward or backwards. If there are hiccups within the tracks, you can manually delete or insert keyframes. Like other trackers, you can select the mode of analysis to be used, such as whether to follow position, scale or perspective.

df_trackx_2_smThe second TrackX generator is Track Layer. This worked well enough, but not nearly as well as the more advanced versions of Mocha that come with After Effects or are sold separately. This tool is designed to replace objects, such as inserting a screen image into a TV, window, iPad or iPhone. To use it, first highlight the area that will be replaced, by using the polygon drawing tool. Next, add the image to be used as the new surface. Then track. There are controls to adjust the scale and offset of the new surface image within its area.

In actual practice, I found it hard to get a track that wasn’t sloppy. It seems to track best when the camera is panning on an object without zooming or having any handheld rotation around the object. Since Mocha tracking is based on identifying flat planes, any three-dimensional motion around an object that results in a perspective change becomes hard to track. This is tough no matter what, but in my experience the standard Mocha trackers do a somewhat better job than TrackX did. A nice feature is a built-in masking tool, so that if your replacement surface is supposed to travel behind an object, like a telephone pole, you can mask the occluded area for realistic results.

Lastly, there’s Track Text. This generator has a built-in text editor and is intended to track objects in perspective. The example used in their demos is text, that’s attached to building rooftops in an aerial. The text is adjusted in perspective to be on the same plane as the roofs.

Overall, I liked the tools, but for serious compositing and effects, I would never turn to FCP X anyway. I would do that sort of work in After Effects. (TrackX does not install into Motion.) Nevertheless, for basic tracking, TrackX really fills a nice hole in FCP X’s power and is a tool that every FCP X editor will want at their fingertips.

For new features announced at NAB and coming soon, check out this video and post from FCP.co.

©2014 Oliver Peters