Cold In July

df_cij_2_smJim Mickle started his career as a freelance editor in New York, working on commercials and corporate videos, like so many others. Bitten by the filmmaking bug, Mickle has gone on to successfully direct four indie feature films, including his latest, Cold in July. Like his previous film, We Are What We Are, both films had a successful premiere at the Sundance Film Festival.

Cold In July, which is based on a novel by Joe R. Lansdale, is a noir crime drama set in 1980s East Texas. It stars Michael C. Hall (Dexter), Sam Shepard (Out of the Furnace, Killing Them Softly) and Don Johnson (Django Unchained, Miami Vice). Awakened in the middle of the night, small town family man Richard Dane (Hall) kills a burglar in his house. Dane soon fears for his family’s safety when the burglar’s ex-con father, Ben (Shepard), comes to town, bent on revenge. However, the story takes a twist into a world of corruption and violence. Add Jim Bob (Johnson) to this mix, as a pig-farming, private eye, and you have an interesting trio of characters.

According to Jim Mickle, Cold In July was on a fast-track schedule. The script was optioned in 2007, but production didn’t start until 2013. This included eight weeks of pre-production beginning in May and principal photography starting in July (for five weeks) with a wrap in September. The picture was “locked” shortly after Thanksgiving. Along with Mickle, John Paul Hortsmann (Killing Them Softly) shared editing duties.

df_cij_1_smI asked Mickle how it was to work with another editor. He explained, “I edited my last three films by myself, but with this schedule, post was wedged between promoting We Are What We Are and the Sundance deadline. I really didn’t have time to walk away from it and view it with fresh eyes. I decided to bring John Paul on board to help. This was the first time I’ve worked with another editor. John Paul was cutting while I was shooting and edited the initial assembly, which was finished about a week before the Sundance submission deadline. I got involved in the edit about mid-October. At that point, we went back to tighten and smooth out the film. We would each work on scenes and then switch and take a pass at each other’s work.”

df_cij_4_smMickle continued, “The version that we submitted to Sundance was two-and-a-half hours long. John Paul and I spent about three weeks polishing and were ready to get feedback from the outside. We held a screening for 20 to 25 people and afterwards asked questions about whether the plot points were coherent to them. It’s always good for me, as the director, to see the film with an audience. You get to see it fresh – with new eyes – and that helps you to trim and condense sections of the film. For example, in the early versions of the script, it generally felt like the middle section of the film lost tension. So, we had added a sub-plot element into the script to build up the mystery. This was a car of agents tailing our hero that we could always reuse, as needed. When we held the screening, it felt like that stuff was completely unnecessary and simply put on top of the rest of the film. The next day we sliced it all out, which cut 10 minutes out of the film. Then it finally felt like everything clicked.”

df_cij_3_smThe director-editor relationship always presents an interesting dynamic, since the editor can be objective in cutting out material that may have cost the director a lot of time and effort on set to capture. Normally, the editor has no emotional investment in production of the footage. So, how did Jim Mickle as the editor, treat his own work as the director? Mickle answered, “As an editor, I’m more ruthless on myself as the director. John Paul was less quick to give up on scenes than I. There are things I didn’t think twice about losing if they didn’t work, but he’d stay late to fix things and often have a solution the next day. I shoot with plenty of coverage these days, so I’ll build a scene and then rework it. I love the edit. It’s the first time you really feel comfortable and can craft the story. On the set, things happen so quickly, that you always have to be reactive – working and thinking on your feet.”

df_cij_5_smAlthough Mickle had edited We Are What We Are with Adobe Premiere Pro, the decision was made to shift back to Apple Final Cut Pro 7 for the edit of Cold In July. Mickle explained, “As a freelance editor in New York, I was very comfortable with Final Cut, but I’m also an After Effects user. When doing a lot of visual effects, it really feels tedious to go back and forth between Final Cut and After Effects. The previous film was shot with RED cameras and I used a raw workflow in post, cutting natively with Premiere Pro. I really loved the experience – working with raw files and Dynamic Link between Premiere and After Effects. When we hired John Paul as the primary editor on the film, we opted to go back to Final Cut, because that is what he is most comfortable with. That would get the job done in the most expedient fashion, since he was handling the bulk of the editing.”

df_cij_6_sm“We shot with RED cameras again, but the footage was transcoded to ProRes for the edit. I did find the process to be frustrating, though, because I really like the fluidness of using the raw files in Premiere. I like the editing process to live and breath and not be delineated. Having access to the raw films, lets me tweak the color correction, which helps me to get an idea of how a scene is shaping up. I get the composer involved early, so we have a lot of the real music in place as a guide while we edit. This way, your cutting style – and the post process in general – are more interactive. In any case, the ProRes files were only used to get us to the locked cut. Our final DI was handled by Light Iron in New York and they conformed the film from the original RED files for a 2K finish.”

The final screening with mix, color correction and all visual effects occurred just before Sundance. There the producers struck a distribution deal with IFC Films. Cold In July started its domestic release in May of this year.

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Apple’s New Mac Pro

df_mp2013_4_smThe run of the brushed aluminum tower design that highlighted Apple’s PowerMac G5 and Intel Mac Pros ended with the introduction of a radical replacement in late 2013. No matter what the nickname – “the cylinder”, “the tube” or whatever - Apple’s new 2013 Mac Pro is a tour de force of industrial design. Few products have had such pent up demand. The long lead times for custom machines originally ran months, but by now, with accelerated production, has been reduced to 24 hours. Nevertheless, if you are happy with a stock configuration, then it’s possible to walk out with a new unit on the same day at some of the Apple Store or reseller retail locations.

Design

The 2013 Mac Pro features a cylindrical design. It’s about ten inches tall, six-and-a-half inches in diameter and, thanks to a very dense component construction, weighs about eleven pounds. The outer shell – it’s actually a sleeve that can be unlocked and lifted off – uses a dark (not black) reflective coating. Internally, the circuits are mounted onto a triangle-shaped core. There’s a central vent system that draws air in through the bottom and out through the top, much like a chimney. You can still mount the Mac Pro sideways without issue, as long as the vents are not blocked. This design keeps the unit quiet and cool most of the time. During my tests, the fan noise was quieter than my tower (generally a pretty quiet unit) and the fans never kicked into high.

Despite the small size, all components are workstation class and not mobile or desktop products, as used in the Apple laptops or iMacs. It employs the fastest memory and storage of any Mac and is designed to pick up where the top-of-the-line iMac leaves off. The processors are Intel Xeon instead of Core i5 or Core i7 CPUs and graphics cards are AMD FirePro GPUs. This Xeon model is a multicore, single CPU chip. Four processor options are offered (4, 6, 8 and 12-core), ranging in speed from 3.7GHz (4-core) to 2.7GHz (12-core). RAM can be maxed out to a full 64GB. It is the only component of the Mac Pro where a user-installed, third-party upgrade is an easy option.

The Mac Pro is optimized for dual graphics processors with three GPU choices: D300 (2GB VRAM each), D500 (3GB VRAM each) or D700 (6GB VRAM each) GPUs. Internal storage is PCIe-based flash memory in 256GB, 512GB or 1TB configurations. These are not solid state drives (SSDs), but rather flash storage like that used in the iPads. Storage is connected directly to the PCIe bus of the Mac Pro for the fastest possible data i/o. The stock models start at $2,999 (4-core) and $3,999 (6-core).

Apple shipped me a reviewer’s unit,  configured in a way that they feel is the “sweet spot” for high-end video. My Mac Pro was the 8-core model, with 32GB of RAM, dual D700 GPUs and 512GB of storage. This configuration with a keyboard, mouse and AppleCare extended warranty would retail at $7,166.

Connectivity

df_mp2013_5_smAll connectors are on the back – four USB 3.0, six Thunderbolt 2, two Gigabit Ethernet and one HDMI 1.4. There is also wireless, Bluetooth, headset and speaker support. The six Thunderbolt 2 ports are split out from three internal Thunderbolt 2 buses, with the bottom bus also taking care of the HDMI port.

You can have multiple Thunderbolt monitors connected, as well as a 4K display via the HDMI spigot, however you will want to separate these onto the different buses. For example, you wouldn’t be able to support two 27” Apple displays and a 4K HDMI-connected monitor all on one single Thunderbolt bus. However, you can support up to six non-4K displays if you distribute the load across all of the connections. Since the plug for Thunderbolt is the same as Mini Display Port, you can connect nearly any standard computer monitor to these ports if you have the proper plug. For example, I used my 20” Apple Cinema Display, which has a DVI plug, by simply adding a DVI-to-MDP adapter.

The change to Thunderbolt 2 enables faster throughput. The first version of Thunderbolt used two channels of 10Gb/s data and video, with each channel going in opposite directions. Thunderbolt 2 combines this for two channels going in the same direction, thus a total of 20Gb/s. You can daisy-chain Thunderbolt devices and it is possible to combine Thunderbolt 1 and Thunderbolt 2 devices in the same chain. First generation Thunderbolt devices (such as monitors) should be at the end of the chain, so as not to create a bottleneck.

The USB 3.0 ports will support USB 1.0 and 2.0 devices, but of course, there is no increase in their speed. There is no legacy support for FireWire or eSATA, so if you want to connect older drives, you’ll need to invest in additional docks, adapters and/or expansion units. (Apple sells a $29 Thunderbolt-to-FireWire 800 adapter.) This might also include a USB hub. For example, I have more than four USB-connected devices on my current 2009 Mac Pro. The benefit of standardizing on Thunderbolt, is that all of the Thunderbolt peripherals will work with any of Apple’s other computers, including MacBook Pros, Minis and iMacs.

The tougher dilemma is if you need to accommodate current PCIe cards, such as a RED Rocket accelerator card, a FibreChannel adapter or a mini-SAS/eSATA card. In that case, a Thunderbolt 2 expansion unit will be required. One such solution is the Sonnet Technologies Echo Express III-D expansion chassis.

Mac Pro as your main edit system

df_mp2013_2_smI work in many facilities with various vintages of Mac Pro towers. There’s a wide range of connectivity needs, including drives, shared storage and peripherals. Although it’s very sexy to think about just a 2013 Mac Pro sitting on your desk with nothing else, other than a Thunderbolt monitor, that’s not the real world of post. If you are evaluating one of these as your next investment, consider what you must add. First and foremost is storage. Flash storage and SSDs are great for performance, but you’re never going to put a lot of video media on a 1TB (or smaller) drive. Then you’ll need monitors and most likely adapters or expansion products for any legacy connection.

I priced out the same unit I’m reviewing and then factored in an Apple 27” display, the Sharp 32” UHD monitor, a Promise Pegasus2 R6 12TB RAID, plus a few other peripherals, like speakers, audio i/o, docks and adapters. This bumps the total to over $15K. Granted, I’ve pretty much got a full system that will last me for years. The point is, that it’s important to look at all the ramifications when you compare the new Mac Pro over a loaded iMac or a MacBook Pro or simply upgrading a recently-purchased Mac Pro tower.

Real world performance

df_mp2013_6_smMost of the tests promoting the new Mac Pro have focused on 4K video editing. That’s coming and the system is certainly good for it, but that’s not what most people encounter today. Editors deal with a mix of media, formats, frame rates, frame sizes, etc. I ran a set of identical tests on the 2013 Mac Pro and on my own 2009 Mac Pro tower. That’s an eight-core (dual 4-core Xeons) 2.26GHz model with 28GB of RAM. The current video card is a single NVIDIA Quadro 4000 and my media is on an internal two-drive (7200RPM eSATA) RAID-0 array. Since I had no external drives connected to the 2013 Mac Pro, all media was playing from and writing to the internal flash storage. This means that performance would be about as good as you can get, but possibly better than with externally-connected drives.

I tested Apple Final Cut Pro X, Motion, Compressor, Adobe Premiere Pro CC and After Effects CC. Media included RED EPIC 5K camera raw, ARRI ALEXA 1080p ProRes 4444, Blackmagic Cinema Camera 2.5K ProResHQ and more. Most of the sequences included built-in effects and some of the new Red Giant Universe filters.

df_mp2013_3_smTo summarize the test results, performance – as measured in render or export times – was significantly better on the 2013 Mac Pro. Most of the tests showed a 2X to 3X bump in performance, even with the Adobe products. Naturally FCP X loves the GPU power of this machine. The “BruceX” test, developed as a benchmark by Alex Gollner for FCP X, consists of a 5K timeline with a series of generators. I exported this as a 5K ProRes 4444 file. The older tower accomplished this in 1:47, while the new Mac Pro smoked it in just :19. My After Effects timeline consisted of ProRes 4444 clips with a bunch of intensive Cycore filters. The old versus new renders were 23:26 and 12:53, respectively.  I also ran tests with DaVinci Resolve 10, another application that loves more than one GPU. These were RED EPIC 5K files in a 1080p timeline. Debayer resolution was set to full (no RED Rocket card used). The export times ran at 4-12fps (depending on the clip) on the tower versus 15-40fps on the new Mac Pro.

df_mp2013_1_smIn general, all operations with applications were more responsive. This is, of course, true with any solid state storage. The computer boots faster and applications load and respond more quickly. Plus, more RAM, faster processors and other factors all help to optimize the 2013 Mac Pro for best performance. For example, the interaction between Adobe Premiere Pro CC and SpeedGrade CC using the Direct Link and Lumetri filters was noticeably better with the new machine. Certainly that’s true of Final Cut Pro X and Motion, which are ideally suited for it. I would add that using a single 20” monitor connected to the Mac Pro placed very little drag on one GPU, so the second could be totally devoted to processing power. Performance might vary if I had two 27” displays, plus a 4K monitor hooked to it.

I also tested Avid Media Composer. This software doesn’t particularly use a lot of GPU processing, so performance was about the same as with my 2009 Mac Pro. It also takes a trick to get it to work. The 2013 Mac Pro has no built-in audio device, which Media Composer needs to see in order to launch. If you have an audio device connected, such as an Mbox2 Mini or even just a headset with a microphone, then Media Composer detects a core audio device and will launch. I downloaded and installed the free Soundflower software. This acts as a virtual core audio device and can be set as the computer’s audio input in the System Preferences sound panel. Doing so enabled Media Composer to launch and operate normally.

Whether the new 2013 Mac Pro is the ideal tower replacement for you comes down to budget and many other variables. Rest assured that it’s the best machine Apple has to offer today. Analogies to powerful small packages (like the Mini Cooper or Bruce Lee) are quite apt. The build quality is superb and the performance is outstanding. If you are looking for a machine to service your needs for the next five years, then it’s the ideal choice.

(Note: This unit was tested prior to the release of 10.9.3, so I didn’t encounter any of the render issues that have been plaguing Adobe and DaVinci users.)

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Filmmaking Pointers

df_fmpointersIf you want to be a good indie filmmaker, you have to understand some of the basic principles of telling interesting visual stories and driving the audience’s emotions. These six   ideas transcend individual components of filmmaking, like cinematography or editing. Rather, they are concepts that every budding director should understand and weave into the entire structure of how a film is approached.

1. Get into the story quickly. Films are not books and don’t always need a lengthy backstory to establish characters and plot. Films are a journey and it’s best to get the characters on that road as soon as possible. Most scripts are structured as three-act plays, so with a typical 90-100 minute running time, you should be through act one at roughly one third of the way into the film. If not, you’ll lose the interest of the audience. If you are 20 minutes into the film and you are still establishing the history of the characters without having advanced the story, then look for places to start cutting.

Sometimes this isn’t easy to tell and an extended start may indeed work well, because it does advance the story. One example is There Will Be Blood. The first reel is a tour de force of editing, in which editor Dylan Tichenor builds a largely dialogue-free montage that quickly takes the audience through the first part of Daniel Plainview’s (Daniel Day-Lewis) history in order to bring the audience up to the film’s present day. It’s absolutely instrumental to the rest of the film.

2. Parallel story lines. A parallel story structure is a great device to show the audience what’s happening to different characters at different locations, but at more or less the same time. With most scripts, parallel actions are designed to eventually converge as related or often unrelated characters ultimately end up in the same place for a shared plot. An interesting take on this is Cloud Atlas, in which an ensemble cast plays different characters spread across six different eras and locations – past, present and future.

The editing style pulled off by Alexander Berner is quite a bit different than traditional parallel story editing. A set of characters might start a scene in one era. Halfway through the scene – through some type of abrupt cut, such as walking through a door – the characters, location and eras shift to somewhere else. However, the story and the editing are such that you clearly understand how the story continues for the first half of that scene, as well as how it led into the second half. This is all without explicitly shooting those parts of each scene. Scene A/era A informs your understanding of scene B/era B and vice versa.

3. Understand camera movement. When a camera zooms, moves or is used in a shaky, handheld manner, this elicits certain emotions from the audience. As a director or DP, you need to understand when each style is appropriate and when it can be overdone. Zooming into a close-up while an actor delivers a line should be done intentionally. It tells the audience, “Listen up. This is important.” If you shoot handheld footage, like most of the Bourne series, it drives a level of documentary-style, frenetic action that should be in keeping with the concept.

The TV series NYPD Blue is credited with introducing TV audiences to the “shaky-cam” style of camera work. Many pros thought it was overdone, with movement often being introduced in an unmotivated fashion. Yet, the original Law & Order series also made extensive use of handheld photography. As this was more in keeping with a subtle documentary style, few complained about its use on that show.

4. Color palettes and art direction. Many new filmmakers often feel that you can get any look you want through color grading. The reality is that it all starts with art direction. Grading should enhance what’s there, not manufacture something that isn’t. To get that “orange & teal” look, you need to have a set and wardrobe that has some greens and blues in it. To get a warm, earthy look, you need a set and wardrobe with browns and reds.

This even extends to black & white films. To get the right contrast and tonal values in black & white, you often have to use set/wardrobe color choices that are not ideal in a color world. That’s because different colors carry differing luminance and midrange values, which becomes very obvious, once you eliminate the color information from the picture. Make sure you take that into account if you plan to produce a black & white film.

5. Score versus sound design. Music should enhance and underscore a film, but it does not have to be wall-to-wall. Some films, like American Hustle and The Wolf of Wall Street, are driven by a score of popular tunes. Others are composed with an original score. However, often the “score” consists of sound design elements and simple musical drones designed to heighten tension and otherwise manipulate emotion. The absence of score in a scene can achieve the same effect. Sound effects elements with stark simplicity may have more impact  on the audience than music. Learn when to use one or the other or both. Often less is more.

6. Don’t tell too much story. Not every film requires extensive exposition. As I said at the top, a film is not a book. Visual cues are as important as the spoken word and will often tell the audience a lot more in shorthand, than pages and pages of script. The audience is interested in the journey your film’s characters are on and frequently need very little backstory to get an understanding of the characters. Don’t shy away from shooting enough of that sort of detail, but also don’t be afraid to cut it out, when it becomes superfluous.

©2014 Oliver Peters

The Ouch of 4K Post

df_4kpost_sm4K is the big buzz. Many in the post community are wondering when the tipping point will be reached when their clients will demand 4K masters. 4K acquisition has been with us for awhile and has generally proven to be useful for its creative options, like reframing during post. This has been possible long before the introduction of the RED One camera, if you were shooting on film. But acquiring in 4K and higher is quite a lot different than working a complete 4K post production pipeline.

There are a lot of half-truths surrounding 4K, so let me tackle a couple. When we talk about 4K, the moniker applies only to frame dimensions in pixels, not resolution, as in sharpness. There are several 4K dimensions, depending on whether you mean cinema specs or television specs. The cinema projection spec is 4096 x 2160 (1.9:1 aspect ratio) and within that, various aspects and frame sizes can be placed. The television or consumer spec is 3840 x 2160 (16:9 or 1.78:1 aspect ratio), which is an even multiple of HD at 1920 x 1080. That’s what most consumer 4K TV sets use. It is referred to by various labels, such as Ultra HD, UHD, UHDTV, Quad HD, 4K HD and so on. If you are delivering a digital cinema master it will be 4096 pixels wide, but if you deliver a television 4K master, it will be 3840 pixels wide. Regardless of which format your deliverable will be, you will most likely want to acquire at 4096 x 2304 (16:9) or larger, because this gives you some reframing space for either format.

This brings us to resolution. Although the area of the 4K frame is 4x that of a 1080p HD frame, the actual resolution is only theoretically 2x better. That’s because resolution is measured based on the vertical dimension and is a factor of the ability to resolve small detail in the image (typically based on thin lines of a resolution chart). True resolution is affected by many factors, including lens quality, depth of field, accuracy of the focus, contrast, etc. When you blow up a 35mm film frame and analyze high-detail areas within the frame, you often find them blurrier than you’d expect.

The brings us to post. The push for 4K post comes from a number of sources, but many voices in the independent owner-operator camp have been the strongest. These include many RED camera owners, who successfully cut their own material straight from the native media of the camera. NLEs, like Adobe Premiere Pro CC and Apple Final Cut Pro X, make this a fairly painless experience for small, independent projects, like short films and commercials. Unfortunately it’s an experience that doesn’t extrapolate well to the broader post community, which works on a variety projects and must interchange media with numerous other vendors.

The reason 4K post seems easy and viable to many is that the current crop of 4K camera work with highly compressed codecs and many newer computers have been optimized to deal with these codecs. Therefore, if you shoot with a RED (Redcode), Canon 1DC (Motion-JPEG), AJA Cion (ProRes), BMD URSA (ProRes) and Sony F55 (XAVC), you are going to get a tolerable post experience using post-ready, native media or by quickly transcoding to ProRes. But that’s not how most larger productions work. A typical motion picture or television show will take the camera footage and process it into something that fits into a known pipeline. This usually means uncompressed DPX image sequences, plus proxy movies for the editors. This allows a base level of color management that can be controlled through the VFX pipeline without each unit along the way adding their own color interpretation. It also keeps the quality highest without further decompression/recompression cycles, as well as various debayering methods used.

Uncompressed or even mildy compressed codecs mean a huge storage commitment for an ongoing facility. Here’s a quick example. I took a short RED clip that was a little over 3 minutes long. It was recorded as 4096 x 2304 at 23.976fps. This file was a bit over 7GB in its raw form. Then I converted this to these formats with the following results:

ProRes 4444 – 27GB

ProRes HQ (also scaled to UHD 3840 x 2160) – 16GB

Uncompressed 10-Bit – 116GB

DPX images (10-bits per channel) – 173GB

TIFF images (8-bits per channel) – 130GB

As you can see, storage requirement increase dramatically. This can be mitigated by tossing out some data, as the ProRes444 versus down-sampled ProResHQ comparison shows. It’s worth noting that I used the lower DPX and TIFF color depth options, as well. At these settings, a single 4K DPX frame is 38MB and a single 4K TIFF frame is 28MB.

For comparison, a complete 90-100 minute feature film mastered at 1920 x 1080 (23.976fps) as ProRes HQ will consume about 110-120GB of storage. UHD is still 4x the frame area, so if we use the ProRes HQ example above, 30x that 3 min. clip would give us the count for a typical feature. That figure comes out to 480GB.

This clearly has storage ramifications. A typical indie feature shot with two RED cameras over a one-month period, will likely generate about 5-10TB of media in the camera original raw form. If this same media were converted to ProRes444, never mind uncompressed, your storage requirements just increased to an additional 16-38TB. Mind you this is all as 24p media. As we start talking 4K in television-centric applications around the world, this also means 4K at 25, 30, 50 and 60fps. 60fps means 2.5x more storage demands than 24p.

The other element is system performance. Compressed codecs work when the computer is optimized for these. RED has worked hard to make Redcode easy to work with on modern computers. Apple ProRes enjoys near ubiquitous playback support. ProRes HQ even at 4K will play reasonably well from a two-drive RAID-0 stripe on my Mac Pro. Recode plays if I lower the debayer quality. Once you start getting into uncompressed files and DPX or TIFF image strings, it takes a fast drive array and a fast computer to get anything approaching consistent real-time playback. Therefore, the only viable workflow is an offline-online editorial system, since creative editorial generally requires multiple streams of simultaneous media.

This workflow gets even worse with other cameras. One example is the Canon C500, which records 4K camera raw files to an external recorder, such as the Convergent Design Odyssey 7Q. These are proprietary Canon camera raw files, which cannot be natively played by an NLE. These must first be turned into something else using a Canon utility. Since the Odyssey records to internal SSDs, media piles up pretty quickly. With two 512GB SSDs, you get 62 minutes of record time at 24fps if you record Canon 4K raw. In the real world of production, this becomes tough, because it means you either have to rent or buy numerous SSDs for your shoot or copy and reuse as you go. Typically transferring 1TB of data on set is not a fast process.

Naturally there are ways to make 4K post efficient and not as painful as it needs to be. But it requires a commitment to hardware resources. It’s not conducive to easy desktop post running off of a laptop, like DV and even HD has been. That’s why you still see Autodesk Smokes, Quantel Rio Pablos and other high-end systems dominate at the leading facilities. Think, plan and buy before you jump in.

©2014 Oliver Peters

Amira Color Tool and your NLE

df_amiracolor_1I was recently alerted to the new Amira Color Tool by Michael Phillips’ 24p blog. This is a lightweight ARRI software application designed to create custom in-camera looks for the Amira camera. You do this by creating custom color look-up tables (LUT). The Amira Color Tool is available as a free download from the ARRI website (free registration required). Although the application is designed for the camera, you can also export looks in a variety of LUT file formats, which in turn, may be installed and applied to footage in a number of different editing and color correction applications. I tested this in both Apple Final Cut Pro X and Avid Media Composer | Software (v8) with good results.

The Amira Color Tool is designed to correct log-C encoded footage into a straight Rec709 offset or with a custom look. ARRI offers some very good instructions, white papers, sample looks and tutorials that cover the operation of this software. The signal flow is from the log-C image, to the Rec709 correction, and then to the CDL-based color correction. To my eye, the math appears to be floating point, because a Rec709 conversion that throws a shot into clipping, can be pulled back out of clipping in the look tab, using the CDL color correction tools. Therefore it is possible to use this tool for shots other than ARRI Amira or Alexa log-C footage, as long as it is sufficiently flat.

The CDL correction tools are based on slope, offset and power. In that model slope is equivalent to gain, offset to lift and power to gamma. In addition to color wheels, there’s a second video look parameters tab for hue intensities for the six main vectors (red, yellow, green, cyan, blue and magenta). The Amira Color Tool is Mac-only and opens both QuickTime and DPX files from the clips I tested. It worked successfully with clips shot on an Alexa (log-C), Blackmagic Cinema Camera (BMD Film profile), Sony F-3 (S-log) and Canon 1DC (4K Canon-log). Remember that the software is designed to correct flat, log-C images, so you probably don’t want to use this with images that were already encoded with vibrant Rec709 colors.

FCP X

df_amiracolor_4To use the Amira Color Tool, import your clip from the application’s file browser, set the look and export a 3D LUT in the appropriate format. I used the DaVinci Resolve setting, which creates a 3D LUT in a .cube format file. To get this into FCP X, you need to buy and install a LUT filter, like Color Grading Central’s LUT Utility. To install a new LUT there, open the LUT Utility pane in System Preferences, click the “+” symbol and navigate to where the file was saved.df_amiracolor_5_sm In FCP X, apply the LUT Utility to the clip as a filter. From the filter’s pulldown selection in the inspector, choose the new LUT that you’ve created and installed. One caveat is to be careful with ARRI files. Any files recorded with newer ARRI firmware are flagged for log-C and FCP X automatically corrects these to Rec709. Since you don’t want to double up on LUTs, make sure “log processing” is unchecked for those clips in the info tab of the inspector pane.

Media Composer

df_amiracolor_6_smTo use the custom LUTs in Media Composer, select “source settings” for the clip. Go to the color management tab and install the LUT. Now it will be available in the pull-down menu for color conversions. This color management change can be applied to a single clip or to a batch of clips within a bin.

In both cases, the source clips in FCP X and/or Media Composer will play in real-time with the custom look already applied.

df_amiracolor_2_sm

df_amiracolor_3_sm

©2014 Oliver Peters