Cold In July

df_cij_2_smJim Mickle started his career as a freelance editor in New York, working on commercials and corporate videos, like so many others. Bitten by the filmmaking bug, Mickle has gone on to successfully direct four indie feature films, including his latest, Cold in July. Like his previous film, We Are What We Are, both films had a successful premiere at the Sundance Film Festival.

Cold In July, which is based on a novel by Joe R. Lansdale, is a noir crime drama set in 1980s East Texas. It stars Michael C. Hall (Dexter), Sam Shepard (Out of the Furnace, Killing Them Softly) and Don Johnson (Django Unchained, Miami Vice). Awakened in the middle of the night, small town family man Richard Dane (Hall) kills a burglar in his house. Dane soon fears for his family’s safety when the burglar’s ex-con father, Ben (Shepard), comes to town, bent on revenge. However, the story takes a twist into a world of corruption and violence. Add Jim Bob (Johnson) to this mix, as a pig-farming, private eye, and you have an interesting trio of characters.

According to Jim Mickle, Cold In July was on a fast-track schedule. The script was optioned in 2007, but production didn’t start until 2013. This included eight weeks of pre-production beginning in May and principal photography starting in July (for five weeks) with a wrap in September. The picture was “locked” shortly after Thanksgiving. Along with Mickle, John Paul Hortsmann (Killing Them Softly) shared editing duties.

df_cij_1_smI asked Mickle how it was to work with another editor. He explained, “I edited my last three films by myself, but with this schedule, post was wedged between promoting We Are What We Are and the Sundance deadline. I really didn’t have time to walk away from it and view it with fresh eyes. I decided to bring John Paul on board to help. This was the first time I’ve worked with another editor. John Paul was cutting while I was shooting and edited the initial assembly, which was finished about a week before the Sundance submission deadline. I got involved in the edit about mid-October. At that point, we went back to tighten and smooth out the film. We would each work on scenes and then switch and take a pass at each other’s work.”

df_cij_4_smMickle continued, “The version that we submitted to Sundance was two-and-a-half hours long. John Paul and I spent about three weeks polishing and were ready to get feedback from the outside. We held a screening for 20 to 25 people and afterwards asked questions about whether the plot points were coherent to them. It’s always good for me, as the director, to see the film with an audience. You get to see it fresh – with new eyes – and that helps you to trim and condense sections of the film. For example, in the early versions of the script, it generally felt like the middle section of the film lost tension. So, we had added a sub-plot element into the script to build up the mystery. This was a car of agents tailing our hero that we could always reuse, as needed. When we held the screening, it felt like that stuff was completely unnecessary and simply put on top of the rest of the film. The next day we sliced it all out, which cut 10 minutes out of the film. Then it finally felt like everything clicked.”

df_cij_3_smThe director-editor relationship always presents an interesting dynamic, since the editor can be objective in cutting out material that may have cost the director a lot of time and effort on set to capture. Normally, the editor has no emotional investment in production of the footage. So, how did Jim Mickle as the editor, treat his own work as the director? Mickle answered, “As an editor, I’m more ruthless on myself as the director. John Paul was less quick to give up on scenes than I. There are things I didn’t think twice about losing if they didn’t work, but he’d stay late to fix things and often have a solution the next day. I shoot with plenty of coverage these days, so I’ll build a scene and then rework it. I love the edit. It’s the first time you really feel comfortable and can craft the story. On the set, things happen so quickly, that you always have to be reactive – working and thinking on your feet.”

df_cij_5_smAlthough Mickle had edited We Are What We Are with Adobe Premiere Pro, the decision was made to shift back to Apple Final Cut Pro 7 for the edit of Cold In July. Mickle explained, “As a freelance editor in New York, I was very comfortable with Final Cut, but I’m also an After Effects user. When doing a lot of visual effects, it really feels tedious to go back and forth between Final Cut and After Effects. The previous film was shot with RED cameras and I used a raw workflow in post, cutting natively with Premiere Pro. I really loved the experience – working with raw files and Dynamic Link between Premiere and After Effects. When we hired John Paul as the primary editor on the film, we opted to go back to Final Cut, because that is what he is most comfortable with. That would get the job done in the most expedient fashion, since he was handling the bulk of the editing.”

df_cij_6_sm“We shot with RED cameras again, but the footage was transcoded to ProRes for the edit. I did find the process to be frustrating, though, because I really like the fluidness of using the raw files in Premiere. I like the editing process to live and breath and not be delineated. Having access to the raw films, lets me tweak the color correction, which helps me to get an idea of how a scene is shaping up. I get the composer involved early, so we have a lot of the real music in place as a guide while we edit. This way, your cutting style – and the post process in general – are more interactive. In any case, the ProRes files were only used to get us to the locked cut. Our final DI was handled by Light Iron in New York and they conformed the film from the original RED files for a 2K finish.”

The final screening with mix, color correction and all visual effects occurred just before Sundance. There the producers struck a distribution deal with IFC Films. Cold In July started its domestic release in May of this year.

Originally written for Digital Video magazine/CreativePlanetNetwork.

©2014 Oliver Peters

Particle Fever

df_pf_1Filmmaking isn’t rocket science, but sometimes they are kissing cousins. Such is the case of the documentary Particle Fever, where the credentials of both producer David Kaplan and director Mark Levinson include a Doctorate in particle physics. Levinson has been involved in filmmaking for 28 years, starting after his graduation from Berkeley, when he found the job prospects for physics in a slump. Instead he turned to his second passion – films. Levinson worked as an ADR specialist on such films as The English Patient, The Talented Mr. Ripley, Cold Mountain, and The Rainmaker. While working on those films, he built up a friendship with noted film editor Walter Murch (The Conversation, Julia, Apocalypse Now, K-19: The Widowmaker). In addition, Levinson was writing screenplays and directing some of his own independent films (Prisoner of Time). This ultimately led him to combine his two interests and pursue Particle Fever, a documentary about the research, construction and goals of building the Large Hadron Collider.

When it came time to put the polish on his documentary, Mark Levinson tapped Walter Murch as the editor. Murch explained, “I was originally only going to be on the film for three months, because I was scheduled to work on another production after that. I started in March 2012, but the story kept changing with each breaking news item from the collider. And my other project went away, so in the end, I worked on the film for 15 months and just finished the mix a few weeks ago [June 2013].” At the start of the documentary project, the outcome of the research from the Large Hadron Collider was unknown. In fact, it wasn’t until later during the edit, that the scientists achieved a major success with the confirmation of the discovery of the Higgs boson as an elementary particle in July 2012. This impacted science, but also the documentary in a major way.

Finding the story arc

df_pf_6Particle Fever is the first feature-length documentary that Walter Murch has edited, although archival and documentary footage has been part of a number of his films. He’d cut some films for the USIA early in his career and has advised and mixed a number of documentaries, including Crumb, about the controversial cartoonist Robert Crumb. Murch is fond of discussing the role of the editor as a participatory writer of the film in how he crafts the story through pictures and sound. Nowhere is this more true than in documentaries. According to Murch, “Particle Fever had a natural story arc by the nature of the events themselves. The machine [the Large Hadron Collider] provided the spine. It was turned on in 2008 and nine days later partly exploded, because a helium relief valve wasn’t strong enough. It was shut down for a year of repairs. When it was turned on again, it was only at half power and many of the scientists feared this was inadequate for any major discoveries. Nevertheless, even at half power, the precision was good enough to see the evidence that they needed. The film covers this journey from hope to disaster to recovery and triumph.”

Due to the cost of constructing large particle accelerators, a project like the Large Hadron Collider is a once-in-a-generation event. It is a seminal moment in science akin to the Manhattan Project or the moon launch. In this case, 10,000 scientists from 100 countries were involved in the goal of recreating the conditions just after the Big Bang and finding the Higgs boson, often nicknamed “the God particle”. Murch explained the production process, “Mark and David picked a number of scientists to follow and we told the story through their eyes without a narrator. They were equipped with small consumer cameras to self-record intermittent video blogs, which augmented the formal interviews. Initially Mark was following about a dozen scientists, but this was eventually narrowed down to the six that are featured in the film. The central creative challenge was to balance the events while getting to know the people and their roles. We also had to present enough science to understand what is at stake without overwhelming the audience. These six turned out to be the best at that and could convey their passion in a very charismatic and understandable way with a minimum of jargon.”

Murch continued, “Our initial cut was two-and-a-half hours, which was ultimately reduced to 99 minutes. We got there by cutting some people, but also some of the ‘side shoots’ or alternate research options that were explored. For example, there was a flurry of excitement related to what was thought to be discoveries of particles of ‘dark matter’ at a Minnesota facility. This covered about 20 minutes of the film, but in the final version there’s only a small trace of that material.”

Sifting to find the nuggets

df_pf_2As in most documentaries, the post team faced a multitude of formats and a wealth of material, including standard definition video recorded in 2007, the HDV files from the scientists’ “webcams” and Panasonic HD media from the interviews. In addition, there was a lot of PAL footage from the media libraries at CERN, the European particle accelerator. During the production, news coverage focused on the theoretical, though statistically unlikely, possibility that the Large Hadron Collider might have been capable of producing a black hole. This yielded even more source material to sift through. In total, the production team generated 300 hours of content and an additional 700 hours were available from CERN and the various news pieces produced about the collider.

Murch is known for his detailed editor’s codebook for scenes and dailies that he maintains for every film in a Filemaker Pro database. Particle Fever required a more streamlined approach. Murch came in at what initially appeared to be the end of the process after Mona Davis (Fresh, Advise & Consent) had worked on the film. Murch said, “I started the process later into the production, so I didn’t initially use my Filemaker database. Mark was both the director and my assistant editor, so for the first few months I was guided by his knowledge of the material. We maintained two mirrored workstations with Final Cut Pro 7 and Mark would ingest any new material and add his markers for clips to investigate. When these bins were copied to my station, I could use them as a guide of where to start looking for possible material.”

Mapping the sound

df_pf_4The post team operated out of Gigantic Studios in New York, which enabled an interactive workflow between Murch and sound designer Tom Paul (on staff at Gigantic) and with composer Robert Miller. Walter Murch’s editorial style involves building up a lot of temporary sound effects and score elements during the rough cut phase and then, piece-by-piece, replacing those with finished elements as he receives them. His FCP sequence on Particle Fever had 42 audio tracks of dialogue, temp sound effects and music elements. This sort of interaction among the editor, sound designer and composer worked well with a small post team all located in New York City. By the time the cut was locked in May, Miller had delivered about an hour of original score for the film and supplied Murch with seven stereo instrumentation stems for that score to give him the most versatility in mixing.

Murch and Paul mixed the film on Gigantic’s Pro Tools ICON system. Murch offered this post trick, “When I received the final score elements from Robert, I would load them into Final Cut and then was able to copy-and-paste volume keyframes I had added to Robert’s temp music onto the final stems, ducking under dialogue or emphasizing certain dynamics of the music. This information was then automatically transferred to the Pro Tools system as part of the OMF output. Although we’d still adjust levels in the mix, embedding these volume shifts gave us a better starting point. We didn’t have to reinvent the wheel, so to speak. In the end, the final mix took four days. Long days!”

df_pf_3Gigantic Post offered the advantage of an on-site screening room, which enabled the producers to have numerous in-progress screenings for both scientific and industry professionals, as well as normal interested viewers. Murch explained, “It was important to get the science right, but also to make it understandable to the layman. I have more than a passing interest in the subject, but both Mark and David have Ph.D.s in particle physics, so if I ever had a question about something, all I had to do was turn around and ask. We held about 20 screenings over the course of a year and the scientists who attended our test screenings felt that the physics was accurate. But, what they also particularly liked was that the film really conveys the passion and experience of what it’s like to work in this field.” Final Frame Post, also in New York, handled the film’s grading and digital intermediate mastering.

Graphic enhancements

df_pf_5To help illustrate the science, the producers tapped MK12, a design and animation studio, which had worked on such films as The Kite Runner and Quantum of Solace. Some of the ways in which they expressed ideas graphically throughout the film could loosely be described as a cross between A Beautiful Mind and Carl Sagan’s PBS Cosmos series. Murch described one example, “For instance, we see Nima (one of our theorists) walking across the campus of the Institute for Advanced Study while we hear his voice-over. As he talks, formulas start to swirl all around him. Then the grass transforms into a carpet of number-particles, which then transform into an expanding universe into which Nima disappears. Eventually, this scene resolves and Nima emerges, returning on campus and walking into a building, the problematic formulas falling to the ground as he goes through the door.”

Although this was Walter Murch’s first feature documentary, his approach wasn’t fundamentally different from how he works on a dramatic film. He said, “Even on a scripted film, I try to look at the material without investing it with intention. I like to view dailies with the fresh-eyed sense of ‘Oh, where did this come from? Let’s see where this will take the story’.  That’s also from working so many years with Francis [Ford Coppola], who often shoots in a documentary style. The wedding scene in The Godfather, for instance; or the Union Square conversation in The Conversation; or any of the action scenes in Apocalypse Now all exemplify that. They are ongoing events, with their own internal momentum, which are captured by multiple cameras. I really enjoyed working on this film, because there were developments and announcements during the post which significantly affected the direction of the story and ultimately the ending. This made for a real roller coaster ride!”

Particle Fever premiered at Doc/Fest Sheffield on June 14th, and won the Audience Award (split with Act of Killing). It is currently in negotiations for distribution.

NOTE: The film will open in New York on Ma5, 2014. In October 2013Peter W. Higgs – who theorized about the boson particle named after him – was awarded the Nobel Prize in Physics, together with Francois Englert. For more on Walter Murch’s thoughts about editing, click here.

And finally, an interesting look at Murch’s involvement in the Rolex Mentor Protege program.

Originally written for Digital Video magazine

©2013 Oliver Peters

The NLE that wouldn’t die

It’s been 18 months since Apple launched Final Cut Pro X and the debate over it continues to rage without let-up. Apple likely has good sales numbers to deem it a success, but if you look around the professional world, with a few exceptions, there has been little or no adoption. Yes, some editors are dabbling with it to see where Apple is headed with it – and yes, some independent editors are using it for demanding projects, including commercials, corporate videos and TV shows. By comparison, though, look at what facilities and broadcasters are using – or what skills are required for job openings – and you’ll see a general scarceness of FCP X.

Let’s compare this to the launch of the original Final Cut Pro (or “legacy”) over 12 years ago. In a similar fashion, FCP was the stealth tool that attracted individual users. The obvious benefit was price. At that time a fully decked out Avid Media Composer was a turnkey system costing over $100K. FCP was available as software for only $999. Of course, what gets lost in that measure, is the Avid price included computer, monitors, wiring, broadcast i/o hardware and storage. All of this would have to be added to the FCP side and in some cases, wasn’t even possible with FCP. In the beginning it was limited to DV and FireWire only. But there were some key advantages it introduced at the start, over Avid systems. These included blend modes, easy in-timeline editing, After Effects-style effects and a media architecture built upon the open, extensible and ubiquitous QuickTime foundation. Over the years, a lot was added to make FCP a powerful system, but at its core, all the building blocks were in place from the beginning.

When uncompressed SD and next HD became the must-have items, Avid was slow to respond. Apple’s partners were able to take advantage of the hardware abstraction layer to add codecs and drivers, which expanded FCP’s capabilities. Vendors like Digital Voodoo, Aurora Video Systems and Pinnacle made it possible to edit something other than DV. Users have them to thank – more so than Apple – for growing FCP into a professional tool. When FCP 5 and 6 rolled around, the Final Cut world was pretty set, with major markets set to shift to FCP as the dominant NLE. HD, color correction and XML interchange had all been added and the package was expanded with an ecosystem of surrounding applications. By the time of the launch of the last Final Cut Studio (FCP 7) in 2009, Apple’s NLE seemed unstoppable. Unfortunately FCP 7 wasn’t as feature-packed as many had expected. Along with reticence to chuck recently purchased PowerMac G5 computers, a number of owners simply stayed with FCP 5 and/or FCP 6.

When Apple discusses the number of licensees, you have to parse how they define the actual purchases. While there are undoubtedly plenty of FCP X owners, the interpretation of sales is that more seats of FCP X have been sold than of FCP 7. Unfortunately it’s hard to know what that really means. Since it’s a comparison to FCP 7 – and not every FCP 1-6 owner upgraded to 7 – it could very well be that the X number isn’t all that large. Even though Apple EOL’ed (end of life) Final Cut Studio with the launch of FCP X, it continued to sell new seats of the software through its direct sales and reseller channels. In fact, Apple seems to still have it available if you call the correct 800 line. When Apple says it has sold more of X than of 7, is it counting the total sales (including those made after the launch) or only before? An interesting statistic would be the number of seats of Final Cut Studio (FCP 7) sold since the launch of FCP X as compared to before. We’ll never know, but it might actually be a larger number. All I know is that the system integrators I personally know, who have a long history of selling and servicing FCP-based editing suites, continue to install NEW FCP 7 rooms!

Like most drastic product changes, once you get over the shock of the new version, you quickly realize that your old version didn’t instantly stop working the day the new version launched. In the case of FCP 7, it continues to be a workhorse, albeit the 32-bit architecture is pretty creaky. Toss a lot of ProRes 4444 at it and you are in for a painful experience. There has been a lot of dissatisfaction with FCP X among facility owners, because it simply changes much of the existing workflows. There are additional apps and utilities to fill the gap, but many of these constitute workarounds compared to what could be done inside FCP 7.

Many owners have looked at alternatives. These include Adobe Premiere Pro, Avid Media Composer/Symphony, Media 100 and Autodesk Smoke 2013. If they are so irritated at Apple as to move over to Windows hardware, then the possibilities expand to include Avid DS, Grass Valley Edius and Sony Vegas. Several of these manufacturers have introduced cross-grade promotional deals to entice FCP “legacy” owners to make the switch. Avid and Adobe have benefited the most in this transition. Editors who were happy with Avid in the past – or work in a market where Avid dominates – have migrated back to Media Composer. Editors who were hoping for the hypothetical FCP 8 are often making Adobe Premiere (and the Production Premium bundle) their next NLE of choice. But ironically, many owners and users are simply doing nothing and continuing with FCP 7 or even upgrading from FCP 6 to FCP 7.

Why is it that FCP 7 isn’t already long gone or on the way out by now? Obviously the fact that change comes slowly is one answer, but I believe it’s more than that. When FCP 1.0 came on the scene, its interface and operational methodology fit into the existing NLE designs. It was like a “baby Avid” with parts of Media 100 and After Effects dropped in. If you cut on a Media Composer, the transition to FCP was pretty simple. Working with QuickTime made it easy to run on most personal machines without extra hardware.  Because of its relatively open nature and reliance in industry-standard interchange formats (many of which were added over time), FCP could easily swap data with other applications using EDLs, OMFs, text-based log files and XML. Facilities built workflows around these capabilities.

FCP X, on the other hand, introduced a completely new editing paradigm that not only changed how you work, but even the accepted nomenclature of editing. Furthermore, the UI design even did things like reverse the behavior of some keystrokes from how similar functions had been triggered in FCP 7. In short, forget everything you know about editing or using other editing software if you want to become proficient with FCP X. That’s a viable concept for students who may be the professional editors of the future. Or, for non-fulltime editors who occasionally have to edit and finish professional-level productions as one small part of their job. Unfortunately, it’s not a good approach if you want to make FCP X the ubiquitous NLE in established professional video environments, like post houses, broadcasters and large enterprise users.

After all, if I’m a facility manager and you can’t show me a compelling reason why this is better and why it won’t require a complete internal upheaval, then why should I change? In most shops, overall workflow is far more important than the specific features of any individual application. Gone are the differences in cost, so it’s difficult to make a compelling argument based on ROI. You can no longer make the (false) argument of 1999 that FCP will only cost you 1% of the cost of an Avid. Or use the bogus $50K edit suite ad that followed a few years later.

Which brings us to the present. I started on Avid systems as the first NLE where I was in the driver’s seat. I’ve literally cut on dozens of edit systems, but for me, Final Cut Pro “legacy” fit my style and preferences best. I would have loved a 64-bit version with a cleaned-up user interface, but that’s not what FCP X delivers. It’s also not exactly where Premiere Pro CS6 is today. I deal with projects from the outside – either sent to me or at shops where I freelance. Apple FCP 7 and Avid Media Composer continue to be what I run into and what is requested.

Over the past few months I’ve done quite a few complex jobs on FCP X, when I’ve had the ability to control the decision. Yet, I cannot get through any complex workflow without touching parts of Final Cut Studio (“legacy”) to get the job done. FCP X seems to excel at small projects where speed trumps precision and interoperability. It’s also great for individual owner-operators who intend to do everything inside FCP X. But for complex projects with integrated workflows, FCP 7 is still decidedly better.

As was the case with early FCP, where most of the editing design was there at the start, I now feel that with the FCP X 10.0.6 update, most of its editing design is also in place. It may never become the tool that marches on to dominate the market. FCP “legacy” had that chance and Apple walked away from it. It’s dubious that lightning will strike twice, but 18 months is simply too short of a timeframe in which to say anything that definitive. All I know is that for now, FCP 7 continues as the preferred NLE for many, with Media Composer a close second. Most editors, like old dogs, aren’t too eager to learn new tricks. At least that’s what I conclude, based on my own ear-to-the-ground analysis. Check back this time next year to see if that’s still the case. For now, I see the industry continuing to live in a very fractured, multi-NLE environment.

©2012 Oliver Peters

Hemingway & Gellhorn

Director Philip Kaufman has a talent for telling a good story against the backdrop of history. The Right Stuff (covering the start of the United States’ race into space) and The Unbearable Lightness of Being (the 1968 Soviet invasion of Prague) made their marks, but now the latest, Hemingway & Gellhorn continues that streak.

Originally intended as a theatrical film, but ultimately completed as a made-for-HBO feature, Hemingway & Gellhorn chronicles the short and tempestuous relationship between Ernest Hemingway (Clive Owen) and his third wife, Martha Gellhorn (Nicole Kidman). The two met in 1936 in Key West, traveled to Spain to cover the Spanish Civil War and were married in 1940. They lived in Havana and after four years of a difficult relationship were divorced in 1945. During her 60-year career as a journalist, Gellhorn was recognized as being one of the best war correspondents of the last century. She covered nearly every conflict up until and including the U. S. invasion of Panama in 1989.

The film also paired another team – that of Kaufman and film editor Walter Murch – both of whom had last teamed up for The Unbearable Lightness of Being. I recently spoke with Walter Murch upon his return from the screening of Hemingway & Gellhorn at the Cannes Film Festival. Murch commented on the similarities of these projects, “I’ve always been attracted to the intersection of history and drama. I hadn’t worked with Phil since the 1980s, so I enjoyed tackling another film together, but I was also really interested in the subject matter. When we started, I really didn’t know that much about Martha Gellhorn. I had heard the name, but that was about it. Like most folks, I knew the legend and myth of Hemingway, but not really many of the details of him as a person.”

This has been Murch’s first project destined for TV, rather than theaters. He continued, “Although it’s an HBO film, we never treated it as anything other than a feature film, except that our total schedule, including shooting, was about six months long, instead of ten or more months. In fact, seeing the film in Cannes with an audience of 2,500 was very rewarding. It was the first time we had actually screened in front of a theatrical audience that large. During post, we had a few ‘friends and family’ screenings, but never anything with a formal preview audience. That’s, of course, standard procedure with the film studios. I’m not sure what HBO’s plans are for Hemingway & Gellhorn beyond the HBO channels. Often some of their films make it into theatrical distribution in countries where HBO doesn’t have a cable TV presence.”

Hemingway & Gellhorn was produced entirely in the San Francisco Bay area, even though it was a period film and none of the story takes place there. All visual effects were done by Tippett Studio, supervised by Christopher Morley, which included placing the actors into scenes using real archival footage. Murch explained, “We had done something similar in The Unbearable Lightness of Being. The technology has greatly improved since then, and we were able to do things that would have been impossible in 1986. The archival film footage quality was vastly different from the ARRI ALEXA footage used for principal photography. The screenplay was conceived as alternating between grainless color and grainy monochrome scenes to juxtapose the intimate events in the lives of Hemingway and Gellhorn with their presence on the world stage at historical events. So it was always intended for effect, rather than trying to convince the audience that there was a completely continuous reality. As we got into editing, Phil started to play with color, using different tinting for the various locations. One place might be more yellow and another cool or green and so on. We were trying to be true to the reality of these people, but the film also has to be dramatic. Plus, Phil likes to have fun with the characters. There must be balance, so you have to find the right proportion for these elements.”

The task of finding the archival footage fell to Rob Bonz, who started a year before shooting. Murch explained, “An advantage you have today that we didn’t have in the ‘80s is YouTube. A lot of these clips exist on-line, so it’s easier to research what options you might have. Of course, then you have to find the highest quality version of what you’ve seen on-line. In the case of the events in Hemingway & Gellhorn, these took place all over the world, so Rob and his researchers were calling all kinds of sources, including film labs in Cuba, Spain and Russia that might still have some of these original nitrate materials.”

This was Walter Murch’s first experience working on a film recorded using an ARRI ALEXA. The production recorded 3K ARRIRAW files using the Codex recorder and then it was the editorial team’s responsibility to convert these files for various destinations, including ProResLT (1280 x 720) for the edit, H.264 for HBO review and DPX sequences for DI. Murch was quite happy with the ALEXA’s image. He said, “Since these were 3K frames we were able to really take advantage of the size for repositioning. I got so used to doing that with digital images, starting with Youth Without Youth, that it’s now just second nature. The ALEXA has great dynamic range and the image held up well to subtle zooms and frame divisions. Most repositionings and enlargements were on the order of 125% to 145%, but there’s one blow-up at 350% of normal.”

In addition to Bonz, the editorial team included Murch’s son Walter (first assistant editor) and David Cerf (apprentice). Walter Murch is a big proponent of using FileMaker Pro for his film editor’s code book and explained some of the changes on this film. “Dave really handled most of the FileMaker jiu-jitsu. It works well with XML, so we were able go back and forth between FileMaker Pro and Final Cut Pro 7 using XML. This time our script supervisor, Virginia McCarthy, was using ScriptE, which also does a handshake with FileMaker, so that her notes could be instantly integrated into our database. Then we could use this information to drive an action in Final Cut Pro – for instance, the assembly of dailies reels. FileMaker would organize the information about yesterday’s shooting, and then an XML out of that data would trigger an assembly in Final Cut, inserting graphics and text as needed in between shots. In the other direction, we would create visibility-disabled slugs on a dedicated video track, tagged with scene information about the clips in the video tracks below. Outputting XML from Final Cut would create an instantaneous continuity list with time markers in FileMaker.”

The way Walter Murch organizes his work is a good fit for Final Cut Pro 7, which he used on Hemingway & Gellhorn and continues to use on a current documentary project. In fact, at a Boston FCP user gathering, Murch showed one of the most elaborate screen grabs of an FCP timeline that you can imagine. He takes full advantage of the track structure to incorporate temporary sound effects and music cues, as well as updated final music and effects.

Another trick he mentioned to me was something he referred to as a QuickTime skin. Murch continued, “I edit with the complete movie on the timeline, not in reels, so I always have the full cut in front of me. I started using this simple QuickTime skin technique with Tetro. First, I export the timeline as a self-contained QuickTime file and then re-import the visual. This is placed on the upper-most video track, effectively hiding everything below. As such, it’s like a ‘skin’ that wraps the clips below it, so the computer doesn’t ‘see’ them when you scroll back and forth. The visual information is now all at one location on a hard drive, so the system isn’t bogged down with unrendered files and other clutter. When you make changes, then you ‘razor-blade’ through the QuickTime and pull back the skin, revealing the ‘internal organs’ (the clips that you want to revise) below – thus making the changes like a surgeon. Working this way also gives a quick visual overview of where you’ve made changes. You can instantly see where the skin has been ‘broken’ and how extensive the changes were. It’s the visual equivalent of a change list. After a couple of weeks of cutting, on average, I make a new QuickTime and start the process over.”

Walter Murch is currently working on a feature documentary about the Large Hadron Collider. Murch, in his many presentations and discussions on editing, considers the art part plumbing (knowing the workflow), part performance (instinctively feeling the rhythm and knowing, in a musical sense, when to cut) and part writing (building and then modifying the story through different combinations of picture and sound). Editing a documentary is certainly a great example of the editor as writer. His starting point is 300 hours of material following three theorists and three experimentalists over a four-year period, including the catastrophic failure of the accelerator nine days after it was turned on for the first time. Murch, who has always held a love and fascination for the sciences, is once again at that intersection of history and drama.

Click here to watch the trailer.

(And here’s a nice additional article from the New York Times.)

Originally written for Digital Video magazine (NewBay Media, LLC).

©2012 Oliver Peters

NAB 2012 – Adobe CS6, Smoke 2013, Thunderbolt and more

Get some coffee, sit back and take your time reading this post. I apologize for its length in advance, but there’s a lot of new hardware and software to talk about. I’m going to cover my impressions of NAB along with some “first looks” at Adobe Creative Suite 6, Smoke 2013 and Thunderbolt i/o devices. There’s even some FCP X news!

_________________________________________________

Impressions of NAB 2012

I thought this year was going to be quiet and laid back. Boy, was I wrong! Once again Blackmagic Design stole the spotlight with democratized products. This year the buzz had to be the Blackmagic Cinema Camera. It delivers on the objective of the original RED Scarlet idea. It’s a $3K camera with 2.5K of resolution and 13 stops. I’ll leave the camera discussions to the camera guys, but suffice it to say that this camera was thought up with post in mind. That is – no new, proprietary codec. It uses ProRes, DNxHD or Cinema DNG (the Adobe raw format). It also includes a copy of Resolve and UltraScope with the purchase.

Along with that news was Blackmagic’s re-introduction of the Teranex processors. Prior to that company’s acquisition by Blackmagic Design, the top-of-the-line Teranex image processor loaded with options was around $90K. Now that Grant Petty’s wizards have had a go at it, the newest versions in a nicely re-designed form factor are $2K for 2D and $4K for 3D. Sweet. And if you think free (or close to it) stifles R&D, take a look at the new, cleaned-up DaVinci Resolve 9.0 interface. Great to see that the development continues.

You’ll note that there was a lot of buzz about 4K camera, but did you notice you need to record that image to something? Enter AJA – not with a camera – but, with the KiPro Mini Quad. That’s right – a 4K version of the Mini already designed with Canon’s C500 4K camera in mind. It records 4K ProRes 4444 files. AJA is also building its Thunderbolt portfolio with T-Tap, a monitoring-only Thunderbolt-to-SDI/HDMI output adapter under $250. More on Thunderbolt devices later in this post.

The NLE news was dominated by Adobe’s reveal of Creative Suite 6 (with Premiere Pro CS6) and Autodesk’s re-designed Smoke 2013. Avid’s news was mainly broadcast and storage-related, since Media Composer version 6 had been launched months before. Although that was old news to the post crowd, it was the first showing for the software at NAB. Nevertheless, to guarantee some buzz, Avid announced a short-term Symphony cross-grade deal that lasts into June. FCP (excluding X), Media Composer and Xpress Pro owners can move into Symphony for $999. If you are an Avid fan, this is a great deal and is probably the best bang-for-the-buck NLE available if you take advantage of the cross-grade.

An interesting sidebar is that both FilmLight and EyeOn are developing plug-in products for Avid software. FilmLight builds the Baselight color correction system, which was shown and recently released in plug-in form for FCP 7. Now they are expanding that to other hosts, including Nuke and Media Composer under the product name of Baselight Editions. EyeOn’s Fusion software is probably the best and fastest, feature film-grade compositor available on Windows. EyeOn is using Connection (a software bridge) to send Media Composer/Symphony or DS timeline clips to Fusion, which permits both applications to stay open. In theory, if you bought Symphony and added Baselight and Fusion, the combination becomes one of the most powerful NLEs on the market. All at under $5K with the current cross-grade!

Autodesk has been quite busy redesigning its Smoke NLE for the Mac platform. Smoke 2013 features a complete Mac-centric overhaul to turn it into an all-in-one “super editor” that still feels comfortable for editors coming from an FCP or Media Composer background. See my “first look” section below.

Quantel, who often gets lost in these desktop NLE discussions showed the software-only version of Pablo running on a tweaked PC. It uses four high-end NVIDIA cards for performance and there’s also a new, smaller Neo Nano control surface. Although pricing is lower, at $50K for the software alone, it’s still the premium brand.

There’s been plenty of talk about “editing in the cloud”, but in my opinion, there were three companies at the show with viable cloud solutions for post: Avid, Quantel and Aframe. In 2010 Avid presented a main stage technology preview that this year has started to come to fruition as Interplay Sphere. The user in the field is connected to his or her home base storage and servers over various public networks. The edit software is a version of the NewsCutter/Media Composer interface that can mix local full-res media with proxy media linked to full-res media at the remote site. When the edit is done, the sequence list is “published” to the server and local, full-res media uploaded back to the home base (trimmed clips only). The piece is conformed and rendered by the server at home. Seems like the branding line should be Replace your microwave truck with a Starbucks!

The company with a year of real experience “in the cloud” at the enterprise level is Quantel with Qtube. It’s a similar concept to Avid’s, but has the advantage of tying in multiple locations remotely. Media at the home base can also be searched and retrieved in formats that work for other NLEs, including Media Composer and Final Cut.

An exciting newcomer is Aframe. They are a British company founded by the former owner of Unit, one of Europe’s largest professional post facilities built around FCP and Xsan. Aframe is geared toward the needs of shows and production companies more so than broadcast infrastructures. The concept uses a “private cloud” (i.e. not Amazon servers) with an interface and user controls that feel a lot like a mash-up between Vimeo and Xprove. Full-res media can be uploaded in several ways, including via regional service centers located around the US. There’s full metadata support and the option to use Aframe’s contracted logging vendor if you don’t want to create metadata yourself. Editors cut with proxy media and then the full-res files are conformed via EDLs and downloaded when ready. Pricing plans are an attractive per-seat, monthly structure that start with a free, single seat account.

Apple doesn’t officially do trade shows anymore, but they were at NAB, flying under the radar. In a series of small, private meetings with professional customers and media, Apple was making their case for Final Cut Pro X. Rome wasn’t built in a day and the same can be said for re-building a dominant editing application from the ground up. Rather than simply put in the same features as the competition, Apple opted to take a fresh look, which has created much “Sturm und Drang” in the industry. Nevertheless, Apple was interested in pointing out the adoption by professional users and the fact that it has held an above-50% market share with new NLE seats sold to professional users during 2011. You can parse those numbers anyway you like, but they point to two facts: a) people aren’t changing systems as quickly as many vocal forum posts imply, and b) many users are buying FCP X and seeing if and how it might work in some or all of their operation.

FCP X has already enjoyed several quick updates in less than a year, thanks to the App Store mechanism. There’s a robust third-party developer community building around X. In fact, walking around the NAB floor, I saw at least a dozen or more booths that displayed FCP X in some fashion to demonstrate their own product or use it as an example of interoperability between their product and X. Off the top of my head, I saw or heard about FCP X at Autodesk, Quantel, AJA, Blackmagic Design, Matrox, MOTU, Tools On Air, Dashwood and SONY – not to mention others, like resellers and storage vendors. SONY has announced the new XDCAM plug-ins for X and compatibility of its XDCAM Browser software. Dashwood Cinema Solutions was showing the only stereo3D package that’s ready for Final Cut Pro X. And of course, we can’t live without EDLs, so developer XMiL Workflow Tools (who wasn’t exhibiting at NAB) has also announced EDL-X, an FCP XML-to-EDL translator, expected to be in the App Store by May.

On the Apple front, the biggest news was another peek behind the curtain at some of the features to be included in the next FCP X update, coming later this year. These include multichannel audio editing tools, dual viewers, MXF plug-in support and RED camera support. There are no details beyond these bullet points, but you can expect a lot of other minor enhancements as part of this update.

“Dual viewers” may be thought of as “source/record” monitors – added by Apple, thanks to user feedback. Apple was careful to point out to me that they intended to do a bit more than just that with the concept. “RED support” also wasn’t defined, but my guess would be that it’s based on the current Import From Camera routine. I would imagine something like FCP 7’s native support of RED media through Log and Transfer, except better options for bringing in camera raw color metadata. Of course, that’s purely speculation on my part.

Now, sit back and we’ll run through some “first looks”.

 _________________________________________________

Adobe Creative Suite 6 – A First Look

Adobe charged into 2012 with a tailwind of two solid years of growth on the Mac platform and heavy customer anticipation for what it plans to offer in Creative Suite 6. The release of CS5 and CS5.5 were each strong in their own right and introduced such technologies as the Mercury Playback Engine for better real-time performance, but in 2011 Adobe clearly ramped up its focus on video professionals. They acquired the IRIDAS SpeedGrade technology and brought the developers of Automatic Duck on board. There have been a few sneak peeks on the web including a popular video posted by Conan O’Brien’s Team Coco editors, but the wait for CS6 ended with this year’s NAB.

Production Premium

Adobe’s video content creation tools may be purchased individually, through a Creative Cloud subscription or as part of the Master Collection and Production Premium bundles. Most editors will be interested in CS6 Production Premium, which includes Prelude, Premiere Pro, After Effects, Photoshop Extended, SpeedGrade, Audition, Encore, Adobe Media Encoder, Illustrator, Bridge and Flash Professional. Each of these applications has received an impressive list of new features and it would be impossible to touch on every one here, so look for a more in-depth review at a future date. I’ll quickly cover some of the highlights.

Prelude

As part of CS6, Adobe is introducing Prelude, a brand new product designed for footage acquisition, ingest/transcode, organization, review and metadata tagging. It’s intended to be used by production assistants or producers as an application to prepare the footage for an editor. Both Prelude and Premiere Pro now feature “hover scrubbing”, which is the ability to scan through footage quickly by moving the mouse over the clip thumbnail, which can be expanded as large as a mini-viewer. Clips can be marked, metadata added and rough cuts assembled, which in turn are sent to Premiere Pro. There is a dynamic reading of metadata between Prelude and Premiere Pro. Clip metadata changes made in one application are updated in the other, since the information is embedded into the clip itself. Although Prelude is included with the software collection for single users, it can be separately purchased in volume by enterprise customers, such as broadcasters and news organizations.

Premiere Pro

A lot of effort was put into the redesign of Premiere Pro. The user interface has been streamlined and commands and icons were adjusted to be more consistent with both Apple Final Cut Pro (“legacy” versions) and Avid Media Composer. Adobe took input from users who have come from both backgrounds and wanted to alter the UI in a way that was reasonably familiar. The new CS6 keyboard shortcuts borrow from each, but there are also full FCP and full MC preset options. Workspaces have been redesigned, but an editor can still call up CS5.5 workspace layouts with existing projects to ease the transition. A dockable timecode window has been added and Adobe has integrated a dynamic trimming function similar to that of Media Composer.

The changes are definitely more than cosmetic, though, as Adobe has set out to design a UI that never forces you to stop. This means you can now do live updates to effects and even open other applications without the timeline playback ever stopping. They added Mercury Playback acceleration support for some OpenCL cards and there’s a new Mercury Transmit feature for better third-party hardware i/o support across all of the video applications. Many new tools have been added, including a new multi-camera editor with an unlimited number of camera angles. Some more features have been brought over from After Effects, including adjustment layers and the Warp Stabilizer that was introduced with CS5.5. This year they’ve broken out the rolling shutter repair function as a separate tool. Use it for quick HDSLR camera correction without the need to engage the full Warp Stabilizer.

SpeedGrade

By adding a highly-regarded and established color grading tool, Adobe has strengthened the position of Production Premium as the primary application suite for video professionals. The current level of integration is a starting point, given the short development time that was possible since last September. Expect this to expand in future versions.

SpeedGrade works as both a standalone grading application, as well as a companion to the other applications. There’s a new “Send to SpeedGrade” timeline export operation in Premiere Pro. When you go into SpeedGrade this way, an intermediate set of uncompressed DPX files is first rendered as the source media to be used by SpeedGrade. Both applications support a wide range of native formats, but they aren’t all the same, so this approach offers the fewest issues for now, when working with mixed formats in a Premiere sequence. In addition, SpeedGrade can also import EDLs and relink media, which offers a second path from Premiere Pro into SpeedGrade. Finished, rendered media returns to Premiere as a single, flattened file with baked-in corrections.

As a color correction tool, SpeedGrade presents an easy workflow – enabling you to stack layers of grading onto a single clip, as well as across the entire timeline. There are dozens of included LUTs and looks presets, which may be used for creative grading or to correct various camera profiles. An added bonus is that both After Effects and Photoshop now support SpeedGrade Look files.

Audition

With CS5.5, Adobe traded out Soundbooth for a cross-platform version of Audition, Adobe’s full-featured DAW software. In CS6, that integration has been greatly improved. Audition now sports an interface more consistent with After Effects and Premiere, newly added Mackie and Avid Eucon control surface protocol support and mixing automation. The biggest feature demoed in the sneak peeks has been the new Automatic Speech Alignment tool. You can take overdubbed ADR lines and automatically align them for near-perfect sync to replace the on-camera dialogue. All of this is thanks to the technology behind Audition’s new real-time, high-quality audio stretching engine.

Audition also gains a number of functions specific to audio professionals. Audio CD mastering has been added back into the program and there’s a new pitch control spectral display. This can be used to alter the pitch of a singer, as well as a new way to create custom sound design. Buying Production Premium gives you access to 20GB of downloadable audio media (sound effects and music scores) formerly available only via the online link to Adobe’s Resource Central.

After Effects

Needless to say, After Effects is the Swiss Army knife of video post. From motion graphics to visual effects to simple format conversation, there’s very little that After Effects isn’t called upon to do. Naturally there’s plenty new in CS6. The buzz feature is a new 3D camera tracker, which uses a point cloud to tightly track an object that exhibits size, position, rotation and perspective changes. These are often very hard for traditional 2D point trackers to follow. For example, the hood of a car moving towards the camera at an angle.

Now for the first time in After Effects, you can build extruded 3D text and vector shapes using built-in tools. This includes surface material options and a full 3D ray tracer. In general, performance has been greatly improved through a better hand-off between RAM cache and disk cache. As with Premiere Pro, rolling shutter repair is now also available as a separate tool in After Effects.

Photoshop

Photoshop has probably had the most online sneak peeks of any of the new Adobe apps. It has been available as a public beta since mid-March. Photoshop, too, sports a new interface, but that’s probably the least noteworthy of the new features. These include impressive new content-aware fill functions, 3D LUT support (including SpeedGrade Look files) and better auto-correction. There’s better use of GPU horsepower, which means common tasks like Liquefy are accelerated.

Photoshop has offered the ability to work with video as a single file for several versions. With CS6 it gains expanded video editing capabilities, enabled by a new layer structure akin to that used in After Effects. Although Premiere Pro or After Effects users probably won’t do much with it, Adobe is quite cognizant that many of its photography customers are increasingly asked to deal with video – thanks, of course, to the HD-video-enabled DSLRs, like the Canon EOS series. By integrating video editing and layering tools into Photoshop, it allows these customers to deliver a basic video project while working inside an application environment where they are the most comfortable. Video editors gain the benefit of having it there if they want to use it. Some may, in fact, develop their own innovative techniques once they investigate what it can do for them.

Adobe Creative Suite 6 offers a wealth of new features, expanded technologies and a set of brand new tools. It’s one of Adobe’s largest releases ever and promises to attract new interest from video professionals.

Click here for updated price and availability information.

Click here for videos that explain CS6 features.

Plus, a nice set of tutorial videos here.

 _________________________________________________

Autodesk Smoke 2013 – A First Look

Thanks to the common Unix underpinnings of Linux and Mac OS X, Autodesk Media & Entertainment was able to bring its advanced Smoke editor to the Mac platform in December of 2009 as an unbundled software product. The $15K price tag was a huge drop from that of their standard, turnkey Linux Smoke workstations, but still hefty for the casual user. Nevertheless, thanks to an aggressive trial and academic policy, Autodesk was very successful in getting plenty of potential new users to download and test the product. In the time since the launch on the Mac, Autodesk has had a chance to learn what Mac-oriented editors want and adjust to the feedback from these early adopters.

Taking that user input to heart, Autodesk introduced the new Smoke 2013 at NAB. This is an improved version that is much more “Mac-like”. Best of all it’s now available for $3,495 plus an optional annual subscription fee for support and software updates. Although this is an even bigger price reduction, it places Smoke in line with Autodesk’s animation product family (Maya, Softimage, etc.) and in keeping with what most Mac users feel is reasonable for a premium post production tool. Smoke 2013 will ship in fall, but the new price took effect at NAB. Any new and existing customers on subscription will receive the update as part of their support. Tutorials and trial versions of Smoke 2013 are expected to be available over the summer.

More Mac-like

Autodesk was successful in attracting a lot of trial downloads, but realized that the biggest hurdle was the steep learning curve even expert Final Cut and Media Composer editors encountered. Previous Mac versions of Smoke featured a user interface and commands inherited from the Linux versions of Smoke and Flame, which were completely different from any Mac editing application. Just getting media into the system baffled many. With Smoke 2013, Autodesk has specifically targeted editors who come from an Apple Final Cut Pro and/or Avid Media Composer background. The interface uses a standard, track-based editing workflow to maintain the NLE environment that editors are comfortable with. There’s a familiar Mac OS X menu bar at the top and the application has adopted most of the common OS commands. In short, it’s been redesigned – but not “re-imagined” – to act like a Mac application is supposed to.

Smoke now features a tab structure to quickly switch between modes, like media access, editing, etc. The biggest new tool is the Media Hub. This is an intelligent media browser that lets you easily access any compatible media on any of your hard drives. It recognizes native media formats, as opposed to simply browsing all files in the Finder. Media support includes RED, ARRIRAW, ProRes, DNxHD, H.264, XDCAM, image sequences, LUTs and more. Media Hub is the place to locate and import files, including the ability to drag-and-drop media directly into your Smoke library, as well as from the Finder into Smoke. Settings for formats like RED (debayer, color, etc.) are maintained even when you drag from the Finder. Since Smoke is designed as a finishing tool, you can also import AAF, XML (FCP 7, FCP X, Premiere Pro) and EDL lists generated by offline editors.

ConnectFX

Beyond familiar commands and the Media Hub, the editing interface has been redesigned to be more visually appealing and for the easier application of effects. ConnectFX is a method to quickly apply and modify effects right in the timeline. Tabbed buttons let you change between modes, such as resizing, time warps, Sparks filter effects and color correction. When you choose to edit effects parameters, the interface opens a ribbon above the timeline where you can alter numerical settings or enter a more advanced effects editing interface. If you need more sophistication, then move to nodes using ConnectFX. Smoke is the only editor with a node-based compositor that works in 3D space. You get many of the tools that have been the hallmark of the premium Autodesk system products, such as effects process nodes, the Colour Warper, relighting, 3D tracking and more.

Smoke 2013 is positioned as an integrated editing and effects tool. According to Autodesk’s research, editors who use a mixture of several different tools to get the job done – from editing to effects to grading – often use up to seven different software applications. Smoke is intended as a “super editor” that places all of these tools and tasks into a single, comprehensive application with a cohesive interface. The design is intended to maximize the workflow as an editor moves from editing into finishing.

Lighter system requirements

Apple is changing the technology landscape with more powerful personal workstations, like the iMac, which doesn’t fit the traditional tower design. Thunderbolt adds advanced, high-bandwidth connectivity for i/o and storage in a single cable connection.

To take advantage of these changes, Smoke 2013 has been designed to run on this new breed of system. For example, it will work on a newer MacBook Pro or iMac, connected to fast Thunderbolt storage, like a Promise Pegasus RAID array. A key change has been in the render format used by Smoke. Up until now, intermediate renders have been to uncompressed RGB 4:4:4 DPX image sequence files. While this maintains maximum quality, it quickly eats storage space and is taxing on less powerful machines. Rendering to an uncompressed RGB format is generally overkill if your camera originals started as some highly-compressed format like XDCAM or H.264. Now Smoke 2013 offers the option to render to compressed formats, such as one of the Apple ProRes codecs.

Another welcomed change is the ability to use some of the newer Thunderbolt i/o devices. Smoke on a Mac Pro tower has been able to work with AJA KONA 3G cards, but with Smoke 2013, AJA’s new Io XT has been added to the mix. The Io XT is an external unit with most of the features and power of the KONA card. It connects in the Thunderbolt chain with storage and/or a secondary display and is the only current Thunderbolt i/o device with a loop-through connection. Thus it isn’t limited to being at the end of the chain.

While at NAB, I took a few minutes to see how comfortable this new version felt. I’ve been testing Smoke 2012 at home and quite frankly had some of the same issues other FCP and Media Composer editors have had. It has been a very deep program that required a lot of relearning before you could feel comfortable. When I sat down in front of Smoke 2013 in the NAB pod, I was able to quickly work through some effects without any assistance, primarily based on what seemed logical to me in a “standard” NLE approach. I’m not going to kid you, though. To do advanced effects still requires a learning curve, but editors do plenty of in-timeline effects that never require extensive compositing. When I compare doing this type of work in Smoke 2013 versus 2012, I’d say that the learning requirements have been cut by 60% to 75% with this new version. That’s how much the redesign improves things for beginners.

You can start from scratch editing a project strictly on Smoke 2013, but in case you are wondering, this really shouldn’t be viewed as a complete replacement for FCP 7. Instead, it’s the advanced product used to add the polish. As such, it becomes an ideal companion for a fast application used for creative cutting, like Final Cut Pro, Premiere Pro or Media Composer.

Apple’s launch of Final Cut Pro X was a disruptive event that challenged conventional thinking. Autodesk Media & Entertainment’s launch of Smoke 2013 might not cause the same sort of uproar, but it brings a world-class finishing application to the Mac at a price that is attractive to many individual users and small boutiques.

Click here for videos and tutorials about Smoke.

Click here for Autodesk’s NAB videos.

 _________________________________________________

Thunderbolt I/O Devices – A First Look

Over the years media pros have seen data protocols come and go. Some, like Fibre Channel, are still current fixtures, while others, such as SCSI, have bitten the dust. The most exciting new technology is Thunderbolt, which is a merger of PCI Express and DisplayPort technologies co-developed by Intel and Apple. Started under the code name of Light Peak, the current implementation of Thunderbolt is a bi-directional protocol that passes power, video display signals and data transfer at up to 10Gbps of throughput in both directions. According to Apple, that’s up to twelve times faster than FireWire 800. It’s also faster than Fibre Channel, which tends to be the protocol of choice in larger facilities. Peripherals can access ten watts of power through Thunderbolt, too. Like SCSI and FireWire, Thunderbolt devices can be daisy-chained with special cables. Up to six devices can be connected in series, but certain devices have to be at the end of the chain. This is typically true when a PCIe-to-Thunderbolt adapter is used.

A single signal path can connect the computer to external storage, displays and capture devices, which provides editors with a powerful data protocol in a very small footprint. Thunderbolt technology is currently available in Apple iMac, MacBook Air, MacBook Pro and Mini computers and is starting to become available on some Windows systems. It is not currently available as a built-in technology on Mac Pros, but you can bet that if there’s a replacement tower, Thunderbolt will be a key part of the engineering design.

By its nature, Thunderbolt dictates that peripheral devices are external units. All of the processing horsepower of a PCIe card, such as a KONA or Decklink, is built into the circuitry of an external device, which is connected via the Thunderbolt cable to the host computer. I tested three Thunderbolt capture/output devices for this review: AJA Io XT, Blackmagic Design UltraStudio 3D and Matrox MXO2 LE MAX. AJA added the monitoring-only T-Tap at NAB to join the Io XT in AJA’s Thunderbolt line-up. Blackmagic Design has developed four Thunderbolt units at difference price tiers. For smaller installations or mobile environments, the UltraStudio Express, Intensity Shuttle Thunderbolt or Intensity Extreme are viable solutions.

Matrox has taken a different approach by using an adapter. Any of its four MXO2 products – the standard MXO2, Mini, LE or Rack – can be used with either Thunderbolt or non-Thunderbolt workstations. Simply purchase the unit with a Thunderbolt adapter, PCIe card and/or Express 34 slot laptop card. The MXO2 product is the same and only the connection method differs for maximum flexibility. The fourth company making Thunderbolt capture devices is MOTU. Their HDX-SDI was not available in time for this review, but I did have a chance to play with one briefly on the NAB show floor.

Differentiating features

All three of the tested units include up/down/cross-conversion between SD and HD formats and perform in the same fashion as their non-Thunderbolt siblings. Each has pros and cons that will appeal to various users with differing needs. For instance, the AJA Io XT is the only device with a Thunderbolt pass-through connector. The other units have to be placed at the end of a Thunderbolt path. They all support SDI and HDMI capture and output, as well as RS-422 VTR control. Both the AJA and Blackmagic units support dual-link SDI for RGB 4:4:4 image capture and output. The Matrox and AJA units use a power supply connected via a four-pin XLR, which makes it possible to operate them in the field on battery power.

The need to work with legacy analog formats or monitoring could determine your choice. This capability represents the biggest practical difference among the three. Both the MXO2 LE and UltraStudio 3D support analog capture and output, while there’s only analog output from the Io XT. The MXO2 LE uses standard BNC and XLR analog connectors (two audio channels on the LE, but more with the MXO2 or Rack), but the other two require a cable harness with a myriad of small connectors. That harness is included with the Blackmagic unit, but with AJA, you need to purchase an optional DB-25 Tascam-style cable snake for up to eight channels of balanced analog audio.

One unique benefit of the Matrox products is the optional MAX chip for accelerated H.264 processing. In my case, I tested the MXO2 LE MAX, which includes the embedded chip. When this unit is connected to a Mac computer, Apple Compressor, Adobe Media Encoder, Avid Media Composer, Telestream Episode and QuickTime perform hardware-accelerated encodes of H.264 files using the Matrox presets.

Fitting into your layout

I ran the Io XT, UltraStudio 3D and MXO2 LE through their paces connected to a friend’s new, top-of-the-line Apple iMac. All three deliver uncompressed SD or HD video over the Thunderbolt cable to the workstation. Processing to convert this signal to an encoded ProRes or DNxHD format will depend on the CPU. In short, recording a codec like ProRes4444 will require a fast machine and drives. I haven’t specifically tested it, but I presume this task would definitely challenge a Mac Mini using only internal drives!

The test-bed iMac workstation was configured with a Promise Pegasus 6-drive RAID array. The iMac includes two Thunderbolt ports and the Pegasus array offers a pass-through, so I was able to test these units both directly connected to the iMac, as well as daisy-chained onto the Promise array. This system would still allow the connection of more Thunderbolt storage and/or a secondary computer monitor, such as Apple’s 27″ Thunderbolt Display. Most peripheral manufacturers do not automatically supply cables, so plan on purchasing extra Thunderbolt cables ($49 for a six-foot cable from Apple).

These units work with most of the current crop of Mac OS X-based NLEs; however, you may need to choose a specific driver or software set to match the NLE you plan to operate. For instance, AJA requires a separate additional driver to be installed for Premiere Pro or Media Composer, which is provided for maximum functionality with those applications. The same is true for Matrox and Media Composer. I ran tests with Final Cut Pro 7, X and Premiere Pro CS 5.5, but not Media Composer 6, although they do work fine with that application. Only the Blackmagic Design products, like the UltraStudio 3D, will work with DaVinci Resolve. In addition to drivers, the software installation includes application presets and utility applications. Each build includes a capture/output application, which lets you ingest and lay off files through the device, independent of any editing application.

Broadcast monitoring and FCP X

The biggest wild card right now is performance with Final Cut Pro X. Broadcast monitoring was a beta feature added in the 10.0.3 update. With the release of 10.0.4 and compatible drivers, most performance issues have stabilized and this is no longer considered beta. Separate FCP X-specific drivers may need to be installed depending on the device.

If you intend to work mainly with Final Cut Pro “legacy” or Premiere Pro, then all of these units work well. On the other hand, if you’ve taken the plunge for FCP X, I would recommend the Io XT. I never got the MXO2 LE MAX to work with FCP X (10.0.3) during the testing period and initially the UltraStudio 3D wouldn’t work either, until the later version 9.2 drivers that Blackmagic posted mid-March. Subsequent re-testing with 10.0.4 and checking these units at NAB, indicate that both the Blackmagic and Matrox units work well enough. There are still some issues when you play at fast-forward speeds, where the viewer and external monitor don’t stay in sync with each other. I also checked the MOTU HDX-SDI device with FCP X in their NAB booth. Performance seemed similar to that of Matrox and Blackmagic Design.

The Io XT was very fluid and tracked FCP X quite well as I skimmed through footage. FCP X does not permit control over playback settings, so you have to set that in the control panel application (AJA) or system preference pane (Blackmagic Design and Matrox) and relaunch FCP X after any change. The broadcast monitoring feature in FCP X does not add any new VTR control or ingest capability and it’s unlikely that it ever will. To ingest videotape footage for FCP X using Io XT or UltraStudio, you will have to use the separate installed capture utility (VTR Xchange or Media Express, respectively) and then import those files from the hard drive into FCP X. Going the other direction requires that you export a self-contained movie file and use the same utility to record that file onto tape. The Matrox FCP X drivers and software currently do not include this feature.

Finally, the image to the Panasonic professional monitor I was using in this bay matched the FCP X viewer image on the iMac screen using either the Io XT or UltraStudio 3D. That attests to Apple’s accuracy claims for its ColorSync technology.

Performance with the mainstream NLEs

Ironically the best overall performance was using the end-of-life Final Cut Pro 7. In fact, all three units were incredibly responsive on this iMac/Promise combo. For example, when you use a Mac Pro with any FireWire or PCIe-connected card or device, energetic scrubbing or playing files at fast-forward speeds will result in the screen display and the external output going quickly out of sync with each other. When I performed the same functions on the iMac, the on-screen and external output stayed in sync with each of these three units. No amount of violent scrubbing caused it to lose sync. The faster data throughput and Thunderbolt technology had enabled a more pleasant editing experience.

I ran these tests using both a direct run from the iMac’s second Thunderbolt port, as well as looped from the back of the Promise array. Neither connection seemed to make much difference in performance with ProRes and AVCHD footage. I believe that you get the most data throughput when you are not daisy-chaining devices, however, I doubt you’ll see much difference under standard editing operation.

The best experience with Premiere Pro was using the Matrox MXO2 LE MAX, although the experience with the AJA and Blackmagic Design devices was fine, too. This stands to reason, as Matrox has historically had a strong track record developing for Adobe systems with custom cards, such as the Axio board set. Matrox also installs a high-quality MPEG-2 I-frame codec for use as an intermediate preview codec. This is an alternative to the QuickTime codecs installed on the system.

Portions of this entry originally written for Digital Video Magazine.

©2012 Oliver Peters

FCP X tools, Part 5 – filter suites

Ever since Red Giant Software introduced Magic Bullet Looks, a growing trend in effects packages has been filter suites. These install as plug-ins, which are built around presets that can be previewed and adjusted in a separate browser application launched from the filter interface. A representative frame has to be sent from the host application to the preset browser in order to preview the desired look or effect with your image instead of a template image. This was a missing element at the launch of Final Cut Pro X, but has been fixed with the 10.0.3 update. Two suite sets are currently available for FCP X – Magic Bullet Looks 2 and GenArts Sapphire Edge.

Magic Bullet Looks 2 may be purchased separately or as part of the Looks Suite and is available as a plug-in for a variety of hosts, including other NLEs, After Effects, Motion and photo applications, such as Photoshop, Aperture and Lightroom. In FCP X, you access the Magic Bullet Looks browser application through the on-screen overlay button. Once launched, controls are the same as when used with any of the other hosts. Looks itself is a collection of filters that mimic various tools used in production and post, such as diffusion from a lens filter or color correction used in post. A newly-added tool is Cosmo, which is a video noise cleaning effect that’s ideal for smoothing skin textures.

Looks are created by stringing together a chain of filters into a single effect. Any of these can be freely modified within the interface. A wealth of presets is installed with the plug-in. Any custom looks you’ve created yourself and saved are added to your library. Additional “guru” packages of presets may be purchased from Red Giant Software. Any preset look that has been installed or custom looks that you’ve created are available to other hosts, as well. For example, you could create a look in FCP X and have it available in After Effects, too. Once you build and apply a look and return to FCP X, you still have the ability to mask the area where the effect is visible from the effects control panel. Naturally, the look you’ve created can be copied and pasted to another clip without relaunching the custom browser interface.

A new competitor in the world of effects suites is GenArts with Sapphire Edge. The 10.0.3 update also enabled it to work with FCP X. Sapphire filters have always been extremely powerful, high-quality effects, but some users might find the control options daunting to dial in just the right effect. They are available for nearly every editing and compositing host, but tend to be among the pricier offerings in the market. These two factors led GenArts to develop Sapphire Edge, which is designed around presets selected via a preview browser. The effects collection is based on four Sapphire filter styles – Film Damage, Film Style, Lens Flare and TV Damage. The package installs plug-ins into both the effects and transitions palettes of FCP X. Edge is less expensive than a full set of Sapphire plug-ins, thus more in line with the price structure of FCP X itself.

The Sapphire Edge preview application is launched from the effects pane and like MB Looks, opens with the reference frame that you were parked on. Unlike MB Looks, you can only apply a preset look, but you can’t tweak it and save a new preset from inside the browser. Since there are a lot of presets with Edge, the preview browser lets you search by genre, mood or style. When you return to FCP X though, there are a number of sliders in the control pane to make adjustments to the application of the effect, which means you have latitude to customize it to your liking.

Purchasing Sapphire Edge includes a year’s subscription to FX Central, the GenArts online preset store. Edge installs a collection of over 350 presets, but additional monthly collections may be downloaded and installed with the FX Central subscription. I’m not really wild about this model, but it is one that GenArts is applying to other products, notably Sapphire 6. Frankly, I prefer working with the standard Sapphire 5 filters in After Effects over using the Edge browser in FCP X. On the plus side, the browser approach lets you quickly check out a lot of looks more quickly then you could ever do by tweaking sliders using the regular filter set. Two different approaches for different mindsets and the Sapphire Edge preset approach is a good fit for FCP X.

One notable missing developer in the FCP X effects suites mix is Digital Film Tools. Their Film Stocks, Photo Copy and Tiffen Dfx packages are superb and available in most of the hosts, except FCP X. I feel that the implementation of FxPlug in X is a bit flakey, so some filters show up in the X palette, even when they don’t yet work with X. On my system, PhotoCopy shows up in the effects palette, complete with preset options, but they don’t work in X.  The DFT filters use a separate browser application to preview, adjust and apply filters, just like MB Looks and Sapphire Edge, so I presume they have similar issues integrating into the FCP X effects architecture. Ironically some of these show up in Motion, but not FCP X. Although the performance in Motion is not terribly stable either. Maybe we’ll see that in a future update.

The good news is that we now have more options from some of the most popular suppliers, but the bad news is that performance is poor. All of the suite tools are taxing on the system and don’t particularly playback well in real-time without rendering. I also found some issues with interaction. For example, if you test out certain filters in Motion ahead of applying Magic Bullet Looks, it will fail to load. That can only be fixed with a relaunch of the app. As I’ve mentioned all along, Apple’s own built-in filters and the ones people have created as Motion template are the smoothest when left unrendered. I tend to treat these complex look effects as icing – best saved for last. I’m not big on using these as the sole place to do grading work. Tackle it that way and the rendering hit won’t be much of an issue.

UPDATE – The FCP X 10.0.4 version released on 4/10/12 “broke” MB Looks and Sapphire Edge. Both GenArts and Red Giant Software released updates on 4/13/12 to correct this issue. Please make sure you download the update if you run into any issues.

(Note: click on the images above to see enlarged views.)

©2012 Oliver Peters

FCP X tools, Part 3 – color grade effects

I’ve posted numerous entries about using various filters and tools to accomplish color correction and grading. I’ve also taken a look at how to use the color board in FCP X. Now it’s time to see what tools are out there if you just don’t feel comfortable with the color board.

Final Cut Pro X effects resources have been starting to quickly shore up. Noise Industries, GenArts, Red Giant and CoreMelt are some of the popular names who have been able to modify their FxPlug filters to work with FCP X, as well as Motion, FCP 7 and After Effects. In FCP X, filter parameters are built on slider controls, which has made it difficult to implement some of the popular filters used for color grading. With the latest updates, developers have been able to start taking advantage of user interface overlays to add controls for wheels and curves, which are important tools in working with color.

Noise Industries has been adding to its roster of partners and many of these have added welcome color grading features. Nattress and Sheffield are two that come from the early FCP days and FxScript development. The Nattress curves package and Sheffield Softworks Vintage and Looks Sweet 2 packages have been updated for FxPlug and are now available for all the supported FxFactory hosts, including FCP X. Looks Sweet 2 is a stylizing package that affects lighting and vibrancy in an image, while Vintage duplicates Technicolor processes.

Nattress Curves and Levels adds a badly-needed tool to FCP X. As the name implies, you have control over curves ranging from luma-only to full RGB. The adjustments can be made using sliders or the on-screen curves display. With the recent update the filters have been tweaked to work with video recorded in standard and well as log-adjusted gamma.

Two veteran FxFactory filter partners are PHYX and DVShade. The PHYX Color and PHYX Stylist packages offer easy control over a set of Technicolor, Bleach Bypass and other types of stylizing and lighting effects.

DVShade’s EasyLooks is an incredibly powerful filter that works entirely off of sliders. In addition to three-way color correction based on split-toning concepts, you can add diffusion, gradients, vignettes and more. The installation includes a set of presets for an easy starting point.

A new FxFactory tool from Yanobox is Moods, which is a color-wheel based grading filter. Like Nattress Curves, you have the option of displaying the controls overlay, complete with “help card” labels if you like. Grab a wheel and make balance, black-wash and brightness/exposure/gamma adjustments. You can start from scratch or apply one of the supplied presets.

The Red Giant Magic Bullet Looks Suite is generally installed for Looks (more on that in an upcoming post), but it also includes Mojo. That’s a separate filter based on the “orange and teal” feature film look. It’s simple to use and works just with sliders, Like some of the others, the installation adds a set of presets, too.

Last, but not least, is Tonalizer|VFX from Irudis. It comes in a PRO and Lite (free) version and works with FCP 7 and X, Motion 4 and 5 and Final Cut Express. It doesn’t run in After Effects or other NLEs, though. The PRO version includes a separate filter with built-in optimization for footage shot with HDSLRs using the Technicolor CineStyle camera profile. The design of Tonalizer|VFX is much like the slider controls in Adobe Lightroom and, in fact, Irudis markets it as offering photographic-style grading. Two big selling points are highlight and shadow recovery. It’s easy to use, works well and a big plus for FCP X is that it still plays well in real-time when left unrendered.

Below are a series of before and after images using some of these filters. Click on any image to see an enlarged view.

Tonalizer|VFX used to improve definition and vibrancy in the image.

Nattress Curves with a slight s-curve adjustment.

Nattress Curves RGB can be used to alter the balance of the image.

Yanobox Moods set to change the balance to a more sunset-like look.

Sheffield Softworks Looks Sweet 2 Super Glam filter to stylize the image.

Magic Bullet Mojo for the “blockbuster” look.

A combination of PHYX Color and Stylist filters for a bit of over-the-top punch.

Tonalizer|VFX used to improve  definition and vibrancy in the image.

DVShade EasyLooks for a cool, heavily saturated and tinted appearance.

©2012 Oliver Peters