Final Cut “Studio 2014″

df_fcpstudio_main

A few years ago I wrote some posts about Final Cut Pro as a platform and designing an FCP-centric facility. Those options have largely been replaced by an Adobe approach built around Creative Cloud. Not everyone has warmed up to Creative Cloud. Either they don’t like the software or they dislike the software rental model or they just don’t need much of the power offered by the various Adobe applications.

If you are looking for alternatives to a Creative Cloud-based production toolkit, then it’s easy to build your own combination with some very inexpensive solutions. Most of these are either Apple software or others that are sold through the Mac App Store. As with all App Store purchases, you buy the product once and get updates for free, so long as the product is still sold as the same. Individual users may install the apps onto as many Mac computers as they personally own and control, all for the one purchase price. With this in mind, it’s very easy for most editors to create a powerful bundle that’s equal to or better than the old Final Cut Studio bundle – at less than its full retail price back in the day.

The one caveat to all of this is how entrenched you may or may not be with Adobe products. If you need to open and alter complex Illustrator, Photoshop, After Effects or Premiere Pro project files, then you will absolutely need Adobe software to do it. In that case, maybe you can get by with an old version (CS6 or earlier) or maybe trial software will work. Lastly you could outsource to a colleague with Adobe software or simply pick up a Creative Cloud subscription on a month-by-month rental. On the other hand, if you don’t absolutely need to interact with Adobe project files, then these solutions may be all you need. I’m not trying to advocate for one over the other, but rather to add some ideas to think about.

Final Cut Pro X / Motion / Compressor

df_fcpstudio_fcpx_smThe last Final Cut Studio bundle included FCP 7, Motion, Compressor, Cinema Tools, DVD Studio Pro, Soundtrack Pro and Color. The current Apple video tools of Final Cut Pro X, Motion and Compressor cover all of the video bases, including editing, compositing, encoding, transcoding and disc burning. The latter may be a bone of contention for many – since Apple has largely walked away from the optical disc world. Nevertheless, simple one-off DVDs and Blu-ray discs can still be created straight from FCP X or Compressor. Of course, FCP X has been a mixed bag for editors, with many evangelists and haters on all sides. If you square off Premiere Pro against Final Cut Pro X, then it really boils down to tracks versus trackless. Both tools get the job done. Which one do you prefer?

df_fcpstudio_motion_smMotion versus After Effects is a tougher call. If you are a power user of After Effects, then Motion may seem foreign and hard to use. If the focus is primarily on motion graphics, then you can certainly get the results you want in either. There is no direct “send to” from FCP X to Motion, but on the plus side, you can create effects and graphics templates using Motion that will appear and function within FCP X. Just like with After Effects, you can also buy stock Motion templates for graphics, show opens and other types of design themes and animations.

Logic Pro X

df_fcpstudio_lpx_smLogic Pro X is the DAW in our package. It becomes the replacement for Soundtrack Pro and the alternative to Adobe Audition or Avid Pro Tools. It’s a powerful music creation tool, but more importantly for editors, it’s a strong single file and multitrack audio production and post production application. You can get FCP X files to it via FCPXML or AAF (converted using X2Pro). There are a ton of plug-ins and mixing features that make Logic a solid DAW. I won’t dive deeply into this, but suffice it to say, that if your main interest in using Logic is to produce a better mix, then you can learn the essentials quickly and get up and running in short order.

DaVinci Resolve

df_fcpstudio_resolve_smEvery decent studio bundle needs a powerful color correction tool. Apple Color is gone, but Blackmagic Design’s DaVinci Resolve is a best-of-breed replacement. You can get the free Resolve Lite version through the App Store, as well as Blackmagic’s website. It does most of everything you need, so there’s little reason to buy the paid version for most editors who do some color correction.

Resolve 11 (due out soon) adds improved editing. There is a solid synergy with FCP X, making it not only a good companion color corrector, but also a finishing editorial tool. OFX plug-ins are supported, which adds a choice of industry standard creative effects if you need more than FCP X or Motion offer.

Pixelmator / Aperture

df_fcpstudio_pixelmator_smThis one’s tough. Of all the Adobe applications, Photoshop and Illustrator are hardest to replace. There are no perfect alternatives. On the other hand, most editors don’t need all that power. If direct feature compatibility isn’t a need, then you’ve got some choices. One of these is Pixelmator, a very lightweight image manipulation tool. It’s a little like Photoshop in the version 4-7 stages, with a mix of Illustrator tossed in. There are vector drawing and design tools and it’s optimized for core image, complete with a nice set of image filters. However, it does not include some of Photoshop CC’s power user features, like smart objects, smart filters, 3D, layer groups and video manipulation. But, if you just need to doctor some images, extract or modify logos or translate various image formats, Pixelmator might be the perfect fit. For more sophistication, another choice (not in the App Store) is Corel’s Painter, as well as Adobe Photoshop Elements (also available at the App Store).

df_fcpstudio_aperture_smAlthough Final Cut Studio never included a photo application, the Creative Cloud does include Lightroom. Since the beginning, Apple’s Aperture and Adobe’s Lightroom have been leapfrogging each other with features. Aperture hasn’t changed much in a few years and is likely the next pro app to get the “X” treatment from Apple’s engineers. Photographers have the same type of “Chevy vs. Ford” arguments about Aperture and Lightroom as editors do about NLEs. Nevertheless, editors deal a lot with supplied images and Aperture is a great tool to use for organization, clean up and image manipulation.

Other

The list I’ve outlined creates a nice set of tools, but if you need to interchange with other pros using a variety of different software, then you’ll need to invest in some “glue”. There are a number of utilities designed to go to and from FCP X. Many are available through the App Store. Examples include Xto7, 7toX, EDL-X, X2Pro, Shot Notes X, Lumberjack and many others.

For a freewheeling discussion about this topic and other matters, check out my conversation with Chris Fenwick at FCPX Grille.

©2014 Oliver Peters

Inside Llewyn Davis

df_ild_01

Fans of Joel and Ethan Coen’s eclectic brand of filmmaking should be thrilled with their latest effort, Inside Llewyn Davis. The story follows Llewyn Davis as a struggling folk singer in the Greenwich Village folk scene at about 1960 – just before Bob Dylan’s early career there. Davis is played by Oscar Isaac, who most recently appeared in The Bourne Legacy. The story was inspired by the life of musician Dave Van Ronk, as chronicled in the book “The Mayor of MacDougal Street”. Although this is the Coen bothers’ most recent release, the film was actually produced in 2012 in true indie filmmaking fashion – without any firm commitment for distribution. It was picked up by CBS Films earlier this year.

df_ild_02The Coen brothers tackle post with a workflow that is specific to them. I had a chance to dig into that world with Katie McQuerrey, who is credited as an additional editor on Inside Llewyn Davis. McQuerrey started with the Coen brothers as they transitioned into digital post, helping to adapt their editorial style to Apple Final Cut Pro. For many of their films, she’s worn a number of hats – helping to coordinate the assistant editors, acting as a conduit to other departments and, in general, serving as another set of eyes, ears and brain while Ethan and Joel are cutting their films.

df_ild_07McQuerrey explained, “Ethan and Joel adapted their approach from how they used to cut on film. Ethan would pull selects from film workprint on a Moviola and then Joel would assemble scenes from these selects using a KEM. With Final Cut Pro, they each have a workstation and these are networked together. No fancy SAN management. Just Apple file sharing and a Promise storage array for media. Ethan will go through a project, review all the takes, make marks, add markers or written notes and pass it over to Joel. Ethan doesn’t actually assemble anything to a timeline. He’s only working within the bins of the broader project. All of the timeline editing of these scenes is then done by Joel.” (Although there’s been press about the Coen brothers planning to use Adobe Premiere Pro in the future, this film was still edited using Apple Final Cut Pro 7.)

df_ild_06Inside Llewyn Davis was filmed on 35mm over the course of a 45-day production in 2012. It wrapped on April 4th and was followed by a 20 to 24-week post schedule, ending in a final mix by the end of September. Technicolor in New York provided lab and transfer services for the production. They scanned in all of the raw 35mm negative one time to DPX files with a 2K resolution and performed a “best light” color correction pass of the DPX files for dailies. In addition, Technicolor also synced the sound from the mono mix of production mixer Peter Kurland’s location recordings. These were delivered to the editorial team as synced ProRes files.

df_ild_05McQuerrey said, “Ethan and Joel don’t cut during the shooting. That doesn’t start until the production wraps. Inside Llewyn Davis has a look for many of the scenes reminiscent of the era. [Director of photography] Bruno Delbonnel worked closely with [colorist] Peter Doyle to establish a suggested look during the dailies. These would be reviewed on location in a production trailer equipped with a 50” Panasonic plasma that Technicolor had calibrated. Once the film was locked, then Technicolor conformed the DPX files and Bruno, Ethan and Joel supervised the DI mastering of the film. Peter graded both the dailies and the final version using a [Filmlight] Baselight system. Naturally, the suggested look was honed and perfected in the final DI.”

df_ild_04Inside Llewyn Davis is about a musician and music is a major component of the film. The intent was to be as authentic as possible. There was no lip-syncing to the playback of a recorded music track. Peter [Kurland] recorded all of these live on set and that’s what ended up in the final mix. For editing, if we ever needed to separate tracks, then we’d go back to Peter’s broadcast wave file multi-track recordings, bring those into Final Cut and create ‘merged clips’ that were synced. Since Ethan and Joel’s offices are in a small building, the assistants had a separate cutting room at Post Factory in New York. We mirrored the media at both locations and I handled the communication between the two offices. Often this was done using Mac screen sharing between the computers.”

df_ild_03The Coen brothers approach their films in a very methodical fashion, so editing doesn’t present the kinds of challenges that might be the case with other directors. McQuerrey explained, “Ethan and Joel have a very good sense of script time to film time. They also understand how the script will translate on screen. They’ll storyboard the entire film, so there’s no improvisation for the editor to deal with. Most scenes are filmed with a traditional, single-camera set-up. This film was within minutes of the right length at the first assembly, so most of the editorial changes were minor trims and honing the cut. No significant scene lifts were made. Joel’s process is usually to do a rough cut and then a first cut. Skip Lievsay, our supervising sound editor, will do a temp mix in [Avid] Pro Tools. This cut with the temp mix will be internally screened for ‘friends and family’, plus the sound team and visual effects department. We then go back through the film top to bottom, creating a second cut with another temp mix.”

“At this stage, some of the visual effects shots have been completed and dropped into the cut. Then there’s more honing, more effects in place and finally another temp mix in 5.1 surround. This will be output to D5 for more formal screenings. Skip builds temp mixes that get pretty involved, so each time we send OMF files and change lists. Sound effects and ADR are addressed at each temp mix. The final mix was done in five days at Sony in Los Angeles with Skip and Greg Orloff working as the re-recording mixers.”

df_ild_08Even the most organized production includes some elements that are tough to cut. For Inside Llewyn Davis, this was the cross-country driving sequence that covers about one-and-a-half reels of the film. It includes another Coen favorite, John Goodman. McQuerrey described, “The driving scenes were all shot as green-screen composites. There are constantly three actors in the car, plus a cat. It’s always a challenge to cut this type of scene, because you are dealing with the continuity from take to take of all three actors in a confined space. The cat, of course, is less under anyone’s control. We ‘cheated’ that a bit using seamless split-screens to composite the shots in a way that the cat was in the right place. All of the windows had to be composited with the appropriate background scenery.”

“The most interesting part of the cut was how the first and last scenes were built. The beginning of the movie and the ending are the same event, but the audience may not realize at first that they are back at the beginning of the story. This was filmed only one time, but each scene was edited in a slightly different way, so initially you aren’t quite sure if you’ve seen this before or not. Actions in the first scene are abbreviated, but are then resolved with more exposition at the end.”

Originally written for Digital Video magazine

©2013 Oliver Peters

The NLE that wouldn’t die II

df_nledie2_sm

With echoes of Monty Python in the background, two years on, Final Cut Pro 7 and Final Cut Studio are still widely in use. As I noted in my post from last November, I still see facilities with firmly entrenched and mature FCP “legacy” workflows that haven’t moved to another NLE yet. Some were ready to move to Adobe until they learned subscription was the only choice going forward. Others maintain a fanboy’s faith in Apple that the next version will somehow fix all the things they dislike about Final Cut Pro X. Others simply haven’t found the alternative solutions compelling enough to shift.

I’ve been cutting all manner of projects in FCP X since the beginning and am currently using it on a feature film. I augment it in lots of ways with plug-ins and utilities, so I’m about as deep into FCP X workflows as anyone out there. Yet, there are very few projects in which I don’t touch some aspect of Final Cut Studio to help get the job done. Some fueled by need, some by personal preference. Here are some ways that Studio can still work for you as a suite of applications to fill in the gaps.

DVD creation

There are no more version updates to Apple’s (or Adobe’s) DVD creation tools. FCP X and Compressor can author simple “one-off” discs using their export/share/batch functions. However, if you need a more advanced, authored DVD with branched menus and assets, DVD Studio Pro (as well is Adobe Encore CS6) is still a very viable tool, assuming you already own Final Cut Studio. For me, the need to do this has been reduced, but not completely gone.

Batch export

Final Cut Pro X has no batch export function for source clips. This is something I find immensely helpful. For example, many editorial houses specify that their production company client supply edit-friendly “dailies” – especially when final color correction and finishing will be done by another facility or artist/editor/colorist. This is a throwback to film workflows and is most often the case with RED and ALEXA productions. Certainly a lot of the same processes can be done with DaVinci Resolve, but it’s simply faster and easier with FCP 7.

In the case of ALEXA, a lot of editors prefer to do their offline edit with LUT-corrected, Rec 709 images, instead of the flat, Log-C ProRes 4444 files that come straight from the camera. With FCP 7, simply import the camera files, add a LUT filter like the one from Nick Shaw (Antler Post), enable TC burn-in if you like and run a batch export in the codec of your choice. When I do this, I usually end up with a set of Rec 709 color, ProResLT files with burn-in that I can use to edit with. Since the file name, reel ID and timecode are identical to the camera masters, I can easily edit with the “dailies” and then relink to the camera masters for color correction and finishing. This works well in Adobe Premiere Pro CC, Apple FCP 7 and even FCP X.

Timecode and reel IDs

When I work with files from the various HDSLRs, I prefer to convert them to ProRes (or DNxHD), add timecode and reel ID info. In my eyes, this makes the file professional video media that’s much more easily dealt with throughout the rest of the post pipeline. I have a specific routine for doing this, but when some of these steps fail, due to some file error, I find that FCP 7 is a good back-up utility. From inside FCP 7, you can easily add reel IDs and also modify or add timecode. This metadata is embedded into the actual media file and readable by other applications.

Log and Transfer

Yes, I know that you can import and optimize (transcode) camera files in FCP X. I just don’t like the way it does it. The FCP 7 Log and Transfer module allows the editor to set several naming preferences upon ingest. This includes custom names and reel IDs. That metadata is then embedded directly into the QuickTime movie created by the Log and Transfer module. FCP X doesn’t embed name and ID changes into the media file, but rather into its own database. Subsequently this information is not transportable by simply reading the media file within another application. As a result, when I work with media from a C300, for example, my first step is still Log and Transfer in FCP 7, before I start editing in FCP X.

Conform and reverse telecine

A lot of cameras offer the ability to shoot at higher frame rates with the intent of playing this at a slower frame rate for a slow motion effect – “overcranking” in film terms. Advanced cameras like the ALEXA, RED One, EPIC and Canon C300 write a timebase reference into the file that tells the NLE that a file recorded at 60fps is to be played at 23.98fps. This is not true of HDSLRs, like a Canon 5D, 7D or a GoPro. You have to tell the NLE what to do. FCP X only does this though its Retime effect, which means you are telling the file to be played as slomo, thus requiring a render.

I prefer to use Cinema Tools to “conform” the file. This alters the file header information of the QuickTime file, so that any application will play it at the conformed, rather than recorded frame rate. The process is nearly instant and when imported into FCP X, the application simply plays it at the slower speed – no rendering required. Just like with an ALEXA or RED.

Another function of Cinema Tools is reverse telecine. If a camera file was recorded with built-in “pulldown” – sometimes called 24-over-60 – additional redundant video fields are added to the file. You want to remove these if you are editing in a native 24p project. Cinema Tools will let you do this and in the process render a new, 24p-native file.

Color correction

I really like the built-in and third-party color correction tools for Final Cut Pro X. I also like Blackmagic Design’s DaVinci Resolve, but there are times when Apple Color is still the best tool for the job. I prefer its user interface to Resolve, especially when working with dual displays and if you use an AJA capture/monitoring product, Resolve is a non-starter. For me, Color is the best choice when I get a color correction project from outside where the editor used FCP 7 to cut. I’ve also done some jobs in X and then gone to Color via Xto7 and then FCP 7. It may sound a little convoluted, but is pretty painless and the results speak for themselves.

Audio mixing

I do minimal mixing in X. It’s fine for simple mixes, but for me, a track-based application is the only way to go. I do have X2Pro Audio Convert, but many of the out-of-house ProTools mixers I work with prefer to receive OMFs rather than AAFs. This means going to FCP 7 first and then generating an OMF from within FCP 7. This has the added advantage that I can proof the timeline for errors first. That’s something you can’t do if you are generating an AAF without any way to open and inspect it. FCP X has a tendency to include many clips that are muted and usually out of your way inside X. By going to FCP 7 first, you have a chance to clean up the timeline before the mixer gets it.

Any complex projects that I mix myself are done in Adobe Audition or Soundtrack Pro. I can get to Audition via the XML route – or I can go to Soundtrack Pro through XML and FCP 7 with its “send to” function. Either application works for me and most of my third-party plug-ins show up in each. Plus they both have a healthy set of their own built-in filters. When I’m done, simply export the mix (and/or stems) and import the track back into FCP X to marry it to the picture.

Project trimming

Final Cut Pro X has no media management function.  You can copy/move/aggregate all of the media from a single Project (timeline) into a new Event, but these files are the source clips at full length. There is no ability to create a new project with trimmed or consolidated media. That’s when source files from a timeline are shortened to only include the portion that was cut into the sequence, plus user-defined “handles” (an extra few frames or seconds at the beginning and end of the clip). Trimmed, media-managed projects are often required when sending your edited sequence to an outside color correction facility. It’s also a great way to archive the “unflattened” final sequence of your production, while still leaving some wiggle room for future trimming adjustments. The sequence is editable and you still have the ability to slip, slide or change cuts by a few frames.

I ran into this problem the other day, where I needed to take a production home for further work. It was a series of commercials cut in FCP X, from which I had recut four spots as director’s cuts. The edit was locked, but I wanted to finish the mix and grade at home. No problem, I thought. Simply duplicate the project with “used media”, create the new Event and “organize” (copies media into the new Event folder). I could live with the fact that the media was full length, but there was one rub. Since I had originally edited the series of commercials using Compound Clips for selected takes, the duping process brought over all of these Compounds – even though none was actually used in the edit of the four director’s cuts. This would have resulted in copying nearly two-thirds of the total source media. I could not remove the Compounds from the copied Event, without also removing them from the original, which I didn’t want to do.

The solution was to send the sequence of four spots to FCP 7 and then media manage that timeline into a trimmed project. The difference was 12GB of trimmed source clips instead of HUNDREDS of GB. At home, I then sent the audio to Soundtrack Pro for a mix and the picture back to FCP X for color correction. Connect the mix back to the primary storyline in FCP X and call it done!

I realize that some of this may sound a bit complex to some readers, but professional workflows are all about having a good toolkit and knowing how to use it. FCP X is a great tool for productions that can work within its walls, but if you still own Final Cut Studio, there are a lot more options at your disposal. Why not continue to use them?

©2013 Oliver Peters

Particle Fever

df_pf_1Filmmaking isn’t rocket science, but sometimes they are kissing cousins. Such is the case of the documentary Particle Fever, where the credentials of both producer David Kaplan and director Mark Levinson include a Doctorate in particle physics. Levinson has been involved in filmmaking for 28 years, starting after his graduation from Berkeley, when he found the job prospects for physics in a slump. Instead he turned to his second passion – films. Levinson worked as an ADR specialist on such films as The English Patient, The Talented Mr. Ripley, Cold Mountain, and The Rainmaker. While working on those films, he built up a friendship with noted film editor Walter Murch (The Conversation, Julia, Apocalypse Now, K-19: The Widowmaker). In addition, Levinson was writing screenplays and directing some of his own independent films (Prisoner of Time). This ultimately led him to combine his two interests and pursue Particle Fever, a documentary about the research, construction and goals of building the Large Hadron Collider.

When it came time to put the polish on his documentary, Mark Levinson tapped Walter Murch as the editor. Murch explained, “I was originally only going to be on the film for three months, because I was scheduled to work on another production after that. I started in March 2012, but the story kept changing with each breaking news item from the collider. And my other project went away, so in the end, I worked on the film for 15 months and just finished the mix a few weeks ago [June 2013].” At the start of the documentary project, the outcome of the research from the Large Hadron Collider was unknown. In fact, it wasn’t until later during the edit, that the scientists achieved a major success with the confirmation of the discovery of the Higgs boson as an elementary particle in July 2012. This impacted science, but also the documentary in a major way.

Finding the story arc

df_pf_6Particle Fever is the first feature-length documentary that Walter Murch has edited, although archival and documentary footage has been part of a number of his films. He’d cut some films for the USIA early in his career and has advised and mixed a number of documentaries, including Crumb, about the controversial cartoonist Robert Crumb. Murch is fond of discussing the role of the editor as a participatory writer of the film in how he crafts the story through pictures and sound. Nowhere is this more true than in documentaries. According to Murch, “Particle Fever had a natural story arc by the nature of the events themselves. The machine [the Large Hadron Collider] provided the spine. It was turned on in 2008 and nine days later partly exploded, because a helium relief valve wasn’t strong enough. It was shut down for a year of repairs. When it was turned on again, it was only at half power and many of the scientists feared this was inadequate for any major discoveries. Nevertheless, even at half power, the precision was good enough to see the evidence that they needed. The film covers this journey from hope to disaster to recovery and triumph.”

Due to the cost of constructing large particle accelerators, a project like the Large Hadron Collider is a once-in-a-generation event. It is a seminal moment in science akin to the Manhattan Project or the moon launch. In this case, 10,000 scientists from 100 countries were involved in the goal of recreating the conditions just after the Big Bang and finding the Higgs boson, often nicknamed “the God particle”. Murch explained the production process, “Mark and David picked a number of scientists to follow and we told the story through their eyes without a narrator. They were equipped with small consumer cameras to self-record intermittent video blogs, which augmented the formal interviews. Initially Mark was following about a dozen scientists, but this was eventually narrowed down to the six that are featured in the film. The central creative challenge was to balance the events while getting to know the people and their roles. We also had to present enough science to understand what is at stake without overwhelming the audience. These six turned out to be the best at that and could convey their passion in a very charismatic and understandable way with a minimum of jargon.”

Murch continued, “Our initial cut was two-and-a-half hours, which was ultimately reduced to 99 minutes. We got there by cutting some people, but also some of the ‘side shoots’ or alternate research options that were explored. For example, there was a flurry of excitement related to what was thought to be discoveries of particles of ‘dark matter’ at a Minnesota facility. This covered about 20 minutes of the film, but in the final version there’s only a small trace of that material.”

Sifting to find the nuggets

df_pf_2As in most documentaries, the post team faced a multitude of formats and a wealth of material, including standard definition video recorded in 2007, the HDV files from the scientists’ “webcams” and Panasonic HD media from the interviews. In addition, there was a lot of PAL footage from the media libraries at CERN, the European particle accelerator. During the production, news coverage focused on the theoretical, though statistically unlikely, possibility that the Large Hadron Collider might have been capable of producing a black hole. This yielded even more source material to sift through. In total, the production team generated 300 hours of content and an additional 700 hours were available from CERN and the various news pieces produced about the collider.

Murch is known for his detailed editor’s codebook for scenes and dailies that he maintains for every film in a Filemaker Pro database. Particle Fever required a more streamlined approach. Murch came in at what initially appeared to be the end of the process after Mona Davis (Fresh, Advise & Consent) had worked on the film. Murch said, “I started the process later into the production, so I didn’t initially use my Filemaker database. Mark was both the director and my assistant editor, so for the first few months I was guided by his knowledge of the material. We maintained two mirrored workstations with Final Cut Pro 7 and Mark would ingest any new material and add his markers for clips to investigate. When these bins were copied to my station, I could use them as a guide of where to start looking for possible material.”

Mapping the sound

df_pf_4The post team operated out of Gigantic Studios in New York, which enabled an interactive workflow between Murch and sound designer Tom Paul (on staff at Gigantic) and with composer Robert Miller. Walter Murch’s editorial style involves building up a lot of temporary sound effects and score elements during the rough cut phase and then, piece-by-piece, replacing those with finished elements as he receives them. His FCP sequence on Particle Fever had 42 audio tracks of dialogue, temp sound effects and music elements. This sort of interaction among the editor, sound designer and composer worked well with a small post team all located in New York City. By the time the cut was locked in May, Miller had delivered about an hour of original score for the film and supplied Murch with seven stereo instrumentation stems for that score to give him the most versatility in mixing.

Murch and Paul mixed the film on Gigantic’s Pro Tools ICON system. Murch offered this post trick, “When I received the final score elements from Robert, I would load them into Final Cut and then was able to copy-and-paste volume keyframes I had added to Robert’s temp music onto the final stems, ducking under dialogue or emphasizing certain dynamics of the music. This information was then automatically transferred to the Pro Tools system as part of the OMF output. Although we’d still adjust levels in the mix, embedding these volume shifts gave us a better starting point. We didn’t have to reinvent the wheel, so to speak. In the end, the final mix took four days. Long days!”

df_pf_3Gigantic Post offered the advantage of an on-site screening room, which enabled the producers to have numerous in-progress screenings for both scientific and industry professionals, as well as normal interested viewers. Murch explained, “It was important to get the science right, but also to make it understandable to the layman. I have more than a passing interest in the subject, but both Mark and David have Ph.D.s in particle physics, so if I ever had a question about something, all I had to do was turn around and ask. We held about 20 screenings over the course of a year and the scientists who attended our test screenings felt that the physics was accurate. But, what they also particularly liked was that the film really conveys the passion and experience of what it’s like to work in this field.” Final Frame Post, also in New York, handled the film’s grading and digital intermediate mastering.

Graphic enhancements

df_pf_5To help illustrate the science, the producers tapped MK12, a design and animation studio, which had worked on such films as The Kite Runner and Quantum of Solace. Some of the ways in which they expressed ideas graphically throughout the film could loosely be described as a cross between A Beautiful Mind and Carl Sagan’s PBS Cosmos series. Murch described one example, “For instance, we see Nima (one of our theorists) walking across the campus of the Institute for Advanced Study while we hear his voice-over. As he talks, formulas start to swirl all around him. Then the grass transforms into a carpet of number-particles, which then transform into an expanding universe into which Nima disappears. Eventually, this scene resolves and Nima emerges, returning on campus and walking into a building, the problematic formulas falling to the ground as he goes through the door.”

Although this was Walter Murch’s first feature documentary, his approach wasn’t fundamentally different from how he works on a dramatic film. He said, “Even on a scripted film, I try to look at the material without investing it with intention. I like to view dailies with the fresh-eyed sense of ‘Oh, where did this come from? Let’s see where this will take the story’.  That’s also from working so many years with Francis [Ford Coppola], who often shoots in a documentary style. The wedding scene in The Godfather, for instance; or the Union Square conversation in The Conversation; or any of the action scenes in Apocalypse Now all exemplify that. They are ongoing events, with their own internal momentum, which are captured by multiple cameras. I really enjoyed working on this film, because there were developments and announcements during the post which significantly affected the direction of the story and ultimately the ending. This made for a real roller coaster ride!”

Particle Fever premiered at Doc/Fest Sheffield on June 14th, and won the Audience Award (split with Act of Killing). It is currently in negotiations for distribution.

NOTE: The film will open in New York on Ma5, 2014. In October 2013Peter W. Higgs – who theorized about the boson particle named after him – was awarded the Nobel Prize in Physics, together with Francois Englert. For more on Walter Murch’s thoughts about editing, click here.

And finally, an interesting look at Murch’s involvement in the Rolex Mentor Protege program.

Originally written for Digital Video magazine

©2013 Oliver Peters

DaVinci Resolve Workflows

df_resolve_main

Blackmagic Design’s purchase of DaVinci Systems put a world class color grading solution into the hands of every video professional. With Resolve 9, DaVinci sports a better user interface that makes it easy to run, regardless of whether you are an editor, colorist or DIT working on set.  DaVinci Resolve 9 comes in two basic Mac or Windows software versions, the $995 paid and the free Lite version. The new Blackmagic Cinema Camera software bundle also includes the full (paid) version, plus a copy of Ultrascope. For facilities seeking to add comprehensive color grading services, there’s also a version with Blackmagic’s dedicated control surface, as well as Linux systems configurations.

Both paid and free versions of Resolve (currently at version 9.1) work the same way, except that the paid version offers larger-than-HD output, noise reduction and the ability to tap into more than one extra GPU card for hardware acceleration. Resolve runs fine with a single display card (I’ve done testing with the Nvidia GT120, the Nvidia Quadro 4000 and the ATI 5870), but requires a Blackmagic video output card if you want to see the image on a broadcast monitor.

Work in Resolve 9 generally flows left-to-right, through the tabbed pages, which you select at the bottom of the interface screen. These are broken into Media (where you access the media files that you’ll be working with), Conform (importing/exporting EDL, XML and AAF files), Color (where you do color correction), Gallery (the place to store and recall preset looks) and Deliver (rendering and/or output to tape).

Many casual users employ Resolve in these two ways: a) correcting camera files to send on to editorial, and b) color correction roundtrips with NLE software. This tutorial is intended to highlight some of the basic workflow steps associated with these tasks. Resolve is deep and powerful, so spend time with the excellent manual to learn its color correction tools, which would be impossible to cover here.

Creating edit-ready dailies – BMCC (CinemaDNG media)

The Blackmagic Cinema Camera can record images as camera raw, CinemaDNG image sequences. Resolve 9 can be used to turn these into QuickTime or MXF media for editing. Files may be graded for the desired final look at this point, or the operator can choose to apply the BMD Film preset. This log preset generates files with a flat look comparable to ARRI Log-C. You may prefer this if you intend to use a Log-to-Rec709 LUT (look up table) in another grading application or a filter like the Pomfort Log-to-Video effect, which is available for Final Cut Pro 7/X.df_resolve_1_sm

Step 1 – Media: Drag clip folders into the Media Pool section.

Step 2 – Conform: Skip this tab, since the clips are already on a single timeline.

df_resolve_3_smStep 3 – Color: Make sure the camera setting (camera icon) for the clips on the timeline are set to Project. Open the project settings (gear icon). Change and apply these values: 1) Camera raw – CinemaDNG; 2) White Balance – as shot; 3) Color Space and Gamma – BMD Film.

Step 4 – Deliver: Set it to render each clip individually, assign the target destination and frame rate and the naming options. Then choose Add Job and Start Render.

The free version of Resolve will downscale the BMCC’s 2.5K-wide images to 1920×1080. The paid version of Resolve will permit output at the larger, native size. Rendered ProRes files may now be directly imported into FCP 7, FCP X or Premiere Pro. Correct the images to a proper video appearance by using the available color correction tools or filters within the NLE that you are using.

Creating edit-ready dailies – ARRI Alexa / BMCC (ProRes, DNxHD media)

df_resolve_2_smBoth the ARRI Alexa and the Blackmagic Cinema Camera can record Apple ProRes and Avid DNxHD media files to onboard storage. Each offers a similar log gamma profile that may be applied during recording in order to preserve dynamic range. Log-C for the Alexa and BMD Film for Blackmagic. These profiles facilitate high-quality grading later. Resolve may be used to properly grade these images to the final look as dailies are generated, or it may simply be used to apply a viewing LUT for a more pleasing appearance during the edit.

Step 1 – Media: Drag clip folders into the Media Pool section.

Step 2 – Conform: Skip this tab, since the clips are already on a single timeline.

Step 3 – Color: Make sure the camera setting for the clips on the timeline are set to Project. Open the project settings and set these values: 3D Input LUT – ARRI Alexa Log-C or BMD Film to Rec 709.

df_resolve_4_smStep 4 – Deliver: Set it to render each clip individually, assign the target destination and frame rate and the naming options. Check whether or not to render with audio. Then choose Add Job and Start Render.

The result will be new, color corrected media files, ready for editing. To render Avid-compatible MXF media for Avid Media Composer, select the Avid AAF Roundtrip from the Easy Setup presets. After rendering, return to the Conform page to export an AAF file.

Roundtrips – using Resolve together with editing applications

DaVinci Resolve supports roundtrips from and back to NLEs based on EDL, XML and AAF lists. You can use Resolve for roundtrips with Apple Final Cut Pro 7/X, Adobe Premiere Pro and Avid Media Composer/Symphony. You may also use it to go between systems. For example, you could edit in FCP X, color correct in Resolve and then finish in Premiere Pro or Autodesk Smoke 2013. Media should have valid timecode and reel IDs to enable the process to work properly.

df_resolve_5_smIn addition to accessing the camera files and generating new media with baked-in corrections, these roundtrips require an interchange of edit lists. Resolve imports an XML and/or AAF file to link to the original camera media and places those clips on a timeline that matches the edited sequence. When the corrected (and trimmed) media is rendered, Resolve must generate new XML and/or AAF files, which the NLE uses to link to these new media files. AAF files are used with Avid systems and MXF media, while standard XML files and QuickTime media is used with Final Cut Pro 7 and Premiere Pro. FCP X uses a new XML format that is incompatible with FCP 7 or Premiere Pro without translation by Resolve or another utility.

Step 1 – Avid/Premiere Pro/Final Cut Pro: Export a list file that is linked to the camera media (AAF, XML or FCPXML).

Step 2- Conform (skip Media tab): Import the XML or AAF file. Make sure you have set the options to automatically add these clips to the Media Pool.

Step 3 – Color: Grade your shots as desired.df_resolve_6_sm

Step 4 – Deliver: Easy Setup preset – select Final Cut Pro XML or Avid AAF roundtrip. Verify QuickTime or MXF rendering, depending on the target application. Change handle lengths if desired. Check whether or not to render with audio. Then choose Add Job and Start Render.

df_resolve_9_smStep 5 – Conform: Export a new XML (FCP7, Premiere Pro), FCPXML (FCP X) or AAF (Avid) list.

The roundtrip back

The reason you want to go back into your NLE is for the final finishing process, such as adding titles and effects or mixing sound. If you rendered QuickTime media and generated one of the XML formats, you’ll be able to import these new lists into FCP7/X or Premiere Pro and those applications will reconnect to the files in their current location. FCP X offers the option to import/copy the media into its own managed Events folders.

df_resolve_7_smIf you export MXF media and a corresponding AAF list with the intent of returning to Avid Media Composer/Symphony, then follow these additional steps.

Step 1 – Copy or move the folder of rendered MXF media files into an Avid MediaFiles/MXF subfolder. Rename this copied folder of rendered Resolve files with a number.

Step 2 – Launch Media Composer or Symphony and return to your project or create a new project.df_resolve_8_sm

Step 3 – Open a new, blank bin and import the AAF file that was exported from Resolve. This list will populate the bin with master clips and a sequence, which will be linked to the new MXF media rendered in Resolve and copied into the Avid MediaFiles/MXF subfolder.

Originally written for DV magazine / Creative Planet Network

©2013 Oliver Peters