FCP X grading styles and tools

df_0813grades_28_main

One of the aspects I enjoy about Final Cut Pro X is the wealth of tools and methods for color correction, grading or whatever you want to call the process. There simply is no other NLE on the market with as many built-in and third-party tools for making adjustments to image color and style. I’m not limiting this to simply color correction, but also glow, diffusion and stylizing filters that increasingly are a part of  a grading session. It’s about getting the right look for the best emotional impact and with FCP X there are a host of choices at very little expense.

In addition, the filters being developed for FCP X include more photographic correction functions than we’ve been used to in the previous class of effects filters. For example, many of these plug-ins include color temperature, tint and contrast controls that add a nice dimension past the usual three-way correctors.

Below is a quick potpourri of plug-ins (built-in and third-party) that you can use with FCP X. Click the thumbnail images for an enlarged view. My sample image is from the John Brawly Afterglow clips used to promote the Blackmagic Cinema Camera. A few of these clips have been posted online in both CinemaDNG and ProRes formats. In this case, I started with an already-corrected clip that I created in Adobe Lightroom. This is my starting point, which is typically what most editors encounter when color correcting a job. It’s nice to get flat, log-profile images for grading, but that’s not always the case. Often you start with a good-looking Rec 709 image that needs some pizzazz. That’s what I’m demonstrating here.

Remember that getting the right look is often a matter of using a combination of filters, rather than just one.

df_0813grades_1_sm

A combination of Alex Gollner’s Levels and YUV Adjust filters. These are two of a set of free FCP X filters.

df_0813grades_2_sm

A combination of three built-in FCP X filters – Hard Light, Hue/Sat and Crisp Contrast.

df_0813grades_3_sm

CoreMelt has released several free filters, including a curve adjustment used to correct “flat” HDSLR images. Of course, you can play with it on non-flat images, too.

df_0813grades_4_sm

CoreMelt’s SliceX masking tool includes several filter variations. One allows extra color correction control within the mask area.

df_0813grades_5_sm

CineFlare has released several free sampler filters. This QuickLooks Teal filter gives you an “orange-and-teal” look. Note that Mac OS color picker controls allow you to tweak the tinting colors.

df_0813grades_6_sm

Developer Simon Ubsdell has posted a number of free filters at the FCP.co forum. This Cross Process filter simulates film processing effects, which, when pushed to extremes, offers a nice way to stylize an image.

df_0813grades_7_sm

Tim Dashwood’s Editor Essentials package includes several image adjustment tools, including Levels and Camera Gamma correction.

df_0813grades_8_sm

Digital Film Tools’ Film Stocks is an external application that’s accessible from FCP X via a plug-in. The adjustment is customized in the external window, with many controls designed to emulate various film emulations.

df_0813grades_9_sm

DV Shade EasyLooks is a full correction suite within a single plug-in. In addition to grading, it also offers controls for gradients, diffusion, warm/cool hue shifts and vignettes.

df_0813grades_10_sm

Rubber Monkey Software’s FilmConvert Pro is available as both a standalone application and as a plug-in. The filter version has a limited range of color correction controls, but like DFT Film Stocks, is designed to emulate film. You can also change the intensity of the grain structure by selecting between film options from 35mm to Super 8mm.

df_0813grades_11_sm

Noise Industries’ FxFactory Pro package includes several color correction filters, including Bleach Bypass, Crush Color and Film Process (their version of the Technicolor 2-strip look).

df_0813grades_12_sm

Hawaiki Color has been jointly developed by Simon Ubsdell and Lawn Road as a full-fledged color corrector, using the color wheel model. In also features blur and sharpening, plus a wealth of color controls. The unique interface may be used as a HUD overlay or surrounding the image. I’ll take a detailed look at Hawaiki in a future post.

df_0813grades_13_sm

Lawn Road also offers other color correction filters, such as Color Grade. Like others, they use the Mac OS color picker controls as a form of three-way color correction without building a separate grading interface.

df_0813grades_14_sm

FCP X’s built-in Teal & Orange look. Like most of the built-in filters, slider control is kept to a minimum, but real-time (unrendered) performance is superior to third-party effects. That’s thanks to under-the-hood optimization done by the Pro Apps engineers, which isn’t available to external developers.

df_0813grades_15_sm

Luca VFX Lo-Fi look is another tool to stylize an image with grunge effects. This is an “animated” effect with built-in flickering. It also includes an image drop well for custom pattern used within the effect.

df_0813grades_16_sm

Luca VFX Vivid Touch is more of a standard color correction filter.

df_0813grades_17_sm

Red Giant Software’s Magic Bullet Looks is the best-known external image stylizing/color correction application. Like DFT Film Stocks, the correction is done in the external application and accessed via the plug-in from FCP X.

df_0813grades_18_sm

Nattress Curves is a venerable tool that been updating from the FCP “legacy” days to work in FCP X. It adds a valuable missing ingredient to the built-in correction tools.

df_0813grades_19_sm

This is a combination of the PHYX Color Bleach Bypass, Glow Dark and Techni2Color effects. By stacking the skip-bleach style (but with more control than usual), a localized contrast function and the 2-strip process, you end up with a very unique look.

df_0813grades_20_sm

Pomfort’s filters are designed to work as LUTs used on ARRI ALEXA images, but can also be used with other clips. Naturally, when you do that, the colorimetry is technically wrong, but offers some interested color options.

df_0813grades_21_sm

Sheffield Software’s Vintage filter is another that’s been ported over from FxScript.

df_0813grades_22_sm

CrumplePop’s ShrinkRay X is designed to create a tilt and shift look with defocused outer edges.

df_0813grades_23_sm

FCP X’s built-in Super 8mm filter is useful, though not as realistic as FilmConvert, because the built-in effect maintains the image sharpness.

df_0813grades_24_sm

Tokyo’s Lomography Look is another Ubsdell filter posted over at FCP.co. It mimics the current photo trend of grunge images created with vintage or poor-quality lenses.

df_0813grades_25_sm

The built-in FCP X Color Board is actually one of my favorites, but you have to get used to its color control model. Slider/puck controls tends to be a bit coarse. Thanks to optimization, the performance is great and beats anything else offered for FCP X. You can stack many instances of the Color Board onto one clip, giving you great primary and secondary color correction control.

df_0813grades_26_sm

Yanobox Moods uses a different approach to color wheels within FCP X. Unlike Hawaiki, Moods uses different color science for its controls, including a silver control and black wash (tints black levels).

df_0813grades_27_sm

Like Dashwood’s Essentials, Ripple Training’s Ripple Tools also includes a grab bag of effects. These include several color correction effects, such as Color Balance and Glow. A unique aspect of their filters is that they are all filter adjustment layers, which are applied as FCP X connected title clips. They will alter any image below this adjustment layer and therefore, may be used as a single filter over more than just one clip.

©2013 Oliver Peters

Advertisements

The NLE that wouldn’t die II

df_nledie2_sm

With echoes of Monty Python in the background, two years on, Final Cut Pro 7 and Final Cut Studio are still widely in use. As I noted in my post from last November, I still see facilities with firmly entrenched and mature FCP “legacy” workflows that haven’t moved to another NLE yet. Some were ready to move to Adobe until they learned subscription was the only choice going forward. Others maintain a fanboy’s faith in Apple that the next version will somehow fix all the things they dislike about Final Cut Pro X. Others simply haven’t found the alternative solutions compelling enough to shift.

I’ve been cutting all manner of projects in FCP X since the beginning and am currently using it on a feature film. I augment it in lots of ways with plug-ins and utilities, so I’m about as deep into FCP X workflows as anyone out there. Yet, there are very few projects in which I don’t touch some aspect of Final Cut Studio to help get the job done. Some fueled by need, some by personal preference. Here are some ways that Studio can still work for you as a suite of applications to fill in the gaps.

DVD creation

There are no more version updates to Apple’s (or Adobe’s) DVD creation tools. FCP X and Compressor can author simple “one-off” discs using their export/share/batch functions. However, if you need a more advanced, authored DVD with branched menus and assets, DVD Studio Pro (as well is Adobe Encore CS6) is still a very viable tool, assuming you already own Final Cut Studio. For me, the need to do this has been reduced, but not completely gone.

Batch export

Final Cut Pro X has no batch export function for source clips. This is something I find immensely helpful. For example, many editorial houses specify that their production company client supply edit-friendly “dailies” – especially when final color correction and finishing will be done by another facility or artist/editor/colorist. This is a throwback to film workflows and is most often the case with RED and ALEXA productions. Certainly a lot of the same processes can be done with DaVinci Resolve, but it’s simply faster and easier with FCP 7.

In the case of ALEXA, a lot of editors prefer to do their offline edit with LUT-corrected, Rec 709 images, instead of the flat, Log-C ProRes 4444 files that come straight from the camera. With FCP 7, simply import the camera files, add a LUT filter like the one from Nick Shaw (Antler Post), enable TC burn-in if you like and run a batch export in the codec of your choice. When I do this, I usually end up with a set of Rec 709 color, ProResLT files with burn-in that I can use to edit with. Since the file name, reel ID and timecode are identical to the camera masters, I can easily edit with the “dailies” and then relink to the camera masters for color correction and finishing. This works well in Adobe Premiere Pro CC, Apple FCP 7 and even FCP X.

Timecode and reel IDs

When I work with files from the various HDSLRs, I prefer to convert them to ProRes (or DNxHD), add timecode and reel ID info. In my eyes, this makes the file professional video media that’s much more easily dealt with throughout the rest of the post pipeline. I have a specific routine for doing this, but when some of these steps fail, due to some file error, I find that FCP 7 is a good back-up utility. From inside FCP 7, you can easily add reel IDs and also modify or add timecode. This metadata is embedded into the actual media file and readable by other applications.

Log and Transfer

Yes, I know that you can import and optimize (transcode) camera files in FCP X. I just don’t like the way it does it. The FCP 7 Log and Transfer module allows the editor to set several naming preferences upon ingest. This includes custom names and reel IDs. That metadata is then embedded directly into the QuickTime movie created by the Log and Transfer module. FCP X doesn’t embed name and ID changes into the media file, but rather into its own database. Subsequently this information is not transportable by simply reading the media file within another application. As a result, when I work with media from a C300, for example, my first step is still Log and Transfer in FCP 7, before I start editing in FCP X.

Conform and reverse telecine

A lot of cameras offer the ability to shoot at higher frame rates with the intent of playing this at a slower frame rate for a slow motion effect – “overcranking” in film terms. Advanced cameras like the ALEXA, RED One, EPIC and Canon C300 write a timebase reference into the file that tells the NLE that a file recorded at 60fps is to be played at 23.98fps. This is not true of HDSLRs, like a Canon 5D, 7D or a GoPro. You have to tell the NLE what to do. FCP X only does this though its Retime effect, which means you are telling the file to be played as slomo, thus requiring a render.

I prefer to use Cinema Tools to “conform” the file. This alters the file header information of the QuickTime file, so that any application will play it at the conformed, rather than recorded frame rate. The process is nearly instant and when imported into FCP X, the application simply plays it at the slower speed – no rendering required. Just like with an ALEXA or RED.

Another function of Cinema Tools is reverse telecine. If a camera file was recorded with built-in “pulldown” – sometimes called 24-over-60 – additional redundant video fields are added to the file. You want to remove these if you are editing in a native 24p project. Cinema Tools will let you do this and in the process render a new, 24p-native file.

Color correction

I really like the built-in and third-party color correction tools for Final Cut Pro X. I also like Blackmagic Design’s DaVinci Resolve, but there are times when Apple Color is still the best tool for the job. I prefer its user interface to Resolve, especially when working with dual displays and if you use an AJA capture/monitoring product, Resolve is a non-starter. For me, Color is the best choice when I get a color correction project from outside where the editor used FCP 7 to cut. I’ve also done some jobs in X and then gone to Color via Xto7 and then FCP 7. It may sound a little convoluted, but is pretty painless and the results speak for themselves.

Audio mixing

I do minimal mixing in X. It’s fine for simple mixes, but for me, a track-based application is the only way to go. I do have X2Pro Audio Convert, but many of the out-of-house ProTools mixers I work with prefer to receive OMFs rather than AAFs. This means going to FCP 7 first and then generating an OMF from within FCP 7. This has the added advantage that I can proof the timeline for errors first. That’s something you can’t do if you are generating an AAF without any way to open and inspect it. FCP X has a tendency to include many clips that are muted and usually out of your way inside X. By going to FCP 7 first, you have a chance to clean up the timeline before the mixer gets it.

Any complex projects that I mix myself are done in Adobe Audition or Soundtrack Pro. I can get to Audition via the XML route – or I can go to Soundtrack Pro through XML and FCP 7 with its “send to” function. Either application works for me and most of my third-party plug-ins show up in each. Plus they both have a healthy set of their own built-in filters. When I’m done, simply export the mix (and/or stems) and import the track back into FCP X to marry it to the picture.

Project trimming

Final Cut Pro X has no media management function.  You can copy/move/aggregate all of the media from a single Project (timeline) into a new Event, but these files are the source clips at full length. There is no ability to create a new project with trimmed or consolidated media. That’s when source files from a timeline are shortened to only include the portion that was cut into the sequence, plus user-defined “handles” (an extra few frames or seconds at the beginning and end of the clip). Trimmed, media-managed projects are often required when sending your edited sequence to an outside color correction facility. It’s also a great way to archive the “unflattened” final sequence of your production, while still leaving some wiggle room for future trimming adjustments. The sequence is editable and you still have the ability to slip, slide or change cuts by a few frames.

I ran into this problem the other day, where I needed to take a production home for further work. It was a series of commercials cut in FCP X, from which I had recut four spots as director’s cuts. The edit was locked, but I wanted to finish the mix and grade at home. No problem, I thought. Simply duplicate the project with “used media”, create the new Event and “organize” (copies media into the new Event folder). I could live with the fact that the media was full length, but there was one rub. Since I had originally edited the series of commercials using Compound Clips for selected takes, the duping process brought over all of these Compounds – even though none was actually used in the edit of the four director’s cuts. This would have resulted in copying nearly two-thirds of the total source media. I could not remove the Compounds from the copied Event, without also removing them from the original, which I didn’t want to do.

The solution was to send the sequence of four spots to FCP 7 and then media manage that timeline into a trimmed project. The difference was 12GB of trimmed source clips instead of HUNDREDS of GB. At home, I then sent the audio to Soundtrack Pro for a mix and the picture back to FCP X for color correction. Connect the mix back to the primary storyline in FCP X and call it done!

I realize that some of this may sound a bit complex to some readers, but professional workflows are all about having a good toolkit and knowing how to use it. FCP X is a great tool for productions that can work within its walls, but if you still own Final Cut Studio, there are a lot more options at your disposal. Why not continue to use them?

©2013 Oliver Peters

Anatomy of editing a two camera scene

df_2cam_1

With the increase in shooting ratios and shortened production schedules, many directors turn to shooting their project with two cameras for the entire time. Since REDs and Canon HDSLRs are bountiful and reasonably priced to buy or rent, even a low budget indie film can take advantage of this. Let me say from the beginning that I’m not a big fan of shooting with two cameras. Too many directors view it as a way to get through their shooting schedule more quickly; but, in fact, they often shoot more footage than needed. Often the B-camera coverage is only 25% useful, because it was not properly blocked or lit for. However, there are situations where shooting with two cameras works out quite well. The technique is at its most useful when shooting a dramatic dialogue scene with two or three principal actors. (Click the images below for expanded views.)

Synchronization

df_2cam_4_smThe most critical aspect is maintaining proper sync with audio and between the two cameras. In an ideal world, this is achieved with matching timecode among the cameras and the external sound recorder. Reality often throws a curve ball, which means that more often than not, timecodes drift throughout the day or the cameras weren’t properly jam-synced or some other issue. The bottom line is that by the time it gets to the editor, you often cannot rely on timecode for all elements to be in sync. That’s why “old school” techniques like a slate with a clapstick are ESSENTIAL. This means roll all three devices and slate both cameras. If you have to move to stand in front of the B-camera for a separate B-camera slate and clap, then you MUST do it.

When this gets to post, the editor or assistant first needs to sync audio and video for both the A-camera and B-camera for every take. If your external sound recorder saved broadcast WAV files, then usually you’ll have one track with the main mix and additional tracks for each isolated microphone used on set. Ideally, the location mixer will have also fed reference audio to both cameras. This means you now have three ways to sync – timecode, slate/clapstick and/or common audio. If the timecode does match, most NLEs have a syncing function to create merged clips with the combined camera file and external audio recording. FCP X can also sync by matching audio waveforms (if reference audio is present on the camera files). For external syncing, there’s Sync-N-Link and Sync-N-Link X (matching timecode) and PluralEyes (matching audio).

These are all great shortcuts, but there are times when none of the automatic solutions work. That’s when the assistant or editor has to manually mark the visual clap on the camera files and audio spike of the clap on the sound file and sync the two elements based on these references. FCP X adds an additional feature, which is the ability to open a master clip in its own timeline (“open in timeline” command). You can then edit directly “inside” the master clip. This is useful with external audio, because you have now embedded the external audio tracks inside the master clip for that camera file and they travel together from then on. This has an advantage over FCP X’s usual synchronized clip method, in that it retains the camera source’s timecode. Synchronized clips reset the timecode of that clip to 00:00:00:00.

Directing strategy

df_2cam_2_smIn standard film productions, a scene will be shot multiple times – first a “master” and then various alternate angles; sometimes alternative line readings; as well as pick-ups for part of a scene or cutaways and inserts showing items around the set. The “master” of the scene gets a scene number designation, such as Scene 101, Take 1, Take 2, etc. Whenever the camera is reframed or repositioned – or alternative dialogue is introduced – those recordings get a letter suffix, such as 101A, 101B and so on. With two cameras, there’s also the A and B camera designation, which is usually part of the actual camera file name or embedded metadata.

In blocking a simple dialogue scene with two actors, the director would set up the master with a wide shot for the entire scene on the A-camera and maybe a medium on the lead actor within that scene on the B-camera. The B-cam may be positioned next to A-cam or on the opposite side (without crossing the line). That’s Scene 101 and typically, two or three takes will be recorded.

Next, the director will set up two opposing OTS (over the shoulder) angles of the two speaking actors for 101A. After that, opposing CU (close-up) angles for 101B. Often there’s a third set-up (101C) for additional items. For example, if the scene takes place in a bar, there may be extra coverage that sets up the environment, such as patrons at the bar in the background of the scene. In this example with four setups (101-101C) – assuming the director rolled for three takes on each set-up – coverage with two cameras automatically gives you 24 clips to choose from in editing this scene.

Editing strategy

When you mention two camera coverage, many will think of multi-cam editing routines. I never use that for this purpose, because for me, an A-cam or B-cam angle of the same take is still like a uniquely separate take. However, I do find that editing the first swipe at the scene works best when you work with A-cam and B-cam grouped together. Although a director might pick a certain take as his best or “circle” take, I make the assumption that all takes have some value for individual lines. I might start with the circle take of the scene’s master, but I usually end up editing in bits and pieces of other takes, as well. The following method works best when the actors stick largely to the script, with minimal ad libs and improvisation.

df_2cam_3_smStep one is to edit the A-cam circle take of the scene master to the timeline, complete with slate. Next, edit the matching B-cam clip on top, using the slate’s clap to match the two angles. (Timecode also works, of course, if A-cam and B-cam have matching timecode.) The exact way I do this varies with the NLE that I am using. In FCP X, the B-cam clip is a connected clip, while in FCP 7, Media Composer and Premiere Pro, the B-cam is on V2 and the accompanying audio is on the tracks below those from the A-cam clip. The point is to have both angles stacked and in sync. df_2cam_5_smLastly, I’ll resize the B-cam clip so I see it as a PIP (picture-in-picture effect) over the A-cam image. Now, I can play through this scene and see what each camera angle of the master offers.

df_2cam_6_smStep two is to do the first editing pass on the scene. I use the blade tool (or add edit) to cut across all tracks/layers/clips at each edit point. Obviously, I’ll add a cut at the start of the action so I can remove the slate and run-up to the actual start of the scene. As I play though, I am making edit selections, as if I were switching cameras. The audio is edited as well – often in the middle of a line or even word. This is fine. Once these edits are done, I will delete the front and back of these takes. Then I will select all of the upper B-cam shots (plus audio) that I don’t want to use and delete these. Finally, remove the transform effects to restore the remaining B-cam clips to full screen.

df_2cam_7_smAt this stage I will usually move the B-cam clips down to the main track. In FCP X, I use the “overwrite to primary storyline” command to edit the B-cam clips (with audio) onto the storyline, thus replacing the A-cam clip segments that were there. This will cause the embedded external audio from the overwritten A-cam clip segments to be pushed down as connected clips. Highlight and delete this. In a track-based NLE, I may leave the B-cam clips on V2 or overwrite to V1. I’ll also highlight and delete/lift the unwarranted, duplicate A-cam audio. In all cases, what you want to end up with is a scene edit that checkerboards the A-cam and B-cam clips – audio and video.

df_2cam_8_smStep three is to find other coverage. So far in this example, I’ve only used the circle take for the master of the scene. As I play the scene, I will want to replace certain line readings with better takes from the other coverage (ex. 101A, B, C, etc.), including OTS and CU shots. One reason is to use the best acting performance. Another is to balance the emotion of the scene and create the best arc. Typically in a dramatic scene, the emotion rises as you get later into the scene. To emphasize this visually, I want to use tighter shots as I get further into the scene – focusing mainly on eyes and facial expressions.

I work through the scene, seeking to replace some of the master A-cam or B-cam clip segments. I will mark the timeline section to delete/extract, find a better version of that line in another angle/take and insert/splice it into that position. FCP X has a replace function which is designed for this, but I find it to be slow and inconsistent. A fast keystroke combo of marking clips and timeline and then pressing delete, followed by insert is significantly faster. Regardless of the specific keystrokes used, the point is to build/control the emotion of the scene in ways that improve the drama and combine the best performances of each actor.

df_2cam_9_smStep four is to tighten the scene. At this point, you are primarily working in the trim mode of your NLE. With FCP X, expand the audio/video so you can independently trim both elements for J and L-cuts. As you begin, you’ll have some sloppy edits. Words may be slightly clipped or the cadence of speech doesn’t sound right. You now have to fix this by trimming clips and adjusting audio and video edit points. FCP X is especially productive here, because the waveform display makes it easy to see where the same words from adjacent clips align.

You want the scene to flow – dramatically and technically. How natural-sounding is the delivered dialogue as a result of your edit choices? You should also be mindful of continuity, such as actors’ eye lines, body positions and actions. Often actors will add dramatic pauses, long stares and verbal stumbles to the performance. This may be for valid dramatic emphasis; but, it can also be over-acting and even the equivalent memory trick of saying “um” – as the actor tries to remember the next line. Your job as an editor (in support of the director) is to decide which it is and edit the scene so that it comes across in the best way possible. You can cut out this “air” by trimming the edits and/or by using tricks. For example, slip part of a line out of sync and play it under an OTS or reaction shot to tighten a pause.

Step five is to embellish the scene. This is where you add non-sync reactions, inserts and cutaways. The goal is to enhance the scene by giving it a sense of place, cover mismatched continuity and to improve drama. Your elements are the extra coverage around the set (like our patrons at the bar) or an actor’s nod, smile, head turn or grimace in response to the dialogue delivered by their acting counterpart. The goal is to maintain the viewer’s emotional involvement and help tell the story through visuals other than simply seeing a person talk. You want the viewer to follow both sides of the conversation through dialogue and visual cues.

While I have written this post with film and television drama in mind, the same techniques apply, whether it’s a comedy, documentary or simple corporate interview. It’s a strategy for getting the most efficiency out of your edit.

Click here for more film editing tips.

©2013 Oliver Peters

CoreMelt SliceX powered by Mocha

df_slicex_1

The structure of Apple’s Final Cut Pro X has brought many new and innovative plug-ins to market that extent the power of the application. One such plug-in is SliceX from CoreMelt. Developer Roger Bolton is a visual effects compositor who has been making FCP, Motion and After Effects plug-ins that fill the needs he sees coming from a VFX background. SliceX was originally developed as a mask-creation filter designed for FCP X, which has no built-in, custom masking tools. The latest version of SliceX has been improved to add keyframes to the effects parameters. It now also sports a built-in 2D planar tracker using Mocha technology licensed from Imagineer Systems.

SliceX is a set of six effects filters for Final Cut Pro X. These include shape masks for blurs, color correction, depth-of-field, object removal (cloning), vignettes and layer shapes. Each has a different set of parameters depending on the design of that filter – blur settings versus color correction sliders, for example. You can apply the filter on a single clip; or you can stack clips vertically and apply a filter to a higher clip in order to composite these clips together, enabled by the masks you have created. Each filter includes a common set of on-screen drawing tools for the mask shapes, as well as the Mocha tracking controls. Masks can be adjusted using control points with Bezier handles and can be moved, scaled and rotated along with the adjustment of edge softness.

Having built-in tracking is unique in a filter (except for the Boris Continuum Complete effects) and by adding the Mocha tracker, elevates the level of effects work you can do inside FCP X. Most trackers are point trackers, where you have to identify one to four consistent high contrast points within an image for the tracker to follow. The Mocha tracker is a planer tracker, which means it automatically calculates tracking points based on a wider surface area of flat planes detected in the image. When you use Mocha tracking in Adobe After Effects, for instance, the clip is sent out to the separate Mocha tracking application, which creates the tracking data. This data is then used to adjust scale, position and rotation information within After Effects’ transform tab.

df_slicex_2All of these extra steps are eliminated with SliceX. In keeping to the general FCP X philosophy of hiding the advanced technology under the hood, SliceX features a simple floating heads-up panel for Mocha and the drawing tools. Click the track arrow in the Mocha control panel to track forward or backward from the point you are parked and the mask is adjusted based on that track. Tracks can be set for either translate (position)/scale/rotation or for perspective. The tracker runs slower than real-time while it analyzes the shot, though this pass is relatively speedy as trackers go. Once the tracking analysis is complete, FCP X will apply and run the effect in real-time without rendering. In addition to the Mocha control panel, there’s also a new mask keyframe control panel. If you change the mask points within a clip, this automatically adds keyframes at every point where you have made adjustments. You can now step through or delete these keyframes, which is a separate function from the standard FCP X keyframe controls available within the effects Inspector window.

CoreMelt’s SliceX adds tremendous functionality to FCP X. You can use the effects to track and blur faces or act like tracked color correction nodes in DaVinci Resolve. The tracking is as simple and foolproof as it comes. Object removal is a great fix-it tool. Have an actress with a skin blemish on her cheek? Simply add the mask and offset the skin area. Track it and voila – the blemish is gone.

It’s easy to say good things about many filters. They are fun to use and give your videos unique stylized looks, but it’s rare that filters significantly improve the core function of your NLE. SliceX is one of those. It’s a must-have filter if you are serious about professional results with Final Cut Pro X. Use it on a few shots and you’ll instantly ask why every NLE doesn’t do the same – and do it so painlessly.

Originally written for Digital Video magazine

©2013 Oliver Peters

Particle Fever

df_pf_1Filmmaking isn’t rocket science, but sometimes they are kissing cousins. Such is the case of the documentary Particle Fever, where the credentials of both producer David Kaplan and director Mark Levinson include a Doctorate in particle physics. Levinson has been involved in filmmaking for 28 years, starting after his graduation from Berkeley, when he found the job prospects for physics in a slump. Instead he turned to his second passion – films. Levinson worked as an ADR specialist on such films as The English Patient, The Talented Mr. Ripley, Cold Mountain, and The Rainmaker. While working on those films, he built up a friendship with noted film editor Walter Murch (The Conversation, Julia, Apocalypse Now, K-19: The Widowmaker). In addition, Levinson was writing screenplays and directing some of his own independent films (Prisoner of Time). This ultimately led him to combine his two interests and pursue Particle Fever, a documentary about the research, construction and goals of building the Large Hadron Collider.

When it came time to put the polish on his documentary, Mark Levinson tapped Walter Murch as the editor. Murch explained, “I was originally only going to be on the film for three months, because I was scheduled to work on another production after that. I started in March 2012, but the story kept changing with each breaking news item from the collider. And my other project went away, so in the end, I worked on the film for 15 months and just finished the mix a few weeks ago [June 2013].” At the start of the documentary project, the outcome of the research from the Large Hadron Collider was unknown. In fact, it wasn’t until later during the edit, that the scientists achieved a major success with the confirmation of the discovery of the Higgs boson as an elementary particle in July 2012. This impacted science, but also the documentary in a major way.

Finding the story arc

df_pf_6Particle Fever is the first feature-length documentary that Walter Murch has edited, although archival and documentary footage has been part of a number of his films. He’d cut some films for the USIA early in his career and has advised and mixed a number of documentaries, including Crumb, about the controversial cartoonist Robert Crumb. Murch is fond of discussing the role of the editor as a participatory writer of the film in how he crafts the story through pictures and sound. Nowhere is this more true than in documentaries. According to Murch, “Particle Fever had a natural story arc by the nature of the events themselves. The machine [the Large Hadron Collider] provided the spine. It was turned on in 2008 and nine days later partly exploded, because a helium relief valve wasn’t strong enough. It was shut down for a year of repairs. When it was turned on again, it was only at half power and many of the scientists feared this was inadequate for any major discoveries. Nevertheless, even at half power, the precision was good enough to see the evidence that they needed. The film covers this journey from hope to disaster to recovery and triumph.”

Due to the cost of constructing large particle accelerators, a project like the Large Hadron Collider is a once-in-a-generation event. It is a seminal moment in science akin to the Manhattan Project or the moon launch. In this case, 10,000 scientists from 100 countries were involved in the goal of recreating the conditions just after the Big Bang and finding the Higgs boson, often nicknamed “the God particle”. Murch explained the production process, “Mark and David picked a number of scientists to follow and we told the story through their eyes without a narrator. They were equipped with small consumer cameras to self-record intermittent video blogs, which augmented the formal interviews. Initially Mark was following about a dozen scientists, but this was eventually narrowed down to the six that are featured in the film. The central creative challenge was to balance the events while getting to know the people and their roles. We also had to present enough science to understand what is at stake without overwhelming the audience. These six turned out to be the best at that and could convey their passion in a very charismatic and understandable way with a minimum of jargon.”

Murch continued, “Our initial cut was two-and-a-half hours, which was ultimately reduced to 99 minutes. We got there by cutting some people, but also some of the ‘side shoots’ or alternate research options that were explored. For example, there was a flurry of excitement related to what was thought to be discoveries of particles of ‘dark matter’ at a Minnesota facility. This covered about 20 minutes of the film, but in the final version there’s only a small trace of that material.”

Sifting to find the nuggets

df_pf_2As in most documentaries, the post team faced a multitude of formats and a wealth of material, including standard definition video recorded in 2007, the HDV files from the scientists’ “webcams” and Panasonic HD media from the interviews. In addition, there was a lot of PAL footage from the media libraries at CERN, the European particle accelerator. During the production, news coverage focused on the theoretical, though statistically unlikely, possibility that the Large Hadron Collider might have been capable of producing a black hole. This yielded even more source material to sift through. In total, the production team generated 300 hours of content and an additional 700 hours were available from CERN and the various news pieces produced about the collider.

Murch is known for his detailed editor’s codebook for scenes and dailies that he maintains for every film in a Filemaker Pro database. Particle Fever required a more streamlined approach. Murch came in at what initially appeared to be the end of the process after Mona Davis (Fresh, Advise & Consent) had worked on the film. Murch said, “I started the process later into the production, so I didn’t initially use my Filemaker database. Mark was both the director and my assistant editor, so for the first few months I was guided by his knowledge of the material. We maintained two mirrored workstations with Final Cut Pro 7 and Mark would ingest any new material and add his markers for clips to investigate. When these bins were copied to my station, I could use them as a guide of where to start looking for possible material.”

Mapping the sound

df_pf_4The post team operated out of Gigantic Studios in New York, which enabled an interactive workflow between Murch and sound designer Tom Paul (on staff at Gigantic) and with composer Robert Miller. Walter Murch’s editorial style involves building up a lot of temporary sound effects and score elements during the rough cut phase and then, piece-by-piece, replacing those with finished elements as he receives them. His FCP sequence on Particle Fever had 42 audio tracks of dialogue, temp sound effects and music elements. This sort of interaction among the editor, sound designer and composer worked well with a small post team all located in New York City. By the time the cut was locked in May, Miller had delivered about an hour of original score for the film and supplied Murch with seven stereo instrumentation stems for that score to give him the most versatility in mixing.

Murch and Paul mixed the film on Gigantic’s Pro Tools ICON system. Murch offered this post trick, “When I received the final score elements from Robert, I would load them into Final Cut and then was able to copy-and-paste volume keyframes I had added to Robert’s temp music onto the final stems, ducking under dialogue or emphasizing certain dynamics of the music. This information was then automatically transferred to the Pro Tools system as part of the OMF output. Although we’d still adjust levels in the mix, embedding these volume shifts gave us a better starting point. We didn’t have to reinvent the wheel, so to speak. In the end, the final mix took four days. Long days!”

df_pf_3Gigantic Post offered the advantage of an on-site screening room, which enabled the producers to have numerous in-progress screenings for both scientific and industry professionals, as well as normal interested viewers. Murch explained, “It was important to get the science right, but also to make it understandable to the layman. I have more than a passing interest in the subject, but both Mark and David have Ph.D.s in particle physics, so if I ever had a question about something, all I had to do was turn around and ask. We held about 20 screenings over the course of a year and the scientists who attended our test screenings felt that the physics was accurate. But, what they also particularly liked was that the film really conveys the passion and experience of what it’s like to work in this field.” Final Frame Post, also in New York, handled the film’s grading and digital intermediate mastering.

Graphic enhancements

df_pf_5To help illustrate the science, the producers tapped MK12, a design and animation studio, which had worked on such films as The Kite Runner and Quantum of Solace. Some of the ways in which they expressed ideas graphically throughout the film could loosely be described as a cross between A Beautiful Mind and Carl Sagan’s PBS Cosmos series. Murch described one example, “For instance, we see Nima (one of our theorists) walking across the campus of the Institute for Advanced Study while we hear his voice-over. As he talks, formulas start to swirl all around him. Then the grass transforms into a carpet of number-particles, which then transform into an expanding universe into which Nima disappears. Eventually, this scene resolves and Nima emerges, returning on campus and walking into a building, the problematic formulas falling to the ground as he goes through the door.”

Although this was Walter Murch’s first feature documentary, his approach wasn’t fundamentally different from how he works on a dramatic film. He said, “Even on a scripted film, I try to look at the material without investing it with intention. I like to view dailies with the fresh-eyed sense of ‘Oh, where did this come from? Let’s see where this will take the story’.  That’s also from working so many years with Francis [Ford Coppola], who often shoots in a documentary style. The wedding scene in The Godfather, for instance; or the Union Square conversation in The Conversation; or any of the action scenes in Apocalypse Now all exemplify that. They are ongoing events, with their own internal momentum, which are captured by multiple cameras. I really enjoyed working on this film, because there were developments and announcements during the post which significantly affected the direction of the story and ultimately the ending. This made for a real roller coaster ride!”

Particle Fever premiered at Doc/Fest Sheffield on June 14th, and won the Audience Award (split with Act of Killing). It is currently in negotiations for distribution.

NOTE: The film will open in New York on Ma5, 2014. In October 2013Peter W. Higgs – who theorized about the boson particle named after him – was awarded the Nobel Prize in Physics, together with Francois Englert. For more on Walter Murch’s thoughts about editing, click here.

And finally, an interesting look at Murch’s involvement in the Rolex Mentor Protege program, as well as an interview from 2016.

Originally written for Digital Video magazine

©2013 Oliver Peters