Stocking Stuffers 2013


With an eye towards Black Friday, it seemed like a good time to put up this post. There’s been so much development in tools for FCP X this year that I decided to focus my year-end post of small items strictly around some of the offerings related to FxFactory. Although there are plenty of other developers focusing on Final Cut Pro X, Noise Industries has done a good job of aggregating a divergent set of developers under one roof. Many of the items listed are for FCP X, but quite a few also work inside Adobe Premiere Pro CC.

Apple’s Final Cut Pro X uses an effects architecture based on templates tied to the underpinnings of Motion 5. Even if the user didn’t buy the Motion application, that’s the engine that drives FCP X effects. End users and developers can create innovative effects, transitions and titles simply by building an effect inside Motion and publishing it as an FCP X effect. This capability has enabled the Final Cut ecosystem to blossom with many new and useful effects that would be very complicated to replicate in most other editing or compositing applications.

df_fcpxplugs0713_1Some of the newest tools are even free, such as the Andy Mees filters. Mees is well-known in FCP circles for his older FxScript plug-ins and now he’s developed a handful of useful effects for FCP X. Of particular interest is Better 3D – a 2.5/3D DVE – as well as his Elastic Aspect filter. The latter stretches 4:3 content to fit a 16:9 frame. It is designed to stretch the outer portions of the frame more than the center, in order to leave talent (usually in the center of the frame) less distorted.

df_fcpxplugs0713_10The folks are Ripple Training are known for software training, but of late have also become plug-in developers with products that include Callouts, Optics, Timelines and Tools. All but the last are design themes with graphic overlays. Timelines is a set of templates for animated timeline charts. Tools is a mix of useful effects to augment FCP X. These include masks, 3D text, guides, color balance and certain stylized looks.

df_fcpxplugs0713_4A similar offering is Tim Dashwood’s Editor Essentials. Dashwood develops Stereo3D tools for Final Cut, but like Ripple’s Tools, this group is an editor’s toolkit designed to make life easier. Included are letterbox/pillarbox masks, color correction adjustments, camera horizon leveling, a quick slate template, a dead pixel fixer filter and more.df_fcpxplugs0713_3

Tokyo Productions developer Simon Ubsdell got into the effects game with FCP X. Some of his newest effects include Chrominator and PIPinator. The first filter turns flat titles into shiny, metallic, extruded text, complete with sheens and glows. PIPinator (as in “picture-in-picture”) is a set of preset DVE moves to fly images in, out and through the frame. Other effects include ReAnimator for dead pixels and Split Animator for various split screen effects.

df_fcpxplugs0713_5I’ve mentioned Luca Visual FX a few times in my past reviews. Luca VFX is a great resource for grunge and distress looks. Some new filters and transitions include Impackt and Lo-Fi Looks. The latest – XOverlays – deviates from grunge, by using a set of patterns and graphics as effects overlays. These plug-ins also include image wells within the filter, so you can alter the look by dropping in your own images. Luca has further extended XOverlays, by releasing a set of bonus motion graphics, which may be used in these image wells. Design styles include bands, grids, high tech elements, ripples and visualizers.df_fcpxplugs0713_15

Another new Luca offering is Hi-Tech, which is a collection of lower-thirds, simulated displays and animations. It’s a perfect package for a show, spot or film that needs a lot of sci-fi style monitors, HUD overlays and digital elements. Each of these can be customized with image drop wells, color changes and the ability to reposition items within the frame.


Stupid Raisins has focused on transitions using blocks, panels, shapes and slide effects. Their latest release is a series of title generators with built-in motion graphics, reminiscent of Apple’s LiveType. Although these animations have been integrated into Motion, it’s nice to have them easily accessible within FCP X as an effect. There are 50 titles with a variety of text animation effects. Apply the clip to the timeline and then you can easily modify fonts, text information and style.

df_fcpxplugs0713_7I’ve only touched the surface of the available tools, but I’ll wrap up with PHYX’s new Flarelight filter. This package includes three glare, lens flare and glow filters, plus noise and star field generators. The lens flare in particular is very nice. It has a ton of parameters to customize the flare and keyframe its movement. Actual adjustment felt a little slow as I was doing it, but it played reasonably well in real-time without having to render – just to see the effect.


idustrial revolution – one of the original FxFactory development partners – has added the XEffects Toolkit to their product family. The Toolkit is a set of 53 filters, titles and generators that cover a wide range of needs from the stylish to the utilitarian. For example, where can you find a tilt-shift filter, a telestrator overlay line and a slate/countdown clock all in the same set of effects? Also included are filters that treat standard editing effects – like a zoom – in much the same way as a Behavior in Motion.

df_fcpxplugs0713_17The last plug-in I’ll mention here is Swoosh from SUGARfx. Swoosh is a set of light brush transitions, titles and generators. The transitions especially show off the power of image manipulation in FCP X. Not only are these a graphic color overlay that wipes from one image to the next, but the path of the light brush effect actually refracts the image underneath. This results in visual ripple or distortion of the incoming and outgoing shots. The transitions run in real-time on a good machine and you have control over a number of parameters, including the gradient colors and the refraction amount.

df_fcpxplugs0713_18I’ve avoided mentioning color correction filters of various types, since these were covered earlier. If you missed those posts, check here and here for a refresher.  I’ve mentioned these in the context of Final Cut Pro, but many of these filters will also show up and work in Motion, After Effects and Premiere Pro with the same installation. It’s a growing ecosystem of tools that makes FCP X a very interesting environment to work in.

Originally written for Digital Video magazine

© 2013 Oliver Peters

Film editing stages – Sound

df_filmsoundeditLike picture editing, the completion of sound for a film also goes through a series of component parts. These normally start after “picture lock” and are performed by a team of sound editors and mixers. On small, indie films, a single sound designer/editor/mixer might cover all of these roles. On larger films, specific tasks are covered by different individuals. Depending on whether it’s one individual or a team, sound post can take anywhere from four weeks to several months to complete.

Location mixing – During original production, the recording of live sound is handled by the location mixer. This is considered mixing, because originally, multiple mics were mixed “on-the-fly” to a single mono or stereo recording device. In modern films with digital location recordings, the mixer tends to record what is really only a mixed reference track for the editors, while simultaneously recording separate tracks of each isolated microphone to be used in the actual post production mix.

ADR – automatic dialogue replacement or “looping”. ADR is the recording of replacement dialogue in sync with the picture. The actors do this while watching their performance on screen. Sometimes this is done during production and sometimes during post. ADR will be used when location audio has technical flaws. Sometimes ADR is also used to record additional dialogue – for instance, when an actor has his or her back turned. ADR can also be used to record “sanitized” dialogue to remove profanity.

Walla or “group loop” – Additional audio is recorded for groups of people. This is usually for background sounds, like guests in a restaurant. The term “walla” comes from the fact that actors were (and often still are) instructed to say “walla, walla, walla” instead of real dialogue. The point is to create a sound effect of a crowd murmuring, without any recognizable dialogue line being heard. You don’t want anything distinctive to stand out above the murmur, other than the lead actors’ dialogue lines.

Dialogue editing – When the film editor (i.e. the picture editor) hands over the locked cut to the sound editors, it generally will include all properly edited dialogue for the scenes. However, this is not prepared for mixing. The dialogue editor will take this cut and break out all individual mic tracks. They will make sure all director’s cues are removed and they will often add room tone and ambience to smooth out the recording. In addition, specific actor mics will be grouped to common tracks so that it is easier to mix and apply specific processing, as needed, for any given character.

Sound effects editing/sound design – Sound effects for a film come from a variety of sources, including live recordings, sound effects libraries and sound synthesizers. Putting this all together is the role of the sound effects editor(s). Because many have elevated the art, by creating very specific senses of place, the term “sound designer” has come into vogue. For example, the villain’s lair might always feature certain sounds that are identifiable with that character – e.g. dripping water, rats squeaking, a distant clock chiming, etc. These become thematic, just like a character’s musical theme. The sound effects editors are the ones that record, find and place such sound effects.

Foley – Foley is the art of live sound effects recording. This is often done by a two-person team consisting of a recordist and a Foley walker, who is the artist physically performing these sounds. It literally IS a performance, because the walker does this in sync to the picture. Examples of Foley include footsteps, clothes rustling, punches in a fight scene and so on. It is usually faster and more appropriate-sounding to record live sound effects than to use library cues from a CD.

In addition to standard sound effects, additional Foley is recorded for international mixes. When an actor deliveries a dialogue line over a sound recorded as part of a scene – a door closing or a cup being set on a table – that sound will naturally be removed when English dialogue is replaced by foreign dialogue in international versions of the film. Therefore, additional sound effects are recorded to fill in these gaps. Having a proper international mix (often called “fully filled”) is usually a deliverable requirement by any distributor.

Music – In an ideal film scenario, a composer creates all the music for a film. He or she is working in parallel with the sound and dialogue editors. Music is usually divided between source cues (e.g. the background songs playing from a jukebox at a bar) and musical score.

Recorded songs may also be used as score elements during montages. Sometimes different musicians, other than the composer, will create songs for source cues or for use in the score. Alternatively, the producers may license affordable recordings from unsigned artists. Rarely is recognizable popular music used, unless the production has a huge budget. It is important that the producers, composer and sound editors communicate with each other, to define whether items like songs are to be treated as a musical element or as a background sound effect.

The best situation is when an experienced film composer delivers all completed music that is timed and synced to picture. The composer may deliver the score in submixed, musical stems (rhythm instruments separated from lead instruments, for instance) for greater control in the mix. However, sometimes it isn’t possible for the composer to provide a finished, ready-to-mix score. In that case, a music editor may get involved, in order to edit and position music to picture as if it were the score.

Laugh tracks – This is usually a part of sitcom TV production and not feature films. When laugh tracks are added, the laughs are usually placed by sound effects editors who specialize in adding laughs. The appropriate laugh tracks are kept separate so they can be added or removed in the final mix and/or as part of any deliverables.

Re-recording mix – Since location recording is called location mixing, the final, post production mix is called a re-recording mix. This is the point at which divergent sound elements – dialogue, ADR, sound effects, Foley and music – all meet and are mixed in sync to the final picture. On a large film, these various elements can easily take up 150 or more tracks and require two or three mixers to man the console. With the introduction of automated systems and the ability to completely mix “in the box”, using a DAW like Pro Tools, smaller films may be mixed by one or two mixers. Typically the lead mixer handles the dialogue tracks and the second and third mixers control sound effects and music. Mixing most feature films takes one to two weeks, plus the time to output various deliverable versions (stereo, surround, international, etc.).

The deliverable requirements for most TV shows and features are to create a so-called composite mix (in several variations), along with separate stems for dialogue, sound effects and music. A stem is a submix of just a group of component items, such as a stereo stem for only dialogue.The combination of the stems should equal the mix. By having stems available, the distributors can easily create foreign versions and trailers.

©2013 Oliver Peters

Film editing stages – Picture


While budding filmmakers have a good idea of what happens during the production phase of shooting a film, most have little idea about what happens in post. Both picture and sound go through lengthy and separate editorial processes. These often become a rude awakening for new directors when it pertains to the time and budget requirements. These are the basic steps every modern film goes through in getting to the finish line.

First cut – This stage goes by many names – first cut, first assembly, editor’s cut, etc. In general, this is the first version of the fully-assembled film, including all the scenes edited according to the script. Depending on the editor and the post schedule, this cut may be very rough – or it might be a reasonably polished edit. If the editing happens concurrent to the studio and location filming, then often there will be a “first assembly” and a subsequent “editor’s cut”. The former is a quick turnaround version, so that everyone can make sure the coverage is adequate. The latter is a more refined version.

Some productions employ an on-set editor who is the person generating this “first assembly”. That editor is then often replaced by the main film editor, who starts after all production is completed. In that situation, the “editor’s cut” might be completely different in style, pace and technique from the first version. No matter how you get there, the intent of this step is to properly represent the intention of the script without concern for length or solving any content or script challenges.

Director’s cut – Once the editor has completed the first cut of the film, then the director steps in. He or she works with the editor to complete the cut of the film. Directors often deviate from the written scene. Sometimes this is sufficiently communicated to the editor to show up that way in the first cut. Sometimes it isn’t, because it lives in the director’s mind as the production proceeds. During the “director’s cut” phase, the director and editor work closely to adjust the cut to reflect the director’s vision.

Many directors and editors repeatedly work together on films and form a partnership of sorts. In these situation, the editor has a good idea of what the director wants and often the director only needs to give notes and review the cut periodically. Other directors like to be very “hands on” and will work closely with the editor, reviewing every take and making adjustments as needed.

Depending on the film and whether or not the director is DGA (Directors Guild), this stage will take a minimum of 20 days (DGA low budget) or 10 weeks (DGA standard) or longer. The goal is for the director and editor to come up with the best film possible, without interference from outside parties, including the producers. At this point, the film may go through severe changes, including shortening, losing and/or re-arranging scenes and even the addition of new content, like insert shots and new voice-over recordings.

Producer’s cut – After the director has a shot at the film, now it’s time to make adjustments according to studio notes, producer comments and feedback from official and unofficial test screenings. If the director hasn’t yet brought the film into line – both story-wise and length-wise – now is the time to do that. Typically most indie films are targeted at the 90-100 minute range. If your first cut or director’s cut is 120 minutes or longer, then it’s going to have to be cut down by a significant amount.

Typically you can shorten a film by 10% through trimming and shortening scenes. A reduction of 25% or more means that shots and whole scenes have to go. This can be a painful experience for the director, who has suffered through the agony, time and expense of getting these scenes and shots recorded. The editor, on the other hand, has no such emotional investment and can be more objective. Whichever way the process moves forward, the point is to get the cut to its final form.

Depending on the production, this version of the film might also include temporary sound effects, music and visual effects that have been added by the editor and/or assistants. Often this is needed to fully appreciate the film when showing it in test screenings.

Locked picture – The goal of these various editing steps is to explore all creative options in order to end up with a film that will not go through any further editing changes. This means, no revisions that change time or selected shots. The reason for a “locked picture” is so that the sound editing team and the visual effects designers can proceed with their work without the fear that changes will undo some of their efforts. Although large budget films have the luxury of making editorial changes after this point, it is unrealistic for smaller indie films. “Locking the cut” is absolutely essential if you want to get the best effort out of the post team, as well as stay within your budget.

Visual effects – If your film requires any visual effects shots, these are best tackled after picture lock. The editors will hand off the required source elements to the visual effects company or designers so they can do their thing. Editors are typically not involved in visual effects creation, other than to communicate the intent of any temp effects that have created and to make sure the completed VFX shots integrate properly back into the picture.

Sound editorial This will be covered in depth in the next blog post. It has its own set of steps and usually takes several weeks to several months to complete.

Conform and grade – Prior to this step, all editing has been performed with “proxy” media. During the “finishing” stage of the film, the original camera media is “conformed” to the locked cut that was handed over from the film editor. This conform step is typically run by an online editor who works in tandem with the colorist. Sometimes this is performed by the colorist and not a separate individual. On very low budget films, the film editor, online editor and colorist might all be the same person. During conforming, the objective is to frame-accurately re-create the edit, including all reframing, speed ramps and to integrate all final visual effects shots. From this point the film goes to color correction for final grading. Here the colorist matches all shots to establish visual consistency, as well as to add any subjective looks requested by the director or director of photography. The last process is to marry the sound mix back to the picture and then generate the various deliverable masters.

©2013 Oliver Peters



One of the ways to extend functions in Adobe After Effects is through scripting. These are automated macros to quickly perform tasks you could do yourself. By using scripts the results can be built more quickly without manually performing tedious, repetitive commands. Developers can create advanced scripts to automate complex creative treatments. These are installed like plug-ins, but show up as a module under the Window pulldown menu. One such script unit is Typemonkey – a kinetic text generator.

Kinetic Text

df_typemonkey_5We’ve all seen this current design trend for TV spots and marketing videos. The copy is presented via animated words, which move into position on screen. The view shifts from one word to the next in sync with the announcer at the reading pace of the viewer. Creating a kinetic text layout is relatively straightforward and can easily be created by an editor using After Effects or Motion.

The starting point for kinetic text is a large layout of stacked words. These are arranged horizontally and vertically in a bigger-than-raster field. It’s like taking a variety of building blocks and stacking them like a building. This word design can be created as a layered Photoshop document or as a series of layers in After Effects or Motion – one word per layer. To add energy and pace, you would next offset the timing of each layer and add an entry animation to the word on that layer, so that it flys, fades, rotates or types into visibility.

df_typemonkey_4Once this layout is created, the entire stack of layers is viewed with a 3D camera, which in turn is animated to create the moves from one word to the next as they appear inside the raster of your composition. This brings them full screen for a moment as the reader follows the context of this text. While this process is very easy once you understand it, the time it takes to build it can be quite long. In addition, a paragraph of words will result in a lengthy series of After Effects layers in your timeline pane.

Automating the process


Where Typemonkey enters the picture is to streamline the process and reduce or even eliminate the manual steps. Once installed, you open the Typemonkey interface module from the Window menu. Set the starting font from After Effects’ normal text control pane, paste or type your text into the Typemonkey window and press the “Do it!” button. At this point Typemonkey operates as a macro to automatically build the layers, the moves and the 3D camera animation. The final result is a timeline that shows the 3D camera layer with all word layers shied. Moves from word to word are evenly space for the length of the composition or selected work area with markers at each change. This builds a very nice composition with kinetic text in a matter of seconds.

df_typemonkey_7Naturally, most editors and designers will want to customize the defaults, so that every composition isn’t identical. This can be achieved through both the Typemonkey pane and AE’s standard layer effects. Sliding the markers in the composition timeline will change the animation pacing of the 3D camera’s move from word to word. This lets you hold longer on some words and move more quickly through others.

df_typemonkey_3The controls within the Typemonkey pane let you adjust some of the move styles and interpolations. You can also set up a series of colors, so that each word changes color as it cycles through the five palette choices. Through adjustments at both locations, designers can get quite a large range of variations from this single tool. The actual effects are performed using After Effects expressions, rather than keyframes, so you cannot easily make individual changes to the internal moves. However, you can certainly add your own keyframed transform effects on top of what Typemonkey creates.

Typemonkey is a low cost tool that will pay for itself in the time saved on a single job. Obviously its use is specific to kinetic text creative treatments, but used sparingly and with taste, it’s a look that will bring your motion graphics up a notch.

©2013 Oliver Peters

Understanding Premiere Pro Transitions


Switchers from Apple Final Cut Pro to Adobe Premiere Pro might miss the wealth of inexpensive transition effects offered by third-party and hobbyist plug-in developers. Native Premiere Pro transitions, like dissolves and wipes, can be applied just like in FCP. Drop the transition on a cut and you are done. Unfortunately third-party transitions don’t work this way, leading some users to conclude that they just don’t work or that Premiere Pro is less versatile.

(EDIT: This changed somewhat a day ago, when Noise Industries released FxFactory 4.1.1. Their transitions now are drag-and-drop enabled, just like Adobe’s default transitions. For other filters, like Sapphire Edge transitions, they must still be applied as I outline in the rest of this post.)

df_pprotrans_2The confusion comes, because Premiere Pro filters are based on a similar architecture to After Effects. Therefore, applying third-party transitions in Premiere Pro needs to be done in much the same manner as in After Effects. Instead of creating a transition between two adjacent clips on the same video track, third-party transitions work by creating a transition between clips on adjacent vertical tracks. In other words, not from A to B on V1, but rather A on V1 to B on V2 or the other way around.

Here are some basic tips to make Premiere Pro’s transitions work for you. (Click on any image for an expanded view.)

df_pprotrans_4Start by moving your B clip up one video level, such as from V1 to V2 or V2 to V3. The new Option + Up Arrow command works well in Premiere Pro CC. Extend the end of the A or B clip or both. This should create an overlap of the two clips equal to the length of the intended duration of the transition. Use the blade tool to add a cut on the B clip (on the higher track) at the end of the overlap.

Access your transition from the transitions group of that filter family. This will be contained within the main Video Effects folder, not the main Video Transitions folder. Drag-and-drop a third-party transition effect to the overlapping portion of the B clip.df_pprotrans_3

df_pprotrans_5Open the Effect Controls for that filter and set the background selection and transition direction. Set beginning and ending keyframes or set it to use or ignore the percentage value. Typically a transition goes from 0% to 100% over the length of the clip to which it is applied. Adjust the filter controls as needed. The example that I’ve shown is a Lens Flare Dissolve from the SapphireEdge transitions collection. With this effect, you can tweak some parameters in the Effect Controls window, but you can also pick from a wide range of presets using the SapphireEdge presets browser. Something worth noting is that the unrendered, real-time performance of this effect is somewhat slow in FCP X, but plays very well in Premiere Pro CC.df_pprotrans_6

Although these steps might feel cumbersome to some users when compared with FCP’s drag-and-drop approach, they are more or less the same as in After Effects. They also offer a greater level of control than in some simpler transition implementations.


(UPDATE: If you are running an older version of FxFactory, there have been conflicts with SpeedGrade CC. Please download the FxFactory 4.1.1 update from the Noise Industries website to correct this.)

©2013 Oliver Peters