Film editing stages – Sound

df_filmsoundeditLike picture editing, the completion of sound for a film also goes through a series of component parts. These normally start after “picture lock” and are performed by a team of sound editors and mixers. On small, indie films, a single sound designer/editor/mixer might cover all of these roles. On larger films, specific tasks are covered by different individuals. Depending on whether it’s one individual or a team, sound post can take anywhere from four weeks to several months to complete.

Location mixing – During original production, the recording of live sound is handled by the location mixer. This is considered mixing, because originally, multiple mics were mixed “on-the-fly” to a single mono or stereo recording device. In modern films with digital location recordings, the mixer tends to record what is really only a mixed reference track for the editors, while simultaneously recording separate tracks of each isolated microphone to be used in the actual post production mix.

ADR – automatic dialogue replacement or “looping”. ADR is the recording of replacement dialogue in sync with the picture. The actors do this while watching their performance on screen. Sometimes this is done during production and sometimes during post. ADR will be used when location audio has technical flaws. Sometimes ADR is also used to record additional dialogue – for instance, when an actor has his or her back turned. ADR can also be used to record “sanitized” dialogue to remove profanity.

Walla or “group loop” – Additional audio is recorded for groups of people. This is usually for background sounds, like guests in a restaurant. The term “walla” comes from the fact that actors were (and often still are) instructed to say “walla, walla, walla” instead of real dialogue. The point is to create a sound effect of a crowd murmuring, without any recognizable dialogue line being heard. You don’t want anything distinctive to stand out above the murmur, other than the lead actors’ dialogue lines.

Dialogue editing – When the film editor (i.e. the picture editor) hands over the locked cut to the sound editors, it generally will include all properly edited dialogue for the scenes. However, this is not prepared for mixing. The dialogue editor will take this cut and break out all individual mic tracks. They will make sure all director’s cues are removed and they will often add room tone and ambience to smooth out the recording. In addition, specific actor mics will be grouped to common tracks so that it is easier to mix and apply specific processing, as needed, for any given character.

Sound effects editing/sound design – Sound effects for a film come from a variety of sources, including live recordings, sound effects libraries and sound synthesizers. Putting this all together is the role of the sound effects editor(s). Because many have elevated the art, by creating very specific senses of place, the term “sound designer” has come into vogue. For example, the villain’s lair might always feature certain sounds that are identifiable with that character – e.g. dripping water, rats squeaking, a distant clock chiming, etc. These become thematic, just like a character’s musical theme. The sound effects editors are the ones that record, find and place such sound effects.

Foley – Foley is the art of live sound effects recording. This is often done by a two-person team consisting of a recordist and a Foley walker, who is the artist physically performing these sounds. It literally IS a performance, because the walker does this in sync to the picture. Examples of Foley include footsteps, clothes rustling, punches in a fight scene and so on. It is usually faster and more appropriate-sounding to record live sound effects than to use library cues from a CD.

In addition to standard sound effects, additional Foley is recorded for international mixes. When an actor deliveries a dialogue line over a sound recorded as part of a scene – a door closing or a cup being set on a table – that sound will naturally be removed when English dialogue is replaced by foreign dialogue in international versions of the film. Therefore, additional sound effects are recorded to fill in these gaps. Having a proper international mix (often called “fully filled”) is usually a deliverable requirement by any distributor.

Music – In an ideal film scenario, a composer creates all the music for a film. He or she is working in parallel with the sound and dialogue editors. Music is usually divided between source cues (e.g. the background songs playing from a jukebox at a bar) and musical score.

Recorded songs may also be used as score elements during montages. Sometimes different musicians, other than the composer, will create songs for source cues or for use in the score. Alternatively, the producers may license affordable recordings from unsigned artists. Rarely is recognizable popular music used, unless the production has a huge budget. It is important that the producers, composer and sound editors communicate with each other, to define whether items like songs are to be treated as a musical element or as a background sound effect.

The best situation is when an experienced film composer delivers all completed music that is timed and synced to picture. The composer may deliver the score in submixed, musical stems (rhythm instruments separated from lead instruments, for instance) for greater control in the mix. However, sometimes it isn’t possible for the composer to provide a finished, ready-to-mix score. In that case, a music editor may get involved, in order to edit and position music to picture as if it were the score.

Laugh tracks – This is usually a part of sitcom TV production and not feature films. When laugh tracks are added, the laughs are usually placed by sound effects editors who specialize in adding laughs. The appropriate laugh tracks are kept separate so they can be added or removed in the final mix and/or as part of any deliverables.

Re-recording mix – Since location recording is called location mixing, the final, post production mix is called a re-recording mix. This is the point at which divergent sound elements – dialogue, ADR, sound effects, Foley and music – all meet and are mixed in sync to the final picture. On a large film, these various elements can easily take up 150 or more tracks and require two or three mixers to man the console. With the introduction of automated systems and the ability to completely mix “in the box”, using a DAW like Pro Tools, smaller films may be mixed by one or two mixers. Typically the lead mixer handles the dialogue tracks and the second and third mixers control sound effects and music. Mixing most feature films takes one to two weeks, plus the time to output various deliverable versions (stereo, surround, international, etc.).

The deliverable requirements for most TV shows and features are to create a so-called composite mix (in several variations), along with separate stems for dialogue, sound effects and music. A stem is a submix of just a group of component items, such as a stereo stem for only dialogue.The combination of the stems should equal the mix. By having stems available, the distributors can easily create foreign versions and trailers.

©2013 Oliver Peters

Film editing stages – Picture

df_filmpicedit

While budding filmmakers have a good idea of what happens during the production phase of shooting a film, most have little idea about what happens in post. Both picture and sound go through lengthy and separate editorial processes. These often become a rude awakening for new directors when it pertains to the time and budget requirements. These are the basic steps every modern film goes through in getting to the finish line.

First cut – This stage goes by many names – first cut, first assembly, editor’s cut, etc. In general, this is the first version of the fully-assembled film, including all the scenes edited according to the script. Depending on the editor and the post schedule, this cut may be very rough – or it might be a reasonably polished edit. If the editing happens concurrent to the studio and location filming, then often there will be a “first assembly” and a subsequent “editor’s cut”. The former is a quick turnaround version, so that everyone can make sure the coverage is adequate. The latter is a more refined version.

Some productions employ an on-set editor who is the person generating this “first assembly”. That editor is then often replaced by the main film editor, who starts after all production is completed. In that situation, the “editor’s cut” might be completely different in style, pace and technique from the first version. No matter how you get there, the intent of this step is to properly represent the intention of the script without concern for length or solving any content or script challenges.

Director’s cut – Once the editor has completed the first cut of the film, then the director steps in. He or she works with the editor to complete the cut of the film. Directors often deviate from the written scene. Sometimes this is sufficiently communicated to the editor to show up that way in the first cut. Sometimes it isn’t, because it lives in the director’s mind as the production proceeds. During the “director’s cut” phase, the director and editor work closely to adjust the cut to reflect the director’s vision.

Many directors and editors repeatedly work together on films and form a partnership of sorts. In these situation, the editor has a good idea of what the director wants and often the director only needs to give notes and review the cut periodically. Other directors like to be very “hands on” and will work closely with the editor, reviewing every take and making adjustments as needed.

Depending on the film and whether or not the director is DGA (Directors Guild), this stage will take a minimum of 20 days (DGA low budget) or 10 weeks (DGA standard) or longer. The goal is for the director and editor to come up with the best film possible, without interference from outside parties, including the producers. At this point, the film may go through severe changes, including shortening, losing and/or re-arranging scenes and even the addition of new content, like insert shots and new voice-over recordings.

Producer’s cut – After the director has a shot at the film, now it’s time to make adjustments according to studio notes, producer comments and feedback from official and unofficial test screenings. If the director hasn’t yet brought the film into line – both story-wise and length-wise – now is the time to do that. Typically most indie films are targeted at the 90-100 minute range. If your first cut or director’s cut is 120 minutes or longer, then it’s going to have to be cut down by a significant amount.

Typically you can shorten a film by 10% through trimming and shortening scenes. A reduction of 25% or more means that shots and whole scenes have to go. This can be a painful experience for the director, who has suffered through the agony, time and expense of getting these scenes and shots recorded. The editor, on the other hand, has no such emotional investment and can be more objective. Whichever way the process moves forward, the point is to get the cut to its final form.

Depending on the production, this version of the film might also include temporary sound effects, music and visual effects that have been added by the editor and/or assistants. Often this is needed to fully appreciate the film when showing it in test screenings.

Locked picture – The goal of these various editing steps is to explore all creative options in order to end up with a film that will not go through any further editing changes. This means, no revisions that change time or selected shots. The reason for a “locked picture” is so that the sound editing team and the visual effects designers can proceed with their work without the fear that changes will undo some of their efforts. Although large budget films have the luxury of making editorial changes after this point, it is unrealistic for smaller indie films. “Locking the cut” is absolutely essential if you want to get the best effort out of the post team, as well as stay within your budget.

Visual effects – If your film requires any visual effects shots, these are best tackled after picture lock. The editors will hand off the required source elements to the visual effects company or designers so they can do their thing. Editors are typically not involved in visual effects creation, other than to communicate the intent of any temp effects that have created and to make sure the completed VFX shots integrate properly back into the picture.

Sound editorial - This will be covered in depth in the next blog post. It has its own set of steps and usually takes several weeks to several months to complete.

Conform and grade – Prior to this step, all editing has been performed with “proxy” media. During the “finishing” stage of the film, the original camera media is “conformed” to the locked cut that was handed over from the film editor. This conform step is typically run by an online editor who works in tandem with the colorist. Sometimes this is performed by the colorist and not a separate individual. On very low budget films, the film editor, online editor and colorist might all be the same person. During conforming, the objective is to frame-accurately re-create the edit, including all reframing, speed ramps and to integrate all final visual effects shots. From this point the film goes to color correction for final grading. Here the colorist matches all shots to establish visual consistency, as well as to add any subjective looks requested by the director or director of photography. The last process is to marry the sound mix back to the picture and then generate the various deliverable masters.

©2013 Oliver Peters

Anatomy of editing a two camera scene

df_2cam_1

With the increase in shooting ratios and shortened production schedules, many directors turn to shooting their project with two cameras for the entire time. Since REDs and Canon HDSLRs are bountiful and reasonably priced to buy or rent, even a low budget indie film can take advantage of this. Let me say from the beginning that I’m not a big fan of shooting with two cameras. Too many directors view it as a way to get through their shooting schedule more quickly; but, in fact, they often shoot more footage than needed. Often the B-camera coverage is only 25% useful, because it was not properly blocked or lit for. However, there are situations where shooting with two cameras works out quite well. The technique is at its most useful when shooting a dramatic dialogue scene with two or three principal actors. (Click the images below for expanded views.)

Synchronization

df_2cam_4_smThe most critical aspect is maintaining proper sync with audio and between the two cameras. In an ideal world, this is achieved with matching timecode among the cameras and the external sound recorder. Reality often throws a curve ball, which means that more often than not, timecodes drift throughout the day or the cameras weren’t properly jam-synced or some other issue. The bottom line is that by the time it gets to the editor, you often cannot rely on timecode for all elements to be in sync. That’s why “old school” techniques like a slate with a clapstick are ESSENTIAL. This means roll all three devices and slate both cameras. If you have to move to stand in front of the B-camera for a separate B-camera slate and clap, then you MUST do it.

When this gets to post, the editor or assistant first needs to sync audio and video for both the A-camera and B-camera for every take. If your external sound recorder saved broadcast WAV files, then usually you’ll have one track with the main mix and additional tracks for each isolated microphone used on set. Ideally, the location mixer will have also fed reference audio to both cameras. This means you now have three ways to sync – timecode, slate/clapstick and/or common audio. If the timecode does match, most NLEs have a syncing function to create merged clips with the combined camera file and external audio recording. FCP X can also sync by matching audio waveforms (if reference audio is present on the camera files). For external syncing, there’s Sync-N-Link and Sync-N-Link X (matching timecode) and PluralEyes (matching audio).

These are all great shortcuts, but there are times when none of the automatic solutions work. That’s when the assistant or editor has to manually mark the visual clap on the camera files and audio spike of the clap on the sound file and sync the two elements based on these references. FCP X adds an additional feature, which is the ability to open a master clip in its own timeline (“open in timeline” command). You can then edit directly “inside” the master clip. This is useful with external audio, because you have now embedded the external audio tracks inside the master clip for that camera file and they travel together from then on. This has an advantage over FCP X’s usual synchronized clip method, in that it retains the camera source’s timecode. Synchronized clips reset the timecode of that clip to 00:00:00:00.

Directing strategy

df_2cam_2_smIn standard film productions, a scene will be shot multiple times – first a “master” and then various alternate angles; sometimes alternative line readings; as well as pick-ups for part of a scene or cutaways and inserts showing items around the set. The “master” of the scene gets a scene number designation, such as Scene 101, Take 1, Take 2, etc. Whenever the camera is reframed or repositioned – or alternative dialogue is introduced – those recordings get a letter suffix, such as 101A, 101B and so on. With two cameras, there’s also the A and B camera designation, which is usually part of the actual camera file name or embedded metadata.

In blocking a simple dialogue scene with two actors, the director would set up the master with a wide shot for the entire scene on the A-camera and maybe a medium on the lead actor within that scene on the B-camera. The B-cam may be positioned next to A-cam or on the opposite side (without crossing the line). That’s Scene 101 and typically, two or three takes will be recorded.

Next, the director will set up two opposing OTS (over the shoulder) angles of the two speaking actors for 101A. After that, opposing CU (close-up) angles for 101B. Often there’s a third set-up (101C) for additional items. For example, if the scene takes place in a bar, there may be extra coverage that sets up the environment, such as patrons at the bar in the background of the scene. In this example with four setups (101-101C) – assuming the director rolled for three takes on each set-up – coverage with two cameras automatically gives you 24 clips to choose from in editing this scene.

Editing strategy

When you mention two camera coverage, many will think of multi-cam editing routines. I never use that for this purpose, because for me, an A-cam or B-cam angle of the same take is still like a uniquely separate take. However, I do find that editing the first swipe at the scene works best when you work with A-cam and B-cam grouped together. Although a director might pick a certain take as his best or “circle” take, I make the assumption that all takes have some value for individual lines. I might start with the circle take of the scene’s master, but I usually end up editing in bits and pieces of other takes, as well. The following method works best when the actors stick largely to the script, with minimal ad libs and improvisation.

df_2cam_3_smStep one is to edit the A-cam circle take of the scene master to the timeline, complete with slate. Next, edit the matching B-cam clip on top, using the slate’s clap to match the two angles. (Timecode also works, of course, if A-cam and B-cam have matching timecode.) The exact way I do this varies with the NLE that I am using. In FCP X, the B-cam clip is a connected clip, while in FCP 7, Media Composer and Premiere Pro, the B-cam is on V2 and the accompanying audio is on the tracks below those from the A-cam clip. The point is to have both angles stacked and in sync. df_2cam_5_smLastly, I’ll resize the B-cam clip so I see it as a PIP (picture-in-picture effect) over the A-cam image. Now, I can play through this scene and see what each camera angle of the master offers.

df_2cam_6_smStep two is to do the first editing pass on the scene. I use the blade tool (or add edit) to cut across all tracks/layers/clips at each edit point. Obviously, I’ll add a cut at the start of the action so I can remove the slate and run-up to the actual start of the scene. As I play though, I am making edit selections, as if I were switching cameras. The audio is edited as well – often in the middle of a line or even word. This is fine. Once these edits are done, I will delete the front and back of these takes. Then I will select all of the upper B-cam shots (plus audio) that I don’t want to use and delete these. Finally, remove the transform effects to restore the remaining B-cam clips to full screen.

df_2cam_7_smAt this stage I will usually move the B-cam clips down to the main track. In FCP X, I use the “overwrite to primary storyline” command to edit the B-cam clips (with audio) onto the storyline, thus replacing the A-cam clip segments that were there. This will cause the embedded external audio from the overwritten A-cam clip segments to be pushed down as connected clips. Highlight and delete this. In a track-based NLE, I may leave the B-cam clips on V2 or overwrite to V1. I’ll also highlight and delete/lift the unwarranted, duplicate A-cam audio. In all cases, what you want to end up with is a scene edit that checkerboards the A-cam and B-cam clips – audio and video.

df_2cam_8_smStep three is to find other coverage. So far in this example, I’ve only used the circle take for the master of the scene. As I play the scene, I will want to replace certain line readings with better takes from the other coverage (ex. 101A, B, C, etc.), including OTS and CU shots. One reason is to use the best acting performance. Another is to balance the emotion of the scene and create the best arc. Typically in a dramatic scene, the emotion rises as you get later into the scene. To emphasize this visually, I want to use tighter shots as I get further into the scene – focusing mainly on eyes and facial expressions.

I work through the scene, seeking to replace some of the master A-cam or B-cam clip segments. I will mark the timeline section to delete/extract, find a better version of that line in another angle/take and insert/splice it into that position. FCP X has a replace function which is designed for this, but I find it to be slow and inconsistent. A fast keystroke combo of marking clips and timeline and then pressing delete, followed by insert is significantly faster. Regardless of the specific keystrokes used, the point is to build/control the emotion of the scene in ways that improve the drama and combine the best performances of each actor.

df_2cam_9_smStep four is to tighten the scene. At this point, you are primarily working in the trim mode of your NLE. With FCP X, expand the audio/video so you can independently trim both elements for J and L-cuts. As you begin, you’ll have some sloppy edits. Words may be slightly clipped or the cadence of speech doesn’t sound right. You now have to fix this by trimming clips and adjusting audio and video edit points. FCP X is especially productive here, because the waveform display makes it easy to see where the same words from adjacent clips align.

You want the scene to flow – dramatically and technically. How natural-sounding is the delivered dialogue as a result of your edit choices? You should also be mindful of continuity, such as actors’ eye lines, body positions and actions. Often actors will add dramatic pauses, long stares and verbal stumbles to the performance. This may be for valid dramatic emphasis; but, it can also be over-acting and even the equivalent memory trick of saying “um” – as the actor tries to remember the next line. Your job as an editor (in support of the director) is to decide which it is and edit the scene so that it comes across in the best way possible. You can cut out this “air” by trimming the edits and/or by using tricks. For example, slip part of a line out of sync and play it under an OTS or reaction shot to tighten a pause.

Step five is to embellish the scene. This is where you add non-sync reactions, inserts and cutaways. The goal is to enhance the scene by giving it a sense of place, cover mismatched continuity and to improve drama. Your elements are the extra coverage around the set (like our patrons at the bar) or an actor’s nod, smile, head turn or grimace in response to the dialogue delivered by their acting counterpart. The goal is to maintain the viewer’s emotional involvement and help tell the story through visuals other than simply seeing a person talk. You want the viewer to follow both sides of the conversation through dialogue and visual cues.

While I have written this post with film and television drama in mind, the same techniques apply, whether it’s a comedy, documentary or simple corporate interview. It’s a strategy for getting the most efficiency out of your edit.

Click here for more film editing tips.

©2013 Oliver Peters

Film Budgeting Basics

New filmmakers tackling their first indie feature will obviously ask, “What is this film going cost to produce?” The answer to this – like many of these questions – is, “It depends.” The cost of making a film is directly related to the resources needed and the time required for each resource. That often has little to do with the time involved in actually filming the scenes.

A friend of mine, after directing his first feature, was fond of saying, “The total time of saying the words ‘roll, action, cut, print’ was probably less than an hour; but, it took me two years prior to that to have the privilege.” Cost is almost never related to return. I’ve often told budding filmmakers to consider long and hard what they are doing. They could instead take the same amount of money and throw themselves the biggest party of their life. After all the effort of making the film, you might actually have more to show for it from the party. Film returns tend to follow other media success percentages, where typically 15% are successful and 85% fail (or at least don’t make a financial return). Understanding how to maximum the value on the screen is integral to budgeting a feature film.

I often work in the realm of indie features, which includes dramatic productions and documentaries. Each of these two categories tends to break into cost tiers like these:

Dramatic films

$0 – $50,000

$200,000

$500,000

$1,000,000-$2,000,000

Over $2,000,000

Documentaries

$0 – $30,000

$50,000

$300,000-$1,500,000

Over $1,500,000

Money is always tight within these ranges. Once you get over $2,000,000, you tend to have a bit more breathing room and the ability to tackle issues by adding more resources to the equation. Production is related to time and that varies greatly between scripted films and documentaries, where the story is often evolving over time and out of the director’s control. Here is a typical rule-of-thumb timeline for the production of each.

Dramatic films – timeline

1 year to secure rights and funding

2 months of casting, scouting, preparation

1 month readying actual production logistics

2-5 weeks of production (stage and location)

8-20 weeks of picture editorial

8-20 weeks sound editorial and scoring (usually starts after picture is “locked”)

1-2 weeks of picture finish/conform/grade

1-2 weeks of audio mix (re-recording mix)

1 week to finalize all deliverables

Documentaries – timeline

The timeframe up to the start of editorial differs with every project and is an unknown.

8-60 weeks of picture editorial

8-20 weeks sound editorial and scoring (usually starts after picture is “locked”)

1-2 weeks of picture finish/conform/grade

1-2 weeks of audio mix (re-recording mix)

1 week to finalize all deliverables

__________________________________________________________

Clearly any of these categories can take longer, but in the indie/low-budget field, indecision and letting things drag out will destroy the viability of the project. You don’t have the luxury of studio film timeframes. This is where a savvy line producer, unit manager and production manager (often the same person on small films) can make or break the budget. Here are some cost variables to consider.

Cost variables that need to be evaluated and balanced

Union versus non-union.

More days of shooting versus fewer, but longer days, with overtime pay.

The size of the cast and the experience level of the actors.

Allotting adequate (non-filmed) rehearsal time.

The number of script pages (a shorter script means a less costly production).

Accurate timing of scene descriptions to determine how much production time is required for each scene.

The number of locations and location changes/distances.

Period drama versus a contemporary story.

Stage and sets versus shooting at real locations.

The number of make-up and wardrobe changes.

A production location with local crews and facilities versus bringing in resources from the outside.

Film versus digital photography.

The number of cameras.

The amount of gear (dollies, cranes, etc.).

Cost-saving tips

Investigate opportunities to partner with regional film schools.

Using a director of photography who is his own camera operator and who can supply his own cameras and lenses.

Using a location mixer with his own gear.

Using an editor with his own gear.

Eliminate the needs for an elaborate “video village” and possibly reduce the need for a DIT (if you have savvy camera assistants).

Negotiate lower equipment rental costs based on fewer days per week.

Negotiate local resources for food, lodging, travel and craft services.

Explore alternatives to stages, such as empty warehouses.

Explore unsigned local musical artists for songs, scores, etc.

Hold one or more days of production in reserve (to fix “gaps” discovered during editing), in order to shoot inserts, B-roll, transitional shots, the opening title, etc.

Errors that will drive up cost

The film is too short or too long (ideal is a first cut that’s about 10% longer than target, so it can be trimmed back).

Unforeseen or poorly executed visual effects.

Judgment calls made on location to “save” time/effort on a rushed day.

Allowing the actors too much freedom to ad lib and improvise, as well as play with props.

Indecision in the edit.

Changing the edit after the cut is “locked”.

Using stock images or popular music without making provisions in advance for clearance and budgeting.

Cost-saving items that AREN’T

Failing to shoot a complete master shot as part of the coverage on complex scenes.

Using two or more camera throughout the entire production.

Letting actors ad lib in lieu of adequate rehearsal.

Not hiring a script supervisor/continuity person.

Using blue/green-screen effects for driving shots.

Relying on low-light cameras instead of proper lighting.

Extensive use of the “video village” on set.

Limiting the amount of footage sent to the editors (send them everything, not only “circle takes”).

Short-changing the importance of the role of the data wrangler.

Not allowing adequate time or resources for proper data management.

__________________________________________________________

For reference, I put together two sample budgets a year ago, as part of a presentation at Digital Video Expo in Pasadena. It’s available for download here in Numbers, Excel and PDF versions. Feel free to manipulate the spreadsheets for your own production to see how they stack up. I break down a film/DI and a digital photography budget. As you can see, going with 35mm film adds about $175K more to the budget, largely due to stock, processing and DI costs. In a major studio feature, the difference in formats is inconsequential, but not in the million dollar indie range. I have not included a “film-out”, which will add $75-$200K.

The budget I developed, with the help of a number of experienced unit managers, represents a fairly typical, non-union, indie film. It includes most of the cost for crew, cast, production and post, but does not include such items as the cost of the script, props, sets, production office rentals, hotels, insurance, creative fees and others. As a rule-of-thumb, I’ve factored gear and stage rentals as 3-day weeks. This means you get seven days of use, but are only charged for three. In the past year, I’ve heard rates as low as 1.5-day weeks, but I don’t think you can plan on that being the norm. A 3-day or 4-day week is customary.

Many states offer film production incentives, designed to entice producers to shoot a project in that state. Often local investment money and economic incentives will attract producers to a particular locale. That’s great if the state has good local crew and production resources, but if not, then you’ll have to bring in more from the outside. This adds cost for travel and lodging, some of which an enterprising producer can negotiate for trade in the form of a credit on the film. There’s no guarantee of that, though, and as it’s such a variable, this is a cost item that must be evaluated with each individual production.

Remember that post production work has to occur in some physical place. Audio post is typically done in a studio owned or rented by the audio engineer. That’s not the case for editors. If you hire a freelance film editor, you will also need to factor in the cost of the editing system, as well as a rental office in which to house the operation. Some editors can supply that as a package deal and others don’t.

Naturally, a savvy line producer can find ways to bring this budget even lower. I work a lot with the Valencia College Film Technology Program in Orlando. Over the years they have partnered with many producers to complete Hollywood-grade features. I’m not talking student films, but rather name directors and actors working alongside students and working pros to put out films destined for theatrical distribution. The films produced there often place a level of production value on the screen that’s as much as twice the actual out-of-pocket cost of production and post. All thanks to the resources and services the program has to offer.

__________________________________________________________

Most new producers have a good handle on the production phase, but post is a total black hole. As a consequence, post often gets short-changed in the budgeting process. Unfortunately, some producers try to figure out their post production costs at the point when everything is in the can, but almost all of the money has been spent. That’s in spite of the fact that post generally takes much more time than the period allotted to location and stage photography. In order to properly understand the post side of things, here are the workflows for four finishing scenarios.

Film – traditional post

Shoot on location with film – 1,000ft. of 35mm = about 10 minutes of unedited footage.

Process the negative at the lab and do a “best light” transfer to videotape or a hard drive.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates a cut list for the negative cutter.

The negative cutter conforms the negative (physical splices).

All visual effects are added as optical effects.

Lab color timing is performed and answer prints are generated for review.

Film deliverables are generated.

Film – DI (digital intermediate) post

Shoot on location with film – 1,000ft. of 35mm = about 10 minutes of unedited footage.

Process the negative at the lab and do a “best light” transfer to videotape or a hard drive.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates edit lists for the finishing house.

Selected shots are retransferred (or scanned), conformed and graded.

Visual effects are inserted during the conform/grade.

Digital and/or film deliverables are generated.

Digital production – camera raw photography

Shoot on location with a digital camera that records in a raw file format to a card or hard drive.

The footage is converted into a viewable form for the editors.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates edit lists for the finishing house.

Camera raw files are conformed and color graded in a process similar to a DI.

Visual effects are inserted during the conform/grade.

Digital and/or film deliverables are generated.

Digital production – tape or file-based (not raw) photography

Shoot on location with a digital camera and recorded to tape or as files to a card or hard drive.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates edit lists for the finishing house.

Camera files are conformed and color graded.

Visual effects are inserted during the conform/grade.

In some cases, the editing format and the system is of a level to be considered final quality and the same editor can do both the creative edit and finishing.

Digital and/or film deliverables are generated.

As these workflows show, a lot goes into post beyond simply editing and mixing the film. These elements take time and determine the level of polish you present to your audience. The sample budgets I’ve compiled aren’t intended to cause sticker shock. It’s clear that getting the tally to $1 Million doesn’t take very much and that’s a pretty realistic range for a small film. Granted, I’ve worked on films done for $150,000 that looked like a lot more, but it takes a lot of work to get there. And often leaning hard on the good graces of the crew and resources you use.

For comparison, here’s an example at The Smoking Gun that’s purported to be the working budget for M. Night Shyamalan’s The Village under the working title of The Woods. It doesn’t really matter whether it is or it isn’t the actual budget. The numbers are in line with this type of studio film, which makes it a good exercise in seeing how one can spend $70 Million on a film.

Whether you play in the studio or the independent film arena, it’s important to understand how to translate the vision of the script in a way that correlates to time and money. Once that becomes second nature, you are on your way to becoming a producer that puts the most production value on the screen for the audiences to appreciate.

©2012 Oliver Peters

Documentary Editing Tips

Of the many projects I work on, documentaries and documentary-style productions are my favorite. I find these often more entertaining and certainly more enlightening than many dramatic features and shows. It’s hard to beat reality. Documentaries present challenges for the editor, but in no other form does the editor play more of a role in shaping the final outcome. Many of them truly typify an editor’s function as the “writer” through shot selection and construction.

Structure and style

There are different ways you can build a documentary, but in the end, the objective is to end up with a film that tells an engaging story in such a way that the audience comprehends it. Structurally a documentary tends to take one of these forms:

-       Interview sound bites completely tell the story

-       The “voice of God” narrator guides you through

-       The “slice of life” story, where the viewer is a hidden observer

-       Re-enactments of events through acted scenes or readings, a la The Civil War or The Blues

-       The filmmaker as a first person guide, such as Werner Herzog

Sometimes, the best approach is a combination of all of these. You may set out to have the complete story told only through assembled sound bites, yet the story is never fully fleshed out. There, pieces of scripted narration will help clarify the story and bind disparate elements and thoughts together.

Story arc and character

The persons on screen are real, but to the audience they are no less characters in a film than a role performed by a dramatic actor. As an editor, the way you select sound bites and put them together – and the order in which these are presented throughout the film – establish not only a story arc, but also perceived heroes and villains in the minds of the audience. Viewers want a film with a logical start, building tension and ultimate resolution. Even when there is no happy ending, the editor should strive to build a story that leaves the audience with some answers or conclusion.

Remember to balance out your characters. In many interview-based stories, the same questions are posed to the various interviewees as the interviews are conducted. This is helpful to the editor, because you can balance out the different on-camera appearances by mixing up whose response you choose to use. That way, the same subject isn’t always to go-to person and you aren’t heavy with any single person. Sometimes it’s best to have one person start a thought or a statement and then conclude with another, assuming the two segments are complementary.

Objectivity

This is one of the myths taught in some film and journalism schools. The truth is that almost every documentary (and often many news stories) are approached from the point-of-view and biases of the writer, producer, director and editor. You can try to portray all sides fairly, but the choice of who is interviewed or which bites are selected reflects an often subconscious bias of the person making that decision. It can also appear lopsided simply based on which subjects decided to participate.

Sometimes the effects are subtle and harmless, as in reality TV shows, where the aim is to tell the most entertaining story. In the other extreme, it can become borderline propaganda for the agenda of the filmmaker. I’m not telling you what type of film to make – just to be aware of the inevitable. If there’s a subjective point-of-view, then don’t try to hide it. Rather, make it clearly a personal statement so the audience isn’t tricked into believing the filmmakers gave a fair shake to all sides.

The art of the interview

 If your documentary tale is built out of interview clips, then a lot of your time as an editor will go into organizing the material and playing with story structure. That is, editing and re-arranging sound bites in a way to tell a complete story without the need for a narrator. Often this requires that you assemble sound bites in a way that’s quite different from the way they were recorded in linear time.

Enter the “Frankenbite”. That’s a term editors apply to two types of sound bite construction: a) splicing together parts of two or more sound bite snippets to create a new, concise statement; or b) editing a word or phrase from another part of the interview to get the right inflection, such as making a statement sound like the end of a sentence, when in fact the original part was really in mid-thought.

Personally I have no problem with any of this, but draw the line at dishonesty. It’s very important to listen to the interviews in their entirety and make sure that the elements you are splicing together aren’t taken out of context. You don’t want to create the impression that what is being said is the exact opposite of what the speaker meant to say. The point of this slicing is to collapse time and get the point across succinctly without presenting a full and possibly rambling answer. Be true to the intent and you’ll be fine.

Typically such edits are covered by cutaway shots to hide the jump cut, though some director stylistically prefer to show the jump cut that such edits produce. This can give a certain interesting rhythm to the cut that might not otherwise be there. It also clearly tells the audience that an edit was made. It’s a stylistic approach, so pick a path and stick with it.

The beauty of the HDSLR revolution brought about by Canon is that it’s easier (and cheaper) than ever to field two-camera shoots. This is especially useful for documentary interviews. Often directors will set up two 5D or 7D cameras – one facing the subject and the other at an angle. This gives the editor two camera angles to cut with and it’s often possible to assemble edited sound bites using cuts between the two cameras at these edit points. This lets you splice together thoughts and still appear like a live switch in a TV show – totally seamless without an obvious jump cut. I’ve been able to build short shows this way working 100% from the interviews without a single cutaway shot and still have the end result appear to the audience as completely contiguous and coherent.

Mine the unrehearsed responses. Naturally that depends on the talent of the interviewer and how much her or she can get out of the interviewee. The best interviewers will warm up their subject first, go through the pro forma questions and then circle back for more genuine answers, once the interviewee is less nervous with the process. This is usually where you’ll get the better responses, so often the first half of the recording tends to be less useful. If the interviewer asks at the end, “Is there anything else you’d like to add?” – that’s where you frequently get the best answers, especially if the subject is someone who is interviewed a lot. Those folks are used to giving stock answers to all the standard questions. If their answers can be more freeform, then you’ll tend to get more unique and thoughtful points-of-view.

Organizing non-timecoded source material

 Archival footage frequently used in documentaries comes from a variety of sources, such as old home movies (on various film formats), VHS tapes and more. Before you ever start editing from these, they should be transferred with the best possible quality to a mastering format, such as Digital Betacam (for NTSC or PAL), HDCAM/HDCAM-SR (for HD) or high-quality QuickTime files (DNxHD, ProRes or uncompressed).

The point is to get these to a format, which can be organized and tracked through stages of the edit. This usually means some format that allows timecode, reel numbers or other file name coding to make it easy to find if the project takes years to complete. Remember that timecode and a 4-digit reel (or source) number lets you find any single frame within 10,000 hours of footage. To make this material easier to use during the offline editing stage of the project, you may elect to make low-cost/low-res copies for editing. For example, DVCAM if on tape or ProRes Proxy or DNxHD 36 for files. Doing so means that timecode and source/reel info MUST correspond perfectly between the low-res and hi-res versions.

Your still photo strategy

 Photography and artwork are the visual lifeblood of documentaries that lack supporting film or video content. Ken Burns has elevated the technique of camera moves on still images to an art form. Clearly he’s a filmmaker known to the general public as much for this effect branded by Apple after his name, as his award-winning films. Yet, the technique clearly predates him and has gone by many terms over the years. A company I once worked for frequently called it “pictography”. Regardless of origin – the use of stills requires two elements: organization and motion.

There are numerous photo and still image organizing and manipulation applications, including Adobe Lightroom, Bridge, Apple iPhoto and Aperture. Each of these provides a method to catalog, rate and sort the photos. You’ll need the application with a good manipulation toolset to properly crop, color correction and/or fix damaged images. Lightroom is my personal preference, but they all get the job done.

Moves on stills can be accomplished in several ways: animated moves in software, a computer-assisted, motion control camera stand or simply a human operator doing real cameras moves. Often the last method is the simplest, fastest and best looking. If that’s your choice, print large versions of the stills, put them on an easel and set up a video camera. Then record a variety of moves at different speeds, which will become source “video” for your edit session.

Another popular method is to separate components of the image into Photoshop layers. Then bring these into After Effects and design perspective moves in which the foreground elements move or grow at a different rate than the background layer. This method was popularized in The Kid Stays in the Picture. The trick to pulling this off successfully is that the Photoshop artist must fill in the background layer to replace the portion cut out for the foreground person or object. Otherwise you see a repeated section of the foreground image or possibly the cut-out area.

Edit system organization

 There are plenty of tools at your disposal, regardless of whether you prefer Avid, FCP 7, FCP X or something else. If this project takes several years with several editors and a potpourri of formats, then Media Composer is a good bet; however, Final Cut also has its share of fans among documentary editors. Make liberal use of subclips and markers to keep yourself straight. Tools like Boris Soundbite (formerly Get) and Avid ScriptSync and PhraseFind are essential to the editors who embrace them.

I tend to not use transcripts as the basis for my edits. Nevertheless, having an electronic and/or paper transcript of interviews available to you (with general timecode locations) makes it easy to find alternatives. That can be as simple as having a copy open in Word on the same computer and using the Find function. My point is that modern tools make it very easy to tackle a wealth of content without getting buried by the footage.

The value of the finishing process

 I feel that even more so than on dramatic features, documentaries benefit for high-quality finishing services. These range from simple online editing to format conversion to color grading. Since original sources often vary so widely in quality, it’s important to get the polish that a trained online/finishing editor and/or colorist can provide. Same for audio. Use the services of talented sound designers, editors and mixers to bring the mix up a notch. Nothing screams “bad”, like a substandard soundtrack, no matter how striking the images are.

Clearances

It is important for the editor is to keep track of the sources and usage for stock images and music. These aren’t free. Many documentary producers seem to feel they can “sweet-talk” the rights holder into donating content out of a sense of interest or altruism. That’s almost never successful. So understand the licensing issues and be wary of using images and music – even on a temporary basis – that you know will be hard to clear or too expensive to purchase.

Make sure that you have an adequate system for tracking and reporting the use of stock material, so that it can be properly bought and cleared when the film is being finished. During the rough cut, stock footage and images will usually be low-res versions with a “burn-in” or watermark. When the time comes to purchase the final high-res images, most companies require that you request the exact range of the material used based on timecode. That material will be provided as files or on tape, but there’s no guarantee that the timecode will match. Be prepared to eye-match each shot if that’s the case.

©2011 Oliver Peters