NLEs at NAB 2015

df2115_NAB6

NAB is the biggest toy store in our industry. As in years past, I’ve covered it for DV magazine, where you’ll find the expanded version. The following is the segment covering the four – soon to become five – most popular NLE vendors.

df2115_NAB2Editing options largely focused on the four “A” companies – Apple, Adobe, Avid and Autodesk. Apple wasn’t officially at the show, but held private press meetings at an area hotel. Consulting company FCPworks presented a series of workflow and case study sessions at the Renaissance Hotel next door to the South Hall. This coincided with Apple’s release of the updated versions of Final Cut Pro X, Motion and Compressor. FCP X 10.2 includes a number of enhancements, but the most buzz went to the addition of a new 3D text engine for FCP X and Motion. Apple’s implementation is one of the easiest to use and best-looking in any application. The best part is that the performance is excellent. Two other big features fall more in line with user wish lists. These include built-in masking and changing the color correction tool into a standard effect filter. Compressor has now added a preset designed for iTunes submission. Although Apple still encourages users to go to iTunes though an approved third-party portal, this new preset makes it easier to create the proper file package necessary for delivery.

df2115_NAB1Adobe has the momentum as the next up-and-coming professional editing tool. At NAB Adobe was showing technology previews of the application features that will be released as part of a Creative Cloud subscription in the coming months. Premiere Pro CC now integrates more of SpeedGrade CC’s color correction capabilities through the addition of the Lumetri Color panel. This tabbed control integrates tools that are familiar from SpeedGrade, but also from Lightroom. Since Premiere already includes built-in masking and tracking, this means the editor is capable of doing very sophisticated color correction right inside of Premiere. Morph Cut is a new effect that everyone cutting interviews will love. The effect is designed to smoothly transition across jump cuts in a seamless manner. It uses advanced tracking and frame interpolation functions to build new “in-between” frames. After Effects adds an outstanding face tracker and improved previews. View design iterations, adjust composition properties, and even resize interface panels without halting composition playback. The face tracker locks onto specific points (pupils, mouth, nose), which enables accurate tracking when elements need to be composited onto an actor’s face.

Adobe is also good for out-of-the-box thinking on new technologies. Character Animator was demonstrated as a live animation tool. Using real-time facial tracking, such as from a laptop’s webcam, the animator could do live animation key framing of an on-screen cartoon character. Import a cartoon character as a layered Photoshop file as the starting point. When you move and talk, so does the character in real-time – all controlled by the tracking. Not only can you add the real-time animation, but certain animation functions are automatically applied, like a character’s breathing motion. Another interesting tool is Candy. This is a mobile app which analyses the tonal color scheme of photos stored on your mobile device. It creates a “look” file and stores it to your Creative Cloud library. This, in turn, can be synced with your copy of Premiere Pro CC and then applied as a color correction look to any video clip.

df2115_NAB3Avid ran the second annual Avid Connect event for members of their customer association in the weekend leading up to the NAB exhibition. Although this was the first show appearance of Media Composer 8.3.1 – Avid’s first move into true 4K editing – they did very little to promote it. That’s not to say there wasn’t any news. Several new products were announced, including the Avid Artist | DNxIO. Instead of developing their own 4K hardware, Avid opted to partner with Blackmagic Design. The DNxIO is essentially the same as the UltraStudio 4K Extreme, except with the addition of Avid’s DNxHR codec embedded into the unit. Only Avid will sell the Avid-branded version and will also provide any technical support. The DNxIO supports both PCIe or Thunderbolt host connections and can also be used for Adobe Premiere Pro CC, Apple FCP X and DaVinci Resolve running on the same workstation as Media Composer.

In an effort to attract new users to Media Composer, Avid also announced Media Composer | First. This is a free version with a reduced feature set. It’s intended as functional starter software from which users will hopefully transition to the full, paid application. However, it uses a “freemium” sales model, allowing users to extend functionality through add-on purchases. For example, Media Composer | First permits users to only store three active projects in the cloud. Additional storage for more projects can be purchased from Avid.

df2115_NAB5Autodesk’s NAB news was all about the 2016 versions of Flame, Maya and 3ds Max. Flame and Flame Premium customers gain new look development tools, including Lightbox – a GPU shader toolkit for 3D color correction – and Matchbox in the Action module. This applies fast Matchbox shaders to texture maps without leaving the 3D compositing scene. Maya 2016 received performance and ease-of-use enhancements. There are also new capabilities in Bifrost to help deliver realistic liquid simulations. 3ds Max 2016 gains a new, node-based creation graph, a new design workspace and template system, as well as other design enhancements. If you’ve been following Smoke, then this NAB was disappointing. Autodesk told me that an update is in the works, but development timing didn’t allow it to be ready in time for the show. I would presume we’ll hear something at IBC.

df2115_NAB4For editors, all eyes are on Blackmagic Design. DaVinci Resolve 12 was demonstrated, which is the first version that the company feels can compete as a full-fledged NLE. Last NAB, Resolve 11 was introduced as an online editor, but once it was out in the wild, most users found the real-time performance wasn’t up to par with other NLEs. Resolve 12 appears to have licked that issue, with a new audio engine and improved editing features. New in Resolve 12 is a multi-camera editing mode with the ability to sync angles by audio, timecode or in/out points. The new, high-performance audio engine was designed to greatly improve real-time playback, but also supports VST and AU audio plug-ins. Editors will also be able to export projects to ProTools using AAF.  Don’t forget that there are also updates to its color correction functions. Aside from interface and control enhancements, the most notable additions are a new keyer and a new perspective tracker. The latter will allow users to better track objects that move off-screen during the clip. Resolve 12 is scheduled to be released in July. Blackmagic acquired Fusion last year. It’s a node-based, compositing application, built on Windows. At the booth, Blackmagic previewed Fusion 8 on the Mac and announced that it will be available for Windows, Mac and Linux. Like Resolve, Fusion 8 will be offered in both a free and a paid version.

This post is an abbreviated overview written for CreativePlanetNetwork and Digital Video magazine. Click here for the full-length version to find out about more post news, as well as cameras, effects and other items presented at NAB.

©2015 Oliver Peters

Tips for Production Success – Part 2

df2015_prodtips_2_smPicking up from my last post (part 1), here are 10 more tips to help you plan for a successful production.

Create a plan and work it. Being a successful filmmaker – that is, making a living at it – is more than just producing a single film. Such projects almost never go beyond the festival circuit, even if you do think it is the “great American film”. An indie producer may work on a project for about four years, from the time they start planning and raising the funds – through production and post – until real distribution starts. Therefore, the better approach is to start small and work your way up. Start with a manageable project or film with a modest budget and then get it done on time and in budget. If that’s a success, then start the next one – a bit bigger and more ambitious. If it works, rinse and repeat. If you can make that work, then you can call yourself a filmmaker.

Budget. I have a whole post on this subject, but in a nutshell, an indie film that doesn’t involve union talent or big special effects will likely cost close to one million dollars, all in. You can certainly get by on less. I’ve cut films that were produced for under $150,000 and one even under $50,000, but that means calling in a lot of favors and having many folks working for free or on deferment. You can pull that off one time, but it’s not a way to build a business, because you can’t go back to those same resources and ask to do it a second time. Learn how to raise the money to do it right and proceed from there.

Contingencies at the end. Intelligent budgeting means leaving a bit for the end. A number of films that I’ve cut had to do reshoots or spend extra days to shoot more inserts, establishing shots, etc. Plan for this to happen and make sure you’ve protected these items in the budget. You’ll need them.

Own vs. rent. Some producers see their film projects as a way to buy gear. That may or may not make sense. If you need a camera and can otherwise make money with it, then buy it. Or if you can buy it, use it, and then resell it to come out ahead – by all means follow that path. But if gear ownership is not your thing and if you have no other production plans for the gear after that one project, then it will most likely be a better deal to work out rentals. After all, you’re still going to need a lot of extras to round out the package.

Shooting ratios. In the early 90s I worked on the post of five half-hour and hourlong episodic TV series that were shot on 35mm film. Back then shooting ratios were pretty tight. A half-hour episode is about 20-22 minutes of content, excluding commercials, bumpers, open, and credits. An hourlong episode is about 44-46 minutes of program content. Depending on the production, these were shot in three to five days and exposed between 36,000 and 50,000 feet of negative. Therefore, a typical day meant 50-60 minutes of transferred “dailies” to edit from – or no more than five hours of source footage, depending on the series. This would put them close to the ideal mark (on average) of approximately a 10:1 shooting ratio.

Today, digital cameras make life easier and with the propensity to shoot two or more cameras on a regular basis, this means the same projects today might have conservatively generated more than 10 hours of source footage for each episode. This impacts post tremendously – especially if deadline is a factor. As a new producer, you should strive to control these ratios and stay within the goal of a 10:1 ratio (or lower).

Block and rehearse. The more a scene is buttoned down, the fewer takes you’ll need, which leads to a tighter shooting ratio. This means rehearse a scene and make sure the camera work is properly blocked. Don’t wing it! Once everything is ready, shoot it. Odds are you’ll get it in two to three takes instead of the five or more that might otherwise be required.

Control the actors. Unless there’s a valid reason to let your actors improvise, make sure the acting is consistent. That is, lines are read in the same order each take, props are handled at the same point, and actors consistently hit their marks each take. If you stray from that discipline, the editorial time becomes longer. If allowed to engage in too much freewheeling improvisation, actors may inadvertently paint you into a corner. To avoid that outcome, control it from the start.

Visual effects planning. Most films don’t require special effects, but there are often “invisible” fixes that can be created through visual effects. For example, combining elements of two takes or adding items to a set. A recent romantic drama I post-supervised used 76 effects shots of one type or another. If this is something that helps the project, make sure to plan for it from the outset. Adobe After Effects is the ubiquitous tool that makes such effects affordable. The results are great and there are plenty of talented designers who can assist you within almost any budget range.

Multiple cameras vs. single camera vs. 4K. Some producers like the idea of shooting interviews (especially two-shots) in 4K (for a 1080 finish) and then slice out the frame they want. I contend that often 4K presents focus issues, due to the larger sensors used in these cameras. In addition, the optics of slicing a region out of a 4K image are different than using another camera or zooming in to reframe the shot. As a result, the look that you get isn’t “quite right”. Naturally, it also adds one more component that the editor has to deal with – reframing each and every shot.

Conversely, when shooting a locked-off interview with one person on-camera, using two cameras makes the edit ideal. One camera might be placed face-on towards the speaker and the other from a side angle. This makes cutting between the camera angles visually more exciting and makes editing without visible jump cuts easier.

In dramatic productions, many new directors want to emulate the “big boys” and also shoot with two or more cameras for every scene. Unfortunately this isn’t always productive, because the lighting is compromised, one camera is often in an awkward position with poor framing, or even worse, often the main camera blocks the secondary camera. At best, you might get 25% usability out of this second camera. A better plan is to shoot in a traditional single-camera style. Move the camera around for different angles. Tweak the lighting to optimize the look and run the scene again for that view.

The script is too long. An indie film script is generally around 100 pages with 95-120 scenes. The film gets shot in 20-30 days and takes about 10-15 weeks to edit. If your script is inordinately long and takes many more days to shoot, then it will also take many more days to edit. The result will usually be a cut that is too long. The acceptable “standard” for most films is 90-100 minutes. If you clock in at three hours, then obviously a lot of slashing has to occur. You can lose 10-15% (maybe) through trimming the fat, but a reduction of 25-40% (or more) means you are cutting meat and bone. Scenes have to be lost, the story has to be re-arranged, or even more drastic solutions. A careful reading of the script and conceiving that as a finished concept can head off issues before production ever starts. Losing a scene before you shoot it can save time and money on a large scale. So analyze your script carefully.

Click here for Part 1.

©2015 Oliver Peters

Tips for Production Success – Part 1

df1915_prodtips_1_smThroughout this blog, I’ve written numerous tips about how to produce projects, notably indie features, with a successful outcome in mind. I’ve tried to educate on issues of budget and schedule. In these next two entries, I’d like to tackle 21 tips that will make your productions go more smoothly, finish on time, and not become a disaster during the post production phase. Although I’ve framed the discussion around indie features, the same tips apply to commercials, music videos, corporate presentations, and videos for the web.

Avoid white. Modern digital cameras handle white elements within a shot much better than in the past, but hitting a white shirt with a lot of light complicates your life when it comes to grading and directing the eye of the viewer. This is largely an issue of art direction and wardrobe. The best way to handle this is simply to replace whites with off-whites, bone or beige colors. The sitcom Barney Miller, which earned DP George Spiro Dibie recognition for getting artful looks out of his video cameras, is said to have had the white shirts washed in coffee to darken them a bit. The whiteness was brought back once the cameras were set up. The objective in all of this is to get the overall brightness into a range that is more controllable during color correction and to avoid clipping.

Expose to the right. When you look at a signal on a histogram, the brightest part is on the righthand side of the scale. By pushing your camera’s exposure towards a brighter, slightly over-exposed image (“to the right”), you’ll end up with a better looking image after grading (color correction). That’s because when you have to brighten an image by bringing up highlights or midtones, you are accentuating the sensor noise from the camera. If the image is already brighter and the correction is to lower the levels, then you end up with a cleaner final image. Since most modern digital cameras use some sort of log or hyper gamma encoding to record a flatter signal, which preserves latitude, opening up the exposure usually won’t run the risk of clipping the highlights. In the end, a look that stretches the shadow and mids to expose more detail to the eye gives you a more pleasing and informative image than one that places emphasis on the highlight portion.

Blue vs. green-screen. Productions almost ubiquitously use green paint, but that’s wrong. Each paint color has a different luminance value. Green is brighter and should be reserved for a composite where the talent should appear to be outside. Blue works best when the composited image is inside. Paint matters. The correct paint to use is still the proper version of Ultimatte blue or green paint, but many people try to cut corners on cost. I’ve even had producers go so far as to rig up a silk with a blue lighting wash and expect me to key it! When you light the subject, move them as far away from the wall as possible to avoid contamination of the color onto their hair and wardrobe. This also means, don’t have your talent stand on a green or blue floor, when you aren’t intending to see the floor or see them from their feet to their head.

Rim lighting. Images stand out best when your talent has some rim lighting to separate them from the background. Even in a dark environment, seek to create a lighting scheme that achieves this rimming effect around their head and shoulders.

Tonal art direction. The various “blockbuster” looks are popular – particularly the “orange and teal” look. This style pushes skin tones warm for a slight orange appearance, while many darker background elements pick up green/blue/teal/cyan casts. Although this can be accentuated in grading, it starts with proper art direction in the set design and costuming. Whatever tonal characteristic you want to achieve, start by looking at the art direction and controlling this from step one.

Rec. 709 vs. Log. Digital cameras have nearly all adopted some method of recording an image with a flat gamma profile that is intended to preserve latitude until final grading. This doesn’t mean you have to use this mode. If you have control over your exposure and lighting, there’s nothing wrong with recording Rec. 709 and nailing the final look in-camera. I highly recommend this for “talking head” interviews, especially ones shot on green or blue-screen.

Microphone direction/placement. Every budding recording engineer working in music and film production learns that proper mic placement is critical to good sound. Pay attention to where mics are positioned, relative to where the person is when they speak. For example, if you have two people in an interview situation wearing lavaliere mics on their lapels, the proper placement would be on each’s inner lapel – the side closer to the other person. That’s because each person will turn towards the other to address them as they speak and thus talk over that shoulder. Having the mic on this side means they are speaking into the mic. If it were on their outer lapel, they would be speaking away from the mic and thus the audio would tend to sound hollow. For the same reasons, when you use a boom or fish pole overhead mic, the operator needs to point the mic in the direction of the person talking. They will need to shift the mic’s direction as the conversation moves from one person to the next to follow the sound.

Multiple microphones/iso mics. When recording dialogue for a group of actors, it’s best to record their audio with individual microphones (lavs or overhead booms) and to record each mic on an isolated track. Cameras typically feature on-board recording of two to four audio channels, so if you have more mics than that, use an external multi-channel recorder. When external recording is used, be sure to still record a composite track to your camera for reference.

Microphone types. There are plenty of styles and types of microphones, but the important factors are size, tonal quality, range, and the axis of pick-up. Make sure you select the appropriate mic for the task. For example, if you are recording an actor with a deep bass voice using a lavaliere, you’d be best to use a type that gives you a full spectrum recording, rather than one that favors only the low end.

Sound sync. There are plenty of ways to sync sound to picture in double-system sound situations. Synchronizing by matched timecode is the most ideal, but even there, issues can arise. Assure that the camera’s and sound recorder’s timecode generators don’t drift during the day – or use a single, common, external timecode generator for both. It’s generally best to also include a clapboard and, when possible, also record reference audio to the camera. If you plan to sync by audio waveforms (PluralEyes, FCP X, Premiere Pro CC), then make sure the reference signal on the camera is of sufficient quality to make synchronization possible.

Record wild lines on set. When location audio is difficult to understand, ADR (automatic dialogue replacement, aka “looping”) is required. This happens because the location recording was not of high quality due to outside factors, like special effects, background noise, etc. Not all actors are good at ADR and it’s not uncommon to watch a scene with ADR dialogue and have it jump out at you as the viewer. Since ADR requires extra recording time with the actor, this drives up cost on small films. One workaround in some of these situations is for the production team to recapture the lines separately – immediately after the scene was shot – if the schedule permits. These lines would be recorded wild and may or may not be in sync. The intent is to get the right sonic environment and emotion while you are still there on site. Since these situations are often fast-paced action scenes, sync might not have to be perfect. If close enough, the sound editors can edit the lines into place with an acceptable level of sync so that viewers won’t notice any issues. When it works, it saves ADR time down the road and sounds more realistic.

Click here for Part 2.

©2015 Oliver Peters

More Life for your Mac Pro

df_life_macproI work a lot with a local college’s film production technology program as an advisor, editing instructor and occasionally as an editor on some of their professional productions. It’s a unique program designed to teach hands-on, below-the-line filmmaking skills. The gear has to be current and competitive, because they frequently partner with outside producers to turn out actual (not student) products with a combination of professional and student crews. The department has five Mac Pros that are used for editing, which I’ve recently upgraded to current standards, as they get ready for a new incoming class. The process has given me some thoughts about how to get more life out of your aging Apple Mac Pro towers, which I’ll share here.

To upgrade or not

Most Apple fans drool at the new Mac Pro “tube” computers, but for many, such a purchase simply isn’t viable. Maybe it’s the cost or the need for existing peripherals or other concerns, but many editors are still opting to get as much life as possible out of their existing Mac Pro towers.

In the case of the department, four of the machines are fast 2010 quad-cores and the fifth is a late 2008 eight-core. As long as your machine is an Intel of late 2008 or newer vintage, then generally it’s upgradeable to the most current software. Early 2008 and older is really pushing it. Anything before 2009 probably shouldn’t be used as a primary workhorse system. At 2009, you are on the cusp of whether it’s worth upgrading or not. 2010 and newer would be definitely solid enough to get a few more productive years out of the machine.

The four 2010 Mac Pros are installed in rooms designated as cutting rooms. The 2008 Mac was actually set aside and largely unused, so it had the oldest configuration and software. I decided it needed an upgrade, too, although mainly as an overflow unit. This incoming class is larger than normal, so I felt that having a fifth machine might be useful, since it still could be upgraded.

Software

All five machines have largely been given the same complement of software, which means Mavericks (10.9.4) and various editing tools. The first trick is getting the OS updated, since the oldest machines were running on versions that cannot be updated via the Mac App Store. Secondly, this kind of update really works best when you do a clean install. To get the Mavericks installer, you have to download it to a machine that can access the App Store. Once you’ve done the download, but BEFORE you actually start the installation, quit out of the installer. This leaves you with the Install Mavericks application in your applications folder. This is a 4GB installer file that you can now copy to other drives.

In doing the updates, I found it best to move drives around in the drive bays, putting a blank drive in bay 1 and moving the existing boot drive to bay 2. Format the bay 1 drive and copy the Mavericks installer to it. Run the installer, but select the correct target drive, which should be your new, empty bay 1 drive and NOT the current boot drive that’s running. Once the installation is complete, set up a new user account and migrate your applications from the old boot drive to the new boot drive. I do this without the rest (no documents or preferences). Since these systems didn’t have purchased third-party plug-ins, there weren’t any authorization issues after the migration. My reason for migrating the existing apps was that some of the software, like volume-licensed versions of Microsoft Office and Apple Final Cut Studio were there and I didn’t want to track down the installers again from IT. Naturally before doing this I had already uninstalled junk, like old trial versions or other software a student might have installed in the past. Any needed documents had already been separately backed up.

Once I’m running 10.9.4 on the new boot drive, I access the App Store, sign in with the proper ID and install all the App Store purchases. Since the school has a new volume license for Adobe Creative Cloud, I also have an installer from IT to cover the Adobe apps. Once the software dance is done, my complement includes:

Apple Final Cut Pro Studio “legacy” (FCP 7, DVD Studio Pro, Cinema Tools, Soundtrack Pro, Compressor, Motion, Color)

Apple Final Cut Pro X  “new” applications and utilities (FCP X, Motion, Compressor, Xto7, 7toX, Sync-N-Link X, EDL-X, X2Pro)

Adobe Creative Cloud 2014 (Prelude, Premiere Pro, SpeedGrade, Adobe Media Encoder, Illustrator, Photoshop, After Effects, Audition)

Avid Media Composer and Sorenson Squeeze (2 machines only)

Blackmagic Design DaVinci Resolve 11

Miscellaneous applications (Titanium Toast, Handbrake, MPEG Streamclip, Pages, Numbers, Keynote, Word, Excel, Redcine-X Pro)

Internal hard drives

All Mac Pro towers support four internal drives. Last year I had upgraded two of these machines with 500GB Crucial SSDs as their boot drive. While these are nice and fast, I opted to stick with spinning drives for everything else. The performance demand on these systems is not such that there’s really a major advantage over a good mechanical drive. For the most part, all machines now have four internal 1TB Western Digital Black 7200 RPM drives. The exceptions are the two machines with 500GB SSD boot drives and the 2008 Mac, which has two 500GB drives that it came supplied with.

After rearranging the drives, the configuration is: bay 1 – boot drive, bay 2 – “Media A”, bay 3 – “Media B” and bay 4 – Time Machine back-up. The Media A and B drives are used for project files, short term media storage and stock sound effects and music. When these systems were first purchased, I had configured the three drives in the 2, 3 and 4 slots as a single 3TB volume by RAIDing them as a RAID-0 software stripe. This was used as a common drive for media on each of the computers. However, over this last year, one of the machines appeared to have an underperforming drive within the stripe, which was causing all sorts of media problems on this machine. Since this posed the risk of potentially losing 3TB worth of media in the future on any of the Macs, I decided to rethink the approach and split all the drives back to single volumes. I replaced the underperforming drive and changed all the machines to this four volume configuration, without any internal stripes.

RAM and video cards

The 2010 machines originally came with ATI 5870 video cards and the 2008 an older NVIDIA card. In the course of the past year, one of the 5870 cards died and was replaced with a Sapphire 7950. In revitalizing the 2008 Mac, I decided to put one of the other 5870s into it and then replace it in the 2010 machine with another Sapphire. While the NVIDIA GTX 680 card is also a highly-regarded option, I decided to stick with the ATI/AMD card family for consistency throughout the units. One unit also includes a RED Rocket card for accelerated transcoding of RED .r3d files.

The 2010 machines have all been bumped up to 32GB of RAM (Crucial or Other Word Computing). The 2008 uses an earlier vintage of RAM and originally only had 2GB installed. The App Store won’t even let you download FCP X with 2GB. It’s been bumped up the 16GB, which will be more than enough for an overflow unit.

Of these cutting rooms, only one is designed as “higher end” and that’s where most of the professional projects are cut, when the department is directly involved in post. It includes Panasonic HD plasma and Sony SD CRT monitors that are fed by an AJA KONA LHi card. This room was originally configured as an Avid Xpress Meridien-based room back in the SD days, so there are also Digibeta, DVCAM and DAT decks. These still work fine, but are largely unused, as most of the workflow now is file-based (usually RED or Canon).

In order to run Resolve on any external monitor, you need a Blackmagic Design Decklink card. I had temporarily installed a loaner in place of the KONA, but it died, so the KONA went back in. Unfortunately with the KONA and FCP X, I cannot see video on both the Panasonic and Sony at the same time with 1080p/23.98 projects. That’s because of the limitations of what the Panasonic will accept over HDMI, coupled with the secondary processing options of the KONA. The HDMI signal wants P and not PsF and this results in the conflict. In the future, we’ll probably revisit the Decklink card issue, budget permitting, possibly moving the KONA to another bay.

All four 2010 units are equipped with two 27” Apple Cinema Displays, so the rooms without external monitoring simply use one of the screens to display a large viewer in most of the software. This is more than adequate in a small cutting room. The fifth 2008 Mac has dual 20” ACDs. Although my personal preference is to work with something smaller that dual 27” screens – as the lateral distance is too great – a lot of the modern software feels very crowded on smaller screens, such as the 20” ACDs. This is especially true of Resolve 11, which feels best with two 27” screens. Personally I would have opted for dual 23” or 24” HPs or Dells, but these systems were all purchased this way and there’s no real reason to change.

External storage

Storage on these units has always been local, so in addition to the internal drives, they are also equipped with external storage. Typically users are encouraged to supply their own external drives for short edits, but storage is made available for extended projects. The main room is equipped with a large MAXX Digital array connected via an ATTO card. All four 2010 rooms each gained a LaCie 4Big 12TB array last year. These were connected on one of the FireWire 800 ports and initially configured as RAID-1 (mirror), so only half the capacity was available.

This year I reconfigured/reformatted them as RAID-5, which nets a bit over 8TB of actual capacity. To increase the data throughput, I also added CalDigit FASTA-6GU3 cards to each. This is a PCIe combo host adapter card that provides two USB 3.0 and two SATA ports. By connecting the LaCie to each of the Macs via USB 3.0, it improves the read/write speeds compared to FireWire 800. While it’s not as fast Thunderbolt or even the MAXX array, the LaCies on USB 3.0 easily handle ProRes 1080p files and even limited use of native RED files within projects.

Other

A few other enhancements were made to round out the rooms as cutting bays. First audio. The main room uses the KONA’s analog audio outputs routed through a small Mackie mixer to supply volume to the speakers. To provide similar capabilities in the other rooms, I added a PreSonus AudioBox USB audio interface and a small Mackie mixer to each. The speakers are a mix of Behringer Truth, KRK Rokit 5 and Rokit 6 powered speaker pairs, mounted on speaker pedestals behind the Apple Cinema Displays. Signal flow is from the computer to the AudioBox via USB (or KONA in one room), the channel 1 and 2 analog outputs from the AudioBox (or KONA) into the Mackie and then the main mixer outputs to the left and right speakers. In this way, the master fader volume on the mixer is essentially the volume control for the system. This is used mainly for monitoring, but this combination does allow the connection of a microphone for input back into the Mac for scratch recordings. Of course, having a small mixer also lets you plug in another device just to preview audio.

The fifth Mac Pro isn’t installed in a room that’s designated as a cutting room, so it simply got the repurposed Roland powered near field speakers from an older Avid system. These were connected directly to the computer output.

Last, but not least, it’s the little things. When I started this upgrade round, one of the machines was considered a basket case, because it froze a lot and, therefore, was generally not used. That turned out to simply be a bad Apple Magic Mouse. The mouse would mess up, leaving the cursor frozen. Users assumed the Mac had frozen up, when in fact, it was fine. To fix this and any other potential future mouse issues, I dumped all the Apple Bluetooth mice and replaced them with Logitech wireless mice. Much better feel and the problem was solved!

©2014 Oliver Peters

NAB 2014 Thoughts

Whodathunkit? More NLEs, new cameras from new vendors and even a new film scanner! I’ve been back from NAB for a little over a week and needed to get caught up on work while decompressing. The following are some thoughts in broad strokes.

Avid Connect. My trip started early with the Avid Connect costumer event. This was a corporate gathering with over 1,000 paid attendees. Avid execs and managers outlined the corporate vision of Avid Everywhere in presentations that were head-and-shoulders better than any executive presentations Avid has given in years. For many who attended, it was to see if there was still life in Avid. I think the general response was receptive and positive. Avid Everywhere is basically a realignment of existing and future products around a platform concept. That has more impact if you own Avid storage or asset management software. Less so, if you only own a seat of Media Composer or ProTools. No new software features were announced, but new pricing models were announced with options to purchase or rent individual seats of the software – or to rent floating licenses in larger quantities.

4K. As predicted, 4K was all over the show. However, when you talked to vendors and users, there was little clear direction about actual mastering in 4K. It is starting to be a requirement in some circles, like delivering to Netflix, for example; but for most users 4K stops at acquisition. There is interest for archival reasons, as well as for reframing shots when the master is HD or 2K.

Cameras. New cameras from Blackmagic Design. Not much of a surprise there. One is the bigger, ENG-style URSA, which is Blackmagic’s solution to all of the add-ons people use with smaller HDSLR-sized cameras. The biggest feature is a 10” flip-out LCD monitor. AJA was the real surprise with its own 4K Cion camera. Think KiPro Quad with a camera built around it. Several DPs I spoke with weren’t that thrilled about either camera, because of size or balance. A camera that did get everyone jazzed was Sony’s A7s, one of their new Alpha series HDSLRs. It’s 4K-capable when recorded via HDMI to an external device. The images were outstanding. Of course, 4K wasn’t everywhere. Notably not at ARRI. The news there is the Amiraa sibling to the Alexa. Both share the same sensor design, with the Amira designed as a documentary camera. I’m sure it will be a hit, in spite of being a 2K camera.

Mac Pro. The new Mac Pro was all over the show in numerous booths. Various companies showed housings and add-ons to mount the Mac Pro for various applications. Lots of Thunderbolt products on display to address expandability for this unit, as well as Apple laptops and eventually PCs that will use Thunderbolt technology. The folks at FCPworks showed a nice DIT table/cart designed to hold a Mac Pro, keyboard, monitoring and other on-set essentials.

FCP X. Speaking of FCP X, the best place to check it out was at the off-site demo suite that FCPworks was running during the show. The suite demonstrated a number of FCP X-based workflows using third-party utilities, shared storage from Quantum and more. FCP X was in various booths on the NAB show floor, but to me it seemed limited to partner companies, like AJA. I thought the occurrences of FCP X in other booths was overshadowed by Premiere Pro CC sightings. No new FCP X feature announcements or even hints were made by Apple in any private meetings.

NLEs. The state of nonlinear editing is in more flux than ever. FCP X seems to be picking up a little steam, as is Premiere Pro. Yet, still no clear market leader across all sectors. Autodesk announced Smoke 2015, which will be the last version you can buy. Following Adobe’s lead, this year they shift to a rental model for their products. Smoke 2015 diverges more from the Flame UI model with more timeline-based effects than Smoke 2013. Lightworks for the Mac was demoed at the EditShare booth, which will make it another new option for Mac editors. Nothing new yet out of Avid, except some rebranding – Media Composer is now Media Composer | Software and Sphere is now Media Composer | Cloud. Expect new features to be rolled in by the end of this year. The biggest new player is Blackmagic Design, who has expanded the DaVinci Resolve software into a full-fledged NLE. With a cosmetic resemblance to FCP X, it caused many to dub it “the NLE that Final Cut Pro 8 should have been”. Whether that’s on the mark or just irrational exuberance has yet to be determined. Suffice it to say that Blackmagic is serious about making it a powerful editor, which for now is targeted at finishing.

Death of i/o cards. I’ve seen little mention of this, but it seems to me that dedicated PCIe video capture cards are a thing of the past. KONA and Decklink cards are really just there to support legacy products. They have less relevance in the file-based world. Most of the focus these days is on monitoring, which can be easily (and more cheaply) handled by HDMI or small Thunderbolt devices. If you looked at AJA and Matrox, for example, most of the target for PCIe cards is now to supply the OEM market. AJA supplies Quantel with their 4K i/o cards. The emphasis for direct customers is on smaller output-only products, mini-converters or self-contained format converters.

Film. If you were making a custom, 35mm film scanner – get out of the business, because you are now competing against Blackmagic Design! Their new film scanner is based on technology acquired through the purchase of Cintel a few months ago. Now Blackmagic introduced a sleek 35mm scanner capable of up to 30fps with UltraHD images. It’s $30K and connects to a Mac Pro via Thunderbolt2. Simple operation and easy software (plus Resolve) will likely rekindle the interest at a number of facilities for the film transfer business. That will be especially true at sites with a large archive of film.

Social. Naturally NAB wouldn’t be the fun it is without the opportunity to meet up with friends from all over the world. That’s part of what I get out of it. For others it’s the extra training through classes at Post Production World. The SuperMeet is a must for many editors. The Avid Connect gala featured entertainment by the legendary Nile Rodgers and his band Chic. Nearly two hours of non-stop funk/dance/disco. Quite enjoyable regardless of your musical taste. So, another year in Vegas – and not quite the ho-hum event that many had thought it would be!

Click here for more analysis at Digital Video’s website.

©2014 Oliver Peters

 

The NLE that wouldn’t die

It’s been 18 months since Apple launched Final Cut Pro X and the debate over it continues to rage without let-up. Apple likely has good sales numbers to deem it a success, but if you look around the professional world, with a few exceptions, there has been little or no adoption. Yes, some editors are dabbling with it to see where Apple is headed with it – and yes, some independent editors are using it for demanding projects, including commercials, corporate videos and TV shows. By comparison, though, look at what facilities and broadcasters are using – or what skills are required for job openings – and you’ll see a general scarceness of FCP X.

Let’s compare this to the launch of the original Final Cut Pro (or “legacy”) over 12 years ago. In a similar fashion, FCP was the stealth tool that attracted individual users. The obvious benefit was price. At that time a fully decked out Avid Media Composer was a turnkey system costing over $100K. FCP was available as software for only $999. Of course, what gets lost in that measure, is the Avid price included computer, monitors, wiring, broadcast i/o hardware and storage. All of this would have to be added to the FCP side and in some cases, wasn’t even possible with FCP. In the beginning it was limited to DV and FireWire only. But there were some key advantages it introduced at the start, over Avid systems. These included blend modes, easy in-timeline editing, After Effects-style effects and a media architecture built upon the open, extensible and ubiquitous QuickTime foundation. Over the years, a lot was added to make FCP a powerful system, but at its core, all the building blocks were in place from the beginning.

When uncompressed SD and next HD became the must-have items, Avid was slow to respond. Apple’s partners were able to take advantage of the hardware abstraction layer to add codecs and drivers, which expanded FCP’s capabilities. Vendors like Digital Voodoo, Aurora Video Systems and Pinnacle made it possible to edit something other than DV. Users have them to thank – more so than Apple – for growing FCP into a professional tool. When FCP 5 and 6 rolled around, the Final Cut world was pretty set, with major markets set to shift to FCP as the dominant NLE. HD, color correction and XML interchange had all been added and the package was expanded with an ecosystem of surrounding applications. By the time of the launch of the last Final Cut Studio (FCP 7) in 2009, Apple’s NLE seemed unstoppable. Unfortunately FCP 7 wasn’t as feature-packed as many had expected. Along with reticence to chuck recently purchased PowerMac G5 computers, a number of owners simply stayed with FCP 5 and/or FCP 6.

When Apple discusses the number of licensees, you have to parse how they define the actual purchases. While there are undoubtedly plenty of FCP X owners, the interpretation of sales is that more seats of FCP X have been sold than of FCP 7. Unfortunately it’s hard to know what that really means. Since it’s a comparison to FCP 7 – and not every FCP 1-6 owner upgraded to 7 – it could very well be that the X number isn’t all that large. Even though Apple EOL’ed (end of life) Final Cut Studio with the launch of FCP X, it continued to sell new seats of the software through its direct sales and reseller channels. In fact, Apple seems to still have it available if you call the correct 800 line. When Apple says it has sold more of X than of 7, is it counting the total sales (including those made after the launch) or only before? An interesting statistic would be the number of seats of Final Cut Studio (FCP 7) sold since the launch of FCP X as compared to before. We’ll never know, but it might actually be a larger number. All I know is that the system integrators I personally know, who have a long history of selling and servicing FCP-based editing suites, continue to install NEW FCP 7 rooms!

Like most drastic product changes, once you get over the shock of the new version, you quickly realize that your old version didn’t instantly stop working the day the new version launched. In the case of FCP 7, it continues to be a workhorse, albeit the 32-bit architecture is pretty creaky. Toss a lot of ProRes 4444 at it and you are in for a painful experience. There has been a lot of dissatisfaction with FCP X among facility owners, because it simply changes much of the existing workflows. There are additional apps and utilities to fill the gap, but many of these constitute workarounds compared to what could be done inside FCP 7.

Many owners have looked at alternatives. These include Adobe Premiere Pro, Avid Media Composer/Symphony, Media 100 and Autodesk Smoke 2013. If they are so irritated at Apple as to move over to Windows hardware, then the possibilities expand to include Avid DS, Grass Valley Edius and Sony Vegas. Several of these manufacturers have introduced cross-grade promotional deals to entice FCP “legacy” owners to make the switch. Avid and Adobe have benefited the most in this transition. Editors who were happy with Avid in the past – or work in a market where Avid dominates – have migrated back to Media Composer. Editors who were hoping for the hypothetical FCP 8 are often making Adobe Premiere (and the Production Premium bundle) their next NLE of choice. But ironically, many owners and users are simply doing nothing and continuing with FCP 7 or even upgrading from FCP 6 to FCP 7.

Why is it that FCP 7 isn’t already long gone or on the way out by now? Obviously the fact that change comes slowly is one answer, but I believe it’s more than that. When FCP 1.0 came on the scene, its interface and operational methodology fit into the existing NLE designs. It was like a “baby Avid” with parts of Media 100 and After Effects dropped in. If you cut on a Media Composer, the transition to FCP was pretty simple. Working with QuickTime made it easy to run on most personal machines without extra hardware.  Because of its relatively open nature and reliance in industry-standard interchange formats (many of which were added over time), FCP could easily swap data with other applications using EDLs, OMFs, text-based log files and XML. Facilities built workflows around these capabilities.

FCP X, on the other hand, introduced a completely new editing paradigm that not only changed how you work, but even the accepted nomenclature of editing. Furthermore, the UI design even did things like reverse the behavior of some keystrokes from how similar functions had been triggered in FCP 7. In short, forget everything you know about editing or using other editing software if you want to become proficient with FCP X. That’s a viable concept for students who may be the professional editors of the future. Or, for non-fulltime editors who occasionally have to edit and finish professional-level productions as one small part of their job. Unfortunately, it’s not a good approach if you want to make FCP X the ubiquitous NLE in established professional video environments, like post houses, broadcasters and large enterprise users.

After all, if I’m a facility manager and you can’t show me a compelling reason why this is better and why it won’t require a complete internal upheaval, then why should I change? In most shops, overall workflow is far more important than the specific features of any individual application. Gone are the differences in cost, so it’s difficult to make a compelling argument based on ROI. You can no longer make the (false) argument of 1999 that FCP will only cost you 1% of the cost of an Avid. Or use the bogus $50K edit suite ad that followed a few years later.

Which brings us to the present. I started on Avid systems as the first NLE where I was in the driver’s seat. I’ve literally cut on dozens of edit systems, but for me, Final Cut Pro “legacy” fit my style and preferences best. I would have loved a 64-bit version with a cleaned-up user interface, but that’s not what FCP X delivers. It’s also not exactly where Premiere Pro CS6 is today. I deal with projects from the outside – either sent to me or at shops where I freelance. Apple FCP 7 and Avid Media Composer continue to be what I run into and what is requested.

Over the past few months I’ve done quite a few complex jobs on FCP X, when I’ve had the ability to control the decision. Yet, I cannot get through any complex workflow without touching parts of Final Cut Studio (“legacy”) to get the job done. FCP X seems to excel at small projects where speed trumps precision and interoperability. It’s also great for individual owner-operators who intend to do everything inside FCP X. But for complex projects with integrated workflows, FCP 7 is still decidedly better.

As was the case with early FCP, where most of the editing design was there at the start, I now feel that with the FCP X 10.0.6 update, most of its editing design is also in place. It may never become the tool that marches on to dominate the market. FCP “legacy” had that chance and Apple walked away from it. It’s dubious that lightning will strike twice, but 18 months is simply too short of a timeframe in which to say anything that definitive. All I know is that for now, FCP 7 continues as the preferred NLE for many, with Media Composer a close second. Most editors, like old dogs, aren’t too eager to learn new tricks. At least that’s what I conclude, based on my own ear-to-the-ground analysis. Check back this time next year to see if that’s still the case. For now, I see the industry continuing to live in a very fractured, multi-NLE environment.

©2012 Oliver Peters

Film Budgeting Basics

New filmmakers tackling their first indie feature will obviously ask, “What is this film going cost to produce?” The answer to this – like many of these questions – is, “It depends.” The cost of making a film is directly related to the resources needed and the time required for each resource. That often has little to do with the time involved in actually filming the scenes.

A friend of mine, after directing his first feature, was fond of saying, “The total time of saying the words ‘roll, action, cut, print’ was probably less than an hour; but, it took me two years prior to that to have the privilege.” Cost is almost never related to return. I’ve often told budding filmmakers to consider long and hard what they are doing. They could instead take the same amount of money and throw themselves the biggest party of their life. After all the effort of making the film, you might actually have more to show for it from the party. Film returns tend to follow other media success percentages, where typically 15% are successful and 85% fail (or at least don’t make a financial return). Understanding how to maximum the value on the screen is integral to budgeting a feature film.

I often work in the realm of indie features, which includes dramatic productions and documentaries. Each of these two categories tends to break into cost tiers like these:

Dramatic films

$0 – $50,000

$200,000

$500,000

$1,000,000-$2,000,000

Over $2,000,000

Documentaries

$0 – $30,000

$50,000

$300,000-$1,500,000

Over $1,500,000

Money is always tight within these ranges. Once you get over $2,000,000, you tend to have a bit more breathing room and the ability to tackle issues by adding more resources to the equation. Production is related to time and that varies greatly between scripted films and documentaries, where the story is often evolving over time and out of the director’s control. Here is a typical rule-of-thumb timeline for the production of each.

Dramatic films – timeline

1 year to secure rights and funding

2 months of casting, scouting, preparation

1 month readying actual production logistics

2-5 weeks of production (stage and location)

8-20 weeks of picture editorial

8-20 weeks sound editorial and scoring (usually starts after picture is “locked”)

1-2 weeks of picture finish/conform/grade

1-2 weeks of audio mix (re-recording mix)

1 week to finalize all deliverables

Documentaries – timeline

The timeframe up to the start of editorial differs with every project and is an unknown.

8-60 weeks of picture editorial

8-20 weeks sound editorial and scoring (usually starts after picture is “locked”)

1-2 weeks of picture finish/conform/grade

1-2 weeks of audio mix (re-recording mix)

1 week to finalize all deliverables

__________________________________________________________

Clearly any of these categories can take longer, but in the indie/low-budget field, indecision and letting things drag out will destroy the viability of the project. You don’t have the luxury of studio film timeframes. This is where a savvy line producer, unit manager and production manager (often the same person on small films) can make or break the budget. Here are some cost variables to consider.

Cost variables that need to be evaluated and balanced

Union versus non-union.

More days of shooting versus fewer, but longer days, with overtime pay.

The size of the cast and the experience level of the actors.

Allotting adequate (non-filmed) rehearsal time.

The number of script pages (a shorter script means a less costly production).

Accurate timing of scene descriptions to determine how much production time is required for each scene.

The number of locations and location changes/distances.

Period drama versus a contemporary story.

Stage and sets versus shooting at real locations.

The number of make-up and wardrobe changes.

A production location with local crews and facilities versus bringing in resources from the outside.

Film versus digital photography.

The number of cameras.

The amount of gear (dollies, cranes, etc.).

Cost-saving tips

Investigate opportunities to partner with regional film schools.

Using a director of photography who is his own camera operator and who can supply his own cameras and lenses.

Using a location mixer with his own gear.

Using an editor with his own gear.

Eliminate the needs for an elaborate “video village” and possibly reduce the need for a DIT (if you have savvy camera assistants).

Negotiate lower equipment rental costs based on fewer days per week.

Negotiate local resources for food, lodging, travel and craft services.

Explore alternatives to stages, such as empty warehouses.

Explore unsigned local musical artists for songs, scores, etc.

Hold one or more days of production in reserve (to fix “gaps” discovered during editing), in order to shoot inserts, B-roll, transitional shots, the opening title, etc.

Errors that will drive up cost

The film is too short or too long (ideal is a first cut that’s about 10% longer than target, so it can be trimmed back).

Unforeseen or poorly executed visual effects.

Judgment calls made on location to “save” time/effort on a rushed day.

Allowing the actors too much freedom to ad lib and improvise, as well as play with props.

Indecision in the edit.

Changing the edit after the cut is “locked”.

Using stock images or popular music without making provisions in advance for clearance and budgeting.

Cost-saving items that AREN’T

Failing to shoot a complete master shot as part of the coverage on complex scenes.

Using two or more camera throughout the entire production.

Letting actors ad lib in lieu of adequate rehearsal.

Not hiring a script supervisor/continuity person.

Using blue/green-screen effects for driving shots.

Relying on low-light cameras instead of proper lighting.

Extensive use of the “video village” on set.

Limiting the amount of footage sent to the editors (send them everything, not only “circle takes”).

Short-changing the importance of the role of the data wrangler.

Not allowing adequate time or resources for proper data management.

__________________________________________________________

For reference, I put together two sample budgets a year ago, as part of a presentation at Digital Video Expo in Pasadena. It’s available for download here in Numbers, Excel and PDF versions. Feel free to manipulate the spreadsheets for your own production to see how they stack up. I break down a film/DI and a digital photography budget. As you can see, going with 35mm film adds about $175K more to the budget, largely due to stock, processing and DI costs. In a major studio feature, the difference in formats is inconsequential, but not in the million dollar indie range. I have not included a “film-out”, which will add $75-$200K.

The budget I developed, with the help of a number of experienced unit managers, represents a fairly typical, non-union, indie film. It includes most of the cost for crew, cast, production and post, but does not include such items as the cost of the script, props, sets, production office rentals, hotels, insurance, creative fees and others. As a rule-of-thumb, I’ve factored gear and stage rentals as 3-day weeks. This means you get seven days of use, but are only charged for three. In the past year, I’ve heard rates as low as 1.5-day weeks, but I don’t think you can plan on that being the norm. A 3-day or 4-day week is customary.

Many states offer film production incentives, designed to entice producers to shoot a project in that state. Often local investment money and economic incentives will attract producers to a particular locale. That’s great if the state has good local crew and production resources, but if not, then you’ll have to bring in more from the outside. This adds cost for travel and lodging, some of which an enterprising producer can negotiate for trade in the form of a credit on the film. There’s no guarantee of that, though, and as it’s such a variable, this is a cost item that must be evaluated with each individual production.

Remember that post production work has to occur in some physical place. Audio post is typically done in a studio owned or rented by the audio engineer. That’s not the case for editors. If you hire a freelance film editor, you will also need to factor in the cost of the editing system, as well as a rental office in which to house the operation. Some editors can supply that as a package deal and others don’t.

Naturally, a savvy line producer can find ways to bring this budget even lower. I work a lot with the Valencia College Film Technology Program in Orlando. Over the years they have partnered with many producers to complete Hollywood-grade features. I’m not talking student films, but rather name directors and actors working alongside students and working pros to put out films destined for theatrical distribution. The films produced there often place a level of production value on the screen that’s as much as twice the actual out-of-pocket cost of production and post. All thanks to the resources and services the program has to offer.

__________________________________________________________

Most new producers have a good handle on the production phase, but post is a total black hole. As a consequence, post often gets short-changed in the budgeting process. Unfortunately, some producers try to figure out their post production costs at the point when everything is in the can, but almost all of the money has been spent. That’s in spite of the fact that post generally takes much more time than the period allotted to location and stage photography. In order to properly understand the post side of things, here are the workflows for four finishing scenarios.

Film – traditional post

Shoot on location with film – 1,000ft. of 35mm = about 10 minutes of unedited footage.

Process the negative at the lab and do a “best light” transfer to videotape or a hard drive.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates a cut list for the negative cutter.

The negative cutter conforms the negative (physical splices).

All visual effects are added as optical effects.

Lab color timing is performed and answer prints are generated for review.

Film deliverables are generated.

Film – DI (digital intermediate) post

Shoot on location with film – 1,000ft. of 35mm = about 10 minutes of unedited footage.

Process the negative at the lab and do a “best light” transfer to videotape or a hard drive.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates edit lists for the finishing house.

Selected shots are retransferred (or scanned), conformed and graded.

Visual effects are inserted during the conform/grade.

Digital and/or film deliverables are generated.

Digital production – camera raw photography

Shoot on location with a digital camera that records in a raw file format to a card or hard drive.

The footage is converted into a viewable form for the editors.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates edit lists for the finishing house.

Camera raw files are conformed and color graded in a process similar to a DI.

Visual effects are inserted during the conform/grade.

Digital and/or film deliverables are generated.

Digital production – tape or file-based (not raw) photography

Shoot on location with a digital camera and recorded to tape or as files to a card or hard drive.

The assistant editor loads and logs footage and syncs double-system audio.

The editor cuts a first cut, then the director’s cut and then the final version.

The sound team edits dialogue, ADR and sound effects (also temp music at times).

The composer writes and records the score (often in a parallel track to the above).

Sound is mixed in a re-recording session.

The editorial team generates edit lists for the finishing house.

Camera files are conformed and color graded.

Visual effects are inserted during the conform/grade.

In some cases, the editing format and the system is of a level to be considered final quality and the same editor can do both the creative edit and finishing.

Digital and/or film deliverables are generated.

As these workflows show, a lot goes into post beyond simply editing and mixing the film. These elements take time and determine the level of polish you present to your audience. The sample budgets I’ve compiled aren’t intended to cause sticker shock. It’s clear that getting the tally to $1 Million doesn’t take very much and that’s a pretty realistic range for a small film. Granted, I’ve worked on films done for $150,000 that looked like a lot more, but it takes a lot of work to get there. And often leaning hard on the good graces of the crew and resources you use.

For comparison, here’s an example at The Smoking Gun that’s purported to be the working budget for M. Night Shyamalan’s The Village under the working title of The Woods. It doesn’t really matter whether it is or it isn’t the actual budget. The numbers are in line with this type of studio film, which makes it a good exercise in seeing how one can spend $70 Million on a film.

Whether you play in the studio or the independent film arena, it’s important to understand how to translate the vision of the script in a way that correlates to time and money. Once that becomes second nature, you are on your way to becoming a producer that puts the most production value on the screen for the audiences to appreciate.

©2012 Oliver Peters