Post Production Mastering Tips

The last step in commercial music production is mastering. Typically this involves making a recording sound as good as it possibly can through the application of equalization and multiband compression. In the case of LPs and CDs (remember those?), this also includes setting up the flow from one tune to the next and balancing out levels so the entire product has a consistent sound. Video post has a similar phase, which has historically been in the hands of the finishing or online editor.

That sounds so sweet

The most direct comparison between the last video finishing steps and commercial music mastering is how filters are applied in order to properly compress the audio track and to bring video levels within legal broadcast specs. When I edit projects in Apple Final Cut Pro 7 and do my own mixes, I frequently use Soundtrack Pro as the place to polish the audio. My STP mixing strategy employs tracks that route into one or more subgroup buses and then a master output bus. Four to eight tracks of content in FCP might become twenty tracks in STP. Voice-over, sync-sound, SFX and music elements get spread over more tracks and routed to appropriate subgroups. These subgroups then flow into the master bus. This gives me the flexibility to apply specific filters to a track and have fine control over the audio.

I’ll usually apply a compressor across the master bus to tame any peaks and beef up the mix. My settings involve a low compression ratio and a hard limit at -10dB. The objective is to keep the mix levels reasonable so as to preserve dynamic range. I don’t want to slam the meters and drive the signal hard into compression. Even when I do the complete mix in Final Cut, I will still use Soundtrack Pro simply to compress the composite mix, because I prefer its filters. When you set the reference tone to -20dB, then these levels will match the nominal levels for most digital VTRs. If you are laying off to an analog format, such as Betacam-SP, set your reference tone to -12dB and match the input on the deck to 0VU.

Getting ready for broadcast

The video equivalent is the broadcast safe limiting filter. Most NLEs have one, including Avid Media Composer and both old and new versions of Final Cut. This should normally be the last filter in the chain of effects. It’s often best to apply it to a self-contained file in FCP 7, a higher track in Media Composer or a compound clip in FCP X. Broadcast specs will vary with the network or station receiving your files or tapes, so check first. It’s worth noting that many popular effects, like glow dissolves, violate these parameters. You want the maximum luminance levels (white peaks) to be limited to 100 IRE and chrominance to not exceed 110, 115 or 120, depending on the specs of the broadcaster to whom you are delivering. In short, the chroma should stay within the outer ring of a vectorscope. I usually turn off any RGB limiting to avoid artifacts.

It’s often a good idea to reduce the overall video levels by about five percent prior to the application of a broadcast safe filter, simply so you don’t clip too harshly. That’s the same principle as I’ve applied to the audio mix. For example, I will often first apply a color correction filter to slightly lower the luminance level and reduce chroma. In addition, I’ll frequently use a desaturate highlights or lows filter. As you raise midrange or highlight levels and crush shadows during color correction, the chroma is also driven higher and/or lower accordingly. Red, blues and yellows are most susceptible, so it’s a good idea to tone down chroma saturation above 90 IRE and below 20 IRE. Most of these filters let you feather the transition range and the percentage of desaturation, so play with the settings to get the most subtle result. This keeps the overall image vibrant, but still legal.

Let me interject at this point that what you pay for when using a music mastering specialist are the “ears” (and brain) of the engineer and their premium monitoring environment. This should be equally true of a video finishing environment. Without proper audio and video monitoring, it’s impossible to tell whether the adjustments being made are correct. Accurate speakers, calibrated broadcast video monitors and video scopes are essential tools. Having said that though, software scopes and modern computer displays aren’t completely inaccurate. For example, the software scopes in FCP X and Apple’s ColorSync technology are quite good. Tools like Blackmagic Design Ultrascope, HP Dreamcolor or Apple Cinema Displays do provide accurate monitoring in lower-cost situations. I’ve compared the FCP X Viewer on an iMac to the output displayed on a broadcast monitor fed by an AJA IoXT. I find that both match surprisingly well. Ultimately it gets down to trusting an editor who knows how to get the best out of any given system.

Navigating the formats

Editors work in a multi-standard world. I frequently cut HD spots that run as downconverted SD content for broadcast, as well as at a higher HD resolution for the internet. The best production and post “lingua franca” format today is 1080p/23.976. This format fits a sweet spot for the internet, Blu-ray, DVD and modern LCD and plasma displays. It’s also readily available in just about every camera at any price range. Even if your product is only intended to be displayed as standard definition today, it’s a good idea to future-proof it by working in HD.

If you shoot, edit and master at 1080p/23.976, then you can easily convert to NTSC, 720p/59.94 or 1080i/29.97 for broadcast. The last step for many of my projects is to create deliverables from my master file. Usually this involves creating three separate broadcast files in SD and two HD formats using either ProRes or uncompressed codecs. I will also generate an internet version (without bars, tone, countdown or slate) that’s a high-quality H.264 file in the 720p/23.976 format. Either .mov or .mp4 is fine.

Adobe After Effects is my tool of choice for these broadcast conversions, because it does high-quality scaling and adds proper cadences. I follow these steps.

A) Export a self-contained 1080p/23.976 ProResHQ file from FCP 7 or X.

B) Place that into a 720×486, 29.97fps After Effects D1 composition and scale the source clip to size. Generally this will be letterboxed inside of the 4×3 frame.

C) Render an uncompressed QuickTime file, which is lower-field ordered with added 2:3 pulldown.

D) Re-import that into FCP 7 or X using a matching sequence setting, add the mixed track and format it with bars, tone, countdown and slate.

E) Export a final self-contained broadcast master file.

F) Repeat the process for each additional broadcast format.

Getting back there

Archiving is “The $64,000 Question” for today’s digital media shops. File-based mastering and archiving introduces dilemmas that didn’t exist with videotape. I recommend always exporting a final mixed master file along with a split-track, textless submaster. QuickTime files support multi-channel audio configurations, so building such a file with separate stereo stems for dialogue, sound effects and music is very easy in just about any NLE. Self-contained QuickTime movies with discrete audio channels can be exported from both FCP 7 and FCP X (using Roles).

Even if your NLE can’t export multi-channel master files, export the individual submixed elements as .wav or .aif audio files for future use. In addition to the audio track configuration, remove any titles and logos. By having these two files (master and submaster), it’s very simple to make most of the future revisions you might encounter without ever having to restore the original editorial project. Naturally, one question is which codec to use for access in the future. The preferred codec families these days are Avid DNxHD, Apple ProRes, uncompressed, OP1a MXF (XDCAM) or IMX. FCP editors will tend towards ProRes and Avid editors towards DNxHD, but uncompressed is very viable with the low cost of storage. For feature films, another option to consider would be image sequences, like a string of uncompressed TIFF or DPX files.

Whichever format you standardize on, make multiple copies. LTO data tape is considered the best storage medium, but for small files, like edited TV commercial masters, DVD-ROM, Blu-ray and XDCAM media are likely the most robust. This is especially true in the case of water damage.

The typical strategy for most small users who don’t want to invest in LTO drives is a three-pronged solution.

A) Store all camera footage, elements and masters on a RAID array for near-term editing access.

B) Back-up the same items onto at least two copies of raw SATA or SSD hard drives for longer storage.

C) Burn DVD-ROM or BD-ROM copies of edited master files, submasters, project files and elements (music, VO, graphics, etc.).

A properly polished production with audio and video levels that conform to standards is an essential aspect of delivering a professional product. Developing effective mastering and archiving procedures will protect the investment your clients have made in a production. Even better, a reliable archive routine will bring you repeat business, because it’s easy to return to the project in the future.

Originally written for DV magazine/Creative Planet/NewBay Media, LLC

©2012 Oliver Peters

Moonrise Kingdom

Wes Anderson is a director known for his ability to bring interesting and quirky stories and characters to the screen. His latest film, Moonrise Kingdom, takes place on an island in New England some time in the 1960s. Two pre-teens meet, fall in love and run away together. This brings the adults together in an effort to find them, but also sparks a fantastical adventure for the two lead characters, Suzy (Kara Hayward) and Sam (Jared Gilman).

Andrew Weisblum has edited the past few Wes Anderson films (Fantastic Mr. Fox, The Darjeeling Limited), but also films for Darren Aronofsky (The Wrestler, Black Swan). In fact, his work on Black Swan earned an Oscar nomination last year for Film Editing. We recently spoke about Moonrise Kingdom and I asked how he was able to juggle such completely different film styles. Weisblum responded, “These are very different genres, but both directors are very specific with a complete vision. I’m there to help achieve that vision and it doesn’t really matter if it’s drama or comedy. Those are all good challenges.”

The Super 16mm aesthetic

An interesting similarity in these diverse films is the use of Super 16mm film. Weisblum explained, “From an organizational and technical standpoint, it’s all just files for me. Technicolor in New York handled the processing and transfers, which came to me as [Avid] DNxHD36 files on hard drives. That’s become interesting with the new digital cameras. I recently was on another project, which was shot with the [ARRI] ALEXA. I didn’t realize that it hadn’t been shot on film, until I noticed there were no keycodes and asked my assistant.”

“For Moonrise Kingdom, the aesthetic of Super 16mm fit well for the era of the ‘60s. It also allowed Wes to keep the crew small. Our two lead actors had never acted in a film, so Wes wanted to make sure they didn’t feel pressured. A lot of the locations were out in the woods and Wes was able to let them explore, feel comfortable and discover the process together. The film choice also allowed the DP [Robert Yeoman] to get some interesting angles. For instance, they used an [Aaton] A-Minima for some shots in difficult terrain. The movie also includes a number of scenes using miniatures, which were shot on 35mm film, so I was working with a mixture of film formats.”

Andrew Weisblum and assistant editor Daniel Triller worked on the film both on location in Rhode Island and during post production in New York. They cut with Avid Media Composer on Mac Pros connected to Avid shared storage, with some additional cutting on a laptop. Weisblum discussed the workflow, “The film was shot in approximately forty days on location in Rhode Island. I was cutting there at a house. Typically we would get the transfer the next day, Daniel would sync the audio and we’d watch dailies. I would screen and cut with Wes on the weekends. The production wrapped in June and we relocated to New York and cut for another eight to ten weeks. The editing went relatively quickly, but there were a lot of visual effects, so that took a bit of time until all these were completed and cut into the film.”

Weisblum continued, “The script was very concise and clear, but Wes also did something new on this film, which was to storyboard a lot of key scenes. That started for him on Mr. Fox, which was animated. Of course, it’s part of that process, but for Moonrise Kingdom, it helped him to plan out framing and timing, especially since some of the scenes were set to music. This allowed him to be very economical in filming these scenes, but also gave us a tight roadmap to follow, so the edit came together quickly and stayed close to the original script.”

Tackling sound through collaboration

How an editor handles temporary sound design and music in building the rough cut is an important ingredient of the film editing process. The Moonrise Kingdom team tackled this in a way different from other projects. Weisblum explained, “On many films, the picture editor will drop in temp sound effects that are close, but not always the perfect sound. The director and others then hear the film over and over again as scenes are previewed and screened with these effects. It often feels ‘wrong’ when they finally hear the fleshed out effects in the mix that were placed by the sound editors. To prevent this type of ‘temp love’ I sent sequences early on to [supervising sound editor/re-recording mixer] Craig Henighan. We were cutting on Media Composer and he uses Pro Tools, so bouncing material back-and-forth was easy. He could incorporate sound effects and ambiances into those scenes. We could integrate these effects into the edit, which meant we were always using the best possible options.”

“We also worked with two composers. Mark Mothersbaugh worked on the percussion pieces for the scout camp scenes. I would place his demos based on Wes’ ideas and then Peter Jarvis would flesh these out with more orchestration and performed the final versions. Alexandre Desplat wrote the romantic theme, which wound its way through the score. There’s also a lot of Benjamin Britten and Hank Williams music throughout the film.”

Visual effects help spell out the story

Although Anderson’s films don’t seem like they would be special effects movies, Moonrise Kingdom includes about 250 visual effects shots. According to Weisblum, “Dan Schrecker and Look Effects in New York handled all these shots. There are some fantastical elements in the film, but many of these effects are simply used to fill in story ‘tidbits’ and missing plot exposition. For example, by adding a seaplane to the background of one shot, it allowed us to convey the point, without requiring a specific set of shots to build an extra little scene. These elements help the audience to understand the story in an economical fashion. Visual effects like these are a constant part of editing now. It’s no longer a specialized item. Quick shot fixes, invisible split-screens and similar effects are just another set of editorial tools.” Tim Stipan was the DI colorist at Technicolor. According to Weisblum, “Wes is not a big proponent of power windows and aggressive secondary corrections, so it’s a straightforward grading job. The exception in Moonrise Kingdom is that specific scenes needed a different look to play into the fantasy, which is a departure for him.”

Wes Anderson has total creative control on his films, which is in keeping with their nature as independent features. Weisblum talked about their working style, “Wes is very ‘hands-on’ in the cut. He’s not big on improvisation, but is interested in the original rhythms and behaviors of the actors. When something unusual happens with a character, he’s willing to embrace it and deviate from the script in editing, when the scene is played better than the way it was written. Wes likes the actors to do a lot of series of their dialogue lines within the takes. With the kids, he also did a lot of wild lines. I use [Avid] ScriptSync all the time and it is very helpful with indexing those performances.”

Our conversation turned to the current flux in editing technology. Weisblum offered, “I started assisting and cutting on film, but I saw that I needed to get proficient with technology. First it was Lightworks and then Avid. Once I became successful as an editor, I was primarily working on Avid systems. I want to be creative and Avid lets me do everything I need it to do. I want to spend all of my energy on the creative process and not worry about learning which new buttons to push on another piece of software that I might not use. The job of an assistant is to clear the path for the editor, so that all the editor has to do is concentrate on getting the best cut of the film. I looked forward to that day as an assistant and now that I’m there, I fully embrace that concept.”

Originally written for DV magazine (NewBay Media, LLC)

©2012 Oliver Peters

DIY Edit Suite Design Pointers

The lower cost of gear opens more opportunities for independent editors, so it’s time to look at some good practices for DIY edit facilities. The considerations in designing and constructing your own suite include power, HVAC, ergonomics, acoustics and equipment configuration. This article is intended for the small-to-medium post production user on a limited budget.

Power

Modern electronics don’t push power demands the way they did in the days of racks loaded with video gear. Most electrical circuits intended for standard office and even home use will be adequate to support the average nonlinear edit suite. If you have the ability to bring in an electrician and improve the service, then you will want a dedicated circuit (or two) for each editing room. These should be “home runs” direct from the outlets in the suite to the circuit breaker box.

Uninterrupted power supply (UPS) units are a “must have”. I recommend the larger (1200 series and up) APC units for each workstation, storage array and equipment rack. These not only supply short-term battery back-up in the event of power failure (long enough to safely close your project and power down), but also maintain a stable frequency for your electronics. Most units include outlets that are both back-up by battery and others that are merely controlled pass-throughs. Your computer, storage and the primary display should be on the battery back-up outlets, but peripherals, like video monitors, mixers and speakers may be plugged into the other outlets.

HVAC (heating, ventilation and air conditioning)

You typically won’t have much control over HVAC in an existing building, but be sure your service is adequate for the needs. With modern editing workstations the biggest heat generators will be large storage arrays, as well as the monitors and displays. The computer itself doesn’t generate that much ambient heat. If you have the ability to relocate large storage arrays from the room into an adjacent room or a centralized location (in the case of a SAN), then it will be easier to control the airflow, temperature and comfort in the edit suite. Likewise, the more humans in the room, the warmer it gets, so suites designed for a single editor with the occasional client won’t demand as much cooling as one that routinely has half-a-dozen folks in there working. If you can specify the HVAC design and installation, then go for a system based on high volume and low velocity. Such units move a lot of air at slower speeds through the ducts and are better for sound-sensitive situations.

Ergonomics

Comfort is important when you’re in a room for hours on end. Make sure monitor position, desk height and the chair design are optimal. The proper height for the desk surface should be between 26” and 29” to the top. When you are seated, your eyeline should be at the top portion of the computer display. These guidelines actually mean that placing a computer monitor (with its own stand) on the raised bridge of a custom tech console forces the operator to look slightly upwards, resulting in shoulder strain. I prefer consoles with no bridges or if budget permits, adjustable ones, like those from Biomorph, with a bridge unit that can be put lower than the table surface. There are no hard and fast rules, but a large flat table surface is often preferable to many of the custom tech consoles. These can be purchased at most furniture outlets or custom-built by a local millwork shop.

Chairs are a subjective choice, but it could be the most critical piece of gear you buy. There’s a reason Herman Miller Aeron chairs decorate most tech companies and it’s not just the design. These (and similarly designed products) are durable and provide back comfort over long periods of work. If you are so inclined, there is some research that indicates working while standing up is healthier. Walter Murch is a huge advocate of editing at a stand-up console. You may wish to look into this as an alternative approach and modify these recommendations accordingly.

Acoustics

Controlling sound waves boils down to issues of transmission and treatment. We are talking about edit suites and not recording studio spaces, so the acoustics don’t have to be perfect. Clearly, you want the room to be as free of annoying equipment fan and air conditioner duct noise as possible. Moving storage arrays out of the suite and using a low velocity HVAC system will take care of the bulk of that. You are mainly concerned with keeping the noise from the suite from bothering neighboring offices, as well as keeping most exterior noise out while you are working. This means adopting some studio design concepts without going overboard.

Mass is your friend. The easiest fix to cut down sound transmission is to double-up on the drywall. This means two layers on the inside and outside walls of the room. Use soundboard if you like for at least one of these. Insulate the space in the wall (spray-in foam is a good idea). Screw the drywall to the studs instead of nailing it, caulk all joints and offset the seams between the two layers. If you want even more isolation, then build a double-thick wall, with two set of studs and more insulation. Don’t forget the ceiling, since a simple, suspended ceiling won’t prevent sound from going up and over the wall. You can cap the ceiling with drywall as well, effectively creating a room within a room. Lastly, remember the doors. Use solid-core wooden doors and tight weather stripping around the jam and at the floor.

Sound treatment involves eliminating standing waves (reverberation or echo) caused by hard parallel surfaces and managing bass response (so the mix isn’t too thin or too boomy). If you have control over the room construction, then avoid true parallel walls by slightly slanting some of the walls as well as adding an angle to the drop ceiling. The more items in the room, such as furniture, bookshelves and wall hangings, the more natural interruption there is to the bounce of sound waves. Furthermore, you can purchase a variety of prefab sound treatment kits and ceiling tiles that will help to mitigate reflections and control bass. Auralex and Primacoustic are popular manufacturers. The good news is that many leading studio designers post examples of their rooms all over the Internet, complete with floor plans. Fifteen minutes of web searching will yield an entire library of studio design options for the ambitious.

Equipment configurations

Thanks to new technologies like Intel’s Core i5 and Core i7 processors and Apple’s Thunderbolt protocol, the footprint of modern editing workstations can be smaller than ever. A small shop’s post suite that doesn’t need to handle legacy hardware, can easily be powered by an Apple iMac, MacBook Pro or an HP EliteBook or Z-1. Personally I still prefer a workstation, like a Mac Pro or an HP Z-series machine, but it’s no longer a given that such a unit will provide the fastest performance.

If you are building a new room, it’s insane to even contemplate purchasing a VTR of any sort. So much acquisition and delivery is file-based, that it makes better business sense to utilize outside services like Digital Service Station or a similar local vendor for the few times a year when tape is a requirement. This means racks of terminal gear are reduced or even eliminated and the “747 Cockpit” staples of the edit bay, like scopes and banks of monitors are overkill. It doesn’t hurt to integrate a wiring harness and space for one VTR, though, in the event that you need to accommodate a rental deck.

The current trend in small suite design is to build a room that’s very client-friendly and not intimidating. Usually this involves a workstation with one or two displays, near field audio monitors, a small mixer (mainly for volume control) and one or two video monitors. I like to work with dual displays (like two Apple 20” LCDs), but a single, larger monitor, like a 27” or 30” is also quite functional. In fact, a lot of the newer software, like Smoke for Mac, Final Cut Pro X and DaVinci Resolve, is optimized for single displays. Just remember that smaller, but higher-resolution displays, mean that the type sizes are also smaller. For example, the 1920×1200 pixel display of a 17” Apple MacBook Pro yields tiny text on screen.

The topic of scopes and broadcast video monitors sparks lively debates among pro editors. I’ve been pretty happy with the results I get from internal software scopes in FCP, FCP X, Media Composer, Resolve and Color. A good option if you want more is Blackmagic Design’s Ultrascope. Naturally if you need external monitoring, you’ll need a capture card of some sort. That’s where Thunderbolt comes in, which allows you to daisy-chain storage (like the Promise Pegasus array) and monitoring. For instance, a single Thunderbolt port would let you connect storage, an AJA Io XT and an external Apple 27” display all in a single path. Then use the AJA unit to connect the Ultrascope and broadcast video monitors.

Color-accurate, broadcast display choices depend on your budget, need and tolerance. Sony’s OLED monitors are beautiful but pricey. I would suggest Panasonic, JVC, TV Logic or Flanders Scientific as good alternatives. If you want a large, wall-mounted screen to “wow” the client, then go with one of the Panasonic presentation-grade plasmas. Even their high-end consumer plasmas provide a wonderful image. A really nice layout – which still maintains that “bridge of the Enterprise” look – would place a 27” iMac (or 27” or 30” display) at the center of the console, with a 17” broadcast grade LCD (for video monitoring) to the left and another display with the Ultrascope signal on the right. This would be capped off with a 50” Panasonic plasma mounted on the wall.

Originally written for DV magazine/Creative Planet/NewBay Media, LLC

©2012 Oliver Peters

Avid Media Composer Tips for the FCP Switcher

The new (or renewed) interest in Avid Media Composer was fueled by the launch of Apple’s Final Cut Pro X and aided by Avid’s cross-grade promotional price. This move has many experienced FCP editors investigating the nuances of what seems like a completely foreign interface to some. Tutorial DVDs, like Steve Hullfish’s Avid Media Composer for Final Cut Pro Users from Class On Demand, are a great start. With experience, editors start to see more similarities than differences. Here are a few pointers of mine to help you find your comfort zone.

Multiple sequences  – FCP editors like working with multiple, tabbed sequences in the timeline window and find this lacking in Media Composer. In fact, it’s there, just not in the same way. Under the pulldown menu in the upper right corner of the record window (Canvas in FCP parlance), Media Composer editors have quick access to any previously opened sequence. Although only one is displayed in the timeline at any given time, you can quickly switch to another sequence by selecting it in this menu.

Displaying source waveforms and sequences clips – In FCP 7 or earlier, having tabbed sequences makes it easy to use a section of one sequence as a source to paste into another sequence. In addition, when reviewing audio-only clips, the waveform is displayed in the Viewer to easily mark in and out points based on the waveform. A similar feature exists in Media Composer. At the lower left corner of the timeline window is a toggle icon, which switches the window display between the source (Viewer) and record (Canvas) timelines. To use a sequence as an edit source, simply drag it from the bin into the source window. Then toggle the source/record icon to display the source’s timeline. The playhead bar or current timeline indicator will be bright green whenever you are working in the source timeline. If you have an audio-only clip, doing the same and enabling waveforms will display the source clip track with its corresponding waveform pattern.

Multiple projects – Another FCP favorite is the ability to have more than one project open at once. The Media Composer equivalent to this is Open Bin. This enables the editor to access any bin from any other project. Opening bins from other projects gives you full access to the master clips and associated metadata. In turn, that information is tracked as part of your current project going forward. Media Composer’s Find command is also active with these other bins.

Oversized images – Media Composer has traditionally been locked into standard broadcast NTSC, PAL and HD frame sizes. When it comes to dealing with higher-resolution images, Media Composer offers two solutions: the built-in Avid Pan & Zoom plug-in and Avid FX. Both of these function nicely for doing camera-style moves on bigger-than-TV-raster images. If you want to preserve the resolution of a 3500 x 3500 pixel still photo, while zooming into it, Media Composer is perfectly capable of accomplishing the task. Edit a placeholder clip to the timeline, apply the Pan & Zoom filter to it. From the effects editor, access the high-res file, which will replace the placeholder content. Finally, program your keyframes for the move.

AvidFX – One of the “stealth” features of Media Composer is that it comes with Avid FX, an OEM version of Boris RED designed to integrate into Avid host systems. Apply an AvidFX filter to a clip and launch the separate interface to open a complete effects, titling and compositing environment. Not only can you apply effects to your timeline clips or import and manipulate high-res stills, but you can also import other QuickTime movies from your drives that aren’t part of the Avid project. AvidFX also includes its own set of Boris Continuum Complete and Final Effects Complete filters, even if the BCC or FEC products haven’t been separately installed into Media Composer.

Open timeline/frame rate mix and match – FCP is known for dealing with a wide range of formats, but in fact, Media Composer is superb at mixing frame rates and sizes in real-time. For example, freely mix SD and HD clips on an HD timeline and Media Composer will automatically scale up the SD clips. However, you can quickly change the format of the project to SD and Media Composer takes care of applying the opposite scale parameters, so that SD clips appear normal and HD clips are scaled down to fit into the SD raster. In addition, codecs and frame rates can also be mixed within the same sequence. Put 23.976fps clips into a 29.97fps timeline or vice versa and Avid’s FluidMotion technology takes care of cadence and frame rate adjustments for seamless playback and blending in real-time.

Smart Tool – Easy editing within the timeline by using the mouse or keystrokes has long been a hallmark of FCP. Avid’s product designers sought to counter this (and add their own twists) with the introduction of Smart Tool. When enabled, it offers contextual timeline editing. Hovering the curser over the top or bottom of a clip  – or close to the top, bottom, right, center or left side of a cut – will change the function you can perform when clicking and dragging with the mouse. It takes a while to get comfortable with Smart Tool and many veteran Avid editors aren’t big fans. Nevertheless, editors who’ve adapted to it can quickly re-arrange clips and trim edit points in much the same way as FCP editors use the Selection and Rolling Edit keys.

Perforation-slipping – Edit accuracy on either NLE is limited to a one-frame interval, but a little known feature is the ability to trim the sync of audio to less than one video frame. Few editors know how to do that in either application. While FCP allows sample-based adjustments, Media Composer does this with a technique borrowed from its film editing heritage. Start by creating your project as a film project (even for non-film media), which enables the Slip Perf Left/Right commands. Standard 35mm film uses four sprocket holes (perforations) per frame, which equates to quarter-frame accuracy. Using the Slip Perf commands is a great way to adjust the accuracy of sound sync on subclips in double-system recordings, such as when an HDSLR camera and separate audio recorder were used.

AMA – The first instinct of many young editors is to start a project by dragging media from anywhere on the hard drives into the project and start editing. FCP facilitated this style of working, whereas Media Composer traditionally was structured with a rigid routine for importing and capturing media. Avid Media Access (AMA) is a solution to bring more of that “drag-and-drop” world into Media Composer. You may either Link to AMA File(s) or Link to AMA Volume(s), depending on whether you wish to mount entire drives or partitions, such as with P2 cards, or simply want to import individual files. Conceived as a process similar to FCP’s Log and Transfer, AMA also lets you edit directly from native files. The recommended workflow is a hybrid approach: load your clips via AMA; cull down selects; and then transcode just those clips into optimized Avid media. In either case, you have immediate access to native media, as long as there’s an AMA camera plug-in for it. This includes RED, Canon XF, P2, XDCAM and most QuickTime-based camera file formats, such as ARRI ALEXA and Canon HDSLRs.

ProRes – Ever since Apple introduced the ProRes codec family, it has been gaining share as an acquisition, edit and delivery format – thanks to the ubiquitous nature of QuickTime. It has become ingrained into many FCP editing workflows, so media compatibility is important in considering a switch. Media Composer version 6 now supports native ProRes media. Apple ProRes QuickTime files can be imported on both Mac and PC versions, but in addition, Mac Media Composers can render and export natively to ProRes, as well. When the files are ingested using the standard (non-AMA) method, the ProRes files are “fast imported”. This means the media is copied without conversion and the container is rewrapped from a .mov to .mxf format. Avid takes care of maintaining proper levels without the dreaded QuickTime gamma shift. For FCP editors moving to Media Composer, this means easy access to existing media files from past projects using one of two import methods – native ProRes import or AMA linking.

Resolve roundtrip – Both NLEs have good, built-in color correction tools, but Final Cut users who needed more grading horsepower turned to Apple Color. With the demise of Color, DaVinci Resolve has taken up the mantle as the preferred desktop grading tool. It becomes the perfect outboard grading companion when used in conjunction with Media Composer. Even the free Resolve Lite version can read and write MXF media. Start by exporting an AAF file for your edited Media Composer sequence. Make sure the Resolve Media Pool includes the location of your Avid media files. Then import the AAF into Resolve, which in turn links to the MXF media. Grade the clips as usual, render and export using the Avid roundtrip preset along with the corresponding AAF file. The folder of rendered Resolve media (in the MXF format) should be renamed with a number and moved into the Avid MediaFiles/MXF subfolder. When the corresponding AAF file is imported into your Media Composer project, the new sequence will be automatically relinked to these graded MXF files.

Open I/O – FCP users have enjoyed a wide range of capture and output hardware options and that same flexibility came to Media Composer with version 6. Both Media Composer and Symphony run as software-only applications, as well as with capture devices for any of five third-party companies: AJA, Blackmagic Design, BlueFish444, Matrox and MOTU. Plus Avid’s own gear, of course. Set up your Media Composer project as you normally would and the card or external device just simply works. No need to directly alter any card settings. Since the existing hardware from most existing FCP installations can still be used, Media Composer becomes another available application for the arsenal. This has made switching a “no-brainer” for many FCP editors.

Two links for the best 200 Media Composer tutorials for beginners on the web (Mac readers should view these in FireFox, not Safari or Chrome):

Douglas Bruce’s First 100 Basic MC Tutorials

Douglas Bruce’s Next 100 Basic MC Tutorials

Originally written for DV magazine/Creative Planet/NewBay Media, LLC

©2012 Oliver Peters