Matrox MXO2 Mini

I’ve covered Matrox for a number of years. With the development of the MXO and MXO2 units, Matrox has put together one of the strongest families of I/O products that are available for the Apple Final Cut Pro editing customer. The original MXO was designed to use the video signal from one of the internal graphics card’s DVI ports and turn that into a broadcast quality signal for output and monitoring. The MXO2 was built as a more traditional ingest and output system. Instead of DVI, the MXO2 connects to the PCIe bus, via a card for the Mac Pro or an ExpressCard|34 adapter for laptops. In addition to the MXO and MXO2, the product group has grown to include the MXO2 Rack, MXO2 LE and MXO2 Mini.

From the beginning, the MXO2 was designed with the addition of future technology in mind. Last year Matrox revealed that new MXO2 products accommodate an H.264 encoding chip, which customers can purchase as an option with any new MXO2. This is branded as the MAX option and involves additional hardware integrated into the MXO2 product. It adds $400 to the price of any of the units, however, customers with older MXO2 units or other I/O hardware, can still benefit from Matrox’s hardware-accelerated H.264 encoding by purchasing the standalone CompressHD PCIe card.

MXO2 Mini Configuration

I’ve been working with an MXO2 Mini (with the MAX option) that Matrox loaned for this review. It packs a lot of punch for under a grand and with the accelerated H.264 encoding, adds value beyond just I/O. The Mini is the smallest of the MXO2 products and is ideal for use with a laptop. You can purchase a unit with either a PCIe card or an ExpressCard|34 adaptor, however, the other interface can be added as an optional accessory. With both interface adapters, one MXO2 Mini can be alternately used with both a MacBook Pro and a Mac Pro.

The back plane of the Mini sports HDMI digital ports and RCA-style analog input and output connectors. The analog video connectors can be assigned as component, composite and/or S-video from the Matrox MXO2 Mini control panel (installed into system preferences), which leaves two analog connectors for left and right audio. The HDMI ports support up to eight channels of embedded digital audio. Unlike the larger MXO2 models, the Mini has no RS-422 port for VTR control and it has to run on AC power (no battery options).

To date, the MXO2 Mini supports Apple Final Cut Studio plus Adobe After Effects and Premiere Pro. I did my testing with Final Cut Pro and like all I/O products, installing any MXO2 adds a whole list of Easy Set-ups to Final Cut’s pulldown menus. Open the MXO2 control panel in system preferences and that’s where you’ll discover the true power of the Mini. You can ingest from either the analog or HDMI inputs, but both output sets are simultaneously active. The Mini uses a “main” and “SD” channel, which can be assigned to either the HDMI or analog outputs. The main channel can follow the host application settings or be forced to always operate as 720p or 1080i. Lastly, you can set the pulldown cadence for 24p sequences and adjust SD aspect ratio conversion (center cut, letterbox, anamorphic). In spite of the small size, the MXO2 Mini gives you a wealth of control to up, down and cross-convert SD and HD video streams.

Who is it for?

Clearly good things come in small sizes, but some users will find the missing items, like lack of RS-422 and SDI a deal-breaker. Those customers are better served by the other MXO2 products. On the other hand, I see a number of markets where the Mini is just right. For instance, if you work almost exclusively with file-based acquisition (P2, XDCAM, EX, etc.) and never need to capture from a digital tape deck (Digital Betacam, HDCAM, etc.) then the Mini may be for you. Most of these editors are mainly interested high-quality monitoring. Modern HD displays are adopting the HDMI standard, so the Mini fits the bill.

Another scenario is the multi-suite facility. In many of these operations, one room does the primary ingest and mastering to and from tape, while the other rooms are connected to shared storage and handle the editing chores. Again, only high-quality monitoring is required in these rooms, so no need to spend extra for I/O capability that is never used. In this situation, one main room equipped with an MXO2 Rack and several satellite suites with MXO2 Minis would seem like an ideal configuration.

Of course, the obvious – location monitoring with a laptop – goes without saying. It’s a small, light unit that’s ideally suited for this environment. Another nice feature in the control panel is HDMI calibration. Since low cost rooms frequently use consumer-grade HD LCD displays in lieu of broadcast monitors, the MXO2 Mini includes calibration controls to adjust the HDMI output for the display panel it’s connected to.

Encoding

Monitoring and I/O are only part of the equation, now that MAX has been added. When the MXO2 software is installed, it adds a component to route Apple Compressor’s H.264 encoding through the accelerator chip inside the MXO2 unit. This can be deactivated from the control panel, in case you don’t want to use it for some reason. Matrox-specific selections are also added to the various preset groups in Compressor. Pick one of the Matrox versions of an H.264 setting, such as for Apple TV or an iPhone, and it will be optimized to work in conjunction with the MAX chip. You can also create and save your own custom settings.

So, how much faster is it? I took a 6:30-long 1080i Apple ProRes QuickTime movie and encoded it a number of different ways, using the MXO2 Mini, as well as other applications. The Mini was installed on my Core 2 Duo 2.4GHz MacBook Pro, which gave me a chance to compare hardware acceleration on the slower machine versus software-only encoding on my faster 8-core 2.26GHz Mac Pro. The software I used included Apple Compressor (without the Mini), Adobe Media Encoder, Sorenson Squeeze 6 and Telestream Episode Pro.

The 8-core MacPro – using Compressor without MAX acceleration – took about 30 minutes to encode this clip into two targets (Apple TV and iPhone). The same clip running on the laptop with the MXO2 Mini was encoded in about 20 minutes, but there I assigned a total of three presets (Apple TV, iPhone and custom SD size). So, more encoded files in less time using a “slower” machine. When I encoded the same file on the MacBook Pro, using Compressor with MAX deactivated, the iPhone preset alone required 50 minutes to encode.

Encoding times vary depending on scaling, cropping, deinterlacing and so on. Another comparison was to convert this clip to a Blu-ray preset, where no resizing was involved. The Matrox-accelerated encoding was approximately real-time, while various software-only times on the 8-core covered a wide range from Adobe Media Encoder (fastest) at 15 minutes to Compressor at 26 minutes. The quality of the MAX-encoded files always looked clean and compression artifacts compared favorably with the best of any of these encoders.

H.264 Playback

The 1.9 software for MXO2 products was released in February and includes Matrox’s new Vetura Playback application. This player enables H.264 playback through the MXO2 ports, complete with SD and HD conversion. The intent is for field and studio playback of H.264 sources. For instance, Matrox envisions a field journalist cutting a story and encoding it to H.264 for a fast internet feed back to master control. This can then go straight to air using the Vetura Playback application and an MXO2 product.

I don’t know if news operations will see it this way or not, but at least it does work. I took a number of my test files (1080i source clips that were encoded to Apple TV and iPhone presets) and played them through Vetura. These files played back correctly through both the HDMI port to my HD display, as well as through the composite SD channel to an older NTSC TV set.

One interesting area to look at is the trend to use HDSLR cameras like the Canon EOS 5D Mark II and Canon EOS 7D for real production, including news. These hybrid cameras record H.264 1080p QuickTime movies. I took some original 5D files straight into Vetura and they played just fine in both HD and SD to the two monitors. The only quirk was that the 5D files are a true 30fps and so ran slightly fast. The video seemed to look fine, but audio sounded sped up. Unfortunately, I don’t have any 7D files, which use a video-friendly 29.97fps rate, available to test.

The reason these hybrid cameras were originally developed was to service photojournalists who also shoot video. It seems like the Mini and Vetura application could easily play an increasing role in this ecosystem. It will be interesting to see where Matrox takes the MXO2 family in the future.

Written for Videography magazine and NewBay Media LLC

©2010 Oliver Peters

Advertisements

Casino Jack

One of this year’s documentary films with legitimate Sundance buzz is Casino Jack and the United States of Money. The work of award-winning director Alex Gibney, Casino Jack chronicles the rise and fall of notorious Washington lobbyist Jack Abramoff. Gibney picked up a Best Documentary Oscar for Taxi to the Dark Side in 2008. I had a chance to chat with the editor of Casino Jack, Alison Ellwood, who also shares producing credits on this film. Ellwood is a New York and Massachusetts-based editor who frequently works in conjunction with Gibney’s Jigsaw Productions. Gibney and Ellwood have collaborated on a number of uniquely American films over the past decade, including Enron: The Smartest Guys in the Room and Gonzo: The Life and Work of Dr. Hunter S. Thompson.

Like most documentary film editors, Alison Ellwood, has travelled that road from the start. According to Ellwood, “I’ve largely worked on documentary productions my entire career as an editor. In fact, Tanner ’88 [the Robert Altman TV series] was my primary experience as an editor on dramas. I like working on documentaries, because I wanted to be a photojournalist when I was in school. Documentary filmmaking is just an extension of that goal.”

Given that Jack Abramoff has become so vilified in the media, I first asked how she felt he was represented in Casino Jack. Ellwood replied, “Our story was really about ‘who is he?’ Jack Abramoff is very present in this film and appears many times in archival interviews. In fact, Alex interviewed him a number of times in prison, but in the end, he declined to do a current interview on-camera for this film. What he did was elevate ‘business as usual’ in Washington to a new level. I think in certain ways, the film treats him somewhat sympathetically. It’s less about ‘here’s this rotten apple’ and more about the whole barrel being rotten. Jack is actually quite charming and funny in many of the clips we have and some of that humor shows on screen. I think he actually comes across better in this film than many would have expected.”

Casino Jack was a three-and-one-half year long project, with hundreds of hours of archival and original footage in a range of formats, like DVCPro, Betacam-SP, HDCAM, DVCAM, DVD and QuickTime movies.  I discussed with Alison how she was able to wrap her head around this much material and develop the story arc. She answered, “This project took several years and during that time Jigsaw was also working on other projects. As a rough estimate, I would say that the actual editing took about fourteen months. I had my first, four-hour-long rough cut at about seven months. The biggest challenge was finding a way to make the story understandable and to humanize it. This was a complex story to tell, just like Enron, but it was bigger in scope. It involved many different countries and government leaders, both here and around the world. Enron was a financial story. We were trying to explain some of the arcane mark-to-market accounting practices in ways that the audiences could quickly grasp. In Casino Jack, the challenge was to present the complex world of politics and lobbying.”

As always, the challenge is deciding what to leave on the cutting room floor. Alison continued, “One big part of the first rough cut was Abramoff’s involvement with the [Commonwealth of the Northern] Mariana Islands.” Abramoff represented the government of the islands, which is a U. S. Commonwealth. He was involved in efforts to affect Congressional action regarding the islands and businesses in the capital of Saipan.

“This was a very important story that involved serious worker abuse on the islands. It was an hour-and-a-half of the original cut and made a very good standalone story in its own right, but it was too long. We spent three months to get that down to forty minutes, but in the final cut, it’s only seventeen minutes of the film. That was tough to let go. This is such an intellectual film anyway. That part had real emotion, which helps give a film like Casino Jack impact. Ultimately we decided that we weren’t telling the Mariana Islands story, but rather that it was only a part of the whole story. Plus, it had already been well-documented by other programs.”

Another surprising editorial challenge you normally don’t associate with a documentary is comedic timing. Alison explained, “A lot of the clips are actually comical, but it’s dark humor. We thought these were funny, but were surprised when the audiences in our test screenings didn’t laugh in certain areas. That was, of course, because in the context of the story, these were quite shocking and the audience was stunned. So we made some adjustments to have these play out with the desired impact.”

The logistics of the variety and amount of footage raised its own challenges. The footage was all dubbed or downconverted to DVCAM and then ingested at the Avid 15:1 draft-quality resolution. Jigsaw uses Avid Media Composers and Unity LANshare storage, but there were multiple projects going through the shop during this time. To ease the load, Ellwood did most of her cutting during the last five or six months of the project on a MacBook Pro at her Massachusetts home. All of the 15:1 media fit onto a 1TB FireWire drive and both Jigsaw and Ellwood had duplicate copies of the media. When a cut was ready for screening, she would e-mail an Avid bin to her assistant editor at Jigsaw, who in turn would relink that to their copy of the media files.

Ellwood is firmly committed to Avid Media Composer as the editing tool of choice. In fact the inevitable Final Cut versus Media Composer question elicited this response, “I used Final Cut Pro on one project, but didn’t like it at all. I really think Media Composer is a better tool for storytelling. One big problem I had was with the behavior of locators. When I cut, I need the locators to stay linked to the clips. As I cut from a clip to a sequence or one sequence to another, I need the locators to be able to follow along, so that I have access to the notes and comments that I’ve made. Final Cut didn’t do this in the same way as I’m used to with Media Composer.”

One software feature – unique to Avid – that came into play on Casino Jack is ScriptSync. A script or transcript can be loaded as a text file into a Media Composer bin. Script Sync will align media clips to lines of text by matching phonetics between the text lines and the audio tracks of the media files. Using ScriptSync, an editor can click on a line of text in the bin and have instance access to any and all available matching clips at that precise point within the clip. This becomes very handy when trying to find one particular answer in hours of recorded interviews.

A new feature that Ellwood would have liked to have had is Frame Rate Mix & Match. She was cutting on version 3.5 of the software and this is a feature that was introduced in Media Composer 4.0. Mix & Match lets the editor combine all the different film and video frame rates within the same sequence. The software takes care of properly adding or adjusting cadences to make the clips play correctly according to the sequence’s timebase.

Alison explained, “We had many different formats at different frame rates, so to make the offline editing easiest, I was working in a 30i [Avid Media Composer] project. Some of our footage was in PAL, which is 25fps. In order to quickly integrate those clips into the edit, the footage was literally shot off of a monitor that was displaying the playback from a PAL deck, using a 30fps NTSC camcorder. This is what I cut into the sequence. Once our offline was locked, we took the PAL tapes and did a slow-scan transfer and upconversion to HD.  This new master material (along with all other archival masters) was then overcut into the offline sequence before handing over our online HD finishing to Postworks in New York.”

“A feature like Mix & Match would have been very helpful and I plan to use it in the future. We haven’t upgraded the Media Composer software yet, because we also need to upgrade the LANshare software. There are three projects on that storage, so we can’t upgrade it yet. When those are completed, then all the systems and software will be done at one time.”

Casino Jack was presented at Sundance and will be distributed through Magnolia Pictures and Participant Productions. Once a theatrical deal has been nailed down, a DI will be done on the HD master for release prints. Like most documentary editors, Alison Ellwood has been integral in shaping the story of all of these films – in essence, helping to ‘write’ the film through pictures and sound. Her next project is another Jigsaw film, which she co-directs with Gibney. This one will be a look at ‘60s pop icon Ken Kesey (author of One Flew Over the Cuckoo’s Nest) and his “Merry Pranksters”, as told through his own words and films.

Written for Videography magazine and NewBay Media LLC.

©2010 Oliver Peters

Grading with Color Wheels

Recently fellow editor Shane Ross and I were discussing the relative merits of grading in Avid Media Composer versus Apple’s FCP or Color, as well as using the Colorista plug-in. Our conversation got down to how each treated the image when you used the color wheels. After I did a quick test, it was obvious that FCP and Avid don’t process the image in quite the same way, even when you push what appears to be the equivalent control in the same direction. So I decided to dig a bit deeper.

The so-called “color wheels” (also called hue offset controls) are separate color balance controls for the shadow, midrange and highlight portions of the image. When you want to effect a change, such as make an image less red, you push the appropriate control in the opposite direction of red-yellow – towards the blue-cyan segment of the control. In addition to low/mid/high ranges, some applications also add an overall “master” balance control for the entire image.

The concept of these tools grew out of early “video shading” controls in studio cameras and color correction systems like DaVinci. To my knowledge, the first actual use of software color wheels originally appeared in Avid Symphony a decade ago.

Since the controls work within three distinct ranges of the image, a critical element is where that crossover occurs between low-to-mid and mid-to-high. Not only where, but also how gradual the transition. Some apps and plug-ins give you control over this and some don’t. If so, look for either a threshold control or a luma ranges control to adjust the transition point and the softness of that transition.

Some points to remember:

a) Not all color wheel adjustments give you the same degree of saturation. If I push the mids to blue, some apps let me really blow out the image with blue saturation, while others only slightly tint the image.

b) A proper color balance control should increase the saturation of the color component you add (or shift to), while reducing the saturation of the complementary colors. This should be evident on a waveform monitor’s RGB parade display. In fact, some only increased blue, while others also appropriately reduce red and green color components.

c) Some have a very tight default threshold at the low/mid/high crossover points and others are very gradual. The softer the threshold or transition between ranges, the more the color balance change looks like a tint or wash over the whole image. The tighter the default transition, the more likely you will see contouring artifacts at the edge of the transition.

The following are the results of my casual testing. I took the same image I’ve used for other color grading articles. This started out as a flat RED One image, which I’ve cropped to HD and increased contrast and saturation.

That’s the starting point for these tests. I wanted to use an image that looked like what you are apt to receive, rather than a flatter image, which would actually be preferable for grading. In each of these applications I have simply increased the blue balance in the mid-range control, without adjusted any thresholds or other controls. In most cases, I’ve pushed the control as far as it would go – either to the edge of its range – or to the point where I started to see some color artifacts. Most importantly, I was NOT going for the best-graded image. Instead, I’m trying to demonstrate how seemingly similar tools can actually give you wildly different results. A tool will an extreme range can often make grading more difficult, because it’s harder to achieve finesse.

Click on any of these image for a larger (and often expanded) view. At the bottom, I’ve included a series of exported images from each of these applications.

Apple Aperture

Aperture neutral

Aperture adjusted

Aperture is here only as a frame-of-reference. It offers similar color controls to some grading applications and presumably would have the most graceful processing of the image.

Avid Media Composer

Avid neutral

Avid adjusted

Avid Media Composer does a nice job of staying within the mid-range. It also tends to reduce red and blue while increasing blue.

Apple Color

Apple Color neutral

Apple Color adjusted

The interesting thing about Color is that the app not only made the image blue, but it also seemed to darken it, compared with the other grading solutions. You can see this below in the exported image. The Color image was round-tripped through FCP.

Apple Final Cut Pro (3-way color correction filter)

Apple FCP 3-way neutral

Apple FCP 3-way adjusted

Apple FCP’s 3-way had the most extreme range, but it seemed to just increase blue without reducing the other colors. You’ll see that a color segment can be completely blown out by this control.

Red Giant Magic Bullet Colorista color correction filter (Apple FCP host)

Colorista adjusted

Red Giant’s Colorista grading filter is the one that many editors gravitate to when the built-in controls aren’t enough. As you see, it offers more graceful color control in FCP than the standard 3-way. Unfortunately, it’s also one of the few FCP plug-ins I use that is extremely picky about versions. If you move a sequence between systems and have a mismatch of Colorista filters installed, it can completely crash FCP or Motion. I find that it’s much better behaved in After Effects. There you also get additional tools, such as a collection of Colorista presets.

Red Giant Magic Bullet Looks

Looks neutral

Looks 3-way adjusted

Looks lift-gamma-gain adjusted

Magic Bullet Looks is another powerful third party plug-in that lets you chain a series of internal filter together – all within its own interface. Looks offers several controls for color correction, including both a 3-way and a lift-gamma-gain control. The two sets of controls appear similar, but don’t work the same way. The lift-gamma-gain control works like Colorista, while the 3-way works more like Adobe’s built-in color correction. You’ll notice on the 3-way that the range isn’t as great and the default threshold is very tight (but adjustable). Note the contouring on the model’s shoulder blades.

Adobe Premiere Pro (3-way color correction filter)

Adobe Premiere Pro 3-way neutral

Adobe Premiere Pro 3-way adjusted

This is my least favorite filter in the batch. Like the Looks 3-way, the level of blue that was increased is pretty minor and the threshold is also very tight (but also adjustable). As one would expect with a tight threshold, there is also visible contouring at the bottom of her back in this image.

Synthetic Aperture Color Finesse color correction filter (Adobe After Effects host)

SA Color Finesse neutral

SA Color Finesse adjusted

Last in this set of comparisons is Synthetic Aperture’s Color Finesse 2 plug-in that ships with Adobe After Effects. Color Finesse offers a toolset that is very similar to Avid Symphony and has been included with After Effects for years. In addition to all the standard color grading models, Color Finesse also offers more advanced modes, like CMYK grading. If you have a two-monitor configuration, the Color Finesse UI displays a full-screen image on one of the monitors.

In my opinion, Synthetic Aperture Color Finesse produced the most pleasing results out of all of these video images. A version of Color Finesse is also available as a standalone grading application, which uses a similar workflow to Apple Color. In any case, it’s right inside Adobe After Effects, though many editors aren’t even aware of the power they already own if they have the CS3 or CS4 bundle!

Exported images

Starting image

Apple Aperture

Avid Media Composer

Apple Color

Apple Final Cut Pro (3-way)

Red Giant Colorista (FCP)

Magic Bullet Looks (3-way)

Magic Bullet Looks (Lift-Gamma-Gain)

Adobe Premiere Pro (3-way)

Synthetic Aperture Color Finesse

©2010 Oliver Peters

FCP Project Tips

When Avid and Final Cut editors talk smack on the internet forums, the arguments invariably get done to the strengths and weaknesses of each application when it comes to collaborative editing. There are pros and cons to each approach, but lets take a look at how Final Cut Pro editors can organize projects in such a way that sharing can be painless.

When you compare Avid and FCP project types, you will quickly realize that Avid Media Composer projects (at the finder level) are actually folders containing multiple bins. That’s where all the metadata lies. In the case of Final Cut Pro, the project file is a single monolithic file containing all the metadata. So in the simplest terms: FCP PROJECTS = Avid BINS.

The second important difference is how and where media is stored. Media Composer neatly stores all media in one folder on each media drive. All media that is ingested or imported (and transcoded) is placed into the target Avid media folder. The location of this folder on the hard drive cannot be altered by the user.

Final Cut Pro’s project window is primarily a media browser, so files can be dragged into the project from anywhere on any of the connected drives. You can also delete clips from the browser and that won’t affect clips that have been edited to the timeline. Avid controls the media location structure, whereas FCP leaves it up to the user. Avid’s method of media relinking is relatively bullet-proof, while FCP’s is more versatile, but also potentially problematic.

Here are some considerations to make your FCP project experience more solid.

File location – The most important consideration is file location. FCP has certain defaults, but people violate these all the time. Since you can drag content from anywhere, it’s very easy to grab an SFX out of the Soundtrack Pro library or an image from your personal Pictures folder. No problem when you are the only editor, but a potential disaster when you send the project to someone else to finish. Set up a structure of file organization and stick to it.

I personally feel that any extraneously-imported files should first be copied to a location consistent with that project, even if this means you are duplicating your media. If all of your media for a project is supposed to be on a single external drive, then first copy such files (images, VO, music, SFX, etc.) to a folder on that drive BEFORE you import these into FCP. When you hand off the media drive to another editor no files will be missing.

FC Studio files – Remember that Final Cut Pro is only part of Final Cut Studio. When you use other applications in the suite, like Motion, they each create their own project files. With the exception of Motion’s master templates, LiveType and Motion project files are not stored as part of the FCP project, even though they appear in the browser. The file in the FCP browser is only a link to the real project file stored somewhere else. Part of your file strategy needs to determine where these files are to be stored. For instance, are you going to store all Motion projects in a single Motion Documents folder or are these files going to be stored with the media for that project? This becomes a crucial decision if you ever have to move the drives and relink media.

Folder structure – When two editors are working on the same project and sharing media on separate drives – or a drive that is passed back and forth – make sure you each maintain the exact same folder hierarchy. This is critical in order to properly relink projects to their media. If you don’t do this, each time you have to relink, it will become an adventure searching through the folders on the drive. If the media path is identical, except for the host computer’s name, FCP has almost no problem in linking media to an FCP project on a different computer. The easiest way for two editors to collaborate (different locations, different drives) is for the media drives to be exact duplicates or clones of each other. Then relinking should be instant whenever project files are copied and shared between the editors.

Multiple projects – One of FCP’s big plusses is the ability to work with multiple projects and multiple sequences open at the same time. Large FCP project files can be unwieldy, so breaking your production into multiple smaller projects helps to control this. Each FCP PROJECT file works like an Avid BIN file (more or less). Following this philosophy, you might have individual projects for each day of dailies and for your edited sequences. But remember that the FCP browser/project window holds your metadata, like comments, timecodes, etc. and this can cause some concern when working with multiple projects. For instance, the match frame command locates master clips within a project or connects to actual media files, but not to other projects. Working across several projects within the same production might not be ideal for you; nevertheless, this way of working can be very streamlined.

Sharing sequences – Another use of multiple projects is when you share edits with another editor. There is no need to copy and send the complete project file each time someone makes a change. Instead, editor “A” makes changes to a copy of the sequence and places that into a new and separate project, containing only that one sequence. Send the file to editor “B”, who opens that project along with the master project. Copy the revised sequence into the master project and continue.

This way of working mimics the concept behind Avid Unity. Since Avid bins are separate files, Unity controls the user’s write access to any open bin. A hallmark of Unity shared storage is that two or more Avid editors can simultaneously work in the same open project file on different workstations. However, each can only make revisions (write permission) in a bin that they have opened first. The other editors only have read permission to such a bin. In order for two Avid Unity editors to make concurrent revisions to different segments of the SAME sequence, a copy must be made and moved into a separate bin, which is closed by the first editor and then re-opened by the other, in order to change the write permission. Once both have made their changes, one of the two editors must combine the two timelines back into a single timeline, which reflects both sets of changes. This is not unlike two FCP editors working with duplicate copies of the same FCP project file.

File and Volume Locking – Shared storage systems in their simplest form control read and write permissions through file-level or volume-level locking. File-level locking means that two editors on different computers can write to the same drive partition without the danger of overwriting each other’s files. Volume-level locking requires that drives are partitioned and each editor is assigned read-write or read-only permission to any given partition. The determination of file versus volume is controlled by the internal file system of the shared storage solution and the media management software used as the “traffic cop”.

Apple’s Xsan software controls media management and is a file-level based system. All editors see a single media volume mounted on their desktop. They have access to all media and Xsan controls the writing of files, such as renders. A volume-level system, like the original Facilis TerraBlock units, use multiple partitions where each editor is typically assigned only one volume with write permission. There are also some systems that use Ethernet connections and no “traffic cop” software. Instead, they rely on the OS and Apple’s internal networking structure to handle the traffic. One computer in the system is set up as the “server” and the media is connected locally to it. The other workstations access the media file over the LAN.

Simple Ethernet-based solutions work relatively well in small shops (up to 4 or 5 systems), but should be used largely as read-only systems. In other words, aside from writing while ingesting media, other simultaneous writing tasks should be kept local to avoid potential collisions. This means that editors should store project files, assign the Autosave target and render all media to a local drive. The downside to this suggestion, however, is that two editors working on copies of the same project, will have to individually render their own sequences, since the render files won’t be on the shared storage.

Using the OS – Apple has integrated many aspects of Final Cut Pro into the fabric of OS X. FxPlug, QuickTime, Apple Events and XML-based roundtrips are all part of this. When you drag a movie file from a drive location directly into the FCP browser, it automatically shows some of the metadata shared between QT and FCP, such as timecode and reel number. When you drag the project icon of a LiveType or Motion project directly from that app’s header bar into an FCP project, you have, in fact, imported that LiveType or Motion project.

If you want to get more out of your FCP experience, then use other parts of the OS. For example. Quick Look and Cover Flow can be used to browse media files. XML has opened a small industry of third party apps to perform tasks with FCP files that can’t be done by FCP itself. Spotlight can be used to find media files. You don’t have to ingest with FCP, but can use other capture utilities (like AJA or Telestream) instead, if that suits your purpose.

I’ve probably raised more questions than given answers, but hopefully these tips will form a starting point for better FCP organization. Now go edit!

©2010 Oliver Peters