Blackmagic Design HyperDeck Shuttle

df_hyperdeck_01The video industry has been moving towards complete file-based workflows, but that doesn’t replace all of the functions that traditional videotape recorders served. To bridge the gap, companies such as AJA, Blackmagic Design, Convergent Design, Sound Devices and others had developed solid state recorders for field and studio operation. I recently tested Blackmagic Design’s HyperDeck Shuttle 2, which is touted as the world’s smallest uncompressed recorder.

Blackmagic’s HyperDeck series includes the Shuttle and two Studio versions. The latter are rack-mounted VTR-replacement devices equipped with dual SSDs (solid state drives). The Shuttle is a palm-sized, battery powered “brick” recorder. A single SSD slides into the Shuttle enclosure, which is only a bit bigger than the drive itself – enough to accommodate battery, controls and internal electronics. To keep the unit small, controls are basic record and transport buttons, much like that of a consumer CD player. You can operate it connected to an external power supply, on-board camera power or battery-powered. The internal, non-removable, rechargeable battery holds its charge for a little over one hour of continuous operation. The purchased unit includes a 12-volt power supply and a kit of international AC plug adapters.

The HyperDeck Shuttle includes 3Gb/s SDI and HDMI for digital capture and playback. Recording formats include 10-bit uncompressed QuickTime movies, as well as Avid DNxHD 175x or 220x in either QuickTime or MXF-wrapped variations. At the time I tested this device, it would not record Apple ProRes codecs. In November, Blackmagic Design released a free software update (version 3.6), which added ProRes HQ to the uncompressed and DNxHD options. It also added closed captioning support to all HyperDeck models.

Since the unit is designed for minimal interaction, all system set-up is handled by an external software utility. Install this application on your computer, connect the HyperDeck Shuttle via USB and then you’ll be able to select recording formats and other preferences, such as whether or not to trigger recording via SDI (for on-camera operation). The unit has no menu, which means you cannot alter, rename or delete files using the button controls or the software utility. There is a display button, but that was not active in the software version that I tested.

Solid state recording

df_hyperdeck_03The SSD used is a standard 2.5” SATA III drive. Several different brands and types have been tested and qualified by Blackmagic Design for use with the HyperDeck units. These drives can be plugged into a generic hard drive dock, like a Thermaltake BlacX Duet to format the drive and copy/erase any files. The SSD was Mac-formatted, so it was simply a matter of pulling the drive out of the Shuttle’s slot and plugging it into the Duet, which was connected to my Mac Pro tower. This allowed me to copy files from the drive to my computer, as well as to move files back to the SSD for later playback from the Shuttle. (At IBC, Blackmagic also announced ExFAT support with the HyperDeck products.) The naming convention is simple, so recorded files are labeled Capture001, Capture002 and so on. Unfortunately, it does not embed reel numbers into the QuickTime files. Placing a similarly named file in the correct format (more on that in a moment) onto the drive makes it possible to use the Shuttle as a portable master playback device for presentations, film festivals, etc.

My evaluation unit came equipped with an 240GB OCZ Vertex 3 SSD. This is an off-the-shelf drive that runs under $200 at most outlets. By comparison, a Sony 124-minute HDCAM-SR videotape is now more expensive. It’s amazing that this SSD will sustain extended 10-bit uncompressed 1080i/59.94 recording and playback, when even most small drive arrays can’t do that! In practical terms, a 240GB drive will not hold a lot of 1080i 10-bit uncompressed media, so it’s more likely that you would use Avid DNxHD 220X or Apple ProRes HQ for the best quality. You could easily fit over 90 minutes of content on the same SSD using one of these codecs and not really see any difference in image quality.

In actual use

I tested the unit with various codecs and frame rates. As a general rule, it’s not a good idea to mix different flavors on the same drive. For example, if you record both 1080i 10-bit uncompressed and 1080p/23.98 Avid DNxHD clips on the same drive, the HyperDeck Shuttle will only be able to playback the clips that match its current set-up. The Shuttle does auto-detect the incoming frame rate without the need to set that using the utility. It did seem to get “confused” in this process, making it hard to access the clips that I thought it should have been able to play. The clips are on the SSD, though, so you can still pull them off of the drive for use in editing. For standard operation, I would suggest that you set your preferences for the current production and stick to that until you are done.

df_hyperdeck_02Blackmagic Design sells a mounting plate as an accessory. It’s easy to install by unscrewing the HyperDeck’s back panel and screwing in the mounting plate in its place. I loaned the unit to a director of photography that I work with for use with a Canon C300.  Although there are common mounting holes, the DP still ended up having to use Velcro as a means to install both his battery and the Shuttle onto the same camera rig. The recordings looked good, but the SDI trigger did not properly stop the recording, requiring the DP to manually stop the unit with each take. Another issue for some is that it uses Mini-BNC connectors. This requires an investment in some Mini-BNC adapter cables for SDI operation, if you intend to connect it to standard BNC spigots.

Overall, the unit performed well in a variety of applications, but with a few quirks. I frequently found that it didn’t respond to my pushing the transport control buttons. I’m not sure if this was due to bad button contacts or a software glitch. It felt more like a software issue, in that once it “settled down” stepping forward and backward through clips and pushing the play button worked correctly. The only format I was not able to playback was 24p media recorded as MXF. Nevertheless the MXF formatting was correct, as I could drop these files right into an Avid Mediafiles folder on my media hard drive for editing with Avid Symphony.

HyperDeck Shuttle as a portable player

If you intend to use a HyperDeck Shuttle as a master playback device, then there a few things you need to know. It can capture interlaced, progressive and progressive-segmented-frame (PsF) footage, but it will only play these out as either interlaced or progressive via the SDI connection. Playing PsF as progressive is fine for many monitors and projectors, but the signal doesn’t pass through many routers or to some other recorders. Often these broadcast devices only function with a “true” progressive signal if the format is 720p/59.94. This means that it would be unlikely that you could play a 1080p/23.98 file (captured as PsF) and record that output from the HyperDeck Shuttle to a Sony HDCAM-SR video recorder, as an example.

It is possible to export a file from your Avid NLE, copy that file to the HyperDeck’s SSD using a drive dock and play it back from the unit; however, the specs get a little touchy. The HyperDeck Shuttle records audio as 16-bit/48kHz in the Little Endian format, but Avid exports its files as Big Endian. Endianness refers to how the bytes are ordered in a 16-, 32- or 64-bit word and whether the most or least significant bit is first. In the case of the Shuttle, this difference meant that I couldn’t get any audio output during playback. If your goal is to transfer a file to the Shuttle for duplication to another deck or playback in a presentation environment, then I would recommend that you take the time to make a real-time recording. Simply connect your NLE’s SDI output to the HyperDeck Shuttle’s SDI input and manually record to it, on-the-fly, like a tape deck.

The HyperDeck Shuttle is a great little unit for filling in workflow gaps. For example, if you don’t own any tape decks, but need to take a master to a duplication facility. You could easily use the Shuttle to transport your media to them and use it for on-site master playback. It’s a bit too quirky to be a great on-camera field recorder, but at $345 (plus the SSD), the Shuttle is an amazing value for image quality that good. As with their other products, Blackmagic Design has a history of enhancing the capabilities through subsequent software updates. I expect that in the future, we’ll see the HyperDeck family grow in a similar fashion.

Originally written for DV magazine / Creative Planet Network

© 2013 Oliver Peters

Zero Dark Thirty

df_zdt_1Few films have the potential to be as politically charged as Zero Dark Thirty. Director Kathryn Bigelow (The Hurt Locker, K-19: The Widowmaker) and producer/writer Mark Boal (The Hurt Locker, In the Valley of Elah) have evaded those minefields by focusing on the relentless CIA detective work that led to the finding and killing of Osama bin Laden by US Navy SEALs. Shot and edited in a cinema verite style, Zero Dark Thirty is more of a suspenseful thriller, than an action-adventure movie. It seeks to tell a raw, powerful story that’s faithful to the facts without politicizing the events.

The original concept started before the raid on bin Laden’s compound occurred. It was to be about the hunt, but not finding him, after a decade of searching. The SEAL raid changed the direction of the film; but, Bigelow and Boal still felt that the story to be told was in the work done on the ground by intelligence operatives that led to the raid. Zero Dark Thirty is based on the perspective of CIA operative Maya (Jessica Chastain), whose job it is to find terrorists. The Maya character is based on a real person.

Zero Dark Thirty was filmed digitally, using ARRI Alexa cameras. This aided Kathryn Bigelow’s style of shooting by eliminating the limitation of the length of film mags. Most scenes were shot with four cameras and some as many as six or seven at once. The equivalent of 1.8 million feet of film (about 320 hours) was recorded. The production ramped up in India with veteran film editor Dylan Tichenor (Lawless, There Will Be Blood) on board from the beginning.

According to Tichenor, “I was originally going to be on location for a short time with Kathryn and Mark and then return to the States to cut. We were getting about seven hours of footage a day and I like to watch everything. When they asked me to stay on for the entire India shoot, we set up a cutting room in Chandigarh, added assistants and Avids to stay up to camera while I was there. Then I rejoined my team in the States when the production moved to Jordan. A parallel cutting room had been set up in Los Angeles, where the same footage was loaded. There, the assistants could also help pull selects from my notes, to make going through the footage and preparing to cut more manageable.”df_zdt_3

William Goldenberg (Argo, Transformers: Dark of the Moon) joined the team as the second editor in June, after wrapping up Argo. Goldenberg continued, “This film had a short post schedule and there was a lot of footage, so they asked me to help out. I started right after they filmed the Osama bin Laden raid scene, which was one of the last locations to be shot and the first part of the film that I edited. The assembled film without the raid was about three hours long. There was forty hours of material just for the raid and this took about three weeks to a month to cut. After I finished that, Dylan and I divided up the workload to refine and hone scenes, with each making adjustments on the other’s cuts. It’s very helpful to have a second pair of eyes in this situation, bouncing ideas back and forth.”

As an Alexa-based production, the team in India, Jordan and London included a three-man digital lab. Tichenor explained, “This film was recorded using ARRIRAW. With digital features in the past, my editorial team has been tasked to handle the digital dailies workload, too. This means the editors are also responsible for dealing with the color space workflow issues and that would have been too much to deal with on this film. So, the production set up a three-person team with a Codex Digilab and Colorfront software in another hotel room to process the ARRIRAW files. These were turned into color-corrected Avid DNxHD media for us and a duplicate set of files for the assistants in LA.” Director of photography Greig Fraser (Snow White and the Huntsman, Killing Them Softly) was able to check in on the digilab team and tweak the one-light color correction, as well as get Tichenor’s input for additional shots and coverage he might need to help tell the story.

df_zdt_4Tichenor continued, “Kathryn likes to set up scenes and then capture the action with numerous cameras – almost like it’s a documentary. Then she’ll repeat that process several times for each scene. Four to seven camera keep rolling all day, so there’s a lot of footage. Plus the camera operators are very good about picking up extra shots and b-roll, even though they aren’t an official second unit team. There are a lot of ways to tell the story and Kathryn gave us – the editors – a lot of freedom to build these scenes. The objective is to have a feeling of ‘you are there’ and I think that comes across in this film. Kathryn picks people she trusts and then lets them do their job. That’s great for an editor, but you really feel the responsibility, because it’s your decisions that will end up on the screen.”

Music for the film was also handled in an unusual manner. According to Goldenberg, “On most films a composer is contracted, you turn the locked picture over to him and he scores to that cut. Zero Dark Thirty didn’t start with a decision on a composer. Like most films, Dylan and I tried different pieces of temp music under some of the scenes that needed music. Of all the music we tried, the work of Alexandre Desplat (Argo, Moonrise Kingdom) fit the best. Kathryn and Mark showed Alexandre a cut to see if he might be interested. He loved it and found time in his schedule to score the film. Right away he wrote seven pieces that he felt were right. We cut those in to fit the scene lengths, which he then used as a template for his final score. It was a very collaborative process.”

Company 3 handled the digital intermediate mastering. Goldenberg explained, “The nighttime raid scene has a very unique look. It was very dark, as shot. In fact, we had to turn off all the lights in the cutting room to even see an image on the Avid monitors. Company 3 got involved early on by color timing about ten minutes of that footage, because we were eager and excited to see what the sequence could look like when it was color timed. When it came to the final DI, the film really took on another layer of richness. We’d been looking at the one-light images so long that it actually took a few screenings to enjoy the image that we’d been missing until then.”

df_zdt_2Both Tichenor and Goldenberg have been cutting on Avid Media Composers for years, but this film didn’t tax the capabilities of the system. Tichenor said, “This isn’t an effects-heavy film. Some parts of the stealth helicopters are CG, but in the Avid, we mainly used effects for some monitor inserts, stabilization and split screens.” Goldenberg added, “One thing we both do is build our audio tracks as LCR [left, center, right channel] instead of the usual stereo. It takes a bit more work to build a dedicated center channel, but screenings sound much better.”

Avid has very good multicamera routines, so I questioned whether these were of value with the number of cameras being used. Tichenor replied, “We grouped clips, of course, but not actual multicam. You can switch cameras easily with a grouped clip. I actually did try for one second on a scene to see if I could use the multicam split screen camera display for watching dailies, but no, there was too much going on.” Goldenberg added, “There are some scenes that – although they were using multiple cameras – the operators would be shooting completely different things. For instance, actors in a car with one camera and other cameras grabbing local flavor and street life. So multicam or group clips were less useful in those cases.”

The film’s post schedule took about four months from the first full assembly until the final mix. Goldenberg said, “I don’t think you can say the cut was ever completely locked until the final mix, since we made minor adjustments even up to the end; but, there was a point at one of the internal screenings where we all knew the structure was in place. That was a big milestone, because from there, it was just a matter of tightening and honing. The story felt right.” Tichenor explained, “This movie actually came together surprisingly well in the time frame we had. Given the amount of footage, it’s the sort of film that could easily have been in post for two years. Fortunately with this script and team, it all came together. The scenes balanced out nicely and it has a good structure.”

For addition stories:

DV’s coverage of Zero Dark Thirty’s cinematography

An interview with William Goldenberg about Argo

FXGuide talks about the visual effects created for the film.

New York Times articles (here and here) about Zero Dark Thirty

Avid interview with William Goldenberg.

DP/30 interview with sound and picture editors on ZDT.

Originally written for DV magazine / Creative Planet Network

©2012, 2013 Oliver Peters

Offline to online with 4K

df_4k_wkflw_01

The 4K buzz  seems to be steam-rolling the industry just like stereo3D before it. It’s too early to tell whether it will be an immediate issue for editors or not, since 4K delivery requirements are few and far between. Nevertheless, camera and TV-set manufacturers  are building important parts of the pipeline. RED Digital Cinema is leading the way with a post workflow that’s both proven and relatively accessible on any budget. A number of NLEs support editing and effects in 4K, including Avid DS, Autodesk Smoke, Adobe Premiere Pro, Apple Final Cut Pro X, Grass Valley EDIUS and Sony Vegas Pro.

Although many of these support native cutting with RED 4K media, I’m still a strong believer in the traditional offline-to-online editing workflow. In this post I will briefly outline how to use Avid Media Composer and Apple FCP X for a cost-effective 4K post pipeline. One can certainly start and finish a RED-originated project in FCP X or Premiere Pro for that matter, but Media Composer is still the preferred creative  tool for many editing pros. Likewise, FCP X is a viable finishing tool. I realize that statement will raise a few eyebrows, but hear me out. Video passing through Final Cut is very pristine, it supports the various flavors of 2K and 4K formats and there’s a huge and developing ecosystem of highly-inventive effects and transitions. This combination is a great opportunity to think outside of the box.

Offline editing with Avid Media Composer

df_4k_wkflw_04_smAvid has supported native RED files for several versions, but Media Composer is not resolution independent. This means RED’s 4K (or 5K) images are downsampled to 1080p and reformatted (cropped or letterboxed) to fit into the 16:9 frame. When you shoot with a RED camera, you should ideally record in one of their 4K 16:9 sizes. The native .r3d files can be brought into Media Composer using the “Link to AMA File(s)” function. Although you can edit directly with AMA-linked files, the preferred method is to use this as a “first step”. That means, you should use AMA to cull your footage down to the selected takes and then transcode the remainder when you start to fine tune your cut.

Avid’s media creation settings are the place to adjust the RED debayer parameters. Media Composer supports the RED Rocket card for accelerated rendering, but without it, Media Composer can still provide reasonable speed in software-only transcoding. Set the debayer quality to 1/4 or 1/8, and transcoding 4K clips to Avid DNxHD36 for offline editing will be closer to real-time on a fast machine, like an 8-core Mac Pro. This resolution is adequate for making your creative decisions.df_4k_wkflw_02_sm

df_4k_wkflw_08_smWhen the cut is locked, export an AAF file for the edited sequence. Media should be linked (not embedded) and the AAF Edit Protocol setting should be enabled. In this workflow, I will assume that audio post is being handled by an audio editor/mixer running a DAW, such as Pro Tools, so I’ll skip any discussion of audio. That would be exported using standard AAF or OMF workflows for audio post. Note that all effects should be removed from your sequence before generating the AAF file, since they won’t be translated in the next steps. This includes any nested clips, collapsed tracks and speed ramps, which are notorious culprits in any timeline translation.

Color grading with DaVinci Resolve

df_4k_wkflw_03_smBlackmagic Design’s DaVinci Resolve 9 is our next step. You’ll need the full, paid version (software-only) for bigger-than-HD output. After launching Resolve, import the Avid AAF file from Resolve’s conform tab. Make sure you check “link to camera files” so that Resolve connects to the original .r3d media and not the Avid DNxHD transcodes. Resolve will import the sequence, connect to the media and generate a new timeline that matches the sequence exported from Media Composer. Make sure the project is set for the desired 4K format.

df_4k_wkflw_09_smNext, open the Resolve project settings and adjust the camera raw values to the proper RED settings. Then make sure the individual clips are set to “project” in their camera settings tab. You can either use the original camera metadata or adjust all clips to a new value in the project settings pane. Once this is done, you are ready to grade the timeline as with any other production. Resolve uses a very good scaling algorithm, so if the RED files were framed with the intent of resizing and repositioning (for example, 5K files that are to be cropped for the ideal framing within a 4K timeline), then it’s best to make that adjustment within the Resolve timeline.df_4k_wkflw_05_sm

Once you’ve completed the grade, set up the render. Choose the FCP XML easy set-up and alter the output frame size to the 4K format you are using. Start the render job. Resolve 9 renders quite quickly, so even without a RED Rocket card, I found that 4K ProRes HQ or 4444 rendering, using full-resolution debayering, was completed in about a 6:1 ratio to running time on my Mac Pro. When the renders are done, export the FCP XML (for FCP X) from the conform tab. I found I had to use an older version of this new XML format, even though I was running FCP X 10.0.7. It was unable to read the newest version that Resolve had exported.

Online with Apple Final Cut Pro X

df_4k_wkflw_11_smThe last step is finishing. Import the Resolve-generated XML file, which will in turn create the necessary FCP Event (media linked to the 4K ProRes files rendered from Resolve) and a timeline for the edited sequence. Make sure the sequence (Project) settings match your desired 4K format. Import and sync the stereo or surround audio mix (generated by the audio editor/mixer) and rebuild any effects, titles, transitions and fast/slo-mo speed effects. Once everything is completed, use FCP X’s share menu to export your deliverables.

©2013 Oliver Peters

Editing in 2013

df_edit2013

Undoubtedly this year will continue the trend of fractured market share for edit systems. If you tally up every system in general use, your professional choices include NLE systems from Adobe, Apple, Avid, Autodesk, Boris/Media 100, Dayang, Editshare/Lightworks, Grass Valley, SGO, Sony and Quantel. In most US markets, the split in market dominance boils down to an Adobe/Apple/Avid split. In many cases, the leader is still the now-defunct Final Cut Pro 7. Even Apple is stuck competing with itself.

By mid-2013, Final Cut Pro X will have hit its two-year anniversary. The screams of “iMovie Pro” have generally died down. Even the most diehard critics grudgingly admit that it offers many professional features. Although I don’t see it taking off in great numbers within the pro editor community during 2013, I do believe that there’s a “silent minority” of users who are testing it for their own use or as an island within a larger facility. I say “silent”, because many of these folks simply are not the sort that post to forums – or haven’t yet, for fear of getting sucked into the typical pro-con arguments that invariably ensue.

There have been four typical responses to X from FCP “legacy” users: 1) adopt FCP X; 2) stick with FCP 5/6/7; 3) move/return to Media Composer; or 4) move to Premiere Pro. Maybe a few jumped platforms, too, as well as pursued PC options, like EDIUS, Vegas Pro or Avid DS. In my market (central Florida), folks have been sticking with FCP 7 in the interim. Many will start moving to Premiere Pro. That seems to be the most common trend that I see. A few going to Media Composer and a handful with FCP X. As far as I know, I’m the only pro editor in town who has used FCP X on real gigs. I’ve encouraged a few others to at least test the waters. In major markets, like New York or Los Angeles, I think Avid will be the biggest beneficiary of this shift.

A new wild card is Autodesk Smoke 2013. At $3500 for the software-only Mac version, I suspect it will still be too rich for the blood for most editors. FCP X’s $300 price tag (for multiple machines!) is unfortunately viewed as the “new normal”. However, if your editorial focus is advanced finishing, then Smoke may be the system for you. I think it will find its way into shops with multiple edit stations. These owners are likely to add a seat of Smoke to augment the rest of their services.

All of this points to the fact that most editors are reluctant to change. FCP 1-7 was successful because it adopted an editing paradigm that was not that far removed from that of its competitors. FCP X is a different story. It requires work to unlearn and relearn what you know about how an editing application is supposed to work. That’s scary for editors who had begun their pro career within the last decade and only know FCP “legacy”. I’ve cut (on paying gigs) with well over a dozen different linear and nonlinear edit systems. If you add review systems and ones where I supervised, but wasn’t “in the seat” myself, that count is closer to two dozen. I’ve gone through at least three major editing paradigms shifts. If this disruption scares you, because it’s the first one you’ve encountered, then hold onto your hat. It’s going to get worse from here!

Here are some “crystal ball” thoughts for the coming year.

Many users will continue to try to stick with older versions of Final Cut Pro. As Mac OS continues to evolve and as more complex media formats arrive, it will become increasingly difficult to use this old 32-bit application and be efficient. I still find FCP 7 quite versatile, but I’ve just had it with out-of-memory errors and other performance issues that are now quite commonplace.

As folks migrate to an “FCP replacement”, that will most likely be Adobe Premiere Pro CS6. Expect the next version to be out later this year. Adobe has done a good job of listening to customers and I think you’ll see even more substantive improvements to Premiere Pro in this next version. You’ll also see the launch of Adobe Anywhere, which is a platform for collaborative editing. Adobe hasn’t announced specifics as to what will be required on the server side, but Anywhere will be an interesting option for enterprise users.

I don’t see major changes for Avid this year. People like to speculate that they are the next “victim” of Blackmagic Design’s annual buying spree, but I don’t see this as a reality yet (if ever). Although still running a negative balance sheet, Avid has cash in the bank and solid sales. It’s a company dedicated to the needs of pro users, so there is no stream of  consumer products cash flow to deepen their pockets. On the plus side, the products are solid and work in ways that pro users expect and are comfortable with.

Avid Media Composer is the most complex editing program there is (in terms of code), so it’s very impressive that Avid was able to move it to 64-bit with as few problems as there have been. This also means it’s hard to completely change the application. Users expect functional continuity and that cannot be sacrificed. In spite of that, new features like Smart Tool and AMA have kept Media Composer, Symphony and NewsCutter relevant for modern file-based workflows. 2013 will likely still be slow and steady for Avid, but hopefully items like resolution-independence are on the radar.

Autodesk is going to make a big push with Smoke 2013. Their biggest target with this product is the user who has heavy involvement with multiple applications to finish his/her work (like Premiere Pro + Photoshop + After Effects w/plug-ins). Smoke 2013 is designed to do all of these functions in a single application. It is also targeted at other competing finishing systems, like Avid DS. Customers now have two similar products – one on each main editing platform – and at similar (sub $10K) price points. I do think some users will try Smoke in the belief that it’s the hypothetical “FCP 8”. Those users will be disappointed. On the other hand, if you buy it for the purpose intended, then it’s the right tool for the job – conforming and advanced finishing.

This brings us to Apple Final Cut Pro X. I see pockets of use in 2013. Lots of individual users – the “one-man band” director/videographer/editor operations. Also some broadcasters (news and promos), corporate producers and event videographers. You will see some shows adopt it for post, but I think those will be in the minority. It’s important to realize that FCP X’s architecture is ideal for the direction some broadcasters want to take their infrastructure. If you want to post in 1080p/59.94 or 2K or 4K, then Final Cut Pro X is ideally suited for this challenge – more so than just about any other application.

Although Apple is less focused on the publicity gained from high-profile users, like film editors – they would certainly love to have another Cold Mountain moment. Walter Murch’s use of Final Cut on that and subsequent films gave the software some valuable street cred. Having a receptive editor and production company (like the Coen brothers or David Fincher) on the right film – at the point that the software is right AND the production is at an early enough stage – is a matter of timing. 2013 might be the year we see that. If that’s the case, it won’t affect sales volume for FCP X much, but it will change many pro users’ attitudes towards the software. Of course, don’t be surprised if Adobe gets there first!

2013 will be another fun year. More splintering of the applications in use. If you are a freelancer, then you need to know as many of them as possible. Just as 3D animators aren’t really wedded to a single animation application – relying instead on a toolkit of several – so, too, will it be for editors.

Read Scott Simmons’ blog for another take on 2013 prognostications. 

©2013 Oliver Peters