NAB 2013 Distilled

df_nab2013_1Another year – another NAB exhibition. A lot of fun stuff to see. Plenty of innovation and advances, but no single “shocker” like last year’s introduction of the Blackmagic Cinema Camera. Here are some observations based on this past week in Las Vegas.

4K

Yes, 4K was all over. I was a bit surprised that many of the pieces for a complete end-to-end solution are in place. The term 4K refers to the horizontal pixel width of the image, but two common specs are used – the DCI (film) standard of 4096 and the UltraHD (aka QuadHD) standard of 3840. Both are “4K”. Forgotten in the discussion is frame rate. Many displays were showing higher frame rates, such as 4K at 60fps. 120fps is also being discussed.

4K (and higher) cameras were there from Canon, Sony, RED, JVC, GoPro and now Blackmagic Design. Stereo3D was there, too, in pockets; but, it’s all but dead (again). 4K, though, will have legs. The TV sets and distribution methods are coming into position and this is a nonintrusive experience for the viewer. SD to HD was an obvious “in your face” difference. 4K is noticeably better, but not as much as SD to HD. More like 720p versus 1080p. This means that consumer prices will have to continue to drop (as they will) for 4K to really catch hold, except for special venue applications. Right now, it’s pretty obvious how gorgeous 4K is when standing a few feet away from an 84” screen, but few folks can afford that yet.

Interestingly enough, you can even do live 4K broadcasts, using 4K cameras and production products from Astro Designs. This will have value in live venues like sporting events and large corporate meetings. A new factor – “region of interest” – comes into play. This means you can shoot 4K and then scale/crop the portion of the image that interests you. Naturally there was also 8K by NHK and also Quantel. Both have been on the forefront of HD and then 4K. Quantel was demonstrating 8K (downsampled to a 4K monitor) just to show their systems have the headroom for the future.

ARRI did not have a 4K camera, but the 4 x 3 sensor of the ALEXA XT model features 2880 x 2160 photosites. When you use an anamorphic 2:1 lens and record ARRIRAW, you effectively end up with an unsqueezed image of 5760 x 2160 pixels. Downsample that to a widescreen 2.4:1 image inside a 4096 DCI frame and you have visually similar results as with a Sony or RED camera delivering in 4K. This was demonstrated in the booth and the results were quite pleasing. The ALEXA looked a bit softer than comparable displays at the Sony and RED booths, but most cinematographers would probably opt for the ARRI image, since it appears a lot closer to the look of scanned film at 4K. Part of this is inherent with ARRI’s sensor array, which includes optical filtering in-camera. Sony was showing clips from the upcoming Oblivion feature film, which was shot with an F65. To many attendees these clips looked almost too crisp.

In practical terms, most commercial, corporate, television or indie film users of 4K cameras want an easy workflow. If that’s your goal, then the best “true” 4K paths are to shoot with the Canon C500 or the Sony F55. The C500 can be paired with the (now shipping) AJA KiPro Quad to record 4K ProRes files. The Sony records in the XAVC codec (a variant of AVC-Intra). Both are ready to edit (importer plug-ins may be required) without conversions.

You can also record ARRI 2K ProRes in an ALEXA or use one of the various raw workflows (RED, Canon, Blackmagic, Sony, ARRI). Raw is nice, but adds extra steps to the process – often with little benefit over log-profile recording to an encoded file format.

Edit systems

With the shake-up that Apple’s introduction of Final Cut Pro X has brought to the market, brand dominance has been up for grabs. Apple wasn’t officially at the show, but did have some off-site presence, as well as a few staffers at demo pods. For example, they were showing the XAVC integration in an area of the Sony booth. FCP X was well-represented as part of other displays all over the floor. An interesting metric I noticed, was that all press covering the show on video, were cutting their reports on laptops using FCP X. That is a sweet spot for use of the application. No new FCP X news (beyond the features released with 10.0.8) was announced.

Adobe is currently the most aggressive in trying to earn the hearts of editors. The “next” versions of Premiere Pro, SpeedGrade, Audition and After Effects have a ton of features that respond to customer requests and will speed workflows. Adobe’s main stage demos were packed and the general consensus of most editors discussing a move away from FCP 7 (and even Avid) was a move to Adobe. In early press, Adobe mentioned working with the Coen brothers, who have committed to cutting their next film with Premiere.

The big push was for Adobe Anywhere – their answer for cloud-based editing. Although a very interesting product, it will compete in the same space as Quantel Qtube and Avid Interplay Sphere. These are enterprise solutions that require servers, storage, software and support. While it’s an interesting technology, it will tend to be of more interest to larger news operations and educational facilities than smaller post shops.

Avid came on with Media Composer 7 at a new price, with Symphony as an add-on option to Media Composer. The biggest features were the ability to edit with larger-than-HD video sources (output is still limited to HD), LUT support, improved media management of AMA files and background transcoding using managed folders (watch folders). In addition, Pro Tools goes to 11, with a new video engine – it can natively run Avid sequences from AAF imports – and faster-than-real-time bounce. The MC background transcode and the PT11 bounce will be time savers for Avid users and that translates into money saved.

Avid Interplay Sphere (announced last year) now works on Macs, but its main benefit is remote editing for stations that have invested in Interplay solutions. Avid is also bundling packages of ISIS storage, Interplay asset management and seats of Media Composer at even lower price points. Although still premium solutions, they are finally in a range that may be attractive to some small edit facilities and broadcasters, given that it includes installation and support.

The other NLE players include Avid DS (not shown), Quantel Pablo Rio, Autodesk Smoke 2013, Grass Valley EDIUS, Sony Vegas, Media 100 (not shown) and Lightworks. Most of these have no bearing in my market. Smoke 2013 is getting traction. Autodesk is working to get user feedback to improve the application, as it moves deeper into a market segment that is new to them. EditShare is forging ahead with Lightworks on the Mac. It looked pretty solid at the show, but expect something that’s ready for users towards the end of the year. It’s got the film credits to back it up, so a free (or near free) Mac version should shake things up even further.

One interesting addition to the market is DaVinci Resolve 10 gaining editing features. Right now the editing bells-and-whistles are still rudimentary, though all of the standard functions are there. Plus there are titles, speed changes with optical flow and a plug-in API (OpenFX). You can already apply GenArts Sapphire filters to your clips. These are applied in the color correction timeline as nodes, rather than effects added to an editing timeline. This means the Sapphire filters can be baked into any clip renders. The positioning of Resolve 10 is as an online editing tool. That means conforming, titling and trims/tweaks after grading. You now have even greater editing capabilities at the grading stage without having to return to an NLE. Ultimately the best synergy will be between FCP X and Resolve. Together the two apps make for a very interesting package and Apple seems to be working closely with Blackmagic Design to make this happen. Ironically the editing mode page looks a lot like FCP X would have looked with tracks and dual viewers.

Final thoughts

I was reading John Buck’s Timeline on the plane. Even though we think of the linear days as having been dominated by CMX, the reality was that there were many systems, including Mach One, Epic, ISC, Strassner, Convergence, Datatron, Sony, RCA and Ampex. In Hollywood, the TV industry was split among them, which is why a common interchange standard of the EDL was developed. For awhile, Avid became the dominant tool in the nonlinear era, but the truth is that hasn’t always been the norm – nor should it be. The design dilemma of engineering versus creative was a factor from the beginning of video editing. Should a system be simple enough that producers, directors and non-technical editors can run it? Sound familiar?

When I look at the show I am struck at how one makes their buying choices. To use the dreaded car analogy, FCP X is the sports car and Avid is the truck. But the sports car is a temperamental Ferrari that does some things very well , but isn’t appropriate for others. The truck is a Tundra with all the built-in, office-on-the-road niceties.

If I were a facility manager, making a purchase for a large scale facility, it would probably still be Avid. It’s the safe bet – the “you don’t get fired for buying IBM” bet. Their innovations at the show were conservative, but meet the practical needs of their current customers. There simply is no other system with a proven track record across all types of productions that scales from one user to massive installations. But offering conservative innovation isn’t a growth strategy. You don’t get new users that way. Media Composer has become truly complex in ways that only veteran users can accept and that has to change fast.

Apple FCP X is the wild card, of course. Apple is playing the long game looking for the next generation of users. If FCP X weren’t an Apple product, it would receive the same level of attention as Vegas Pro, at best. Also a great tool with a passionate user base, but nothing that has the potential of dominating market share. The trouble is Apple gets in its own way due to corporate secrecy. I’ve been using FCP X for awhile and it certainly is a professional product. But to use it effectively, you have to change your workflow. In a multi-editor, multi-production facility, this means changing a lot of practices and retraining staff. It also means augmenting the software with a host of other applications to fix the short-comings.

Broadening the appeal of FCP X beyond the one-man-band operations may be tough for that reason. It’s too non-standard and no one has any idea of where it’s headed. On the other hand, as an editor who’s willing to deal with new challenges, I like the fast, creative cutting performance of FCP X. This makes it a great offline editing tool in my book. I find a “start in X, finish in Resolve” approach quite intriguing.

Right now, Adobe feels like the horse to beat. They have the ear of the users and an outreach reminiscent of when Apple was in the early FCP “legacy” era. Adobe is working hard to build a community and the interoperability between applications is the best in the industry. They are only hampered by the past indifference towards Premiere that many pro users have. But that seems to be changing, with many new converts. Although Premiere Pro “next” feels like FCP 7.5, that appears to be what users really want. The direction, at least, feels right. Apple may have been “skating to where the puck will be”, but it could be that no one is following or the puck simply wasn’t going there in the first place.

For an additional look – click over to my article for CreativePlanetNetwork – DV magazine.

©2013 Oliver Peters

DaVinci Resolve Workflows

df_resolve_main

Blackmagic Design’s purchase of DaVinci Systems put a world class color grading solution into the hands of every video professional. With Resolve 9, DaVinci sports a better user interface that makes it easy to run, regardless of whether you are an editor, colorist or DIT working on set.  DaVinci Resolve 9 comes in two basic Mac or Windows software versions, the $995 paid and the free Lite version. The new Blackmagic Cinema Camera software bundle also includes the full (paid) version, plus a copy of Ultrascope. For facilities seeking to add comprehensive color grading services, there’s also a version with Blackmagic’s dedicated control surface, as well as Linux systems configurations.

Both paid and free versions of Resolve (currently at version 9.1) work the same way, except that the paid version offers larger-than-HD output, noise reduction and the ability to tap into more than one extra GPU card for hardware acceleration. Resolve runs fine with a single display card (I’ve done testing with the Nvidia GT120, the Nvidia Quadro 4000 and the ATI 5870), but requires a Blackmagic video output card if you want to see the image on a broadcast monitor.

Work in Resolve 9 generally flows left-to-right, through the tabbed pages, which you select at the bottom of the interface screen. These are broken into Media (where you access the media files that you’ll be working with), Conform (importing/exporting EDL, XML and AAF files), Color (where you do color correction), Gallery (the place to store and recall preset looks) and Deliver (rendering and/or output to tape).

Many casual users employ Resolve in these two ways: a) correcting camera files to send on to editorial, and b) color correction roundtrips with NLE software. This tutorial is intended to highlight some of the basic workflow steps associated with these tasks. Resolve is deep and powerful, so spend time with the excellent manual to learn its color correction tools, which would be impossible to cover here.

Creating edit-ready dailies – BMCC (CinemaDNG media)

The Blackmagic Cinema Camera can record images as camera raw, CinemaDNG image sequences. Resolve 9 can be used to turn these into QuickTime or MXF media for editing. Files may be graded for the desired final look at this point, or the operator can choose to apply the BMD Film preset. This log preset generates files with a flat look comparable to ARRI Log-C. You may prefer this if you intend to use a Log-to-Rec709 LUT (look up table) in another grading application or a filter like the Pomfort Log-to-Video effect, which is available for Final Cut Pro 7/X.df_resolve_1_sm

Step 1 – Media: Drag clip folders into the Media Pool section.

Step 2 – Conform: Skip this tab, since the clips are already on a single timeline.

df_resolve_3_smStep 3 – Color: Make sure the camera setting (camera icon) for the clips on the timeline are set to Project. Open the project settings (gear icon). Change and apply these values: 1) Camera raw – CinemaDNG; 2) White Balance – as shot; 3) Color Space and Gamma – BMD Film.

Step 4 – Deliver: Set it to render each clip individually, assign the target destination and frame rate and the naming options. Then choose Add Job and Start Render.

The free version of Resolve will downscale the BMCC’s 2.5K-wide images to 1920×1080. The paid version of Resolve will permit output at the larger, native size. Rendered ProRes files may now be directly imported into FCP 7, FCP X or Premiere Pro. Correct the images to a proper video appearance by using the available color correction tools or filters within the NLE that you are using.

Creating edit-ready dailies – ARRI Alexa / BMCC (ProRes, DNxHD media)

df_resolve_2_smBoth the ARRI Alexa and the Blackmagic Cinema Camera can record Apple ProRes and Avid DNxHD media files to onboard storage. Each offers a similar log gamma profile that may be applied during recording in order to preserve dynamic range. Log-C for the Alexa and BMD Film for Blackmagic. These profiles facilitate high-quality grading later. Resolve may be used to properly grade these images to the final look as dailies are generated, or it may simply be used to apply a viewing LUT for a more pleasing appearance during the edit.

Step 1 – Media: Drag clip folders into the Media Pool section.

Step 2 – Conform: Skip this tab, since the clips are already on a single timeline.

Step 3 – Color: Make sure the camera setting for the clips on the timeline are set to Project. Open the project settings and set these values: 3D Input LUT – ARRI Alexa Log-C or BMD Film to Rec 709.

df_resolve_4_smStep 4 – Deliver: Set it to render each clip individually, assign the target destination and frame rate and the naming options. Check whether or not to render with audio. Then choose Add Job and Start Render.

The result will be new, color corrected media files, ready for editing. To render Avid-compatible MXF media for Avid Media Composer, select the Avid AAF Roundtrip from the Easy Setup presets. After rendering, return to the Conform page to export an AAF file.

Roundtrips – using Resolve together with editing applications

DaVinci Resolve supports roundtrips from and back to NLEs based on EDL, XML and AAF lists. You can use Resolve for roundtrips with Apple Final Cut Pro 7/X, Adobe Premiere Pro and Avid Media Composer/Symphony. You may also use it to go between systems. For example, you could edit in FCP X, color correct in Resolve and then finish in Premiere Pro or Autodesk Smoke 2013. Media should have valid timecode and reel IDs to enable the process to work properly.

df_resolve_5_smIn addition to accessing the camera files and generating new media with baked-in corrections, these roundtrips require an interchange of edit lists. Resolve imports an XML and/or AAF file to link to the original camera media and places those clips on a timeline that matches the edited sequence. When the corrected (and trimmed) media is rendered, Resolve must generate new XML and/or AAF files, which the NLE uses to link to these new media files. AAF files are used with Avid systems and MXF media, while standard XML files and QuickTime media is used with Final Cut Pro 7 and Premiere Pro. FCP X uses a new XML format that is incompatible with FCP 7 or Premiere Pro without translation by Resolve or another utility.

Step 1 – Avid/Premiere Pro/Final Cut Pro: Export a list file that is linked to the camera media (AAF, XML or FCPXML).

Step 2- Conform (skip Media tab): Import the XML or AAF file. Make sure you have set the options to automatically add these clips to the Media Pool.

Step 3 – Color: Grade your shots as desired.df_resolve_6_sm

Step 4 – Deliver: Easy Setup preset – select Final Cut Pro XML or Avid AAF roundtrip. Verify QuickTime or MXF rendering, depending on the target application. Change handle lengths if desired. Check whether or not to render with audio. Then choose Add Job and Start Render.

df_resolve_9_smStep 5 – Conform: Export a new XML (FCP7, Premiere Pro), FCPXML (FCP X) or AAF (Avid) list.

The roundtrip back

The reason you want to go back into your NLE is for the final finishing process, such as adding titles and effects or mixing sound. If you rendered QuickTime media and generated one of the XML formats, you’ll be able to import these new lists into FCP7/X or Premiere Pro and those applications will reconnect to the files in their current location. FCP X offers the option to import/copy the media into its own managed Events folders.

df_resolve_7_smIf you export MXF media and a corresponding AAF list with the intent of returning to Avid Media Composer/Symphony, then follow these additional steps.

Step 1 – Copy or move the folder of rendered MXF media files into an Avid MediaFiles/MXF subfolder. Rename this copied folder of rendered Resolve files with a number.

Step 2 – Launch Media Composer or Symphony and return to your project or create a new project.df_resolve_8_sm

Step 3 – Open a new, blank bin and import the AAF file that was exported from Resolve. This list will populate the bin with master clips and a sequence, which will be linked to the new MXF media rendered in Resolve and copied into the Avid MediaFiles/MXF subfolder.

Originally written for DV magazine / Creative Planet Network

©2013 Oliver Peters

Blackmagic Design HyperDeck Shuttle

df_hyperdeck_01The video industry has been moving towards complete file-based workflows, but that doesn’t replace all of the functions that traditional videotape recorders served. To bridge the gap, companies such as AJA, Blackmagic Design, Convergent Design, Sound Devices and others had developed solid state recorders for field and studio operation. I recently tested Blackmagic Design’s HyperDeck Shuttle 2, which is touted as the world’s smallest uncompressed recorder.

Blackmagic’s HyperDeck series includes the Shuttle and two Studio versions. The latter are rack-mounted VTR-replacement devices equipped with dual SSDs (solid state drives). The Shuttle is a palm-sized, battery powered “brick” recorder. A single SSD slides into the Shuttle enclosure, which is only a bit bigger than the drive itself – enough to accommodate battery, controls and internal electronics. To keep the unit small, controls are basic record and transport buttons, much like that of a consumer CD player. You can operate it connected to an external power supply, on-board camera power or battery-powered. The internal, non-removable, rechargeable battery holds its charge for a little over one hour of continuous operation. The purchased unit includes a 12-volt power supply and a kit of international AC plug adapters.

The HyperDeck Shuttle includes 3Gb/s SDI and HDMI for digital capture and playback. Recording formats include 10-bit uncompressed QuickTime movies, as well as Avid DNxHD 175x or 220x in either QuickTime or MXF-wrapped variations. At the time I tested this device, it would not record Apple ProRes codecs. In November, Blackmagic Design released a free software update (version 3.6), which added ProRes HQ to the uncompressed and DNxHD options. It also added closed captioning support to all HyperDeck models.

Since the unit is designed for minimal interaction, all system set-up is handled by an external software utility. Install this application on your computer, connect the HyperDeck Shuttle via USB and then you’ll be able to select recording formats and other preferences, such as whether or not to trigger recording via SDI (for on-camera operation). The unit has no menu, which means you cannot alter, rename or delete files using the button controls or the software utility. There is a display button, but that was not active in the software version that I tested.

Solid state recording

df_hyperdeck_03The SSD used is a standard 2.5” SATA III drive. Several different brands and types have been tested and qualified by Blackmagic Design for use with the HyperDeck units. These drives can be plugged into a generic hard drive dock, like a Thermaltake BlacX Duet to format the drive and copy/erase any files. The SSD was Mac-formatted, so it was simply a matter of pulling the drive out of the Shuttle’s slot and plugging it into the Duet, which was connected to my Mac Pro tower. This allowed me to copy files from the drive to my computer, as well as to move files back to the SSD for later playback from the Shuttle. (At IBC, Blackmagic also announced ExFAT support with the HyperDeck products.) The naming convention is simple, so recorded files are labeled Capture001, Capture002 and so on. Unfortunately, it does not embed reel numbers into the QuickTime files. Placing a similarly named file in the correct format (more on that in a moment) onto the drive makes it possible to use the Shuttle as a portable master playback device for presentations, film festivals, etc.

My evaluation unit came equipped with an 240GB OCZ Vertex 3 SSD. This is an off-the-shelf drive that runs under $200 at most outlets. By comparison, a Sony 124-minute HDCAM-SR videotape is now more expensive. It’s amazing that this SSD will sustain extended 10-bit uncompressed 1080i/59.94 recording and playback, when even most small drive arrays can’t do that! In practical terms, a 240GB drive will not hold a lot of 1080i 10-bit uncompressed media, so it’s more likely that you would use Avid DNxHD 220X or Apple ProRes HQ for the best quality. You could easily fit over 90 minutes of content on the same SSD using one of these codecs and not really see any difference in image quality.

In actual use

I tested the unit with various codecs and frame rates. As a general rule, it’s not a good idea to mix different flavors on the same drive. For example, if you record both 1080i 10-bit uncompressed and 1080p/23.98 Avid DNxHD clips on the same drive, the HyperDeck Shuttle will only be able to playback the clips that match its current set-up. The Shuttle does auto-detect the incoming frame rate without the need to set that using the utility. It did seem to get “confused” in this process, making it hard to access the clips that I thought it should have been able to play. The clips are on the SSD, though, so you can still pull them off of the drive for use in editing. For standard operation, I would suggest that you set your preferences for the current production and stick to that until you are done.

df_hyperdeck_02Blackmagic Design sells a mounting plate as an accessory. It’s easy to install by unscrewing the HyperDeck’s back panel and screwing in the mounting plate in its place. I loaned the unit to a director of photography that I work with for use with a Canon C300.  Although there are common mounting holes, the DP still ended up having to use Velcro as a means to install both his battery and the Shuttle onto the same camera rig. The recordings looked good, but the SDI trigger did not properly stop the recording, requiring the DP to manually stop the unit with each take. Another issue for some is that it uses Mini-BNC connectors. This requires an investment in some Mini-BNC adapter cables for SDI operation, if you intend to connect it to standard BNC spigots.

Overall, the unit performed well in a variety of applications, but with a few quirks. I frequently found that it didn’t respond to my pushing the transport control buttons. I’m not sure if this was due to bad button contacts or a software glitch. It felt more like a software issue, in that once it “settled down” stepping forward and backward through clips and pushing the play button worked correctly. The only format I was not able to playback was 24p media recorded as MXF. Nevertheless the MXF formatting was correct, as I could drop these files right into an Avid Mediafiles folder on my media hard drive for editing with Avid Symphony.

HyperDeck Shuttle as a portable player

If you intend to use a HyperDeck Shuttle as a master playback device, then there a few things you need to know. It can capture interlaced, progressive and progressive-segmented-frame (PsF) footage, but it will only play these out as either interlaced or progressive via the SDI connection. Playing PsF as progressive is fine for many monitors and projectors, but the signal doesn’t pass through many routers or to some other recorders. Often these broadcast devices only function with a “true” progressive signal if the format is 720p/59.94. This means that it would be unlikely that you could play a 1080p/23.98 file (captured as PsF) and record that output from the HyperDeck Shuttle to a Sony HDCAM-SR video recorder, as an example.

It is possible to export a file from your Avid NLE, copy that file to the HyperDeck’s SSD using a drive dock and play it back from the unit; however, the specs get a little touchy. The HyperDeck Shuttle records audio as 16-bit/48kHz in the Little Endian format, but Avid exports its files as Big Endian. Endianness refers to how the bytes are ordered in a 16-, 32- or 64-bit word and whether the most or least significant bit is first. In the case of the Shuttle, this difference meant that I couldn’t get any audio output during playback. If your goal is to transfer a file to the Shuttle for duplication to another deck or playback in a presentation environment, then I would recommend that you take the time to make a real-time recording. Simply connect your NLE’s SDI output to the HyperDeck Shuttle’s SDI input and manually record to it, on-the-fly, like a tape deck.

The HyperDeck Shuttle is a great little unit for filling in workflow gaps. For example, if you don’t own any tape decks, but need to take a master to a duplication facility. You could easily use the Shuttle to transport your media to them and use it for on-site master playback. It’s a bit too quirky to be a great on-camera field recorder, but at $345 (plus the SSD), the Shuttle is an amazing value for image quality that good. As with their other products, Blackmagic Design has a history of enhancing the capabilities through subsequent software updates. I expect that in the future, we’ll see the HyperDeck family grow in a similar fashion.

Originally written for DV magazine / Creative Planet Network

© 2013 Oliver Peters

Offline to online with 4K

df_4k_wkflw_01

The 4K buzz  seems to be steam-rolling the industry just like stereo3D before it. It’s too early to tell whether it will be an immediate issue for editors or not, since 4K delivery requirements are few and far between. Nevertheless, camera and TV-set manufacturers  are building important parts of the pipeline. RED Digital Cinema is leading the way with a post workflow that’s both proven and relatively accessible on any budget. A number of NLEs support editing and effects in 4K, including Avid DS, Autodesk Smoke, Adobe Premiere Pro, Apple Final Cut Pro X, Grass Valley EDIUS and Sony Vegas Pro.

Although many of these support native cutting with RED 4K media, I’m still a strong believer in the traditional offline-to-online editing workflow. In this post I will briefly outline how to use Avid Media Composer and Apple FCP X for a cost-effective 4K post pipeline. One can certainly start and finish a RED-originated project in FCP X or Premiere Pro for that matter, but Media Composer is still the preferred creative  tool for many editing pros. Likewise, FCP X is a viable finishing tool. I realize that statement will raise a few eyebrows, but hear me out. Video passing through Final Cut is very pristine, it supports the various flavors of 2K and 4K formats and there’s a huge and developing ecosystem of highly-inventive effects and transitions. This combination is a great opportunity to think outside of the box.

Offline editing with Avid Media Composer

df_4k_wkflw_04_smAvid has supported native RED files for several versions, but Media Composer is not resolution independent. This means RED’s 4K (or 5K) images are downsampled to 1080p and reformatted (cropped or letterboxed) to fit into the 16:9 frame. When you shoot with a RED camera, you should ideally record in one of their 4K 16:9 sizes. The native .r3d files can be brought into Media Composer using the “Link to AMA File(s)” function. Although you can edit directly with AMA-linked files, the preferred method is to use this as a “first step”. That means, you should use AMA to cull your footage down to the selected takes and then transcode the remainder when you start to fine tune your cut.

Avid’s media creation settings are the place to adjust the RED debayer parameters. Media Composer supports the RED Rocket card for accelerated rendering, but without it, Media Composer can still provide reasonable speed in software-only transcoding. Set the debayer quality to 1/4 or 1/8, and transcoding 4K clips to Avid DNxHD36 for offline editing will be closer to real-time on a fast machine, like an 8-core Mac Pro. This resolution is adequate for making your creative decisions.df_4k_wkflw_02_sm

df_4k_wkflw_08_smWhen the cut is locked, export an AAF file for the edited sequence. Media should be linked (not embedded) and the AAF Edit Protocol setting should be enabled. In this workflow, I will assume that audio post is being handled by an audio editor/mixer running a DAW, such as Pro Tools, so I’ll skip any discussion of audio. That would be exported using standard AAF or OMF workflows for audio post. Note that all effects should be removed from your sequence before generating the AAF file, since they won’t be translated in the next steps. This includes any nested clips, collapsed tracks and speed ramps, which are notorious culprits in any timeline translation.

Color grading with DaVinci Resolve

df_4k_wkflw_03_smBlackmagic Design’s DaVinci Resolve 9 is our next step. You’ll need the full, paid version (software-only) for bigger-than-HD output. After launching Resolve, import the Avid AAF file from Resolve’s conform tab. Make sure you check “link to camera files” so that Resolve connects to the original .r3d media and not the Avid DNxHD transcodes. Resolve will import the sequence, connect to the media and generate a new timeline that matches the sequence exported from Media Composer. Make sure the project is set for the desired 4K format.

df_4k_wkflw_09_smNext, open the Resolve project settings and adjust the camera raw values to the proper RED settings. Then make sure the individual clips are set to “project” in their camera settings tab. You can either use the original camera metadata or adjust all clips to a new value in the project settings pane. Once this is done, you are ready to grade the timeline as with any other production. Resolve uses a very good scaling algorithm, so if the RED files were framed with the intent of resizing and repositioning (for example, 5K files that are to be cropped for the ideal framing within a 4K timeline), then it’s best to make that adjustment within the Resolve timeline.df_4k_wkflw_05_sm

Once you’ve completed the grade, set up the render. Choose the FCP XML easy set-up and alter the output frame size to the 4K format you are using. Start the render job. Resolve 9 renders quite quickly, so even without a RED Rocket card, I found that 4K ProRes HQ or 4444 rendering, using full-resolution debayering, was completed in about a 6:1 ratio to running time on my Mac Pro. When the renders are done, export the FCP XML (for FCP X) from the conform tab. I found I had to use an older version of this new XML format, even though I was running FCP X 10.0.7. It was unable to read the newest version that Resolve had exported.

Online with Apple Final Cut Pro X

df_4k_wkflw_11_smThe last step is finishing. Import the Resolve-generated XML file, which will in turn create the necessary FCP Event (media linked to the 4K ProRes files rendered from Resolve) and a timeline for the edited sequence. Make sure the sequence (Project) settings match your desired 4K format. Import and sync the stereo or surround audio mix (generated by the audio editor/mixer) and rebuild any effects, titles, transitions and fast/slo-mo speed effects. Once everything is completed, use FCP X’s share menu to export your deliverables.

©2013 Oliver Peters

Blackmagic Cinema Camera post workflows

Digital camera development has been running in high gear for several years outpacing any other portion of our industry. Thanks to a revolution started by RED, Nikon and Canon, videographers are now blessed with a wide range of small, affordable, high-performance imaging systems that have broken us free from the confines of the mundane 2/3” video camera. The newest entrant is the Blackmagic Cinema Camera introduced by the industry’s favorite disrupter, Blackmagic Design. Marked by a small form factor, QuickTime or camera raw recording and a $3K price tag, Blackmagic has been able to bring to market a product that seems to have eluded many other seasoned camera manufacturers.

The basic engineering design of the Blackmagic Cinema Camera is a “sandwich” of an EF or MFT (Micro Four Thirds) lens mount, a recording device based on the HyperDeck Shuttle and a touch screen/viewfinder. It records either 1920×1080 ProResHQ QuickTime movie files or 2400×1350 CinemaDNG camera raw image sequences. (Version 1.1 software was recently released, which adds Avid DNxHD support.) The high-def QuickTime files are downsampled from the 2.5K sensor. With CinemaDNG selected, each clip is treated as a folder of individual frames, plus a broadcast wave file. Each time “record” is pressed, a new folder is created for that clip. The camera raw files maintain the full sensor resolution, allowing for high-quality reframing and digital zooms in post.

This isn’t a camera review, so I’ll leave the discussion of the merits of the camera in the field to others. Since the BMCC offers new options to filmmakers, it’s important to understand how to handle these files in post. (Click any of these images for an expanded view.)

Understanding camera raw

Camera raw is not an acronym. The term refers to a file that has not gone through full processing to produce a final RGB image. The full dynamic range of the sensor’s ability to capture light is maintained in raw images. Different manufacturers use different camera raw methods and profiles for individual models. When you get frequent software updates to Apple Aperture or Adobe Photoshop, it’s often to add new camera profiles to keep current with the latest Canon or Nikon offerings. In most camera raw images, ISO/exposure and color temperature/tint values are represented as metadata recorded by the camera at the time the image was captured. As metadata, it can be altered in post and isn’t “baked in” as a permanent part of the image – as it would with a TIFF or JPEG still. If a camera raw image appears to be slightly overexposed, post processing software allows you to recover the highlight detail by changing the ISO or exposure values.

Adobe launched an initiative to create a common camera raw format as a type of “digital negative” file, which became the DNG standard. This was released as open source software and is available for manufacturers to use in their products as DNG (stills) and CinemaDNG (motion), thus eliminating the need to create their own new, proprietary camera raw file format. Blackmagic uses this CinemaDNG file format for its raw image sequences, which means that a wide range of applications can read, open and import these files natively. Those that include camera raw importer modules also enable you to alter the recorded settings within that application.

Correctly importing camera raw images is important. For instance, Apple Final Cut Pro X will natively read the BMCC’s CinemaDNG files, but it currently has no raw importer settings. If the “as shot” metadata makes the image appear overexposed with clipped highlights, you cannot recover that detail from within FCP X. Likewise, not all camera raw importers use the same values. An image opened at the default or the “as shot” value in DaVinci Resolve will look different than in an Adobe application. The beauty of raw, though, is that the image is within an adjustable range and any of these importers will give you good results with a few tweaks.

Image sequence workflows

Blackmagic Design includes a full copy of DaVinci Resolve 9 with the purchase of the camera and that’s obviously their recommended tool for producing editing “dailies” and final color correction. Since DNG is a still photo format, grading and conversion can be handled in other applications, too, including Adobe Photoshop, Lightroom, After Effects and Apple Aperture. Each of these applications includes camera raw controls to get the most out of the image. Currently, you cannot import the camera raw files directly into Avid Media Composer, Apple Color or Adobe Premiere Pro. SpeedGrade (with the latest updates) will read the files, but offers no specific camera raw adjustments. There the default import of CinemaDNG files renders a flat, log-style image as a starting point.

The following is a simple workflow using a photography application, such as Lightroom or Aperture. Import each folder of CinemaDNG files into the application. Now select a representative frame within that group and apply your adjustments. Since these are full-featured color correction tools, go as extreme as you like, if you intend to create the final look at this time. Once you get the appearance you want, copy-and-paste those settings to the other images in that folder. These are non-destructive changes within Lightroom and Aperture and may be altered at any time in the future. Next, export the adjusted versions as a new set of TIFFs to a separate folder on your hard drive. These TIFFs will contain the “baked in” look you have just created.

The process is a bit different in Photoshop, but there you have the option of using one of the many special tools and filters to create unique looks. For example, you can apply an oil paint or dark strokes effect for an artistic, painted style. Open a representative frame from a shot and apply the settings you want to use. As you do this, record the steps as a Photoshop Action. When you are happy with the look, use Photoshop’s Batch function to apply this saved Action to all the frames in a folder for each shot.

The CinemaDNG files are 5MB/frame in size, while the exported TIFFs are 9.8MB each. It is possible to open a TIFF image sequence in QuickTime 7 and save it as a QuickTime reference movie. That, in turn, can be used as an editing source in Final Cut Pro 7 and X. As a reference movie, if you update the files later by re-exporting TIFFs with a new look, the reference movie will also be updated to reflect these new files. FCP X offers a performance edge by being able to play these 2.5K sequences natively in real-time within a 2K or HD timeline. QuickTime reference movies are 8-bit, but I saw no visual difference when comparing these files to 10-bit uncompressed and ProRes4444 exports. I would recommend that you transcode these to proxy editing files in FCP X if you opt for the QuickTime reference method.

After Effects offers another solution. You can open CinemaDNG image sequences, make adjustments with its camera raw importer, and then render out final, graded movie files. Naturally, plug-ins like Magic Bullet Looks add more options for custom styles. If you opt to first convert the DNG sequences to TIFFs using Lightroom or Aperture, then Avid Media Composer and Symphony will auto-detect the files as an image sequence and import them as a single media file.

DaVinci Resolve 9

DaVinci Resolve 9 is an advanced grading tool, but may also be used simply to turn the image sequences into a set of flat-looking QuickTime movie files, suitable for color correction later. Resolve 9 now includes a camera raw settings tab. Tweak the settings and then export each clip as a separate movie file. Blackmagic Design has implemented BMD Film with this camera. It’s a log-encoded color space and gamma profile that resembles ARRI’s Log-C. This profile may be selected in-camera for the QuickTime files, but may also be used as a preset in the Resolve 9 raw module (also available in the free Lite version).

So far, my testing has been limited to the handful of clips by Australian DP John Brawley. These are sample shots from the short film Afterglow, which was produced to showcase the Blackmagic Cinema Camera. Brawley has posted several shots online in both CinemaDNG and ProResHQ formats. The QuickTime files from the camera were encoded with the BMD Film profile, which matches the same setting when applied to camera raw files converted through Resolve 9.

BMD Film

The option of using After Effects, Photoshop, Lightroom or Aperture gives users an interesting new toolset for creating stunning images; but, for most, there’s a comfort factor in using your favorite NLE or grading software. I believe the majority of users will probably stick to shooting ProResHQ files using the BMD Film log profile, because it’s a proven workflow. It preserves the dynamic range and gives you most of the latitude available from the CinemaDNG files. Camera raw files exported from Resolve 9 as ProResHQ using the BMD Film preset (without other correction) are identical to the appearance of the QuickTimes recorded in-camera. The ARRI ALEXA also shoots raw and Log-C, which makes the BMCC are interesting option as a sort of “baby ALEXA”. I haven’t intercut clips from a project that was shot with both an ALEXA and the Blackmagic camera yet, but I suspect the BMCC would work well as a good B or C camera in this type of production.

When I take the BMD Film-encoded clips into Final Cut Pro 7 or X, the values are close enough to Log-C that I can use many of the same LUTs and filters. For example, the Pomfort Alexa Look2Video filter that I use to correct Log-C into Rec 709 works equally well with BMD Film. I’ve done grading tests using a range of NLEs and color correction software and have been very impressed with the results from these Afterglow test clips. Working with the CinemaDNG or ProResHQ BMCC clips will fit into established workflows, without the need to learn new, proprietary tools. No matter what your preference – Avid, FCP X, Premiere, After Effects, Color, Resolve, Photoshop, etc. – this is one new camera that was designed with post in mind first.

Click here to see a variety of grading examples.

Originally written for Digital Video magazine.

© 2012 Oliver Peters