Hawaiki Color

df_hawaiki_1_sm

Color correction using graphical color wheels was introduced to the editing world in the Avid Symphony over a decade ago and adopted by nearly every NLE after that.  Final Cut Pro “legacy” had a two nice color correctors using the color wheel model, so adopters of Final Cut Pro X were disappointed to see the Color Board as the replacement. Although the additive/subtractive color math works about the same way to change tonality of lows, mids and highlights, many users still pine for wheels instead of pucks and sliders. A pair of developers (Tokyo Productions and Lawn Road) set out to rectify that situation with Hawaiki Color. It’s the color correction tool that many Final Cut Pro X editors wish Apple had built. (Click any images in this post for an enlarged view.)

Both developers offer several different types of grading filters, which all perform similar tasks. Each has its own twists, but only Hawaiki Color includes on-screen sliders and color wheel controls. Based on how Apple designed FCP X, developers simply cannot create custom interfaces within the Inspector effects panel. They are limited to sliders and a few extras. One of these extras is to the ability to tap into the Mac OS color pickers to use color swatches as tonal controls for low/mid/hi color balance. A number of grading filters use this method quite successfully.

If a developer wants to introduce more custom interface elements, then there are two routes – linking to a separate external application (Magic Bullet Looks, Digital Film Tools Film Stocks, Tiffen Dfx3, GenArts Sapphire Edge) – or placing an overlay onto the Viewer. Thanks to the latter option, a number of developers have created special overlays that become “heads up display” (HUD) controls for their plug-ins. To date, only Hawaiki Color and Yanobox Moods have used a HUD overlay to reproduce color wheels for grading.

df_hawaiki_2_smThe Hawaiki Color grading controls can be adjusted either from the Inspector effects pane or from the on-screen HUD controls placed over the main Viewer output. Set-ups, like a reference split screen, must be done from the Inspector. The grading controls are built into three of the four frame corners with low/mid/hi/global sliders for exposure, temperature and saturation. The sliders in the fourth corner let you adjust overall hue, contrast, sharpening and blur. At the center bottom of the frame are three color wheels (low/mid/hi) for balance offsets. Once the Hawaiki Color filter is applied to the desired clips in your timeline – and you have set the filter to be displayed in a window or full screen with overlaid controls – it becomes very easy to move from clip-to-clip in a very fast grading session.

df_hawaiki_3_smI ran a test using Philip Bloom’s Hiding Place short film, which he shot as part of his review of the Blackmagic Pocket Cinema Camera. He was gracious enough to offer an ungraded ProResHQ version for download, which is what I used as my test footage. The camera settings include a flat gamma profile (BMD Film), which is similar to RED’s RedLogFilm or ARRI’s Log-C and is ideal for grading. I edited this into an FCP X timeline, bladed the clip at all the cuts and then applied the Hawaiki Color filter to each segment.

df_hawaiki_4_smBy running my Viewer on the secondary screen, setting the filter to full screen with the interface controls overlaid and placing the FCP X scopes below, I ended up with a very nice color grading environment and workflow.  The unique aspect, compared to most other grading filters, is that all adjustments occur right on the image. This means your attention always stays on the image, without needing to shift between the Inspector and the Viewer or an external monitor. I did my grading using a single instance of the filter, but it is possible to stack more than one application of Hawaiki Color onto a clip or within adjustment layers. You can also use it in conjunction with any other filter. In fact, in my final version, I added just a touch of the FilmConvert Pro film emulsion filter, as well as an FCP X Color Board shape mask for a vignette effect.

df_hawaiki_5_smThere are a few things to be mindful of. Because of the limitations developers face in creating HUDs for an FCP X effect, Hawaiki Color includes a “commit grade” button, which turns off the on-screen interface. If you don’t “commit” the grade, then the interface is baked into your rendered file and/or your exported master. Like all third-party filters, Hawaiki Color does not have the same unrendered performance as FCP X’s own Color Board. There’s “secret sauce” that Apple uses, which developers are not privy to. Frankly, there isn’t a single third-party FCP X filter that performs as well as Apple’s built-in effects. Nevertheless, Hawaiki Color performed reasonably well in real-time and didn’t get sluggish until I stacked FilmConvert and a vignette on top of it.

df_hawaiki_6_smI ran into an issue with Bloom’s source file, which he exports at a cropped 1920 x 816 size for a 2.40:1 aspect ratio. FCP X will fit this into a 1920 x 1080 sequence with letterboxed black pad on the top and bottom. However, by doing this, I found out that it affected the HUD controls, once I added more filters. It also caused the color wheel controls to change possible in the frame, as they are locked to the source size. The solution to avoid such issues is to place the non-standard-sized clip into a 1080p sequence and then create a Compound Clip. Now edit your Compound Clip to a new sequence where you will apply the filters. None of this is an issue with Hawaiki Color or any other filter, but rather a function of working with non-standard (for video) frame sizes within an FCP X sequence.

df_hawaiki_7_smAs far as grading Hiding Place, my intent was to go for a slight retro look, like 1970s era film. The footage lent itself to that and with the BMD Film gamma profile was easy to grade. I stretched exposure/contrast, increased saturation and swung the hue offsets as follows – shadows towards green, midrange towards red/orange and highlights towards blue. The FilmConvert Pro filter was set to a Canon Mark II/Standard camera profile and the KD5207 Vis3 film stock selection. This is a preset that mimics a modern Kodak negative stock with relatively neutral color. I dialed it back to 30% of its color effect, but with grain at 100% (35mm size). The effect of this was to slightly change gamma and brightness and to add grain. Finally, the Color Board vignette darkens the edges of the frame.

Click here to see my version of Hiding Place graded using Hawaiki Color. In my clip, you’ll see the final result (first half), followed by a split screen output with the interface baked in. Although I’ve been a fan of the Color Board, I really like the results I got from Hawaiki Color. Control granularity is better than the Color Board and working the wheels is simply second nature. Absolutely a bargain if it fits your grading comfort zone!

©2013 Oliver Peters / Source images @2013 PhilipBloom.net

The NLE that wouldn’t die II

df_nledie2_sm

With echoes of Monty Python in the background, two years on, Final Cut Pro 7 and Final Cut Studio are still widely in use. As I noted in my post from last November, I still see facilities with firmly entrenched and mature FCP “legacy” workflows that haven’t moved to another NLE yet. Some were ready to move to Adobe until they learned subscription was the only choice going forward. Others maintain a fanboy’s faith in Apple that the next version will somehow fix all the things they dislike about Final Cut Pro X. Others simply haven’t found the alternative solutions compelling enough to shift.

I’ve been cutting all manner of projects in FCP X since the beginning and am currently using it on a feature film. I augment it in lots of ways with plug-ins and utilities, so I’m about as deep into FCP X workflows as anyone out there. Yet, there are very few projects in which I don’t touch some aspect of Final Cut Studio to help get the job done. Some fueled by need, some by personal preference. Here are some ways that Studio can still work for you as a suite of applications to fill in the gaps.

DVD creation

There are no more version updates to Apple’s (or Adobe’s) DVD creation tools. FCP X and Compressor can author simple “one-off” discs using their export/share/batch functions. However, if you need a more advanced, authored DVD with branched menus and assets, DVD Studio Pro (as well is Adobe Encore CS6) is still a very viable tool, assuming you already own Final Cut Studio. For me, the need to do this has been reduced, but not completely gone.

Batch export

Final Cut Pro X has no batch export function for source clips. This is something I find immensely helpful. For example, many editorial houses specify that their production company client supply edit-friendly “dailies” – especially when final color correction and finishing will be done by another facility or artist/editor/colorist. This is a throwback to film workflows and is most often the case with RED and ALEXA productions. Certainly a lot of the same processes can be done with DaVinci Resolve, but it’s simply faster and easier with FCP 7.

In the case of ALEXA, a lot of editors prefer to do their offline edit with LUT-corrected, Rec 709 images, instead of the flat, Log-C ProRes 4444 files that come straight from the camera. With FCP 7, simply import the camera files, add a LUT filter like the one from Nick Shaw (Antler Post), enable TC burn-in if you like and run a batch export in the codec of your choice. When I do this, I usually end up with a set of Rec 709 color, ProResLT files with burn-in that I can use to edit with. Since the file name, reel ID and timecode are identical to the camera masters, I can easily edit with the “dailies” and then relink to the camera masters for color correction and finishing. This works well in Adobe Premiere Pro CC, Apple FCP 7 and even FCP X.

Timecode and reel IDs

When I work with files from the various HDSLRs, I prefer to convert them to ProRes (or DNxHD), add timecode and reel ID info. In my eyes, this makes the file professional video media that’s much more easily dealt with throughout the rest of the post pipeline. I have a specific routine for doing this, but when some of these steps fail, due to some file error, I find that FCP 7 is a good back-up utility. From inside FCP 7, you can easily add reel IDs and also modify or add timecode. This metadata is embedded into the actual media file and readable by other applications.

Log and Transfer

Yes, I know that you can import and optimize (transcode) camera files in FCP X. I just don’t like the way it does it. The FCP 7 Log and Transfer module allows the editor to set several naming preferences upon ingest. This includes custom names and reel IDs. That metadata is then embedded directly into the QuickTime movie created by the Log and Transfer module. FCP X doesn’t embed name and ID changes into the media file, but rather into its own database. Subsequently this information is not transportable by simply reading the media file within another application. As a result, when I work with media from a C300, for example, my first step is still Log and Transfer in FCP 7, before I start editing in FCP X.

Conform and reverse telecine

A lot of cameras offer the ability to shoot at higher frame rates with the intent of playing this at a slower frame rate for a slow motion effect – “overcranking” in film terms. Advanced cameras like the ALEXA, RED One, EPIC and Canon C300 write a timebase reference into the file that tells the NLE that a file recorded at 60fps is to be played at 23.98fps. This is not true of HDSLRs, like a Canon 5D, 7D or a GoPro. You have to tell the NLE what to do. FCP X only does this though its Retime effect, which means you are telling the file to be played as slomo, thus requiring a render.

I prefer to use Cinema Tools to “conform” the file. This alters the file header information of the QuickTime file, so that any application will play it at the conformed, rather than recorded frame rate. The process is nearly instant and when imported into FCP X, the application simply plays it at the slower speed – no rendering required. Just like with an ALEXA or RED.

Another function of Cinema Tools is reverse telecine. If a camera file was recorded with built-in “pulldown” – sometimes called 24-over-60 – additional redundant video fields are added to the file. You want to remove these if you are editing in a native 24p project. Cinema Tools will let you do this and in the process render a new, 24p-native file.

Color correction

I really like the built-in and third-party color correction tools for Final Cut Pro X. I also like Blackmagic Design’s DaVinci Resolve, but there are times when Apple Color is still the best tool for the job. I prefer its user interface to Resolve, especially when working with dual displays and if you use an AJA capture/monitoring product, Resolve is a non-starter. For me, Color is the best choice when I get a color correction project from outside where the editor used FCP 7 to cut. I’ve also done some jobs in X and then gone to Color via Xto7 and then FCP 7. It may sound a little convoluted, but is pretty painless and the results speak for themselves.

Audio mixing

I do minimal mixing in X. It’s fine for simple mixes, but for me, a track-based application is the only way to go. I do have X2Pro Audio Convert, but many of the out-of-house ProTools mixers I work with prefer to receive OMFs rather than AAFs. This means going to FCP 7 first and then generating an OMF from within FCP 7. This has the added advantage that I can proof the timeline for errors first. That’s something you can’t do if you are generating an AAF without any way to open and inspect it. FCP X has a tendency to include many clips that are muted and usually out of your way inside X. By going to FCP 7 first, you have a chance to clean up the timeline before the mixer gets it.

Any complex projects that I mix myself are done in Adobe Audition or Soundtrack Pro. I can get to Audition via the XML route – or I can go to Soundtrack Pro through XML and FCP 7 with its “send to” function. Either application works for me and most of my third-party plug-ins show up in each. Plus they both have a healthy set of their own built-in filters. When I’m done, simply export the mix (and/or stems) and import the track back into FCP X to marry it to the picture.

Project trimming

Final Cut Pro X has no media management function.  You can copy/move/aggregate all of the media from a single Project (timeline) into a new Event, but these files are the source clips at full length. There is no ability to create a new project with trimmed or consolidated media. That’s when source files from a timeline are shortened to only include the portion that was cut into the sequence, plus user-defined “handles” (an extra few frames or seconds at the beginning and end of the clip). Trimmed, media-managed projects are often required when sending your edited sequence to an outside color correction facility. It’s also a great way to archive the “unflattened” final sequence of your production, while still leaving some wiggle room for future trimming adjustments. The sequence is editable and you still have the ability to slip, slide or change cuts by a few frames.

I ran into this problem the other day, where I needed to take a production home for further work. It was a series of commercials cut in FCP X, from which I had recut four spots as director’s cuts. The edit was locked, but I wanted to finish the mix and grade at home. No problem, I thought. Simply duplicate the project with “used media”, create the new Event and “organize” (copies media into the new Event folder). I could live with the fact that the media was full length, but there was one rub. Since I had originally edited the series of commercials using Compound Clips for selected takes, the duping process brought over all of these Compounds – even though none was actually used in the edit of the four director’s cuts. This would have resulted in copying nearly two-thirds of the total source media. I could not remove the Compounds from the copied Event, without also removing them from the original, which I didn’t want to do.

The solution was to send the sequence of four spots to FCP 7 and then media manage that timeline into a trimmed project. The difference was 12GB of trimmed source clips instead of HUNDREDS of GB. At home, I then sent the audio to Soundtrack Pro for a mix and the picture back to FCP X for color correction. Connect the mix back to the primary storyline in FCP X and call it done!

I realize that some of this may sound a bit complex to some readers, but professional workflows are all about having a good toolkit and knowing how to use it. FCP X is a great tool for productions that can work within its walls, but if you still own Final Cut Studio, there are a lot more options at your disposal. Why not continue to use them?

©2013 Oliver Peters

NAB 2013 Distilled

df_nab2013_1Another year – another NAB exhibition. A lot of fun stuff to see. Plenty of innovation and advances, but no single “shocker” like last year’s introduction of the Blackmagic Cinema Camera. Here are some observations based on this past week in Las Vegas.

4K

Yes, 4K was all over. I was a bit surprised that many of the pieces for a complete end-to-end solution are in place. The term 4K refers to the horizontal pixel width of the image, but two common specs are used – the DCI (film) standard of 4096 and the UltraHD (aka QuadHD) standard of 3840. Both are “4K”. Forgotten in the discussion is frame rate. Many displays were showing higher frame rates, such as 4K at 60fps. 120fps is also being discussed.

4K (and higher) cameras were there from Canon, Sony, RED, JVC, GoPro and now Blackmagic Design. Stereo3D was there, too, in pockets; but, it’s all but dead (again). 4K, though, will have legs. The TV sets and distribution methods are coming into position and this is a nonintrusive experience for the viewer. SD to HD was an obvious “in your face” difference. 4K is noticeably better, but not as much as SD to HD. More like 720p versus 1080p. This means that consumer prices will have to continue to drop (as they will) for 4K to really catch hold, except for special venue applications. Right now, it’s pretty obvious how gorgeous 4K is when standing a few feet away from an 84” screen, but few folks can afford that yet.

Interestingly enough, you can even do live 4K broadcasts, using 4K cameras and production products from Astro Designs. This will have value in live venues like sporting events and large corporate meetings. A new factor – “region of interest” – comes into play. This means you can shoot 4K and then scale/crop the portion of the image that interests you. Naturally there was also 8K by NHK and also Quantel. Both have been on the forefront of HD and then 4K. Quantel was demonstrating 8K (downsampled to a 4K monitor) just to show their systems have the headroom for the future.

ARRI did not have a 4K camera, but the 4 x 3 sensor of the ALEXA XT model features 2880 x 2160 photosites. When you use an anamorphic 2:1 lens and record ARRIRAW, you effectively end up with an unsqueezed image of 5760 x 2160 pixels. Downsample that to a widescreen 2.4:1 image inside a 4096 DCI frame and you have visually similar results as with a Sony or RED camera delivering in 4K. This was demonstrated in the booth and the results were quite pleasing. The ALEXA looked a bit softer than comparable displays at the Sony and RED booths, but most cinematographers would probably opt for the ARRI image, since it appears a lot closer to the look of scanned film at 4K. Part of this is inherent with ARRI’s sensor array, which includes optical filtering in-camera. Sony was showing clips from the upcoming Oblivion feature film, which was shot with an F65. To many attendees these clips looked almost too crisp.

In practical terms, most commercial, corporate, television or indie film users of 4K cameras want an easy workflow. If that’s your goal, then the best “true” 4K paths are to shoot with the Canon C500 or the Sony F55. The C500 can be paired with the (now shipping) AJA KiPro Quad to record 4K ProRes files. The Sony records in the XAVC codec (a variant of AVC-Intra). Both are ready to edit (importer plug-ins may be required) without conversions.

You can also record ARRI 2K ProRes in an ALEXA or use one of the various raw workflows (RED, Canon, Blackmagic, Sony, ARRI). Raw is nice, but adds extra steps to the process – often with little benefit over log-profile recording to an encoded file format.

Edit systems

With the shake-up that Apple’s introduction of Final Cut Pro X has brought to the market, brand dominance has been up for grabs. Apple wasn’t officially at the show, but did have some off-site presence, as well as a few staffers at demo pods. For example, they were showing the XAVC integration in an area of the Sony booth. FCP X was well-represented as part of other displays all over the floor. An interesting metric I noticed, was that all press covering the show on video, were cutting their reports on laptops using FCP X. That is a sweet spot for use of the application. No new FCP X news (beyond the features released with 10.0.8) was announced.

Adobe is currently the most aggressive in trying to earn the hearts of editors. The “next” versions of Premiere Pro, SpeedGrade, Audition and After Effects have a ton of features that respond to customer requests and will speed workflows. Adobe’s main stage demos were packed and the general consensus of most editors discussing a move away from FCP 7 (and even Avid) was a move to Adobe. In early press, Adobe mentioned working with the Coen brothers, who have committed to cutting their next film with Premiere.

The big push was for Adobe Anywhere – their answer for cloud-based editing. Although a very interesting product, it will compete in the same space as Quantel Qtube and Avid Interplay Sphere. These are enterprise solutions that require servers, storage, software and support. While it’s an interesting technology, it will tend to be of more interest to larger news operations and educational facilities than smaller post shops.

Avid came on with Media Composer 7 at a new price, with Symphony as an add-on option to Media Composer. The biggest features were the ability to edit with larger-than-HD video sources (output is still limited to HD), LUT support, improved media management of AMA files and background transcoding using managed folders (watch folders). In addition, Pro Tools goes to 11, with a new video engine – it can natively run Avid sequences from AAF imports – and faster-than-real-time bounce. The MC background transcode and the PT11 bounce will be time savers for Avid users and that translates into money saved.

Avid Interplay Sphere (announced last year) now works on Macs, but its main benefit is remote editing for stations that have invested in Interplay solutions. Avid is also bundling packages of ISIS storage, Interplay asset management and seats of Media Composer at even lower price points. Although still premium solutions, they are finally in a range that may be attractive to some small edit facilities and broadcasters, given that it includes installation and support.

The other NLE players include Avid DS (not shown), Quantel Pablo Rio, Autodesk Smoke 2013, Grass Valley EDIUS, Sony Vegas, Media 100 (not shown) and Lightworks. Most of these have no bearing in my market. Smoke 2013 is getting traction. Autodesk is working to get user feedback to improve the application, as it moves deeper into a market segment that is new to them. EditShare is forging ahead with Lightworks on the Mac. It looked pretty solid at the show, but expect something that’s ready for users towards the end of the year. It’s got the film credits to back it up, so a free (or near free) Mac version should shake things up even further.

One interesting addition to the market is DaVinci Resolve 10 gaining editing features. Right now the editing bells-and-whistles are still rudimentary, though all of the standard functions are there. Plus there are titles, speed changes with optical flow and a plug-in API (OpenFX). You can already apply GenArts Sapphire filters to your clips. These are applied in the color correction timeline as nodes, rather than effects added to an editing timeline. This means the Sapphire filters can be baked into any clip renders. The positioning of Resolve 10 is as an online editing tool. That means conforming, titling and trims/tweaks after grading. You now have even greater editing capabilities at the grading stage without having to return to an NLE. Ultimately the best synergy will be between FCP X and Resolve. Together the two apps make for a very interesting package and Apple seems to be working closely with Blackmagic Design to make this happen. Ironically the editing mode page looks a lot like FCP X would have looked with tracks and dual viewers.

Final thoughts

I was reading John Buck’s Timeline on the plane. Even though we think of the linear days as having been dominated by CMX, the reality was that there were many systems, including Mach One, Epic, ISC, Strassner, Convergence, Datatron, Sony, RCA and Ampex. In Hollywood, the TV industry was split among them, which is why a common interchange standard of the EDL was developed. For awhile, Avid became the dominant tool in the nonlinear era, but the truth is that hasn’t always been the norm – nor should it be. The design dilemma of engineering versus creative was a factor from the beginning of video editing. Should a system be simple enough that producers, directors and non-technical editors can run it? Sound familiar?

When I look at the show I am struck at how one makes their buying choices. To use the dreaded car analogy, FCP X is the sports car and Avid is the truck. But the sports car is a temperamental Ferrari that does some things very well , but isn’t appropriate for others. The truck is a Tundra with all the built-in, office-on-the-road niceties.

If I were a facility manager, making a purchase for a large scale facility, it would probably still be Avid. It’s the safe bet – the “you don’t get fired for buying IBM” bet. Their innovations at the show were conservative, but meet the practical needs of their current customers. There simply is no other system with a proven track record across all types of productions that scales from one user to massive installations. But offering conservative innovation isn’t a growth strategy. You don’t get new users that way. Media Composer has become truly complex in ways that only veteran users can accept and that has to change fast.

Apple FCP X is the wild card, of course. Apple is playing the long game looking for the next generation of users. If FCP X weren’t an Apple product, it would receive the same level of attention as Vegas Pro, at best. Also a great tool with a passionate user base, but nothing that has the potential of dominating market share. The trouble is Apple gets in its own way due to corporate secrecy. I’ve been using FCP X for awhile and it certainly is a professional product. But to use it effectively, you have to change your workflow. In a multi-editor, multi-production facility, this means changing a lot of practices and retraining staff. It also means augmenting the software with a host of other applications to fix the short-comings.

Broadening the appeal of FCP X beyond the one-man-band operations may be tough for that reason. It’s too non-standard and no one has any idea of where it’s headed. On the other hand, as an editor who’s willing to deal with new challenges, I like the fast, creative cutting performance of FCP X. This makes it a great offline editing tool in my book. I find a “start in X, finish in Resolve” approach quite intriguing.

Right now, Adobe feels like the horse to beat. They have the ear of the users and an outreach reminiscent of when Apple was in the early FCP “legacy” era. Adobe is working hard to build a community and the interoperability between applications is the best in the industry. They are only hampered by the past indifference towards Premiere that many pro users have. But that seems to be changing, with many new converts. Although Premiere Pro “next” feels like FCP 7.5, that appears to be what users really want. The direction, at least, feels right. Apple may have been “skating to where the puck will be”, but it could be that no one is following or the puck simply wasn’t going there in the first place.

For an additional look – click over to my article for CreativePlanetNetwork – DV magazine.

©2013 Oliver Peters

DaVinci Resolve Workflows

df_resolve_main

Blackmagic Design’s purchase of DaVinci Systems put a world class color grading solution into the hands of every video professional. With Resolve 9, DaVinci sports a better user interface that makes it easy to run, regardless of whether you are an editor, colorist or DIT working on set.  DaVinci Resolve 9 comes in two basic Mac or Windows software versions, the $995 paid and the free Lite version. The new Blackmagic Cinema Camera software bundle also includes the full (paid) version, plus a copy of Ultrascope. For facilities seeking to add comprehensive color grading services, there’s also a version with Blackmagic’s dedicated control surface, as well as Linux systems configurations.

Both paid and free versions of Resolve (currently at version 9.1) work the same way, except that the paid version offers larger-than-HD output, noise reduction and the ability to tap into more than one extra GPU card for hardware acceleration. Resolve runs fine with a single display card (I’ve done testing with the Nvidia GT120, the Nvidia Quadro 4000 and the ATI 5870), but requires a Blackmagic video output card if you want to see the image on a broadcast monitor.

Work in Resolve 9 generally flows left-to-right, through the tabbed pages, which you select at the bottom of the interface screen. These are broken into Media (where you access the media files that you’ll be working with), Conform (importing/exporting EDL, XML and AAF files), Color (where you do color correction), Gallery (the place to store and recall preset looks) and Deliver (rendering and/or output to tape).

Many casual users employ Resolve in these two ways: a) correcting camera files to send on to editorial, and b) color correction roundtrips with NLE software. This tutorial is intended to highlight some of the basic workflow steps associated with these tasks. Resolve is deep and powerful, so spend time with the excellent manual to learn its color correction tools, which would be impossible to cover here.

Creating edit-ready dailies – BMCC (CinemaDNG media)

The Blackmagic Cinema Camera can record images as camera raw, CinemaDNG image sequences. Resolve 9 can be used to turn these into QuickTime or MXF media for editing. Files may be graded for the desired final look at this point, or the operator can choose to apply the BMD Film preset. This log preset generates files with a flat look comparable to ARRI Log-C. You may prefer this if you intend to use a Log-to-Rec709 LUT (look up table) in another grading application or a filter like the Pomfort Log-to-Video effect, which is available for Final Cut Pro 7/X.df_resolve_1_sm

Step 1 – Media: Drag clip folders into the Media Pool section.

Step 2 – Conform: Skip this tab, since the clips are already on a single timeline.

df_resolve_3_smStep 3 – Color: Make sure the camera setting (camera icon) for the clips on the timeline are set to Project. Open the project settings (gear icon). Change and apply these values: 1) Camera raw – CinemaDNG; 2) White Balance – as shot; 3) Color Space and Gamma – BMD Film.

Step 4 – Deliver: Set it to render each clip individually, assign the target destination and frame rate and the naming options. Then choose Add Job and Start Render.

The free version of Resolve will downscale the BMCC’s 2.5K-wide images to 1920×1080. The paid version of Resolve will permit output at the larger, native size. Rendered ProRes files may now be directly imported into FCP 7, FCP X or Premiere Pro. Correct the images to a proper video appearance by using the available color correction tools or filters within the NLE that you are using.

Creating edit-ready dailies – ARRI Alexa / BMCC (ProRes, DNxHD media)

df_resolve_2_smBoth the ARRI Alexa and the Blackmagic Cinema Camera can record Apple ProRes and Avid DNxHD media files to onboard storage. Each offers a similar log gamma profile that may be applied during recording in order to preserve dynamic range. Log-C for the Alexa and BMD Film for Blackmagic. These profiles facilitate high-quality grading later. Resolve may be used to properly grade these images to the final look as dailies are generated, or it may simply be used to apply a viewing LUT for a more pleasing appearance during the edit.

Step 1 – Media: Drag clip folders into the Media Pool section.

Step 2 – Conform: Skip this tab, since the clips are already on a single timeline.

Step 3 – Color: Make sure the camera setting for the clips on the timeline are set to Project. Open the project settings and set these values: 3D Input LUT – ARRI Alexa Log-C or BMD Film to Rec 709.

df_resolve_4_smStep 4 – Deliver: Set it to render each clip individually, assign the target destination and frame rate and the naming options. Check whether or not to render with audio. Then choose Add Job and Start Render.

The result will be new, color corrected media files, ready for editing. To render Avid-compatible MXF media for Avid Media Composer, select the Avid AAF Roundtrip from the Easy Setup presets. After rendering, return to the Conform page to export an AAF file.

Roundtrips – using Resolve together with editing applications

DaVinci Resolve supports roundtrips from and back to NLEs based on EDL, XML and AAF lists. You can use Resolve for roundtrips with Apple Final Cut Pro 7/X, Adobe Premiere Pro and Avid Media Composer/Symphony. You may also use it to go between systems. For example, you could edit in FCP X, color correct in Resolve and then finish in Premiere Pro or Autodesk Smoke 2013. Media should have valid timecode and reel IDs to enable the process to work properly.

df_resolve_5_smIn addition to accessing the camera files and generating new media with baked-in corrections, these roundtrips require an interchange of edit lists. Resolve imports an XML and/or AAF file to link to the original camera media and places those clips on a timeline that matches the edited sequence. When the corrected (and trimmed) media is rendered, Resolve must generate new XML and/or AAF files, which the NLE uses to link to these new media files. AAF files are used with Avid systems and MXF media, while standard XML files and QuickTime media is used with Final Cut Pro 7 and Premiere Pro. FCP X uses a new XML format that is incompatible with FCP 7 or Premiere Pro without translation by Resolve or another utility.

Step 1 – Avid/Premiere Pro/Final Cut Pro: Export a list file that is linked to the camera media (AAF, XML or FCPXML).

Step 2- Conform (skip Media tab): Import the XML or AAF file. Make sure you have set the options to automatically add these clips to the Media Pool.

Step 3 – Color: Grade your shots as desired.df_resolve_6_sm

Step 4 – Deliver: Easy Setup preset – select Final Cut Pro XML or Avid AAF roundtrip. Verify QuickTime or MXF rendering, depending on the target application. Change handle lengths if desired. Check whether or not to render with audio. Then choose Add Job and Start Render.

df_resolve_9_smStep 5 – Conform: Export a new XML (FCP7, Premiere Pro), FCPXML (FCP X) or AAF (Avid) list.

The roundtrip back

The reason you want to go back into your NLE is for the final finishing process, such as adding titles and effects or mixing sound. If you rendered QuickTime media and generated one of the XML formats, you’ll be able to import these new lists into FCP7/X or Premiere Pro and those applications will reconnect to the files in their current location. FCP X offers the option to import/copy the media into its own managed Events folders.

df_resolve_7_smIf you export MXF media and a corresponding AAF list with the intent of returning to Avid Media Composer/Symphony, then follow these additional steps.

Step 1 – Copy or move the folder of rendered MXF media files into an Avid MediaFiles/MXF subfolder. Rename this copied folder of rendered Resolve files with a number.

Step 2 – Launch Media Composer or Symphony and return to your project or create a new project.df_resolve_8_sm

Step 3 – Open a new, blank bin and import the AAF file that was exported from Resolve. This list will populate the bin with master clips and a sequence, which will be linked to the new MXF media rendered in Resolve and copied into the Avid MediaFiles/MXF subfolder.

Originally written for DV magazine / Creative Planet Network

©2013 Oliver Peters

Blackmagic Design HyperDeck Shuttle

df_hyperdeck_01The video industry has been moving towards complete file-based workflows, but that doesn’t replace all of the functions that traditional videotape recorders served. To bridge the gap, companies such as AJA, Blackmagic Design, Convergent Design, Sound Devices and others had developed solid state recorders for field and studio operation. I recently tested Blackmagic Design’s HyperDeck Shuttle 2, which is touted as the world’s smallest uncompressed recorder.

Blackmagic’s HyperDeck series includes the Shuttle and two Studio versions. The latter are rack-mounted VTR-replacement devices equipped with dual SSDs (solid state drives). The Shuttle is a palm-sized, battery powered “brick” recorder. A single SSD slides into the Shuttle enclosure, which is only a bit bigger than the drive itself – enough to accommodate battery, controls and internal electronics. To keep the unit small, controls are basic record and transport buttons, much like that of a consumer CD player. You can operate it connected to an external power supply, on-board camera power or battery-powered. The internal, non-removable, rechargeable battery holds its charge for a little over one hour of continuous operation. The purchased unit includes a 12-volt power supply and a kit of international AC plug adapters.

The HyperDeck Shuttle includes 3Gb/s SDI and HDMI for digital capture and playback. Recording formats include 10-bit uncompressed QuickTime movies, as well as Avid DNxHD 175x or 220x in either QuickTime or MXF-wrapped variations. At the time I tested this device, it would not record Apple ProRes codecs. In November, Blackmagic Design released a free software update (version 3.6), which added ProRes HQ to the uncompressed and DNxHD options. It also added closed captioning support to all HyperDeck models.

Since the unit is designed for minimal interaction, all system set-up is handled by an external software utility. Install this application on your computer, connect the HyperDeck Shuttle via USB and then you’ll be able to select recording formats and other preferences, such as whether or not to trigger recording via SDI (for on-camera operation). The unit has no menu, which means you cannot alter, rename or delete files using the button controls or the software utility. There is a display button, but that was not active in the software version that I tested.

Solid state recording

df_hyperdeck_03The SSD used is a standard 2.5” SATA III drive. Several different brands and types have been tested and qualified by Blackmagic Design for use with the HyperDeck units. These drives can be plugged into a generic hard drive dock, like a Thermaltake BlacX Duet to format the drive and copy/erase any files. The SSD was Mac-formatted, so it was simply a matter of pulling the drive out of the Shuttle’s slot and plugging it into the Duet, which was connected to my Mac Pro tower. This allowed me to copy files from the drive to my computer, as well as to move files back to the SSD for later playback from the Shuttle. (At IBC, Blackmagic also announced ExFAT support with the HyperDeck products.) The naming convention is simple, so recorded files are labeled Capture001, Capture002 and so on. Unfortunately, it does not embed reel numbers into the QuickTime files. Placing a similarly named file in the correct format (more on that in a moment) onto the drive makes it possible to use the Shuttle as a portable master playback device for presentations, film festivals, etc.

My evaluation unit came equipped with an 240GB OCZ Vertex 3 SSD. This is an off-the-shelf drive that runs under $200 at most outlets. By comparison, a Sony 124-minute HDCAM-SR videotape is now more expensive. It’s amazing that this SSD will sustain extended 10-bit uncompressed 1080i/59.94 recording and playback, when even most small drive arrays can’t do that! In practical terms, a 240GB drive will not hold a lot of 1080i 10-bit uncompressed media, so it’s more likely that you would use Avid DNxHD 220X or Apple ProRes HQ for the best quality. You could easily fit over 90 minutes of content on the same SSD using one of these codecs and not really see any difference in image quality.

In actual use

I tested the unit with various codecs and frame rates. As a general rule, it’s not a good idea to mix different flavors on the same drive. For example, if you record both 1080i 10-bit uncompressed and 1080p/23.98 Avid DNxHD clips on the same drive, the HyperDeck Shuttle will only be able to playback the clips that match its current set-up. The Shuttle does auto-detect the incoming frame rate without the need to set that using the utility. It did seem to get “confused” in this process, making it hard to access the clips that I thought it should have been able to play. The clips are on the SSD, though, so you can still pull them off of the drive for use in editing. For standard operation, I would suggest that you set your preferences for the current production and stick to that until you are done.

df_hyperdeck_02Blackmagic Design sells a mounting plate as an accessory. It’s easy to install by unscrewing the HyperDeck’s back panel and screwing in the mounting plate in its place. I loaned the unit to a director of photography that I work with for use with a Canon C300.  Although there are common mounting holes, the DP still ended up having to use Velcro as a means to install both his battery and the Shuttle onto the same camera rig. The recordings looked good, but the SDI trigger did not properly stop the recording, requiring the DP to manually stop the unit with each take. Another issue for some is that it uses Mini-BNC connectors. This requires an investment in some Mini-BNC adapter cables for SDI operation, if you intend to connect it to standard BNC spigots.

Overall, the unit performed well in a variety of applications, but with a few quirks. I frequently found that it didn’t respond to my pushing the transport control buttons. I’m not sure if this was due to bad button contacts or a software glitch. It felt more like a software issue, in that once it “settled down” stepping forward and backward through clips and pushing the play button worked correctly. The only format I was not able to playback was 24p media recorded as MXF. Nevertheless the MXF formatting was correct, as I could drop these files right into an Avid Mediafiles folder on my media hard drive for editing with Avid Symphony.

HyperDeck Shuttle as a portable player

If you intend to use a HyperDeck Shuttle as a master playback device, then there a few things you need to know. It can capture interlaced, progressive and progressive-segmented-frame (PsF) footage, but it will only play these out as either interlaced or progressive via the SDI connection. Playing PsF as progressive is fine for many monitors and projectors, but the signal doesn’t pass through many routers or to some other recorders. Often these broadcast devices only function with a “true” progressive signal if the format is 720p/59.94. This means that it would be unlikely that you could play a 1080p/23.98 file (captured as PsF) and record that output from the HyperDeck Shuttle to a Sony HDCAM-SR video recorder, as an example.

It is possible to export a file from your Avid NLE, copy that file to the HyperDeck’s SSD using a drive dock and play it back from the unit; however, the specs get a little touchy. The HyperDeck Shuttle records audio as 16-bit/48kHz in the Little Endian format, but Avid exports its files as Big Endian. Endianness refers to how the bytes are ordered in a 16-, 32- or 64-bit word and whether the most or least significant bit is first. In the case of the Shuttle, this difference meant that I couldn’t get any audio output during playback. If your goal is to transfer a file to the Shuttle for duplication to another deck or playback in a presentation environment, then I would recommend that you take the time to make a real-time recording. Simply connect your NLE’s SDI output to the HyperDeck Shuttle’s SDI input and manually record to it, on-the-fly, like a tape deck.

The HyperDeck Shuttle is a great little unit for filling in workflow gaps. For example, if you don’t own any tape decks, but need to take a master to a duplication facility. You could easily use the Shuttle to transport your media to them and use it for on-site master playback. It’s a bit too quirky to be a great on-camera field recorder, but at $345 (plus the SSD), the Shuttle is an amazing value for image quality that good. As with their other products, Blackmagic Design has a history of enhancing the capabilities through subsequent software updates. I expect that in the future, we’ll see the HyperDeck family grow in a similar fashion.

Originally written for DV magazine / Creative Planet Network

© 2013 Oliver Peters