Why 4K

Ever since the launch of RED Digital Cinema, 4K imagery has become an industry buzzword. The concept stems from 35mm film post, where the digital scan of a film frame at 4K is considered full resolution and a 2K scan to be half resolution. In the proper used of the term, 4K only refers to frame dimensions, although it is frequently and incorrectly used as an expression of visual resolution or perceived sharpness. There is no single 4K size, since it varies with how it is used and the related aspect ratio. For example, full aperture film 4K is 4096 x 3112 pixels, while academy aperture 4K is 3656 x 2664. The RED One and EPIC use several different frame sizes. Most displays use the Quad HD standard of 3840 x 2160 (a multiple of 1920 x 1080) while the Digital Cinema Projection standard is 4096 x 2160 for 4K and 2048 x 1080 for 2K. The DCP standard is a “container” specification, which means the 2.40:1 or 1.85:1 film aspects are fit within these dimensions and the difference padded with black pixels.

Thanks to the latest interest in stereo 3D films, 4K-capable projection systems have been installed in many theaters. The same system that can display two full bandwidth 2K signals can also be used to project a single 4K image. Even YouTube offers some 4K content, so larger-than-HD production, post and distribution has quickly gone from the lab to reality. For now though, most distribution is still predominantly 1920 x 1080 HD or a slightly larger 2K film size.

Large sensors

The 4K discussion starts at sensor size. Camera manufacturers have adopted larger sensors to emulate the look of film for characteristics such as resolution, optics and dynamic range. Although different sensors may be of a similar physical dimension, they don’t all use the same number of pixels. A RED EPIC and a Canon 7D use similarly sized sensors, but the resulting pixels are quite different. Three measurements come into play: the actual dimensions, the maximum area of light-receiving pixels (photosites) and the actual output size of recorded frames. One manufacturer might use fewer, but larger photosites, while another might use more pixels of a smaller size that are more densely packed. There is a very loose correlation between actual pixel size, resolution and sensitivity. Larger pixels yield more stops and smaller pixels give you more resolution, but that’s not an absolute. RED has shown with EPIC that it is possible to have both.

The biggest visual attraction to large-sensor cameras appears to be the optical characteristics they offer – namely a shallower depth of field (DoF).  Depth of field is a function of aperture and focal length. Larger sensors don’t inherently create shallow depth of field and out-of-focus backgrounds. Because larger sensors require a different selection of lenses for equivalent focal lengths compared with standard 2/3-inch video cameras, a shallower depth of field is easier to achieve and thus makes these cameras the preferred creative tool. Even if you work with a camera today that doesn’t provide a 4K output, you are still gaining the benefits of this engineering. If your target format is HD, you will get similar results – as it relates to these optical characteristics – regardless of whether you use a RED, an ARRI ALEXA or an HDSLR.

Camera choices

Quite a few large-sensor cameras have entered the market in the past few years. Typically these use a so-called Super 35MM-sized sensor. This means it’s of a dimension comparable to a frame of 3-perf 35MM motion picture film. Some examples are the RED One, RED EPIC, ARRI ALEXA, Sony F65, Sony F35, Sony F3 and Canon 7D among others. That list has just grown to include the brand new Canon EOS C300 and the RED SCARLET-X. Plus, there are other variations, such as the Canon EOS 5D Mark II and EOS 1D X (even bigger sensors) and the Panasonic AF100 (Micro Four Thirds format). Most of these deliver an output of 1920 x 1080, regardless of the sensor. RED, of course, sports up to 5K frame sizes and the ALEXA can also generate a 2880 x 1620 output, when ARRIRAW is used.

This year was the first time that the industry at large has started to take 4K seriously, with new 4K cameras and post solutions. Sony introduced the F65, which incorporates a 20-megapixel 8K sensor. Like other CMOS sensors, the F65 uses a Bayer light filtering pattern, but unlike the other cameras, Sony has deployed more green photosites – one for each pixel in the 4K image. Today, this 8K sensor can yield 4K, 2K and HD images. The F65 will be Sony’s successor to the F35 and become a sought-after tool for TV series and feature film work, challenging RED and ARRI.

November 3rd became a day for competing press events when Canon and RED Digital Cinema both launched their newest offerings. Canon introduced the Cinema EOS line of cameras designed for professional, cinematic work. The first products seem to be straight out of the lineage that stems from Canon’s original XL1 or maybe even the Scoopic 16MM film camera. The launch was complete with a short Bladerunner-esque demo film produced by Stargate Studios along with a new film shot by Vincent Laforet (the photographer who launch the 5D revolution with his short film Reverie)  called Möbius.

The Canon EOS C300 and EOS C300 PL use an 8.3MP CMOS Super 35MM-sized sensor (3840 x 2160 pixels). For now, these only record at 1920 x 1080 (or 1280 x 720 overcranked) using the Canon XF codec. So, while the sensor is a 4K sensor, the resulting images are standard HD. The difference between this and the way Canon’s HDSLRs record is a more advanced downsampling technology, which delivers the full pixel information from the sensor to the recorded frame without line-skipping and excessive aliasing.

RED launched SCARLET-X to a fan base that has been chomping at the bit for years waiting for some version of this product. It’s far from the original concept of SCARLET as a high-end “soccer mom” camera (fixed lens, 2/3” sensor, 3K resolution with a $3,000 price tag). In fact, SCARLET-X is, for all intents and purposes, an “EPIC Lite”. It has a higher price than the original SCARLET concept, but also vastly superior specs and capabilities. Unlike the Canon release, it delivers 4K recorded motion images (plus 5K stills) and features some of the developing EPIC features, like HDRx (high dynamic range imagery).

If you think that 4K is only a high-end game, take a look at JVC. This year JVC has toured a number of prototype 4K cameras based on a proprietary new LSI chip technology that can record a single 3840 x 2160 image or two 1920 x 1080 streams for the left and right eye views of a stereo 3D recording. The GY-HMZ1U is derivative of this technology and uses dual 3.32MP CMOS sensors for stereo 3D and 2D recordings.

Post at 4K

Naturally the “heavy iron” systems from Quantel and Autodesk have been capable of post at 4K sizes for some time; however, 4K is now within the grasp of most desktop editors. Grass Valley EDIUS, Adobe Premiere Pro and Apple Final Cut Pro X all support editing with 4K media and 4K timelines. Premiere Pro even includes native camera raw support for RED’s .r3d format at up to EPIC’s 5K frames. Avid just released its 6.0 version (Media Composer 6, Symphony 6 and NewsCutter 10), which includes native support for RED One and EPIC raw media. For now, edited sequences are still limited to 1920 x 1080 as a maximum size. For as little as $299 for FCP X and RED’s free REDCINE-X (or REDCINE-X PRO) media management and transcoding tool, you, too, can be editing with relative ease on DCP-compliant 4K timelines.

Software is easy, but what about hardware? Both AJA and Blackmagic Design have announced 4K solutions using the KONA 3G or Decklink 4K cards. Each uses four HD-SDI connections to feed four quadrants of a 4K display or projector at up to 4096 x 2160 sizes. At NAB, AJA previewed for the press its upcoming 5K technology, code-named “Riker”. This is a multi-format I/O system in development for SD up to 5K sizes, complete with a high-quality, built-in hardware scaler. According to AJA, it will be capable of handling high-frame-rate 2K stereo 3D images at up to 60Hz per eye and 4K stereo 3D at up to 24/30Hz per eye.

Even if you don’t own such a display, 27″ and 30″ computer monitors, such as an Apple Cinema Display, feature native display resolutions of up to 2560 x 1600 pixels. Sony and Christie both manufacture a number of 4K projection and display solutions. In keeping with its plans to round out a complete 4K ecosystem, RED continues in the development of REDRAY PRO, a 4K player designed specifically for RED media.

Written for DV magazine (NewBay Media, LLC)

©2011 Oliver Peters

RED post for My Fair Lidy

I’ve work on various RED projects, but a recent interesting example is My Fair Lidy, an independent film produced through the Valencia College Film Production Technology program. This was a full-blown feature shot entirely with RED One cameras. In this program, professional filmmakers with real projects in hand partner with a class of eager students seeking to learn the craft of film production. I’ve edited two of these films produced through the program and assisted in various aspects of post on many others. My Fair Lidy – a quirky comedy directed by program director Ralph Clemente – was shot in 17 days this summer at various central Florida locations. Two RED Ones were used – one handled by director of photography Ricardo Galé and the second by student cinematographers. My Fair Lidy was produced by SandWoman Films and stars Christopher Backus and Leigh Shannon.

There are many ways to handle the post production of native RED media and I’ve covered a number of them in these earlier posts. There is no single “best way” to handle these files, because each production is often best-served by a custom solution. Originally, I felt the way to tackle the dailies was to convert the .r3d camera files into ProRes 4444 files using the RedLogFilm profile. This gives you a very flat look, and a starting point very similar to ARRI ALEXA files shot with the Log-C profile. My intension would have been to finish and grade straight from the QuickTimes and never return to the .r3d files, unless I needed to fix some problems. Neutral images with a RedLogFilm gamma setting are very easy to grade and they let the colorist swing the image for different looks with ease. However, after my initial discussions with Ricardo, it was decided to do the final grade from the native camera raw files, so that we had the most control over the image, plus the ability to zoom in and reframe using the native 4K files as a source.

The dailies and editorial flow

My Fair Lidy was lensed with a 16 x 9 aspect ratio, with the REDs set to record 4096 x 2304 (at 23.98fps). In addition to a RED One and a healthy complement of grip, lighting and electrical gear, Valencia College owns several Final Cut Pro post systems and a Red Rocket accelerator card. With two REDs rolling most of the time, the latter was a godsend on this production.  We had two workstations set up – one as the editor’s station with a large Maxx Digital storage array and the other as the assistant’s station. That system housed the Red Rocket card. My two assistants (Kyle Prince and Frank Gould) handled all data back-up and conversion of 4K RED files to 1920 x 1080 ProResHQ for editorial media. Using ProResHQ was probably overkill for cutting the film (any of the lower ProRes codecs would have been fine for editorial decisions) but this gave us the best possible image for an potential screenings, trailers, etc.

Redcine-X was our tool for .r3d media organization and conversion. All in-camera settings were left alone, except the gamma adjustment. The Red Rocket card handles the full-resolution debayering of the raw files, so conversion time is close to real time. The two stations were networked via AFP (Apple’s file-sharing protocol), which permitted the assistant to handle his tasks without slowing down the editor. In addition, the assistant would sync and merge audio from the double-system sound, multi-track audio recordings and enter basic scene/take descriptions. Each shoot day had its own FCP project, so when done, project files and media (.r3d, ProRes and audio) were copied over to the editor’s Maxx array. Master clips from these daily FCP projects were then copied-and-pasted (and media relinked) into a single “master edit” FCP project.

For reasons of schedule and availability, I split the editing responsibilities with a second film editor, Patrick Tyler. My initial role was to bring the film to its first cut and then Patrick handled revisions with the producer and director. Once the picture was locked, I rejoined the project to cover final finishing and color grading. My Fair Lidy was on a very accelerated schedule, with sound design and music scoring running on a parallel track. In total, post took about 15 weeks from start to finish.

Finishing and grading

Since we didn’t use FCP’s Log and Transfer function nor the in-camera QuickTime reference files as edit proxies, there was no easy way to get Apple Color to automatically relink clips to the original .r3d files. You can manually redirect Color to link to RED files, but this must be done one shot at a time – not exactly desirable for the 1300 or so shots in the film.

The recommended workflow is to export an XML from FCP 7, which is then opened in Redcine-X. It will correctly reconnect to the .r3d files in place of the QuickTime movies. From there you export a new XML, which can be imported into Color. Voila! A Color timeline that matches the edit using the native camera files. Unfortunately for us, this is where reality came crashing in – literally. No matter what we did, using both  XMLs and EDLs, everything that we attempted to import into Color crashed the application. We also tried ClipFinder, another free application designed for RED media. It didn’t crash Color, but a significant number of shots were incorrectly linked. I suspect some internal confusion because of the A and B camera situation.

On to Plan B. Since Redcine-X correctly links to the media and includes not only controls for the raw settings, but also a healthy toolset for primary color correction, then why not use it for part of the grading process? Follow that up with a pass through Color to establish the stylistic “look”. This ended up working extremely well for us. Here are the basic steps I followed.

Step 1. We broke the film into ten reels and exported an XML file for each reel from FCP 7.

Step 2. Each reel’s XML was imported into Redcine-X as a timeline. I changed all the camera color metadata for each shot to create a neutral look and to match shots to each other. I used RedColor (slightly more saturated than RedColor2) and RedGamma2 (not quite as flat as RedLogFilm), plus adjusted the color temp, tint and ISO values to get a neutral white balance and match the A and B camera angles. The intent was to bring the image “within the goalposts” of the histogram. Occasionally I would make minor exposure and contrast adjustments, but for the most part, I didn’t touch any of the other color controls.

My objective was to end up with a timeline that looked consistent but preserved dynamic range. Essentially that’s the same thing I would do as the first step using the primary tab within Color. The nice part about this is that once I matched the settings of the shots, the A and B cameras looked very consistent.

Step 3. Each timeline was exported from Redcine-X as a single ProResHQ file with these new settings baked in. We had moved the Red Rocket card into the primary workstation, so these 1920 x 1080 clips were rendered with full resolution debayering. As with the dailies, rendering time was largely real-time or somewhat slower. In this case, approximately 10-20 minutes per reel.

Step 4. I imported each rendered clip back into FCP and placed it onto video track two over the corresponding clips for that reel to check the conforming accuracy and sync. Using the “next edit” keystroke, I quickly stepped through the timeline and “razored” each edit point on the clip from Redcine-X. This may sound cumbersome, but only took a couple of minutes for each reel. Now I had an FCP sequence from a single media clip, but with each cut split as an edit point. Doing this creates “notches” that are used by the color correction software for cuts between corrections. That’s been the basis for all “tape-to-tape” color correction since DaVinci started doing it and the new Resolve software still includes a similar automatic scene detection function today.

Step 5. I sent my newly “notched” timeline to Color and graded as I normally would. By using the Redcine-X step as a “pre-grade”, I had done the same thing to the image as I would have done using the RED tab within Color, thus keeping with the plan to grade from the native camera raw files. I do believe the approach I took was faster and better than trying to do it all inside Color, because of the inefficiency of bouncing in and out of the RED tab in Color for each clip. Not to mention that Color really bogs down when working with 4K files, even with a Red Rocket card in place.

Step 6. The exception to this process was any shot that required a blow-up or repositioning. For these, I sent the ProRes file from dailies in place of the rendered shot from Redcine-X. In Color, I would then manually reconnect to the .r3d file and resize the shot in Color’s geometry room, thus using the file’s full 4K size to preserve resolution at 1080 for the blow-up.

Step 7. The last step was to render in Color and then “Send to FCP” to complete the roundtrip. In FCP, the reel were assembled for the full movie and then married to the mixed soundtrack for a finished film.

© 2011 Oliver Peters

Higher Ground

Timing is often everything when it comes to indie filmmaking. That’s certainly the case with Higher Ground, the directorial debut by Academy Award-nominated actress, Vera Farmiga (Up In The Air, Source Code, Nothing But The Truth). The film about the struggle and coexistence between faith and doubt is inspired by Carolyn S. Biggs’ memoir, This Dark World. It features Farmiga in the lead role of Corrine Walker and follows her through three phases of her life. The film has appeared at the 2011 Sundance, Tribeca and Los Angeles Film Festivals and is currently in distribution through Sony Pictures Classics.

Successfully pulling off a highly-regarded, low budget feature is a challenge for anyone, but even more so, if you are the director, the lead actress and pregnant on top of that. Living in upstate New York, Farmiga happened to be ten minutes away from BCDF Pictures, a production company and facility built with the intent of facilitating indie feature film production. She decided to check them out as a possible production resource and quickly discovered a synergy that was ideal for Higher Ground. Although BCDF was prepping another film at the time, the decision was made to fast-track Higher Ground, in part to be able to film before Farmiga was too far along in her pregnancy. Within a couple of weeks, the film was in full production for a 28-day filming schedule during June 2010.

BCDF Pictures, situated in the upper Hudson River valley, is a mash-up between summer camp and the old Hollywood studio system. The founders also created a film fund, Strategic Motion Ventures, to finance the pictures produced by BCDF. They own RED One MX camera packages and the farmhouse-style facility is home to several edit suites and screening theaters, which makes it ideal for a filmmaking home base. For Higher Ground, BCDF supplied two RED packages to director of photography Michael McDonough. They also worked out various tests prior to the production that let the DoP establish a number of in-camera looks for the three time periods in the story.

Hitting the ground running

Higher Ground editor Colleen Sharp wasn’t hired until three weeks after the start of production. So, BCDF proceeded down a post production workflow path based on the assumption that the film would be edited using Apple Final Cut Pro, their primary in-house NLE platform. Head of post production Jeremy Newmark handled the one-light color correction for the RED camera dailies, transcoding them into ProRes QuickTime movies. By the time Sharp was on board, BCDF had already accumulated two-and-a-half weeks of dailies in the ProRes format.

According to Sharp, “I’ve cut one other film using Final Cut, but I feel more comfortable with [Avid] Media Composer. I suggested, if possible, it would be better if I could cut Higher Ground on an Avid, because I had to hit the ground running. Since I was starting three weeks after filming had begun, I needed to be as efficient as possible and that would be on a system that I was most comfortable with.” Of course, this added the dilemma of whether or not to re-transcode the RED files into a format native to Avid.

Good timing once again played a role. Avid had just released Media Composer version 5.0, which enabled the direct use of ProRes files through AMA (Avid Media Access), as well as limited third-party hardware support for monitoring. In addition to Final Cut systems, BCDF also owned an older Media Composer license. They were able to cost-effectively set up the Avid suite for Sharp by upgrading their older Avid software license and adding the Matrox MXO2 Mini for video output to the large screen in the edit suite.

Newmark explained, “I was concerned about whether I’d need to take the existing dailies and convert them again to DNxHD media for Colleen. I talked it over with a friend at PostWorks in New York and it seemed like using AMA would be viable. We proceeded down the road of using the ProRes files in the Avid and Colleen was able to cut the film entirely using linked AMA files. We never transcoded them into DNxHD and it worked well. Of course, at the beginning I still had the Plan B of converting everything again if the AMA idea didn’t work; but, I wanted to avoid this as it would have cost us extra time. Even though we own a Red Rocket card for fast transcoding, the crew was using two cameras the entire time and often recording very long performance takes. So, in two-and-a-half weeks, they’d already accumulated quite a large amount of footage.”

In the end, it worked better than expected for what was at that time a new software release. Higher Ground is likely the first feature film edited using strictly AMA-linked ProRes files. Thanks in part to the weak economy, the film company was able to secure off-hours packages for DI finishing in Los Angeles and sound editing and mixing at Sound One in New York. Newmark continued, “I was able to send the colorist [Adam Hawkey] an EDL and the trimmed .r3d RED camera files, as well as the looks that I’d established with the DoP. These were imported into a Nucoda system, which read the files perfectly, including the looks presets. Adam told us this worked seamlessly and gave him a great starting point to work from in grading the film. Michael [McDonough] supervised the grading over a five-day stretch.”

Anticipating the big challenges

I asked Colleen Sharp about editing challenges on the film. She replied, “The biggest challenge I’d anticipated turned out not to be an issue at all. That was working with a first-time director, who was also the lead actor. Vera was great to work with. She was new to the entire editing process and very intrigued by the possibilities. She was hands-on during the edit and very helpful. I normally work on a film during the shooting and complete an editor’s cut before I start working with the director. In this case, I wasn’t completely done with my cut before the production wrapped, so the last portion of this first cut was worked out with Vera’s involvement. They finished shooting just after the 4th of July weekend, but I didn’t have my first cut together until the third week in July. It was just under three hours long! We continued working at it until mid-October and ended up at the final length of 107 minutes. Naturally, with that much trimming, you have to lose some scenes that are painful to cut, but that’s all part of the process.”

“I’m glad to say that none of Vera’s decisions were ever based on vanity. Only about the best performance and with this cast, the performances were always good. One editing challenge was dealing with the number of children in the scenes. For instance, Vera’s sister Taissa plays Corrine in the younger scenes. She’s never acted before. So, you had Vera directing her sister and she got a great performance out of her. Of course, as the editor, it’s my job to help get that performance on screen in a way that best represents the story.”

Naturally, whenever you have a lot of footage, the biggest challenge for the editor is wrestling just the sheer volume of material. Higher Ground shot about 14TB of RED footage, which translates into nearly 100 hours of raw material. Fortunately the story progressed in a linear fashion through the three periods of Corinne’s life. No parallel storylines or intercutting between different eras. To help manage the content, assistant editor Peter Saguto organized the ProRes files at the Finder level into folders based on scenes. This made sense for a Final Cut edit, but when it came time to move to Media Composer, most of this structure could be carried into Avid via AMA. As a result, Saguto didn’t have to completely start his logging from scratch after the change of platforms.

In the end, the post production workflow proved to be very viable. Newmark said, “When we started this, a lot of the advice we received ended with ‘good luck – no one has ever done this before.’ I was impressed with the stability of the Avid system, compared with the Final Cut system that was being used at the same time on the other film going through BCDF.” In the future, BCDF intends to handle more films on the Avid system. Newmark continued, “We always want to let the decision be made by the cinematographers and editors whenever possible. We own RED camera packages, but we’ve also shot films with ARRI ALEXA and 35mm film depending on what’s the right approach for that film. I really think Avid is the best tool for feature film editing and I’m glad this experience worked so well. Of course, now when we have a RED show that we know will be cut on Media Composer, we transcode the RED media to DNxHD.  Nevertheless, going ProRes on Higher Ground proved to be far more seamless than I would have expected.”

In its first year, BCDF Pictures produced four films: Higher Ground; Peace, Love, & Misunderstanding; The Last Keepers (formerly known as The Art of Love) and Rhymes with Bananas. They are currently in post production on Predisposed and Liberal Arts and in production on Bachlorette.

Written for DV Magazine (NewBay Media LLC)

©2011 Oliver Peters

Playing with Epic frames

As RED Digital Cinema moves beyond the RED One camera, post production folks will need to keep up with the changes in files mastered on these next-generation RED cameras. RED’s Epic camera is starting to make it into the production world in ever-increasing numbers, but to date, most NLEs on the market aren’t ready yet to accept these files. Adobe has been leading the charge with Epic support available in Premiere Pro CS 5.5 and After Effects CS 5.5. To date, Premiere Pro is the only desktop NLE to be able to open media files and edit sequences using Epic frames in native sizes, such as 5120 x 2160 and 5120 x 2560.

I still advocate conversions prior to editing using RED’s free Redcine-X or The Foundry’s Storm and then editing in the NLE of your choice. If you want to start cutting straight from the camera raw Epic files, then today, Premiere Pro CS 5.5 is just about your only option. This could change with Final Cut Pro X, but we’ll have to wait and see. If you prefer Media Composer or FCP7, then for now you are limited to smaller frame sizes and only RED One files.

So far, my Epic testing has been purely experimental, with only a few test frames generously posted at RED User by Jarred Land and others. I haven’t really been able to check real-world performance – merely how the files work within Premiere Pro. To that end, I’ve focused on color manipulation. I feel there are two viable approaches to the workflow, when you are color correcting the raw files within an NLE like Premiere Pro.

Source clips set to REDcolor2/REDlogFilm – Click to see an enlarged view

The first is to make all the color adjustments within the RED raw source settings pane. Here you can make all the raw-to-RGB adjustments, as well as subjectively adjusting curves, color balance, levels, etc. The second approach is to set a base level with the intent of doing all of your color grading using the regular NLE color correction tools, plug-ins and filters. From a standpoint of image quality, I don’t see much difference between color adjustments made within the source settings panel and those made in the timeline using standard color correction tools. With that in mind, I feel that the best workflow is the latter – use a basic raw setting that applies to all clips and then do your subjective grading in the standard environment.

One thing to point out is that Redcine-X and Storm update the .rmd (camera metadata looks) file when a clip is altered. You can use either of these applications to set the grading for a raw clip and then simply load that preset from the source settings pane in Premiere Pro or After Effects. By doing so, you can make color adjustments in Redcine-X or Storm and have those show up within the Adobe apps without any exports or renders.

The camera “look” that seems most conducive to a workflow where you grade after raw conversion is to use a flat setting that can easily be manipulated. In the newest Premiere Pro RED Importer source settings pane, this means using Color Version 2, a Color Space of REDcolor2 (or REDcolor – slightly more saturated) and a Gamma Curve of REDlogFilm. ISO, Kelvin and Tint should be adjusted to taste, but basically Kelvin/Tint should be set to a neutral white balance. An ISO value of 800 will tend to place the signal in the middle to middle-lower part of the histogram; however, experiment with the ISO setting for an optimal value. Now leave the other color controls alone.

By doing this you have effectively created an image that is very similar to the Log-C profile of an ARRI ALEXA or a scanned 35mm film negative. It provides a good neutral starting point for grading, which can be readily moved into a wide range of creative looks. In fact, this setting responds well to the built-in Cineon Converter, with a few tweaks.

One of the biggest advantages to working this way is that you can stay within the world of all your familiar tools. Premiere Pro CS 5.5 has become much more responsive to third-party plug-ins. I’ve found that common filters like Magic Bullet Looks, Colorista II, Mojo and GenArts’s Sapphire have a much-improved responsiveness compared with earlier versions. As such, it’s quite viable to grade an entire project within a Premiere Pro timeline without bouncing over to After Effects or relying on a dedicated grading application like DaVinci Resolve. In short, drop your native Epic clips into a Premiere Pro project, set the clip source settings to a neutral preset and then adjust the clips on the timeline by using the standard and/or third party filters.

I’ve become particularly found of using the Sapphire plug-ins. Now that they work rather well inside Premiere Pro, you can quickly develop “looks” by building up a stack of filters. For instance, in one of these examples, the combination of HueSatBright, Gamma, FilmEffect, BleachBypass and GlowDarks filters result in a very rich grade. Likewise, the Epic files respond nicely to Colorista II and Magic Bullet Looks.

This is, of course, only one of many ways to work. The outlined workflow is designed to appeal to the editor who wants to work inside the NLE as much as possible. Adobe has now made it possible for Premiere Pro editors to have a viable solution when dealing with RED Epic footage. I’m sure other companies will also get up to speed, but for now Adobe is leading the pack.

Some grading examples using MB Looks (click to see enlarged views)

Some grading examples using MB Colorista II (click to see enlarged views)

Some grading examples using GenArts Sapphire filters (click to see enlarged views)

A grading example using MB Mojo + Sapphire (click to see enlarged views)

Some grading examples using the Cineon converter (click to see enlarged views)

©2011 Oliver Peters

Avid DS shines for Metric

With all of the Media Composer 5 news, it might be easy to miss Avid’s latest update for the flagship system, Avid DS. Version 10.3.1 (see addendum below), released in mid-July, is a small point release that introduced two huge features – improved stereoscopic 3D control and support for RED Digital Cinema’s new “color science” and the Mysterium-X sensor. The new RED capabilities are showcased in the “All Yours” music video by the band Metric. It’s the official music video for The Twilight Saga: Eclipse, which featured the track under the end credits.

I spoke with Dermot Shane, a Vancouver-based VFX/DI supervisor who specializes in using Avid DS. Shane was working with 10.3.1 (in beta) when he got the call to handle finishing for “All Yours” (directed by Brantley Gutierrez). According to Shane, “The schedule on this was very tight and changes were being made up until the last minute. That’s because the video integrates clips from the movie and there had been a few last minute changes to the cut. In fact, we ended up getting one of these clips FTP’ed to us just in time for the deadline!” The production company for Metric shot the music video scenes using a RED One with the updated Mysterium-X sensor, which offers improved dynamic range. The newest RED software also improves how the camera raw files are converted into color information. These latest RED software updates have been integrated into the RED SDK used in Avid DS 10.3.1.

Shane described the workflow on this project. “The production company had cut the offline edit on [Apple] Final Cut Pro and provided us with an EDL. Avid DS can take this EDL and relink to the original R3D camera files, which gives me direct access to the raw data from the camera files by way of RED’s SDK. It’s an easy matter to scale the images for HD and to alter any of the looks of the images, based on changes that the director might want. Because these changes are made from the camera raw files, color grading is far cleaner than if I only had a flat image to start from. Once this is adjusted, I can cache the media into the DS and everything is real-time. On this project, the caches were working in 10-bit YUV high-def, and the master was rendered directly from the RED MX files. I probably changed the color information on all but three of the 162 clips in the music video.”

The new RED Mysterium-X support came in handy on this project. Shane continued, “The new sensor is much more sensitive and Avid DS 10.3.1 let me take advantage of this. For instance, I could create three versions of a clip all linked to the same R3D file. In each of these versions, I would create a different color setting using the RED source setting controls inside Avid DS.  One clip might be adjusted for the best shadow detail, another for the midrange and a third to preserve the highlights. These would then be composited into a single shot using the standard DS keyers and masks. The final image almost looks like a high dynamic range image. This is something you can’t do through standard grading techniques when the camera image has a ‘baked in’ look. It really shows the advantage of working with camera raw files.”

And what is the best thing about this new Avid DS release? “Stability,” answered Shane. “We worked around the clock for three or four days without a hiccup. That’s hard to sell people on up front, but it really matters when you are in a crunch. On this project, we literally finished about 20 minutes before the deadline. My client really appreciated the integrated environment that DS offers. Their previous projects had gone from Final Cut to a Smoke finish and a Lustre grade. These are very capable Autodesk finishing systems, but Avid DS is a complete finishing solution. You can do editing, effects and color grading all in one workstation. This makes it a lot better for the client, especially when last minute changes are made during the color correction pass.”

Stereo 3D tools have been enhanced in DS 10.3.1. Convergence tools now allow independent adjustment of 3D content for each eye. There is also real-time playback of stereoscopic containers and effects. Although “All Yours” wasn’t a stereo 3D project, I asked Shane about the new 3D tools. He replied, “So far I’ve only had a  chance to do some testing with the new tools. In previous versions, I would have to go out to [The Foundry’s] Nuke and use Ocula for stereo 3D work. Our DS has the Furnace plug-in set, which includes some stereoscopic tools. With Avid DS 10.3.1, I can complete one eye, apply the same grading to the other eye, adjust the convergence and then use one of the Furnace plug-ins to tweak the minor grading differences between the left and right eye views.”

Addendum: This article was originally written prior to the 2010 IBC exhibition in Amsterdam. At that conference, Avid announced the release of Avid DS 10.5, which will be available both as a full-featured software-only version and as a turnkey solution. The software version will be available for under $10K and comes bundled with a copy of Avid Media Composer 5. Some of the features in DS 10.5 – available for the first time in a software version – include full 2K playback and REDRocket accelerator support. In addition, the software has been ported to the Windows 7 64-bit OS, making it one of the most powerful editing/VFX/grading solutions for the PC platform.

Written for Videography and DV magazines (NewBay Media LLC)

©2010 Oliver Peters