The Handmaid’s Tale

With tons of broadcast, web, and set-top outlets for dramatic television, there’s a greater opportunity than ever for American audiences to be exposed to excellent productions produced outside of Hollywood or New York. Some of the most interesting series come out of Canada from a handful of production vendors. One such company is Take 5 Productions, which has worked on such co-productions as Vikings, American Gothic, Penny Dreadful, and others. One of their newest offerings is The Handmaid’s Tale, currently airing in ten, hourlong episodes on Hulu, as well as being distributed internationally through MGM.

The Handmaid’s Tale is based on a dystopian novel written in 1985 by Margaret Atwood. It’s set in New England during the near future, when an authoritarian theocracy has overthrown the United States government and replaced it with the Republic of Gilead. The population has had declining births due to pollution and disease, so a class of women (the handmaids), who are considered fertile, are kept by the ruling class (the Commanders) as concubines for the purpose of having their children. This disturbing tale and series, with its nods to Nazi Germany and life behind the Iron Curtain, not to mention Orwell and Kubrick, stars Elizabeth Moss (Mad Men, The One I Love, Girl, Interrupted) as Offred, one of the handmaids, as she tries to survive her new reality.

The tone of the style and visuals for The Handmaid’s Tale was set by cinematographer-turned-director, Reed Morano (Frozen River, Meadowland, The Skeleton Twins). She helmed three of the episodes, including the pilot. As with many television series, a couple of editors traded off the cutting duties. For this series, Julian Clarke (Deadpool, Chappie, Elysium) started the pilot, but it was wrapped up by Wendy Hallam Martin (Queer As Folk, The Tudors, The Borgias). Hallam Martin and Christopher Donaldson (Penny Dreadful, Vikings, The Right Kind of Wrong) alternated episodes in the series, with one episode cut by Aaron Marshall (Vikings, Penny Dreadful, Warrior).

Cutting a dystopian future

I recently spoke with Wendy Hallam Martin about this series and working in the Toronto television scene. She says, “As a Canadian editor, I’ve been lucky to work on some of the bigger shows. I’ve done a lot of Showtime projects, but Queer As Folk was really the first big show for me. With the interest of outlets like Netflix and Hulu, budgets have increased and Canadian TV has had a chance to produce better shows, especially the co-productions. I started on The Handmaid’s Tale with the pilot, which was the first episode. Julian [Clarke] started out cutting the pilot, but had to leave due to his schedule, so I took over. After the pilot was shot (with more scenes to come), the crew took a short break. Reed [Morano] was able to start her director’s cut before she shot episodes two and three to set the tone. The pilot didn’t lock until halfway through the season.”

One might think a mini-series that doesn’t run on a broadcast network would have a more relaxed production and post schedule, akin to a feature film. But not so with The Handmaid’s Tale, which was produced and delivered on a schedule much like other television dramatic series. Episodes were shot in blocks of two episodes at a time with eight days allotted per episode. The editor’s assembly was due five days later followed by two weeks working with the director for a director’s cut. Subsequent changes from Hulu and MGM notes result in a locked cut three months after the first day of production for those two episodes. Finally, it’s three days to color grade and about a month for sound edit and mix.

Take 5 has its own in-house visual effects department, which handles simple VFX, like wire removals, changing closed eyes to open, and so on. A few of the more complex VFX shots are sent to outside vendors. The episodes average about 40 VFX shots each, however, the season finale had 70 effects shots in one scene alone.

Tackling the workload

Hallam Martin explained how they dealt with the post schedule. She continues, “We had two editors handling the shows, so there was always some overlap. You might be cutting one show while the next one was being assembled. This season we had a first and second assistant editor. The second would deal with the dailies and the first would be handling visual effects hand-offs, building up sound effects, and so on. For the next season we’ll have two firsts and one second assistant, due to the load. Reed was very hands-on and wanted full, finished tracks of audio. There were always 24 tracks of sound on my timelines. I usually handle my own temp sound design, but because of the schedule, I handed that off to my first assistant. I would finish a scene and then turn it over to her while I moved on to the next scene.”

The Handmaid’s Tale has a very distinctive look for its visual style. Much of the footage carries a strong orange-and-teal grade. The series is shot with an ARRI ALEXA Mini in 4K (UHD). The DIT on set applies a basic look to the dailies, which are then turned into Avid DNxHD36 media files by Deluxe in Toronto to be delivered to the editors at Take 5. Final color correction is handled from the 4K originals by Deluxe under the supervision of the series director of photography, Colin Watkinson (Wonder Woman, Entourage, The Fall). A 4K (UHD) high dynamic range master is delivered to Hulu, although currently only standard dynamic range is streamed through the service. Hallam Martin adds, “Reed had created an extensive ‘look book’ for the show. It nailed what [series creator] Bruce Miller was looking for. That, combined with her interview, is why the executive producers hired her. It set the style for the series.”

Another departure from network television is that episodes do not have a specific duration that they must meet. Hallam Martin explains, “Hulu doesn’t dictate exact lengths like 58:30, but they did want the episodes to be under an hour long. Our episodes range from about 50 to 59 minutes. 98% of the scenes make it into an episode, but sometimes you do have to cut for time. I had one episode that was 72 minutes, which we left that long for the director’s cut. For the final version, the producers told me to ‘go to town’ in order to pace it up and get it under an hour. This show had a lot of traveling, so through the usual trimming, but also a lot of jump cuts for the passage of time, I was able to get it down. Ironically the longest show ended up being the shortest.”

Adam Taylor (Before I Fall, Meadowland, Never a Neverland) was the series composer, but during the pilot edit, Morano and Hallam Martin had to set the style. Hallam Martin says, “For the first three episodes, we pulled a lot of sources from other film scores to set the style. Also a lot of Trent Reznor stuff. This gave Adam an idea of what direction to take. Of course, after he scored the initial episodes, we could use those tracks as temp for the next episodes and as more episode were completed, that increased the available temp library we had to work with.”

Post feelings

Story points in The Handmaid’s Tale are often exposed through flashbacks and Moss’ voice over. Naturally voice over pieces affect the timing of both the acting and the edit. I asked Hallam Martin how this was addressed. She says, “The voice over was recorded after the fact. Lizzie Moss would memorize the VO and act with that in mind. I would have my assistant do a guide track for cutting and when we finally received Lizzie’s, we would just drop it in. These usually took very little adjustment thanks to her preparation while shooting. She’s a total pro.” The story focuses on many ideas that are tough to accept and watch at times. Hallam Martin comments, “Some of the subject matter is hard and some of the scenes stick with you. It can be emotionally hard to watch and cut, because it feels so real!”

Wendy Hallam Martin uses Avid Media Composer for these shows and I asked her about editing style. She comments, “I watch all the dailies from top to bottom, but I don’t use ScriptSync. I will arrange my bins in the frame view with a representative thumbnail for each take. This way I can quickly see what my coverage is. I like to go from the gut, based on my reaction to the take. Usually I’ll cut a scene first and then compare it against the script notes and paperwork to make sure I haven’t overlooked anything that was noted on set.” In wrapping up, we talked about films versus TV projects. Hallam Martin says, “I have done some smaller features and movies-of-the-week, but I like the faster pace of TV shows. Of course, if I were asked to cut a film in LA, I’d definitely consider it, but the lifestyle and work here in Toronto is great.”

(Here’s an updated interview with the editors by Steve Hullfish during season 2.)

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Fear the Walking Dead

df3615_ftwd_1_sm

When AMC cable network decided to amp up the zombie genre with The Walking Dead series, it resulted in a huge hit. Building upon that success, they’ve created a new series that could be viewed as a companion story, albeit without any overlapping characters. Fear the Walking Dead is a new, six-episode series that starts season one on August 23. The story takes place across the country in Los Angeles and chronologically just before the outbreak in the original series. The Walking Dead was based on Robert Kirkman’s graphic novels by the same name and he has been involved in both versions as executive producer.

Unlike the original series, which was shot on 16mm film, Fear the Walking Dead is being shot digitally with ARRI ALEXA cameras and anamorphic lenses. That’s in an effort to separate the two visual styles, while maintaining a cinematic quality to the new series. I recently spoke with Tad Dennis, the editor of two of the six episodes in season one, about the production.

Tad Dennis started his editing career as an assistant editor on reality TV shows. He says, “I started in reality TV and then got the bump-up to full-time editing (Extreme Makeover: Home Edition, America’s Next Top Model, The Voice). However, I realized my passion was elsewhere and made the shift to scripted television. I started there again as an assistant and then was bumped back up to editing (Fairly Legal, Manhattan, Parenthood). Both types of shows really do have a different workflow, so when I shifted to scripted TV, it was good to start back as an assistant. That let me be very grounded in the process.”

Creating a new show with a shared concept

Dennis started with these thoughts on the new show, “We think of this series as more of a companion show to the other and not necessarily a spin-off or prequel. The producers went with different cameras and lenses for a singular visual aesthetic, which affects the style. In trying to make it more ‘cinematic’, I tend linger on wider shots and make more selective use of tight facial close-ups. However, the material really has to dictate the cut.”

df3615_ftwd_3Three editors and three assistant editors work on the Fear the Walking Dead series, with each editor/assistant team cutting two of the six shows of season one. They are all working on Avid Media Composer systems connected to an Avid Isis shared storage solution. Scenes were shot in both Vancouver and in Los Angeles, but the editing teams were based in Los Angeles. ALEXA camera media was sent to Encore Vancouver and Encore Hollywood, depending on the shooting location. Encore staff synced sound and provided the editors with Avid DNxHD editorial media. The final color correction, conform, and finishing was also handled at Encore Hollywood.

Dennis described how post on this show differed from other network shows he’s worked on in the past. He says, “With this series, everything was shot and locked for the whole season by the first airdate. On other series, the first few shows will be locked, but then for the rest of the season, it’s a regular schedule of locking a new show each week until the end of the season. This first season was shot in two chunks for all six episodes – the Vancouver settings and then the Los Angeles scenes. We posted everything for the Vancouver scenes and left holes for the LA parts. The shows went all the way through director cuts, producer cuts, and network notes with these missing sections. Then when the LA portions came in, those scenes were edited and incorporated. This process was driven by the schedule. Although we didn’t have the pressure of a weekly airdate, the schedule was definitely tight.” Each of the editors had approximately three to four days to complete their cut of an episode after receiving the last footage. Then the directors got another four days for a director’s cut.

df3615_ftwd_5Often films and television shows go through adjustments as they move from script to actual production and ultimately the edit. Dennis feels this is more true of the first few shows in a new series than with an established series. He explains, “With a new series, you are still trying to establish the style. Often you’ll rethink things in the edit. As I went through the scenes, performances that were coming across as too ‘light’ had to be given more ‘weight’. In our story, the world is falling apart and we wanted every character to feel that all the way throughout the show. If a performance didn’t convey a sense of that, then I’d make changes in the takes used or mix takes, where picture might be better on one and audio better on the other.”

Structure and polish in post

In spite of the tight schedule, the editors still had to deal with a wealth of footage. Typical of most hour-long dramas, Fear the Walking Dead is shot with two or three cameras. For very specific moments, the director would have some of the footage shot on 48fps. In those cases, where cameras ran at different speeds, Dennis would treat these as separate clips. When cameras ran at the same speed (for example, at 24fps for sync sound), such as in dialogue scenes, Susan Vinci (assistant editor) would group the clips as multicam clips. He explains, “The director really determines the quality of the coverage. I’d often get really necessary options on both cameras that weren’t duplicated otherwise. So for these shows, it helped. Typically this meant three to four hours of raw footage each day. My routine is to first review the multicam clips in a split view. This gives me a sense of what the coverage is that I have for the scene. Then I’ll go back and review each take separately to judge performance.”

df3615_ftwd_4Dennis feels that sound is critical to his creative editing process. He continues, “Sound is very important to the world of Fear the Walking Dead. Certain characters have a soundscape that’s always associated with them and these decisions are all driven by editorial. The producers want to hear a rough cut that’s as close to airable as possible, so I spend a lot of time with sound design. Given the tight schedule on this show, I would hand off a lot of this to my long-time assistant, Susan. The sound design that we do in the edit becomes a template for our sound designer. He takes that, plus our spotting notes, and replaces, improves, and enhances the work we’ve done. The show’s music composer also supplied us with a temp library of past music he’d composed for other productions. We were able to use these as part of our template. Of course, he would provide the final score customized to the episode. This score would be based on our template, the feelings of the director, and of course the composer’s own input for what best suited each show.”

df3615_ftwd_2Dennis is an unabashed Avid Media Composer proponent. He says, “Over the past few years, the manufacturers have pushed to consolidate many tools from different applications. Avid has added a number of Pro Tools features into Media Composer and that’s been really good for editors. There are many tools I rely on, such as those audio tools. I use the Audiosuite and RTAS filters in all of my editing. I like dialogue to sound as it would in a live environment, so I’ll use the reverb filters. In some cases, I’ll pitch-shift audio a bit lower. Other tools I’ll use include speed-ramping and invisible split-screens, but the the trim tool is what defines the system for me. When I’m refining a cut, the trim tool is like playing a precise instrument, not just using a piece of software.”

Dennis offered these parting suggestions for young editors starting out. “If you want to work in film and television editing, learn Media Composer inside and out. The dominant tool might be Final Cut or Premiere Pro in some markets, but here in Hollywood, it’s largely Avid. Spend as much time as possible learning the system, because it’s the most in-demand tool for our craft.”

Originally written for Digital Video magazine / CreativePlanetNetwork

©2015 Oliver Peters

More 4K

df_4Kcompare_main

I’ve talked about 4K before (here, here and here), but I’ve recently done some more 4K jobs that have me thinking again. 4K means different things to different people and in terms of dimensions, there’s the issue of cinema 4K (4096 pixels wide) versus the UltraHD/QuadHD/4K 16:9 (whatever you want to call it) version of 4K (3840 pixels wide). That really doesn’t make a lot of difference, because these are close enough to be the same. There’s so much hype around it, though, that you really have to wonder if it’s “the Emperor’s new clothes”. (Click on any of these images for expanded views.)

First of all, 4K used as a marketing term is not a resolution, it’s a frame dimension. As such, 4K is not four times the resolution of HD. That’s a measurement of area and not resolution. True resolution is usually measured in the vertical direction based on the ability to resolve fine detail (regardless of the number of pixels) and, therefore, 4K is only twice the resolution of HD at best. 4K is also not sharpness, which is a human perception affected by many things, such as lens quality, contrast, motion and grading. It’s worth watching Mark Schubin’s excellent webinar on the topic to get a clearer understanding of this. There’s also a very good discussion among top DoPs here about 4K, lighting, high dynamic range and more.

df_4kcompare_1A lot of arguments have been made that 4K cameras using a color-pattern filter method (Bayer-style), single CMOS sensor don’t even deliver the resolution they claim. The reason is that in many designs 50% of the pixels are green versus 25% each for red and blue. Green is used for luminance, which determines detail, so you do not have a 1:1 pixel relationship between green and the stated frame resolution of the sensor. That’s in part why RED developed 5K and 6K sensors and it’s why Sony uses an 8K sensor (F65) to deliver a 4K image.

The perceived image quality is also not all about total pixels. The pixels of the sensor, called photosites, are the light-receiving elements of the sensor. There’s a loose correlation between pixel size and light sensitivity. For any given sensor of a certain physical dimension, you can design it with a lot of small pixels or with fewer, but larger, pixels. This roughly correlates to a sensor that’s of high resolution, but a smaller dynamic range (many small pixels) or one with lower resolution, but a higher dynamic range (large, but fewer pixels). Although the equation isn’t nearly this simplistic, since a lot of color science and “secret sauce” goes into optimizing a sensor’s design, you can certainly see this play out in the marketing battles between the RED and ARRI camps. In the case of the ALEXA, ARRI adds some on-the-sensor filtering, which results in a softer image that gives it a characteristic filmic quality.df_4kcompare_2

Why do you use 4K?

With 4K there are two possible avenues. The first is to shoot 4K for the purpose of reframing and repositioning within HD and 2K timelines. Reframing isn’t a new production idea. When everyone shot on film, some telecine devices, like the Rank Cintel Mark III, sported zoom boards that permitted an optical blow-up of the 35mm negative. You could zoom in for a close-up in transfer that didn’t cost you resolution. Many videographers shoot 1080 for a 720 finish, as this allows a nice margin for reframing in post. The second is to deliver a final 4K product. Obviously, if your intent is the latter, then you can’t count on the techniques of the former in post.

df_4kcompare_3When you shoot 4K for HD post, then workflow is an issue. Do you shoot everything in 4K or just the items you know you’ll want to deal with? How will this cut with HD and 2K content? That’s where it gets dicey, because some NLEs have good 4K workflows and others don’t. But it’s here that I contend you are getting less than meets the eye, so to speak.  I have run into plenty of editors who have dropped a 4K clip into an HD timeline and then blown it up, thinking that they are really cropping into the native 4K frame and maintaining resolution. Depending on the NLE and the settings used, often they are simply blowing up an HD shot. The NLE scaled the 4K to HD first and then expanded the downscaled HD image. It didn’t crop into the actual 4K native resolution. So you have to be careful. And guess what, if the blow up isn’t that extreme, it may not look much different than the crop.

df_4kcompare_4One thing to remember is that a 4K image that is scaled to fit into an HD timeline gains the benefits of oversampling. The result in HD will be very sharp and, in fact, will generally look better perceptually than the exact same image natively shot in an HD size. When you now crop into the native image, you are losing some of that oversampling effect. A 1:1 pixel relationship is the same effective image size as a 200% blow-up. Of course, it’s not the same result. When you compare the oversampled “wide shot” (4K scaled to HD) to the “close-up” (native 4K crop), the close-up will often look softer. You’ll see defects of the image, like chromatic aberration in the lens, missed critical focus and sensor noise. Instead, if you shoot a wide and then an actual close-up, that result will usually look better.

On the other hand, if you blow up the 4K-to-HD or a native HD shot, you’ll typically see a result that looks pretty good. That’s because there’s often a lot more information there than monitors or the eye can detect. In my experience, you can commonly get away with a blow-up in the range of 120% of the original image size and in some cases, as much as 150%.

To scale or not to scale

df_4K_comparison_Instant4KLet me point out that I’m not saying a native 4K shot doesn’t look good. It does, but often the associated workflow hassles aren’t worth it. For example, let’s take a typical 1080p 50” Panasonic plasma that’s often used as a client monitor in edit suites. You or your client may be sitting 7 to 10 feet away from it, which is closer than most people sit in a living room with that size of a screen. If I show a client the native image (4K at 1:1 in an HD timeline) compared with an separate HD image at the same framing, it’s unlikely that they’ll see a difference. Another test is to take two exact images – one native HD and the other 4K. Scale up the HD and crop down the 4K to match. In theory, the 4K should look better and sharper. In fact, sitting back on the client sofa, most won’t see a difference. It’s only when they step to about 5 feet in front of the monitor that a difference is obvious and then only when looking at fine detail within the shot.

df_gh4_instant4k_smNot all scaling is equal. I’ve talked a lot about the comparison of HD scaling, but that really depends on the scaling that you use. For a quick shot, sure, use what your NLE has built in. For more critical operations, then you might want to scale images separately. DaVinci Resolve has excellent built-in scaling and lets you pick from smooth, sharp and bilinear algorithms. If you want a plug-in, then the best I’ve found is the new Red Giant Instant 4K filter. It’s a variation of their Instant HD plug-in and works in After Effects and Premiere Pro. There are a lot of quality tweaks and naturally, the better it does, the longer the render will be. Nevertheless, it offers outstanding results and in one test that I ran, it actually provided a better look within portions of the image than the native 4K shot.

df_4K_comparison-C500_smIn that case, it was a C500 shot of a woman on a park bench with a name badge. I had three identical versions of the shot (not counting the raw files) – the converted 4K ProRes4444 file, a converted 1080 ProRes4444 “proxy” file for editing and the in-camera 1080 Canon XF file. I blew up the two 1080 shots using Instant 4K and cropped the 4K shot so all were of equal framing. When I compared the native 4K shot to the expanded 1080 ProRes4444 shot, the woman’s hair was sharper in the 1080 blow-up, but the letters on the name badge were better on the original. The 1080 Canon XF blow-up was softer in both areas. I think this shows that some of the controls in the plug-in may give you superior results to the original (crisper hair); but, a blow-up suffers when you are using a worse codec, like Canon’s XF (50 Mbps 4:2:2). It’s fine for native HD, but the ProRes4444 codec has twice the chroma resolution and less compression, which makes a difference when scaling an image larger. Remember all of this pertains to viewing the image in HD.

4K deliverables

df_4K_comparison-to-1080_smSo what about working in native 4K for a 4K deliverable? That certainly has validity for high-resolution projects (films, concerts, large corporate presentations), but I’m less of a believer for television and web viewing. I’d rather have “better” pixels and not simply “more” pixels. Most of the content you watch at theaters using digital projection is 2K playback. Sometimes the master for that DCP was HD, 2K or 4K. If you are in a Sony 4K projector-equipped theater, most of the time, it’s simply the projector upscaling the content to 4K as part of the projection. Even though you may see a Sony 4K logo at the head of the trailers, you aren’t watching 4K content – definitely not, if it’s a stereo3D film. Yet, much of this looks pretty good, doesn’t it?

df_AMIRAEverything I talked about, regarding blowing up HD by up to 120% or more, still applies to 4K. Need to blow up a shot a bit in a 4K timeline? Go ahead, it will look fine. I think ARRI has proven this as well, taking films shot with the ALEXA all the way up to Imax. In fact, ARRI just announced that the AMIRA will get in-camera, on-the-fly upscaling of its image with the ability to record 4K (3840 x 2160 at up to 60fps) on the CFast 2.0 cards. They can do this, because the sensor starts with more pixels than HD or 2K. The AMIRA will expose all of the available photosites (about 3.4K sensor pixels) in what they call the “open gate” method. This image is lightly cropped to 3.2K and then scaled by a 1.2 factor, which results in UltraHD 4K recording on the same hardware. Pretty neat trick and judging by ARRI’s image quality, I’ll bet it will look very good. Doubling down on this technique, the ALEXA XT models will also be able to record ProRes media at this 3.2K size. In the case of the ALEXA, the designers have opted to leave the upscaling to post, rather than to do it in-camera.

To conclude, if you are working in 4K today, then by all means continue to do so. It’s a great medium with a lot of creative benefits. If you aren’t working in 4K, then don’t sweat it. You won’t be left behind for awhile and there are plenty of techniques to get you to the same end goal as much of the 4K production that’s going on.

Click these thumbnails for full resolution images.

df_gh4_instant4k_sm

 

 

 

df_4K_comparison-to-1080_sm

 

 

 

 

©2014 Oliver Peters

Amira Color Tool and your NLE

df_amiracolor_1I was recently alerted to the new Amira Color Tool by Michael Phillips’ 24p blog. This is a lightweight ARRI software application designed to create custom in-camera looks for the Amira camera. You do this by creating custom color look-up tables (LUT). The Amira Color Tool is available as a free download from the ARRI website (free registration required). Although the application is designed for the camera, you can also export looks in a variety of LUT file formats, which in turn, may be installed and applied to footage in a number of different editing and color correction applications. I tested this in both Apple Final Cut Pro X and Avid Media Composer | Software (v8) with good results.

The Amira Color Tool is designed to correct log-C encoded footage into a straight Rec709 offset or with a custom look. ARRI offers some very good instructions, white papers, sample looks and tutorials that cover the operation of this software. The signal flow is from the log-C image, to the Rec709 correction, and then to the CDL-based color correction. To my eye, the math appears to be floating point, because a Rec709 conversion that throws a shot into clipping, can be pulled back out of clipping in the look tab, using the CDL color correction tools. Therefore it is possible to use this tool for shots other than ARRI Amira or Alexa log-C footage, as long as it is sufficiently flat.

The CDL correction tools are based on slope, offset and power. In that model slope is equivalent to gain, offset to lift and power to gamma. In addition to color wheels, there’s a second video look parameters tab for hue intensities for the six main vectors (red, yellow, green, cyan, blue and magenta). The Amira Color Tool is Mac-only and opens both QuickTime and DPX files from the clips I tested. It worked successfully with clips shot on an Alexa (log-C), Blackmagic Cinema Camera (BMD Film profile), Sony F-3 (S-log) and Canon 1DC (4K Canon-log). Remember that the software is designed to correct flat, log-C images, so you probably don’t want to use this with images that were already encoded with vibrant Rec709 colors.

FCP X

df_amiracolor_4To use the Amira Color Tool, import your clip from the application’s file browser, set the look and export a 3D LUT in the appropriate format. I used the DaVinci Resolve setting, which creates a 3D LUT in a .cube format file. To get this into FCP X, you need to buy and install a LUT filter, like Color Grading Central’s LUT Utility. To install a new LUT there, open the LUT Utility pane in System Preferences, click the “+” symbol and navigate to where the file was saved.df_amiracolor_5_sm In FCP X, apply the LUT Utility to the clip as a filter. From the filter’s pulldown selection in the inspector, choose the new LUT that you’ve created and installed. One caveat is to be careful with ARRI files. Any files recorded with newer ARRI firmware are flagged for log-C and FCP X automatically corrects these to Rec709. Since you don’t want to double up on LUTs, make sure “log processing” is unchecked for those clips in the info tab of the inspector pane.

Media Composer

df_amiracolor_6_smTo use the custom LUTs in Media Composer, select “source settings” for the clip. Go to the color management tab and install the LUT. Now it will be available in the pull-down menu for color conversions. This color management change can be applied to a single clip or to a batch of clips within a bin.

In both cases, the source clips in FCP X and/or Media Composer will play in real-time with the custom look already applied.

df_amiracolor_2_sm

df_amiracolor_3_sm

©2014 Oliver Peters

NAB 2014 Thoughts

Whodathunkit? More NLEs, new cameras from new vendors and even a new film scanner! I’ve been back from NAB for a little over a week and needed to get caught up on work while decompressing. The following are some thoughts in broad strokes.

Avid Connect. My trip started early with the Avid Connect costumer event. This was a corporate gathering with over 1,000 paid attendees. Avid execs and managers outlined the corporate vision of Avid Everywhere in presentations that were head-and-shoulders better than any executive presentations Avid has given in years. For many who attended, it was to see if there was still life in Avid. I think the general response was receptive and positive. Avid Everywhere is basically a realignment of existing and future products around a platform concept. That has more impact if you own Avid storage or asset management software. Less so, if you only own a seat of Media Composer or ProTools. No new software features were announced, but new pricing models were announced with options to purchase or rent individual seats of the software – or to rent floating licenses in larger quantities.

4K. As predicted, 4K was all over the show. However, when you talked to vendors and users, there was little clear direction about actual mastering in 4K. It is starting to be a requirement in some circles, like delivering to Netflix, for example; but for most users 4K stops at acquisition. There is interest for archival reasons, as well as for reframing shots when the master is HD or 2K.

Cameras. New cameras from Blackmagic Design. Not much of a surprise there. One is the bigger, ENG-style URSA, which is Blackmagic’s solution to all of the add-ons people use with smaller HDSLR-sized cameras. The biggest feature is a 10” flip-out LCD monitor. AJA was the real surprise with its own 4K Cion camera. Think KiPro Quad with a camera built around it. Several DPs I spoke with weren’t that thrilled about either camera, because of size or balance. A camera that did get everyone jazzed was Sony’s A7s, one of their new Alpha series HDSLRs. It’s 4K-capable when recorded via HDMI to an external device. The images were outstanding. Of course, 4K wasn’t everywhere. Notably not at ARRI. The news there is the Amiraa sibling to the Alexa. Both share the same sensor design, with the Amira designed as a documentary camera. I’m sure it will be a hit, in spite of being a 2K camera.

Mac Pro. The new Mac Pro was all over the show in numerous booths. Various companies showed housings and add-ons to mount the Mac Pro for various applications. Lots of Thunderbolt products on display to address expandability for this unit, as well as Apple laptops and eventually PCs that will use Thunderbolt technology. The folks at FCPworks showed a nice DIT table/cart designed to hold a Mac Pro, keyboard, monitoring and other on-set essentials.

FCP X. Speaking of FCP X, the best place to check it out was at the off-site demo suite that FCPworks was running during the show. The suite demonstrated a number of FCP X-based workflows using third-party utilities, shared storage from Quantum and more. FCP X was in various booths on the NAB show floor, but to me it seemed limited to partner companies, like AJA. I thought the occurrences of FCP X in other booths was overshadowed by Premiere Pro CC sightings. No new FCP X feature announcements or even hints were made by Apple in any private meetings.

NLEs. The state of nonlinear editing is in more flux than ever. FCP X seems to be picking up a little steam, as is Premiere Pro. Yet, still no clear market leader across all sectors. Autodesk announced Smoke 2015, which will be the last version you can buy. Following Adobe’s lead, this year they shift to a rental model for their products. Smoke 2015 diverges more from the Flame UI model with more timeline-based effects than Smoke 2013. Lightworks for the Mac was demoed at the EditShare booth, which will make it another new option for Mac editors. Nothing new yet out of Avid, except some rebranding – Media Composer is now Media Composer | Software and Sphere is now Media Composer | Cloud. Expect new features to be rolled in by the end of this year. The biggest new player is Blackmagic Design, who has expanded the DaVinci Resolve software into a full-fledged NLE. With a cosmetic resemblance to FCP X, it caused many to dub it “the NLE that Final Cut Pro 8 should have been”. Whether that’s on the mark or just irrational exuberance has yet to be determined. Suffice it to say that Blackmagic is serious about making it a powerful editor, which for now is targeted at finishing.

Death of i/o cards. I’ve seen little mention of this, but it seems to me that dedicated PCIe video capture cards are a thing of the past. KONA and Decklink cards are really just there to support legacy products. They have less relevance in the file-based world. Most of the focus these days is on monitoring, which can be easily (and more cheaply) handled by HDMI or small Thunderbolt devices. If you looked at AJA and Matrox, for example, most of the target for PCIe cards is now to supply the OEM market. AJA supplies Quantel with their 4K i/o cards. The emphasis for direct customers is on smaller output-only products, mini-converters or self-contained format converters.

Film. If you were making a custom, 35mm film scanner – get out of the business, because you are now competing against Blackmagic Design! Their new film scanner is based on technology acquired through the purchase of Cintel a few months ago. Now Blackmagic introduced a sleek 35mm scanner capable of up to 30fps with UltraHD images. It’s $30K and connects to a Mac Pro via Thunderbolt2. Simple operation and easy software (plus Resolve) will likely rekindle the interest at a number of facilities for the film transfer business. That will be especially true at sites with a large archive of film.

Social. Naturally NAB wouldn’t be the fun it is without the opportunity to meet up with friends from all over the world. That’s part of what I get out of it. For others it’s the extra training through classes at Post Production World. The SuperMeet is a must for many editors. The Avid Connect gala featured entertainment by the legendary Nile Rodgers and his band Chic. Nearly two hours of non-stop funk/dance/disco. Quite enjoyable regardless of your musical taste. So, another year in Vegas – and not quite the ho-hum event that many had thought it would be!

Click here for more analysis at Digital Video’s website.

©2014 Oliver Peters