Did you pick the right camera? Part 3

Let me wrap up this three-parter with some thoughts on the media side of cameras. The switch from videotape recording to file-based recording has added complexity with not only specific file formats and codecs, but also the wrapper and container structure of the files themselves. The earliest file-based camera systems from Sony and Panasonic created a folder structure on their media cards that allowed for audio and video, clip metadata, proxies, thumbnails, and more. FAT32 formatting was adopted, so a 4GB file limit was imposed, which added the need for clip-spanning any time a recording exceeded 4GB in size.

As a result, these media cards contain a complex hierarchy of spanned files, folders, and subfolders. They often require a special plug-in for each NLE to be able to automatically interpret the files as the appropriate format of media. Some of these are automatically included with the NLE installation while others require the user to manually download and install the camera manufacturer’s software.

This became even more complicated with RED cameras, which added additional QuickTime reference files at three resolutions, so that standard media players could be used to read the REDCODE RAW files. It got even worse when digital still photo cameras added video recording capabilities, thus creating two different sets of folder paths on the card for the video and the still media. Naturally, none of these manufacturers adopted the same architecture, leaving users with a veritable Christmas tree of discovery every time they popped in one of these cards to copy/ingest/import media.

At the risk of sounding like a broken record, I am totally a fan of ARRI’s approach with the Alexa camera platform. By adopting QuickTime wrappers and the ProRes codec family (or optionally DNxHD as MXF OP1a media), Alexa recordings use a simple folder structure containing a set of uniquely-named files. These movie files include interleaved audio, video, and timecode data without the need for subfolders, sidecar files, and other extraneous information. AJA has adopted a similar approach with its KiPro products. From an editor’s point-of-view, I would much rather be handed Alexa or KiPro media files than any other camera product, simply because these are the most straight-forward to deal with in post.

I should point out that in a small percentage of productions, the incorporated metadata does have value. That’s often the case when high-end VFX are involved and information like lens data can be critical. However, in some camera systems, this is only tracked when doing camera raw recordings. Another instance is with GoPro 360-degree recordings. The front and back files and associated data files need to stay intact so that GoPro’s stitching software can properly combine the two halves into a single movie.

You can still get the benefit of the simpler Alexa-style workflow in post with other cameras if you do a bit of media management of files prior to ingesting these for the edit. My typical routine for the various Panasonic, Canon, Sony, and prosumer cameras is to rip all of the media files out of their various Clip or Private folders and move them to the root folder (usually labelled by camera roll or date). I trash all of those extra folders, because none of it is useful. (RED and GoPro 360 are the only formats to which I don’t do this.) When it’s a camera that doesn’t generate unique file names, then I will run a batch renaming application in order to generate unique file names. There are a few formats (generally drones, ‘action’ cameras, smart phones, and image sequences) that I will transcode to some flavor of ProRes. Once I’ve done this, the edit and the rest of post becomes smooth sailing.

While part of your camera buying decision should be based on its impact on post, don’t let that be a showstopper. You just have to know how to handle it and allow for the necessary prep time before starting the edit.

Click here for Part 2.

©2019 Oliver Peters

Did you pick the right camera? Part 2

HDR (high dynamic range) imagery and higher display resolutions start with the camera. Unfortunately that’s also where the misinformation starts. That’s because the terminology is based on displays and not on camera sensors and lenses.

Resolution

4K is pretty common, 8K products are here, and 16K may be around the corner. Resolution is commonly expressed as the horizontal dimension, but in fact, actual visual resolution is intended to be measured vertically. A resolution chart uses converging lines. The point at which you can no longer discern between the lines is the limit of the measurable resolution. That isn’t necessarily a pixel count.

The second point to mention is that camera sensors are built with photosites that only loosely equate to pixels. The hitch is that there is no 1:1 correlation between a sensor’s photosites and display pixels on a screen. This is made even more complicated by the design of a Bayer-pattern sensor that is used in most professional video cameras. In addition, not all 4K cameras look good when you analyze the image at 100%. For example, nearly all early and/or cheap drone and ‘action’ cameras appear substandard when you actually look at the image closely. The reasons include cheap plastic lenses and high compression levels.

The bottom line is that when a company like Netflix won’t accept an ARRI Alexa as a valid 4K camera for its original content guidelines – in spite of the number of blockbuster feature films captured using Alexas – you have to take it with a grain of salt. Ironically, if you shoot with an Alexa in its 4:3 mode (2880 x 2160) using anamorphic lenses (2:1 aspect squeeze), the expanded image results in a 5760 x 2160 (6K) frame. Trust me, this image looks great on a 4K display with plenty of room to crop left and right. Or, a great ‘scope image. Yes, there are anamorphic lens artifacts, but that’s part of the charm as to why creatives love to shoot that way in the first place.

Resolution is largely a non-issue for most camera owners these days. There are tons of 4K options and the only decision you need to make when shooting and editing is whether to record at 3840 or 4096 wide when working in a 4K mode.

Log, raw, and color correction

HDR is the ‘next big thing’ after resolution. Nearly every modern professional camera can shoot footage that can easily be graded into HDR imagery. That’s by recording the image as either camera raw or with a log color profile. This lets a colorist stretch the highlight information up to the peak luminance levels that HDR displays are capable of. Remember that HDR video is completely different from HDR photography, which can often be translated into very hyper-real photos. Of course, HDR will continue to be a moving target until one of the various competing standards gains sufficient traction in the consumer market.

It’s important to keep in mind that neither raw nor log is a panacea for all image issues. Both are ways to record the linear dynamic range that the camera ‘sees’ into a video colorspace. Log does this by applying a logarithmic curve to the video, which can then be selectively expanded again in post. Raw preserves the sensor data in the recording and pushes the transformation of that data to RGB video outside of the camera. Using either method, it is still possible to capture unrecoverable highlights in your recorded image. Or in some cases the highlights aren’t digitally clipped, but rather that there’s just no information in them other than bright whiteness. There is no substitute for proper lighting, exposure control, and shaping the image aesthetically through creative lighting design. In fact, if you carefully control the image, such as in a studio interview or a dramatic studio production, there’s no real reason to shoot log instead of Rec 709. Both are valid options.

I’ve graded camera raw (RED, Phantom, DJI) and log footage (Alexa, Canon, Panasonic, Sony) and it is my opinion that there isn’t that much magic to camera raw. Yes, you can have good iso/temp/tint latitude, but really not a lot more than with a log profile. In one, the sensor de-Bayering is done in post and in the other, it’s done in-camera. But if a shot was recorded underexposed, the raw image is still going to get noisy as you lift the iso and/or exposure settings. There’s no free lunch and I still stick to the mantra that you should ‘expose to the right’ during production. It’s easier to make a shot darker and get a nice image than going in the other direction.

Since NAB 2018, more camera raw options have hit the market with Apple’s ProRes RAW and Blackmagic RAW. While camera raw may not provide any new, magic capabilities, it does allow the camera manufacturer to record a less-compressed file at a lower data rate.  However, neither of these new codecs will have much impact on post workflows until there’s a critical mass of production users, since these are camera recording codecs and not mezzanine or mastering codecs. At the moment, only Final Cut Pro X properly handles ProRes RAW, yet there are no actual camera raw controls for it as you would find with RED camera raw settings. So in that case, there’s actually little benefit to raw over log, except for file size.

One popular raw codec has been Cinema DNG, which is recorded as an image sequence rather than a single movie file. Blackmagic Design cameras had used that until replaced by Blackmagic RAW.  Some drone cameras also use it. While I personally hate the workflow of dealing with image sequence files, there is one interesting aspect of cDNG. Because the format was originally developed by Adobe, processing is handled nicely by the Adobe Camera Raw module, which is designed for camera raw photographs. I’ve found that if you bring a cDNG sequence into After Effects (which uses the ACR module) as opposed to Resolve, you can actually dig more highlight detail out of the images in After Effects than in Resolve. Or at least with far less effort. Unfortunately, you are stuck making that setting decision on the first frame, as you import the sequence into After Effects.

The bottom line is that there is no way to make an educated decision about cameras without actually testing the images, the profile options, and the codecs with real-world footage. These have to be viewed on high quality displays at their native resolutions. Only then will you get an accurate reading of what that camera is capable of. The good news is that there are many excellent options on the market at various price points, so it’s hard to go wrong with any of the major brand name cameras.

Click here for Part 1.

Click here for Part 3.

©2019 Oliver Peters

Are you ready for a custom PC?

Why would an editor, colorist, or animator purchase a workstation from a custom PC builder, instead of one of the brand name manufacturers? Puget Systems, a PC supplier in Washington state, loaned me a workstation to delve into this question. They pride themselves on assembling systems tailor-made for creative users. Not all component choices are equal, so Puget tests the same creative applications we use every day in order to optimize their systems. For instance, Premiere Pro benefits from more CPU cores, whereas with After Effects, faster core speeds are more important than the core count.

Puget Systems also offers a unique warranty. It’s one year on parts, but lifetime free labor. This means free tech and repair support for as long as you own the unit. Even better, it also includes free labor to install hardware upgrades at their facility at any point in the future – you only pay for parts and shipping.

Built for editing

The experience starts with a consultation, followed by progress reports, test results, and photos of your system during and after assembly. These include thermal scans showing your system under load. Puget’s phone advisers can recommend a system designed specifically for your needs, whether that’s CAD, gaming, After Effects, or editing. My target was Premiere Pro and Resolve with a bit of After Effects. I needed it to be capable of dealing with 4K media using native codecs (no transcodes or proxies). 

Puget’s configuration included an eight-core Intel i9 3.6GHz CPU, 64GB RAM, and an MSI GeForce RTX 2080 Ti Venus GPU (11GB). We put in two Samsung SSDs (a Samsung 860 Pro for OS/applications, plus a faster Samsung 970 Pro M.2 NVMe for cache) and a Western Digital Ultrastar 6TB SATA3 spinning drive for media. This PC has tons of connectivity with ports for video displays, Thunderbolt 3, USB-C, and USB 3. The rest was typical for any PC: sound card, ethernet, wifi, DVD-RW, etc. This unit without a display costs slightly over $5K USD, including shipping and a Windows 10 license. That price is in line with (or cheaper than) any other robust, high-performance workstation.

The three drives in this system deliver different speeds and are intended for different purposes. The fastest of these is the “D” drive, which is a blazingly fast NVMe drive that is mounted directly onto the motherboard. This one is intended for use with material requiring frequent and fast read/write cycles. So it’s ideal for Adobe’s cache files and previews. While you wouldn’t store the media for a large Premiere Pro project on it, it would be well-suited for complex After Effects jobs, which typically only deal with a smaller amount of media. While the 6TB HGST “E” drive dealt well with the 4K media for my test projects, in actual practice you would likely add more drives and build up an internal RAID, or connect to a fast external array or NAS.

If we follow Steve Jobs’ analogy that PCs are like trucks, then this is the Ford F-350 of workstations. The unit is a tad bigger and heavier than an older Mac Pro tower. It’s built into an all-metal Fractal Design case with sound dampening and efficient cooling, resulting in the quietest workstation I’ve ever used – even the few times when the fans revved up. There’s plenty of internal space for future expansion, such as additional hard drives, GPUs, i/o card, etc.

For anyone fretting about a shift from macOS to Windows, setting up this system couldn’t have been simpler. Puget installs a professional build of Windows 10 without all of the junk software most PC makers put there. After connecting my devices, I was up and running in less than an hour, including software installation for Adobe CC, Resolve, Chrome, MacDrive, etc. That’s a very ‘Apple-like’ experience and something you can’t touch if you built your own PC.

The proof is in the pudding

Professional users want hardware and software to fade away so they can fluidly concentrate on the creative process. I was working with 4K media and mixed codecs in Premiere Pro, After Effects, and Resolve. The Puget PC more than lived up to its reputation. It was quiet, media handling was smooth, and Premiere and Resolve timelines could play without hiccups. In short, you can stay in the zone without the system creating distractions.

I don’t work as often with RED camera raw files; however, I did load up original footage from an indie film onto the fastest SSD. This was 4K REDCODE media in a 4K timeline in Premiere Pro. Adobe gives you access to the raw settings, in addition to Premiere’s Lumetri color correction controls. The playback was smooth as silk at full timeline resolution. Even adding Lumetri creative LUTs, dissolves, and slow motion with optical flow processing did not impede real-time playback at full resolution. No dropped frames! Nvidia and RED Digital Camera have been working closely together lately, so if your future includes work with 6K/8K RED media, then a system like this requires serious consideration.

The second concern is rendering and exporting. The RTX 2080 Ti is an Nvidia card that offers CUDA processing, a proprietary Nvidia technology.  So, how fast is the system? There are many variables, of course, such as scaling, filters, color correction, and codecs. When I tested the export of a single 4K Alexa clip from a 1080p Premiere Pro timeline, the export times were nearly the same between this PC and an eight-core 2013 Mac Pro. But you can’t tell much from such a simple test.

To push Premiere Pro, I used a nine minute 1080p travelogue episode containing mostly 4K camera files. I compared export times for ProRes (new on Windows with Adobe CC apps) and Avid DNx between this PC and the Mac Pro (through Adobe Media Encoder). ProRes exports were faster than DNxHD and the PC exports were faster than on the Mac, although comparative times tended to be within a minute of each other. The picture was different when comparing H.264 exports using the Vimeo Full HD preset. In that test, the PC export was approximately 75% faster.

The biggest performance improvements were demonstrated in After Effects and Resolve. I used Puget Systems’ After Effects Benchmark, which includes a series of compositions that test effects, tracking, keys, caustics, 3D text, and more (based on Video Copilot’s tutorials). The Puget PC trounced the Mac Pro in this test. The PC scored a total of 969.5 points versus the Mac’s 535 out of a possible maximum score of 1,000. Resolve was even more dramatic with the graded nine-minute-long sequence sent from Premiere Pro. Export times bested the Mac Pro by more than 2.5x for DNxHD and 6x for H.264.

Aside from these benchmark tests, I also created a “witches brew” After Effects composition of my own. This one contains ten layers of 4K media in a one-minute-long 6K composition. The background layer was blown up and defocused, while all other layers were scaled down and enhanced with a lot of color and Cycore stylized effects. A 3D camera was added to create a group move for the layers. In addition, I was working from the slower drives and not the fast SSDs on either machine. Needless to say this one totally bogs any system down. The Mac Pro rendered a 1080 ProRes file in about 54 minutes, whereas the PC took 42 minutes. Not the same 2-to-1 advantage as in the benchmarks; however, that’s likely due to the fact that I heavily weighted the composition with the Cycore effects. These are not particularly efficient and probably introduce some bottlenecks in After Effects’ processing. Nevertheless, the Puget Systems PC still maintained a decided advantage.

Conclusion

Mac vs. PC comparisons are inevitable when discussing creative workstations. Ultimately it gets down to preference – the OS, the ecosystem, and hardware options. But if you want the ultimate selection of performance hardware and to preserve future expandability, then a custom-built PC is currently the best solution. For straight-forward editing, both platforms will generally serve you well, but there are times when a top-of-the-line PC simply leaves any Mac in the dust. If you need to push performance in After Effects or Resolve, then Windows-based solutions offer the edge today. Custom systems, like those from Puget Systems, are designed with our needs in mind. That’s something you don’t necessarily get from a mainline PC maker. This workstation is a future-proof, no-compromise system that makes the switch from Mac to PC an easy and graceful transition – and with power to space.

Originally written for RedShark News.

©2019 Oliver Peters

Mindhunter

The investigation of crime is a film topic with which David Fincher is very familiar. He returns to this genre in the new Netflix series, Mindhunter, which is executive produced by Fincher and Charlize Theron. The series is the story of the FBI’s Behavioral Science Unit and how it became an elite profiling team, known for investigating serial criminals. The TV series is based on the nonfiction book Mind Hunter: Inside the FBI’s Elite Serial Crime Unit, co-written by Mark Olshaker and John Douglas, a former agent in the unit who spent 25 years with the FBI. Agent Douglas interviewed scores of serial killers, including Charles Manson, Ted Bundy, and Ed Gein, who dressed himself in his victims’ skin. The lead character in the series, Holden Ford (played by Jonathan Groff) is based on Douglas. The series takes place in 1979 and centers on two FBI agents, who were among the first to interview imprisoned serial killers in order to learn how they think and apply that to other crimes. Mindhunter is about the origins of modern day criminal profiling.

As with other Fincher projects, he brought in much of the team that’s been with him through the various feature films, like Gone Girl, The Girl with the Dragon Tattoo, and Zodiac. It has also given a number in the team the opportunity to move up in their careers. I recently spoke with Tyler Nelson, one of the four series editors, who was given the opportunity to move from the assistant chair to that of a primary editor. Nelson explains, “I’ve been working with David Fincher for nearly 11 years, starting with The Curious Case of Benjamin Button. I started on that as an apprentice, but was bumped up to an assistant editor midway through. There was actually another series in the works for HBO called Videosyncrasy, which I was going to edit on. But that didn’t make it to air. So I’m glad that everyone had the faith in me to let me edit on this series. I cut the four episodes directed by Andrew Douglas and Asif Kapadia, while Kirk Baxter [editor on Gone Girl, The Girl with the Dragon Tattoo, The Social Network] cut the four shows that David directed.”

Pushing the technology envelope

The Fincher post operation has a long history of trying new and innovative techniques, including their selection of editing tools. The editors cut this series using Adobe Premiere Pro CC. Nelson and the other editors are no stranger to Premiere Pro, since Baxter had cut Gone Girl with it. Nelson says, “Of course, Kirk and I have been using it for years. One of the editors, Byron Smith, came over from House of Cards, which was being cut on [Apple] Final Cut Pro 7. So that was an easy transition for him. We are all fans of Adobe’s approach to the entertainment industry and were onboard with using it. In fact, we were running on beta software, which gave us the ability to offer feedback to Adobe on features that will hopefully make it into released products and benefit all Premiere users.”

Pushing the envelope is also a factor on the production side. The series was shot with custom versions of the RED Weapon camera. Shots were recorded at 6K resolution, but framed for a 5K extraction, leaving a lot of “padding” around the edges. This allowed room for reposition and stabilization, which is done a lot on Fincher’s projects. In fact, nearly all of the moving footage is stabilized. All camera footage is processed into EXR image sequences in addition to ProRes editing files for “offline” editing. These ProRes files also get an added camera LUT so everyone sees a good representation of the color correction during the editing process. One change from past projects was to bring color correction in-house. The final grade was handled by Eric Weidt on a FilmLight Baselight X unit, which was sourcing from the EXR files. The final Netflix deliverables are 4K/HDR masters. Pushing a lot of data through a facility requires robust hardware systems. The editors used 2013 (“trash can”) Mac Pros connected to an Open Drives shared storage system. This high-end storage system was initially developed as part of the Gone Girl workflow and uses storage modules populated with all SSD drives.

The feature film approach

Unlike most TV series, where there’s a definite schedule to deliver a new episode each week, Netflix releases all of their shows at once, which changes the dynamic of how episodes are handled in post. Nelson continues, “We were able to treat this like one long feature film. In essence, each episode is like a reel of a film. There are 10 episodes and each is 45 minutes to an hour long. We worked it as if it was an eight-and-a-half to nine hour long movie.” Skywalker Sound did all the sound post after a cut was locked. Nelson adds, “Most of the time we handed off locked cuts, but sometimes when you hear the cleaned up sound, it can highlight issues with the edit that you didn’t notice before. In some cases, we were able to go back into the edit and make some minor tweaks to make it flow better.”

As Adobe moves more into the world of dialogue-driven entertainment, a number of developers are coming up with speech-to-text solutions that are compatible with Premiere Pro. This potentially provides editors a function similar to Avid’s ScriptSync. Would something like this have been beneficial on Mindhunter, a series based on extended interviews? Nelson replies, “I like to work with the application the way it is. I try not to get too dependent on any feature that’s very specific or unique to only one piece of software. I don’t even customize my keyboard settings too much, just so it’s easier to move from one workstation to another that way. I like to work from sequences, so I don’t need a special layout for the bins or anything like that.”

“On Mindhunter we used the same ‘KEM roll’ system as on the films, which is a process that Kirk Baxter and Angus Wall [editor on Zodiac, The Curious Case of Benjamin Button, The Social Network] prefer to work in,” Nelson continues. “All of the coverage for each scene set-up is broken up into ‘story beats’. In a 10 minute take for an interview, there might be 40 ‘beats’. These are all edited in the order of last take to first take, with any ‘starred’ takes at the head of the sequence. This way you will see all of the coverage, takes, and angles for a ‘beat’ before moving on to the group for the next ‘beat’. As you review the sequence, the really good sections of clips are moved up to video track two on the sequence. Then create a new sequence organized in story order from these selected clips and start building the scene. At any given time you can go back to the earlier sequences if the director asks to see something different than what’s in your scene cut. This method works with any NLE, so you don’t become locked into one and only one software tool.”

“Where Adobe’s approach is very helpful to us is with linked After Effects compositions,” explains Nelson. “We do a lot of invisible split screen effects and shot stabilization. Those clips are all put into After Effects comps using Dynamic Link, so that an assistant can go into After Effects and do the work. When it’s done, the completed comp just pops back into the timeline. Then ‘render and replace’ for smooth playback.”

The challenge

Certainly a series like this can be challenging for any editor, but how did Nelson take to it? He answers, “I found every interview scene to be challenging. You have an eight to 10 minute interview that needs to be interesting and compelling. Sometimes it takes two days to just get through looking at the footage for a scene like that. You start with ‘How am I going to do this?’ Somewhere along the line you get to the point where ‘This is totally working.’ And you don’t always know how you got to that point. It takes a long time approaching the footage in different ways until you can flesh it out. I really hope people enjoy the series. These are dramatizations, but real people actually did these terrible things. Certainly that creeps me out, but I really love this show and I hope people will see the craftsmanship that’s gone into Mindhunter and enjoy the series.”

In closing, Nelson offered these additional thoughts. “I’d gotten an education each and every day. Lots of editors haven’t figured it out until well into a long career. I’ve learned a lot being closer to the creative process. I’ve worked with David Fincher for almost 11 years. You think you are ready to edit, but it’s still a challenge. Many folks don’t get an opportunity like this and I don’t take that lightly. Everything that I’ve learned working with David has given me the tools and I feel fortunate that the producers had the confidence in me to let me cut on this amazing show.”

Click here for Steve Hullfish’s Art of the Cut interview with Tyler Nelson and Kirk Baxter.

Click here for Scott Simmons’ interview from NAB 2018.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

6 Below

From IMAX to stereo3D, theaters have invested in various technologies to entice viewers and increase ticket sales. With a tip of the hat to the past, Barco has developed a new ultrawide, 3-screen digital projection system, which is a similar concept to Cinerama film theaters from the 1950s. But modern 6K-capable digital cinema cameras make the new approach possible with stunning clarity. There are currently 40 Barco Escape theaters worldwide, with the company looking for opportunities to run films designed for this format.

Enter Scott Waugh, director (Act of Valor, Need for Speed) and co-founder of LA production company, Bandito Brothers. Waugh, who is always on the lookout for new technologies, was interested in developing the first full-length, feature film to take advantage of this 3-screen, 7:1 aspect ratio for the entire length of the film. But Waugh didn’t want to change how he intended to shoot the film strictly for these theaters, since the film would also be distributed to conventional theaters. This effectively meant that two films needed to come out of the post-production process – one formatted for the Barco Escape format and one for standard 4K theaters.

6 Below (written by Madison Turner) became the right vehicle. This is a true life survival story of Eric LaMarque (played by Josh Harnett), an ex-pro hockey player turned snowboarder with an addiction problem, who finds himself lost in the ice and snow of the California Sierra mountains for a week. To best tell this story, Waugh and company trekked an hour or more into the mountains above Sundance, Utah for the production.

To handle the post workflow and co-edit the film with Waugh, editor Vashi Nedomansky (That Which I Love Destroys Me, Sharknado 2, An American Carol) joined the team. Nedomansky, another veteran of Bandito Brothers who uses Adobe Premiere Pro as his axe of choice, has also helped set up Adobe-based editorial workflows for Deadpool and Gone Girl. Ironically, in earlier years Nedomansky had been a pro hockey player himself, before shifting to a career in film and video. In fact, he played against the real Eric LeMarque on the circuit.

Pushing the boundaries

The Barco Escape format projects three 2K DCPs to cover the total 6K width. To accommodate this, RED 6K cameras were used and post was done with native media at 6K in Adobe Premiere Pro CC. My first question to Nedomansky was this. Why stay native? Nedomansky says, “We had always been pushing the boundaries at Bandito Brothers. What can we get away with? It’s always a question of time, storage, money, and working with a small team. We had a small 4-person post team for 6 Below, located near Sundance. So there was interest in not losing time to transcoding.

After some testing, we settled on decked out Dell workstations, because these could tackle the 6K RED raw files natively.” Two Dell Precision 7910 towers (20-core, 128GB RAM) with Nvidia Quadro M6000 GPUs were set up for editing, along with a third, less beefy HP quad-core computer for the assistant editor and visual effects. All three were connected to shared storage using a 10GigE network. Mike McCarthy, post production supervisor for 6 Below, set up the system. To keep things stable, they were running Windows 7 and stayed on the same Adobe Creative Cloud version throughout the life of the production. Nedomansky continues, “We kept waiting for the 6K to not play, but it never stopped in the six weeks of time that we were up there. My first assembly was almost three hours long – all in a single timeline – and I was able to play it straight through without any skips or stuttering.”

There were other challenges along the way. Nedomansky explains, “Almost all of the film was done as single-camera and Josh has to carry it with his performance as the sole person on screen for much of the film. He has to go through a range of emotions and you can’t just turn that on and off between takes. So there were lots of long 10-minute takes to convey his deterioration within the hostile environmental conditions. The story is about a man lost in the wild, without much dialogue. The challenge is how to cut down these long takes without taking away from his performance. One solution was to go against the grain – using jump cuts to shorten long takes. But I wanted to look for the emotional changes or a physical act to motivate a jump cut in a way that would make it more organic. In one case, I took a 10-minute take down to 45 seconds.”

When you have a film where weather is a character, you hope that the weather will cooperate. Nedomansky adds, “One of our biggest concerns going in, was the weather. Production started in March – a time when there isn’t a lot of snow in Utah. Fortunately for us, a day before we were supposed to start shooting, they had the biggest ‘blizzard’ of the winter for four days. This saved us a lot of VFX time, because we didn’t have to create atmospherics, like snow in front of the lens. It was there naturally.”

Using the Creative Cloud tools to their fullest

6 Below features an extensive percentage of visual effects shots. Nedomansky says, “The film has 1500 shots with 205 of them as VFX shots. John Carr was the assistant editor and visual effects artist on the film and he did all of the work in After Effects and at 6K resolution, which is unusual for films. Some of the shots included ‘day for night’ where John had to add star plates for the sky. This meant rotoscoping behind Josh and the trees to add the plates. He also had to paint out crew footprints in the snow, along with the occasional dolly track or crew member in a shot. There were also some split screens done at 6K right in Premiere Pro.”

The post schedule involved six weeks on-set and then fourteen more weeks back in LA, for a 20-week total. After that, sound post and grading (done at Technicolor). The process to correctly format the film for both Barco and regular theaters almost constituted posting two films. The RED camera image is 6144 x 2592 pixels, Barco Escape 6144 x 864, and a 4K extraction 4096 x 2160. Nedomansky explains, “The Barco frame is thin and wide. It could use the full width, but not height, of the full 6K RED image. So, I had to do a lot of ‘animation’ to reposition the frame within the Barco format. For the 4K version, the framing would be adjusted accordingly. The film has about 1500 shots, but we didn’t use different takes for the two versions. I was able to do this all through reframing.”

In wrapping up our conversation, Nedomansky adds, “I played hockey against Eric and this added an extra layer of responsibility. He’s very much still alive today. Like any film of this type, it’s ‘based on’ the true story, but liberties are taken. I wanted to make sure that Eric would respect the result. Scott and I’ve done films that were heavy on action, but this film shows another directorial style – more personal and emotional with beautiful visuals. That’s also a departure for me and it’s very important for editors to have that option.”

6 Below was released on October 13 in cinemas.

Read Vashi’s own write-up of his post production workflow.

Images are courtesy of Vashi Visuals.

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters