Last Night in Soho

Edgar Wright has written and directed a string of successful comedies and cult classics. His latest, Last Night in Soho is a change from this pattern. It’s a suspense thriller that would make Hitchcock proud. Eloise (Thomasin McKenzie) is a young English country girl who’s moved to the Soho area of London to study fashion design. Her nightly dreams transport her into the past of the swinging 60s and the Soho nightlife, observing and becoming strangely intertwined with the life of Sandie (Anya Taylor-Joy), an aspiring singer. Those dreams quickly evolve into nightmares as the story turns more sinister.

Last Night in Soho was edited by Paul Machliss, ACE, a frequent collaborator with Wright. Machliss edited Scott Pilgrim vs. The World, The World’s End, and Baby Driver. The latter picked up a BAFTA win and an Oscar nomination for best editing. I recently had a chance to interview Machliss during a break from cutting his next feature, The Flash.

______________________________

Music was a key plot device and impetus for much of the editing in Baby Driver. In Last Night in Soho, music of the 1960s also plays a key role, but in a less blatant manner.

Baby Driver was about capital E editing. Edgar and I took a lot of what we learned and applied it to Soho. Can we do it in a more subtle away? It’s not immediately obvious, but it’s probably as intense. It just doesn’t show itself to the same level.

Many shots that look like sophisticated visual effects were done practically, such as the dance sequence involving both lead actresses and Matt Smith. What were some of the other practical effects?

One clever shot was the bit in the phone box, where right at the end of the dance, they go in for a snog. Initially the reflection is that of Matt [Jack] and Anya [Sandie]. Then as Matt pulls back, you realize that it becomes Thomasin [Eloise] in the mirror. That was initially in a mirror when they start kissing. In the middle of the kiss there’s an effects team yanking the mirror back to reveal Thomasin and a stand-in behind the mirror. The visual effect is to get rid of the moment when the mirror is yanked.

I’d love to say it was full of subtle things that couldn’t have been done without editing. However, it almost goes back to the days of Vaudeville or the earliest days of cinema. Less is more – those simple things are still incredibly effective.

As with Baby Driver, I presume you were editing primarily on the set initially?

I was on set every day. We did about three weeks straight of night shoots at the height of summer in Soho. I wouldn’t say we owned Soho, but we could just literally run anywhere. We could park in Soho Square and wheel the trolleys wherever we needed for the purposes of filming. However, Soho didn’t close down – life carried on in Soho. It was fascinating to see the lifestyle of Soho change when you are filming from 6:00 PM till 6:00 AM. It’s in the smaller hours that elements of the ‘darker’ side of Soho start to appear and haunt the place. So that slightly sinister level of it hasn’t gone away.

Being on set had a lot to do with things like music playback, motion control cameras, and certainly lighting, which probably more than Baby Driver played a huge part in this film. I found myself somewhat responsible for the timing to make sure that Edgar’s idea of how he wanted the lighting to work in sync with the music was 100% successful on the night, because there was no fixing it afterwards. 

Some editors avoid being on set to maintain objectivity. Your working style seems to be different.

My MO really is not to do the perfect assemble edit in the midst of all the madness of filming. What I’m trying to do for Edgar is a proof of concept. We know as we’re shooting that issues can arise from continuity, camera angles, and various other things. So part of what I’m doing on set is keeping an eye on all the disparate elements from a technical perspective to make sure they’re all in harmony to support Edgar’s vision. But as the editor, I still need to make time for an assembly. Sometimes that meant getting there an hour or two earlier on set. Then I just sit at the Avid with headphones and quickly catch up on the previous day’s work, where I do try and make more of a concerted effort to cut a good scene. Then that gets passed on to the editorial department. By the time Edgar and I get back into the cutting room, we have a fully assembled film to start working on.

Lighting design – especially the neon sign outside of Ellie’s window – drives the scene transitions.

Edgar’s original brief was, “There is a point where the lighting goes from blue, white, red, blue, white, red, and then whenever she transitions into the past, it goes into a constant flashing red.” That’s something that any good lighting operator can queue quite simply. What made it more interesting is that Edgar said, “What I’d love, is for the lighting to flash subtly, but in time to every different bit of music that Eloise puts on her little record player.” Then it was like, “Oh, right, how do we do that?” 

Bradley Farmer on our music team was able to break the songs down into a kind of beat sheet with all the lyrics and the chorus. Edgar would go, “According to the storyboards this is the line in the song that I’d like it to start going red, red, red.” Armed with that knowledge, I had one track of audio with the music in mono and another track with longitudinal timecode – a different hour code for every song. I would edit full screen color images of red, white, and blue as a reference on the Avid to match what the timing of the color changes should be against the track.

Next, I was able to export an XML file out of the Avid, which could be read by the lighting panel computer. The lighting operator would load these sequences in so the panel would know when to have the lights in their ‘on’ or ‘off’ state. He also had a QuickTime reference from the Avid so he could see the color changes against the burnt-in timecode and know, “Next one’s red, program the SkyPanel to go red.”

Our music playback guy, Pete Blaxill, had copies of the tracks in stereo and was able to use my timecode as his base timecode. He then sent that timecode to the lighting panel. So if Edgar goes, “I now want to pick up the song from the first chorus,” then the lighting panel would chase the playback timecode. Once the sequence was set at the lighting panel, wherever Edgar wanted to go, the lighting desk knew which part of the song we were at and what the next color in the sequence was.

To make things a tad more complex Edgar wanted to shoot some of the action to the playback at 32fps so there could be a dreamlike quality to movement at certain points in the song. This meant creating a lighting map that would work at 32fps, as well as the regular 24fps version. Bradley Farmer gave me a version of the songs that were exactly 33.3% faster and pitch-corrected. I reformatted my Avid sequence so everything just went on and off a third faster for the 32fps tracks. And once again, gave the XML file to the lighting guy and the sped-up tracks to Pete for his Pro Tools system.

I realized that the motion control camera could also be triggered by external timecode from Pete’s Pro Tools system. We utilized that at the climax of ‘You’re My World’ where Anya descends the staircase in the Cafe de Paris ballroom. This was a two-layer composite shot filmed with a motion control camera in multiple passes. Thomasin does one pass descending the staircase and then we did Anya’s pass. We also had singer Beth Singh [Cilla Black] in the foreground of the shot with her backing band behind her. Pete Blaxill would hit play on his Pro Tools. The music would be coming through the foldback for Beth to mime to, the lighting switched in at the right musical point, and then at exactly the right moment in the chorus the Pro Tools timecode would trigger the MoCo. I remember sitting crosslegged on the floor out of the way in the corner and watching all this happen. It’s incredibly ‘nerdy,’ but it gave one a wonderful feeling of satisfaction to be part of the team that could make moments like this happen seamlessly.

What formats were used to capture Last Night in Soho?

This was all shot 35mm, except that we used an ARRI Alexa for the night shoots. We tested both 16mm and 35mm for these exteriors. The 16mm looked good, but was too grainy and the highlights flared out tremendously. The 35mm was okay, but required a lot more lighting to achieve the desired look. Of course with the Alexa being digital, you put it there without any extra lighting at all, look at the result, and go, “Gosh, look at the amount of detail – that’s almost usable as it is.” Just because of the variety of shots we needed, Edgar and DP Chung-hoon Chung decided to use the Alexa for exterior night locations and any interior location where we would be looking out into the street at some point – for example, the library that Eloise visits later in the film.

Does the choice of film rather than digital acquisition add some technical challenges?

Fortunately, Kodak is still a company that is churning out 35mm stock full-time. The infrastructure for it though is getting less and less. Even in the time between doing The World’s End and Last Night in Soho, there is now only one facility in the UK that can process 35mm. And it has to process the rushes for every production that’s shooting on 35mm in the UK.

Even so, it’s great to think that a medium like 35mm can still support something as complicated as one of Edgar’s films without having to do everything in the digital domain. You’ve got the [Panavision] Millennium XL cameras – even though they’re nearing 20 years old, they’re pretty solid. 

Edgar is totally aware that he’s not shooting on an ‘archaic’ format for the sake of it, but that it’s his preferred medium of acquisition – in the same way some painters choose water-based paints as opposed to oil-based. He knows what medium he’s on and he respects that. You might say, “Yes, but the punter won’t necessarily know or appreciate what format this was shot on.” However, it helps to contribute to the feeling of our production. It’s part of the look, part of the patina – that slightly organic feel of the media brings so much to the look of Soho.

Because you are cutting on set, you’re working from a video tap off of the film camera.

Right. Once again, I had a little network set up with a large spool of Cat 5 cable connected to the Qtake system. And I would put myself out of the way in a corner of the soundstage and get on with it. I would just be a crew member quietly doing my job editing, not bothering Edgar. Sometimes he might not see me for a couple of hours into the day until he needed my input on a shot.

So that means at some point the low-resolution video tap clips have to be replaced by the actual footage after the film has been transferred.

That’s right. I used my editorial team from Baby Driver – Jerry Ramsbottom and Jessica Medlycott. Both of them were well-versed in the technique of over-cutting the low-res video tap footage once the hi-res Avid media came back from the lab. I never had to worry about that. The cutting room was also in Soho and when we were shooting there, if Edgar ever had a question about the footage or the previous day’s edit he would say, “Could we meet an hour early (before call-time) and pop into the cutting room to look at some stuff?” It also meant that we could tweak the edit before that day’s shoot and give Edgar a better idea of his goals for that day.

Please tell me about the production and post timeline, considering that this was all impacted by the pandemic.

Prep started at the beginning of 2019. I came on board in April. The shoot itself went from May till early September. We then had the director’s cut period, which due to the complexity took slightly longer than your average 10 weeks.

We worked up until the new year and had a preview screening early in January, which got some good numbers back. Edgar was able to start talking to the studio about some additional photography. We planned that shoot for the last week of March 2020. But at the last minute it was cancelled as we entered the first lockdown. However, visual effects work continued, because all of that could be done remotely. We had a weekly VFX review using Zoom and Cinesync with Double Negative for about a four-month period during the first lockdown. That was a bit of a lifesaver for us to know that the film still had a heartbeat during that time.

Things calmed down at the end of July – enough for the studio to consider allowing us to remount the additional photography. And of course, it was a new world we were living in by that stage. Suddenly we were working in ‘zones’ on set. I was assigned the zone with the video department and couldn’t go on to set and work with Edgar. We had an area – divided by plastic fencing – where we could be. We would have to maintain a distance with him on one side and me on the other. Fortunately, I had my edit-trolley modified so that the A-Grade monitor was on a swivel-mount and that’s how he was able to keep an eye on the progress of the work.

We were only the second production in England to resume at that time. I think other productions were watching us and thinking, “Would we all just collapse like flies? Would COVID just go ‘BAM!’ and knock us all out?” Overall, the various PCR testing and new health and safety procedures added about an extra 20% to the budget, but it was the only way we were going to be allowed to shoot.

The reshoots went very well and we had another preview screening in October and the numbers were even better. But then we were approaching our second lockdown in the UK. However, this time Edgar and I were able to see post-production all the way through. All of our dates at Twickenham Studios for the sound mix and at Warner Bros. De Lane Lea for the grade could be honored, even though strict safety precautions were in place.

We delivered the film on December 18th of last year, having made all the various HDR and SDR versions for the UHD Blu-ray, as well as general release. We did a wonderful Dolby Vision/Dolby Atmos version, which is actually the ultimate way to see the film. Because of the pandemic and the lack of open theaters at the time, there was a school of thought that this film should be released directly onto a streaming platform. Fortunately, producers Eric Fellner and Nira Park viewed the final grade and mix at Twickenham and said, “No, no, this is a cinematic experience. It was designed to be seen on the big screen, so let’s wait. Let’s bide our time.”

Edgar Wright was also working on the documentary, The Sparks Brothers. Was the post on these two simultaneous? If so, how did this impact his availability to you during the Soho post?

The timelines to the two projects were kind of parallel. Making a documentary is about gathering tons of archival footage and obtaining new interviews. Then you can leave it with the editor to put a version of it together. I remember that Edgar had done a ton of interviews before we started shooting Soho. He’d done all of the black-and-white interviews in London or in LA pre-pandemic. The assembly of all of that footage happened during our shoot. Then when the lockdown occurred, it was very easy for Paul Trewartha, the editor of the Sparks documentary, to carry on working from home.

When we came back, the Soho and Sparks cutting rooms were both on different floors of our edit facility on Wardour Street in Soho. They were on the first floor and we were on the top floor. Edgar would literally bounce between the two. It got a little bit fraught for Edgar, because the grading and dubbing for both films happened at the same time at different facilities. I remember Edgar had to sign off on both films on December 18th. So he had to go to one facility to watch Soho with me and then he went off to watch Sparks with Paul, which I imagined gave him quite a degree of satisfaction to complete both huge projects on the same day.

To wrap it up, let’s talk edit systems. Avid Media Composer is still your go-to NLE. Right?

Avid, yeah. I’ve been running it for God knows how long now. It’s a little like touch-typing for me – it all just happens very quickly. When I watch all the dailies of a particular scene, I’m in almost a trance-like state, putting shots together very quickly on the timeline before taking a more meticulous approach. I know the ballistics of the Avid, how it behaves, how long it takes between commands. It’s still the best way for me to connect emotionally to the material. Plus on a technical level – in terms of media management, having multiple users, VFX editors, two assistant editors – It’s still the best.

In a past life you were a Smoke editor. Any closing thoughts about some of the other up-and-coming editing applications?

You certainly can’t call them the poor cousins of post-production. Especially Resolve. Our colorist, Asa Shoul, suggested I look at Resolve. He said, “Paul, you really should have a look, because not only is the cutting intuitive, but you could send me a sequence, to which I could apply a grade, and that would be instantly updated on your timeline.” Temp mixes from the sound department would work in a very similar way. I think that sort of cross-pollination of ideas from various departments, all contributing elements to a sequence, which itself is being continually updated, is a very exciting concept.

I wouldn’t be surprised if one day someone said, “Paul, we are doing this film, but we’re going to do it on Resolve, because we have this workflow in place and it’s really good.” At my advanced age of 49 [laugh], I don’t want to think, “Well, no, it’s Avid or nothing.” I think part of keeping your career fresh is the ability to integrate your skills with new workflow methods that offer up results, which would have seemed impossible ten years earlier. 

Images courtesy of Focus Features.

This article is also available at postPerspective.

For more, check out Steve Hullfish’s Art of the Cut interview with Paul Machliss.

©2021 Oliver Peters

Building a Scene

The first thing any film student learns about being an editor is that a film is not put together simply the way the editor thinks it should be. The editor is there as the right hand of the director working in service to the story.

Often a film editor will start out cutting while the film is still being shot. The director is on set or location and is focused on getting the script captured. Meanwhile, the editor is trying to “keep up to camera” and build the scenes in accordance with the script as footage is received. Although it is often said that the final edit is the last rewrite of any film, this first edited version is intended to be a faithful representation of the script as it was shot. It’s not up to the editor’s discretion to drop, change, or re-arrange scenes that don’t appear to work. At least not at this stage of the process.

Any good editor is going to do the best job they can to “sell” their cut to the director by refining the edits and often adding basic sound design and temp music. The intent is to make the story flow as smoothly as possible. Whether you call this a first assembly or the editor’s cut, this first version is usually based on script notes, possibly augmented by the director’s initial feedback during downtime from filming. Depending on the director, the editor might have broad license to use different takes or assemble alternate versions. Some directors will later go over the cut in micro detail, while others only focus on the broad strokes, leaving a lot of the editor’s cut intact.

Anatomy of a scene

Many editors make it their practice not to be on the set. Unfortunately the days of a crew watching “dailies” with the director are largely gone. Thus the editor misses seeing the initial reaction a director has to the material that has been filmed. This means that the editor’s first input will be the information written on the script and notes from the script supervisor. It’s important to understand that information.

A scene can be a complex dialogue interaction with multiple actors that may cover several pages. Or, it can be a simple transition shot to bridge two other scenes. While scenes are generally shot in multiple angles that are edited together, there are also scenes done as a single, unedited shot, called “oners.” A oner can be a complex, choreographed SteadiCam shot or it can be a simple static shot, like a conversation between a driver and passenger only recorded as a two-shot though the windshield. There are even films that are captured and edited as if they were a continuous oner, such as 1917 and Birdman or (The Unexpected Virtue of Ignorance). In fact, these films were cleverly built with seamless edits. However, individual component scenes certainly were actual oners.

The lined script

Scripts are printed as one-sided pages. When placed in a binder, you’ll have the printed text on the right and a blank facing page on the left (the backside of the previous script page). The script supervisor will physically or electronically (ScriptE) draw lines through the typed, script side of a page. These lines are labelled and represent each set-up and/or angle used to film the scene. Specific takes and notes will be written onto the left facing page.

Script scenes are numbered and systems vary around the world along with variations made by individual script supervisors. For US crews, it’s common to number angles and takes alphanumerically according to their scene numbers. A “master shot” will usually be a wide shot that covers the entire length of the scene. So for scene 48, the master shot will be labelled 48 or sometimes 48-ws, if it’s a wide shot. The scene/take number will also appear on the slate. The supervisor will draw a vertical line through the scene from the start to the end of the capture. Straight segments of the line indicate the person speaking is on camera. Wiggly or zig-zag segments indicate that portion of the scene will be on a different angle.

After the master, the director will run the scene again with different camera set-ups. Maybe it’s a tighter angle or a close-up of an individual actor in the scene. These are numbered with a letter suffix, such as 48A, 48B, and so on. A close-up might also be listed as 48A-cu, for example. Lengthy scenes can be tough to get down all at once without mistakes. So the director may film “pick-ups” – portions of a scene, often starting in the middle. Or there may be a need to record an alternate version of the scene. Pick-ups would be labelled 48-PU and an alternate would be A48. Sometimes a director will record an action multiple times in a row without stopping camera or re-slating. This might be the case when the director is trying to get a variety of actions from an actor handling a prop. Such set-ups would be labelled as a “series” (e.g. 48F-Ser).

On the left facing page, the script supervisor will keep track of these angles and note the various takes for each – 48-1, 48-2, 48-3, 48A-1, 48A-2, etc. They will also add notes and comments. For example, if a prop didn’t work or the actor missed an important line. And, of course, the take that the director really liked will be circled and is known as the “circle take.” In the days of physical film editing, only circle takes were printed from the negative to work print for the editors to use. With modern digital editing, everything is usually loaded in the editing system. The combination of drawn, lined set-ups with straight and zig-zag line segments together with circle takes provides the editor with a theoretical schematic of how a scene might be cut together.

The myth of the circle take

A circle take indicates a take that the director preferred. However, this is often based on the script supervisor’s observation of the director’s reaction to the performance. The director may or may not actually have indicated that’s the one and only take to use. Often a circle take is simply a good performance take, where actors and camera all hit their marks, and nothing was missed. In reality, an earlier take might have been better for the beginning of the scene, but the actors didn’t make it all the way through.

There are typically three scenarios for how a director will direct the actors in a scene. A) The scene has already been rehearsed and actions defined, so the acting doesn’t change much from take to take. The director is merely tweaking nuance out of the actors to get the best possible performance. B) The director has the actors ramp up their intensity with each take. Early takes may have a more subtle performance while later takes feature more exaggerated speech and mannerisms. C) The director wants a different type of performance with each take. Maybe sarcastic or humorous for a few, but aggressive and angry for others.

Depending on the director’s style, a circle take can be a good indication of what the editor should use – or it can be completely meaningless. In scenario A, it will be pretty easy to figure out the best performances and usually circle takes and other notes are a good guide. Scenario B is tougher to judge, especially in the early days of a production. The level of intensity should be consistent for a character throughout the film. Once you’ve seen a few days of dailies you’ll have a better idea of how characters should act in a given scene or situation. It’s mainly a challenge of getting the calibration right. Scenario C is toughest. Without actually cutting some scenes together and then getting solid, direct feedback from the director, the editor is flying blind in this situation.

Let’s edit the scene

NLEs offer tools to aid the editor in scene construction. If you use Avid Media Composer, then you can avail yourself of script-based editing. This lets you organize script bins that mimic a lined script. The ScriptSync option removes some of the manual preparation by phonetically aligning ingested media to lines of dialogue. Apple Final Cut Pro editors can also use keywords to simulate dialogue lines.

A common method going back to film editing is the organization of “KEM rolls.” These are string-outs of selected takes placed back-to-back, which enables fast comparisons of different performances. In the digital world this means assembling a sequence of best takes and then using that sequence as the source for your scene edit. Adobe Premiere Pro and Media Composer are the two main NLEs that facilitate easy sequence-to-sequence editing.

The first step before you make any edit is to review all of the dailies for the scene. The circle takes are important, but other takes may also be good for portions of the scene. The director may not have picked circle takes for the other set-ups – 48A, 48B, etc. If that case, you need to make that selection yourself.

You can create custom columns in a Media Composer bin. Create one custom column to rank your selections. An “X” in that column is for a good take. “XX” for one that can also be considered. Add your own notes in another custom column. Now you can use Media Composer’s Custom Sift command to show/hide clips based on these entries. If you only want to see the best takes displayed in the bin, then sift for anything with an X or XX in that first custom column. All other clips will be temporarily hidden. This is a similar function to showing Favorites in a Final Cut Pro Event. At this point you can either build a KEM Roll (selects) first or just start editing the scene.

Cutting a scene together is a bit like playing chess or checkers. Continuity of actors’ positions, props, and dialogue lines often determines whether a certain construct works. If an actor ad libs the lines, you may have a lengthy scene in which certain bits of dialogue are in a different order or even completely different words from one take to the next. If you pick Take 5 for the master shot, this can block your use of some other set-ups, simply because the order of the dialogue doesn’t match. Good editing can usually overcome these issues, but it limits your options and may result in a scene that’s overly cutty.

Under ideal conditions, the lines are always said the same way and in the right order, props are always handled the same way at the same times, and actors are in their correct positions at the same points in the dialogue. Those scenes are a dream to cut. When they aren’t, that’s when an editor earns his or her pay.

When I cut a scene, I’ve reviewed the footage and made my selections. My first pass is to build the scene according to what’s in my head. Once I’ve done that I go back through and evaluate the cut. Would a different take be better on this line? Should I go to a close-up here? How about interspersing a few reaction shots? After that round, the last pass is for refinement. Tighten the edits, trim for J-cuts and L-cuts, and balance out audio levels. I now have a scene that’s ready to show to the director and hopefully put into the ongoing assembly of the film. I know the scene will likely change when I start working one-on-one with the director, but it’s a solid starting point that should reflect the intent and text of the script.

Happy editing!

©2021 Oliver Peters

Santa’s Cutting Room

Holiday time is a joyous affair, but it’s also the time of the year when retailers roll out their specials and cyber-deals. In these past almost-two years, many editors have spent a significant time cutting from home. Many have been working with ad hoc set-ups, like spreading a laptop and drives across the dining room table. If that’s you, maybe now is the time to create a more comfortable and productive editing environment.

I’ve discussed the minimalist approach in the past and that’s what I’m revisiting in this post. The objective is to set up a powerful room built for modern workflows, but with a light footprint. My goal is a comfortable cutting room, not a high-end grading or mixing suite. For example, with editing in mind, you don’t necessarily need a large video reference display and AJA or Blackmagic Design i/o hardware. You can if you want to, but it’s not essential.

I’m going to describe a layout designed around a laptop with an external display. While you could certainly go with a workstation, the laptops offer you mobility, should you need to move the project to the location. Modern laptops, like the new 14″ and 16″ MacBook Pros have more than enough horsepower to compete with most desktop system. However, if mobility is less important and you still want the latest Apple hardware, then a Mac mini or a 24″ iMac might best fit the bill. Currently these use the entry-level M1 chip. While offering plenty of power for most users, you might want to wait until 2022 before committing, since more powerful versions are expected. After a couple of years running an older 15″ MacBook Pro plus a Dell display, I reconfigured my home cutting room around a 27″ Intel iMac. This best matches my current needs, since I rarely work on-location anymore.

Since I typically work with Macs, I’m going to focus on them here. If you prefer Windows PCs, then simply substitute your favorite – maybe a Dell, HP, or even a custom Puget Systems machine. If you are purchasing a new Mac with an Apple Silicon M1 integrated SoC, then make sure the hardware and software you add is compatible with the M1 chipset, as well as the latest macOS. While I have tried to make selections that are compatible, I don’t have personal experience with each and every item, so do your own research before you buy. My whole point here is to spark some ideas that may help you hone in on just the right room layout and tools for you.

Click the embedded links that appear throughout this post for more specific product information.

The room

For sake of discussion, I’m talking about repurposing a spare bedroom, office, cave, etc into a permanent editing suite. However, this doesn’t mean a total rehab. You certainly could do that, but I’m not advocating making the walls soundproof, adding a compression ceiling, or anything crazy. If that is your goal, then check out some of my old posts about DIY edit suite construction. There are three key considerations – power, your sitting/standing preference, and sound treatment.

Most modern residential buildings have adequate power in the normal outlets to power your gear. However, I would highly recommend bringing in an electrician to wire dedicated circuits just for your computer and accessories. After all, you don’t want your RAID on the same circuit as the kitchen microwave or toaster. In addition, add an uninterruptible power supply, like an APC Smart-UPS or CyberPower Systems UPS. A single floor unit (1500-220VA) will be sufficient to provide power through momentary black-outs, as well as provide some voltage regulation. Also include a few standard power strips.

The chair and desk will be the most important purchases over time. Whether you prefer to sit, stand, or mix it up is a personal choice. I like large flat table surfaces in the 3′ x 6′ range. To cover the bases look at something like a Jarvis Bamboo Standing Desk. Make sure you pick an adjustable height model with electrical control and add the option of cable trays for neater cable management. Your sit/stand preference will dictate whether you need a standard office chair or stool. I’m a fan of the Herman Miller Aeron and Mirra 2 models for their quality and durability, but X-Chair or similar gaming chair is also a high-quality alternative.

Finally, the acoustic wall treatments. You may have enough furnishings and irregular surfaces in the room already to sufficiently kill sound reflections. If not, a good solution will be sound panels that can be mounted on the walls, like color-coordinated kits from Acoustical Solutions.

Computer and peripherals

As I stated up top, this room is designed around a laptop with an external display. My top laptop pick right now is Apple’s 16″ MacBook Pro with the M1 Max chip. Get the 10-core (CPU) / 32-core (GPU) model with 64GB RAM and a 1TB internal drive. Except for the drive, this is maxed out, so it should be ready for anything you toss at it. Some will prefer the 14″ MacBook Pro, which is smaller yet offers similar power. (Although some have noted that the “notch” is more intrusive on the 14″ models.)

Be sure to add an external Bluetooth keyboard, mouse, and/or trackpad. I recommend also adding a Rain Design mStand laptop stand and an Anker USB dock for additional USB-A ports. If you prefer to run your laptop in the clamshell mode and tuck it out of the way, then go for the Twelve South BookArc. In that case, the smaller 14″ MacBook Pro might be the better choice, since the laptop screen is less important in daily use.

When it comes to the display, the goal is a large primary interface display that’s good (enough) for critical video evaluation. But it’s not a dedicated video display. The laptop’s screen becomes your secondary display if you configure this as a dual-display layout. There are plenty of options, but my top three choices are the Apple Pro Display XDR, Apple’s version of the LG UltraFine 5K display, and the ASUS ProArt PA32UCG-K.

If you prefer a matched dual-display layout, instead of using the laptop screen as a second display, then you may want to consider a pair of the LG UltraFine 4K models. Remember to add AppleCare to any of the products purchased from Apple. Yes, it’s an “insurance” policy, but I have actually had to use it in the past with an older Apple laptop. I was glad I had it then.

Storage

The type and capacity of required storage will depend on the projects you tackle. If you work with many projects or a lot of media, then I would recommend two RAID choices. The Promise RAIDs are tried and true, so the Pegasus32 R4, R6, or R8 models stand out. Another interesting option is OWC’s Thunderbay Flex 8, because you can mix both SSD/NVMe and spinning drives in the same enclosure. The enclosure also adds some expansion capability.

Because you never have enough ports, look at the OWC Thunderbolt 3 Pro Dock. This may or may not be necessary if you purchase the Thunderbay Flex 8, as that unit already adds some additional ports. But you may need both.

I archive all client projects onto removable drives. If you work with such raw drives, make sure to get a fast drive dock, such as the OWC Drive Dock USB-C. This is handy not only for archiving, but if the dock and drive are fast enough, you can also edit directly from it, should quick revisions be required in the future. For media, my current choice is Seagate’s Ironwolf Pro line. Regardless of brand or capacity, select drives that run at 7200 RPM or faster, carry a 5-year limited warranty, and are rated for NAS use.

If you prefer to use small external drives or need them for location work, then you can’t beat the Samsung T7, G-Drive Pro, or Glyph Atom SSDs.

Audio hardware

While I think you can do without an external video monitor in this type of room, I do believe you need good audio monitoring. I’m not a fan of working and mixing 100% of the time using only headphones, so speakers and an audio interface are important. How invested you are in audio gear is going to depend on whether you want to be able to work in stereo only or also surround. While I think you can adequately mix and master (for TV and the web) simple stereo projects in this environment, I don’t recommend that for surround. The intent is purely monitoring while editing, using either an LCR or a 5.1 configuration.

The most straightforward is a stereo-only configuration. There are plenty of interface options, but my top choices are the PreSonus AudioBox USB 96, PreSonus Studio 24c, Apogee Symphony Desktop, or Focusrite Scarlett 2i2. If you want or need to monitor LCR or 5.1, then this requires an audio interface with additional discrete analog line outputs. 5.1 requires six outputs, but most units will come in configurations with eight line outputs. I have not personally tested these, but based on my research, I would recommend the PreSonus Studio 1824c, Focusrite Scarlett 18i20, or Focusrite Pro Red 16Line.

If the interface uses macOS core audio, then it should be compatible.  However, if a separate software driver is used, then make sure that it will work with the M1 Macs and the latest macOS version. If you use powered speakers, then you can design the room without an external audio mixer, unless you prefer to add one.

Powered speakers eliminate the need for additional amps, wiring, a mixer, etc. This fits with the minimalist ethos. There are plenty of great speakers to choose from, but it’s a highly subjective choice. My top picks include Genelec, ADAM Audio, PreSonus, KRK, and M-Audio. Obviously for stereo, you’ll only need a pair of speakers; but for a 5.1 surround, you’ll need to purchase five matched speakers, plus a subwoofer. I also recommend speaker stands or risers – especially for any speaker sitting on the desk. This isolates vibration between the speaker and the desk and elevates the cones to the right height for your ears. I currently use speaker ISO-Stands from isoAcoustics.

Having speakers won’t eliminate the need to use headphones from time to time, so pick some good ones. AKG studio models are my go-to. Need to record any scratch tracks? Don’t forget a basic microphone. I’m fond of RØDE’s NT-USB Mini or their PodMic (with stand).

Video software / plug-ins

I’m not going to dwell on the choice of editing software (NLE). The options are all good and what you use will depend on preference or business/project requirements. I’m talking about Avid Media Composer, Adobe Premiere Pro, Apple Final Cut Pro, or DaVinci Resolve. Of course, you can also toss in Lightworks and Media 100. Of these, Premiere Pro, FCP, and Resolve will be the most up-to-date with Mac compatibility. If your needs are simple, then you are probably fine using the tools as they come. But if your needs are more advanced or you are required to interchange projects with users who are running other NLEs, then you’ll have to augment the NLE with third-party tools.

Final Cut Pro users who exchange files with Premiere Pro editors or send sequences to a mixer using Pro Tools will need to invest in interchange products from Intelligent Assistance, XMiL, and/or Marquis Broadcast. These are available through the Apple App Store.

If you perform advanced color correction, but want to stay within the NLE, then Media Composer editors can choose to add the Symphony option. Final Cut Pro editors have several choices, including plug-ins and applications from Color Finale or Coremelt. Other tools available to the various NLEs are FilmConvert’s Nitrate and CineMatch. Need more effects grunt than what the native effects and transitions offer? The best all-around package is Continuum from Boris FX.

Audio plug-ins

Like video, most of these NLEs have a solid selection of built-in audio effect/filter plug-ins. Nevertheless, often a third-party tool might handle tough sound situations in a better manner. One area is audio clean-up. My three top choices include the Accusonus ERA bundle, Klevgrand Brusfri, and iZotope RX. The first two offer good real-time performance and work well applied to clips or tracks. RX does include some real-time plug-ins, but you are better off using the standalone application that’s part of the RX bundle.

If you need to add more processing (EQ, compression, limiting), then I really like the various FabFilter plug-ins. But, there are also some good free plug-ins from Tokyo Dawn Labs and TBProAudio.

Other software applications

To wrap it up, don’t forget about some other useful tools. Even through you may use Photoshop, it’s still a good investment to add Pixelmator Pro and Affinity Photo. Each offers some graphics capabilities that Photoshop doesn’t. For example, Photoshop no longer supports certain font types as editable text within .PSD files. However, Affinity Photo can edit these, which helps with legacy files.

You are probably using either Apple Compressor or Adobe Media Encoder to batch-encode camera footage, generate files for review, and create web deliverables. Yet, there are still formats that these skip. One solution is the free (donation requested) Shutter Encoder. Two other applications in my video player toolkit include the free VLC – an all-purpose media player – and Telestream Switch. The latter is a bit pricey, but is my go-to application for detailed video analysis and QC. It can also transcode single video files when needed. Finally, if you do a lot of work with batches of files, Better Rename is one of my most-used applications.

OK, there you have the round-up to build a powerful, yet minimalist cutting room from the ground up. Remember that minimalist does not mean cheap. Heck, I’d venture to guess that if you max out this list, you might be pushing $35K. This is an investment in your business and future, so bite off what makes sense now and leave the rest for later. In any case, enjoy the holidays and maybe this rundown will give you some ideas as to what to put on the wishlist for Santa!

©2021 Oliver Peters