Think you can mix?

Are you aspiring to be the next Chris Lord-Alge or Glyn Johns? Maybe you just have a rock ‘n roll heart. Or you just want to try your hand at mixing music, but don’t have the material to work with. Whatever your inspiration, Lewitt Audio – the Austrian manufacturer of high-quality studio microphones – has made it easier than ever to get started. Awhile back Lewitt launched the myLEWITT site as a user community, featuring educational tips, music challenges, and free content.

Even though the listed music challenge contests may have expired, Lewitt leaves the content online and available to download for free. Simply create a free myLEWITT account to access them. These are individual .wav stem tracks of the complete challenge songs recorded using a range of Lewitt microphones. Each file is labelled with the name of the mic used for that track. That’s a clever marketing move, but it’s also handy if you are considering a mic purchase. Naturally these tracks are only for your educational and non-commercial use.

Since these are audio files and not specific DAW projects, they are compatible with any audio software. Naturally, if you a video editor, it’s possible to mix these tracks in an NLE, like Premiere Pro, Media Composer, or Final Cut Pro. However, I wouldn’t recommend that. First of all, DAW applications are designed for mixing and NLEs aren’t. Second, if you are trying to stretch your knowledge, then you should use the correct tool for the job. Especially if you are going to go out on the web for mixing tips and tricks from noted recording engineers and producers.

Start with a DAW

If you are new to DAW (digital audio workstation) software, then there are several free audio applications you might consider just to get started. Mac users already have GarageBand. Of course, most pros wouldn’t consider that, but it’s good enough for the basics. On the pro level, Reaper is a popular free DAW application. Universal Audio offers Luna for free, if you have a compatible UA Thunderbolt audio interface.

As a video editor, you might also be getting into DaVinci Resolve. Both the free and paid Studio versions integrate the Fairlight audio page. Fairlight, the company, had a well-respected history in audio prior to the acquisition by Blackmagic Design, who has continued to build upon that foundation. This means that not only can you do sophisticated audio mixes for video in Resolve, but there’s no reason that you can’t start and end in the Fairlight page for a music project.

The industry standard is Avid Pro Tools. If you are planning to work in a professional audio environment like a recording studio, then you’ll really want to know Pro Tools. Unfortunately, Avid discontinued their free Pro Tools|First version. However, you can still get a free, full-featured 30-day trial. Plus, the subscription costs aren’t too bad. If you have an Adobe Creative Cloud subscription, then you also have access to Audition as part of the account. Finally, if you are deep into the Apple ecosystem, then I would recommend purchasing Logic Pro, which is highly regarded by many music producers. 

Taking the plunge

In preparing this blog post, I downloaded and remixed one of the myLEWITT music challenge projects – The Seeds of your Sorrow by Spitting Ibex. This downloaded as a .zip containing 19 .wav files, all labelled according to instrument and microphone used. I launched Logic Pro, brought in the tracks, and lined them up at the start so that everything was in sync. From there it’s just a matter of mixing to taste.

Logic is great for this type of project, because of its wealth of included plug-ins. Logic is also a good host application for third party plug-ins, such as those from iZotope, Waves, Accusonus, and others. Track stacks are a versatile Logic feature. You can group a set of tracks (like all of the individual drums kit tracks) and turn those into a track stack, which then functions like a submix bus. The individual tracks can still be adjusted, but then you can also adjust levels on the entire stack. Track stacks are also great for visual organization of your track layout. You can show or hide all of the tracks within a stack, simply by twirling a disclosure triangle.

I’m certainly not an experienced music mixer, but I have mixed simple projects before. Understanding the process is part of being a well-rounded editor. In total, I spent about six hours over two days mixing the Spitting Ibex song. I’ve posted it on Vimeo as a clip with three sections – the official mix, my mix, and the unmixed/summed tracks. My mix was relatively straightforward. I wanted an R&B vibe, so no fancy left-right panning, voice distortions, or track doubling.

I mixed it totally in Logic Pro using mainly the native plug-ins for EQ, compression, reverb, amp modeling, and other effects. I also used some third-party plug-ins, including iZotope RX8 De-click and Accusonus ERA De-esser on the vocal track. As I brightened the vocal track to bring it forward in the mix, it also emphasized certain mouth sounds caused by the singer’s proximity to the mic. These plug-ins helped to tame those. I also added two final mastering plug-ins: Tokyo Dawn’s Nova for slight multi-band compression, along with FabFilter’s Pro-L2 limiter. The latter is one of the smoothest mastering plug-ins on the market and is a nice way to add “glue” to the mix.

If you decide to download and play with the tracks yourself, then check out the different versions submitted to the contest, which are showcased at myLEWITT. For a more detailed look into the process, Dutch mixing/mastering engineer and YouTuber Wytse Gerichhausen (White Sea Studio) has posted his own video about creating a mix for this music challenge.

In closing…

Understand that a great music mix starts with a tight group of musicians and high-quality recordings. Without those, it’s hard to make magic. With those, you are more than three-quarters of the way there. Fortunately Lewitt has taken care of that for you.

The point of any exercise like this is to learn and improve your skills. Learn to trust your ears and taste. Should you remove the breaths in a singer’s track? Should the mix be wetter (more reverb) or not? If so, what sort of reverb space? Should the bottom end be fatter? Should the guitars use distortion or be clean? These are all creative judgements that can only be made through trial-and-error and repeated experimentation. If music mixing is something you want to pursue, then the Produce Like A Pro YouTube channel is another source of useful information.

Let me leave you with some pro tips. At a minimum, make sure to mix any complex project on quality nearfield monitors (assuming you don’t have an actual studio at your disposal). Test your mix in different listening environments, on different speakers, and at different volume levels to see if it translates universally well. If you are going for a particular sound or style, have some good reference tracks, such as commercially-mastered songs, to which you can compare your mix. How did they balance the instruments? Did the reference song sound bright, boomy, or midrange? How were the dynamics and level of compression? And finally, take a break. All mixers can get fatigued. Mixes will often sound quite different after a break or on the next day. Sometimes it’s best to leave it and come back later with fresh ears and mind.

In any case, you can get started without spending any money. The tracks are free. Software like DaVinci Resolve is free. As with so many other tasks enabled by modern technology, all it takes is making the first move.

©2022 Oliver Peters

Pro Tips for FCP Editors

Every nonlinear editing application has strengths and weaknesses. Each experienced editor has a list of features and enhancements that they’d like to see added to their favorite tool. Final Cut Pro has many fans, but also its share of detractors, largely because of Apple’s pivot when Final Cut Pro changed from FCP7 to FCPX a decade ago. That doesn’t mean it’s not adequate for professional-level work. In fact, it’s a powerful tool in its own right. But there are ways to adapt it to workflows you may miss from competing NLEs. I discuss five of these tips in my article Making Final Cut More Pro over at FCP.co.

©2022 Oliver Peters

Analogue Wayback, Ep. 1

Christmas Eve and the Radio Skywave

Any editor who’s been in the business for a few decades has certainly accumulated their share of oddball anecdotes and knowledge tied to techniques and processes lost to history. I’m no exception. And so, I’ve decided to start posting a few of these. However, I’ll start this first one in the land of AM radio.

My first official job in media was as a part-time disk jockey during my senior year of high school. It was a small central Florida 1,000 watt AM radio station that covered a range of musical programming – country in the morning, MOR (middle of the road) midday, Top 40 early in the evening, and more album cuts at the end of the night. That last part, 10PM-midnight (our sign-off time), was my shift.

AM radio, like shortwave radio, transmits a signal that bounces between land and the ionosphere. Sunlight during the daytime excites the atoms of the ionosphere, which blocks the penetration of the radio signal, thus limiting its distance. As a result during the day a 1,000 watt AM station signal can’t be heard too far past the county line. Stations in different cities are able to occupy the same frequency without interference.

With the sun gone at night, the ionosphere is now less excited. The radio signal can punch through to a higher level and ricochet a greater distance, thanks to a phenomena called skywave, aka skip. If the transmitter continues to operate at full power, the signal will then travel far enough to interfere with any radio station on that same frequency in another city. To mitigate this effect and reduce interference, the FCC requires most AM radio stations to reduce power at sundown – down to 250 watts in our case.

The exceptions are clear channel stations, which are allowed to operate at full power – typically 50,000 watts. These stations are assigned a frequency that is not also allocated in another market and, therefore, won’t create interference. In central Florida, we routinely picked up a station from Nashville at night. Such stations in the US and Mexico play a key part of rock ‘n roll history and radio lore – think Wolfman Jack, American Graffiti, and ZZ Top.

At our little station, one of the DJ’s responsibilities during morning or evening shifts was to raise transmitter power at 6AM sign-on and lower it at sundown. During the holidays, shifts were re-arranged to accommodate Christmas and New Year programming. So on Christmas Eve, I was on the late afternoon shift and got off right at the time of the power change. On this night, our country music morning DJ, who was a bit of a local celebrity, was on until midnight doing his annual special Christmas Eve program. He was a rather jolly old elf who arrived for his shift ready for the night – a tray of brownies, a loaf of bread and cold cuts, and some libations.

On schedule, I reduced the transmitter power to 250 watts, did my end-of-shift paperwork, and prepared to turn things over. On my way out the door I reminded my colleague that I’d already taken care of the power shift. “That’s OK, I’ve got it,” he replied. To my surprise, he walked back to the transmitter and switched it back up to full power. A nice, strong signal for his Christmas Eve special, I suppose!

The rest of this story is second hand. As I understand it, the station GM was driving around later that evening and realized the station had a much stronger signal than usual. He immediately knew what must have happened. The conversation went something like this (insert your own expletives to taste):

GM: “What are you doing! They must be able to hear us clear up to Georgia!”

DJ: “I have no idea what you’re talking about.”

GM: “Yes you do! Bring the power down right now before we get into serious trouble.”

Dj: “OK.”

As far as I know that’s what happened. Just one of those experiences that point to the fact that the radio business is (or at least was) a truly crazy place.

©2022 Oliver Peters

Last Night in Soho

Edgar Wright has written and directed a string of successful comedies and cult classics. His latest, Last Night in Soho is a change from this pattern. It’s a suspense thriller that would make Hitchcock proud. Eloise (Thomasin McKenzie) is a young English country girl who’s moved to the Soho area of London to study fashion design. Her nightly dreams transport her into the past of the swinging 60s and the Soho nightlife, observing and becoming strangely intertwined with the life of Sandie (Anya Taylor-Joy), an aspiring singer. Those dreams quickly evolve into nightmares as the story turns more sinister.

Last Night in Soho was edited by Paul Machliss, ACE, a frequent collaborator with Wright. Machliss edited Scott Pilgrim vs. The World, The World’s End, and Baby Driver. The latter picked up a BAFTA win and an Oscar nomination for best editing. I recently had a chance to interview Machliss during a break from cutting his next feature, The Flash.

______________________________

Music was a key plot device and impetus for much of the editing in Baby Driver. In Last Night in Soho, music of the 1960s also plays a key role, but in a less blatant manner.

Baby Driver was about capital E editing. Edgar and I took a lot of what we learned and applied it to Soho. Can we do it in a more subtle away? It’s not immediately obvious, but it’s probably as intense. It just doesn’t show itself to the same level.

Many shots that look like sophisticated visual effects were done practically, such as the dance sequence involving both lead actresses and Matt Smith. What were some of the other practical effects?

One clever shot was the bit in the phone box, where right at the end of the dance, they go in for a snog. Initially the reflection is that of Matt [Jack] and Anya [Sandie]. Then as Matt pulls back, you realize that it becomes Thomasin [Eloise] in the mirror. That was initially in a mirror when they start kissing. In the middle of the kiss there’s an effects team yanking the mirror back to reveal Thomasin and a stand-in behind the mirror. The visual effect is to get rid of the moment when the mirror is yanked.

I’d love to say it was full of subtle things that couldn’t have been done without editing. However, it almost goes back to the days of Vaudeville or the earliest days of cinema. Less is more – those simple things are still incredibly effective.

As with Baby Driver, I presume you were editing primarily on the set initially?

I was on set every day. We did about three weeks straight of night shoots at the height of summer in Soho. I wouldn’t say we owned Soho, but we could just literally run anywhere. We could park in Soho Square and wheel the trolleys wherever we needed for the purposes of filming. However, Soho didn’t close down – life carried on in Soho. It was fascinating to see the lifestyle of Soho change when you are filming from 6:00 PM till 6:00 AM. It’s in the smaller hours that elements of the ‘darker’ side of Soho start to appear and haunt the place. So that slightly sinister level of it hasn’t gone away.

Being on set had a lot to do with things like music playback, motion control cameras, and certainly lighting, which probably more than Baby Driver played a huge part in this film. I found myself somewhat responsible for the timing to make sure that Edgar’s idea of how he wanted the lighting to work in sync with the music was 100% successful on the night, because there was no fixing it afterwards. 

Some editors avoid being on set to maintain objectivity. Your working style seems to be different.

My MO really is not to do the perfect assemble edit in the midst of all the madness of filming. What I’m trying to do for Edgar is a proof of concept. We know as we’re shooting that issues can arise from continuity, camera angles, and various other things. So part of what I’m doing on set is keeping an eye on all the disparate elements from a technical perspective to make sure they’re all in harmony to support Edgar’s vision. But as the editor, I still need to make time for an assembly. Sometimes that meant getting there an hour or two earlier on set. Then I just sit at the Avid with headphones and quickly catch up on the previous day’s work, where I do try and make more of a concerted effort to cut a good scene. Then that gets passed on to the editorial department. By the time Edgar and I get back into the cutting room, we have a fully assembled film to start working on.

Lighting design – especially the neon sign outside of Ellie’s window – drives the scene transitions.

Edgar’s original brief was, “There is a point where the lighting goes from blue, white, red, blue, white, red, and then whenever she transitions into the past, it goes into a constant flashing red.” That’s something that any good lighting operator can queue quite simply. What made it more interesting is that Edgar said, “What I’d love, is for the lighting to flash subtly, but in time to every different bit of music that Eloise puts on her little record player.” Then it was like, “Oh, right, how do we do that?” 

Bradley Farmer on our music team was able to break the songs down into a kind of beat sheet with all the lyrics and the chorus. Edgar would go, “According to the storyboards this is the line in the song that I’d like it to start going red, red, red.” Armed with that knowledge, I had one track of audio with the music in mono and another track with longitudinal timecode – a different hour code for every song. I would edit full screen color images of red, white, and blue as a reference on the Avid to match what the timing of the color changes should be against the track.

Next, I was able to export an XML file out of the Avid, which could be read by the lighting panel computer. The lighting operator would load these sequences in so the panel would know when to have the lights in their ‘on’ or ‘off’ state. He also had a QuickTime reference from the Avid so he could see the color changes against the burnt-in timecode and know, “Next one’s red, program the SkyPanel to go red.”

Our music playback guy, Pete Blaxill, had copies of the tracks in stereo and was able to use my timecode as his base timecode. He then sent that timecode to the lighting panel. So if Edgar goes, “I now want to pick up the song from the first chorus,” then the lighting panel would chase the playback timecode. Once the sequence was set at the lighting panel, wherever Edgar wanted to go, the lighting desk knew which part of the song we were at and what the next color in the sequence was.

To make things a tad more complex Edgar wanted to shoot some of the action to the playback at 32fps so there could be a dreamlike quality to movement at certain points in the song. This meant creating a lighting map that would work at 32fps, as well as the regular 24fps version. Bradley Farmer gave me a version of the songs that were exactly 33.3% faster and pitch-corrected. I reformatted my Avid sequence so everything just went on and off a third faster for the 32fps tracks. And once again, gave the XML file to the lighting guy and the sped-up tracks to Pete for his Pro Tools system.

I realized that the motion control camera could also be triggered by external timecode from Pete’s Pro Tools system. We utilized that at the climax of ‘You’re My World’ where Anya descends the staircase in the Cafe de Paris ballroom. This was a two-layer composite shot filmed with a motion control camera in multiple passes. Thomasin does one pass descending the staircase and then we did Anya’s pass. We also had singer Beth Singh [Cilla Black] in the foreground of the shot with her backing band behind her. Pete Blaxill would hit play on his Pro Tools. The music would be coming through the foldback for Beth to mime to, the lighting switched in at the right musical point, and then at exactly the right moment in the chorus the Pro Tools timecode would trigger the MoCo. I remember sitting crosslegged on the floor out of the way in the corner and watching all this happen. It’s incredibly ‘nerdy,’ but it gave one a wonderful feeling of satisfaction to be part of the team that could make moments like this happen seamlessly.

What formats were used to capture Last Night in Soho?

This was all shot 35mm, except that we used an ARRI Alexa for the night shoots. We tested both 16mm and 35mm for these exteriors. The 16mm looked good, but was too grainy and the highlights flared out tremendously. The 35mm was okay, but required a lot more lighting to achieve the desired look. Of course with the Alexa being digital, you put it there without any extra lighting at all, look at the result, and go, “Gosh, look at the amount of detail – that’s almost usable as it is.” Just because of the variety of shots we needed, Edgar and DP Chung-hoon Chung decided to use the Alexa for exterior night locations and any interior location where we would be looking out into the street at some point – for example, the library that Eloise visits later in the film.

Does the choice of film rather than digital acquisition add some technical challenges?

Fortunately, Kodak is still a company that is churning out 35mm stock full-time. The infrastructure for it though is getting less and less. Even in the time between doing The World’s End and Last Night in Soho, there is now only one facility in the UK that can process 35mm. And it has to process the rushes for every production that’s shooting on 35mm in the UK.

Even so, it’s great to think that a medium like 35mm can still support something as complicated as one of Edgar’s films without having to do everything in the digital domain. You’ve got the [Panavision] Millennium XL cameras – even though they’re nearing 20 years old, they’re pretty solid. 

Edgar is totally aware that he’s not shooting on an ‘archaic’ format for the sake of it, but that it’s his preferred medium of acquisition – in the same way some painters choose water-based paints as opposed to oil-based. He knows what medium he’s on and he respects that. You might say, “Yes, but the punter won’t necessarily know or appreciate what format this was shot on.” However, it helps to contribute to the feeling of our production. It’s part of the look, part of the patina – that slightly organic feel of the media brings so much to the look of Soho.

Because you are cutting on set, you’re working from a video tap off of the film camera.

Right. Once again, I had a little network set up with a large spool of Cat 5 cable connected to the Qtake system. And I would put myself out of the way in a corner of the soundstage and get on with it. I would just be a crew member quietly doing my job editing, not bothering Edgar. Sometimes he might not see me for a couple of hours into the day until he needed my input on a shot.

So that means at some point the low-resolution video tap clips have to be replaced by the actual footage after the film has been transferred.

That’s right. I used my editorial team from Baby Driver – Jerry Ramsbottom and Jessica Medlycott. Both of them were well-versed in the technique of over-cutting the low-res video tap footage once the hi-res Avid media came back from the lab. I never had to worry about that. The cutting room was also in Soho and when we were shooting there, if Edgar ever had a question about the footage or the previous day’s edit he would say, “Could we meet an hour early (before call-time) and pop into the cutting room to look at some stuff?” It also meant that we could tweak the edit before that day’s shoot and give Edgar a better idea of his goals for that day.

Please tell me about the production and post timeline, considering that this was all impacted by the pandemic.

Prep started at the beginning of 2019. I came on board in April. The shoot itself went from May till early September. We then had the director’s cut period, which due to the complexity took slightly longer than your average 10 weeks.

We worked up until the new year and had a preview screening early in January, which got some good numbers back. Edgar was able to start talking to the studio about some additional photography. We planned that shoot for the last week of March 2020. But at the last minute it was cancelled as we entered the first lockdown. However, visual effects work continued, because all of that could be done remotely. We had a weekly VFX review using Zoom and Cinesync with Double Negative for about a four-month period during the first lockdown. That was a bit of a lifesaver for us to know that the film still had a heartbeat during that time.

Things calmed down at the end of July – enough for the studio to consider allowing us to remount the additional photography. And of course, it was a new world we were living in by that stage. Suddenly we were working in ‘zones’ on set. I was assigned the zone with the video department and couldn’t go on to set and work with Edgar. We had an area – divided by plastic fencing – where we could be. We would have to maintain a distance with him on one side and me on the other. Fortunately, I had my edit-trolley modified so that the A-Grade monitor was on a swivel-mount and that’s how he was able to keep an eye on the progress of the work.

We were only the second production in England to resume at that time. I think other productions were watching us and thinking, “Would we all just collapse like flies? Would COVID just go ‘BAM!’ and knock us all out?” Overall, the various PCR testing and new health and safety procedures added about an extra 20% to the budget, but it was the only way we were going to be allowed to shoot.

The reshoots went very well and we had another preview screening in October and the numbers were even better. But then we were approaching our second lockdown in the UK. However, this time Edgar and I were able to see post-production all the way through. All of our dates at Twickenham Studios for the sound mix and at Warner Bros. De Lane Lea for the grade could be honored, even though strict safety precautions were in place.

We delivered the film on December 18th of last year, having made all the various HDR and SDR versions for the UHD Blu-ray, as well as general release. We did a wonderful Dolby Vision/Dolby Atmos version, which is actually the ultimate way to see the film. Because of the pandemic and the lack of open theaters at the time, there was a school of thought that this film should be released directly onto a streaming platform. Fortunately, producers Eric Fellner and Nira Park viewed the final grade and mix at Twickenham and said, “No, no, this is a cinematic experience. It was designed to be seen on the big screen, so let’s wait. Let’s bide our time.”

Edgar Wright was also working on the documentary, The Sparks Brothers. Was the post on these two simultaneous? If so, how did this impact his availability to you during the Soho post?

The timelines to the two projects were kind of parallel. Making a documentary is about gathering tons of archival footage and obtaining new interviews. Then you can leave it with the editor to put a version of it together. I remember that Edgar had done a ton of interviews before we started shooting Soho. He’d done all of the black-and-white interviews in London or in LA pre-pandemic. The assembly of all of that footage happened during our shoot. Then when the lockdown occurred, it was very easy for Paul Trewartha, the editor of the Sparks documentary, to carry on working from home.

When we came back, the Soho and Sparks cutting rooms were both on different floors of our edit facility on Wardour Street in Soho. They were on the first floor and we were on the top floor. Edgar would literally bounce between the two. It got a little bit fraught for Edgar, because the grading and dubbing for both films happened at the same time at different facilities. I remember Edgar had to sign off on both films on December 18th. So he had to go to one facility to watch Soho with me and then he went off to watch Sparks with Paul, which I imagined gave him quite a degree of satisfaction to complete both huge projects on the same day.

To wrap it up, let’s talk edit systems. Avid Media Composer is still your go-to NLE. Right?

Avid, yeah. I’ve been running it for God knows how long now. It’s a little like touch-typing for me – it all just happens very quickly. When I watch all the dailies of a particular scene, I’m in almost a trance-like state, putting shots together very quickly on the timeline before taking a more meticulous approach. I know the ballistics of the Avid, how it behaves, how long it takes between commands. It’s still the best way for me to connect emotionally to the material. Plus on a technical level – in terms of media management, having multiple users, VFX editors, two assistant editors – It’s still the best.

In a past life you were a Smoke editor. Any closing thoughts about some of the other up-and-coming editing applications?

You certainly can’t call them the poor cousins of post-production. Especially Resolve. Our colorist, Asa Shoul, suggested I look at Resolve. He said, “Paul, you really should have a look, because not only is the cutting intuitive, but you could send me a sequence, to which I could apply a grade, and that would be instantly updated on your timeline.” Temp mixes from the sound department would work in a very similar way. I think that sort of cross-pollination of ideas from various departments, all contributing elements to a sequence, which itself is being continually updated, is a very exciting concept.

I wouldn’t be surprised if one day someone said, “Paul, we are doing this film, but we’re going to do it on Resolve, because we have this workflow in place and it’s really good.” At my advanced age of 49 [laugh], I don’t want to think, “Well, no, it’s Avid or nothing.” I think part of keeping your career fresh is the ability to integrate your skills with new workflow methods that offer up results, which would have seemed impossible ten years earlier. 

Images courtesy of Focus Features.

This article is also available at postPerspective.

For more, check out Steve Hullfish’s Art of the Cut interview with Paul Machliss.

©2021 Oliver Peters

Building a Scene

The first thing any film student learns about being an editor is that a film is not put together simply the way the editor thinks it should be. The editor is there as the right hand of the director working in service to the story.

Often a film editor will start out cutting while the film is still being shot. The director is on set or location and is focused on getting the script captured. Meanwhile, the editor is trying to “keep up to camera” and build the scenes in accordance with the script as footage is received. Although it is often said that the final edit is the last rewrite of any film, this first edited version is intended to be a faithful representation of the script as it was shot. It’s not up to the editor’s discretion to drop, change, or re-arrange scenes that don’t appear to work. At least not at this stage of the process.

Any good editor is going to do the best job they can to “sell” their cut to the director by refining the edits and often adding basic sound design and temp music. The intent is to make the story flow as smoothly as possible. Whether you call this a first assembly or the editor’s cut, this first version is usually based on script notes, possibly augmented by the director’s initial feedback during downtime from filming. Depending on the director, the editor might have broad license to use different takes or assemble alternate versions. Some directors will later go over the cut in micro detail, while others only focus on the broad strokes, leaving a lot of the editor’s cut intact.

Anatomy of a scene

Many editors make it their practice not to be on the set. Unfortunately the days of a crew watching “dailies” with the director are largely gone. Thus the editor misses seeing the initial reaction a director has to the material that has been filmed. This means that the editor’s first input will be the information written on the script and notes from the script supervisor. It’s important to understand that information.

A scene can be a complex dialogue interaction with multiple actors that may cover several pages. Or, it can be a simple transition shot to bridge two other scenes. While scenes are generally shot in multiple angles that are edited together, there are also scenes done as a single, unedited shot, called “oners.” A oner can be a complex, choreographed SteadiCam shot or it can be a simple static shot, like a conversation between a driver and passenger only recorded as a two-shot though the windshield. There are even films that are captured and edited as if they were a continuous oner, such as 1917 and Birdman or (The Unexpected Virtue of Ignorance). In fact, these films were cleverly built with seamless edits. However, individual component scenes certainly were actual oners.

The lined script

Scripts are printed as one-sided pages. When placed in a binder, you’ll have the printed text on the right and a blank facing page on the left (the backside of the previous script page). The script supervisor will physically or electronically (ScriptE) draw lines through the typed, script side of a page. These lines are labelled and represent each set-up and/or angle used to film the scene. Specific takes and notes will be written onto the left facing page.

Script scenes are numbered and systems vary around the world along with variations made by individual script supervisors. For US crews, it’s common to number angles and takes alphanumerically according to their scene numbers. A “master shot” will usually be a wide shot that covers the entire length of the scene. So for scene 48, the master shot will be labelled 48 or sometimes 48-ws, if it’s a wide shot. The scene/take number will also appear on the slate. The supervisor will draw a vertical line through the scene from the start to the end of the capture. Straight segments of the line indicate the person speaking is on camera. Wiggly or zig-zag segments indicate that portion of the scene will be on a different angle.

After the master, the director will run the scene again with different camera set-ups. Maybe it’s a tighter angle or a close-up of an individual actor in the scene. These are numbered with a letter suffix, such as 48A, 48B, and so on. A close-up might also be listed as 48A-cu, for example. Lengthy scenes can be tough to get down all at once without mistakes. So the director may film “pick-ups” – portions of a scene, often starting in the middle. Or there may be a need to record an alternate version of the scene. Pick-ups would be labelled 48-PU and an alternate would be A48. Sometimes a director will record an action multiple times in a row without stopping camera or re-slating. This might be the case when the director is trying to get a variety of actions from an actor handling a prop. Such set-ups would be labelled as a “series” (e.g. 48F-Ser).

On the left facing page, the script supervisor will keep track of these angles and note the various takes for each – 48-1, 48-2, 48-3, 48A-1, 48A-2, etc. They will also add notes and comments. For example, if a prop didn’t work or the actor missed an important line. And, of course, the take that the director really liked will be circled and is known as the “circle take.” In the days of physical film editing, only circle takes were printed from the negative to work print for the editors to use. With modern digital editing, everything is usually loaded in the editing system. The combination of drawn, lined set-ups with straight and zig-zag line segments together with circle takes provides the editor with a theoretical schematic of how a scene might be cut together.

The myth of the circle take

A circle take indicates a take that the director preferred. However, this is often based on the script supervisor’s observation of the director’s reaction to the performance. The director may or may not actually have indicated that’s the one and only take to use. Often a circle take is simply a good performance take, where actors and camera all hit their marks, and nothing was missed. In reality, an earlier take might have been better for the beginning of the scene, but the actors didn’t make it all the way through.

There are typically three scenarios for how a director will direct the actors in a scene. A) The scene has already been rehearsed and actions defined, so the acting doesn’t change much from take to take. The director is merely tweaking nuance out of the actors to get the best possible performance. B) The director has the actors ramp up their intensity with each take. Early takes may have a more subtle performance while later takes feature more exaggerated speech and mannerisms. C) The director wants a different type of performance with each take. Maybe sarcastic or humorous for a few, but aggressive and angry for others.

Depending on the director’s style, a circle take can be a good indication of what the editor should use – or it can be completely meaningless. In scenario A, it will be pretty easy to figure out the best performances and usually circle takes and other notes are a good guide. Scenario B is tougher to judge, especially in the early days of a production. The level of intensity should be consistent for a character throughout the film. Once you’ve seen a few days of dailies you’ll have a better idea of how characters should act in a given scene or situation. It’s mainly a challenge of getting the calibration right. Scenario C is toughest. Without actually cutting some scenes together and then getting solid, direct feedback from the director, the editor is flying blind in this situation.

Let’s edit the scene

NLEs offer tools to aid the editor in scene construction. If you use Avid Media Composer, then you can avail yourself of script-based editing. This lets you organize script bins that mimic a lined script. The ScriptSync option removes some of the manual preparation by phonetically aligning ingested media to lines of dialogue. Apple Final Cut Pro editors can also use keywords to simulate dialogue lines.

A common method going back to film editing is the organization of “KEM rolls.” These are string-outs of selected takes placed back-to-back, which enables fast comparisons of different performances. In the digital world this means assembling a sequence of best takes and then using that sequence as the source for your scene edit. Adobe Premiere Pro and Media Composer are the two main NLEs that facilitate easy sequence-to-sequence editing.

The first step before you make any edit is to review all of the dailies for the scene. The circle takes are important, but other takes may also be good for portions of the scene. The director may not have picked circle takes for the other set-ups – 48A, 48B, etc. If that case, you need to make that selection yourself.

You can create custom columns in a Media Composer bin. Create one custom column to rank your selections. An “X” in that column is for a good take. “XX” for one that can also be considered. Add your own notes in another custom column. Now you can use Media Composer’s Custom Sift command to show/hide clips based on these entries. If you only want to see the best takes displayed in the bin, then sift for anything with an X or XX in that first custom column. All other clips will be temporarily hidden. This is a similar function to showing Favorites in a Final Cut Pro Event. At this point you can either build a KEM Roll (selects) first or just start editing the scene.

Cutting a scene together is a bit like playing chess or checkers. Continuity of actors’ positions, props, and dialogue lines often determines whether a certain construct works. If an actor ad libs the lines, you may have a lengthy scene in which certain bits of dialogue are in a different order or even completely different words from one take to the next. If you pick Take 5 for the master shot, this can block your use of some other set-ups, simply because the order of the dialogue doesn’t match. Good editing can usually overcome these issues, but it limits your options and may result in a scene that’s overly cutty.

Under ideal conditions, the lines are always said the same way and in the right order, props are always handled the same way at the same times, and actors are in their correct positions at the same points in the dialogue. Those scenes are a dream to cut. When they aren’t, that’s when an editor earns his or her pay.

When I cut a scene, I’ve reviewed the footage and made my selections. My first pass is to build the scene according to what’s in my head. Once I’ve done that I go back through and evaluate the cut. Would a different take be better on this line? Should I go to a close-up here? How about interspersing a few reaction shots? After that round, the last pass is for refinement. Tighten the edits, trim for J-cuts and L-cuts, and balance out audio levels. I now have a scene that’s ready to show to the director and hopefully put into the ongoing assembly of the film. I know the scene will likely change when I start working one-on-one with the director, but it’s a solid starting point that should reflect the intent and text of the script.

Happy editing!

©2021 Oliver Peters