Analogue Wayback, Ep. 8

Nonlinear editing in the early days.

At the dawn of the “Hollywood East” days in central Florida, our post house, Century III, took up residency at Universal Studios Florida. As we ramped up the ability to support episodic television series production, a key member joined our team. John Elias was an A-list Hollywood TV series editor who’d moved to Florida in semi-retirement. Instead of retiring, he joined the company as our senior film editor.

Based on John’s experience in LA, our NLE of choice at that time was the Cinedco Ediflix. Like many of the various early NLEs, the Ediflix was a Rube Goldberg contraption. The edit computer controlled 12 VHS decks designed to mimic random access playback. The edit interface was controlled using a light pen. The on-screen display emulated numbered dialogue lines and vertical take lines somewhat like a script supervisor’s notation – only without any text for dialogue.

Shooting ratios were reasonable in those days; therefore, a day of filming was generally no more than an hour of footage. The negative would come back from the lab. The colorist would transfer and sync dailies in the telecine room, recording color-corrected footage to the camera master reels (1″ Type C). Dailies would then be copied to 3/4″ Umatic to be loaded into the Ediflex by the assistant editor. This included adding all the script information in a process called Script Mimic. The 3/4″ footage was copied to 12 duplicate VHS videocassettes (with timecode) for the Ediflex to use as its media.

The decks were industrial-grade JVC players. Shuttling and cueing performance was relatively fast with 60-minute cassettes. To edit and/or play a scene, the Ediflex would access the VHS deck closest to the desired timecode, cueing and switching between different decks to maintain real-time playback of a sequence. On most scripted shows, the editor could get through a scene or often even a complete act without the need to wait on a machine to cue or change videocassette loads.

Rough cuts were recorded back to 3/4″ for review by the director and producers. When the edit was locked, an EDL was transferred via floppy disk to the online edit bay where the film transfer masters were conformed at full quality. Since the Ediflex systems used an internal timebase corrector, image quality was reasonable and you could easily check for issues like proper lip-sync. So, while the system was a bit clunky in operation, it was head and shoulders better for film and TV offline editing than the digital upstarts – mainly thanks to relatively better image quality.

We leased four full Ediflex systems plus assistant stations by 1990. John and our team of offline editors cut numerous series, films, and even some custom, themed attraction programs. It was in this climate that Avid Technology came onto the scene and had to prove itself. We had seen one of the earliest Avid demos at NAB when they were still back in the “pipe-and-drape” section of the show floor. Most editors who were there will likely remember the Top Gun demo project and just how awful the image quality really was. There was no common computer media architecture, so Avid had to invent their own. As I remember, these were 4-bit images – very low-res and very posterized. But, most of the core editing functions were in place. 

Avid was making headway among commercial editors. The company was also eager to gain traction in the long-form market. As the resident post house on a major studio lot in a promising new market, making a successful sale to us was definitely of interest to them. To see if this was the right fit, Avid brought in a system for John to test. Avid legend Tom Ohanian also came down to train and work with John and run through a test project. They took an episode of one of the shows to see how the process compared with our experiences using Ediflex.

Unfortunately, this first test wasn’t great. Well into the week, the Mac’s internal drive crashed, thus corrupting the project file and bins. Your media becomes useless when the project is gone. This meant starting over. Needless to say, John was not a happy camper. When the cut was done, we decided to have the series producer review the rough cut to see if the quality was acceptable. This was the Swamp Thing series – a Universal property that you can still find in distribution. The show is dark and scenes generally happen at night. As the producer reviewed the edit, it was immediately clear that the poor image quality was a no-go at that time. It was impossible to see sync or proper eyeline on anything other than close-up shots. That temporarily tanked our use of Avid for series work.

Fast forward a couple of years. Avid’s image quality had improved and there was at least one in town. As a test run, we booked time on that unit and I cut a statewide citrus campaign. This was a more successful experience, so we soon added an Avid to our facility. This eventually grew to include several Media Composers, shared storage, and even a hero room rebuilt around Avid Symphony (a separate unit from Media Composer back then). Those systems were ultimately used to cut many shows, commercials, feature films, and IllumiNations: Reflections of Earth – a show that enjoyed a 20-year run at EPCOT.

©2022 Oliver Peters

Boris FX Optics 2022

Boris FX is a respected developer of visual effects tools for video. With the introduction of Optics in 2020, Boris FX further extended that expertise into the photography market. Optics installs as a plug-in for Adobe Photoshop, Lightroom, and Bridge. Optics is also installed as a standalone app that supports a variety of still image formats, including camera RAW. So, if you’ve avoided an Adobe subscription, you are still in luck. Before you go any further, I would encourage you to read my 2020 review of Optics (linked here) for an overview of how it works and how to use it.

How has Optics 2022 changed? 

Since that introduction in late 2020, Optics has gone through several free updates, but the 2022 version requires a small upgrade fee for existing users. If you are new to Optics, it’s available though subscription or perpetual licensing and includes a trial period to test the waters.

At first glance, Optics 2022 looks and operates much like the previous versions. Key changes and improvements for Optics 2022 include Mac M1 native support, Metal acceleration of most Sapphire filters, UI enhancements, and mask exchange with Photoshop. However, the big new features include the introduction of a Particle Illusion category with over 1700 emitters, more Sapphire filters, and the Beauty Studio filter set from Continuum. The addition of Particle Illusion might seem a bit odd for a photography application, but by doing so, Boris FX has enhanced Optics as a graphic design tool.

Taking those point-and-shoot photos into Optics 

I’ve used Optics since its introduction and was eager to review Optics 2022 when Boris FX contacted me. There was a local British car show this past Saturday – a superb opportunity to take some photos of vintage Jags, MGs, Minis, Bentleys, Triumphs, and Morgans on a sunny Florida weekend. To make this more real, I decided to shoot the stills with my plain vanilla iPhone SE 2020 using FiLMiC Pro’s FirstLight still photo app. Somewhere along the line, iOS and FirstLight have been updated to allow camera RAW photography. This wasn’t initially available and technically the SE doesn’t support Apple’s ProRAW codec. However, FirstLight now enables RAW recording of DNG files, which are kissing cousins of ProRAW. In the RAW mode, you get the full 4:3, 12MP sensor image. Alternate aspect ratios or in-app film emulations will be disabled.

After a morning of checking out classic cars, I returned home, AirDropped the stills to my iMac and started testing Optics. As RAW photos, the first step in Photoshop is to make any adjustment in the Adobe Camera RAW module before the photo opens in Photoshop. Next, send the layer to Optics, which launches the Optics 2022 application and opens that image in the Optics interface. When you’ve completed your Optics adjustments, click Apply to send the image back to Photoshop as a flat, rasterized image layer or a smart filter.

Working with layers and filters

As I discussed in my 2020 post, Optics itself is a layer-based system, similar to Photoshop. Each layer has separate blend and masking controls. Typically you add one effect per layer and stack more layers as you build up the look. The interface permits you to enable/disable individual layers, compare before and after versions, and adjust the display size and resolution.

Effects are organized into categories (FilmLab, Particle Illusion, Color, Light, etc) and then groups of filters within each category. For example, the Stylize category includes the various Sapphire paint filters. Each filter selection includes a set of presets. When you apply a filter preset, the parameters panel allows you to fine-tune the look and the adjustment of that effect, so you aren’t locked into the preset.

In addition to the parameters panel, many of the effects include on-screen overlay controls for visual adjustment. This is especially helpful with the Particle Illusion effects. For instance, you can change or modify the path of a lightning bolt by moving the on-screen points of the emitter.

Handling file formats

Optics supports TIFF, JPEG, PNG, and RAW formats, so you can open those straight into Optics without Photoshop. In the case of my DNG files, the first effect to be applied is a Develop filter. You can tweak the image values much like in the Adobe Camera RAW module. The operation for creating your look is the same as when you come from Photoshop, except that there is no Apply function. You will need to Save or Save As to export a flat, rasterized TIFF, PNG, or JPEG file. 

Unlike Photoshop, Optics does not have its own layered image format. You can save and recall a set-up. So if you’ve built up a series of filter layers for a specific look, simply save that set-up as a file (minus the image itself). This can be recalled and applied to any other image and modified to adapt that set-up for the new image. If you save the file in the TIFF format, then you have the option to save it with the set-up embedded. These files can be opened back up in Optics along with the various filter layers for further editing.


As I worked through my files on my iMac, Optics 2022 performed well, but I did experience a number of application crashes of just Optics. When Optics crashes, you lose any adjustments made to the image in Optics. However, when I tested Optics 2022 on my mid-2014 15″ MacBook Pro using the same RAW images, the application was perfectly stable. So it could be some sort of hardware difference between the two Macs.

Here’s one workflow item to be aware of between Photoshop and Optics. If you crop an image in Photoshop, the area outside of the crop still exists, but is hidden. That full image without the crop is the layer sent to Optics. If you apply a stylized border effect, the border is applied to the edges of the full image. Therefore, some or all of the border will be cropped upon returning to Photoshop. Optics includes internal crop controls, so in that instance, you might wish to crop in Optics first, apply the border, and then match the crop for the whole image once back in Photoshop.

All in all, it’s a sweet application that really helps when stuck for ideas about what to do with an image when you want to elevate it above the mundane. Getting great results is fast and quite enjoyable – not to mention, infinitely easier than in Photoshop. Overall, Optics is a great tool for any photographer or graphic designer.

Click through the gallery images below to see further examples of looks and styles created with Boris FX Optics 2022.

©2022 Oliver Peters

Analogue Wayback, Ep. 7

Clickety, clackety, click.

Prior to the advent of Microsoft Powerpoint, large-scale corporate presentations often involved elaborate multimedia productions. Support for keynote addresses, trade shows, and board of directors meetings took the form of audio-visual productions using multiple, synchronized 35mm slide projectors as the playback system. In the 1970s and even 1980s, if you wanted to wow an audience with high-res images spread across an entire stage, that’s how it was done. There were numerous production companies and even an NAB-style trade show dedicated specifically to this art and technology. If you said that you produced corporate presentations or multimedia, everyone knew that this was what you were talking about.

All visuals had to end up on a 35mm slide. This included photographic images, graphics, and text. Proper alignment of the projectors at the time of the presentation was paramount to get the correct seamless effect. The projected light could be controlled, which enabled dissolves and composites. Any large image that was a composite of several slides spread horizontally across the stage had to be correctly divided and masked. This feathered mask let the light additively blend across slides for a seamless image.

Typical presentations played from a bank of six to 21 stacked projectors. Picture was accompanied by a powerful soundtrack. The common audio playback device was a 4-track recorder (often a Teac). This allowed for a stereo track, a guard band, and a timecode track to trigger the projectors. Audio timecode, pulses, or even punch tape were methods used for synchronization. A central controller was the brain and the AVL Eagle was the king in that field, if memory serves me right. AVL was the multimedia world’s equivalent to CMX for video editors.

At the height of multi-image technology, some systems even integrated lip-sync soundbites. Obviously you could integrate a film projector for playback. However, a second more ingenious way was demoed at one of the shows. The on-camera speaker was filmed and those frames copied to individual 35mm slides. The system was able to work at about 20fps and could maintain lip-sync when filmed at this rate. The clip was played by cycling through the consecutive slides that made up the soundbite!

During my time in Jacksonville, our company had a separate department devoted to creating multimedia productions. One of their clients was an oil company with a presence in Alaska. In order to optimize portability for traveling, the show director developed a package using the 110 slide format. Up to nine 110 slide projectors, complete with playback system, fit neatly into a road case!

Since that time, I’ve worked numerous corporate shows as an on-site editor. The keynote presentation is often supported by three or four Powerpoint designers who are handling all of the graphics and speaker support images from their laptops. That’s a huge advancement. Nevertheless, it’s sad to have witnessed the demise of a rather ingenious presentation form.

©2022 Oliver Peters

Analogue Wayback, Ep. 6

This is the world’s most expensive stopwatch.

There are few clients who truly qualify as “the client from Hell”. Clients have their own stresses that may be unseen or unknown to the editor. Nevertheless, some create extremely stressful edit sessions. I previously wrote about the color bar fiasco in Jacksonville. That client returned for numerous campaigns. Many of the edit sessions were overnight and each was a challenge.

Editing with a client is all about the interpersonal dynamics. In this case, the agency came down with an entourage – director, creative director, account executive, BTS photographer, and others. The director had been a big-time commercial director in the days of cigarette ads on TV. When those were pulled, his business dried up. So he had a retainer deal with this agency. However, the retail spots that I was cutting were the only TV spots the agency (owned by a larger corporation as an in-house agency) was allowed to do. For much of the run, the retail spots featured a celebrity actor/spokesman, which probably explained the entourage.

Often editors complain about a client sitting next to them and starting to crowd their working space as that client edges closer to the monitor. In these sessions the creative director and director would sit on either side of me – left and right. Coming from a film background, they were less familiar with video advances like timecode and insisted on using stopwatches to time every take. Of course, given reaction times and the fact that both didn’t get the same length, there was a lot of, “Please rewind and play it again.” At least on one occasion I was prompted to point to the edit controller display and remind them that I had the world’s most expensive stopwatch right there. I could tell them exactly how long the clip was. But, to no avail.

The worst part was that the two would get into arguments with each other – across me! Part of this was just personality and part of it was that they had written the spots in the hotel room the night before the shoot. (Prior planning? Harumph!) In any case, there were numerous sessions when I just had to excuse myself from the room while they heatedly hashed it out. “Call me when you’ve made a decision.”

There was an ironic twist. One quiet gentleman in the back of the room seemed to be the arbiter. He could make a decision when neither of them would. At the beginning I had assumed that he was the person really in charge. As it turned out, he was the account executive and they largely discounted him and his opinions. Yet, he had the best understanding of their client, which is why, when all else failed, they deferred to him!

Over the course of numerous sessions we pumped out commercial campaigns in spite of the stress. But those sessions always stick in my mind as some of the quirkiest I’ve ever had.

©2022 Oliver Peters

Analogue Wayback, Ep. 5

Those are the wrong color bars.

In the late 70s our production company picked up a new commercial client. This was a home improvement retailer similar to Lowe’s or Home Depot. It was owned by a larger corporation and their in-house ad agency handled the TV spots for the retail outlet. Although headquartered in New York City, Jacksonville (where I was) provided the right combination of new stores, easy flights from NYC, and a local production company with high-end production and post capabilities.

This first outing with us was to produce a series of holiday commercials intended to run during the Christmas season. We had a mobile production unit, which back then meant a Winnebago decked out with an onboard RCA 2″ quad recorder and an RCA TKP-45 camera. Plus dolly, grip and lighting, sound, etc. The commercials were a series of choreographed spots featuring a group of singers and dancers. Quite the production. I was booked to start the edit first thing the next morning.

I arrived at work early, set up the VTRs, and mounted the camera master tape from the truck. When I first hit play, it struck me that the wrong color bar test signal was at the head of the reel. We used SMPTE bars inside the facility, but the truck engineer normally recorded full field color bars from the camera at the head of the location recordings. This tape had SMPTE bars. Hmmm….  I spun in a bit farther and hit play again (you can’t shuttle and see picture on the 2″ VTRs). Uh, oh. SMPTE bars. I quickly realized that the tape had color bars throughout. Somehow the entire production from the day before had been erased!

Next, it was time to greet the director and agency creatives for the first time as they arrived for the session and inform that that it was all gone. Just how you want to start off with a new client! Of course, being from NYC, their first reaction was to threaten a lawsuit. However, my immediate priority and our first obligation was still to deliver finished spots for air. After everyone calmed down, the production department was able to set up talent and location and do it all over again. Although the crew was scheduled to do another gig in Tampa, they managed to pull it off and still make the Tampa gig. And even better, nearly all of the talent, especially the leads, were available for the fast-turnaround reshoot.

How did this happen?

We were a 24-hour operation with three edit shifts servicing another weekly retail commercial account. Videotape edit master reels were typically prerecorded with a signal and timecode. That signal could either be black, a test pattern, or something else. We used house (SMPTE) color bars. This allowed the editor or VTR operator to work in the insert mode across any part of the tape. The second or third shift editor was usually the person who prepped these tapes, as they often had some available VTR time during these later hours.

What had happened was a true comedy of errors. I had left instructions with the evening shift editor to prep an edit master tape for me to use in the morning. During his shift, the truck engineer returned from the shoot and set the camera master reel on the counter, saying, “Here’s the tape for Oliver’s edit.” The evening editor went about his business and forgot about prepping a tape for me. On his way out the door at the shift change, the evening editor passed the task to the overnight editor, “Please prep the edit master tape for Oliver. It’s there on the counter.” Then, without reading any of the labels and stickers on the reel indicating that this was a camera master, that editor loaded the reel on the machine and hit record. That’s how I wound up with a tape full of color bars instead of a raw footage with singers and dancers the next morning.

Fortunately the reshoot went well and the client was willing to accept that this was a fluke that would never happened again. We did more sessions with them over the next couple of years. But they would never, ever again trust someone to handle the camera reel after the shoot, nor wait until the next day to start the edit. The night of the reshoot, the producer hand-carried the reel to the shop and we started the edit that evening and went all through the night. That’s a pattern that continued. If a shoot wrapped at 9PM, we’d start at 10PM and edited until the next morning as needed.

So let that be a lesson. Read the labels!

©2022 Oliver Peters