Boris FX Optics 2022

Boris FX is a respected developer of visual effects tools for video. With the introduction of Optics in 2020, Boris FX further extended that expertise into the photography market. Optics installs as a plug-in for Adobe Photoshop, Lightroom, and Bridge. Optics is also installed as a standalone app that supports a variety of still image formats, including camera RAW. So, if you’ve avoided an Adobe subscription, you are still in luck. Before you go any further, I would encourage you to read my 2020 review of Optics (linked here) for an overview of how it works and how to use it.

How has Optics 2022 changed? 

Since that introduction in late 2020, Optics has gone through several free updates, but the 2022 version requires a small upgrade fee for existing users. If you are new to Optics, it’s available though subscription or perpetual licensing and includes a trial period to test the waters.

At first glance, Optics 2022 looks and operates much like the previous versions. Key changes and improvements for Optics 2022 include Mac M1 native support, Metal acceleration of most Sapphire filters, UI enhancements, and mask exchange with Photoshop. However, the big new features include the introduction of a Particle Illusion category with over 1700 emitters, more Sapphire filters, and the Beauty Studio filter set from Continuum. The addition of Particle Illusion might seem a bit odd for a photography application, but by doing so, Boris FX has enhanced Optics as a graphic design tool.

Taking those point-and-shoot photos into Optics 

I’ve used Optics since its introduction and was eager to review Optics 2022 when Boris FX contacted me. There was a local British car show this past Saturday – a superb opportunity to take some photos of vintage Jags, MGs, Minis, Bentleys, Triumphs, and Morgans on a sunny Florida weekend. To make this more real, I decided to shoot the stills with my plain vanilla iPhone SE 2020 using FiLMiC Pro’s FirstLight still photo app. Somewhere along the line, iOS and FirstLight have been updated to allow camera RAW photography. This wasn’t initially available and technically the SE doesn’t support Apple’s ProRAW codec. However, FirstLight now enables RAW recording of DNG files, which are kissing cousins of ProRAW. In the RAW mode, you get the full 4:3, 12MP sensor image. Alternate aspect ratios or in-app film emulations will be disabled.

After a morning of checking out classic cars, I returned home, AirDropped the stills to my iMac and started testing Optics. As RAW photos, the first step in Photoshop is to make any adjustment in the Adobe Camera RAW module before the photo opens in Photoshop. Next, send the layer to Optics, which launches the Optics 2022 application and opens that image in the Optics interface. When you’ve completed your Optics adjustments, click Apply to send the image back to Photoshop as a flat, rasterized image layer or a smart filter.

Working with layers and filters

As I discussed in my 2020 post, Optics itself is a layer-based system, similar to Photoshop. Each layer has separate blend and masking controls. Typically you add one effect per layer and stack more layers as you build up the look. The interface permits you to enable/disable individual layers, compare before and after versions, and adjust the display size and resolution.

Effects are organized into categories (FilmLab, Particle Illusion, Color, Light, etc) and then groups of filters within each category. For example, the Stylize category includes the various Sapphire paint filters. Each filter selection includes a set of presets. When you apply a filter preset, the parameters panel allows you to fine-tune the look and the adjustment of that effect, so you aren’t locked into the preset.

In addition to the parameters panel, many of the effects include on-screen overlay controls for visual adjustment. This is especially helpful with the Particle Illusion effects. For instance, you can change or modify the path of a lightning bolt by moving the on-screen points of the emitter.

Handling file formats

Optics supports TIFF, JPEG, PNG, and RAW formats, so you can open those straight into Optics without Photoshop. In the case of my DNG files, the first effect to be applied is a Develop filter. You can tweak the image values much like in the Adobe Camera RAW module. The operation for creating your look is the same as when you come from Photoshop, except that there is no Apply function. You will need to Save or Save As to export a flat, rasterized TIFF, PNG, or JPEG file. 

Unlike Photoshop, Optics does not have its own layered image format. You can save and recall a set-up. So if you’ve built up a series of filter layers for a specific look, simply save that set-up as a file (minus the image itself). This can be recalled and applied to any other image and modified to adapt that set-up for the new image. If you save the file in the TIFF format, then you have the option to save it with the set-up embedded. These files can be opened back up in Optics along with the various filter layers for further editing.

Performance

As I worked through my files on my iMac, Optics 2022 performed well, but I did experience a number of application crashes of just Optics. When Optics crashes, you lose any adjustments made to the image in Optics. However, when I tested Optics 2022 on my mid-2014 15″ MacBook Pro using the same RAW images, the application was perfectly stable. So it could be some sort of hardware difference between the two Macs.

Here’s one workflow item to be aware of between Photoshop and Optics. If you crop an image in Photoshop, the area outside of the crop still exists, but is hidden. That full image without the crop is the layer sent to Optics. If you apply a stylized border effect, the border is applied to the edges of the full image. Therefore, some or all of the border will be cropped upon returning to Photoshop. Optics includes internal crop controls, so in that instance, you might wish to crop in Optics first, apply the border, and then match the crop for the whole image once back in Photoshop.

All in all, it’s a sweet application that really helps when stuck for ideas about what to do with an image when you want to elevate it above the mundane. Getting great results is fast and quite enjoyable – not to mention, infinitely easier than in Photoshop. Overall, Optics is a great tool for any photographer or graphic designer.

Click through the gallery images below to see further examples of looks and styles created with Boris FX Optics 2022.

©2022 Oliver Peters

Think you can mix… Round 2

A month ago I discussed LEWITT Audio’s music challenge mixing contest. While there’s no current contest, all of the past tracks are still available for download, so you can try your hand at mixing. To date, there are six songs running about 3 1/2 minutes each. LEWITT has selected a cross-section of eclectic European artists, showcasing styles that include R&B, rock, jazz, folk, swing, and punk. I mixed one of the songs for that first post, but decided to take my own suggestion and ended up mixing all six. I have posted that compilation to Vimeo.

As I previously mentioned, I don’t mix songs for a living, so why add this to my workload? Well, partially for fun, but also to learn. Consider it a video editor’s version of the “busman’s holiday.”

Each of the six songs poses different challenges. LEWITT’s marketing objective is to sell microphones and these recordings showcase those products. As such, the bands have been recorded in a live or semi-live studio environment. This means mics placed in front of instrument cabs, as well as mics placed all around the drum kit. Depending on the proximity of the band members to each other and how much acoustic baffling was used, some instrument tracks are more isolated than others. Guitars, bass, and keys might have additional direct input (DI) tracks for some songs, as well as additional overdubs for second and third parts. The track count ranged from 12 to 28 tracks. As is typical, drums had more tracks to deal with than any other instrument.

The performing styles vary widely, which also presents some engineering decisions that have to be made in how you mix. Move Like A Ghost was deliciously nasty by design. Do you try to clean that up or just embrace it? 25 Reasons is closer to a live stage performance with tons of leakage between tracks.

The video component

LEWITT, in conjunction with the performers, produced videos for five of the six songs. The mixes are specifically for LEWITT, not the official release versions of any of these songs. Therefore, the LEWITT videos were designed to accompany and promote the contest tracks. This makes it easy to sync your mix to each video, which is what I did for my compilation. In the case of Move Like a Ghost, LEWITT did not produce a full video with Saint Agnes. So, I pulled a stylized live music video for the band from YouTube for my version of the mix. I assembled this reel in Final Cut Pro, but any editing was really just limited to titles and the ins/outs for each song. The point is the mix and not the editing.

On the other hand, working with the video did inform my mix. For example, if a lead instrument had a riff in the song that’s shown in the video, then I’d emphasize it just a bit more in the mix. That’s not a bad thing, per se, if it’s not overdone. One quirk I ran into was on The Palace. The tracks included several vocal passes, using different mics. I picked the particular vocal track that I thought sounded best in the mix. When I synced it up to the video, I quickly realized that one vocal line (on camera) was out-of-sync and that LEWITT’s mixers must have blended several vocal performances in their own mix. Fortunately, it was an easy fix to use that one line from a different track and then everything was in sync.

Working the DAW

One of the tracks even included a Pro Tools session for Pro Tools users, but Logic Pro is my DAW of choice. Audition and Resolve (Fairlight) could have been options, but I prefer Logic Pro. It comes with really good built-in plug-ins, including reverbs, EQs, compressors, and amp modelers. I used all of these, plus a few paid and free third-party plug-ins from Accusonus, Analog Obsession, iZotope, FabFilter, Klevgrand, Sound Theory, TBProAudio, and Tokyo Dawn Labs.

One big selling point for me is Logic’s track stack feature, which is a method of grouping and organizing tracks and their associated channel strips. Use stacks to organize instrument tracks by type, such as drums, guitars, keys, vocals, etc. A track stack can be a folder or a summing stack. When summing is selected, then a track stack functions like a submix bus. Channel strips within the stack are summed and additional plug-ins can then be applied to the summing stack. If you think in terms of an NLE, then a track stack is a bit like a compound clip or nest. You can collapse or expand the tracks that have been grouped into the stack with a reveal button. Want to bring your visual organization from a couple of dozen tracks down to only a few? Then track stacks organized by instruments are the way to go.

For those unfamiliar with Logic Pro basics, here’s a simplified look at Logic’s signal flow. Audio from the individual track flows into the channel strip. That signal first hits any plug-ins, EQ, or filtering, and then flows out through the volume control (fader). If you need to raise or lower the volume of a track going into the plug-in chain, then you either have to adjust the input of the plug-in itself, or add a separate gain plug-in as the first effect in the chain. The volume control/fader affects the level after plug-ins have been applied. This signal is then routed through the track stack (if used). On a summing track stack, the signal flow through its channel strip works the same way – plug-ins first, then volume fader. Of course, it can get more complex with groups, sends, and side-chaining.

All track stack signals, as well as any channel not placed into a track stack, flow through the stereo out bus (on a stereo project) – again, into the plug-ins first and then out through the volume control. In addition to the stereo output bus, there’s also a master output fader, which controls the actual volume of the file written to the drive. If you place a metering plug-in into the chain of the stereo output bus, it indicates the level for peaks or loudness prior to the volume control of the stereo output AND the master output bus. Therefore, I would recommend that you ALWAYS leave both faders at their zero default, in order to get accurate readings.

All mixes are subjective

The approach to the mix varies with different engineers. What worked best for me was to concentrate on groups of instruments. The order isn’t important, but start with drums, for instance. The kit will likely have the highest number of tracks. Work with the soloed drum tracks to get a well-defined drum sound as a complete kit. Same for guitars, vocals, or any other instrument. Then work with the combination to get the right overall balance. Lastly, add and adjust mastering plug-ins to the stereo output channel strip to craft the final sound.

Any mix is totally subjective and technical perfection is merely an aspiration. I personally prefer my mix of Dirty to the others. The song is fun and the starting tracks nice and clean. But I’m certainly happy with my mix on the others, in spite of sonic imperfections. To make sure your mix is as good as it can be, check your mix in different listening environments. Fortunately, Audition can still burn your mix to an audio CD. Assuming you still own a disc burner and CD players, then it’s a great portable medium to check your mix in the car or on your home stereo system. Overall, during the course of mixing and then reviewing, I probably checked this on four different speaker set-ups, plus headphones and earbuds. The latter turned out to be the best way to detect stereo imaging issues, though not necessarily the most accurate sound otherwise. But, that is probably the way a large swath of people consume music these days.

I hope you enjoy the compilation if you take the time to listen. The order of the songs is:

The Seeds of your Sorrow

Spitting Ibex

25 Reasons

Louis Berry and band

The Palace

Cosmix Studios session

featuring Celina Ann

with

Thomas Hechenberger (guitar)

Valentin Omar (keys)

David Leisser (drums)

Bernhard Osanna (bass)

Home

AVEC

Dirty

Marina & the Kats

Move Like a Ghost

Saint Agnes

©2022 Oliver Peters

Think you can mix?

Are you aspiring to be the next Chris Lord-Alge or Glyn Johns? Maybe you just have a rock ‘n roll heart. Or you just want to try your hand at mixing music, but don’t have the material to work with. Whatever your inspiration, Lewitt Audio – the Austrian manufacturer of high-quality studio microphones – has made it easier than ever to get started. Awhile back Lewitt launched the myLEWITT site as a user community, featuring educational tips, music challenges, and free content.

Even though the listed music challenge contests may have expired, Lewitt leaves the content online and available to download for free. Simply create a free myLEWITT account to access them. These are individual .wav stem tracks of the complete challenge songs recorded using a range of Lewitt microphones. Each file is labelled with the name of the mic used for that track. That’s a clever marketing move, but it’s also handy if you are considering a mic purchase. Naturally these tracks are only for your educational and non-commercial use.

Since these are audio files and not specific DAW projects, they are compatible with any audio software. Naturally, if you a video editor, it’s possible to mix these tracks in an NLE, like Premiere Pro, Media Composer, or Final Cut Pro. However, I wouldn’t recommend that. First of all, DAW applications are designed for mixing and NLEs aren’t. Second, if you are trying to stretch your knowledge, then you should use the correct tool for the job. Especially if you are going to go out on the web for mixing tips and tricks from noted recording engineers and producers.

Start with a DAW

If you are new to DAW (digital audio workstation) software, then there are several free audio applications you might consider just to get started. Mac users already have GarageBand. Of course, most pros wouldn’t consider that, but it’s good enough for the basics. On the pro level, Reaper is a popular free DAW application. Universal Audio offers Luna for free, if you have a compatible UA Thunderbolt audio interface.

As a video editor, you might also be getting into DaVinci Resolve. Both the free and paid Studio versions integrate the Fairlight audio page. Fairlight, the company, had a well-respected history in audio prior to the acquisition by Blackmagic Design, who has continued to build upon that foundation. This means that not only can you do sophisticated audio mixes for video in Resolve, but there’s no reason that you can’t start and end in the Fairlight page for a music project.

The industry standard is Avid Pro Tools. If you are planning to work in a professional audio environment like a recording studio, then you’ll really want to know Pro Tools. Unfortunately, Avid discontinued their free Pro Tools|First version. However, you can still get a free, full-featured 30-day trial. Plus, the subscription costs aren’t too bad. If you have an Adobe Creative Cloud subscription, then you also have access to Audition as part of the account. Finally, if you are deep into the Apple ecosystem, then I would recommend purchasing Logic Pro, which is highly regarded by many music producers. 

Taking the plunge

In preparing this blog post, I downloaded and remixed one of the myLEWITT music challenge projects – The Seeds of your Sorrow by Spitting Ibex. This downloaded as a .zip containing 19 .wav files, all labelled according to instrument and microphone used. I launched Logic Pro, brought in the tracks, and lined them up at the start so that everything was in sync. From there it’s just a matter of mixing to taste.

Logic is great for this type of project, because of its wealth of included plug-ins. Logic is also a good host application for third party plug-ins, such as those from iZotope, Waves, Accusonus, and others. Track stacks are a versatile Logic feature. You can group a set of tracks (like all of the individual drums kit tracks) and turn those into a track stack, which then functions like a submix bus. The individual tracks can still be adjusted, but then you can also adjust levels on the entire stack. Track stacks are also great for visual organization of your track layout. You can show or hide all of the tracks within a stack, simply by twirling a disclosure triangle.

I’m certainly not an experienced music mixer, but I have mixed simple projects before. Understanding the process is part of being a well-rounded editor. In total, I spent about six hours over two days mixing the Spitting Ibex song. I’ve posted it on Vimeo as a clip with three sections – the official mix, my mix, and the unmixed/summed tracks. My mix was relatively straightforward. I wanted an R&B vibe, so no fancy left-right panning, voice distortions, or track doubling.

I mixed it totally in Logic Pro using mainly the native plug-ins for EQ, compression, reverb, amp modeling, and other effects. I also used some third-party plug-ins, including iZotope RX8 De-click and Accusonus ERA De-esser on the vocal track. As I brightened the vocal track to bring it forward in the mix, it also emphasized certain mouth sounds caused by the singer’s proximity to the mic. These plug-ins helped to tame those. I also added two final mastering plug-ins: Tokyo Dawn’s Nova for slight multi-band compression, along with FabFilter’s Pro-L2 limiter. The latter is one of the smoothest mastering plug-ins on the market and is a nice way to add “glue” to the mix.

If you decide to download and play with the tracks yourself, then check out the different versions submitted to the contest, which are showcased at myLEWITT. For a more detailed look into the process, Dutch mixing/mastering engineer and YouTuber Wytse Gerichhausen (White Sea Studio) has posted his own video about creating a mix for this music challenge.

In closing…

Understand that a great music mix starts with a tight group of musicians and high-quality recordings. Without those, it’s hard to make magic. With those, you are more than three-quarters of the way there. Fortunately Lewitt has taken care of that for you.

The point of any exercise like this is to learn and improve your skills. Learn to trust your ears and taste. Should you remove the breaths in a singer’s track? Should the mix be wetter (more reverb) or not? If so, what sort of reverb space? Should the bottom end be fatter? Should the guitars use distortion or be clean? These are all creative judgements that can only be made through trial-and-error and repeated experimentation. If music mixing is something you want to pursue, then the Produce Like A Pro YouTube channel is another source of useful information.

Let me leave you with some pro tips. At a minimum, make sure to mix any complex project on quality nearfield monitors (assuming you don’t have an actual studio at your disposal). Test your mix in different listening environments, on different speakers, and at different volume levels to see if it translates universally well. If you are going for a particular sound or style, have some good reference tracks, such as commercially-mastered songs, to which you can compare your mix. How did they balance the instruments? Did the reference song sound bright, boomy, or midrange? How were the dynamics and level of compression? And finally, take a break. All mixers can get fatigued. Mixes will often sound quite different after a break or on the next day. Sometimes it’s best to leave it and come back later with fresh ears and mind.

In any case, you can get started without spending any money. The tracks are free. Software like DaVinci Resolve is free. As with so many other tasks enabled by modern technology, all it takes is making the first move.

©2022 Oliver Peters

Photo Phun 2021

The last time I posted one of these “just for fun” set of photography samples was in 2015. So it’s high time for another one. Earlier this year I reviewed FiLMiC Pro’s iOS movie application and put together a small demo highlighting the capabilities. At the same time I installed Firstlight – FiLMiC’s iOS still photo/camera application.

Firstlight extends the power of the iPhone camera system with extra controls and features. It also adds film-style looks, including grain, vignettes, and film stock simulations. My past Photo Phun blog posts have showcased post processes that could be applied to existing images. The examples in this article used no image manipulation in post (outside of size and crop adjustments). The “look” of all samples was done in-camera, using Firstlight’s built-in features. This is using a stock iPhone SE 2020 – no lens attachments, filters, or tripod.

I occasionally throw shade on the true quality of the best iPhone images. If image quality is the critical factor, then give me a high-end DSLR any day. But, the iPhone is “the camera you have with you” and as such is the modern digital equivalent of the Kodak Instamatic camera, albeit with a lot better image quality. We can pixel-peep all day long, but the bottom line is that iPhones and top-of-the-line Android phones create some stunning imagery in the right hands.

I’ve shot a fair amount of 35mm slide and print film over the years as an amateur photographer. I’ve also manipulated images in post for my share of motion film. Any film stock emulation LUT is suspect in my eyes. You can get close, but ultimately the look designed by a company can only be an approximation. FiLMiC positions its looks as film simulations inspired by 19th and 20th century photography. I think that’s the right approach, without claiming that a given simulation is the exact same as a specific photochemical stock from a film manufacturer or a development process at a lab.

Firstlight includes a wide range of film simulations to choose from. Bear in mind that a look will change the saturation or hue of certain colors. This will only be readily obvious if those colors are in the images that you shoot. For example, C41 Plus shifts blues towards cyan. If you have blue skies in your shot, the resulting skies will appear cyan. However, if your scene is devoid of blues, then you might not see as much effect from that simulation. Think about matching the look to the content. Shoot the right subject in B&W Noir and any shot will look like it came from a gothic tale!

Remember that your iPhone has no true viewfinder – just the screen. If you are outside in bright sunlight, you can barely see what you are shooting, let alone the color differences between film simulations. If you intend to use a film simulation, then plan the shot out ahead of time. Become familiar with what each setting is intended to do, because that look will be baked in. If you get back to the computer and realize you made a mistake, then it’s too late. Personally, I’m keen on post rather than in-camera manipulations. However, if you feel confident in the results – go for it.

The gallery below features examples using Firstlight’s simulated film looks. Each choice has been noted on most of the screens, although some will be very obvious. Other than the changes in the film simulation, the rest of the Firstlight settings are identical for all images. The camera was set to 3:2 aspect ratio, AE mode enabled, and HDR active. Both a fine grain and a vignette at the lowest setting were also applied to each during Firstlight’s image capture.

Check out my Flickr page at this link to see some of these same images and more with added Photoshop and BorisFX Optics development and effects.

Click any gallery thumbnail below to view an enlarged slideshow of these photos.

©2021 Oliver Peters

Final Cut Pro at 10 and Other Musings

Recently Final Cut Pro (formerly Final Cut Pro X) hit its tenth anniversary.  Since I’ve been a bit quiet on this blog lately due to the workload, I thought it was a good time to reflect. I recently cut a set of involved commercials using FCP. While I’ve cut literally thousands of commercials in my career, my work in recent years tends to be corporate/branding/image content in the five to ten minute range. I work in a team and the tool of choice is Premiere Pro. It’s simply a better fit for us, since the bulk of staff and freelancers are very fluid in Adobe products and less so with Apple’s pro software. Sharing projects and elements also works better in the Adobe ecosystem.

Cutting the spots in Final Cut Pro

In the case of the four :60s, I originally budgeted about two days each, plus a few days for client revisions – eleven days in total. My objective was to complete the creative cut, but none of the finishing, since these spots involved extensive visual effects. I was covering for the client’s regular editor who had a scheduled vacation and would finish the project. The spots were shot with a Sony Venice, simultaneously recording 6K RAW and 4K XAVC (AVC-Intra) “proxy” files. The four spots totaled over 1200 clips with approximately an hour of footage per spot. My cutting options could be to work natively with the Sony RAW media in Premiere Pro or DaVinci Resolve, or to edit with the proxies in any NLE.

The Sony RAW files are large and don’t perform well playing from a shared storage system. I didn’t want to waste the time copying location drives to the NAS, partially for reasons of time. I also wanted to be able to access media to cut the spots whether at home or at the work facility. So I opted to use the proxies, which allowed me to cut the spots in FCP. Of course, if you think of proxies as low-res files, you’d be wrong. These Sony XAVC files are high-res, camera-original files on par with 4K ProRes HQ media. If it weren’t for VFX, these would actually be the high-quality source files used for the final edit.

I copied the proxy files to a 2TB Samsung T7 SSD portable drive. This gave me the freedom to edit wherever – either on my iMac at home or one of the iMac Pros at work. This is where Final Cut Pro comes in. When you wade through that much footage, it’s easy for an NLE to get bogged down by caching footage or for the editor to get lost in the volume of clips. Thanks to skimming and keyword collections, I was able to cut these spots far more quickly than using any of the other NLE options. I could go from copying proxy files to my first cut on a commercial within a single day. That’s half of the budgeted time.

The one wrinkle was that I had to turn over a Premiere Pro project linked to the RAW media files. There are various ways to do that, but automatic relinking is dicier with these RAW files, because each clip is within its own subfolder, similar to RED. This complicates Premiere’s ability to easily relink files. So rather than go through XtoCC, I opted to import the Sony RAW clips into Resolve, then import the FCPXML, which in turn automatically relinked to the RAW files in Resolve.

There are a few quirks in this method that you have to suss out, but once everything was correct in Resolve, I exported an XML for Premiere. In Premiere Pro, I imported that XML, made sure that Premiere linked to the RAW files, corrected any size and speed issues, removed any duplicate clips, and then the project was ready for turnover. While one could look at these steps and question the decision to not cut in Premiere in the first place, I can assure you that cutting with Final Cut was considerably faster and these roundtrip steps were minor.

Remote workflows

Over the past year, remote workflows and a general “work from home” movement has shifted how the industry moves forward. So much of what I do requires connection to shared storage, so totally working from home is impractical. These spots were the exception for me, but the client and director lived across the country. In years past, they used to fly in and work in supervised sessions with me. However, in more recent years, that work has been unattended using various review-and-approval solutions for client feedback and revisions. Lately that’s through Frame.io. In the case of these spots, my workflow wasn’t any different than it would have been two years ago.

On the other hand, since I have worked with these clients in supervised sessions, as well as remote projects, it’s easy to see what’s been lost in this shift. Remote workflows present two huge drawbacks. The first is turnaround time. It’s inherently an inefficient process. You’ll cut a new version, upload it for review, and then wait – often for hours or even the next day. Then make the tweaks, rinse, and repeat. This impacts not only the delivery schedule, but also your own ability to book sessions and determine fair billing.

Secondly, ideation takes a back seat. When a client is in the room, you can quickly go through options, show a rearranged cut, alternate takes, and so on. Final Cut’s audition function is great for this, but it’s a wasted feature in these modern workflows. During on-prem sessions, you could quickly show a client the options, evaluate, and move on. With remote workflows, that’s harder to show and is subject to the same latency of replying, so as a result, you have fewer options that can be properly vetted in the cut.

The elephant in the room is security. I know there are tons of solutions for “drilling” into your system from home that are supposed to be secure. In reality, the only true security is to have your system disconnected from the internet (but also not totally bulletproof). As Sony Pictures, QNAP owners, Colonial Pipeline, agencies of the US government, or multiple other corporations have found out, if a bad actor wants to get into your system, they can. No amount of encryption, firewalls, VPNs, multi-factor authentication, or anything else is going to be guaranteed to stop them. While remote access might have been a necessary evil due to COVID lockdowns, it’s not something that should be encouraged going forward.

However, I know that I’m swimming against the stream on this. Many editors/designers/colorists don’t seem to ever want to return to an office. This is at odds with surveys indicating the majority of producers and agencies are chomping to get back to working one-on-one. Real estate and commuting costs are factors that affect such decisions, so I suspect hybrids will evolve and the situation in the future may vary geographically.

Final Cut Pro’s future

I mention the WFH dilemma, because remote collaboration is one of the features that Apple has been encouraged to build into Final Cut Pro by some users. It’s clearly a direction Adobe has moved towards and where Avid already has a track record.

I’m not sure that’s in Apple’s best interest. For one thing, I don’t personally believe Apple does a good job of this. Access and synchronization performance of iCloud is terrible compared with Google’s solutions. Would a professional collaboration solution really be industry-leading and robust? I highly doubt it.

Naturally Apple wants to make money, but they are also interested in empowering the creative individual – be that a professional or an enthusiast. Define those terms in whatever way you like, but the emphasis is on the individual. That direction seems to be at odds with what “pro” users think should be the case for Apple ProApps software, based on their experiences in the late years of FCP 1-7/FCP Studio (pre-X).

I certainly have my own feature request list for Final Cut Pro, but ultimately the lack of these did not stop me from a rapid turnaround on the spots I just discussed. Nor on other projects when I turn to FCP as the tool of choice. I use all four major NLEs and probably never will settle on a single “best” NLE for all cases.

The term “YouTube content creator” or “influencer” is often used as a pejorative, but for many filmmakers and marketeers outlets like YouTube, facebook, and Instagram have become the new “broadcast.” I recently interviewed Alexander Fedorov for FCP.co. He’s a Russian photographer/filmmaker/vlogger who epitomizes the type of content creator for whom Apple is designing its professional products. I feel that Apple can indeed service multiple types of users, from the individual, self-taught filmmaker to the established broadcast pro. How Apple does that moving forward within a tool like Final Cut Pro is anyone’s guess. All I know is that using the measurements of what is and isn’t “pro” no longer works in so many different arenas.

©2021 Oliver Peters