Adobe’s Frame Rollout

Adobe acquired last October. The latest Adobe Creative Cloud application updates showcase the first formal integration of as a product within the Creative Cloud ecosystem. had already developed a Premiere Pro integration using Adobe’s extensions architecture; however, the latest version of Premiere Pro and After Effects adds an integrated interface panel called Review with

Now your individual Adobe Creative Cloud subscription includes a account at no additional charge. This includes 100GB of cloud storage (separate from existing Creative Cloud storage) for up to five projects, use by two collaborators, and unlimited access for reviewers. If you need more storage or to add more collaborators, then you can upgrade to a larger plan, but at additional cost.

Adobe Creative Cloud Team and Enterprise accounts don’t fall under this plan and those admins will need to consult Adobe or for a plan that best meets their needs. In other words, if you are a production company paying for an Adobe Team account with multiple users on the account, you don’t get 100GB of “free” storage for each user. This offering is primarily designed for individual Adobe Creative Cloud subscribers.

Something to know before you start

There’s a gotcha for some existing customers. You activate your new Adobe CC service by logging in with the same e-mail and password as used for your Adobe ID. Let’s say you work freelance at a facility and are a collaborator on their Team account. In that case, you might be using a personal email address to log into However, if that email is the same as used for your personal Adobe ID, then does not know how to differentiate between the two.

To rectify this you need to use a different email for one of these two log-ins. This is generally a minor issue, since most people have more than one email address that they use. In my own case, I needed to change my Adobe ID email, which was a relatively quick procedure. This allows me to separately access either of the two accounts as a collaborator, based on which email I log in with.

One confusing thing I encountered was that the account starts as a 30-day trial for a Team account, so it looks like you are going to get billed extra after the trial ends. This is not the case. I think it’s a mistake for Adobe and to do this, because they are trying to upsell you to the paid account. Fortunately there’s no need to enter payment information up front. I wish that this was clearer in the marketing details. Hopefully Adobe will correct this after the initial rollout. At the end of the 30-trial, you will be asked whether to pay or end the trial. If you opt to end the trial, then the account reverts to the free plan, which is the one included with your Adobe Creative Cloud subscription.

Getting started

Open the Review with panel in Premiere Pro or After Effects and sign-in using your Adobe ID. This will open your default browser and send you to the website to complete the sign-in. As long as you stay signed in, you can access either in your web browser or within the panel. If you sign out, then next time you’ll need to sign in again using the Adobe ID.

I won’t go into how itself works, since there are plenty of tutorials. This integration doesn’t change any of the operation. The panel works like the previous extensions panel. A clip with reviewer comments can be synced to your Premiere Pro timeline for easy changes. Or you can simply work from the web portal and ignore the panel entirely. 100GB is plenty if your intent is to use for low-resolution review files. However, if your intention is a larger, more complex workflow, then you may need to upgrade your account after all.

Enter C2C

The bigger picture is that is enthusiastically pushing its camera-to-cloud (C2C) workflow. I’m not really a big believer in this concept, but I know plenty of companies are going to announce more cloud and remote services at NAB. For many reasons, I don’t believe that all of our media will be in the cloud in a decade or two. However, I think Adobe does. In my opinion, it’s not a particularly good goal for users or the planet. But, I digress. In today’s world, what C2C offers in conjunction with the Premiere Pro integration is a Dropbox-style experience.

Let’s say your videographer is recording a corporate CEO interview in Los Angeles. The company’s PR rep is in New York and the editor in Atlanta. And there’s a very short turnaround schedule. In this basic scenario, both the videographer and editor are collaborators on a project. While the interview is being recorded, the feed is being uploaded to in near real-time. This requires some hardware on the camera side or it could be done by someone on set right after the recording ends. Once it’s in, the PR rep in NYC can access and review the takes. The editor in Atlanta also sees the footage appear in the panel within Premiere Pro. Files can be downloaded from the panel to the editor’s drives and the edit can start right away.

Given most standard internet speeds today and the 100GB bucket, this workflow makes sense if you are uploading smaller camera proxy files. Some proxies can actually be good enough to master with – especially in fast turnaround situations. In other scenarios, the proxies might be used to start the edit and later replaced with the high-res camera originals, once received from the shoot.

I feel that such situations are a lot fewer than the marketers want you to believe. Moving high-res files over the internet is never fast. FedEx often still offers the better option. So unless you really do need to get started right away, just wait for the media to arrive a day or so later. However, C2C for the purpose of an out-of-town producer reviewing takes remotely – especially in light of workflow changes caused by COVID over the past couple of years – has gained steam. is clear that just because they are an Adobe company doesn’t change their dedication to other workflows and other applications, such as Final Cut Pro. New announcements include native FilmLight Baselight integration, an app for Apple TV, and C2C partnerships with FiLMiC Pro.

If you are a current customer without any Adobe subscription – no problem. Nothing changes for you. I’ve been using since it launched and have been happy with the service. There are occasional glitches, but no worse than any other internet service, including your regular e-mail provider. Better yet, clients love the process. It’s not perfect, but it is one of the better review-and-approval sites and services on the market. If this is the first time you start using by virtue of your Adobe subscription, then you are bound to see your daily workflow enhanced.

©2022 Oliver Peters

Boris FX Optics 2022

Boris FX is a respected developer of visual effects tools for video. With the introduction of Optics in 2020, Boris FX further extended that expertise into the photography market. Optics installs as a plug-in for Adobe Photoshop, Lightroom, and Bridge. Optics is also installed as a standalone app that supports a variety of still image formats, including camera RAW. So, if you’ve avoided an Adobe subscription, you are still in luck. Before you go any further, I would encourage you to read my 2020 review of Optics (linked here) for an overview of how it works and how to use it.

How has Optics 2022 changed? 

Since that introduction in late 2020, Optics has gone through several free updates, but the 2022 version requires a small upgrade fee for existing users. If you are new to Optics, it’s available though subscription or perpetual licensing and includes a trial period to test the waters.

At first glance, Optics 2022 looks and operates much like the previous versions. Key changes and improvements for Optics 2022 include Mac M1 native support, Metal acceleration of most Sapphire filters, UI enhancements, and mask exchange with Photoshop. However, the big new features include the introduction of a Particle Illusion category with over 1700 emitters, more Sapphire filters, and the Beauty Studio filter set from Continuum. The addition of Particle Illusion might seem a bit odd for a photography application, but by doing so, Boris FX has enhanced Optics as a graphic design tool.

Taking those point-and-shoot photos into Optics 

I’ve used Optics since its introduction and was eager to review Optics 2022 when Boris FX contacted me. There was a local British car show this past Saturday – a superb opportunity to take some photos of vintage Jags, MGs, Minis, Bentleys, Triumphs, and Morgans on a sunny Florida weekend. To make this more real, I decided to shoot the stills with my plain vanilla iPhone SE 2020 using FiLMiC Pro’s FirstLight still photo app. Somewhere along the line, iOS and FirstLight have been updated to allow camera RAW photography. This wasn’t initially available and technically the SE doesn’t support Apple’s ProRAW codec. However, FirstLight now enables RAW recording of DNG files, which are kissing cousins of ProRAW. In the RAW mode, you get the full 4:3, 12MP sensor image. Alternate aspect ratios or in-app film emulations will be disabled.

After a morning of checking out classic cars, I returned home, AirDropped the stills to my iMac and started testing Optics. As RAW photos, the first step in Photoshop is to make any adjustment in the Adobe Camera RAW module before the photo opens in Photoshop. Next, send the layer to Optics, which launches the Optics 2022 application and opens that image in the Optics interface. When you’ve completed your Optics adjustments, click Apply to send the image back to Photoshop as a flat, rasterized image layer or a smart filter.

Working with layers and filters

As I discussed in my 2020 post, Optics itself is a layer-based system, similar to Photoshop. Each layer has separate blend and masking controls. Typically you add one effect per layer and stack more layers as you build up the look. The interface permits you to enable/disable individual layers, compare before and after versions, and adjust the display size and resolution.

Effects are organized into categories (FilmLab, Particle Illusion, Color, Light, etc) and then groups of filters within each category. For example, the Stylize category includes the various Sapphire paint filters. Each filter selection includes a set of presets. When you apply a filter preset, the parameters panel allows you to fine-tune the look and the adjustment of that effect, so you aren’t locked into the preset.

In addition to the parameters panel, many of the effects include on-screen overlay controls for visual adjustment. This is especially helpful with the Particle Illusion effects. For instance, you can change or modify the path of a lightning bolt by moving the on-screen points of the emitter.

Handling file formats

Optics supports TIFF, JPEG, PNG, and RAW formats, so you can open those straight into Optics without Photoshop. In the case of my DNG files, the first effect to be applied is a Develop filter. You can tweak the image values much like in the Adobe Camera RAW module. The operation for creating your look is the same as when you come from Photoshop, except that there is no Apply function. You will need to Save or Save As to export a flat, rasterized TIFF, PNG, or JPEG file. 

Unlike Photoshop, Optics does not have its own layered image format. You can save and recall a set-up. So if you’ve built up a series of filter layers for a specific look, simply save that set-up as a file (minus the image itself). This can be recalled and applied to any other image and modified to adapt that set-up for the new image. If you save the file in the TIFF format, then you have the option to save it with the set-up embedded. These files can be opened back up in Optics along with the various filter layers for further editing.


As I worked through my files on my iMac, Optics 2022 performed well, but I did experience a number of application crashes of just Optics. When Optics crashes, you lose any adjustments made to the image in Optics. However, when I tested Optics 2022 on my mid-2014 15″ MacBook Pro using the same RAW images, the application was perfectly stable. So it could be some sort of hardware difference between the two Macs.

Here’s one workflow item to be aware of between Photoshop and Optics. If you crop an image in Photoshop, the area outside of the crop still exists, but is hidden. That full image without the crop is the layer sent to Optics. If you apply a stylized border effect, the border is applied to the edges of the full image. Therefore, some or all of the border will be cropped upon returning to Photoshop. Optics includes internal crop controls, so in that instance, you might wish to crop in Optics first, apply the border, and then match the crop for the whole image once back in Photoshop.

All in all, it’s a sweet application that really helps when stuck for ideas about what to do with an image when you want to elevate it above the mundane. Getting great results is fast and quite enjoyable – not to mention, infinitely easier than in Photoshop. Overall, Optics is a great tool for any photographer or graphic designer.

Click through the gallery images below to see further examples of looks and styles created with Boris FX Optics 2022.

©2022 Oliver Peters

Analogue Wayback, Ep. 4

Divide 24 into 60.

Back in the 1970s and as late as the early 80s, film editors working on film ruled the world of high-end regional and national TV commercials. They had the agency connections and, as such, controlled the workflow in post. Most of the top commercials were shot on film, so after the lab processed the negative the editors (or editorial companies) took it from there. Often they would subcontract and outsource any further lab services, video finishing, and the sound mix as part of their arrangement with the client.

Timecode-based video editing was just starting to grab hold in the mid-70s. Early attempts at creative, nonlinear (“offline”) editing systems were commercially unsuccessful. What stuck were the linear (“online”) edit suites built around switchers and  2″ VTRs. Occasionally producers, directors, and agencies would go directly into an online bay and cut start-to-finish there. But at the highest end, spots were typically edited with the agency folks sitting around a flatbed film edit bench, like a KEM or Steenbeck. Online bays and editors were considered technical and used to conform the edit, thus replacing the work previously done by labs to finish and distribute those TV spots.

Frame rate and interlaced – the two flies in the ointment

This was the world of interlaced NTSC in the US (PAL in much of the rest of the world). There was no HD, progressive scan, or 24p video (23.976). NTSC operated and continues to operate at 29.97fps using interlaced fields – i.e. two unique images (fields) per frame, each containing half the content of that whole frame. Therefore, NTSC is really 60 images per second, or expressed in technical terms, 59.94i. Film is typically shot at a true 24fps (or now offset to 23.976) where each frame is a complete image. This presents the dilemma of converting 24 film images into 60 video fields (let’s keep it simple and skip the 23.976 vs 24 and 59.94 vs 60 discussion).

Every four film frames (out of 24) corresponds to five video frames (out of 30). Telecine engineers devised the scheme of 3:2 pulldown to transfer film to video. The pattern of corresponding frames or cadence is 2:3:2:3. Consecutive film frames are alternately scanned for two or three video fields. In this cadence of so-called whole and split frames, the first film frame in the sequence covers both fields of the first video frame. The next three film frames continue the cadence until we reach the fifth film frame/sixth video frame. At this point the film and video frames sync up again and the cadence repeats itself. If you park on a single video frame and see only one film image, that’s a whole frame. If you see a blend of two film images, then that’s a split frame.

The hybrid editing workflow model

This 2:3:2:3 cadence is how film aligns with interlaced NTSC video, but there’s also the issue of different counting methods. Video editors work in SMPTE timecode – 00:00:00:00 – hours, minutes, seconds, frames. Film editors work in feet and frames – 0000+00 – 16 frames/foot for 35mm and 40 frames/foot for 16mm. Where’s the rosetta stone to go between them? 

Since transfer of the negative to video was already possible in the 70s, a hybrid editing model developed. Shoot film – transfer to video – edit film – conform the film edit on video. That approach goes like this. Workprint is created by the lab from the negative. This workprint is cut by the film editor. The negative itself isn’t cut (at least in the case of most commercials). The key is to get the proper cross-reference between the two forms of media. To do that you need a common starting point for each film roll.

The key

The lab punches a hole at the start of each negative film roll and also adds edge numbers for feet and frames. The hole punch becomes 0000+00. During the film-to-tape transfer of the negative, the telecine operator/colorist syncs the hole punch to a zero timecode value on the videotape, such as 01:00:00:00. Now there’s a common starting point. 0000+00 = 01:00:00:00. In the whole and split frame cadence (2:3:2:3) the first, second, and fourth film frames have matching whole frame video frames (:00, :01, :04). The third film frame is split across two video frames (:02, :03).

Next, the film editor works through the creative edit of the commercial. When the cut of the workprint is locked, a cut list is delivered to the online edit facility. This is similar to a video EDL (edit decision list), except that events are listed in feet and frames instead of timecode. From the online video editor’s point of view this foot/frame number isn’t an absolute value, but rather an offset from the common 01:00:00:00 start point on the videotape transfer.

For each edit, the offset between the two must be calculated in order to get to the correct matching shot on the video tape. It really doesn’t matter whether the source image lands on a split or whole frame timecode, because the source is unedited camera original. However, it does matter in the timeline, because the duration on the film clip determines the out-point of the video edit. In order to maintain the proper rhythm of the film editor’s cut, certain frames will be ambiguous. The video editor will have to make a judgement call whether to trim one frame forward or backwards.

A cheat sheet to the rescue

I worked as on online editor in Jacksonville in the 1970s. One of our clients was the largest ad agency in Florida. Their creative director had been a film editor in an earlier life. Any time large campaigns for one of their brands was produced, he would lock himself into a cutting room at the local film lab and edit the spots on their Steenbeck. It then became our job at the post house to translate that cut into a video master based on the steps I’ve just described. 

Fortunately, my boss was experienced in both film and video editing. He took it upon himself to create a handy conversion chart that cross-referenced 10 minutes of film (the approximate length of a film roll) between a foot/frame count and timecode. He even marked all of the ambiguous frames. Using this chart, it was easy to take a film number from the cut list, look up the timecode on the chart and type that into the edit controller. Preview the edit, trim as needed, and commit to the edit. Rinse and repeat.

I’m sure that this sounds rather old school to most readers, but it’s one of those arcane skills that still has validity today. Film is not quite dead yet. Avid still integrates some features of cut list conversion. As a business model, film editorial companies of the past are top commercial edit shops today, using Media Composer, Premiere, and Final Cut in place of Steenbecks and KEMs. So while the technology may have changed, many of the concepts haven’t.

©2022 Oliver Peters

Pro Tips for FCP Editors

Every nonlinear editing application has strengths and weaknesses. Each experienced editor has a list of features and enhancements that they’d like to see added to their favorite tool. Final Cut Pro has many fans, but also its share of detractors, largely because of Apple’s pivot when Final Cut Pro changed from FCP7 to FCPX a decade ago. That doesn’t mean it’s not adequate for professional-level work. In fact, it’s a powerful tool in its own right. But there are ways to adapt it to workflows you may miss from competing NLEs. I discuss five of these tips in my article Making Final Cut More Pro over at

©2022 Oliver Peters

Storage for Editors 

Storage is the heart of a modern post-production facility. The size and type of storage you pick can greatly impact the efficiency of the facility. Surprisingly the concerns and requirements around a storage network aren’t all that different, regardless of whether you’re a large or smaller post facility.

I recently spoke with industry veterans at Molinare in London and Republic Editorial in Dallas about how they’ve addressed storage needs. 

Check out my interview at postPerspective.

©2021 Oliver Peters