Photo Phun 2022

Let’s polish off the year with another post of stills from my photography hobby. These stills were taken during this fall and Christmas season, plus a few oldies from other posts about Firstlight and Optics. As before, all of these images were captured with my iPhone SE using Firstlight, FiLMiC’s still photo companion to their FiLMiC Pro video capture app. Aside from the extra features, Firstlight enhances the phone with camera raw recording. This isn’t otherwise possible on the SE using the native camera application.

The workflow to “develop” these images started in Adobe Bridge, where it was easy to make the basic raw adjustments using the camera raw module. Bridge offers Lightroom-style control and quick processing for a folder of images. These images then went to Photoshop for cropping and resizing.

Boris FX Optics functions as both a Photoshop plug-in and a standalone application. It’s one of my favorite tools for creating looks with still photos. It goes far beyond the filters, adjustments, and effects included in applications like Photoshop alone. Nearly all image manipulation was done by roundtripping each file from Photoshop to Optics (via the plug-in) and then back. The last step in the workflow was to use the TinyJPG website to optimize the file sizes of these JPEG images. Click any image below to peruse a gallery of these stills.

Enjoy the images and the rest of the holiday season. I’ll be back after we flip the page to a new year. Look for a 4-part interview in January with legendary film editor, Walter Murch.

©2022 Oliver Peters

Frame.io Brings FiLMiC Pro to the Cloud

It’s not news that Frame.io has been pioneering camera-to-cloud (C2C) workflows. However, one of the newsworthy integrations announced last week was the addition C2C capabilities for iPhone and Android users with the Filmic Pro camera application. The update already popped up in your Filmic Pro settings if you’ve kept the app current. Frame’s C2C feature requires Filmic’s Cinematographer Kit (an in-app purchase), and a Frame.io Pro or Adobe Creative Cloud account.

Professional filming with iPhones has become common in many market sectors for both primary and secondary videography. The Filmic Pro/C2C workflow can prove worthwhile when fast turnaround and remote access become factors in your production.

Understanding the Filmic Pro C2C integration

Filmic Pro’s C2C integration is a little different than Frame’s other camera-to-cloud workflows, which are tied to Teradek devices. In those situations, the live video stream from the camera is simultaneously encoded into a low-res proxy file by the Teradek device. High-res OCF (original camera files) media is stored on the camera card and the proxies on the Teradek. The proxies are uploaded to Frame. There is some latency in triggering the proxy generation, so start and end times do not match perfectly between the OCF media and the proxies. Accurate relinks between file versions are made possible by common timecode.

An Android phone or iPhone does not require any extra hardware to handle the proxy creation or uploading. Filmic Pro encodes the proxy file after the recording is stopped, not simultaneously. Both high and low-res files are stored on the phone within Filmic’s clip library and have identical lengths and start/stop times. Filmic Pro won’t add a timecode track in this mode, so all files start from zero, albeit with unique file names. If you are shooting with multiple iPhones or double-system sound, then be sure to slate the clips so the editor can sync the files.

Testing the workflow

I have an iPhone SE and the software, so it was time to run some workflow tests in a hypothetical scenario. Here’s the premise – three members of the production team (in reality, me, of course), all located in different cities. The videographer is on site. The producer is in another city and he’s going to turn around a quick cut for story content. The editor is in yet a third location and he’ll conform the high-res camera files, add effects, graphics, color correction, and finish the mix to deliver the final product.

Click here to read the rest of the article at Pro Video Coalition

Click here for a more in-depth article about mobile filmmaking with FiLMiC Pro

©2022 Oliver Peters

Boris FX Optics 2022

Boris FX is a respected developer of visual effects tools for video. With the introduction of Optics in 2020, Boris FX further extended that expertise into the photography market. Optics installs as a plug-in for Adobe Photoshop, Lightroom, and Bridge. Optics is also installed as a standalone app that supports a variety of still image formats, including camera RAW. So, if you’ve avoided an Adobe subscription, you are still in luck. Before you go any further, I would encourage you to read my 2020 review of Optics (linked here) for an overview of how it works and how to use it.

How has Optics 2022 changed? 

Since that introduction in late 2020, Optics has gone through several free updates, but the 2022 version requires a small upgrade fee for existing users. If you are new to Optics, it’s available though subscription or perpetual licensing and includes a trial period to test the waters.

At first glance, Optics 2022 looks and operates much like the previous versions. Key changes and improvements for Optics 2022 include Mac M1 native support, Metal acceleration of most Sapphire filters, UI enhancements, and mask exchange with Photoshop. However, the big new features include the introduction of a Particle Illusion category with over 1700 emitters, more Sapphire filters, and the Beauty Studio filter set from Continuum. The addition of Particle Illusion might seem a bit odd for a photography application, but by doing so, Boris FX has enhanced Optics as a graphic design tool.

Taking those point-and-shoot photos into Optics 

I’ve used Optics since its introduction and was eager to review Optics 2022 when Boris FX contacted me. There was a local British car show this past Saturday – a superb opportunity to take some photos of vintage Jags, MGs, Minis, Bentleys, Triumphs, and Morgans on a sunny Florida weekend. To make this more real, I decided to shoot the stills with my plain vanilla iPhone SE 2020 using FiLMiC Pro’s FirstLight still photo app. Somewhere along the line, iOS and FirstLight have been updated to allow camera RAW photography. This wasn’t initially available and technically the SE doesn’t support Apple’s ProRAW codec. However, FirstLight now enables RAW recording of DNG files, which are kissing cousins of ProRAW. In the RAW mode, you get the full 4:3, 12MP sensor image. Alternate aspect ratios or in-app film emulations will be disabled.

After a morning of checking out classic cars, I returned home, AirDropped the stills to my iMac and started testing Optics. As RAW photos, the first step in Photoshop is to make any adjustment in the Adobe Camera RAW module before the photo opens in Photoshop. Next, send the layer to Optics, which launches the Optics 2022 application and opens that image in the Optics interface. When you’ve completed your Optics adjustments, click Apply to send the image back to Photoshop as a flat, rasterized image layer or a smart filter.

Working with layers and filters

As I discussed in my 2020 post, Optics itself is a layer-based system, similar to Photoshop. Each layer has separate blend and masking controls. Typically you add one effect per layer and stack more layers as you build up the look. The interface permits you to enable/disable individual layers, compare before and after versions, and adjust the display size and resolution.

Effects are organized into categories (FilmLab, Particle Illusion, Color, Light, etc) and then groups of filters within each category. For example, the Stylize category includes the various Sapphire paint filters. Each filter selection includes a set of presets. When you apply a filter preset, the parameters panel allows you to fine-tune the look and the adjustment of that effect, so you aren’t locked into the preset.

In addition to the parameters panel, many of the effects include on-screen overlay controls for visual adjustment. This is especially helpful with the Particle Illusion effects. For instance, you can change or modify the path of a lightning bolt by moving the on-screen points of the emitter.

Handling file formats

Optics supports TIFF, JPEG, PNG, and RAW formats, so you can open those straight into Optics without Photoshop. In the case of my DNG files, the first effect to be applied is a Develop filter. You can tweak the image values much like in the Adobe Camera RAW module. The operation for creating your look is the same as when you come from Photoshop, except that there is no Apply function. You will need to Save or Save As to export a flat, rasterized TIFF, PNG, or JPEG file. 

Unlike Photoshop, Optics does not have its own layered image format. You can save and recall a set-up. So if you’ve built up a series of filter layers for a specific look, simply save that set-up as a file (minus the image itself). This can be recalled and applied to any other image and modified to adapt that set-up for the new image. If you save the file in the TIFF format, then you have the option to save it with the set-up embedded. These files can be opened back up in Optics along with the various filter layers for further editing.

Performance

As I worked through my files on my iMac, Optics 2022 performed well, but I did experience a number of application crashes of just Optics. When Optics crashes, you lose any adjustments made to the image in Optics. However, when I tested Optics 2022 on my mid-2014 15″ MacBook Pro using the same RAW images, the application was perfectly stable. So it could be some sort of hardware difference between the two Macs.

Here’s one workflow item to be aware of between Photoshop and Optics. If you crop an image in Photoshop, the area outside of the crop still exists, but is hidden. That full image without the crop is the layer sent to Optics. If you apply a stylized border effect, the border is applied to the edges of the full image. Therefore, some or all of the border will be cropped upon returning to Photoshop. Optics includes internal crop controls, so in that instance, you might wish to crop in Optics first, apply the border, and then match the crop for the whole image once back in Photoshop.

All in all, it’s a sweet application that really helps when stuck for ideas about what to do with an image when you want to elevate it above the mundane. Getting great results is fast and quite enjoyable – not to mention, infinitely easier than in Photoshop. Overall, Optics is a great tool for any photographer or graphic designer.

Click through the gallery images below to see further examples of looks and styles created with Boris FX Optics 2022.

©2022 Oliver Peters

Photo Phun 2021

The last time I posted one of these “just for fun” set of photography samples was in 2015. So it’s high time for another one. Earlier this year I reviewed FiLMiC Pro’s iOS movie application and put together a small demo highlighting the capabilities. At the same time I installed Firstlight – FiLMiC’s iOS still photo/camera application.

Firstlight extends the power of the iPhone camera system with extra controls and features. It also adds film-style looks, including grain, vignettes, and film stock simulations. My past Photo Phun blog posts have showcased post processes that could be applied to existing images. The examples in this article used no image manipulation in post (outside of size and crop adjustments). The “look” of all samples was done in-camera, using Firstlight’s built-in features. This is using a stock iPhone SE 2020 – no lens attachments, filters, or tripod.

I occasionally throw shade on the true quality of the best iPhone images. If image quality is the critical factor, then give me a high-end DSLR any day. But, the iPhone is “the camera you have with you” and as such is the modern digital equivalent of the Kodak Instamatic camera, albeit with a lot better image quality. We can pixel-peep all day long, but the bottom line is that iPhones and top-of-the-line Android phones create some stunning imagery in the right hands.

I’ve shot a fair amount of 35mm slide and print film over the years as an amateur photographer. I’ve also manipulated images in post for my share of motion film. Any film stock emulation LUT is suspect in my eyes. You can get close, but ultimately the look designed by a company can only be an approximation. FiLMiC positions its looks as film simulations inspired by 19th and 20th century photography. I think that’s the right approach, without claiming that a given simulation is the exact same as a specific photochemical stock from a film manufacturer or a development process at a lab.

Firstlight includes a wide range of film simulations to choose from. Bear in mind that a look will change the saturation or hue of certain colors. This will only be readily obvious if those colors are in the images that you shoot. For example, C41 Plus shifts blues towards cyan. If you have blue skies in your shot, the resulting skies will appear cyan. However, if your scene is devoid of blues, then you might not see as much effect from that simulation. Think about matching the look to the content. Shoot the right subject in B&W Noir and any shot will look like it came from a gothic tale!

Remember that your iPhone has no true viewfinder – just the screen. If you are outside in bright sunlight, you can barely see what you are shooting, let alone the color differences between film simulations. If you intend to use a film simulation, then plan the shot out ahead of time. Become familiar with what each setting is intended to do, because that look will be baked in. If you get back to the computer and realize you made a mistake, then it’s too late. Personally, I’m keen on post rather than in-camera manipulations. However, if you feel confident in the results – go for it.

The gallery below features examples using Firstlight’s simulated film looks. Each choice has been noted on most of the screens, although some will be very obvious. Other than the changes in the film simulation, the rest of the Firstlight settings are identical for all images. The camera was set to 3:2 aspect ratio, AE mode enabled, and HDR active. Both a fine grain and a vignette at the lowest setting were also applied to each during Firstlight’s image capture.

Check out my Flickr page at this link to see some of these same images and more with added Photoshop and BorisFX Optics development and effects.

Click any gallery thumbnail below to view an enlarged slideshow of these photos.

©2021 Oliver Peters

Five Adobe Workflow Tips

Subscribers to Adobe Creative Cloud have a whole suite of creative tools at their fingertips. I believe most users often overlook some of the less promoted features. Here are five quick tips for your workflow. (Click on images to see an enlarged view.)

Camera Raw. Photographers know that the Adobe Camera Raw module is used to process camera raw images, such as .cr2 files. It’s a “develop” module that opens first when you import a camera raw file into Photoshop. It’s also used in Bridge and Lightroom. Many people use Photoshop for photo enhancement – working with the various filters and adjustment layer tools available. What may be overlooked is that you can use the Camera Raw Filter in Photoshop on any photo, even if the file is not raw, such as a JPEG or TIFF.

Select the layer containing the image and choose the Camera Raw Filter. This opens that image into this separate “develop” module. There you have all the photo and color enhancement tools in a single, comprehensive toolkit – the same as in Lightroom. Once you’re done and close the Camera Raw Filter, those adjustments are now “baked” into the image on that layer.

Remix. Audition is a powerful digital audio workstation application that many use in conjunction with Premiere Pro or separately for audio productions. One feature it has over Premiere Pro is the ability to use AI to automatically edit the length of music tracks. Let’s say you have a music track that’s 2:47 in length, but you want a :60 version to underscore a TV commercial. Yes, you could manually edit it, but Audition Remix turns this into an “automagic” task. This is especially useful for projects where you don’t need to have certain parts of the song time to specific visuals.

Open Audition, create a multitrack session, and place the music selection on any track in the timeline. Right-click the selection and enable Remix. Within the Remix dialogue box, set the target duration and parameters – for example, short versus long edits. Audition will calculate the number and location of edit points to seamlessly shorten the track to the approximate desired length.

Audition attempts to create edits at points that are musically logical. You won’t necessarily get an exact duration, since the value you entered is only a target. This is even more true with tracks that have a long musical fade-out. A little experimentation may be needed. For example, a target value of :59 will often yield significantly different results than a target of 1:02, thanks to the recalculation. Audition’s remix isn’t perfect, but will get you close enough that only minimal additional work is required. Once you are happy, bounce out the edited track for the shortened version to bring into Premiere Pro.

Photoshop Batch Processing. If you want to add interesting stylistic looks to a clip, then effects filters in Premiere Pro and/or After Effects usually fit the bill. Or you can go with expensive third party options like Continuum Complete or Sapphire from Boris FX. However, don’t forget Photoshop, which includes many stylized looks not offered in either of Adobe’s video applications, such as specific paint and brush filters. But, how do you apply those to a video clip?

The first step is to turn your clip into an image sequence using Adobe Media Encoder. Then open a representative frame in Photoshop to define the look. Create a Photoshop action using the filters and settings you desire. Save the action, but not the image. Then create a batch function to apply that stored action to the clean frames within the image sequence folder. The batch operation will automatically open each image, apply the effects, and save the stylized results to a new destination folder.

Open that new image sequence using any app that supports image sequences (including QuickTime) and save it as a ProRes (or other) movie file. Stylized effects, like oil paint, are applied to individual frames and will vary with the texture and lighting of each frame; therefore, the stitched movie will display an animated appearance to that effect.

After Effects for broadcast deliverables. After Effects is the proverbial Swiss Army knife for editors and designers. It’s my preferred conversion tool when I have 24p masters that need to be delivered as 60i broadcast files.

Import a 23.98 master and place it into a new composition. Scale, if needed (UHD to HD, for instance). Send to the Render Queue. Set the frame rate to 29.97, field render to Upper (for HD), and enable pulldown (any whole/split frame cadence is usually OK). Turn off Motion Blur and Frame Blending. Render for a proper interlaced broadcast deliverable file.

Photoshop motion graphics. One oft-ignored (or forgotten) feature of Photoshop is that you can do layer-based video animation and editing within. Essentially there’s a very rudimentary version of After Effects inside Photoshop. While you probably wouldn’t want to use it for video instead of using After Effects or Premiere Pro, Photoshop does have a value in creating animated lower thirds and other titles.

Photoshop provides much better text and graphic style options than Premiere Pro. The files are more lightweight than an After Effects comp on your Premiere timeline – or rendering animated ProRes 4444 movies. Since it’s still a Photoshop file (albeit a special version), the “edit in original” command opens the file in Photoshop for easy revisions. Let’s say you are working on a show that has 100 lower thirds that slide in and fade out. These can easily be prepped for the editor by the graphics department in Photoshop – no After Effects skills required.

Create a new file in Photoshop, turn on the timeline window, and add a new blank video layer. Add a still onto a layer for positioning reference, delete the video layer, and extend the layers and timeline to the desired length. Now build your text and graphic layers. Keyframe changes to opacity, position, and other settings for animation. Delete the reference image and save the file. This is now a keyable Photoshop file with embedded animation properties.

Import the Photoshop file into Premiere with Merged Layers. Add to your timeline. The style in Premiere should match the look created in Photoshop. It will animate based on the keyframe settings created in Photoshop.

©2021 Oliver Peters