Photo Phun 2021

The last time I posted one of these “just for fun” set of photography samples was in 2015. So it’s high time for another one. Earlier this year I reviewed FiLMiC Pro’s iOS movie application and put together a small demo highlighting the capabilities. At the same time I installed Firstlight – FiLMiC’s iOS still photo/camera application.

Firstlight extends the power of the iPhone camera system with extra controls and features. It also adds film-style looks, including grain, vignettes, and film stock simulations. My past Photo Phun blog posts have showcased post processes that could be applied to existing images. The examples in this article used no image manipulation in post (outside of size and crop adjustments). The “look” of all samples was done in-camera, using Firstlight’s built-in features. This is using a stock iPhone SE 2020 – no lens attachments, filters, or tripod.

I occasionally throw shade on the true quality of the best iPhone images. If image quality is the critical factor, then give me a high-end DSLR any day. But, the iPhone is “the camera you have with you” and as such is the modern digital equivalent of the Kodak Instamatic camera, albeit with a lot better image quality. We can pixel-peep all day long, but the bottom line is that iPhones and top-of-the-line Android phones create some stunning imagery in the right hands.

I’ve shot a fair amount of 35mm slide and print film over the years as an amateur photographer. I’ve also manipulated images in post for my share of motion film. Any film stock emulation LUT is suspect in my eyes. You can get close, but ultimately the look designed by a company can only be an approximation. FiLMiC positions its looks as film simulations inspired by 19th and 20th century photography. I think that’s the right approach, without claiming that a given simulation is the exact same as a specific photochemical stock from a film manufacturer or a development process at a lab.

Firstlight includes a wide range of film simulations to choose from. Bear in mind that a look will change the saturation or hue of certain colors. This will only be readily obvious if those colors are in the images that you shoot. For example, C41 Plus shifts blues towards cyan. If you have blue skies in your shot, the resulting skies will appear cyan. However, if your scene is devoid of blues, then you might not see as much effect from that simulation. Think about matching the look to the content. Shoot the right subject in B&W Noir and any shot will look like it came from a gothic tale!

Remember that your iPhone has no true viewfinder – just the screen. If you are outside in bright sunlight, you can barely see what you are shooting, let alone the color differences between film simulations. If you intend to use a film simulation, then plan the shot out ahead of time. Become familiar with what each setting is intended to do, because that look will be baked in. If you get back to the computer and realize you made a mistake, then it’s too late. Personally, I’m keen on post rather than in-camera manipulations. However, if you feel confident in the results – go for it.

The gallery below features examples using Firstlight’s simulated film looks. Each choice has been noted on most of the screens, although some will be very obvious. Other than the changes in the film simulation, the rest of the Firstlight settings are identical for all images. The camera was set to 3:2 aspect ratio, AE mode enabled, and HDR active. Both a fine grain and a vignette at the lowest setting were also applied to each during Firstlight’s image capture.

Check out my Flickr page at this link to see some of these same images and more with added Photoshop and BorisFX Optics development and effects.

Click any gallery thumbnail below to view an enlarged slideshow of these photos.

©2021 Oliver Peters

Final Cut Pro at 10 and Other Musings

Recently Final Cut Pro (formerly Final Cut Pro X) hit its tenth anniversary.  Since I’ve been a bit quiet on this blog lately due to the workload, I thought it was a good time to reflect. I recently cut a set of involved commercials using FCP. While I’ve cut literally thousands of commercials in my career, my work in recent years tends to be corporate/branding/image content in the five to ten minute range. I work in a team and the tool of choice is Premiere Pro. It’s simply a better fit for us, since the bulk of staff and freelancers are very fluid in Adobe products and less so with Apple’s pro software. Sharing projects and elements also works better in the Adobe ecosystem.

Cutting the spots in Final Cut Pro

In the case of the four :60s, I originally budgeted about two days each, plus a few days for client revisions – eleven days in total. My objective was to complete the creative cut, but none of the finishing, since these spots involved extensive visual effects. I was covering for the client’s regular editor who had a scheduled vacation and would finish the project. The spots were shot with a Sony Venice, simultaneously recording 6K RAW and 4K XAVC (AVC-Intra) “proxy” files. The four spots totaled over 1200 clips with approximately an hour of footage per spot. My cutting options could be to work natively with the Sony RAW media in Premiere Pro or DaVinci Resolve, or to edit with the proxies in any NLE.

The Sony RAW files are large and don’t perform well playing from a shared storage system. I didn’t want to waste the time copying location drives to the NAS, partially for reasons of time. I also wanted to be able to access media to cut the spots whether at home or at the work facility. So I opted to use the proxies, which allowed me to cut the spots in FCP. Of course, if you think of proxies as low-res files, you’d be wrong. These Sony XAVC files are high-res, camera-original files on par with 4K ProRes HQ media. If it weren’t for VFX, these would actually be the high-quality source files used for the final edit.

I copied the proxy files to a 2TB Samsung T7 SSD portable drive. This gave me the freedom to edit wherever – either on my iMac at home or one of the iMac Pros at work. This is where Final Cut Pro comes in. When you wade through that much footage, it’s easy for an NLE to get bogged down by caching footage or for the editor to get lost in the volume of clips. Thanks to skimming and keyword collections, I was able to cut these spots far more quickly than using any of the other NLE options. I could go from copying proxy files to my first cut on a commercial within a single day. That’s half of the budgeted time.

The one wrinkle was that I had to turn over a Premiere Pro project linked to the RAW media files. There are various ways to do that, but automatic relinking is dicier with these RAW files, because each clip is within its own subfolder, similar to RED. This complicates Premiere’s ability to easily relink files. So rather than go through XtoCC, I opted to import the Sony RAW clips into Resolve, then import the FCPXML, which in turn automatically relinked to the RAW files in Resolve.

There are a few quirks in this method that you have to suss out, but once everything was correct in Resolve, I exported an XML for Premiere. In Premiere Pro, I imported that XML, made sure that Premiere linked to the RAW files, corrected any size and speed issues, removed any duplicate clips, and then the project was ready for turnover. While one could look at these steps and question the decision to not cut in Premiere in the first place, I can assure you that cutting with Final Cut was considerably faster and these roundtrip steps were minor.

Remote workflows

Over the past year, remote workflows and a general “work from home” movement has shifted how the industry moves forward. So much of what I do requires connection to shared storage, so totally working from home is impractical. These spots were the exception for me, but the client and director lived across the country. In years past, they used to fly in and work in supervised sessions with me. However, in more recent years, that work has been unattended using various review-and-approval solutions for client feedback and revisions. Lately that’s through In the case of these spots, my workflow wasn’t any different than it would have been two years ago.

On the other hand, since I have worked with these clients in supervised sessions, as well as remote projects, it’s easy to see what’s been lost in this shift. Remote workflows present two huge drawbacks. The first is turnaround time. It’s inherently an inefficient process. You’ll cut a new version, upload it for review, and then wait – often for hours or even the next day. Then make the tweaks, rinse, and repeat. This impacts not only the delivery schedule, but also your own ability to book sessions and determine fair billing.

Secondly, ideation takes a back seat. When a client is in the room, you can quickly go through options, show a rearranged cut, alternate takes, and so on. Final Cut’s audition function is great for this, but it’s a wasted feature in these modern workflows. During on-prem sessions, you could quickly show a client the options, evaluate, and move on. With remote workflows, that’s harder to show and is subject to the same latency of replying, so as a result, you have fewer options that can be properly vetted in the cut.

The elephant in the room is security. I know there are tons of solutions for “drilling” into your system from home that are supposed to be secure. In reality, the only true security is to have your system disconnected from the internet (but also not totally bulletproof). As Sony Pictures, QNAP owners, Colonial Pipeline, agencies of the US government, or multiple other corporations have found out, if a bad actor wants to get into your system, they can. No amount of encryption, firewalls, VPNs, multi-factor authentication, or anything else is going to be guaranteed to stop them. While remote access might have been a necessary evil due to COVID lockdowns, it’s not something that should be encouraged going forward.

However, I know that I’m swimming against the stream on this. Many editors/designers/colorists don’t seem to ever want to return to an office. This is at odds with surveys indicating the majority of producers and agencies are chomping to get back to working one-on-one. Real estate and commuting costs are factors that affect such decisions, so I suspect hybrids will evolve and the situation in the future may vary geographically.

Final Cut Pro’s future

I mention the WFH dilemma, because remote collaboration is one of the features that Apple has been encouraged to build into Final Cut Pro by some users. It’s clearly a direction Adobe has moved towards and where Avid already has a track record.

I’m not sure that’s in Apple’s best interest. For one thing, I don’t personally believe Apple does a good job of this. Access and synchronization performance of iCloud is terrible compared with Google’s solutions. Would a professional collaboration solution really be industry-leading and robust? I highly doubt it.

Naturally Apple wants to make money, but they are also interested in empowering the creative individual – be that a professional or an enthusiast. Define those terms in whatever way you like, but the emphasis is on the individual. That direction seems to be at odds with what “pro” users think should be the case for Apple ProApps software, based on their experiences in the late years of FCP 1-7/FCP Studio (pre-X).

I certainly have my own feature request list for Final Cut Pro, but ultimately the lack of these did not stop me from a rapid turnaround on the spots I just discussed. Nor on other projects when I turn to FCP as the tool of choice. I use all four major NLEs and probably never will settle on a single “best” NLE for all cases.

The term “YouTube content creator” or “influencer” is often used as a pejorative, but for many filmmakers and marketeers outlets like YouTube, facebook, and Instagram have become the new “broadcast.” I recently interviewed Alexander Fedorov for He’s a Russian photographer/filmmaker/vlogger who epitomizes the type of content creator for whom Apple is designing its professional products. I feel that Apple can indeed service multiple types of users, from the individual, self-taught filmmaker to the established broadcast pro. How Apple does that moving forward within a tool like Final Cut Pro is anyone’s guess. All I know is that using the measurements of what is and isn’t “pro” no longer works in so many different arenas.

©2021 Oliver Peters


A regrettable aspect of history and the march of time is that many interesting stories are buried or forgotten. We learn the bullet points of the past, but not the nuances that bring history alive. It’s a challenge that many documentarians seek to meet. While the WWII era is ripe with heroic tales, one unit was almost forgotten.

Women Airforce Service Pilots  (aka WASP)

As WWII ramped up, qualified male pilots were sent to European and Pacific combat, leaving a shortage of stateside pilots. The WASP unit was created as a civilian auxiliary  attached to the U. S. Army Air Forces, It was organized and managed by Jackie Cochran, an accomplished female aviator and entrepreneur. More than 25,000 women applied for the WASP, but only 1,830 were accepted into the program.

The WASP members engaged in military-style training at Avenger Field in Sweetwater, Texas. They wore uniforms, and were given flight assignments by the military, yet they weren’t actually in the military. Their role was to handle all non-combat, military flight tasks within the states, including ferrying aircraft cross-country from factories to deployment bases, serve as test pilots, and handle training tasks like towing targets and mock strafing runs over combat trainees. During her service, the typical WASP would fly more types of aircraft than most male, military pilots. Sadly, 38 WASP died during training or active duty assignments.

Although WASP members joined with the promise of their unit becoming integrated into the regular military, that never happened. As the war wound down and male pilots returned home needing jobs, the WASP units were disbanded, due in part to Congressional and media resistance. Records were sealed and classified and the WASP were almost forgotten by history. Finally in the late 1970s President Carter signed legislation that recognized WASP members as veterans and authorized veterans benefits. In 2009 President Obama and the Congress awarded WASP members with the Congressional Gold Medal.

The documentary

Documentary filmmaker Jon Anderson set out over a decade ago to tell a complete story of the WASP in a feature-length film. Anderson, a history buff, had already produced and directed one documentary about the Tuskegee Airmen. So the WASP story was the next logical subject. The task was to interview as many living WASP to tell their story as possible. The goal was not just the historical facts, but also what it was like to be a WASP, along with some of the backstory details about Cochran and the unit’s formation. The result was W.A.S.P. – A Wartime Experiment in WoManpower.

Anderson accumulated a wealth of interviews, but with limited resources. This meant that interviews were recorded mostly on DV cameras in standard definition. However, as an instructor of documentary filmmaking at Valencia College, Anderson also utilized some of the film program’s resources in the production. This included a number of re-enactments – filmed with student crews, talent, and RED cameras. The initial capture and organization of footage was handled by a previous student of his using Final Cut Pro 7.

Technical issues

Jon asked me to join the project as co-editor after the bulk of interviews and re-enactments had been compiled. Several dilemmas faced me at the front end. The project was started in FCP7, which was now a zombie application. Should I move the project to Final Cut Pro X, Premiere Pro, or Media Composer? After a bit of experimentation, the best translation of the work that had already been done was into Premiere Pro. Since we had a mix of SD and HD/4K content, what would be the best path forward – upconvert to HD or stay in standard def? HD seemed to be the best option for distribution possibilities, but that posed additional challenges.

Only portions of tapes were originally captured – not complete tapes. These were also captured with separated audio and video going to different capture folders (a feature of FCP “classic”). Timecode accuracy was questionable, so it would be nearly impossible to conform the current organized clips from the tapes at a higher resolution. But since it was captured as DV from DV tapes, there was no extra quality loss due to interim transcoding into a lower resolution file format.

Ultimately I opted to stick with what was on the drives as my starting point. Jon and I organized sequences and I was able to borrow a Blackmagic Teranex unit. I exported the various sequences between two computers through the Teranex, which handled the SD to HD conversion and de-interlacing of any interlaced footage. This left us with upscaled ProRes interviews that were 4×3 within a 16×9 HD sequence. Nearly all interviews were filmed against a black limbo background, so I then masked around each woman on camera. In addition, each was reframed to the left or right side, depending on where they faced. Now we could place them against another background – either true black, a graphic, or B-roll. Finally, all clips were graded using Lumetri within Premiere Pro. My home base for video post was TinMen – an Orlando creative production company.

Refining the story

With the technical details sorted out, it was time to refine the story. Like many docs, you end up with more possible storylines than will fit. It’s always a whittling process to reveal a story’s essence and to decide which items are best left out so that the rest remains clear. Interviews were bridged with voice-overs plus archival footage, photos, or re-enactments to fill in historical details. This went through numerous rounds of refinement with input from Jon and Rachel Becker Wright, the producer and co-editor on the film. Along the way Rachel was researching, locating, and licensing archival footage for B-roll. 

Once the bulk of the main storyline was assembled with proper voice-overs, re-enactments, and some B-roll, I turned the cut over to Rachel. She continued with Jon to refine the edit with graphics, music, and final B-roll. Sound post was handled by the audio production department at Valencia College. A nearly-final version of the 90-minute documentary was presented at a “friends and family” screening at the college.


Many readers know about the national Emmy® Awards handed out annually by the National Academy of Television Arts and Sciences (NATAS). It may be less known that NATAS includes 19 regional chapters, which also award Emmys within their chapters. Awards are handed out for projects presented in that region, usually via local broadcast or streaming. Typically the project wins the award without additional craft categories. Anderson was able to submit a shortened version of the documentary for judging by the Suncoast regional chapter, which includes Florida, Puerto Rico, and parts of Louisiana, Alabama, and Georgia. I’m happy to say that W.A.S.P. – A Wartime Experiment in WoManpower won a 2020 regional Emmy, which included Jon Anderson, Rachel Becker Wright, Joe Stone (production designer), and myself.

Awards are nice, of course, but getting the story out about the courageous ladies of the WASP is far more important and I was happy to play a small part in that.

©2021 Oliver Peters

Producing a Short Mini-Doc with the AJA CION

AJA surprised the industry in 2014 when it rolled out its CION digital cinema 4K camera. Although not known as a camera manufacturer, it had been working on this product for over four years. Last year the company offered its Try CION promotion (ended in October), which loaned camera systems to qualified filmmakers. Even though this promotion is over, potential customers with a serious interest can still get extended demos of the camera through their regional AJA sales personnel. It was in this vein that I arranged a two-week loan of a camera unit for this review.

I’m a post guy and don’t typically write camera reviews; however, I’m no stranger to cameras either. I’ve spent a lot of time “shading” cameras (before that position was called a DIT) and have taken my turn as a studio and field camera operator. My interest in doing this review was to test the process. How easy was it to use the camera in actual production and how easy was the post workflow associated with it?

CION details

The AJA CION is a 4K digital camera that employs an APS-C CMOS sensor with a global shutter and both infrared-cut and optical low-pass filters. It can shoot in various frame sizes (from 1920×1080 up to 4096×2160) and frame rates (from 23.98 up 120fps). Sensor scaling rather than windowing/cropping is used, which means the lens size related to the image it produces is the same in 4K as in 2K or HD. In other words, a 50mm lens yields the same optical framing in all digital sizes.

df0516_CION_Chellee5The CION records in Apple ProRes (up to ProRes 4444) using a built-in Pak media recorder. Think of this as essentially an AJA KiPro built right into the camera. Since Pak media cards aren’t FAT32 formatted like CF or SD cards used by other cameras, you don’t run into a 4GB file-size limit that would cause clip-spanning.  You can also record AJA Raw externally (such as to an AJA KiPro Quad) over 3G-SDI or Thunderbolt. Video is linear without any log encoding schemes; but, there are a number of gamma profiles and color correction presets.

df0516_CION_prod_1It is designed as an open camera system, using standard connectors for HDMI, BNC, XLR, batteries, lens mounts, and accessories. CION uses a PL lens mount system, because that’s the most open and the best glass comes for that mounting system. When the AJA rep sent me the camera, it came ready to shoot and included a basic camera configuration, plus accessories, including some rods, an Ikan D5w monitor, a Zeiss Compact Prime 28mm lens, 512GB and 256GB solid-state Pak media cards, and a Pak media dock/reader. The only items not included – other than tripod, quick-release base plate, and head, of course – were camera batteries. The camera comes with a standard battery plate, as well as an AC power supply.

Learning the CION

The subject of this mini-doc was a friend of mine, Peter Taylor. He’s a talented luthier who builds and repairs electric and acoustic guitars and basses under his Chellee brand. He also designs and produces a custom line of electric guitar pedals. To pull this off, I partnered with the Valencia College Film Production Technology Program, with whom I’m edited a number of professional feature films and where I teach an annual editing workshop. I worked with Ray Bracero, a budding DP and former graduate of that program who helps there as an instructional assistant. This gave me the rest of the package I needed for the production, including more lenses, a B-camera for the interview, lighting, and sound gear.

Our production schedule was limited with only one day for the interview and B-roll shots in the shop. To augment this material, I added a second day of production with my son, Chris Peters, playing an original track that he composed as an underscore for the interview. Chris is an accomplished session musician and instructor who plays Chellee guitars.

df0516_CION_prod_2With the stage set, this provided about half a day for Ray and me to get familiar with the CION, plus two days of actual production, all within the same week. If AJA was correct in designing an easy-to-use cinematic camera, then this would be a pretty good test of that concept. Ray had never run a CION before, but was familiar with REDs, Canons, and other camera brands. Picking up the basic CION operation was simple. The menu is easier than other cameras. It uses the same structure as a KiPro, but there’s also an optional remote set-up, if you want a wireless connection to the CION from a laptop.

4K wasn’t warranted for this project, so everything was recorded in 2K (2048×1080) to be used in an HD 2.35:1 sequence (1920×817). This would give me some room to reframe in post. All sync sound shots would be 23.98fps and all B-roll would be in slow motion. The camera permits “overcranking”, meaning we shot at 59.94fps for playback at 23.98fps. The camera can go up to 120fps, but only when recording externally in AJA Raw. To keep it simple on this job, all recording was internal to the Pak media card – ProResHQ for the sync footage and ProRes 422 for the slow motion shots.

Production day(s)

The CION is largely a “what you see is what you get” camera. Don’t plan on extensive correction in post. What you see on the monitor is typically what you’ll get, so light and control your production set-up accordingly. It doesn’t have as wide of a dynamic range as an ARRI ALEXA for example. The bottom EI (exposure index) is 320 and that’s pretty much where you want to operate as a sweet spot. This is similar to the original RED One. This means that in bright exteriors, you’ll need filtering to knocking down the light. There’s also not much benefit in running with a high EI. The ALEXA, for instance, looks great at 800, but that setting didn’t seem to help the CION.

df0516_CION_Chellee13_smGamma profiles and color temperature settings didn’t really behave like I would have expected from other cameras. With our lighting, I would have expected a white balance of 3200 degrees Kelvin, however 4500 looked right to the eye and was, in fact, correct in post. The various gamma profiles didn’t help with clipping in the same way as Log-C does, so we ultimately stayed with Normal/Expanded. This shifts the midrange down to give you some protection for highlights. Unfortunately with CION, when highlights are clipped or blacks are crushed, that is actually how the signal is being recorded and these areas of the signal are not recoverable. The camera’s low end is very clean and there’s a meaty midrange. We discovered that you cannot monitor the video over SDI while recording 59.94-over-23.98 (slow motion). Fortunately HDMI does maintain a signal. All was good again, once we switched to the HDMI connection.

CION features a number of color correction presets. For Day 1 in the luthier shop, I used the Skin Tones preset. This is a normal color balance, which slightly desaturates the red-orange range, thus yielding more natural flesh tones. On Day 2 for the guitar performance, I switched to the Normal color correction preset. The guitar being played has a red sunburst paint finish and the Skin Tones preset pulled too much of the vibrance out of the guitar. Normal more closely represented what it actually looked like.

df0516_CION_Chellee4During the actual production, Ray used three Zeiss Super Speed Primes (35mm, 50mm, and 85mm) on the CION, plus a zoom on the Canon 5D B-camera. Since the locations were tight, he used an ARRI 650w light with diffusion for a key and bounced a second ARRI 150w light as the back light. The CION permits two channels of high-quality audio input (selectable line, mic, or +48v). I opted to wire straight into the camera, instead of using an external sound recorder. Lav and shotgun mics were directly connected to each channel for the interview. For the guitar performance, the amp was live-mic’ed into an Apogee audio interface (part of Chris’ recording system) and the output of that was patched into the CION at line level.df0516_CION_Chellee8

The real-time interview and performance material was recorded with the CION mounted on a tripod, but all slow motion B-roll shots were handheld. Since the camera had been rigged with a baseplate and rods, Ray opted to use the camera in that configuration instead of taking advantage of the nice shoulder pad on the CION. This gave him an easy grasp of the camera for “Dutch angles” and close working proximity to the subject. Although a bit cumbersome, the light weight of the CION made such quick changes possible.

Post production

df0516_CION_FCPX_2As an editor, I want a camera to make life easy in post, which brought me to Apple Final Cut Pro X for the finished video. Native ProRes, easy syncing of two-camera interviews, and simple-yet-powerful color correction makes FCPX a no-brainer. We recorded a little over three hours of material – 146 minutes on the CION, 37 minutes on the 5D and 11 minutes on a C500 (for two pick-up shots). All of the CION footage only consumed about 50% of the single 512GB Pak media card. Using the Pak media dock, transfer times were fast. While Pak media isn’t cheap, the cards are very robust and unless you are chewing through tons of 4K, you actually get a decent amount of recording time on them.

I only applied a minor amount of color correction on the CION footage. This was primarily to bring up the midrange due to the Normal/Expanded gamma profile, which naturally makes the recorded shot darker. The footage is very malleable without introducing the type of grain-like sensor noise artifacts that I see with other cameras using a similar amount of correction. Blacks stay true black and clean. Although my intention was not to match the 5D to the CION – I had planned on some stylized correction instead – in the end I matched it anyway, since I only used two shots. Surprisingly, I was able to get a successful match.

Final thoughts

df0516_CION_Chellee6The CION achieved the design goals AJA set for it. It is easy to use, ergonomic, and gets you a good image with the least amount of fuss. As with any camera, there are a few items I’d change. For example, the front monitoring connectors are too close to the handle. Occasionally you have to press record twice to make sure you are really recording. There’s venting on the top, which would seem to be an issue if you suddenly got caught in the rain. Overall, I was very happy with the results, but I think AJA still needs to tweak the color science a bit more.

In conjunction with FCPX for post, this camera/NLE combo rivals ARRI’s ALEXA and AMIRA for post production ease and efficiency. No transcoding. No performance hits due to taxing, native, long-GOP media. Proper file names and timecode. A truly professional set-up. At a starting point of $4,995, the AJA CION is a dynamite camera for the serious producer or filmmaker. The image is good and the workflow outstanding.

Click this link to see the final video on Vimeo.

Originally written for Digital Video magazine / Creative Planet Network

©2016 Oliver Peters

Photo Phun 2015


It’s holiday time again and a chance to take a break from serious talk about editing and the tools – sort of. I’ve done a version of this post for a few years. Usually I take a series of my photos and run them through Photoshop, Lightroom, or one of the other photography applications to create stylized treatments. This year, I figured, why not try it with Final Cut Pro X?

These images have all been processed in a custom FCP X timeline set to 2000 x 1500 pixels. I’ve used a wide range of filters, including some from the FxFactory partner family, Koji, the built-in FCP X effects, as well as my own Motion templates published over from Motion. Enjoy these as we go into the holiday season. See you in the new year!

Click any image to see a slideshow of these photos.

©2015 Oliver Peters