A Light Footprint

When I started video editing, the norm was an edit suite with three large quadraplex (2”) videotape recorders, video switcher, audio mixer, B&W graphics camera(s) for titles, and a computer-assisted, timecode-based edit controller. This was generally considered  an “online edit suite”, but in many markets, this was both “offline” (creative cutting) and “online” (finishing). Not too long thereafter, digital effects (ADO, NEC, Quantel) and character generators (Chyron, Aston, 3M) joined the repertoire. 2” quad eventually gave way to 1” VTRs and those, in turn, were replaced by digital – D1, D2, and finally Digital Betacam. A few facilities with money and clientele migrated to HD versions of these million dollar rooms.

Towards the midpoint in the lifespan for this way of working, nonlinear editing took hold. After a few different contenders had their day in the sun, the world largely settled in with Avid and/or Media 100 rooms. While a lower cost commitment than the large online bays of the day, these nonlinear edit bays (NLE) still required custom-configured Macs, a fair amount of external storage, along with proprietary hardware and monitoring to see a high-quality video image. Though crude at first, NLEs eventually proved capable of handling all the video needs, including HD-quality projects and even higher resolutions today.

The trend towards smaller

As technology advanced, computers because faster and more powerful, storage capacities increased, and software that required custom hardware evolved to work in a software-only mode. Today, it’s possible to operate with a fraction of the cost, equipment, and hassle of just a few years ago, let along a room from the mid-70s. As a result, when designing or installing a new room, it’s important to question the assumptions about what makes a good edit bay configuration.

For example, today I frequently work in rooms running newer iMacs, 2013 Mac Pros, and even MacBook Pro laptops. These are all perfectly capable of running Apple Final Cut Pro X, Adobe Premiere Pro, Avid Media Composer, and other applications, without the need for additional hardware. In my interview with Thomas Grove Carter, he mentioned often working off of his laptop with a connected external drive for media. And that’s at Trim, a high-end London commercial editing boutique.

In my own home edit room, I recently set aside my older Mac Pro tower in favor of working entirely with my 2015 MacBook Pro. No more need to keep two machines synced up and the MBP is zippier in all respects. With the exception of some heavy-duty rendering (infrequent), I don’t miss using the tower. I run the laptop with an external Dell display and have configured my editing application workspaces around a single screen. The laptop is closed and parked in a BookArc stand tucked behind the Dell. But I also bought a Rain stand for those times when I need the MBP open and functioning as a second display.

Reduce your editing footprint

I find more and more editors working in similar configurations. For example, one of my clients is a production company with seven networked (NAS storage) workstations. Most of these are iMacs with few other connected peripherals. The main room has a 2013 “trash can” Mac Pro and a bit more gear, since this is the “hero” room for clients. If you are looking to downsize your editing environment, here are some pointers.

While you can work strictly from a laptop, I prefer to build it up for a better experience. Essential for me is a Thunderbolt dock. Check out OWC or CalDigit for two of the best options. This lets you connect the computer to the dock and then everything else connects to that dock. One Thunderbolt cable to the laptop, plus power for the computer, leaving you with a clean installation with an easy-to-move computer. From the dock, I’m running a Presonus Audiobox USB audio interface (to a Mackie mixer and speakers), a TimeMachine drive, a G-Tech media drive, and the Dell display. If I were to buy something different today, I would use the Mackie Onyx Blackjack interface instead of the Presonus/Mackie mixer combo. The Blackjack is an all-in-one solution.

Expand your peripherals as needed

At the production company’s hero room, we have the extra need to drive some video monitors for color correction and client viewing. That room is similarly configured as above, except with a Mac Pro and connection to a QNAP shared storage solution. The latter connects over 10Gb/s Ethernet via a Sonnet Thunderbolt/Ethernet adapter.

When we initially installed the room, video to the displays was handled by a Blackmagic Design UltraStudio device. However, we had a lot of playback performance issues with the UltraStudio, especially when using FCPX. After some experimenting, we realized that both Premiere Pro and FCPX can send a fullscreen, [generally] color-accurate signal to the wall-mounted flat panel using only HDMI and no other video i/o hardware. We ended up connecting the HDMI from the dock to the display and that’s the standard working routine when we are cutting in either Premiere Pro or Final Cut.

The rub for us is DaVinci Resolve. You must use some type of Blackmagic Design hardware product in order to get fullscreen video to a display when in Resolve. Therefore, the Ultrastudio’s HDMI port connects to the second HDMI input of the large client display and SDI feeds a separate TV Logic broadcast monitor. This is for more accurate color rendition while grading. With Media Composer, there were no performance issues, but the audio and video signal wants to go through the same device. So, if we edit Avid, then the signal chain goes through the UltraStudio, as well.

All of this means that in today’s world, you can work as lightly as you like. Laptop-only – no problem. iMac with some peripherals – no problem. A fancy, client-oriented room – still less hassle and cost than just a few short years ago. Load it up with extra control surfaces or stay light with a keyboard, mouse, or tablet. It all works today – pretty much as advertised. Gone are the days when you absolutely need to drop a small fortune to edit high-quality video. You just have to know what you are doing and understand the trade-offs as they arise.

©2017 Oliver Peters

Advertisements

Adobe’s Late-2017 Creative Cloud Updates

According to Cisco, 82% of internet traffic will be video by 2021. Adobe believes over 50% of that will be produced video and not just simple user content. This means producers will be expected to produce more – working faster and smarter. In the newest Creative Cloud update, Adobe has focused on just such workflow improvements. These were previewed at IBC and will be released later this year.

Adobe Premiere Pro CC

With this release, Adobe has finally enabled the ability to have more than one project file open at the same time. You can move clips and sequences between open projects. In addition, projects can be locked by the user, making Premiere Pro the first NLE to enable multiple open projects and locking within a single application. In addition, Adobe has expanded project types to include both Team Projects (your project is in the cloud) and shared projects (your project is local). The latter is ideal for SAN/NAS environments and adds Avid-style collaboration.

Editors will enjoy specific timeline enhancements, like “close all gaps” and up to 16 label colors. The Essential Graphics panel gets some love with font filtering and a visual font preview window. Graphics templates will now include a minimum duration, so that these clips can be extended on the timeline, while leaving the fade-in and fade-out constant.

Adobe is doubling down on VR using its acquired Skybox technology. New are 19 immersive effects and transitions specific to VR projects. These are needed to properly seam wraparound edges when effects are added to VR clips. They are all GPU-only effects; however, as some VR clips can be 5K wide and larger, performance can be challenging. Nevertheless, Adobe reports decent performance with 6K VR clips at half-resolution on laptops like the HP z820 or the 2017 15” MacBook Pro. There is also an immersive playback viewer designed for HMDs (head mount displays). It will display the image along with the Premiere Pro timeline window.

Premiere Pro’s non-VR editing updates, including shared projects, are explained well by the reTooled blog (video here).

Adobe Audition

Audition is the place to finalize your Premiere Pro mix, so a new auto-ducking mix tool has been added. This is based on Sensei, Adobe’s umbrella name for its artificial intelligence technologies. To use auto-ducking, the editor simply has to adjust sensitivity, amount of reduction, and fades, and then let Audition do the rest. Under AI, it will detect pauses in the dialogue and adjust music volume accordingly.

Other Audition enhancements include a timeline timecode overlay for the video viewer, the ability to simultaneously adjust dual-sided fades on clips, and new record and punch-in preferences for ADR work (“looping”).

After Effects

Here’s another example of this focus on time-savings. After Effects gains a new start-up window to set-up the first composition. It also gains a keyboard command editor, and in this release, will add the same font previewing tools as Premiere Pro. The biggest new feature is an expansion of the expression controls. These will be tied to data files for the quick updating of template graphics. If you create a graphic – such as a map of the US with certain information displayed by colors for each state – and it’s based on a template tied to data, then changing the supporting data information will automatically update the graphic. Other enhancements include GPU acceleration for third-party plug-ins that use the Mercury Playback Engine.

Character Animator

This live-capture, cartoon animation tool finally comes out of beta. A new feature will be the adjustment of the responsiveness of the animation tracking. This will permit live animation to look more hand-drawn. Actions can now be triggered by MIDI control panels. Triggers are editable in the timeline with a waveform for better matching of lip-sync.

There’s plenty of good user news, too, including the the release of 6 Below, an ultra-wide film designed for the Barco three-screen format. It was edited by Vashi Nedomansky using Premiere Pro. Other Premiere Pro news includes the dramatic feature film, Only The Brave, edited by Bill Fox, and Coup 53, a documentary in post being cut by Walter Murch. Both of these noted editors have been using Premiere Pro.

For more in-depth info, check out these links for a solid overview of Adobe’s soon-to-come Creative Cloud application updates:

ProVideo Coalition – Scott Simmons

Premiere Bro blog

Adobe’s own Digital Video & Audio blog

Originally written for Digital Video magazine / Creative Planet Network

©2017 Oliver Peters

Chromatic

Since its introduction six years ago, Apple Final Cut Pro X has only offered the Color Board as its color correction/grading tool. That’s in addition to some automatic correction features and stylized “look” effects. The Color Board interface is based on color swatches and puck sliders, instead of traditional color wheels, leaving many users pining for something else. To answer this need, several third-party, plug-in developers have created color corrector effects modules to fill the void. The newest of these is Chromatic from Coremelt – a veteran Final Cut plug-in developer.

The toolset

Chromatic is the most feature-rich color correction module currently available for FCPX. It offers four levels of color grading, including inside and/or outside of a mask, overall frame, and also a final output correction. When you first apply the Chromatic Grade effect to a clip, you’ll see controls appear within the FCPX inspector window. These are the final output adjustments. To access the full toolset, you need to click on the Grade icon, which launches a custom UI. Like other grading tools that require custom interfaces, Chromatic’s grading toolset opens as a floating window. This is necessitated by the FCPX architecture, which doesn’t give developers the ability to integrate custom interface panels, like you’ll find in Adobe applications. To work around this limitation, developers have come up with various ingenious solutions, including floating UI windows, HUDs (heads up displays), and viewer overlays. Chromatic uses all of these approaches.

The Chromatic toolset includes nine correction effects, which can be stacked in any order onto a clip. These include lift/gamma/gain sliders, lows/mids/highs color wheels, auto white balance, replace color, color balance/temperature/exposure/saturation, three types of curves (RGB, HSL, and Lab), and finally, color LUTs. As you use more tools on a clip, these will stack into the floating window like layers. Click on any of these tools within the window to access those specific controls. Drag tools up or down in this window to rearrange the order of operation of Chromatic’s color correction processes. The specific controls work and look a lot like similar functions within DaVinci Resolve. This is especially true of HSL Curves, where you can control Hue vs. Sat or Hue vs. Luma.

Masking with the power of Mocha

Corrections can be masked, in order to effect only specific regions of the image. If you select “overall”, then your correction will affect the entire image. But is you select “inside” or “outside” of the mask, then you can grade regions of the image independent of each other. Take, for example, a common, on-camera interview situation with a darkened face in front of a brightly exposed exterior window. Once you mask around the face, you can then apply different correction tools and values to the face, as opposed to the background window. Plus, you can still apply an overall grade to the image, as well as final output adjustment tweaks with the sliders in the inspector window. That’s a total of four processes, with a number of correction tools used in each process.

To provide masking, Coremelt has leveraged its other products, SliceX and TrackX. Chromatic uses the same licensed Mocha planar tracker for fast, excellent mask tracking. In our face example, should the talent move around within the frame, then simply use the tracker controls in the masking HUD to track the talent’s movement within the shot. Once tracked, the mask is locked onto the face.

Color look-up tables (LUTs)

When you purchase Chromatic, you’ll also get a LUT (color look-up table) browser and a default collection of looks. (More looks may be purchased from Coremelt.) The LUT browser is accessible within the grading window. I’m not a huge fan of LUTs, as these are most often a very subjective approach to a scene that simply doesn’t work with all footage equally well. All “bleach bypass” looks are not equal. Chromatic’s LUT browser also enables access to any other LUTs you might have installed on your system, regardless of where they came from, as long as they are in the .cube format.

LUTs get even more confusing with camera profiles, which are designed to expand flat-looking, log-encoded camera files into colorful Rec709 video. Under the best of circumstances these are mathematically correct LUTs developed by the camera manufacturer. These work as an inverse of the color transforms applied as the image is recorded. But in many cases, commonly available camera profile LUTs don’t come from the manufacturers themselves, but are actually reverse-engineered to function closely to the manufacturer’s own LUT. They will look good, but might not yield identical results to a true camera LUT.

In the case of FCPX, Apple has built in a number of licensed camera manufacturer LUTs for specific brands. These are usually auto-detected and applied to the footage without appearing as an effect in the inspector. So, for instance, with ARRI Alexa footage that was recorded as Log-C, FCPX automatically adds a LogC-to-Rec709 LUT. However, if you disable that and then subsequently add Chromatic’s LogC-to-Rec709 LUT, you’ll see quite a bit of difference in gamma levels. Apple actually uses two of these LUTs – a 2D and a 3D cube LUT. Current Alexa footage defaults to the 3D LUT, but if you change the inspector pulldown to the regular LogC LUT, you’ll see similar gamma levels to what Chromatic’s LUT shows. I’m not sure if the differences are because the LUT isn’t correct, or whether it’s an issue of where, within the color pipeline, the LUT is being inserted. My recommendation is to stick with the FCPX default camera profile LUTs and then use the Chromatic LUTs for creative looks.

In use

Chromatic is a 1.0 product and it’s not without some birthing issues. One that manifested itself is a clamping issue with 2013 Mac Pros. Apparently this depends on which model of AMD D-series GPU your machine has. On some machines with the D-500 chips, video will clamp at 0 and 100, regardless of whether or not clamping has been enabled in the plug-in. Coremelt is working on a fix, so contact them for support if you have this or other issues.

Overall, Chromatic is well-behaved as custom plug-ins go. Performance is good and rendering is fast. Remember that each tool you use on a clip is like adding an additional effects filters. Using all nine tools on a clip is like applying nine effects filters. Performance will depend on a lot of circumstances. For example, if you are working with 4K footage playing back from a fast NAS storage system, then it will take only a few applied tools before you start impacting performance. However, 1080p local media on a fast machine is much more forgiving, with very little performance impact during standard grading using a number of applied tools.

Coremelt has put a lot of work into Chromatic. To date, it’s the most comprehensive grading toolset available within Final Cut Pro X. It is like having a complete grading suite right inside of the Final Cut timeline. If you are serious about grading within the application and avoiding a roundtrip through DaVinci Resolve, then Chromatic is an essential plug-in tool to have.

©2017 Oliver Peters

Affinity Photo for iPad

UK developer Serif has been busy creating a number of Mac, Windows, and now iOS applications that challenge Adobe’s stranglehold on the imaging industry. Newest of these is Affinity Photo for the iPad. As newer iPads become more powerful – starting with the Air 2 and moving into the present with two Pro models – iOS app developers are taking notice. There have been a number of graphic and design apps available for iOS for some time, including Adobe Photoshop Express (PS Express), but none is as full-featured as Affinity Photo. There is very little compromise between the desktop version and the iPad version, making is the most sophisticated iOS application currently on the market.

Affinity Photo starts with an elegant user interface, that’s broken down into five “personas”. These are essentially workspaces and include Photo, Selections, Liquify, Develop, and Tone Mapping. Various tools, specific to each persona, populate the left edge of the screen. So in Photo, that’s where you’ll find crop, move, brush tools and more. The right edge displays a series of “Studios”. These often contain a set of tools, like layer management, adjustment filters, channel control, text, and so on. There’s everything there that we’ve come to expect from an advanced desktop graphics application. Naturally, if you own an iPad Pro with the Apple Pencil, then you can further take advantage of Serif’s support for the pressure-sensitivity of that input device.

Best of all, response is very fluid. For example, the Liquify persona offers an image mesh that you can drag around to bend or deform an image. There’s virtually no lag while doing this. Some changes require rasterization before moving on. In the case of Liquify, changes are non-destructive, until you exit that persona. Then you are asked whether or not to commit to those changes. If you commit, then the distortion you’ve done in that persona is rendered to the image inside of the Photo app.

When working with photography, you’ll do your work either in the Develop or the Tone Mapping persona. As you would expect, Develop includes the standard photo enhancement tools, including color, red-eye, and lens distortion correction. There’s also detail enhancement, noise reduction, and a blemish removal tool. Tone mapping is more exotic. While intended for work with high dynamic range images, you can use these tools to create very stylized enhancement effects on non-HDR images, too.

All of this is great, but how do you get in and out of the iPad? That’s one of Affinity Photo’s best features. Like most iOS apps, you can bring in files from various cloud services like Dropbox. But being a photography application, you can also import any native iPad images from other applications, like the native Apple Photos. Therefore, if you snap a photo with your iPad camera, it’s available to Affinity Photo for enhancement. When you “save a copy” of the document, the processed file is saved to iCloud in its native .afphoto file format. These images can be accessed from iCloud on a regular Mac desktop or laptop computer. So if you also have the desktop (macOS) version of Affinity Photo, it will read the native file format, preserving all of the layer and effects information within that file. In addition, you can export a version from the iPad in a wide range of graphic formats, including Photoshop.

Affinity Photo includes sophisticated color management tools that aren’t commonly available in an iOS photo/graphics application. Exports may be saved in various color profiles. In addition, you can set various default color profiles and convert a document’s profile, such as from RGB to CMYK. While having histograms available for image analysis isn’t unusual, Affinity Photo also includes tools like waveform displays and a vectorscope, which are familiar to video-centric users.

Serif has made it very easy to get up and running for new users. At the launch screen, you have access to an interactive introduction, an extensive list of help topics, and tutorials. You can also access a series of complex sample images. When you pick one of these, it’s downloaded to your iPad where you can dive in and deconstruct or modify it to your heart’s content. Lastly, all personas include a question mark icon in the lower right corner. Touch and hold the icon and it will display the labels for all of the tools in that persona. Thus, it’s very easy to switch over if you come from a Photoshop-centric background.

Affinity Photo is a great example of what the newest iPads are capable of. Easy interchange between the iOS and macOS versions are the icing on the cake, enabling the iPad to be part of a designer’s arsenal and not simply a media consumption device.

©2017 Oliver Peters

Customize Premiere Pro with Workspaces

Most days I find myself in front of Adobe Premiere Pro CC, both by choice and by the jobs I’m booked on. Yes, I know, for some it’s got bugs and flaws, but for me it’s generally well-behaved. Given the choices out there, Premiere Pro feels the most natural to me for an efficient editing workflow.

Part of what makes Premiere Pro work for me is the ability to customize and fine-tune the user interface layout for the way I like to work or the tasks at hand. This is made possible by Adobe’s use of panels for the various tools and windows within the interface. These panels can float or be docked, stacked, or tabbed in a wonderfully large range of configuration possibilities. The Adobe CC applications come with a set of preset workspaces, but these can be customized and augmented as needed. I won’t belabor this post with an in-depth explanation of workspaces, because there are three very good explanations over at PremiereBro. (Click these links for Post 1, Post 2, and Post 3). My discussion with Simon Ubsdell made me think the topic would make a good blog post here, too.

It all starts with displays

I started my NLE journey with Avid and in the early days, two screens (preferably of a matching size) were essential. Bins on the left with viewers and timeline on the right. However, in the intervening years, screen resolution has greatly increased and developers have made their UIs work on dual and single-screen configurations. Often today, two screens can actually be too much. For example, if you have two side-by-side 27” (or larger) displays, the distance from the far left to the far right is pretty large. This makes your view of the record window quite a bit off-center. To counter-balance this issue, in a number of set-ups, I’ve taken to working with two different sized displays: a centered 27”, plus a smaller 20” display to the left. Sometimes I’ll have a broadcast display to the right. The left and right displays are at an angle, which means that my main working palette – the viewers and timeline – are dead-center on the display in front of me.

I also work with a laptop from time to time, as well as do some jobs in Final Cut Pro X. Generally a laptop is going to be the only available display and FCPX is well-optimized for single-screen operation. As a result, I’ve started to play around with working entirely on a single display – only occasionally using the landscape of the secondary display on my left when really needed. The more I work this way, the more I find that I can work almost entirely on one screen, if that screen offers a decent resolution.

So in order to optimize my workflow, I’ve created a number of custom Premiere Pro workspaces to serve my needs. (Click any of these images to see the enlarged view.)

Edit layout 1

This is the classic two-screen layout. Bins on the left and dual-viewer/timeline on the right. I use this when I have a lot of footage and need to tab a number of bins or expand a bin to see plenty of list details or thumbnails.

Edit layout 2

This layout collapses the classic layout onto a single screen, with the project panel, viewers and timeline.

Edit layout 3

This layout is the one I use most often, because most of what I need is neatly grouped as a tab or a stack on the left and right sides of a single viewer window. Note that there are actually source and record viewers, but they are stacked behind each other. So if I load a clip or match frame from the timeline, the source viewer becomes foremost for me to work with. Do an edit or go back to the timeline and the viewer switches back to the record side.

By tabbing panels on the left side, I can select the panel needed at the time. There is a logical order to what is on the left or right side. For instance, scopes are left and Lumetri Color controls on the right – thus, both can be open. Or I can drag an effect from the right pane’s Effects palette onto the Effects Control panel on the left.

Edit layout 4

This is the most minimalist of my workspaces. Just the viewers and timeline. Anything else can be opened as a floating window for temporary access. The point of this workspace is 100% focus on the timeline, with everything else hidden.

Edit layout 5

This workspace is designed for the “pancake timeline” style of editing. For example, build a “selects” timeline and then pull from that down to your main editing timeline.

Edit layout 6

This is another dual-display layout optimized for color correction. Lumetri Color and Effects Control panel flanking the viewer, with the Lumetri Scopes fullscreen on the lefthand monitor.

There are certainly plenty of other ways you can configure a workspace to suit your style. Some Premiere Pro editors like to use the secondary screen to display the timeline panel fullscreen. Or maybe use it to spread out their audio track mixer. Hence the beauty of Adobe’s design – you can make it as minimal or complex as you like. There is no right or wrong approach – simply whatever works to improve your editing efficiency.

Note: Footage shown within these UI screen grabs is courtesy of Imagine Dragons and Adobe from the Make the Cut Contest.

©2017 Oliver Peters

LumaFusion – an iOS NLE

As Apple’s iOS platform becomes more powerful, applications for it begin to rival the power and complexity of desktop software. LumaFusion is a recently introduced nonlinear video editing product from Luma Touch. Its founders created the Avid/Pinnacle/Corel iOS NLE, but LumaFusion takes a fresh approach. Luma Touch currently offers three iOS products: LumaClip (a single-clip editor), LumaFX (video effects for clips), and LumaFusion (a full-fledged NLE that integrates the features of the other two products). All three apps run under what Luma Touch dubs their Spry Engine, a framework for iOS video applications.

LumaFusion works on both the iPhone and iPad; however, the iPad version comes closest to a professional desktop experience. Ideally you’ll want one of the iPad Pros, but it runs perfectly fine on an iPad Air 2 with the A8X chip, which is what I used. I’ve tried other iOS NLEs, including Adobe Clip, iMovie, and TouchEdit, which have their pros and cons. For instance, iMovie doesn’t deal with fractional video frame rates and TouchEdit tries to mimic a flatbed film editor. This brings me to LumaFusion, which has been designed as a modern, professional-grade NLE for the iOS platform.

UPDATE: Watch this video for a rundown of the new features in version 1.4, released in September 2017.

The iOS ecosystem

Like other iOS apps, that tie into the ecosystem, media can be imported from iTunes, Photos, and other third-party applications, like FiLMiC Pro. As a “pro” app, it understands various whole and fractional frame rates and sizes up to 3840 x 2160 (UHD 4K), depending on your device. However, for me, the interest is not in cutting things that I’ve shot with my iPad, but rather fitting it into an offline/online editing workflow. This means import and export are critical.

If you own an iPad Pro, then you can get an SD card reader as an accessory. With the card reader, only native DSLR movie clips will be imported into the Photos app, but not other file formats. Typically, you are going to transfer media using cloud syncing tools, like Dropbox, Box, OneDrive, etc. LumaFusion also includes a number of royalty-free music cuts, which can be accessed through its integrated media browser.

To use it as a rough-cut tool, simply create H.264 proxies on your desktop system and sync those to the iPad using Dropbox (or another cloud service). I created a test project of about 60 clips (720p, 6Mbps, 29.97fps) that only consumed 116MB of storage space. Therefore, even a free 2GB Dropbox account would be fine. Within LumaFusion, import the files from Dropbox and start editing.

LumaTouch will soon start beta testing LumaConnect – a macOS companion application designed to facilitate offline/online editing roundtrips. It will feature automatic iOS proxy creation and the ability to relink high-res media – as well as any iOS-captured content – back on your desktop computer. LumaConnect will also allow the rendering of projects as Apple ProRes files.

User interface and editing workflow

Overall, the interface design and editing model more closely approximates Apple Final Cut Pro X than any other NLE. The app’s design is built around a media pool with various editing projects (sequences). This is a similar approach to FCPX 10.0, which had separate Events (bins) and Projects (sequences), but no combined Libraries. It’s almost like FCPX “Lite” for iOS.

There are three main windows: media browser, timeline, and a single, combo viewer. It uses fly-out panels for tools and mode changes to access clip editing and effects modules. These modules are, in fact, LumaClip and LumaFX integrated into LumaFusion. The timeline is “magnetic”, much like FCPX. Clip construction on the timeline also follows the layout of primary and connected clips, rather than discrete target tracks. A total of three integrated audio/video clips can be stacked vertically, along with another three audio-only clips, for a total of six audio “tracks”. Audio can be adjusted through a fly-out track mixer. LumaFusion includes four clip editing tools: speed and reverse, frame fit, color effects, and audio editing. In addition, there’s a multi-layered title tool, along with a number of customizable title templates to choose from. Clip-based volume and video effect adjustments can be keyframed.

Effects are pretty sophisticated and would often be GPU-accelerated on a desktop system. These include color correction, blurs, transforms, transitions, and more. You can stack a number of these onto a single clip without any impact on playback. The effects priority can be rearranged and the interface also provides an indication of how many resources you are tying up on the iPad.

The editing experience

Serious video editing on an iPad isn’t for everyone, but the more I worked with it, the more I enjoyed the experience. If you have an iPad-compatible keyboard, it follows some generic commands, including JKL playback and I and O for mark-in and mark-out. There are also a few FCPX keystrokes, like W for insert/overwrite (depending on which edit mode is selected). Unfortunately J (reverse playback) only works in the clip viewer, but not in the timeline. I’d love to see a more extensive keyboard command set. Naturally, being an iOS app, everything can be accessed via touch, which is best (though not essential) if you have the Apple Pencil for the iPad Pro.

There are a few standard editing functions that I missed. For example, there’s no “rolling-edit” trim function. If you want to move a cut point – equally trimming the left and right sides – you have to do it in the overwrite edit mode and trim the incoming or outgoing side of one of the clips. But, if you trim it back, a gap is left. J-cuts and L-cuts require that you detach the audio from the clip, as there is no way to expand an a/v clip in the timeline.

It is definitely possible to finish and export a polished piece from LumaFusion. You can also export an audio-only mix. This enables you to embellish your audio track outside of LumaFusion and then reimport and marry it to the picture for the final version. Because you can layer vertical tracks, cutting a two-camera interview piece on your iPad is pretty easy. Rough-cutting a first pass or pulling edited selects on an iPad becomes completely viable with LumaFusion.

Sharing your edit

Once you’ve edited your piece, it’s easy to share (export) your final sequence as a single audio/video file, audio-only file, project (currently only compatible with LumaFusion), or trimmed media. Be aware that there’s a disconnect between the frame rate terminology for settings versus exports. For example, with project settings, you can pick 24 or 30, which are actually 23.98 or 29.97; however, on export, you must pick between 24 and 23.98 or 30 and 29.97. Nevertheless, exports up to UHD frame sizes are fine, including downscaled sizes, if needed. So, you can import and cut in UHD and export a 1080 file. A flattened H.264 movie file of your sequence – wrapped in either an .mp4 or QuickTime .mov container – may be exported at up to 50Mbps (1080p) or 100Mbps (UHD).

If your intension is to use LumaFusion for “offline” editing, then for now, your only option is to embed “burn-in” timecode into the media that you send to the iPad. Then manually write down edit points based on the visible timecode at the cuts. The upcoming LumaConnect macOS application will make it possible to send projects to both Final Cut Pro X and Premiere Pro via XML. According to Luma Touch, they will also be adding XML export from LumaFusion as an in-app purchase, most likely before the release of LumaConnect.

Using an iPad or iPad Pro as your only computer isn’t for everyone, but LumaFusion is definitely a tool that brings iOS editing closer to the desktop experience. To get you started, the company has posted over 30 short tutorials on their YouTube channel. Sure, there are compromises, but not as many as you might think for simple projects. Even if an iPad is only a supplemental tool, then like so many other iOS apps, LumaFusion is another way to add efficiency in the modern, mobile world.

Originally written for RedShark News.

©2017 Oliver Peters

Premiere Pro Workflow Tips

When you are editing on projects that only you touch, your working practices can be as messy as you want them to be. However, if you work on projects that need to be interchanged with others down the line, or you’re in a collaborative editing environment, good operating practices are essential. This starts at the moment you first receive the media and carries through until the project has been completed, delivered, and archived.

Any editor who’s worked with Avid Media Composer in a shared storage situation knows that it’s pretty rock solid and takes measures to assure proper media relinking and management. Adobe Premiere Pro is very powerful, but much more freeform. Therefore, the responsibility of proper media management and editor discipline falls to the user. I’ve covered some of these points in other posts, but it’s good to revisit workflow habits.

Folder templates. I like to have things neat and one way to assure that is with project folder templates. You can use a tool like Post Haste to automatically generate a new set of folders for each new production – or you can simply design your own set of folders as a template layout and copy those for each new job. Since I’m working mainly in Premiere Pro these days, my folder template includes a Premiere Pro template project, too. This gives me an easy starting point that has been tailored for the kinds of narrative/interview projects that I’m working on. Simply rename the root folder and the project for the new production (or let Post Haste do that for you). My layout includes folders for projects, graphics, audio, documents, exports, and raw media. I spend most of my time working at a multi-suite facility connected to a NAS shared storage system. There, the folders end up on the NAS volume and are accessible to all editors.

Media preparation. When the crew comes back from the shoot, the first priority is to back-up their files to an archive drive and then copy the files again to the storage used for editing – in my case a NAS volume. If we follow the folder layout described above, then those files get copied to the production dailies or raw media (whatever you called it) folder. Because Premiere Pro is very fluid and forgiving with all types of codecs, formats, and naming conventions, it’s easy to get sloppy and skip the next steps. DON’T. The most important thing for proper media linking is to have consistent locations and unique file names. If you don’t, then future relinking, moving the project into an application like Resolve for color correction/finishing, or other process may lead to not linking to the correct file.

Premiere Pro works better when ALL of the media is in a single common format, like DNxHD/HR or ProRes. However, for most productions, the transcoding time involved would be unacceptable. A large production will often shoot with multiple camera formats (Alexa, RED, DSLRs, GoPros, drones, etc.) and generate several cards worth of media each day. My recommendation is to leave the professional format files alone (like RED or Alexa), but transcode the oddball clips, like DJI cameras. Many of these prosumer formats place the media into various folder structures or hide them inside a package container format. I will generally move these outside of this structure so they are easily accessible at the Finder level. Media from the cameras should be arranged in a folder hierarchy of Date, Camera, and Card. Coordinate with the DIT and you’ll often get the media already organized in this manner. Transcode files as needed and delete the originals if you like (as long as they’ve been backed up first).

Unfortunately these prosumer cameras often use repeated, rather than unique, file names. Every card starts over with clip number 0001. That’s why we need to rename these files. You can usually skip renaming professional format files. It’s optional. Renaming Alexa files is fine, but avoid renaming RED or P2 files. However, definitely rename DSLR, GoPro, and DJI clips. When renaming clips I use an app called Better Rename on the Mac, but any batch renaming utility will do. Follow a consistent naming convention. Mine is a descriptive abbreviation, month/day, camera, and card. So a shoot in Palermo on July 22, using the B camera, recorded on card 4, becomes PAL0722B04_. This is appended in front of the camera-generated clip name, so then clip number 0057 becomes PAL0722B04_0057. You don’t need the year, because the folder location, general project info, or the embedded file info will tell you that.

A quick word on renaming. Stick with universal alphanumeric conventions in both the files and the folder names. Avoid symbols, emojis, etc. Otherwise, some systems will not be able to read the files. Don’t get overly lengthy in your names. Stick with upper and lower case letters, numbers, dashes, underscores, and spaces. Then you’ll be fine.

Project location. Premiere Pro has several basic file types that it generates with each project. These include the project file itself, Auto-saved project files, renders, media cache files and audio peak (.pek) files. Some of these are created in the background as new media is imported into the project. You can choose to store these anywhere you like on the system, although there are optimal locations.

Working on a NAS, there is no problem in letting the project file, Auto-saves, and renders stay on the NAS in the same section of the NAS as all of your other media. I do this because it’s easy to back-up the whole job at the end of the line and have everything in one place. However, you don’t want all the small, application-generated cache files to be there. While it’s an option in preferences, it is highly recommended to have these media cache files go to the internal hard drive of the workstation or a separate, external local drive. The reason is that there are a lot of these small files and that traffic on the NAS will tend to bog down the overall performance. So set them to be local (the default).

The downside of doing this is that when another editor opens the Premiere Pro project on a different computer, these files have to be regenerated on that new system. The project will react sluggishly until this background process is complete. While this is a bit of a drag, it’s what Adobe recommends to keep the system operating well.

One other cache setting to be mindful of is the automatic delete option. A recent Premiere Pro problem cropped up when users noticed that original media was disappearing from their drives. Although this was a definite bug, the situation mainly affected users who had set Media cache to be with their original media files and had enabled automatic deletion. You are better off to keep the default location, but change the deletion setting to manual. You’ll have to occasional clean your caches manually, but this is preferable to losing your original content.

Premiere Pro project locking. A recent addition to Premiere Pro is project locking. This came about because of Team Projects, which are cloud-only shared project files. However, in many environments, facilities do not want their projects in the cloud. Yet, they can still take advantage of this feature. When project locking is enabled in Premiere Pro (every user on the system must do this), the application opens a temporary .prlock next to the project file. This is intended to prevent other users from opening the same project and overwriting the original editor’s work and/or revisions.

Unfortunately, this only works correctly when you open a project from the launch window. Do not open the project by double-clicking the project file itself in order to launch Premiere Pro and that project. If you open through the launch window, then Premiere Pro will prevents you from opening a locked project file. However, if you open through the Finder, then the locking system is circumvented, causing crashes and potentially lost work.

Project layout templates.  Like folder layouts, I’m fond of using a template for my Premiere Pro projects, too. This way all projects have a consistent starting point, which is good when working with several editors collaboratively. You can certainly create multiple templates depending on the nature and specs of the job, e.g. commercials, narrative, 23.98, 29.97, etc. As with the folder layout, I’ll often use a leading underscore with a name to sort an item to the top of a list, or start the name with a “z” to sort it to the bottom. A lot of my work is interview-driven with supportive B-roll footage. Most of the time I’m cutting in 23.98fps. So, that’s the example shown here.

My normal routine is to import the camera files (using Premiere Pro’s internal Media Browser) according to the date/camera/card organization described earlier. Then I’ll review the footage and rearrange the clips. Interview files go into an interview sources bin. I will add sub-bins in the B-roll section for general categories. As I review footage, I’ll move clips into their appropriate area, until the date/camera/card bins are empty and can be deleted from the project. Interviews will be grouped as multi-cam clips and edited to a single sequence for each person. This sequence gets moved into the Interview Edits sub-bin and becomes the source for any clips from this interview. I do a few other things before starting to edit, but that’s for another time and another post.

Working as a team. There are lots of ways to work collaboratively, so the concept doesn’t mean the same thing in every type of job. Sometimes it requires different people working on the same job. Other times it means several editors may access a common pool of media, but working in their own discrete projects. In any case, Premiere does not allow the same sort of flexibility that Media Composer or Final Cut Pro editors enjoy. You cannot have two or more editors working inside the same project file. You cannot open more than one project at a time. This mean Premiere Pro editors need to think through their workflows in order to effectively share projects.

There are different strategies to employ. The easiest is to use the standard “save as” function to create alternate versions of a project. This is also useful to keep project bloat low. As you edit a long time on a project, you build up a lot of old “in progress” sequences. After a while, it’s best to save a copy and delete the older sequences. But the best way is to organize a structure to follow.

As an example, let’s say a travel-style show covers several locations in an episode. Several editors and an assistant are working on it. The assistant would create a master project with all the footage imported and organized, interviews grouped/synced, and so on. At this point each editor takes a different location to cut that segment. There are two options. The first is to duplicate the project file for each location. Open each one up and delete the content that’s not for that location. The second option is to create a new project for each location and them import media from the master project using Media Browser. This is Adobe’s built-in module that enables the editor to access files, bins, and sequences from inside other Premiere Pro projects. When these are imported, there is no dynamic linking between the two projects. The two sets of files/sequences are independent of each other.

Next, each editors cuts their own piece, resulting in a final sequence for each segment. Back in the master project, each edited sequence can be imported – again, using Media Browser –  for the purposes of the final show build and tweaks. Since all of the media is common, no additional media files will be imported. Another option is to create a new final project and then import each sequence into it (using Media Browser). This will import the sequences and any associated media films. Then use the segment sequences to build the final show sequence and tweak as needed.

There are plenty of ways to use Premiere Pro and maintain editing versatility within a shared storage situation. You just have to follow a few rules for “best practices” so that everyone will “play nice” and have a successful experience.

Click here to download a folder template and enclosed Premiere Pro template project.

©2017 Oliver Peters