DaVinci Resolve 10

df_r10_1_sm

Last NAB, Blackmagic Design caused everyone to perk up when it touted Resolve 10 as an online editor. Once it was released, it became a bit more obvious that Resolve was still primarily a color corrector, but one that also added editing features. This latest update has been out for a number of months (including a lengthy public beta period) and gone through several updates. Resolve 10 was a free update for owners of previous versions. No short review can do this program justice, due to the depth of its toolset, but let’s take a quick dive into what it has to offer. (Click any image to enlarge.)

df_r10_10_smDaVinci Resolve 10 comes in several versions for Mac and Windows, including Resolve Lite (free), Resolve Software ($995) and Resolve ($29,995), which includes the custom Resolve control surface. There are also Linux configurations. All versions will only work with Blackmagic video devices for I/O and monitoring, but these are not required for operation. In addition to mouse, trackpad and tablet control, Tangent Devices (Wave, Elements), JL Cooper (Eclipse) and Avid Artist control panels may be used as lower cost alternatives to the Resolve control surface. The free Lite version is most likely the biggest software bang-for-the-buck in the industry, but you’ll need the paid version for blur and noise reduction, 3D stereoscopic work, support for more than two GPUs and output at sizes bigger that UltraHD.

New in Resolve 10

df_r10_2_smSince Resolve 10 was a pretty thorough overhaul of the Resolve 9 interface, there’s a lot new in terms of minor changes throughout the application. Many functions are more streamlined and logical and will make more sense to the new user. Editing is the biggest new addition and most of the typical functions are there, including various edit modes, tracks, effects, titles, speed changes, transitions and audio. Although I really can’t envision starting any edit from scratch in Resolve 10, it’s easier than ever to make editorial changes when the client has last minute adjustments in mind. The point is that this can now be achieved in the grading session, without having to go back into an edit bay.

df_r10_9_smA big addition is the integration of an effects architecture, using the OpenFX plug-in format. Various developers are tweaking their OpenFX filters for compatibility with Resolve 10, but already I’ve been able to test the FilmConvert film emulation plug-in. Filters are applied to clips or a complete track as a node, so there are no third-party transition effects. However, since Resolve can render the timeline as a single file or as individual source clips, this means that the rendered clips will also have the applied effects baked into the rendered media. The application of an OpenFX filter to a node will slow down render speeds.

Resolve 10 also gained the ability to create DCPs straight from the timeline for cinema masters. However this only preps the project settings and does not cover the licensing fees that you need for an actual DCP export.

Nodes

df_r10_6_smEvery color corrector takes a different approach for how to build up a series of color correction adjustments. Resolve uses nodes, which have become fairly sophisticated. Although, it’s not a true compositor’s node tree, it does start to approach that. Node types include serial, parallel, splitters, combiners and layer mixers. These let a colorist not only string together a series of adjustments (serial nodes), but also split and recombine a signal, and create parallel node paths that are combined for a final output. The layer mixer node includes composite modes, similar to those used in Photoshop. While a lot of Resolve demos got very deep into node trees that dissect every aspect of a shot, I tend to take a simpler approach, sticking to curves and lift/gamma/gain controls. Nevertheless, if you need that power, it’s there in Resolve 10.

Strengths

df_r10_4_smDaVinci Resolve 10 – even the free Lite version – represents an amazing level of versatility. For example, many editors and DITs use it to prep media for an edit. It’s super simple to apply LUTs to log-profile camera files and spit out edit-ready, adjusted source files. Resolve is one of the fastest renderers I’ve encountered and it handles cross-format conversions quite well. For example, it can render Avid-compliant MXF media, which is relatively uncommon. The scaling function is second to none. After Effects used to be my preferred tool for upscaling images, but I’ve found that Resolve is even better. Not only is the quality great, but you have control over the smoothness or crispness of the scaled image.

df_r10_5_smYou can’t talk about Resolve without mentioning the tracker. If you apply a “power window” to a portion of a shot (like a person’s face), you need to track the movement. The tracker in Resolve is a very fast, point-cloud style tracker. These tracks are almost always dead-on, so you never think twice about using the tracker. One the things I especially like about Resolve is the image quality and processing. It uses 32-bit floating math. Essentially this means that you can crank up video in one node – even past the point of clipping – and then pull it back down (recovering detail without a clip) in the next node.

Weaknesses

User interfaces are a very subjective issue in color correction tools, just like they are for editing software. I find this to be a weakness, because I work with a dual-display system. With Resolve you can’t place the viewer on the secondary monitor, like you can with Adobe SpeedGrade CC or Apple Color. You can place the video scopes and the new audio mixer there, but the viewer is locked to the primary screen. If you use the enhanced viewer mode, it hides the node tree. This tends to make operation awkward if you don’t have a control surface nor an external broadcast monitor.

df_r10_3_smThe depth of Resolve’s color correction toolkit is amazing, but it’s almost too much. For example, you have both wheels and sliders for primary control. That makes it very adaptive to different working styles, but it also makes it easy to lose track of which tool you used to make adjustments. Some things don’t make sense to me. For instance, the maximum saturation level isn’t all that large and if you really want a shot dripping with chroma, it takes several serial nodes to do that.

A personal peeve, since I use dual 20” screens, is that something broke between Resolve 9 and Resolve 10. The earliest version of Resolve on the Mac was optimized for 1920×1080 screens (or larger), but then this was subsequently corrected for smaller resolutions, like laptops and the 20” Apple Cinemas. Apparently this has reverted a little with the latest version. With Resolve 9, the interface opened and was correctly scaled for the 20” display. With Resolve 10, the interface opens with the right edge running off the screen. You have to click the green “plus” button (one of the top three buttons in every Mac OS X window) to resize the window to fit the display.

Roundtrips with your editor

df_r10_8_smDaVinci Resolve 10 has the broadest support for roundtrips of any color correction tool, translating XML (Final Cut Pro 7 and Premiere Pro), FCPXML (Final Cut Pro X), EDL and AAF (Avid) list formats. This is a bi-directional roundtrip, so you can import sequences from your NLE into Resolve 10, but then also export NLE-compatible lists that properly relink to the rendered media. When Final Cut Pro X version 10.1 was released, compatibility was broken, but that’s recently been fixed with the latest updates from each company. However, it’s still not quite perfect. I tried two very simple sequences of a few shots each. One sequence used 1920×1080 ProRes HQ files from a Blackmagic Cinema Camera. The other used native, camera raw files from a RED EPIC (various sizes and frame rates). Both sequences were cut in FCP X and the FCPXML from each imported without issue into Resolve 10.

df_r10_7_smGoing the other way, back into FCP X, did present some issues. Both of the new FCPXMLs that were imported into FCP X reported error messages, although the clips and sequences imported correctly. The 1920×1080 files from the BMCC were fine. The EPIC files, which had been resized in the original FCP X timeline, were all interpreted by FCP X as 1280×720, even though Resolve 10 had correctly rendered the media as 1920×1080. These same timelines imported fine into Premiere Pro using standard XMLs.

Final thoughts

DaVinci Resolve 10 is currently the most popular color correction tool, largely because of the free version. It is powerful, though at times I feel that the correction tends to be a little harsher than with other grading applications. The interface could stand to be even more streamlined. Nevertheless, I’ve done grades that required extensive correction, which would have been impossible to achieve in any other color correction application.

It’s an essential tool that functions like a “Swiss Army knife” and as such, you owe it to yourself to spend some time learning it. The manual, written by noted colorist and author Alexis Van Hurkman, is easy to follow. Training resources include online tutorials at Color Grading Central, Ripple Training, Tao of Color and Mixing Light.

Originally written for Digital Video magazine / Creative PlanetNetwork.

©2014 Oliver Peters

NLE Tips – Week 3

df_nle3_1_sm

The Avid  – Resolve Roundtrip Workflow

Avid Media Composer has always been regarded as the best offline editing tool and its heritage was built upon a strong offline-to-online workflow. The file-based world has complicated things and various camera formats have made life even more complex for editors. Many have become quite fond of using Blackmagic Design’s DaVinci Resolve as a great companion to Media Composer. It’s cross-platform and even the free version will do most of what you need. Here’s a step-by-step example of how you might use the combo. Relinking varies a bit, based on file metadata and might need to be modified for your particular circumstances. This workflow is great with ARRI ALEXA files and will most likely work well with other similar camera formats. (Click images for an expanded view.)

df_nle3_4_smCreating edit proxies files with Resolve – ALEXA files are usually Apple ProRes 4444 or ProRes HQ QuickTime files that have been recorded with a Log-C gamma profile. So, they are big files with a flat appearance. To start, launch Resolve, load the ProRes camera clips into the Media Pool (Media or Edit tab) and select/edit all of the full clips to a new timeline. In the Color tab, select “track” instead of “clip” and apply a single node. In that node, apply an ARRI Log-C-to-Rec709 LUT. Go to the Deliver tab and pick the Avid roundtrip Easy Set-up. Make sure “Individual Source Clips” is selected (not a single file), define a render location and df_nle3_3_smdecide whether or not to add a file name prefix or suffix (not required). Render using the DNxHD 36 codec choice.

Moving to Media Composer for the creative cut – When the render process has been completed, you’ll have a folder containing Avid MXF media and a corresponding AAF file. This media has the LUT “baked in” and has been rendered with the very lightweight df_nle3_5_smDNxHD 36 codec. Drag the AAF file out of this folder to another location. Now drag this complete folder into any of your Avid MediaFiles/MXF subfolders. Unless you’ve already added extra folders there, you will typically find one existing folder (with Avid’s default label of “1”) that contains MXF media. Change the label of the new folder (the one that you’ve just dragged in) to another number, such as “2″.df_nle3_2_sm

Launch Media Composer, create a new project, open the first bin and import the AAF file that was created by Resolve. This bin will become populated by the color corrected, DNxHD 36 files created by Resolve. Voila, you are ready to edit your Oscar-winner! Cut until the project is locked. When you are done and are ready to move to the online or finishing phase of the edit, export an AAF file from Media Composer. Select “AAF Edit Protocol” and “Link to” media in the AAF options.df_nle3_10_sm

df_nle3_7_smReturning to Resolve for the final grade – Launch Resolve and start a new project. Import the AAF file that you exported from Media Composer. You’ll end up with a timeline that matches your Avid cut and it will be linked to the DNxHD 36 media. You will want to relink the files back to the original camera media – the ProRes HQ or ProRes 4444 files. To do this, delete all the media in the Resolve Media Pool (Edit tab), which will make the timeline clips appear offline. df_nle3_12_smNow, navigate to the folder with the original camera files and bring those into the Media Pool. Your timeline clips will now be relinked to this original camera media. You’ll recognize this because the clips on the timeline will be back to their original, flat, Log-C appearance. In some instances, Resolve may see some files as duplicate and might possibly relink to the wrong file. In that case, you’ll see an error icon on the timeline clip. Click on it and Resolve will present a dialogue window with the possible alternate media options. Pick the correct one and the clip should then be linked to the right shot. Color correct your timeline with the desired grade and any reframing.

df_nle3_6_smReturning to Media Composer to complete the edit – When you’ve completed the color grading, go to the Deliver tab and pick the Avid roundtrip Easy Set-up again, but this time pick a higher-quality codec (like DNxHD 175x). Make sure to set handle lengths (usually 2-5 sec.) and render (as “Individual Source Clips” again). The result will be a new folder of rendered MXF media with the “baked in” grade, plus a new corresponding AAF file. As before, drag out this AAF file and drag the folder of rendered media into the Avid MediaFiles/MXF subfolder. Relabel the folder of this new Resolve media with a different number (such as “3″).

df_nle3_11_smLaunch Media Composer, open your existing project and create a new bin. Import the new AAF file, which will now populate this bin with the high-quality media. This bin will also include the sequence that you sent over to Resolve, but now linked to the high-resolution media files. In many cases, you would simply use this sequence for any final effects, titles and other adjustments.

df_nle3_8_smRelinking the sequence in Media Composer – If for some reason the sequence that was “round-tripped” does not correctly reflect the edited cut as built in the offline stage, then you will need to relink a copy of that sequence to the new media. To do so, duplicate the sequence from your DNxHD 36 edit and move that copy into the bin with the 175x media. Close all other bins, except the 175x bin. Right-click the sequence and select “Relink” from the menu. Set your options to “Select Items In All Open Bins” and relink by “Timecode – Start” and “Source Name – Tape Name or Source File ID”. This will cause the sequence to be relinked to the new 175x final-quality media.df_nle3_9_sm

If everything worked correctly, you will have done a complete offline (creative cut) and online (finishing) workflow between Media Composer and Resolve, without the need for Avid’s traditional import or newer AMA processes!

©2014 Oliver Peters

Color Concepts and Terminology

df_clrterms_10_sm

It’s time to dive into some of the terms and concept that brought you modern color correction software. First of all – color grading versus color correction. Many use these terms to identify different processes, such as technical shot matching versus giving a shot a subjective “look”. I do this too, but the truth of the matter is that they are the same and are interchangeable. Grading tends to be a more European way of naming the process, but it is the same as color correction. (Click on any of the images in this article for an expanded and more descriptive view.)

All of our concepts stem from the film lab process known as color timing. Originally this described the person who knew how long to leave the negative in the chemical bath to achieve the desired result (the “timer”). Once the industry figured out to manipulate color in the negative-to-positive printing process, the “color timer” was the person who controlled the color analyzer and who dialed in degrees of density and red/blue/green coloration. The Dale Grahn Color iPad application will give you a good understanding of this process. Alexis Van Hurkman also covers it in his “Color Correction Handbook”.df_clrterms_09_sm

Electronic video color correction started with early color cameras and telecine (film-to-tape transfer or “film chain”) devices. These were based on red/blue/green color systems, where the video engineer (or “video shader”) would balance out the three components, along with exposure and black level (shadows). He or she would adjust the signal of the pick-up systems, including tubes, CCDs and photoelectric cells.

RCA added circuitry onto their cameras called a chroma proc, which divided the color spectrum according to the six divisions of the vectorscope – red, blue, green, cyan, magenta and yellow. The chroma proc let the operator shift the saturation and/or hue of each one of these six slices. For instance, you could desaturate the reds within the image. Early color correction modules for film-to-tape transfer systems adopted this same circuity. The “primary” controls manipulated the actual pick-up devices, while the “secondary” controls were downstream in the signal chain and let you further fine tune the color according to this simple, six-vector division.

df_clrterms_11_sm

Early color correction system were built to transfer color film to air or to videotape. They were part machine control and part color corrector. Modern color correction for post production came to be, because of these three key advances: memory storage, scene detection and signal decoding.

Memory storage. Once you could store and recall color correction settings, then it was easy to go back and forth between camera angles or shots and apply a different setting to each. Or you could create several looks and preview those for the client. The addition of this technology was the basis for a seminal patent lawsuit, known as the Rainbow patent suit, as the battle ranged over who first developed this technology.

Scene detection. Film transfer systems had to play in real-time to be recorded to videotape, which meant that shot changes had to trigger the change from one color correction setting to the next. Early systems did this via the operator manually marking an edit point (called “notching”), via an EDL (edit decision list) or through automatic scene detection circuitry. This was important for the real-time transfer of edited content, including film prints, cut negative and eventually videotape programs.

Signal decoding. The ability of color correction systems to decode a composite or component analog (and later digital) signal through added hardware, shifted color correction from camera shading and film transfer to being another general post production tool at a post facility. The addition of a signal decoder board in a DaVinci unit split the input signal into RGB parts and enabled the colorist to enhance the correction of an already-edited master using the “secondary” signal electronics of the system. This enabled “tape-to-tape” color correction of edited masters. Thanks to scene detection or an EDL, color correction could be shot-to-shot and frame-accurate, when played back in real-time for its re-encoded, corrected output back to a second videotape master.

Eventually the tools used in hardware-based, tape-to-tape color correction systems became standard. Quantel and Avid led the way by being first to incorporate these features into their nonlinear editing software.

_________________________________

Color correction software tends to break up its control into primary and secondary functions. As you can see from the earlier explanations, there’s really no reason to do that, since we are no longer controlling the pick-up devices within a camera or telecine. Nevertheless, it’s terminology we seem to be comfortable with. Often secondary controls enable masking and keys to isolate color – not because it has to be that way – but, because DaVinci added these features into their secondary control set. In modern correction tools, any function could happen on any layer, node, room, etc.df_clrterms_03_sm

The core language for color manipulation still boils down to the simple controls exemplified by the Dale Grahn app. A signal can be brighter, darker, more or less “dense” (contrast) and have its colorimetry shifted by added or subtracting red, blue or green for the overall image or in the highlight, midrange or shadow portions of the image. This basic approach can be controlled through sliders, knobs, color wheels and other user interfaces. Different software applications and plug-ins get to the same point through different means, so I’ll cover a few approaches here. Bear in mind, that since some of these actually represent somewhat different color science and math, examples that I present might not yield exactly the same results. Many controls are equivalent in their effect, though not necessarily identical in how they affect the image.

df_clrterms_01_smA common misconception is that shadow/mid/highlight controls on a 3-way color corrector will evenly divide the waveform into three discrete ranges. In fact, these are very large, overlapping ranges that interact with each other. If you shift a shadow luminance control up, it doesn’t typically just expand or compress the lower third of the waveform. Although some correctors act this way, most tend to shift the whole waveform up or down. If you change the color balance of the midrange, this color change will also affect shadows and highlights. The following is a quick explanation of some of the popular color control models.

Contrast/pivot/temperature/tint

df_clrterms_07_smContrast and temperature controls have recently become more popular and are considered a more photographic approach to correction. When you adjust contrast, the image levels expand or stretch as viewed on a waveform. Highlights get brighter and shadows deepen. This contrast expansion centers on a pivot point, which by default is at the center of the signal. If you change the pivot slider you are shifting the center point of this contrast expansion. In one direction, this means the contrast control will stretch the range below the pivot point more than above it. Shift the pivot slider in the other direction for the opposite effect.

df_clrterms_06_smColor temperature and tint (also called magenta) controls balance the red/blue/green signal channels in relationship to each other. If you slide a color temperature control while watching an RGB parade display on a waveform, you’ll note that adjustments shift the red and blue channels up or down in the opposite direction to each other, while leaving green unaffected. When you adjust the tint (or magenta) slider, you are adjusting the green channel. As you raise or lower the green, both the red and blue channels move together in a compensating direction.

Slope/offset/power

df_clrterms_08_smThe SOP model is used for CDL (color decision list) values and breaks down the signal according to luma (master), red, green and blue and are expressed in the form of plus or minus values for slope, offset and power. Scratch Play’s color adjustments are a good example of the SOP model in action. Slope is equivalent to gain. Picture the waveform as a diagonal line from dark to light. As you rotate this imaginary line, the higher part becomes taller, which represents brightness values. Think of the slope concept as this rotating line. As such, its results are comparable to a contrast control.

The offset control shifts the entire signal up or down, similar to other shadow or lift controls. The power control alters gamma. As you adjust power, the gamma signal is curved in a positive or negative direction, effectively making the midrange tones lighter or darker.

Lift/gamma/gain

df_clrterms_02_smThe LGG model is the common method used for most 3-way color wheel-style correctors. It effectively works in a similar manner to contrast and SOP, except that the placement of controls makes more sense to most casual users. Gain, as the name implies, increases the signal, effectively expanding the overall values and making highlights brighter. Lift shifts the entire signal higher or lower. Changing a lift control to darken shadows, will also have some effect on the overall image. Gamma bends the curve and effectively makes the midrange values lighter or darker.

Luma ranges

df_clrterms_04_smThe portions of the signal altered by highlight/shadow/midrange controls (like SOP, LGG or other) overlap. If you change the color balance for the midrange tones, you will also contaminate shadows and highlights with this color shift. The extent of the portion that is affected is controlled by a luma range control. Many color correction applications do not give you control over shifting the crossover points of these luma ranges. Some that do, include Avid Symphony, Synthetic Aperture Color Finesse and Adobe SpeedGrade. Each offers curves or sliders to reduce or expand the area controlled by each luma range and effectively tightens or widens the overlap or crossover between the ranges.

DaVinci Resolve includes a similar function within its log-style color wheels panel. It uses range adjustments that can limit the area affected by the balance and saturation controls. Similar results may be achieved by using HSL keyers or qualifiers that include softening controls.

Channels or printer lights

df_clrterms_05_smVideo signals are made up of red, blue and green channel information. It is not uncommon for properly-balanced digital cameras to still maintain a green color cast to the overall image, especially if log-profile recording was used. Here, it’s best to simply balance the overall channels first to neutralize the image, rather than attempt to do this through color wheel adjustments. Some software uses actual channel controls, so it’s easy to make a base-level adjustment to the output or mix of a channel. If your software uses printer lights, you can achieve the same results. Printer lights harken back to lab color timing, using point values that equate to color analysis values. Regardless, dialing in a plus or minus red/blue/green printer light value effectively gives you the same results as altering the output value of a specific color channel.

This is just a short post to go over some of the more confusing terminology found in modern color correction software. Many applications tend to blend the color science models, so as you apply the points mentioned to your favorite tool, you may see somewhat different results. Hopefully I’ve gotten you in the ballpark, in order to understand what happens when you twirl the knob the next time.

©2014 Oliver Peters

LUTs and FCP X

df_luts_01

LUTs or color look-up tables are a method of converting images from one color space or gamma profile into another. LUTs are usually a mathematically correct transform of one set of color and level values into another. For most editors and colorists, LUTs are commonly associated with log profiles that are increasingly used with various digital cameras, like an ARRI ALEXA, RED One, RED Epic or Blackmagic Design Cinema Camera. (Click on the images in this article for an expanded view.)

The concept gets confusing, because there are various types of LUTs and they can be inserted into different stages of the pipeline. There are display LUTs, used to convert the viewing color space, such as from Rec. 709 (video) into P3 (digital cinema projection). These can be installed into hardware conversion boxes, monitors and within software grading applications. There are camera LUTs, which are used to convert gamma profiles, such as from log-C to Rec. 709. And finally, there are creative LUTs used for aesthetic purposes, like film stock emulation.

df_luts_02One of the really sweet parts of Apple Final Cut Pro X is that it offers a vastly improved color pipeline that ties in closely to underpinnings of the OS, such as ColorSync. This offers developers opportunities over FCP “legacy” and quite frankly over many other competitors. Built into the code is the ability to recognize certain camera metadata if the camera manufacturer chooses to take advantage of Apple’s SDK. ARRI, Sony and RED are among those that have done so. For example, when you import ARRI ALEXA footage that was recorded with a log-C gamma profile, a metadata flag in the file toggles on log processing automatically within FCP X. Instead of seeing the flat log-C image, you see one that has already been converted, on-the-fly into Rec. 709 color space.

This built-in log processing comes with some caveats, though. The capability is only enabled with files recorded on ALEXA cameras with more recent firmware. It cannot be manually applied to older log-C footage, nor to any other log-encoded video file. It can only be toggled on or off without any adjustments. Finally, because this is done via under-the-hood ColorSync profile changes, it happens prior to the point any filters or color correction can be applied within FCP X itself.

df_luts_03A different approach has been developed by colorist Denver Riddle, known for his Color Grading Central website, products and tutorials. His new product, LUT Utility, is designed to provide FCP X editors with a better way of using LUTs for both corrective and creative color transforms. The plug-in installs into both Final Cut Pro X and Motion 5 and comes with a number of built-in LUTs for various cameras, such as the ALEXA, Blackmagic and even the Cinestyle profiles used with the Canon HDSLRs. Simply drop the filter onto a clip and select the LUT from the pulldown menu in the FCP X inspector pane. As a filter, you can freely apply any LUT selection, regardless of camera – plus, you can adjust the strength of the LUT via a slider. It can work within a series of filters applied to the same clip and can be placed upstream or downstream of any other filters, as well as within an adjustment layer (blank title effect). You can also stack multiple instances of the LUT with different settings on the same clip for creative effect.

df_luts_04The best part of LUT Utility is that you aren’t limited to the built-in LUTs. When you install the plug-in, a LUT Utility pane is added to System Preferences. In that pane, you can add additional LUTs sold by Color Grading Central or that you have created yourself. (External LUT files can be directly accessed within the filter when working in Motion 5.) One such package is the set of Osiris Digital Film Emulation LUTs developed jointly by Riddle and visionCOLOR. These are a set of nine film LUTs designed to mimic the looks of various film stocks. Each has two settings designed for either log or Rec. 709 video. For example, you can take an ALEXA log-C file and apply two instances of LUT Utility. Set the first filter to use the log-C-to-Rec.709 LUT. Then in the second filter, pick one of the film LUTs, but use the Rec. 709 version of it. Or, you could apply one instance of the LUT Utility filter and simply pick the same film LUT, but instead, select its log version. Both work, but will give you slightly different looks. Using the filter’s amount slider, it’s easy to fine tune the intensity of the effect.

df_luts_05LUT Utility is applied as a filter, which means you can still add other color correction filters before or after it. Applying a filter, like Hawaiki Color, prior to a log conversion LUT, means that you would be adjusting color values of the log image, before converting it into Rec. 709. If you add such a filter after the LUT, then you are grading the already-converted image. Each position will give you different results, but most of this is handled gracefully, thanks to FCP X’s floating-point processing. Finally, you can also apply the LUT as a filter and then do additional corrections downstream of the filter by using the built-in Color Board tools.

I found these LUTs easy to install and use. They appear to be pretty lightweight in how they affect FCP X playback performance. I’m running a 2009 Mac Pro with a new Mavericks installation. I can apply one or more instances of the LUT Utility filter and my unrendered ProRes media plays in real-time. With the widespread use of log and log-style gamma profiles, this is one of the handiest filter sets to have if you are a heavy FCP X user. Not only are most of the common cameras covered, but the Osiris LUTs add a nice creative edge that you won’t find at this price point in competitive products. If you use FCP X for color correction and finishing, then it’s really an essential tool.

©2014 Oliver Peters

Photo phun II

Time to come back with a look at photography – just for the fun of it. Earlier this year I talked about using Pixelmator as an alternative to Photoshop. When I work with photos, I prefer to use Lightroom, Aperture and/or Photoshop (in that order). For extra effects, a touch of Tiffen Dfx, DFT Film Stocks or Magic Bullet Looks also gives you more pizzazz. While Pixelmator is pretty “lite” compared with Photoshop, it still gives most casual photographers more than enough control to enhance their images. Since it is based on Apple’s Core Image technology, it can also serendipitously take advantage of some of the FxFactory effects plug-ins.

Below is a set of images processed strictly with Pxelmator. I did use some of the FxFactory filters just because they were there, but understand that most of these effects also have native equivalents within Pixelmator. So, FxFactory filters are not an essential part in using Pixelemator as your image processing application. Click on any image below for a slideshow.

Merry Christmas and Happy Holidays! See you in the new year!