CoreMelt PaintX

When Apple launched Final Cut Pro X, it was with a decidedly simplified set of video effects. This was enhanced by the easy ability for users to create their own custom effects, using Apple Motion as a development platform. The result has been an entirely new ecosystem of low-cost, high-quality video effects. As attractive as that is, truly advanced visual effects still require knowledgeable plug-in developers who are able to work within the FCPX and macOS architecture in order to produce more powerful tools. For example, other built-in visual effects tools, such as Avid Media Composer’s Intraframe Paint or the Fusion page in DaVinci Resolve, simply aren’t within the scope of FCPX, nor what users can create on their own through Motion templates.

To fill that need, developers like CoreMelt have been designing a range of advanced visual effects tools for the Final Cut Pro market, including effects for tracking, color correction, stabilization, and more. Their newest release is PaintX, which adds a set of Photoshop-style tools to Final Cut Pro X. As with many of CoreMelt’s other offerings, PaintX includes planar tracking, thanks to the licensing of Mocha tracking technology.

To start, drop the PaintX effect onto a clip and then launch the custom interface. PaintX requires a better control layout than the standard FCPX user interface has been designed for. Once inside the PaintX interface window, you have a choice of ten brush functions, including paint color, change color, blur, smear, sharpen, warp, clone, add noise, heal, and erase. These functions cover a range of needs, from simple wire removal to beauty enhancements and even pseudo horror makeup effects. You have control over brush size, softness, aspect ratio, angle, and opacity. The various brushes also have specific controls for their related functions, such as the blur range for the blur brush. Effects are applied in layers and actions. Each stroke is an action and both remain editable. If you aren’t the most precise artist, then the erase brush comes in handy. Did you color a bit too far outside of the lines? Simply use the erase brush on that layer and trim back your excess.

Multiple brush effects can be applied to the same or different areas within the image, simply by adding a new layer for each effect. Once you’ve applied the first paint stroke, an additional brush control panel opens – allowing you to edit the brush parameters, after the fact. So, if your brush size was too large or not soft enough, simply alter those settings without the need to redo the effect. Each effect can be individually tracked in either direction. The Mocha tracker offers additional features, such as transform (scale/position) versus perspective tracking, along with the ability to copy and paste tracking data between brush layers.

As a Final Cut Pro X effect, PaintX works within the standard video pipeline. If you applied color correction upstream of your PaintX filter, then that grade is visible within the PaintX interface. But if the color correction is applied downstream of the PaintX effect, you won’t see it when you open the PaintX interface. However, that correction will still be uniformly applied to the clip, including the areas altered within the PaintX effect. If you’ve “punched into” a 4K clip on an HD timeline, when you open PaintX, you’ll still see the full 4K frame. Finally, you have additional FCPX control over the opacity and mix of the applied PaintX filter.

I found PaintX to be well-behaved even on a modest Mac, like my 3-year-old laptop. However, if you don’t have a beefy Mac, keep the effect simple. The more brush effects that you apply and track in a single clip, the slower the real-time response will become, especially on under-powered machines. These effects are GPU-intensive and paint strokes are really a particle system; therefore, simple, single-layer effects are the easiest on the machine. But, if you intend to do more complex effects like blurs and sharpens in multiple layers, then you will really want one of the more powerful Macs. Playback response is generally better, once you’ve saved the effect and exit back to Final Cut. I did run into one minor issue with the clone brush on a single isolated clip, while using a 2013 Mac Pro. CoreMelt told me there have been a few early bugs with certain GPUs and is looking into the anomaly I discovered. That model in particular has been notorious for GPU issues with video effects. (Update: CoreMelt sent me a new build, which has corrected this problem.)

Originally written for RedShark News

©2018 Oliver Peters

Advertisements

Art Doesn’t Pay

In the short time since its 2015 launch, Frame.io has become a leading video collaboration site. Going beyond its early roots as a video review-and-approval site, Frame.io now supports numerous, long distance workflows that empower creative video professionals all around the world. Most companies feature blogs and customer profiles as just another form of marketing. For Frame.io, these are a way to give back to the post production community. Their blog features tips and tutorials that benefit editors and other creatives, whether or not they use the company’s services in their daily workflows.

The newest outreach is Frame.io Masters, a short film series, featuring renowned filmmakers who also happen to be customers of the site. Emery Wells, CEO and co-founder, explains, “We wanted Frame.io Masters to be delivered in the voice of the creator. These are personal stories, brought to life by having each filmmaker make their own film, with no direction from us. It’s a manifestation of what Frame.io is all about – a collaborative effort that will serve as inspiration for aspiring creatives in every facet of the word.”

The inaugural video showcases sought-after Australian commercial filmmaker, Mark Toia. His inspiring short film, entitled Art Doesn’t Pay, is a showreel that features a wide range of his impressive work. The title stems from what Toia was told in school. As the film demonstrates, art did indeed come to pay for Toia. Heeding his teacher’s admonition, Toia started his working career as a steelworker. But an amateur interest in photography brought professional attention that resulted in a new path following his natural artistic talent. This eventually brought him to commercial filmmaking.

Words that motivate you to use your talent

When I asked about the apparent contradiction of the title, Mark Toia replied, “My teacher said to me, ‘Art doesn’t pay’. This was a motivation for me. These words were the drive behind my push to prove him wrong.  At the end of the video, I state that ‘My teacher was wrong’. What I should have said at the end was that art does pay. Because with commerce, marketing, and general good business practices, art does pay.”

Photography, painting, and cinematography are very “hands on”, in a similar fashion to some blue collar jobs. Do these follow the so-called “10,000-hour rule”? That’s the premise that you get good at something only after having invested a lot of time doing to it – thus, the 10,000 hours. But Toia wasn’t completely convinced. “Not sure about that. I’m a firm believer in natural ability. I could pick up a brush and a pencil and draw or paint real life almost instantly. This was an obvious natural gift. I had no formal training in photography, but quite quickly obtained the right eye for it and was better than most of my peers in very little time and made money from it very early.  So the 10,000-hour rule may be the case for the people with a natural ability. I do not doubt that at all. For example, I wanted to be a professional motorcycle racer. I put thousands of hours into trying to be the fastest I could be. I remember this person started racing against us. He was new to the sport and quickly beat us all – and ended up winning five world titles. His name was Mick Doohan. Another natural.”

Art meets technology

Toia is obviously a natural artist, but he is also no stranger to the technology. I see his frequent posts on the RedUser forum about RED cameras and using Final Cut Pro X for editing. Toia explained, “I’m very open about not being a fan of a logo or a brand, but more a fan of time savings, speed, and performance. Not caring at all who made the app or camera. I’m not loyal to any brand. I have told Jim and Jarred of RED many times that if someone else brought out a camera that was faster, lighter, smaller, with more dynamic range, and with faster frames rates, then I would jump ship in an instant. And they know this very well, as that’s the reason why I jumped to RED in the first place. RED produces a camera that ticks most, but not all, of my boxes at the moment, hence, why I use RED. Overall I’m only loyal to a tool that gets me to the results I want quicker, at a high quality, and what gets me to bed earlier, makes me work fewer hours, and helps me make money more easily. Simple as that.”

“When it comes to editing and compositing programs, I made a point of learning them all to a high level, if only to make sure I was making the right judgment and choice of using the correct tool for the job. Flame, for instance, has a great keyer – Color Warper. But it’s a slow machine, so I only use it for that toolset. It’s actually a fantastic program, but just slow to use overall. If it could use the GPU like After Effects and Apple Motion – giving near real-time feedback – it would be a winner. Nuke is slow, too, but great for multiple layers of 3D.  Cheap, but again, very slow. After Effects is very fast for 90% of everything I do, so I use that for my 3D and 2D compositing work.”

“From an editing perspective, Avid was my go-to for many years, but FCP7 was quicker, so I jumped to that. Premiere Pro came out being able to use multiple codecs, so I used that for a couple of years, but… again it wasn’t as quick to use as FCP7. Then FCPX came onto the scene and I hated everything about it. I tried to love it, but could not get my head around the magnetic timeline. After my third attempt of trying to learn and understand it, the penny finally dropped. I got it! Now I quite literally work 30-40% faster in FCPX than any other edit program. I still work in Premiere from time to time for older projects I have.  I know both programs intimately, so I can 100% say that FCPX is far more stable and quicker to use than Premiere, giving myself more time to be creative. And that’s were it wins. Speed and performance is always my drawcard. Not the logo on the box. I couldn’t care less if Google or Coca-Cola invented it. Time makes for better creative, a better end product, and better profit margins.”

Mark Toia shows us that natural talent combined with a drive for the best results is a winning combination.

Originally written for RedShark News.

©2018 Oliver Peters

Apple 2018 MacBook Pro

July was a good month for Apple power users, with the simultaneous release of Blackmagic Design’s eGPU and a refresh of Apple’s popular MacBook Pro line, including both 13″ and 15″ models. Although these new laptops retain the previous model’s form factor, they gained a bump-up in processors, RAM, and storage capacity.

Apple loaned me one of the Touch Bar space gray 15” models for this review. It came maxed out with the 8th generation 2.9 GHz 6-core Intel Core i9 CPU, 32GB DDR4 (faster) RAM, Radeon Pro 560X GPU, and a 2TB SSD. The price range on the 15″ model is pretty wide, due in part to the available SSD choices – from 256GB up to 4TB. Touch Bar 15” configurations start at $2,399 and can go all the way up to $6,699, once you spec the top upgrade for everything. My configuration was only $4,699 with the 2TB SSD. Of course, that’s before you add Apple Care (which I highly recommend for laptops) and any accessories.

Apple also released premium leather sleeves for both the 13″ and 15″ models in three colors ($199 for the 15″ size). They are pricey, of course, but not out of line with other branded, luxury products, like bags and watch bands. They fit the unit snuggly and protect it when you are out and about. In addition, they serve as a good pad on rough desk surfaces or when you have the MacBook Pro on your lap. Depending on the task you are performing, the bottom surface of the MacBook Pro can get warm, but nothing to be concerned about.

Before you point me to the nearest Windows gaming machine instead, let me mention that this review really isn’t a comparison against Windows laptops, but rather advances by Apple within the MacBook Pro line. But for context, I have owned six laptops to date – 3 PCs and 3 Macs. I shifted to Mac in order to have access to Final Cut Pro and have been happy with that move. The first 2 PCs developed stress fractures at the lid hinges before they were even a year old. The third, an HP, was solid, but after I gave it to my daughter, the power supply shorted. In addition, the hard drive became so corrupt (thank you Windows) that it wasn’t worth trying to recover. In short, my Mac laptop experience, like that of others, has been one of good value. MacBook Pros generally last years and if you use them for actual billable work (editing, DIT, sound design, etc.), then the investment will pay for itself.

This is the fastest and best laptop Apple has made. Apple engineering has nicely balanced power, size, weight, and battery life in a way that’s hard to counter. It is expensive, but if you try to find an equivalent PC, it is hard to actually find one with these exact same specs or components, until you get into gaming PCs. Those a) look pretty ugly, b) tend to be larger and heavier, with lower battery life, and c) cost about the same. There’s also the sales experience. Try to navigate nearly any PC-centric laptop supplier in an effort to customize the options and it tends to become an exercise in frustration. On the other hand, Apple makes it quite easy to buy and configure its machines with the options that you want.

I do have to mention that when these MacBook Pros first came out there was an issue of performance throttling, which was quickly addressed by Apple and fixed by a supplemental macOS release. That had already been installed on my unit, so no throttling issues that affected any of my performance tests.

Likewise, there have been debris complaints with the first run of the “butterfly” keys used in this and the previous version of these laptops. As other reviewers have stated when tear-downs have been done, Apple has added a membrane under the keys to help with sound dampening. Some reviewers have speculated that this also helps mitigate or even eliminate the debris issues. Whatever the reason, I liked typing on this keyboard and it did sound quieter to me. I tend to bang on keys, since I’m not a touch typist. The feel of a keyboard to a typist can be very subjective and in the course of a day, I tend to type on several vintages of Apple keyboards. In general, the keyboard on this newest MacBook Pro felt comfortable to me, when used for standard typing.

What did Apple bring new to the mix?

When Apple introduced the Touch Bar in 2016, I thought ‘meh’. But after these couple of weeks, I’ve really enjoyed it, especially when an application like Final Cut Pro X extends its controls to the Touch Bar. You can switch the Touch Bar preferences to only be function keys if you like. But having control strip options makes it quick to adjust screen brightness, volume, and so on. In the case of FCPX, you also get a mini-timeline view in some modes. Even QuickTime player calls up a small movie strip into the Touch Bar screen for the file being played.

These units also include Apple’s T2 security chip, which powers the fingerprint Touch ID and the newly added “Hey Siri” commands. The Retina screen on this laptop is gorgeous with up to 500 nits brightness and a wide color gamut. Another new addition is True Tone, which adjusts the display’s color temperature for the surrounding ambient light. That may become a more important selling point in the coming years. There is growing concern within the industry that blue light emitted from computer displays causes long-term eyesight damage. Generally, True Tone warms up the screen when under interior lighting, which reduces eye fatigue when you are working with a lot of white documents. But my recommendation is that editors, colorists, photographers, and designers turn this feature off when working on tasks that require color accuracy. Otherwise, the color balance of media will appear too warm (yellowish).

The 2018 15” MacBook Pro has four Thunderbolt 3/USB-C ports and a headphone jack. The four ports (two per side) are driven by two internal Thunderbolt 3 (40Gb/s) buses. It appears that’s one for each side, which means that plugging in two devices on one side will split the available Thunderbolt 3 bandwidth on that bus in half. Although, this doesn’t seem to be much of a factor during actual use. The internal bus routing does appear to be different from the previous model, in spite of what otherwise is more or less the same hardware configuration.

Gone are all other connections, so plan on purchasing an assortment of adapters to connect peripherals, such as those ubiquitous USB thumb drives or hardware dongles (license keys). I do wish that Apple had retained at least one standard USB port. Thunderbolt 3 supports power, so no separate MagSafe port is required either. (Power supply and cable are included.) One minor downside of this is that there is no indicator LED when a full battery charge is achieved, like we used to have on the MagSafe plug.

If connected to a Thunderbolt 3 device with an adequate power supply (e.g. the LG displays or the Blackmagic eGPU sold through Apple), then a single cable can both transfer data and power the laptop. One caveat is that Thunderbolt 3 doesn’t pass a video signal in the same way as Thunderbolt 2. You cannot simply add a Thunderbolt 3-to-Thunderbolt 2 adapter and connect a typical monitor’s MiniDisplayPort plug, as was possible with Thunderbolt 2 ports. External monitors without the correct connection will need to go through a dock or monitor adapter in order to pass a video signal. (This is also true for the iMac Pros.)

Many users have taken to relying on their MacBook Pros as the primary machine for their home or office, as well on the road. The upside of Thunderbolt connectively is that when you get back to the office, connecting a single Thunderbolt 3 cable to the rest of your suite peripherals (dock, display, eGPU, whatever) is all you need to get up and running. Simple and clean. Stick the laptop in a cradle in the clamshell mode or on a laptop stand, connect the cable, and you now have a powerful desktop machine. MacBook Pros have gained enough power in recent years that – unless your demands are heavy – they can easily service your editing, photography, and graphic needs.

Is it time to upgrade?

I own a mid-2014 15” MacBook Pro (the last series with an NVIDIA GPU), which I purchased in early 2015. Three years is often a good interval for most professional users to plan on a computer refresh, so I decided to compare the two. To start with, the new 2018 machine boots faster and apps also open faster. It’s even slightly smaller and thinner than the mid-2014 model. Both have fast SSDs, but the 2018 model is significantly faster (2645 MB/s write, 2722 MB/s read – Blackmagic Speed Test).

As with other reviews, I pulled an existing edit project for my test sequence. This timeline could be the same in Final Cut Pro X, Premiere Pro, and Resolve – without effects unique to one specific software application. My timeline consisted of 4K Alexa ProResHQ files that had a LUT and were scaled into a 1080p sequence. A few 1080p B-roll shots were also part of this sequence. The only taxing effect was a reverse slomo 4K clip, using optical flow interpolation. Both machines handled 4K ProRes footage just fine at full resolution using various NLEs. Exports to ProRes and H.264 were approximately twice as fast from Final Cut Pro X on the newer MacBook Pro. The same exports from Premiere Pro were longer overall than from FCPX, but faster on the 2018 machine, as well (see the section at the end for performance by the numbers).

If you are a fan of Final Cut Pro X, this machine is one of the best to use it on, especially if you can store your media on the internal drive. However, as an equalizer of sorts, I also ran these same test projects from an external SSD connected via USB3. While fast (over 200+ MB/s read/write), it wasn’t nearly as fast as the internal SSDs. Nevertheless, performance didn’t really lag behind with either FCPX or Premiere Pro. However, the optical flow clip did pose some issues. It played smoothly at “best quality” in FCPX, but oddly stuttered in the “best performance” setting. It did not play well in Premiere Pro at either full or half resolution. I also believe it contributed to the slower export times evident with Premiere Pro.

I tested a second project made up of all 4K REDCODE raw footage, which was placed into a 4K timeline. The 2018 MacBook Pro played the individual files and edited sequences smoothly when set to “best performance” in FCPX or half resolution in Premiere Pro. However, bumping the settings up to full quality caused stuttering with either NLE.

My last test was the same DaVinci Resolve project that I’ve used for my eGPU “stress” tests. These are anamorphic 4K Alexa files in a 2K DCI timeline. I stripped off all of the added filters that I had applied for the test of the eGPU, leaving a typical editing timeline with only a LUT and basic correction. This sequence played smoothly without dropping frames, which bodes well for editors who are considering a shift to Resolve as their main NLE.

Speaking of the Blackmagic eGPU tests, I had one day of overlap between the loans of the MacBook Pro and the Blackmagic eGPU. DaVinci Resolve’s real-time playback performance and exports were improved by about a 2X factor with the eGPU connected to the 15” model. Naturally,  the 15” machine by itself was quite a bit faster than the 13” MacBook Pro, so the improvement with an eGPU attached wasn’t as dramatic of a margin as the test with the 13” demonstrated. Even with this powerhouse MacBook Pro, the Blackmagic eGPU still adds value as a general appliance, as well as providing Resolve acceleration.

A note on battery life. The spec claims about 10 hours, but that’s largely for simple use, like watching web movies or listening to iTunes. Most of these activities do not cause the graphics to switch over from the integrated Intel to the Radeon Pro GPU, which consumes more power. In my editing tests with the Radeon GPU constantly on – and most of the energy saving settings disabled – I got five to six hours of battery life. That’s even when an application like FCPX was open, but minimized, without any real activity being done on the laptop.

I also ran a “heavy load” test, which involved continually looping my sample 1080 timeline (with 4K source media) full screen at “best quality” in FCPX. This is obviously a worst case scenario, but the charge only lasted about two hours. In short, the battery capacity is very good for a laptop, but one can only expect so much. If you plan on a heavy workload for an extended period of time, stay plugged in.

The 2018 MacBook Pro is a solid update that creative professionals will certainly enjoy, both in the field and even as a desktop replacement. If you bought last year’s model, there’s little reason to refresh your computer, yet. But three years or more? Get out the credit card!

_________________________________________________

Performance by the numbers

Blackmagic Design eGPU test

DaVinci Resolve renders/exports
(using the same test sequence as used for my eGPU review)

13” 2018 MacBook Pro – internal Intel graphics only
Render at source resolution – 1fps
Render at timeline resolution – 4fps

13” 2018 MacBook Pro – with Blackmagic eGPU
Render at source resolution – 5.5fps
Render at timeline resolution – 17.5fps

15” 2018 MacBook Pro – internal Radeon graphics only
Render at source resolution – 2.5fps
Render at timeline resolution – 8fps

15” 2018 MacBook Pro – with Blackmagic eGPU
Render at source resolution – 5.5fps
Render at timeline resolution – 16fps

Standard performance tests – 2018 15” MacBook Pro vs. Mid-2014
(using editing test sequence – 4K ProResHQ media)

2018 export from FCPX to ProRes  :30
2018 export from FCPX to H.264 at 10Mbps  :57
2014 export from FCPX to ProRes  :57
2014 export from FCPX to H.264 at 10Mbps  1:42

2018 export from Premiere Pro to ProRes  2:59
2018 export from Premiere Pro to H.264 at 10Mbps  2:32
2014 export from Premiere Pro to ProRes  3:35
2014 export from Premiere Pro to H.264 at 10Mbps  3:25

2018 export from Resolve to ProRes :35
2018 export from Resolve to H.264 at 10Mbps  :35
(Mid-2014 MBP was not used in this test)

Originally written for RedSharkNews

©2018 Oliver Peters

Blackmagic Design eGPU

Power users have grown to rely on graphics processing units from AMD, Intel and Nvidia to accelerate a wide range of computational functions – from visual effect filters to gaming and 360VR, and even to bitcoin mining. Apple finally supports external GPUs, which can easily be added as plug-and-play devices without any hack. Blackmagic Design just released its own eGPU product for the Mac, which is sold exclusively through Apple ($699 USD). It requires macOS 10.13.6 or later, and a Thunderbolt 3 connection. (Thunderbolt 2, even with adapters, will not work.)

The Blackmagic eGPU features a sleek, aluminum enclosure that makes a fine piece of desk art. It’s of similar size and weight to a 2013 Mac Pro and is optimized for both cooling and low noise. The unit is built around the AMD Radeon Pro 580 GPU with 8GB of video memory. It delivers 5.5 teraflops of processing power and is the same GPU used in Apple’s top-end, 27” Retina 5K iMac.

Leveraging Thunderbolt 3

Thunderbolt 3 technology supports 40Gb/s of bandwidth, as well as power. The Blackmagic eGPU includes a beefy power supply that can also power and/or charge a connected MacBook Pro. There are two Thunderbolt 3 ports, four USB3.1 ports, and HDMI. Therefore, you can connect a Mac, two displays, plus various USB peripherals. It’s easy to think of it as an accelerator, but it is also an appliance that can be useful in other ways to extend the connectivity and performance of MacBook Pros. Competing products with the same Radeon 580 GPU may be a bit less expensive, but they don’t offer this level of connectivity.

Apple and Blackmagic both promote eGPUs as an add-on for laptops, but any Thunderbolt 3 Mac qualifies. I tested the Blackmagic eGPU with both a high-end iMac Pro and the base model 13” 2018 MacBook Pro with touch bar. This model of iMac Pro is configured with the more advanced Vega Pro 64 GPU (16GB VRAM). My main interest in including the iMac Pro was simply to see whether there would be enough performance boost to justify adding an eGPU to a Mac that is already Apple’s most powerful. Installation of the eGPU was simply a matter of plugging it in. A top menu icon appears on the Mac screen to let you know it’s there and so you can disconnect the unit while the Mac is powered up.

Pushing the boundaries through testing

My focus is editing and color correction and not gaming or VR. Therefore, I ran tests with and without the eGPU, using Final Cut Pro X, Premiere Pro, and DaVinci Resolve (Resolve Studio 15 beta). Anamorphic ARRI Alexa ProRes 4444 camera files (2880×2160, native / 5760×2160 pixels, unsqueezed) were cut into 2K DCI (Resolve) and/or 4K DCI (FCPX, Premiere Pro) sequences. This meant that every clip got a Log-C LUT and color correction, as well as aspect ratio correction and scaling. In order to really stress the system, I added several GPU-accelerated effect filters, like glow, film grain, and so on. Finally, timed exports went back to ProRes 4444 – using the internal SSD for media and render files to avoid storage bottlenecks.

Not many applications take advantage of this newfound power, yet. Neither FCPX nor Premiere utilize the eGPU correctly or even at all. Premiere exports were actually slower using the eGPU. In my tests, only DaVinci Resolve gained measurable acceleration from the eGPU, which also held true for a competing eGPU that I compared.

If editing, grading or possibly location DIT work is your main interest, then consider the Blackmagic eGPU a good accessory for DaVinci Resolve running on a MacBook Pro. As a general rule, lesser-powered machines benefit more from eGPU acceleration than powerful ones, like the iMac Pro, with its already-powerful, built-in Vega Pro 64 GPU.

Performance by the numbers (iMac Pro only)

To provide some context, here are the results I got with the iMac Pro:

Resolve on iMac Pro (internal V64 chip) – NO eGPU – Auto GPU config

Playback of timeline at real-time 23.976 without frames dropping

Render at source resolution – average 11fps (slower than real-time)

Render at timeline resolution – average 33fps (faster than real-time)

Resolve on iMac Pro – with BMD eGPU (580 chip) – OpenCL

Playback of timeline at real-time 23.976 without frames dropping

Render at source resolution – average 11fps (slower than real-time)

Render at timeline resolution – average 37fps (faster than real-time)

Metal

Apple’s ability to work with eGPUs is enabled by Metal. This is their framework for addressing hardware components, like graphics and central processors. The industry has relied on other frameworks, including OpenGL, OpenCL and CUDA. The first two are open standards written for a wide range of hardware platforms, while CUDA is specific to Nvidia GPUs. Apple is deprecating all of these in favor of Metal (now Metal 2). With each coming OS update, these will become more and more “legacy” until presumably, at some point in the future, macOS may only support Metal.

Apple’s intention is to gain performance improvements by optimizing the code at a lower level “closer to the metal”. It is possible to do this when you only address a limited number of hardware options, which may explain why Apple has focused on using only AMD and Intel GPUs. The downside is that developers must write code that is proprietary to Apple computers. Metal is in part what gives Final Cut Pro X it’s smooth media handling and real-time performance. Both Premiere Pro and Resolve give you the option to select Metal, when installed on Macs.

In the tests that I ran, I presume FCPX only used Metal, since there is no option to select anything else. I did, however, test both Premiere Pro/Adobe Media Encoder and Resolve with both Metal and again with OpenCL specifically selected. I didn’t see much difference in render times with either setting in Premiere/AME. Resolve showed definite differences, with OpenCL the clear winner. For now, Resolve is still optimized for OpenCL over Metal.

Power for the on-the-go editor and colorist

The MacBook Pro is where the Blackmagic eGPU makes the most sense. It gives you better performance with faster exports, and adds badly-needed connectivity. My test Resolve sequence is a lot more stressful than I would normally create. It’s the sort of sequence I would never work with in the real world on a lower-end machine, like this 13” model. But, of course, I’m purposefully pushing it through a demanding task.

When I ran the test on the laptop without the eGPU connected, it would barely play at all. Exports at source resolution rendered at around 1fps. Once I added the Blackmagic eGPU, this sequence played in real-time, although the viewer would start to drop frames towards the end of each shot. Exports at the source resolution averaged 5.5fps. At timeline resolution (2K DCI) it rendered at up to 17fps, as opposed to 4fps without it. That’s over 4X improvement.

Everyone’s set of formats and use of color correction and filters are different. Nevertheless, once you add the Blackmagic eGPU to this MacBook Pro model, functionality in Resolve goes from insanely slow to definitely useable. If you intend to do reliable color correction using Resolve, then a Thunderbolt 3 UltraStudio HD Mini or 4K Extreme 3 is also required for proper video monitoring. Resolve doesn’t send video signals over HDMI, like Premiere Pro and Final Cut Pro X can.

It will be interesting to see if Blackmagic also offers a second eGPU model with the higher-end chip in the future. That would likely double the price of the unit. In the testing I’ve done with other eGPUs that used a version of the Vega 64 GPU, I’m not convinced that such a product would consistently deliver 2X more performance to justify the cost. This Blackmagic eGPU adds a healthy does of power and connectivity for current MacBook Pro users and that will only get better in the future.

I think it’s clear that Apple is looking towards eGPUs are a way to enhance the performance of its MacBook Pro line, without compromising design, battery life, and cooling. Cable up to an external device and you’ve gained back horsepower that wouldn’t be there in the standard machine. After all, you mainly need this power when you are in a fixed, rather than mobile, location. The Blackmagic eGPU is portable enough, so that as long as you have electrical power, you are good to go.

In his review of the 2018 MacBook Pro, Ars Technica writer Samuel Axon stated, “Apple is trying to push its own envelope with the CPU options it has included in the 2018 MacBook Pro, but it’s business as usual in terms of GPU performance. I believe that’s because Apple wants to wean pro users with serious graphics needs onto external GPUs. Those users need more power than a laptop can ever reasonably provide – especially one with a commitment to portability.”

I think that neatly sums it up, so it’s nice to see Blackmagic Design fill in the gaps.

Originally written for RedShark News.

©2018 Oliver Peters

Beyond the Supernova

No one typifies hard driving, instrumental, guitar rock better than Joe Satriani. The guitar virtuoso – known to his fans as Satch – has sixteen studio albums under his belt, along with several other EPs, live concert and compilation recordings. In addition to his solo tours, Satriani founded the “G3”, a series of short tours that feature Satriani along with a changing cast of two other all-star, solo guitarists, such as Steve Vai, Yngwie Malmsteen, Guthrie Govan, and others. In another side project, Satriani is the guitarist for the supergroup Chickenfoot, which is fronted by former Van Halen lead singer, Sammy Hagar.

The energy behind Satriani’s performances was captured in the new documentary film, Beyond the Supernova, which is currently available on the Stingray Qello streaming channel. This documentary grew out of the general behind-the-scenes coverage of Satriani’s 2016 and 2017 tours in Asia and Europe, to promote his 15th studio album, Shockwave Supernova. Tour filming was handled by Satriani’s son, ZZ (Zachariah Zane) – an up-and-coming, young filmmaker. The tour coincided with Joe Satriani’s 60th birthday and 30 years after the release of his multi-platinum-selling album Surfing with the Alien. These elements, as well as capturing Satriani’s introspective nature, provided the ingredients for a more in-depth project, which ZZ Satriani produced, directed and edited.

According to Joe Satriani in an interview on Stingray’s PausePlay, “ZZ was able to capture the real me in a way that only a son would understand how to do; because I was struggling with how I was going to record a new record and go in a new direction. So, as I’m on the tour bus and backstage – I guess it’s on my face. He’s filming it and he’s going ‘there’s a movie in here about that. It’s not just a bunch of guys on tour.’”

From music to filmmaking

ZZ Satriani graduated from Occidental College in 2015 with a BA in Art History and Visual Arts, with a focus on film production. He moved to Los Angeles to start a career as a freelance editor. I spoke with ZZ Satriani about how he came to make this film. He explained, “For me it started with skateboarding in high school. Filmmaking and skateboarding go hand-in-hand. You are always trying to capture your buddies doing cool tricks. I gravitated more to filmmaking in college. For the 2012 G3 Tour, I produced a couple of web videos that used mainly jump cuts and were very disjointed, but fun. They decided to bring me on for the 2016 tour in order to produce something similar. But this time, it had to have more of a story. So I recorded the interviews afterwards.”

Although ZZ thinks of himself as primarily an editor, he handed all of the backstage, behind-the-scenes, and interview filming himself, using a Sony PXW-FS5 camera. He comments, “I was learning how to use the camera as I was shooting, so I got some weird results – but in a good way. I wanted the footage to have more of a filmic look – to have more the feeling of a memory, than simply real-time events.”

The structure of Beyond the Supernova intersperses concert performances with events on the tour and introspective interviews with Joe Satriani. The multi-camera concert footage was supplied by the touring support company and is often mixed with historical footage provided by Joe Satriani’s management team. This enabled ZZ to intercut performances of the same song, not only from different locations, but even different years, going back to Joe Satriani’s early career.

The style of cutting the concert performances is relatively straightforward, but the travel and interview bridges that join them together have more of a stream-of-consciousness feel to them and are often quite psychedelic. ZZ says, “I’m not a big [Adobe] After Effects guy, so all of the ‘effects’ are practical and built up in layers within [Adobe] Premiere Pro. The majority of ‘effects’ dealt with layering, blending and cropping different clips together. It makes you think about the space within the frame – different shapes, movement, direction, etc. I like playing around that way – you end up discovering things you wouldn’t have normally thought of. Let your curiosity guide you, keep messing with things and you will look at everything in a new way. It keeps editing exciting!”

Premiere Pro makes the cut

Beyond the Supernova was completely cut and finished in Premiere Pro. ZZ explains why,  “Around 2011-12, I made the switch from [Apple] Final Cut Pro to Premiere Pro while I was in a film production class. They informed us that was the new standard, so we rolled with it and the transition was very smooth. I use other apps in the Adobe suite and I like the layout of everything in each one, so I’ve never felt the need to switch to another NLE.”

ZZ Satriani continues, “We had a mix of formats to deal with, including the need to upscale some of the standard definition footage to HD, which I did in software. Premiere handled the PXW-FS5’s XAVC-L codec pretty well in my opinion. I didn’t transcode to Pro Res, since I had so much footage, and not a lot of external hard drive space. I knew this might make things go more slowly – but honestly, I didn’t notice any significant drawbacks. I also handled all of the color correction, using Premiere’s Lumetri color controls and the FilmConvert plug-in.” Satriani created the sound design for the interview segments, but John Cuniberti (who has also mixed Joe Satriani’s albums) re-mixed the live concert segments in his studio in London. The final 5.1 surround mix of the whole film was handled at Skywalker Sound.

The impetus pushing completion was entry into the October 2017 Mill Valley Film Festival. ZZ says, “I worked for a month putting together the trailer for Mill Valley. Because I had already organized the footage for this and an earlier teaser, the actual edit of the film came easily. It took me about two months to cut – working by myself in the basement on a [2013] Mac Pro. Coffee and burritos from across the street kept me going.” 

Introspection brings surprises

Fathers and sons working together can often be an interesting dynamic and even ZZ learned new things during the production. He comments, “The title of the film evolved out of the interviews. I learned that Joe’s songs on an album tend to have a theme tied to the theme of the album, which often has a sci-fi basis to it. But it was a real surprise to me when Joe explained that Shockwave Supernova was really his character or persona on stage. I went, ‘Wait! After all these years, how did I not know that?’”

As with any film, you have to decide what gets cut and what stays. In concert projects, the decision often comes down to which songs to include. ZZ says, “One song that I initially thought shouldn’t be included was Surfing with the Alien. It’s a huge fan favorite and such an iconic song for Joe. Including it almost seemed like giving in. But, in a way it created a ‘conflict point’ for the film. Once we added Joe’s interview comments, it worked for me. He explained that each time he plays it live that it’s not like repeating the past. He feels like he’s growing with the song – discovering new ways to approach it.”

The original plan for Beyond the Supernova after Mill Valley was to showcase it at other film festivals. But Joe Satriani’s management team thought that it coincided beautifully with the release of his 16th studio album, What Happens Next, which came out in January of this year. Instead of other film festivals, Beyond the Supernova made its video premiere on AXS TV in March and then started its streaming run on Stingray Qello this July. Qello is known as a home for classic and new live concerts, so this exposes the documentary to a wider audience. Whether you are a fan of Joe Satriani or just rock documentaries, ZZ Satriani’s Beyond the Supernova is a great peek behind the curtain into life on the road and some of the thoughts that keep this veteran solo performer fresh.

Images courtesy of ZZ Satriani.

©2018 Oliver Peters

Hawaiki AutoGrade

The color correction tools in Final Cut Pro X are nice. Adobe’s Lumetri controls make grading intuitive. But sometimes you just want to click a few buttons and be happy with the results. That’s where AutoGrade from Hawaiki comes in. AutoGrade is a full-featured color correction plug-in that runs within Final Cut Pro X, Motion, Premiere Pro and After Effects. It is available from FxFactory and installs through the FxFactory plug-in manager.

As the name implies, AutoGrade is an automatic color correction tool designed to simplify and speed-up color correction. When you install AutoGrade, you get two plug-ins: AutoGrade and AutoGrade One. The latter is a simple, one-button version, based on global white balance. Simply use the color-picker (eye dropper) and sample an area that should be white. Select enable and the overall color balance is corrected. You can then tweak further, by boosting the correction, adjusting the RGB balance sliders, and/or fine-tuning luma level and saturation. Nearly all parameters are keyframeable, and looks can be saved as presets.

AutoGrade One is just a starter, though, for simple fixes. The real fun is with the full version of AutoGrade, which is a more comprehensive color correction tool. Its interface is divided into three main sections: Auto Balance, Quick Fix, and Fine-Tune. Instead of a single global balance tool, the Auto Balance section permits global, as well as, any combination of white, black, and/or skin correction. Simply turn on one or more desired parameters, sample the appropriate color(s) and enable Auto Balance. This tool will also raise or lower luma levels for the selected tonal range.

Sometimes you might have to repeat the process if you don’t like the first results. For example, when you sample the skin on someone’s face, sampling rosy cheeks will yield different results than if you sample the yellowish highlights on a forehead. To try again, just uncheck Auto Balance, sample a different area, and then enable Auto Balance again. In addition to an amount slider for each correction range, you can also adjust the RGB balance for each. Skin tones may be balanced towards warm or neutral, and the entire image can be legalized, which clamps video levels to 0-100.

Quick Fix is a set of supplied presets that work independently of the color balance controls. These include some standards, like cooling down or warming up the image, the orange and teal look, adding an s-curve, and so on. They are applied at 100% and to my eye felt a bit harsh at this default. To tone down the effect, simply adjust the amount slider downwards to get less intensity from the effect.

Fine-Tune rounds it out when you need to take a deeper dive. This section is built as a full-blown, 3-way color corrector. Each range includes a luma and three color offset controls. Instead of wheels, these controls are sliders, but the results are the same as with wheels. In addition, you can adjust exposure, saturation, vibrance, temperature/tint, and even two different contrast controls. One innovation is a log expander, designed to make it easy to correct log-encoded camera footage, in the absence of a specific log-to-Rec709 camera LUT.

Naturally, any plug-in could always offer more, so I have a minor wish list. I would love to see five additional features: film grain, vignette, sharpening, blurring/soft focus, and a highlights-only expander. There are certainly other individual filters that cover these needs, but having it all within a single plug-in would make sense. This would round out AutoGrade as a complete, creative grading module, servicing user needs beyond just color correction looks.

AutoGrade is a deceptively powerful color corrector, hidden under a simple interface. User-created looks can be saved as presets, so you can quickly apply complex settings to similar shots and set-ups. There are already many color correction tools on the market, including Hawaiki’s own Hawaiki Color. The price is very attractive, so AutoGrade is a superb tool to have in your kit. It’s a fast way to color-grade that’s ideal for both users who are new or experienced when it comes to color correction.

(Click any image to see an enlarged view.)

©2018 Oliver Peters

More about ProRes RAW

A few weeks ago I wrote a two-part post – HDR and RAW Demystified. In the second part, I covered Apple’s new ProRes RAW codec. I still see a lot of misinformation on the web about what exactly this is, so I felt it was worth an additional post. Think of this post as an addendum to Part 2. My apologies up front, if there is some overlap between this and the previous post.

_____________________________

Camera raw codecs have been around since before RED Digital Camera brought out their REDCODE RAW codec. At NAB, Apple decided to step into the game. RED brought the innovation of recording the raw signal as a compressed movie file, making on-board recording and simplified post-production possible. Apple has now upped the game with a codec that is optimized for multi-stream playback within Final Cut Pro X, thus taking advantage of how FCPX leverages Apple hardware. At present, ProRes RAW is incompatible with all other applications. The exception is Motion, which will read and play the files, but with incorrect default – albeit correctable – video levels.

ProRes RAW is only an acquisition codec and, for now, can only be recorded externally using an Atomos Inferno or Sumo 19 monitor/recorder, or in-camera with DJI’s Inspire 2 or Zenmuse X7. Like all things Apple, the complexity is hidden under the surface. You don’t get the type of specific raw controls made available for image tweaking, as you do with RED. But, ProRes RAW will cover the needs of most camera raw users, making this the raw codec “for the rest of us”. At least that’s what Apple is banking on.

Capturing in ProRes RAW

The current implementation requires a camera that exports a camera raw signal over SDI, which in turn is connected to the Atomos, where the conversion to ProRes RAW occurs. Although no one is very specific about the exact process, I would presume that Atomos’ firmware is taking in the camera’s form of raw signal and rewrapping or transforming the data into ProRes RAW. This means that the Atomos firmware would require a conversion table for each camera, which would explain why only a few Sony, Panasonic, and Canon models qualify right now. Others, like ARRI Alexa or RED cameras, cannot yet be recorded as ProRes RAW. The ProRes RAW codec supports 12-bit color depth, but it depends on the camera. If the SDI output to the Atomos recorder is only 10-bit, then that’s the bit-depth recorded.

Until more users buy or update these specific Atomos products – or more manufacturers become licensed to record ProRes RAW onboard the camera – any real-word comparisons and conclusions come from a handful of ProRes RAW source files floating around the internet. That, along with the Apple and Atomos documentation, provides a pretty solid picture of the quality and performance of this codec group.

Understanding camera raw

All current raw methods depend on single-sensor cameras that capture a Bayer-pattern image. The sensor uses a monochrome mosaic of photosites, which are filtered to register the data for light in the red, green, or blue wavelengths. Nearly all of these sensors have twice as many green receptors as red or blue. At this point, the sensor is capturing linear light at the maximum dynamic range capable for the exposure range of the camera and that sensor. It’s just an electrical signal being turned into data, but without compression (within the sensor). The signal can be recorded as a camera raw file, with or without compression. Alternatively, it can also be converted directly into a full-color video signal and then recorded – again, with or without compression.

If the RGGB photosite data (camera raw) is converted into RGB pixels, then sensor color information is said to be “baked” into the file. However, if the raw conversion is stored in that form and then later converted to RGB in post, sensor data is preserved intact until much later into the post process. Basically, the choice boils down to whether that conversion is best performed within the camera’s electronics or later via post-production software.

The effect of compression may also be less destructive (fewer visible artifacts) with a raw image, because data, rather than video is being compressed. However, converting the file to RGB, does not mean that a wider dynamic range is being lost. That’s because most camera manufacturers have adopted logarithmic encoding schemes, which allow a wide color space and a high dynamic range (big exposure latitude) to be carried through into post. HDR standards are still in development and have been in testing for several years, completely independent of whether or not the source files are raw.

ProRes RAW compression

ProRes RAW and ProRes RAW HQ are both compressed codecs with roughly the same data footprint as ProRes and ProRes HQ. Both raw and standard versions use a variable bitrate form of compression, but in different ways. Apple explains it this way in their white paper: 

“As is the case with existing ProRes codecs, the data rates of ProRes RAW are proportional to frame rate and resolution. ProRes RAW data rates also vary according to image content, but to a greater degree than ProRes data rates. 

With most video codecs, including the existing ProRes family, a technique known as rate control is used to dynamically adjust compression to meet a target data rate. This means that, in practice, the amount of compression – hence quality – varies from frame to frame depending on the image content. In contrast, ProRes RAW is designed to maintain constant quality and pristine image fidelity for all frames. As a result, images with greater detail or sensor noise are encoded at higher data rates and produce larger file sizes.”

ProRes RAW and HDR do not depend on each other

One of my gripes, when watching some of the ProRes RAW demos on the web and related comments on forums, is that ProRes RAW is being conflated with HDR. This is simply inaccurate. Raw applies to both SDR and HDR workflows. HDR workflows do not depend on raw source material. One of the online demos I saw recently immediately started with an HDR FCPX Library. The demo ProRes RAW clips were imported and looked blown out. This made for a dramatic example of recovering highlight information. But, it was wrong!

If you start with an SDR FCPX Library and import these same files, the default image looks great. The hitch here, is that these ProRes RAW files were shot with a Sony camera and a default LUT is applied in post. That’s part of the file’s metadata. To my knowledge, all current, common camera LUTs are based on conversion to the Rec709 color space, not HDR or wide gamut. If you set the inspector’s LUT tab to “none” in either SDR or HDR, you get a relatively flat, log image that’s easily graded in whatever direction you want.

What about raw-specific settings?

Are there any advantages to camera raw in the first place? Most people will point to the ability to change ISO values and color temperature. But these aren’t actually something inherently “baked” into the raw file. Instead, this is metadata, dialed in by the DP on the camera, which optimizes the images for the sensor. ISO is a sensitivity concept based on the older ASA film standard for exposing film. In modern digital cameras, it is actually an exposure index (EI), which is how some refer to it. (RedShark’s Phil Rhodes goes into depth in this linked article.)

The bottom line is that EI is a cross-reference to that camera sensor’s “sweet spot”. 800 on one camera might be ideal, while 320 is best on another. Changing ISO/EI has the same effect as changing gain in audio. Raising or lowering ISO/EI values means that you can either see better into the darker areas (with a trade-off of added noise) – or you see better highlight detail, but with denser dark areas. By changing the ISO/EI value in post, you are simply changing that reference point.

In the case of ProRes RAW and FCPX, there are no specific raw controls for any of this. So it’s anyone’s guess whether changing the master level wheel or the color temp/tint sliders within the color wheels panel is doing anything different for a ProRes RAW file than doing the same adjustment for any other RGB-encoded video file. My guess is that it’s not.

In the case of RED camera files, you have to install a camera raw plug-in module in order to work with the REDCODE raw codec inside of Final Cut Pro X. There is a lot of control of the image, prior to tweaking with FCPX’s controls. However, the amount of image control for the raw file is significantly more for a REDCODE file in Premiere Pro, than inside of FCPX. Again, my suspicion is that most of these controls take effect after the conversion to RGB, regardless of whether or not the slider lives in a specific camera raw module or in the app’s own color correction controls. For instance, changing color temperature within the camera raw module has no correlation to the color temperature control within the app’s color correction tools. It is my belief that few of these actually adjust file data at the raw level, regardless of whether this is REDCODE or ProRes RAW. The conversion from raw to RGB is proprietary with every manufacturer.

What is missing in the ProRes RAW implementation is any control over the color science used to process the image, along with de-Bayering options. Over the years, RED has reworked/improved its color science, which theoretically means that a file recorded a few years ago can look better today (using newer color science math) than it originally did. You can select among several color science models, when you work with the REDCODE format. 

You can also opt to lower the de-Bayering resolution to 1/2, 1/4, 1/8, etc. for a RED file.  When working in a 1080p timeline, this speeds up playback performance with minimal impact on the visible resolution displayed in the viewer. For full-quality conversion, software de-Bayering also yields different results than hardware acceleration, as with the RED Rocket-X card. While this level of control is nice to have, I suspect that’s the sort of professional complication that Apple seeks to avoid.

The main benefit of ProRes RAW may be a somewhat better-quality image carried into post at a lower file size. To get the comparable RGB image quality you’d need to go up to uncompressed, ProRes 4444, or ProRes 4444 XQ – all of which become very taxing in post. Yet, for many standard productions, I doubt you’ll see that great of a difference. Nevertheless, more quality with a lower footprint will definitely be welcomed.

People will want to know whether this is a game-changer or not. On that count, probably not. At least not until there are a number of in-camera options. If you don’t edit – and finish – with FCPX, then it’s a non-starter. If you shoot with a camera that records in a high-quality log format, like an ARRI Alexa, then you won’t see much difference in quality or workflow. If you shoot with any RED camera, you have less control over your image. On the other hand, it’s a definite improvement over all raw workflows that capture in image sequences. And it breathes some life into an older camera, like the Sony FS700. So, on balance, ProRes RAW is an advancement, but just not one that will affect as large a part of the industry as the rest of the ProRes family has.

(Note – click any image for an enlarged view. Images courtesy of Apple, FilmPlusGear, and OffHollywood.)

©2018 Oliver Peters