Everest VR and DaVinci Resolve Studio

In April of 2017, world famous climber Ueli Steck died while preparing to climb both Mount Everest and Mount Lhotse without the use of bottled oxygen. Ueli’s close friends Jonathan Griffith and Sherpa Tenji attempted to finish this project while director/photographer Griffith captured the entire story. The result is the 3D VR documentary, Everest VR: Journey to the Top of the World. It was produced by Facebook’s Oculus and teased at last year’s Oculus Connect event. Post-production was completed in February and the documentary is being distributed through Oculus’ content channel.

Veteran visual effects artist Matthew DeJohn was added to the team to handle end-to-end post as a producer, visual effects supervisor, and editor. DeJohn’s background includes camera, editing, and visual effects with a lot of experience in both traditional visual effects, 2D to 3D conversion, and 360 virtual reality. Before going freelance, he worked at In3, Digital Domain, Legend3D, and VRTUL.

As an editor, DeJohn was familiar with most of the usual tools, but opted to use Blackmagic’s DaVinci Resolve Studio and Fusion Studio applications as the post-production hub for the Everest VR documentary. Posting stereoscopic, 360-degree content can be quite challenging, so I took the opportunity to speak with DeJohn about using DaVinci Resolve Studio on this project.

_______________________________________________________

[OP] Please tell me a bit about your shift to DaVinci Resolve Studio as the editing tool of choice.

[MD] I have had a high comfort level with Premiere Pro and also know Final Cut Pro. Premiere has good VR tools and there’s support for it. In addition to these tools I was using Fusion Studio in my workflow so it was a natural to look at DaVinci Resolve Studio as a way to combine my Fusion Studio work with my editorial work.

I made the switch about a year and half ago and it simplified my workflow dramatically. It integrated a lot of different aspects all under one roof – the editorial page, the color page, the Fusion page, and the speed to work with high-res footage. From an editing perspective, the tools are all there that I was used to in what I would argue is a cleaner interface. Sometimes, software just collects all of these features over time. DaVinci Resolve Studio is early in its editorial development trajectory, but it’s still deep. Yet it doesn’t feel like it has a lot of baggage.

[OP] Stereo and VR projects can often be challenging, because of the large frame sizes. How did DaVinci Resolve Studio help you there?

[MD] Traditionally 360 content uses a 2:1 aspect ratio, so 4K x 2K. If it’s going to be a stereoscopic 360 experience, then you stack a left and right eye image on top of each other. It ends up being 4K x 4K square – two 4K x 2K frames stacked on top of each other. With DaVinci Resolve Studio and the graphics card I have, I can handle a 4K x 4K full online workflow. This project was to be delivered as 8K x 8K. The hardware I had wasn’t quite up to it, so I used an offline/online approach. I created 2K x 2K proxy files and then relinked to the full resolution sources later.  I just had to unlink the timeline and then reconnect it to another bin with my 8K media.

You can cut a stereo project just looking at the image for one eye, then conform the other eye, and then combine them. I chose to cut with the stacked format. My editing was done looking at the full 360 unwrapped, but my review was done through a VR headset from the Fusion page. From there I was also able to review the stereoscopic effect on a 3D monitor. 3D monitoring can also be done on the color page, though I didn’t use that feature on this project.

[OP] I know that successful VR is equal parts production and post. And that post goes much more smoothly with a lot of planning before anyone starts. Walk me through the nuts and bolts of the camera systems and how Everest VR was tackled in post.

[MD] Jon Griffith – the director, cameraman, and alpinist – a man of many talents – utilized a number of different systems. He used the Yi Halo, which is a 17-camera circular array. Jon also used the Z CAM V1 and V1 Pro cameras. All were stereoscopic 360 camera systems.

The Yi Halo camera used the Jump cloud stitcher from Google. You upload material to that service and it produces an 8K x 8K final stitch and also a proxy 2K x 2K stitch. I would cut with the 2K x 2K and then conform to the 8K x 8K. That was for the earlier footage. The Jump stitcher is no longer active, so for the more recent footage Jon switched to the Z CAM systems. For those, he would run through Z CAM’s Wonderstitch application, with is auto-stitching software. For the final, we would either clean up any stitching artifacts in Fusion Studio or restitch it in Mistika VR.

Once we had done that, we would use Fusion Studio for any rig removal and fine-tuned adjustments. No matter how good these cameras and stitching software are, they can fail in some situations. For instance, if the subject is too close to the camera or walks between seams. There’s quite a bit of composting/fixing that needs to be done and Fusion Studio was used heavily for that.

[OP] Everest VR consists of three episodes ranging from just under 10 minutes to under 17 minutes. A traditional cinema film, shot conservatively, might have a 10:1 shooting ratio. How does that sort of ratio equate on a virtual reality film like this?

[MD] As far as the percentage of shots captured versus used, we were in the 80-85% range of clips that ended up in the final piece. It’s a pretty high figure, but Jon captured every shot for a reason with many challenging setups – sometimes on the side of an ice waterfall. Obviously there weren’t many retakes. Of course the running time of raw footage would result in a much higher ratio. That’s because we had to let the cameras run for an extended period of time. It takes a while for a climber to make his way up a cliff face!

[OP] Both VR and stereo imagery present challenges in how shots are planned and edited. Not only for story and pacing, but also to keep the audience comfortable without the danger of motion-induced nausea. What was done to address those issues with Everest VR?

[MD] When it comes to framing, bear in mind there really is no frame in VR. Jon has a very good sense of what will work in a VR headset. He constructed shots that make sense for that medium, staging his shots appropriately without any moving camera shots. The action moved around you as the viewer. As such, the story flows and the imagery doesn’t feel slow even though the camera doesn’t move. When they were on a cliffside, he would spend a lot of time rigging the camera system. It would be floated off the side of the cliff enough so that we could paint the rigging out. Then you just see the climber coming up next to you.

The editorial language is definitely different for 360 and stereoscopic 360. Where you might normally have shots that would go for three seconds or so, our shots go for 10 to 20 seconds, so the action on-screen really matters. The cutting pace is slower, but what’s happening within the frame isn’t. During editing, we would plan from cut to cut exactly where we believed the viewer would be looking. We would make sure that as we went to the next shot, the scene would be oriented to where we wanted the viewer to look. It was really about managing the 360 hand-off between shots, so that viewers could follow the story. They didn’t have to whip their head from one side of the frame to the other to follow the action.

In some cases, like an elevation change – where someone is climbing at the top of the view and the next cut is someone climbing below – we would use audio cues. The entire piece was mixed in ambisonic third order, which means you get spatial awareness around and vertically. If the viewer was looking up, an audio cue from below would trigger them to look down at the subject for the next shot. A lot of that orchestration happens in the edit, as well as the mix.

[OP] Please explain what you mean by the orientation of the image.

[MD] The image comes out of the camera system at a fixed point, but based on your edit, you will likely need to change that. For the shots where we needed to adjust the XYZ axis orientation, we would add a Panomap node in the Fusion page within DaVinci Resolve Studio and shift the orientation as needed. That would show up live in the edit page. This way we could change what would become the center of the view.

The biggest 3D issue is to make sure the vertical alignment is done correctly. For the most part these camera systems handled it very well, but there are usually some corrections to be made. One of these corrections is to flatten the 3D effect at the poles of the image. The stereoscopic effect requires that images be horizontally offset. There is no correct way to achieve this at the poles, because we can’t guarantee how the viewer’s head is oriented when they look at the poles. In traditional cinema, the stereo image can affect your cutting, but with our pacing, there was enough time for a viewer to re-converge their view to a different distance comfortably.

[OP] Fusion was used for some of the visual effects, but when do you simply use the integrated Fusion page within DaVinci Resolve Studio versus a standalone version of the Fusion Studio application?

[MD] All of the orientation was handled by me during the edit by using the integrated Fusion page within DaVinci Resolve Studio. Some simple touch-ups, like painting out tripods, were also done in the Fusion page. There are some graphics that show the elevation of Everest or the climbers’ paths. These were all animated in the Fusion page and then they showed up live in the timeline. This way, changes and quick tweaks were easy to do and they updated in real-time.

We used the standalone version of Fusion Studio for some of the more complex stitches and for fixing shots. Fusion Studio is used a lot in the visual effects industry, because of its scriptability, speed, and extensive toolset. Keith Kolod was the compositor/stitcher for those shots. I sent him the files to work on in the standalone version of Fusion Studio. This work was a bit heavier and would take longer to render. He would send those back and I would cut those into the timeline as a finished file.

[OP] Since DaVinci Resolve Studio is an all-in-one tool covering edit, effects, color, and audio, how did you approach audio post and the color grade?

[MD] The Initial audio editing was done in the edit and Fairlight pages of DaVinci Resolve Studio. I cut in all of the temp sounds and music tracks to get the bone structure in place. The Fairlight page allowed me to get in deeper than a normal edit application would. Jon recorded multiple takes for his narration lines. I would stack those on the Fairlight page as audio layers and audition different takes very quickly just by re-arranging the layers. Once I had the take I liked, I left the others there so I could always go back to them. But only the top layer is active.

After that, I made a Pro Tools turnover package for Brendan Hogan and his team at Impossible Acoustic. They did the final mix in Pro Tools, because there are some specific built-in tools for 3D ambisonic audio. They took the bones, added a lot of Foley, and did a much better job of the final mix than I ever could.

I worked on the color correction myself. The way this piece was shot, you only had one opportunity to get up the mountain. At least on the actual Everest climb, there aren’t a lot of takes. I ended up doing color right from the beginning, just to make sure the color matched for all of those different cameras. Each had a different color response and log curve. I wanted to get a base grade from the very beginning just to make sure the snow looked the same from shot to shot. By the time we got to the end, there were very minimal changes to the color. It was mainly to make sure that the grade we had done while looking at Rec. 709 monitoring translated correctly to the headset, because the black levels are a bit different in the headsets.

[OP] In the end, were you 100% satisfied with the results?

[MD] Jon and Oculus held us to a high level in regards to the stitch and the rig removals. As a visual effects guy, there’s always something, if you look really hard! (laughs) Every single shot is a visual effects shot in a show like this. The tripod always has to be painted out. The cameraman always needs to be painted out if they didn’t hide well enough.

The Yi Halo doesn’t actually capture the bottom 40 degrees out of the full 360. You have to make up that bottom part with matte painting to complete the 360. Jon shot reference photos and we used those in some cases. There is a lot of extra material in a 360 shot, so it’s all about doing a really nice clone paint job within Fusion Studio or the Fusion page of DaVinci Resolve Studio to complete the 360.

Overall, as compared with all the other live-action VR experiences I’ve seen, the quality of this piece is among the very best. Jon’s shooting style, his drive for a flawless experience, the tools we used, and the skill of all those involved helped make this project a success.

The article originally written for Creative Planet Network.

©2020 Oliver Peters

DaVinci Resolve Editor Keyboard

Blackmagic Design doubled-down on advanced editing features in 2019 by introducing a new editing mode to DaVinci Resolve 16 called the cut page. They also added a dedicated editor’s keyboard – something that warms the heart of any editor who started their career in a linear edit suite. After some post-NAB feedback and adjustment, the keyboard is finally ready for prime time, running with DaVinci Resolve 16.1 (currently in public beta) or later.

Blackmagic Design’s Grant Petty comes from a broadcast engineering background and knows how fast tape editing was with the right controller. Speed is lost using a mouse-centric, drag-and-drop approach, so the DaVinci Resolve keyboard is designed to put speed back into modern edit workflows. Blackmagic Design was kind enough to loan me a keyboard for a couple of weeks of testing for this review.

Hardware design

The keyboard is very reminiscent of Sony’s BVE keyboards of the past. That’s not simply cosmetic – there are a number of plastic editing keyboards with a shuttle knob – it’s about precision engineering. The DaVinci Resolve search dial (job/shuttle/scroll wheel) truly feels like it has the same type of ballistics and tactile feedback that a Sony dial gave you. The DaVinci Resolve keyboard is built into a sturdy metal case with keycaps that are designed to take some pounding. They intend for the keyboard to last and will offer replacement parts as needed. In short, don’t think of this as a product you’ll have to toss out in a few years.

The keyboard connects via USB-C. But it also worked on the USB3.0 connection of a two-year old iMac and MacBook Pro by using a USB-A to USB-C cable. The back of the keyboard includes two additional USB-A ports for a thumb drive, mouse, or a DaVinci Resolve license key (“dongle”). The keyboard is wider than a standard extended keyboard due to dedicated edit keys on the left and the search dial on the right. It has a replaceable wrist rest on the front edge and adjustable feet to elevate the keyboard angle.

The Cut Page

The Editor Keyboard is optimized for the cut and edit pages. It does work as a standard keyboard in the color, Fairlight, and Fusion pages. However, I found the dial operation in those modes to be rather finicky. Outside of DaVinci Resolve, it’s a generic QWERTY keyboard, but the special edit keys and dial will not work with other editing software.

It’s hard to talk about the keyboard without delving into the cut page. While the keyboard works effectively and correctly in the edit page, you’ll still find yourself needing the mouse, which defeats the purpose. In short, the design motivation is fast editing where your hands never leave the keyboard. That ideal plays out best in the cut page and the two have been developed in tandem.

While the DaVinci Resolve cut page shares many similarities to Apple’s Final Cut Pro X, Blackmagic Design software engineers added a number of unique functions that improve editing speed. The best of these is the source tape view. The bin can be sorted by timecode, camera, duration, or name order using dedicated keys and then viewed as if from a single source – essentially a virtual string-out. Quickly scroll through the footage using the search dial as effortlessly as using the FCPX skimming function. Large, dedicated buttons for source and timeline, in and out, and sort methods make for easy navigation and quick assembly. Smart edit and special function buttons, such as the unique “close-up” button (automatically does a basic punch-in of high-res footage), round out the picture.

The cut page itself has a number of other unique features that are beyond the scope of this article. Nevertheless, one unique tool that is worth mentioning is the dual timeline view. The timeline pane is divided into a top mini-display of the full timeline, while the lower area always shows the zoomed-in section of the timeline at the current time indicator (cursor). You never have to zoom in and zoom out to navigate your timeline. The search dial makes it a breeze to quickly scroll through the full timeline (top) and then hit the jog key to zero in on the frame you want (bottom).

Trimming is where the dial shines. Dedicated keys quickly select in-point, out-point, roll, slip, or slide trimming. Simply hit the key and DaVinci Resolve automatically jumps to the nearest cut point. Then use the search dial for the rest. As you adjust the head or tail of a cut the rest of the timeline ripples accordingly. It’s one of the best trim models of any NLE.

Some additional thoughts

I do have a few quibbles. Trim functions in the cut and edit pages are inconsistent with each other. The cut page uses a similar model to FCPX, where audio and video from the clip are combined into a single timeline clip rather than on separate tracks. Unfortunately, Blackmagic Design has yet to implement a way to expand a/v clips and perform L-cut or J-cut trimming on the cut page. You’ll have to shift to the edit page to perform those.

This is a right-handed device, so left-handed editors will have the same dilemma that left-handed guitar players encounter. In addition, these are imprinted keycaps based on DaVinci Resolve’s default keyboard map. If you use a custom layout or one of the other keyboard maps that DaVinci Resolve offers, then the QWERTY command portion of the keyboard becomes less useful.

The search dial will not override the J-K-L or the space bar play commands. In order to jog once the sequence is playing, you must first hit the K key or the space bar to stop playback before you can properly jog through frames. Otherwise, playback continues the minute you let go of the dial.

Conclusion

This keyboard is addictive. But, is its $995 (USD) price tag justified? That’s steep, but many plastic gaming keyboards can run up to $200 and some even $500. That’s without any extra pointers, dials, or keys. I’ve also found precision metal keyboards with force-sensitive pointers as high as $3,000. Given that, Blackmagic Design may be in the right ballpark. Just like control surfaces for grading or mixing, this keyboard isn’t for everyone. If you are already a fast, keyboard-oriented editor, then the DaVinci Resolve Editor Keyboard may not make you faster. Likewise, a Final Cut Pro X editor who flies by skimming with a mouse is also going to have a hard time justifying the expense, not to mention a shift to a different application.

This keyboard is designed for DaVinci Resolve editors and not colorists. It’s for facilities that intend to deploy DaVinci Resolve as their full-time editing application. I could easily see DaVinci Resolve and this keyboard used in a fast turnaround edit environment, like broadcast news. Under that scenario, it will certainly enhance speed and workflow, especially for editors who want to make the most out of the new cut page.

Originally written for RedShark News.

Be sure to also check out Scott Simmons’ review at ProVideoCoalition.

©2019 Oliver Peters

NAB Show 2019

This year the NAB Show seemed to emphasize its roots – the “B” in National Association of Broadcasters. Gone or barely visible were the fads of past years, such as stereoscopic 3D, 360-degree video, virtual/augmented reality, drones, etc. Not that these are gone – merely that they have refocused on the smaller segment of marketshare that reflects reality. There’s not much point in promoting stereo 3D at NAB if most of the industry goes ‘meh’.

Big exhibitors of the past, like Quantel, RED, Apple, and Autodesk, are gone from the floor. Quantel products remain as part of Grass Valley (now owned by Belden), which is the consolidation of Grass Valley Group, Quantel, Snell & Wilcox, and Philips. RED decided last year that small, camera-centric shows were better venues. Apple – well, they haven’t been on the main floor for years, but even this year, there was no off-site, Final Cut Pro X stealth presence in a hotel suite somewhere. Autodesk, which shifted to a subscription model a couple of years ago, had a demo suite in the nearby Renaissance Hotel, focusing on its hero product, Flame 2020. Smoke for Mac users – tough luck. It’s been over for years.

This was a nuts-and-bolts year, with many exhibits showing new infrastructure products. These appeal to larger customers, such as broadcasters and network facilities. Specifically the world is shifting to an IP-based infrastructure for signal routing, control, and transmission. This replaces copper and fiber wiring of the past, along with the devices (routers, video switchers, etc) at either end of the wire. Companies that might have appeared less relevant, like Grass Valley, are back in a strong sales position. Other companies, like Blackmagic Design, are being encouraged by their larger clients to fulfill those needs. And as ever, consolidation continues – this year VizRT acquired NewTek, who has been an early player in video-over-IP with their proprietary NDI protocol.

Adobe

The NAB season unofficially started with Adobe’s pre-NAB release of the CC2019 update. For editors and designers, the hallmarks of this update include a new, freeform bin window view and adjustable guides in Premiere Pro and content-aware, video fill in After Effects. These are solid additions in response to customer requests, which is something Adobe has focused on. A smaller, but no less important feature is Adobe’s ongoing effort to improve media performance on the Mac platform.

As in past years, their NAB booth was an opportunity to present these new features in-depth, as well as showcase speakers who use Adobe products for editing, sound, and design. Part of the editing team from the series Atlanta was on hand to discuss the team’s use of Premiere Pro and After Effects in their ‘editing crash pad’.

Avid

For many attendees, NAB actually kicked off on the weekend with Avid Connect, a gathering of Avid users (through the Avid Customer Association), featuring meet-and-greets, workshops, presentations, and ACA leadership committee meetings. While past product announcements at Connect have been subdued from the vantage of Media Composer editors, this year was a major surprise. Avid revealed its Media Composer 2019.5 update (scheduled for release the end of May). This came as part of a host of many updates. Most of these apply to companies that have invested in the full Avid ecosystem, including Nexis storage and Media Central asset management. While those are superb, they only apply to a small percentage of the market. Let’s not forget Avid’s huge presence in the audio world, thanks to the dominance of Pro Tools – now with Dolby ATMOS support. With the acquisition of Euphonix years back, Avid has become a significant player in the live and studio sound arena. Various examples of its S-series consoles in action were presented.

Since I focus on editing, let me discuss Media Composer a bit more. The 2019.5 refresh is the first major Media Composer overhaul in years. It started in secret last year. 2019.5 is the first iteration of the new UI, with more to be updated in coming releases. In short, the interface has been modernized and streamlined in ways to attract newer, younger users, without alienating established editors. Its panel design is similar to Adobe’s approach – i.e. interface panels can be docked, floated, stacked, or tabbed. Panels that you don’t want to see may be closed or simply slid to the side and hidden. Need to see a hidden panel again? Simply side it back open from the edge of the screen.

This isn’t just a new skin. Avid has overhauled the internal video pipeline, with 32-bit floating color and an uncompressed DNx codec. Project formats now support up to 16K. Avid is also compliant with the specs of the Netflix Post Alliance and the ACES logo program.

I found the new version very easy to use and a welcomed changed; however, it will require some adaptation if you’ve been using Media Composer for a long time. In a nod to the Media Composer heritage, the weightlifter (aka ‘liftman’) and scissors icons (for lift and extract edits) are back. Even though Media Composer 2019.5 is just in early beta testing, Avid felt good enough about it to use this version in its workshops, presentations, and stage demos.

One of the reasons to go to NAB is for the in-person presentations by top editors about their real-world experiences. No one can top Avid at this game, who can easily tap a host of Oscar, Emmy, BFTA, and Eddie award winners. The hallmark for many this year was the presentation at Avid Connect and/or at the show by the Oscar-winning picture and sound editing/mixing team for Bohemian Rhapsody. It’s hard not to gather a standing-room-only crowd when you close your talk with the Live Aid finale sequence played in kick-ass surround!

Blackmagic Design

Attendees and worldwide observers have come to expect a surprise NAB product announcement out of Grant Petty each year and he certainly didn’t disappoint this time. Before I get into that, there were quite a few products released, including for IP infrastructures, 8K production and post, and more. Blackmagic is a full spectrum video and audio manufacturer that long ago moved into the ‘big leagues’. This means that just like Avid or Grass Valley, they have to respond to pressure from large users to develop products designed around their specific workflow needs. In the BMD booth, many of those development fruits were on display, like the new Hyperdeck Extreme 8K HDR recorder and the ATEM Constellation 8K switcher.

The big reveal for editors was DaVinci Resolve 16. Blackmagic has steadily been moving into the editorial space with this all-in-one, edit/color/mix/effects/finishing application. If you have no business requirement for – or emotional attachment to – one of the other NLE brands, then Resolve (free) or Resolve Studio (paid) is an absolute no-brainer. Nothing can touch the combined power of Resolve’s feature set.

New for Resolve 16 is an additional editorial module called the Cut Page. At first blush, the design, layout, and operation are amazingly similar to Apple’s Final Cut Pro X. Blackmagic’s intent is to make a fast editor where you can start and end your project for a time-sensitive turnaround without the complexities of the Edit Page. However, it’s just another tool, so you could work entirely in the Cut Page, or start in the Cut Page and refine your timeline in the Edit Page, or skip the Cut Page all together. Resolve offers a buffet of post tools that are at your disposal.

While Resolve 16’s Cut Page does elicit a chuckle from experienced FCPX users, it offers some new twists. For example, there’s a two-level timeline view – the top section is the full-length timeline and the bottom section is the zoomed-in detail view. The intent is quick navigation without the need to constantly zoom in and out of long timelines. There’s also an automatic sync detection function. Let’s say you are cutting a two-camera show. Drop the A-camera clips onto the timeline and then go through your B-camera footage. Find a cut-away shot, mark in/out on the source, and edit. It will ‘automagically’ edit to the in-sync location on the timeline. I presume this is matched by either common sound or timecode. I’ll have to see how this works in practice, but it demos nicely. Changes to other aspects of Resolve were minor and evolutionary, except for one other notable feature. The Color Page added its own version of content-aware, video fill.

Another editorial product addition – tied to the theme of faster, more-efficient editing – was a new edit keyboard. Anyone who’s ever cut in the linear days – especially those who ran Sony BVE9000/9100 controllers – will feel very nostalgic. It’s a robust keyboard with a high-quality, integrated jog/shuttle knob. The feel is very much like controlling a tape deck in a linear system, with fast shuttle response and precise jogging. The precision is far better than any of the USB controllers, like a Contour Shuttle. Whether or not enough people will have interest in shelling out $1,025 for it awaits to be seen. It’s a great tool, but are you really faster with one, than with FCPX’s skimming and a standard keyboard and mouse?

Ironically, if you look around the Blackmagic Design booth there does seem to be a nostalgic homage to Sony hardware of the past. As I said, the edit keyboard is very close to a BVE9100 keyboard. Even the style of the control panel on the Hyperdecks – and the look of the name badges on those panels – is very much Sony’s style. As humans, this appeals to our desire for something other than the glass interfaces we’ve been dealing with for the past few years. Michael Cioni (Panavision, Light Iron) coined this as ‘tactile attraction’ in his excellent Faster Together Stage talk. It manifests itself not only in these type of control surfaces, but also in skeuomorphic designs applied to audio filter interfaces. Or in the emotion created in the viewer when a colorist adds film grain to digital footage.

Maybe Grant is right and these methods are really faster in a pressure-filled production environment. Or maybe this is simply an effort to appeal to emotion and nostalgia by Blackmagic’s designers. (Check out Grant Petty’s two-hour 2019 Product Overview for more in-depth information on Blackmagic Design’s new products.)

8K

I won’t spill a lot of words on 8K. Seems kind of silly when most delivery is HD and even SD in some places. A lot of today’s production is in 4K, but really only for future-proofing. But the industry has to sell newer and flashier items, so they’ve moved on to 8K pixel resolution (7680 x 4320). Much of this is driven by Japanese broadcast and manufacturer efforts, who are pushing into 8K. You can laugh or roll your eyes, but NAB had many examples of 8K production tools (cameras and recorders) and display systems. Of course, it’s NAB, making it hard to tell how many of these are only prototypes and not yet ready for actual production and delivery.

For now, it’s still a 4K game, with plenty of mainstream product. Not only cameras and NLEs, but items like AJA’s KiPro family. The KiPro Ultra Plus records up to four channels of HD or one channel of 4K in ProRes or DNx. The newest member of the family is the KiPro GO, which records up to four channels of HD (25Mbps H.264) onto removable USB media.

Of course, the industry never stops, so while we are working with HD and 4K, and looking at 8K, the developers are planning ahead for 16K. As I mentioned, Avid already has project presets built-in for 16K projects. Yikes!

HDR

HDR – or high dynamic range – is about where it was last year. There are basically four formats vying to become the final standard used in all production, post, and display systems. While there are several frontrunners and edicts from distributors to deliver HDR-compatible masters, there still is no clear path. In you shoot in log or camera raw with nearly any professional camera produced within the past decade, you have originated footage that is HDR-compatible. But none of the low-cost post solutions make this easy. Without the right monitoring environment, you are wasting your time. If anything, those waters are muddier this year. There were a number of HDR displays throughout the show, but there were also a few labelled as using HDR simulation. I saw a couple of those at TV Logic. Yes, they looked gorgeous and yes, they were receiving an HDR signal. I found out that the ‘simulation’ part of the description meant that the display was bright (up to 350 nits), but not bright enough to qualify as ‘true’ HDR (1,000 nits or higher).

As in past transitions, we are certainly going to have to rely on a some ‘glue’ products. For me, that’s AJA again. Through their relationship with Colorfront, AJA offers two FS-HDR products: the HDR Image Analyzer and the FS-HDR convertor. The latter was introduced last year as a real-time frame synchronizer and color convertor to go between SDR and HDR display standards.  The new Analyzer is designed to evaluate color space and gamut compliance. Just remember, no computer display can properly show you HDR, so if you need to post and delivery HDR, proper monitoring and analysis tools are essential.

Cameras

I’m not a cinematographer, but I do keep up with cameras. Nearly all of this year’s camera developments were evolutionary: new LF (large format sensor) cameras (ARRI), 4K camcorders (Sharp, JVC), a full-frame mirrorless DSLR from Nikon (with ProRes RAW recording coming in a future firmware update). Most of the developments were targeted towards live broadcast production, like sports and megachurches.  Ikegami had an 8K camera to show, but their real focus was on 4K and IP camera control.

RED, a big player in the cinema space, was only there in a smaller demo room, so you couldn’t easily compare their 8K imagery against others on the floor, but let’s not forget Sony and Panasonic. While ARRI has been a favorite, due to the ‘look’ of the Alexa, Sony (Venice) and Panasonic (Varicam and now EVA-1) are also well-respected digital cinema tools that create outstanding images. For example, Sony’s booth featured an amazing, theater-sized, LED 8K micro-pixel display system. Some of the sample material shown was of the Rio Carnival, shot with anamorphic lenses on a 6K full-frame Sony Venice camera. Simply stunning.

Finally, let’s not forget Canon’s line-up of cinema cameras, from the C100 to the C700FF. To complement these, Canon introduced their new line of Sumire Prime lenses at the show. The C300 has been a staple of documentary films, including the Oscar-winning film, Free Solo, which I had the pleasure of watching on the flight to Las Vegas. Sweaty palms the whole way. It must have looked awesome in IMAX!

(For more on RED, cameras, and lenses at NAB, check out this thread from DP Phil Holland.)

It’s a wrap

In short, NAB 2019 had plenty for everyone. This also included smaller markets, like products for education seminars. One of these that I ran across was Cinamaker. They were demonstrating a complete multi-camera set-up using four iPhones and an iPad. The iPhones are the cameras (additional iPhones can be used as isolated sound recorders) and the iPad is the ‘switcher/control room’. The set-up can be wired or wireless, but camera control, video switching, and recording is done at the iPad. This can generate the final product, or be transferred to a Mac (with the line cut and camera iso media, plus edit list) for re-editing/refinement in Final Cut Pro X. Not too shabby, given the market that Cinamaker is striving to address.

For those of us who like to use the NAB Show exhibit floor as a miniature yardstick for the industry, one of the trends to watch is what type of gear is used in the booths and press areas. Specifically, one NLE over another, or one hardware platform versus the other. On that front, I saw plenty of Premiere Pro, along with some Final Cut Pro X. Hardware-wise, it looked like Apple versus HP. Granted, PC vendors, like HP, often supply gear to use in the booths as a form of sponsorship, so take this with a grain of salt. Nevertheless, I would guess that I saw more iMac Pros than any other single computer. For PCs, it was a mix of HP Z4, Z6, and Z8 workstations. HP and AMD were partner-sponsors of Avid Connect and they demoed very compelling set-ups with these Z-series units configured with AMD Radeon cards. These are very powerful workstations for editing, grading, mixing, and graphics.

©2019 Oliver Peters

The State of the NLE 2019

It’s a new year, but the doesn’t mean that the editing software landscape will change drastically in the coming months. For all intents and purpose, professional editing options boil down to four choices: Avid Media Composer, Adobe Premiere Pro, Apple Final Cut Pro X, and Blackmagic Design DaVinci Resolve. Yes, I know Vegas, Lightworks, Edius, and others are still out there, but those are far off on the radar by comparison (no offense meant to any happy practitioners of these tools). Naturally, since blogs are mainly about opinions, everything I say from here on is purely conjecture. Although it’s informed by my own experiences with these tools and my knowing many of the players involved on the respective product design and management teams – past and present.

Avid continues to be the go-to NLE in the feature film and episodic television world. That’s certainly a niche, but it’s a niche that determines the tools developed by designers for the broader scope of video editing. Apple officially noted two million users for Final Cut Pro X last year and I’m sure it’s likely to be at least 2.5M by now. Adobe claims Premiere Pro to be the most widely used NLE by a large margin. I have no reason to doubt that statement, but I have also never seen any actual stats. I’m sure through the Creative Cloud subscription mechanism Adobe not only knows how many Premiere Pro installations have been downloaded, but probably has a good idea as to actual usage (as opposed to simply downloading the software). Bringing up the rear in this quartet is Resolve. While certainly a dominant color correction application, I don’t yet see it as a key player in the creative editing (as opposed to finishing) space. With the stage set, let’s take a closer look.

Avid Media Composer

Editors who have moved away from Media Composer or who have never used it, like to throw shade on Avid and its marquee product. But loyal users – who include some of the biggest names in film editing – stick by it due in part to familiarity, but also its collaborative features and overall stability. As a result, the development pace and rate of change is somewhat slow compared with the other three. In spite of that, Avid is currently on a schedule of a solid, incremental update nearly every month – each of which chips away at a long feature request list. The most recent one dropped on December 31st. Making significant changes without destroying the things that people love is a difficult task. Development pace is also hindered by the fact that each one of these developers is also chasing changes in the operating system, particularly Apple and macOS. Sometimes you get the feeling that it’s two steps forward, one step back.

As editors, we focus on Media Composer, but Avid is a much bigger company than just that, with its fingers in sound, broadcast, storage, cloud, and media management. If you are a Pro Tools user, you are just as concerned about Avid’s commitment to you, as editors are to them. Like any large company, Avid must advance not just a single core product, but its ecosystem of products. Yet it still must advance the features in these products, because that’s what gets users’ attention. In an effort to improve its attraction to new users, Avid has introduced subscription plans and free versions to make it easier to get started. They now cover editing and sound needs with a lower cost-of-entry than ever before.

I started nonlinear editing with Avid and it will always hold a spot in my heart. Truth be told, I use it much less these days. However, I still maintain current versions for the occasional project need plus compatibility with incoming projects. I often find that Media Composer is the single best NLE for certain tasks, mainly because of Avid’s legacy with broadcast. This includes issues like proper treatment of interlaced media and closed captioning. So for many reasons, I don’t see Avid going away any time soon, but whether or not they can grow their base remains an unknown. Fortunately many film and media schools emphasize Avid when they teach editing. If you know Media Composer, it’s an easy jump to any other editing tool.

Adobe Premiere Pro CC

The most widely used NLE? At least from what I can see around me, it’s the most used NLE in my market, including individual editors, corporate media departments, and broadcasters. Its attraction comes from a) the versatility in editing with a wide range of native media formats, and b) the similarity to – and viable replacement for – Final Cut Pro “legacy”. It picked up steam partly as a reaction to the Final Cut Pro X roll-out and users have generally been happy with that choice. While the shift by Adobe to a pure subscription model has been a roadblock for some (who stopped at CS6), it’s also been an advantage for others. I handle the software updates at a production company with nine edit systems and between the Adobe Creative Cloud and Apple Mac App Store applications, upgrades have never been easier.

A big criticism of Adobe has been Premiere’s stability. Of course, that’s based on forum reads, where people who have had problems will pipe up. Rarely does anyone ever post how uneventful their experience has been. I personally don’t find Premiere Pro to be any less stable than any other NLE or application. Nonetheless, working with a mix of oddball native media will certainly tax your system. Avid and Apple get around this by pushing optimized and proxy media. As such, editors reap the benefits of stability. And the same is true with Premiere. Working with consistent, optimized media formats (transcoded in advance) – or working with Adobe’s own proxies – results in a more stable project and a better editing experience.

Avid Media Composer is the dominant editing tool in major markets, but mainly in the long-form entertainment media space. Many of the top trailer and commercial edit shops in those same markets use Premiere Pro. Again, that goes back to the FCP7-to-Premiere Pro shift. Many of these companies had been using the old Final Cut rather than Media Composer. Since some of these top editors also cut features and documentaries, you’ll often see them use Premiere on the features that they cut, too. Once you get below the top tier of studio films and larger broadcast network TV shows, Premiere Pro has a much wider representation. That certainly is good news for Adobe and something for Avid to worry about.

Another criticism is that of Adobe’s development pace. Some users believed that moving to a subscription model would speed the development pace of new versions – independent of annual or semi-annual cycles. Yet cycles still persist – much to the disappointment of those users. This gets down to how software is actually developed, keeping up with OS changes, and to some degree, marketing cycles. For example, if there’s a big Photoshop update, then it’s possible that the marketing “wow” value of a large Premiere Pro update might be overshadowed and needs to wait. Not ideal, but that’s the way it is.

Just because it’s possible, doesn’t mean that users really want to constantly deal with automatic software updates that they have to keep track of. This is especially true with After Effects and Premiere Pro, where old project files often have to be updated once you update the application. And those updates are not backwards compatible. Personally, I’m happy to restrict that need to a couple of times a year.

Users have the fear that a manufacturer is going to end-of-life their favorite application at some point. For video users, this was made all too apparent by Apple and FCPX. Neither Apple nor Adobe has been exempt from killing off products that no longer fit their plans. Markets and user demands shift. Photography is an obvious example here. In recent years, smart phones have become the dominant photographic device, which has enabled cloud-syncing and storage of photos. Adobe and Apple have both shifted the focus for their photo products accordingly. If you follow any of the photo blogs, you’ll know there’s some concern that Adobe Lightroom Classic (the desktop version) will eventually give way completely to Lightroom CC (the cloud version). When a company names something as “classic”, you have to wonder how long it will be supported.

If we apply that logic to Premiere Pro, then the new Adobe Rush comes to mind. Rush is a simpler, nimbler, cross-platform/cross-device NLE targeted as users who produce video starting with their smart phone or tablet. Since there’s also a desktop version, one could certainly surmise that in the future Rush might replace Premiere Pro in the same way that FCPX replaced FCP7. Personally, I don’t think that will happen any time soon. Adobe treats certain software as core products. Photoshop, Illustrator, and After Effects are such products. Premiere Pro may or may not be viewed that way internally, but certainly more so now than ever in the past. Premiere Pro is being positioned as a “hub” application with connections to companion products, like Prelude and Audition. For now, Rush is simply an interesting offshoot to address a burgeoning market. It’s Adobe’s second NLE, not a replacement. But time will tell.

Apple Final Cut Pro X

Apple released Final Cut Pro X in the summer of 2011 – going on eight years now. It’s a versatile, professional tool that has improved greatly since that 2011 launch and gained a large and loyal fan base. Many FCPX users are also Premiere Pro users and the other way around. It can be used to cut nearly any type of project, but the interface design is different from the others, making it an acquired taste. Being a Mac-only product and developed within the same company that makes the hardware and OS, FCPX is optimized to run on Macs more so than any cross-platform product can be. For example, the fluidity of dealing with 4K ProRes media on even older Macs surpasses that of any other NLE.

Prognosticating Apple’s future plans is a fool’s errand. Some guesses have put the estimated lifespan of FCPX at 10 years, based in part on the lifespan of FCP “legacy”. I have no idea whether that’s true of not. Often when I read interviews with key Apple management (as well as off-the-record, casual discussions I’ve had with people I know on the inside), it seems like a company that actually has less of a concrete plan when it comes to “pro” users. Instead, it often appears to approach them with an attitude of “let’s throw something against the wall and see what sticks”. The 2013 Mac Pro is a striking example of this. It was clearly innovative and a stellar exhibit for Apple’s “think different” mantra. Yet it was a product that obviously was not designed by actually speaking with that product’s target user. Apple’s current “shunning” of Nvidia hardware seems like another example.

One has to ask whether a company so dominated by the iPhone is still agile enough to respond to the niche market of professional video editors. While Apple products (hardware and software) still appeal to creatives and video professionals, it seems like the focus with FCPX is towards the much broader sphere of pro video. Not TV shows and feature films (although that’s great when it comes) – or even high-end commercials and trailers – but rather the world of streaming channels, social media influencers, and traditional publishers who have shifted to an online media presence from a print legacy. These segments of the market have a broad range of needs. After all, so called “YouTube stars” shoot with everything from low-end cameras and smart phones all the way up to Alexas and REDs. Such users are equally professional in their need to deliver a quality product on a timetable and I believe that’s a part of the market that Apple seeks to address with FCPX.

If you are in the world of the more traditional post facility or production company, then those users listed above may be market segments that you don’t see or possibly even look down upon. I would theorize that among the more traditional sectors, FCPX may have largely made the inroads that it’s going to. Its use in films and TV shows (with the exception of certain high-profile, international examples) doesn’t seem to be growing, but I could be wrong. Maybe the marketing is just behind or it no longer has PR value. Regardless, I do see FCPX as continuing strong as a product. Even if it’s not your primary tool, it should be something in your toolkit. Apple’s moves to open up ProRes encoding and offering LumaForge and Blackmagic eGPU products in their online store are further examples that the pro customer (in whatever way you define “pro”) continues to have value to them. That’s a good thing for our industry.

Blackmagic Design DaVinci Resolve

No one seems to match the development pace of Blackmagic Design. DaVinci Resolve underwent a wholesale transformation from a tool that was mainly a high-end color corrector into an all-purpose editing application. Add to this the fact that Blackmagic has acquired and integrated a number of companies, whose tools have been modernized and integrated into Resolve. Blackmagic now offers a post-production solution with some similarities to FCPX while retaining a traditional, track-based interface. It includes modes for advanced audio post (Fairlight) and visual effects (Fusion) that have been adapted from those acquisitions. Unlike past all-in-one applications, Resolve’s modal pages retain the design and workflow specific to the task at hand, rather than making them fit into the editing application’s interface design. All of this in a very short order and across three operating systems, thus making their pace the envy of the industry.

But a fast development pace doesn’t always translate into a winning product. In my experience each version update has been relatively solid. There are four ways to get Resolve (free and paid, Mac App Store and reseller). That makes it a no-brainer for anyone starting out in video editing, but who doesn’t have the specific requirement for one application over another. I have to wonder though, how many new users go deep into the product. If you only edit, there’s no real need to tap into the Fusion, Fairlight, or color correction pages. Do Resolve editors want to finish audio in Fairlight or would they rather hand off the audio post and mix to a specialist who will probably be using Pro Tools? The nice thing about Resolve is that you can go as deep as you like – or not – depending on your mindset, capabilities, and needs.

On the other hand, is the all-in-one approach better than the alternatives: Media Composer/Pro Tools, Premiere Pro/After Effects/Audition, or Final Cut Pro X/Motion/Logic Pro X? I don’t mean for the user, but rather the developer. Does the all-in-one solution give you the best product? The standalone version of Fusion is more full-featured than the Fusion page in Resolve. Fusion users are rightly concerned that the standalone will go away, leaving them with a smaller subset of those tools. I would argue that there are already unnecessary overlaps in effects and features between the pages. So are you really getting the best editor or is it being compromised by the all-in-one approach? I don’t know the answer to these questions. Resolve for me is a good color correction/grading application that can also work for my finishing needs (although I still prefer to edit in something else and roundtrip to/from Resolve). It’s also a great option for the casual editor who wants a free tool. Yet in spite of all its benefits, I believe Resolve will still be a distant fourth in the NLE world, at least for the next year.

The good news is that there are four great editing options in the lead and even more coming from behind. There are no bad choices and with a lower cost than ever, there’s no reason to limit your knowledge to only one. After all, the products that are on top now may be gone in a decade. So broaden your knowledge and define your skills by your craft – not your tools!

©2019 Oliver Peters

Edit Collaboration and Best Practices

There are many workflows that involve collaboration, with multiple editors and designers working on the same large project or group of projects. Let me say up front that if you want the best possible collaborative experience with multiple editors, then work with Avid Media Composer. Full stop. I have worked both sides of the equation and without a doubt, Media Composer connected to Avid Unity/Isis/Nexis shared storage is simply not matched by Final Cut Pro, Final Cut Pro X, Premiere Pro, or any other editing software/storage/cloud combination. Everything else is a compromise, which is why feature film and TV series editorial teams continue to select Avid solutions as their first choice.

In spite of that, there are many reasons to use other editing tools. I work most of the time in Adobe Premiere Pro CC and freelance at a shop with nine edit workstations connected to shared storage. We work mainly in Adobe Creative Cloud applications and our projects involve a lot of collaboration. Some of these are corporate videos that are frequently edited and revised by different editors. Some are entertainment shows, cut by a small editorial team focused on those shows. For some projects, Premiere Pro is the perfect tool. For others, we have to develop strategies to adapt Premiere to our workflow.

With that in mind, the following are tips and best practices that I’ll share for what has worked best for us over the past three years, while working on large projects with a team of editors. Although it applies to our work with Premiere Pro, the same would generally be true if we were working with Apple Final Cut Pro X instead.

Organization. We organize all projects into a specific folder structure, using a Post Haste template. All media files, like camera footage, audio, graphic elements, etc. go into common folders. Editors know where to look to find things. When new camera footage comes in, files are organized as “dailies” into specific folders by date, camera, and camera card. Non-pro formats, like GoPro and DSLR footage will be batch-renamed to reflect the project, date, and camera card. The objective is to have unique file names for each and every media file.

Optimized, transcoded, or proxy media. Depending on the performance and amount of media, you may need to do some prep work before even starting the edit process. Premiere and FCPX work well with some media formats and not with others. NAS/SAN storage is particularly taxing, especially once you get to resolutions greater than HD. If you want the most fluid experience in a shared workflow, then you will likely need to transcode proxy files from within the application. The reason to stay inside of FCPX or Premiere Pro is so that frame size offsets are properly tracked. Once proxies have been transcoded, it’s a simple matter of toggling between the proxy media (best playback performance) and full-resolution media (best image quality).

On the other hand, if you’d rather stick to full-resolution, native media, then some formats will have to be transcoded into “optimized” media. For instance, GoPro 4K footage is terrible to edit with natively. It should always be transcoded to ProRes or DNxHD before editing, if you don’t want to go the proxy route. This can be done inside or outside of the application and is an easy task with DaVinci Resolve, EditReady, Adobe Media Encoder, or Apple Compressor.

Finally, if you have image sequences from a drone or other source, forget trying to edit from these off of a network. Transcode them right away into some format of master movie file. I find Resolve to be the best tool for this. It’s fast and since these are often camera raw files, you can apply a base grade to them as a starting point for future color correction.

Break up your projects. Depending on the type and size of the job and number of editors working on it, you may choose to work in multiple Premiere projects. There may be a master file where all media is imported and initially organized. Then there may be multiple projects that are offshoots from this for component parts. In a corporate environment, it could be several different videos cut from a single, larger set of media. In a feature film, there could be different Premiere projects for each reel of the film.

Since Premiere Pro employs project locking, any project opened by one editor can also be opened in a read-only mode by other editors. Editors can have multiple Premiere projects open at one time. Thus, it’s simple to bring in elements from one project into another, even while they are all open. This workflow mimics Avid’s bin-locking strategy.

It helps to keep project files streamlined as progress on the production extends over time. You want to keep the number of sequences in any given project small. Periodically duplicate your project(s), strip out old sequences from the current project, and archive the older project files.

As a general note, while working to build the creative story edits – i.e. “offline editing” – you will want to keep plug-in filter effects to a minimum. In fact, it’s generally a good idea to keep the plug-in selection on each system small, so that all workstations in this shared environment are able to have the same set of installed plug-ins. The same is true of fonts.

Finishing stages of post. There are generally two paths in the finishing, aka “online editing” stage. Either all final color correction and assembly of effects is completed within Premiere Pro, or there is a roundtrip through a color correction application, like Blackmagic Design DaVinci Resolve. The same holds true for audio, where a separate sound editor/designer/mixer may handle the finishing touches in Avid Pro Tools.

To accomplish an easy roundtrip with Resolve, create a sequence with all color correction and effects removed. Flatten the video to a single track (if possible), and remove the audio or do a simple stereo mixdown for reference. Ideally, media with mixed frame rates should be addressed as slow motion in the edited sequence. Avoid modifying the frame rate through any sort of “interpret” function within the application. Export an XML or AAF and send that and the associated media to Resolve. When color correction is complete, you can render the entire timeline at the sequence resolution as a single master file.

Conversely, if you want to send it back to Premiere Pro for final assembly and to complete the roundtrip, then render individual clips at their source resolution with handles of one to two seconds. Back in Premiere, re-apply titles, insert completed visual effects, and add any missing plug-in effects.

With audio post, there will be no roundtrip of elements, since the mixer will deliver a completed mixed stereo or surround track. This should be imported into Premiere (or Resolve if the final master is created in Resolve) and married back to the final video sequence. The mixer should also supply “stems” – the individual dialogue, music, and sound effects (D/M/E) submix tracks.

Mastering. Final sequences should be exported in a master file format (ProRes, DNxHD/HR, uncompressed) in at least two forms: 1) master with final mix and titles, and 2) textless submaster with split-track audio (multiple channels containing the D/M/E stems). All of these files are stored within the same job-based folder structure outlined at the top. It is quite common that future revisions will be made using the textless submaster rather than re-opening the full project, or that it may be used as source material in another edit.

Another aspect of finishing the project is media consolidation. This means taking the final sequence and generating a new project file from it. That file contained only those elements from the sequence, along with a copy of the media used, where each file has been trimmed to the portion within the sequence (plus handles). This is the Project Manager function in Premiere Pro. Unfortunately, Premiere is not consistently good at this task. Some formats will be properly trimmed, while others will be copied in their entirety. That’s OK for a :10 take, but a bummer when it’s a 30-minute interview.

The good news is that if you went through the Resolve roundtrip workflow and rendered individual clips, then effectively Resolve has already done media consolidation as a byproduct. In addition, if your source media is 4K, but you only finished in HD, the Resolve renders will be 4K. If in the future, you need to deliver the same master in 4K, everything is already set. Of course, that assumes that you didn’t do a lot of “punching in” and reframing in your edit sequence.

Cloud-based services. Often collaboration requires a distributed team, when not everyone is under one roof. While Adobe does offer cloud-based team editing methods, this doesn’t really work when editors are on different Creative Cloud accounts or when the collaboration is between an editor and a graphic designer/animator/VFX artist working in non-Adobe tools. In that case the old standbys have been Dropbox, Box, or Google Drive. Syncing is easy and relatively reliable. However, these are really just designed for sharing assets. But when this involves a couple of editors and each has a local, mirrored set of media, then simple sharing/syncing of only small project files makes for a working collaborative method.

Frame.io is the newbie here, with updated extension tools designed for in-application workspace panels within Final Cut Pro X, After Effects, and Premiere Pro. While they tout the ease of moving full-resolution media into their cloud, including camera files, I really wouldn’t recommend doing that. It’s simply not very practical on must projects. But for sharing cuts using a standard review-and-approach workflow, Frame.io definitely hits most of the buttons.

©2018 Oliver Peters