ARRI ALEXA post, part 5

A commercial case study

Upon my return from NAB, I dove straight into post on a set of regional commercials for Hy-Vee, a Midwest grocer. I’ve worked with this client, agency and director for a number of years and all previous projects had been photographed on 35mm, transferred to Digital Betacam and followed a common, standard definition post workflow. The new spots featured celebrity chef Curtis Stone and instead of film, Director/DP Toby Phillips opted to produce the spots using the ARRI ALEXA. This gave us the opportunity to cut and finish in HD. Although we mastered in 1080p/23.98, delivery formats included 720p versions for the web and cinema, along with reformatted spots in 14×9 SD for broadcast.

The beauty of ALEXA is that you can take the Apple ProRes QuickTime camera files straight into edit without any transcoding delays. I was cutting these at TinMen, a local production company, on a fast 12-core Mac Pro connected to a Fibre Channel SAN, so there was no slowdown working with the ProRes 4444 files. Phillips shot with two ALEXAs and a Canon 5D, plus double-system sound. The only conversion involved was to get the 5D files into ProRes, using my standard workflow. The double-system sound was mainly as a back-up, since the audio was also tethered to the ALEXA, which records two tracks of high-quality sound.

On location, the data wrangler used the Pomfort Silverstack ARRI Set application to offload, back-up and organize files from the SxS cards to hard drive. Silverstack lets you review and organize the footage and write a new XML file based on this organization. Since the week-long production covered several different spots, the hope was to organize files according to commercial and scene. In general, this concept worked, but I ran into problems with how Final Cut Pro reconnects media files. Copying the backed-up camera files to the SAN changes the file path. FCP wouldn’t automatically relink the imported XML master clips to the corresponding media. Normally, in this case, once you reconnect the first file, the rest in a similar path will also relink. Unfortunately by using the Silverstack XML, it meant I had to start the reconnect routine every few clips, since this new XML would bridge information across various cards. Instead of using the Silverstack-generated XML, I decided to use the camera-generated XML files, which meant only going through the reconnect dialogue once per card.

It’s worth noting that the QuickTime files written by the ARRI ALEXA somehow differ from what FCP expects to see. When you import these files into FCP, you frequently run into two error prompts: the “media isn’t optimized” message and the “file attributes don’t match” message. Both of these are bogus and the QuickTime files work perfectly well in FCP, so when you encounter such messages, simply click “continue” and proceed.

Click for an enlarged view

Dealing with Log-C in the rough cut

As I’ve discussed in numerous posts, one of the mixed blessings of the camera is the Log-C profile. It’s ARRI’s unique way of squeezing a huge dynamic range into the ALEXA’s recorded signal, but it means editors need to understand how to deal with it. Since these spots wouldn’t go through the standard offline-online workflow, it was up to me as the editor to create the “dailies”. I’ve mentioned various approaches to LUTs (color look-up tables), but on this project I used the standard FCP color correction filter to change the image from its flat Log-C appearance to a more pleasing Rec 709 look. On this 12-core Mac Pro, ProRes 4444 clips (with an unrendered color correction filter applied) played smoothly and with full video quality on a ProRes HQ timeline. Since the client was aware of how much better the image would look after grading – and because in the past they had participated in film transfer and color correction sessions – seeing the flat Log-C image didn’t pose a problem.

From my standpoint, it was simply a matter of creating a basic setting and then quickly pasting that filter to clips as I edited them to the timeline. One advantage to using the color correction filter instead of a proper LUT, is that this allowed me to subjectively tweak a shot for the client, without adding another filter. If the shot looked a little dark (compared with a “standard” setting), I would quickly brighten it as I went along. Like most commercial sessions, I would usually have several versions roughed in before the client really started to review anything. In reality, their exposure to the uncorrected images was less frequent than you might think. As such, the “apply filter as you go” method works well in the spot editorial world.

Moving to finishing

New Hat colorist Bob Festa handled the final grading of these spots on a Filmlight Baselight system. There are a couple of ways to send media to a Baselight, but the decision was made to send DPX files, which corresponded to the cut sequence. Since I was sending a string of over ten commercials to be graded, I had a concern about the volume of raw footage to ship. There is a bug in the ALEXA/FCP process and that has to do with FCP’s Media Manager. When you media manage and trim the camera clips, many are not correctly written and result in partial clips with a “-v” suffix. If you media manage, but take the entire length of a clip, then FCP’s Media Manager seems to work correctly. To avoid sending too much footage, I only sent an assembled sequence with the entire series of spots strung out end-to-end. I extended all shots to add built-in handles and removed any of my filters, leaving the uncorrected shots with pad.

Final Cut Pro doesn’t export DPX files, but Premiere Pro does. So…  a) I exported an XML from FCP, b) imported that into Premiere Pro, and c) exported the Premiere Pro timeline as DPX media. In addition, I also generated an EDL to serve as a “notch list”, which lined up with all the cuts and divided the long image sequence into a series of shots with edit points – ready to be color corrected.

After a supervised color correction session at New Hat, the graded shots were rendered as a single uncompressed QuickTime movie. I imported that file and realigned the shots with my cuts (removing handles) to now have a set of spots with the final graded clips in place of the Log-C camera footage.

Of course, spot work always involves a few final revisions, and this project was no exception. After a round of agency and client reviews, we edited for a couple of days to revise a few spots and eliminate alternate versions before sending the spots to the audio mixing session. Most of these changes were simple trims that could be done within the amount of handle length I had on the graded footage. However, a few alternate takes were selected and in some cases, I had to extend a shot longer than my handles. This combination meant that about a dozen shots (out of more than ten commercials) had to be newly graded, meaning a second round at New Hat. We skipped the DPX pass and instead sent an EDL and the raw footage as QuickTime ProRes 4444 camera files for only the revised clips. Festa was able to match his previous grades, render new QuickTimes of the revised shots and ship a hard drive back to us.

Click to view “brand introduction” commercial

Reformatting

Our finished masters were ProRes HQ 1920×1080 23.98fps files, but think of these only as intermediates. The actual spots that run in broadcast are 4×3 NTSC. Phillips had framed his shots protecting for 4×3, but in order to preserve some of the wider visual aspect ratio, we decided to finish with a 14×9 framing. This means that the 4×3 frame has a slight letterbox with smaller top and bottom black bars. Unlike the usual 4×3 center-crop, a smaller portion of the left and right edge of the 16×9 HD frame is cropped off. I don’t like how FCP handles the addition of pulldown (to turn 23.98 into 29.97 fps) and I’m not happy with its scaling quality to downconvert HD to SD. My “go to” solution is to use After Effects as the conversion utility for the best results.

From Final Cut, I exported a self-contained, textless QuickTime movie (HD 23.98). This was placed into an After Effects 720 x 486 D1 composition and scaled to match a 14×9 framing within that comp. I rendered an uncompressed QuickTime file out of After Effects (29.97 fps, field-rendered with added 3:2 pulldown). The last step was to bring this 720 x 486 file back into FCP, place it on an NTSC 525i timeline, add and reposition all graphics for proper position and finish the masters.

Most of these steps are not unusual if you do a lot of high-end spot work. In the past, 35mm spots would be rough cut from one-light “dailies”. Transfer facilities would then retransfer selects in supervised color correction sessions and an online shop would conform this new film transfer to the rough cut. Although many of the traditional offline-online approaches are changing, they aren’t going away completely. The tricks learned over the past 40 years of this workflow still have merit in the digital world and can provide for rich post solutions.

Sample images – click to see enlarged view

Log-C profile from camera

Nick Shaw Log-C to Rec 709 LUT (interface)

Nick Shaw Log-C to Rec 709 LUT (result)

Final image after Baselight grading

© 2011 Oliver Peters

ARRI ALEXA post, part 4

Local producers have started real productions with the ARRI ALEXA, so my work has moved from the theoretical to the practical. As an editor, working with footage from ALEXA is fun. The ProRes files are easily brought into FCP, Premiere Pro or Media Composer via import or AMA with little extra effort. The Rec 709 color profile looks great, but if the DP opts for the Log-C profile, grading is a snap. Log-C, as I wrote before, results in an image akin to a film scan of uncorrected 35mm negative. It’s easy to grade and you end up with gorgeous natural colors. There’s plenty of range to swing the image in different ways for many different looks.

Working with the Log-C profile in front of a client takes a bit of strategy, depending on the NLE you are using. Under the best of circumstances, you’d probably want to process the images first and work with offline-resolution editing clips (like Avid DNxHD36 or Apple ProRes Proxy) with a color correction LUT “baked” into the image. Much like one-light “dailies” for film-originated projects.

Many projects don’t permit this amount of advanced time, though, so editors often must deal with it as part of the edit session. This primarily applies to commercial and corporate work. Workflows for feature film and TV projects should follow a more traditional course, with prep time built into the front end, but that’s another blog post.

Strategies

There are LUT (color look-up table) filters for FCP, but unfortunately real-time performance is challenged by many. The best performance, is when you can use the native filters, even though that might not technically be the correct curve. That’s OK, because most of the time you simply want a good looking image for the client to see while you are doing the creative cut. Apple Final Cut Pro and Adobe Premiere Pro both require that you apply a filter to each clip on the timeline. This has an impact on your workflow, because you have to add filters as you go.

One good approach, which balances FCP performance with an accurate LUT, is the Log-C to Rec 709 plug-in developed by Nick Shaw at Antler Post. It not only corrects the profile, but adds other features, like “burn-in” displays. If you leave your FCP timeline’s RT setting in dynamic/dynamic, the unrendered clips with this filter applied will drop frames. Changing the setting to Full frame rate and/or High/Full will yield real-time playback at full video quality on a current Mac.

ARRI has enabled its web-based LUT Generator, which is accessible for free if you register at the ARRIDIGITAL site. You can create LUTs in various formats, but the one that has worked best for me is the Apple Color .mga version. This can be properly imported and applied in Apple Color. There it may be used simply for viewing or optionally baked into the rendered files as part of the color correction.

You can also use Red Giant’s free Magic Bullet LUT Buddy. This filter can be used to create and/or read LUTs. Apply it to a clip in Final Cut Pro, Premiere Pro or After Effects, read in the .mga file and render. Lastly, the Adobe apps also include a Cineon conversion filter. Apply this in Premiere Pro or After Effects and tweak as needed. On a fast machine, Premiere Pro CS 5.5 plays clips with the Cineon converter applied, in real-time without rendering.

Avid Media Composer and Adobe After Effects currently have the best routines, because you can add color correction to an upper layer and everything underneath is adjusted.

After Effects actually treats this as an “adjustment layer”, like in Photoshop, while Media Composer simply lets you add filters to a blank track – effectively doing the same thing as an adjustment layer. You still won’t see the source clip as a corrected image, but once it is placed on the timeline, the correction is applied and the image appears richer.

In the case of Avid Media Composer, this can also include some filters other than its own color correction mode filters. For example, GenArts Sapphire or Magic Bullet Looks. Media Composer is able to play these files at full quality, even though they are unrendered, giving it a performance edge over FCP.

Cutting spots in Log-C

I recently cut a set of national spots for Florida Film & Tape (a local production company) on a late-model Apple dual-processor PowerMac G5, running FCP 6.0.6. It was equipped with a fast SCSI RAID and an AJA Kona card. That’s a perfectly good set-up for most SD and HD post. In fact, I’ve previously edited spots photographed on 35mm film and the RED One camera for the same client and same production company on this system. G5s were manufactured and sold before ProRes was ever released; but, in spite of that, I was able to work with the 1920×1080 23.98fps ProRes4444 files that were shot. I placed my selected clips on an uncompressed timeline and started cutting. The client had already seen a Rec 709 preview out of the camera, so he understood that the image would look fine after grading. Therefore, there was no need to cut with a corrected image. That was good, because adding any sort of color correction filter to a large amount of footage would have really impacted performance on this computer.

In order to make the edit as straightforward and efficient as possible, I first assembled a timeline of all the “circle takes” so the director (Brad Fuller) and the client could zero in on the best performances. Then I assembled these into spots and applied a basic color correction filter to establish an image closer to the final. At this point, I rendered the spots and started to fine-tune the edit, re-rendering the adjustments as I went along. This may sound more cumbersome than it was, since I was editing at online quality the entire time (uncompressed HD). Given the short turnaround time, this was actually the fastest way to work. The shoot and post (edit, grade, mix) were completed in three consecutive days!

Once the picture was locked, I proceeded to the last steps – color grading the spots and formatting versions for various air masters. I decided to grade these spots using the Magic Bullet Colorista (version 1) plug-in. There was no need to use Apple Color and Colorista works fine on the G5. I removed the basic filter I had applied to the clips for the edit and went to work with Colorista. It does a good job with the Log-C images, including adding several layers for custom color-correction masks. As flat as the starting images are, it’s amazing how far you can stretch contrast and increase saturation without objectionable noise or banding.

I’ll have more to write about ALEXA post in the coming weeks as I work on more of these projects. This camera has garnered buzz, thanks to a very filmic image and its ease in post. It’s an easy process to deal with if your editing strategy is planned out.

©2011 Oliver Peters

Rise of the Preditor

Producer + Editor = Preditor. It’s a word that seems to generate derision from many traditional, professional editors. The concept that one person can and should handle all of the aspects of post is characterized as a “jack-of-all-trades and a master of none”. While that may be true, it doesn’t change the fact that many news operations, reality TV shops and broadcast creative services departments are adopting the model. That’s based on the concept that producers and editors can merge job skills and that the combined role can be handled by a single individual.

What is often described as a “professional editor” is really a concept that only applies to Hollywood, unionized workplaces or to the short slice of time when linear videotape editing was the norm. During the 1970s – 1990s, video post production was handled in very expensive edit suites. The editors originally came out of the engineering ranks and were considered more technical than creative, since part of their job was making sure that broadcast standards were met.

This evolved and attitudes changed around the introduction of Avid’s original nonlinear systems. These early units were quite expensive, so although the editor roles were creative, the business model didn’t change much from the linear operations. With the entry of Apple Final Cut Pro, the last decade or so has been viewed by many as a “race to the bottom”. The tools are cheaper than ever and it is perceived that no specialized skills are needed to operate the editor’s toolkit.

If you look at this outside of the scope of major film productions or those three decades and go back to the way standard small-to-medium market film production was handled before the ‘70s, you will see that the concept of a preditor actually predates the modern video world. In fact, most filmmakers would have been considered preditors. In the days when every commercial, corporate film (then called “industrials”), news story or documentary was shot and posted on film, it was quite common for the cinematographer, editor, director and producer to be the same person. They required a lab for some of the finishing services, but by and large, one person – or a very small team – handled many of the roles.

Many TV stations had an in-house “special projects” producer, who was often a one-man-band filmmaker doing small feature pieces or investigative journalism. The days of a news reporter/photographer driving around with a Canon Scoopic or Bolex 16mm film camera on the front seat are not that far removed from the modern video journalist shooting with a small handheld video camera and cutting a story with FCP on his laptop. I see this model proliferating throughout the video production world. I would suggest the concept of the “traditional professional” is in decline – to be replaced in larger numbers by the “new professional”. That’s who Apple and Adobe are servicing in the development of Final Cut Studio and the Creative Suite. Clearly it’s the focus of Final Cut Pro X.

Avid, Autodesk and Quantel are still trying to hang on to the old definitions and make their margins on the niche that is still working in that space. The larger fortunes are in addressing the needs of the “new professional” – the preditor. After all, if you define “professional” by the delivery of the end product (documentaries, broadcast news, sports, commercials) – rather than by the number of years a person has worked in a specialized position – then is the person who put it together any less professional than an experienced, seasoned pro?

This is even true in the feature film world. The Coen brothers have received several nominations for best editing Oscars. I’d bet they don’t view their knowledge of Final Cut as being at the same level as a so-called “professional” editor who might be an expert in manipulating the software. Yet, I doubt anyone would consider them as anything other than film professionals and talented editors.

Clearly my heart is on the side of the seasoned pro and I do think that in many cases that editor will deliver a better product. But in the end, I’ve seen enough compelling pieces edited by less skilled individuals who had a great vision. I suppose you can call that “good enough”, though I contend that’s really the wrong way to look at it.

So in this new model, what sort of technical skills should a person have? What advice is there for people about to enter college, are now in a college program or early in their career? First, I would offer that many college “digital media” programs are probably a better starting point than the more traditional “film programs”. Of course, that’s not a blanket statement, as many schools are adjusting their curriculum to stay relevant.

A model I see popping up a lot is that of a hands-on producer who shoots his or her own projects. Typically this is with HDSLR, P2 or XDCAM camcorders. The editing tool of choice is Final Cut – although in many instances, FCP is only used for a basic rough cut for the base layer of video. Then the project is actually finished in After Effects. You would think that Motion with FCP or Premiere Pro with After Effects would be the better choices. Although true, many shops decided on FCP a while back and Premiere Pro has yet to achieve similar street credibility as a professional editing tool. Likewise, Motion never gained the sort of commercial success that After Effects has.

If you are trying to plan out your future – regardless of your age – these are the five skill sets you need to master for success as a “new video professional”:

Producing – Learn everything you can about project organization, budgeting, directing talent and, in general, running an efficient location shoot or studio set.

Production/camera – Learn to be a one-man-band on location and in the studio. If you have a crew – great. But, you personally need to understand lighting, sound and the basics of cinematography. With the new file-based technology, this includes camera-specific data wrangling functions, whether that’s RED, P2 or something else.

Editing – Learn Final Cut Pro and Final Cut Studio inside and out. Don’t stop at editing. Make sure you know how to use Color for grading and Soundtrack Pro for mixing. Make sure you extend that learning to Final Cut Pro X. Don’t limited your skills to this one tool. Schedule learning excursions to Avid Media Composer, Adobe Premiere Pro, Grass Valley EDIUS and others.

Finishing/graphics – Since After Effects is the tool of choice for many, you really need to understand how this application works and how to get from FCP (or another NLE) to AE and back.

Encoding/delivery – This is the last stage and more delivery is file-based than ever before. You no longer have a duplication technician or VTR operator to fall back on. It’s just you. So this means you need to understand how to encode for the web, DVD, Blu-ray and various other client deliverables.

Whether you view the “new video professional” as the modern preditor or the filmmaker of five decades ago, it’s the same basic concept. What was old is new again. No need to complain. Time to learn, adapt and grow! Or get out of the business and run an ice cream truck!

©2011 Oliver Peters

Grass Valley EDIUS 6

Editing software manufacturers continue to expand features by adding native support for a variety of new camera codecs. EDIUS 6, the newest version of Grass Valley’s primary editing platform advances this endeavor.

EDIUS 6 is exclusively a Windows-based NLE, running under a wide range of configurations. I was able to test it on a new HP EliteBook 8740w with the Windows 7 64-bit Professional OS installed. This is a robust, top-of-the-line mobile workstation with more horsepower than most desktop units. I was operating in a software-only configuration, but EDIUS 6 can also be integrated with Grass Valley’s line of STORM and SPARK i/o cards when you use a tower instead of a mobile system.

The EDIUS 6 mantra – “Edit Anything”

Grass Valley promotes EDIUS 6 around the ability to throw nearly any codec or format at it and being immediately ready to edit – in real-time without the need to render. This is of particular interest to P2 and XDCAM shooters. My system natively supported a wide range of the professional acquisition codecs, including the newly added Canon XF MPEG2 4:2:2 codec. Possibly even more important – you can also drag-and-drop Canon H.264 files from a Canon EOS 5D or 7D and start editing right away. The “sweet spot” in my tests was Panasonic’s AVC-Intra codec. This is supposed to be very computationally intensive, but both 50 and 100 Mb/s files were a breeze to work with inside EDIUS 6 and on this workstation. With many prosumer and even professional cameras adopting some form of AVCHD, EDIUS 6 users will be happy to know that it can also handle real-time, multi-stream editing of AVCHD content.

It’s important to stress the “native” part. A lot of manufacturers claim native support, but in actual practice, transcode the media to another format upon ingest; or at the very least, rewrap the file container to a different type of file. EDIUS 6 does neither. If you ingest media from an OHCI device (FireWire), such as an HDV camcorder, the media stays native. Do you shoot P2 or XDCAM-HD? Simply edit straight from the media files or copy them to your local storage and access the media directly from the native folder and file structure of that format. Other than the time it takes to copy the files, the media is immediately accessible to EDIUS 6 and ready to edit. This difference becomes critical in time-sensitive workflows, like broadcast news.

The EDIUS 6 user interface sports four color bars in the timeline to indicate CPU stress: blue (no rendering), green (rendered), orange (may or may not need rendering) and red (needs rendering). When you edit a sequence, EDIUS 6 estimates where rendering is needed, so a lot of the timeline is highlighted with an orange bar. It is only when you actually play the timeline that EDIUS 6 updates this estimate with accurate information. Unfortunately, this may mean you have to play the sequence at least once all the way through. When clips are rendered, EDIUS 6 uses the codec defined by the project setting, which could be Grass Valley HQ (an 8-bit intermediate codec), HQX (10-bit) or uncompressed. Grass Valley HQ/HQX is equivalent to Apple ProRes 422 or Avid DNxHD and includes several selectable quality/data rate settings.

EDIUS 6 uses a buffer system that preloads up to 15 frames and 512MB, so as long as you don’t go below a user-defined threshold in the buffer, no rendering is required. The usual culprits tax the CPU: compute-intensive codecs, 3D DVE effects and many video layers. The user preferences offer a degree of render control that’s better than most, so you can choose to render the whole timeline, part of it or only the overloaded (red) areas. In a typical timeline consisting of some mixed-format media, color-correction, transitions and a few titles, the required rendering was very minimal.

EDIUS 6 now features the ability to work at standards up to 4K digital cinema (4096 x 2160). Unfortunately EDIUS 6 does not natively support RED’s .r3d camera raw format. That’s about the only format most users would encounter, which is larger than HD at 1920 x 1080. RED users considering EDIUS 6 will want to first transcode/render/export the camera’s media files into an editable format using RED’s free Redcine-X. Then EDIUS 6 becomes a very viable editing solution for RED projects.

I did render one of my RED files at 2K (2048 x 1024) as an uncompressed QuickTime movie. This worked fine in an EDIUS 6 HD timeline – displaying with a slight letterbox. You can set project settings to this size and work in a native 2K project, but bear in mind that all sequences within a given project must use the same frame size. Changing the project settings resets all sequences to this new size if this is an existing project. When I changed my test project from HD 16:9 to 2K 2:1, all previous sequences changed to the new size and aspect. Clips on those timelines were scaled to fit into the new sequence size.

The editing experience

This software includes features that professional editors expect. The interface is highly customizable, including colors, icons, window layouts and keyboard shortcuts. The organization of bins, folders and clips inside the project is independent of where and how your actual media files are stored. You can add media clips to your project from anywhere on the drives and reorganize them inside the project by creating folders specific to that project. EDIUS 6 supports multiple sequences and the ability to have several open at once in a tabbed timeline layout. There’s a good multi-camera editing mode that has been expanded to 16 cameras.

EDIUS 6 doesn’t use a popular plug-in API for its video effects, so you won’t find a lot of third party filter packages on the market specifically coded for EDIUS 6. This new version does include a bridge function to access existing Adobe After Effects and VST plug-ins for use as video and audio effects. I did not have After Effects installed, so wasn’t able to test this feature. The built-in effects interface is pretty weak in my opinion. I’m not sure why, but plug-in effects filters are divided into two folders – System Preset Video Filters and Video Filters. There aren’t a lot of choices and each time you click on an effect to open and modify it, a separate window opens for each filter you click on. In short, no single effects control panel as in other editors.

I feel that the interface could benefit from a modern overhaul. There also seems to be some incompatibility between the interface itself and the graphics card. For instance, when you grab a window to drag and resize the docked panels, there’s a redraw problem. The windows stutter and glitch as you drag the mouse. It’s fine again when you are done. I also tried to run EDIUS 6 under Parallels on my Mac Pro. That’s a definite no-go. The software works, but it’s very slow to react. As a point of comparison, I ran that same experiment with Vegas Pro and it actually performs pretty well under virtualization.

EDIUS 6 users are loyal Windows editors, of course, so performance with media on a modern PC is the biggest selling point and that’s one where the application really excels. In actual editing, EDIUS 6 does well with the real-time, “Edit Anything” claim. You can toss a mix of codecs, sizes and frame rates on the same timeline, complete with effects and 3D DVE transitions and the playback is guaranteed in real-time without any hiccups. I was able to view the sequence in full screen playback on the HP’s built-in 17” Dreamcolor display for an absolutely gorgeous image. Clean playback without any dropped frames. When it comes to composites, I found that with AVC-Intra 100, I was able to get about four layers using 2D picture-in-picture effects before it started to drop frames when left unrendered. Not too shabby.

The newly added 4K support gives Grass Valley something to build on for the future. Few NLEs handle media well when it exceeds HD sizes, so to see uncompressed 2K files playing smoothly on a laptop is pretty impressive. Likewise, performance with the Canon H.264 files was also very smooth, but I’d still recommend transcoding these files first, since the direct import doesn’t bring in any metadata, like timecode. On the other hand, if you just want to drag-and-drop the files into an editor and quickly bang out a sequence for the fastest possible turnaround time, EDIUS 6 is hard to beat. That’s appealing to folks first working with video, thanks to the HDSLR revolution. EDIUS 6 might be light on glitz, but it delivers when fast, nuts-and-bolts editing is the primary concern.

Written for Videography magazine (NewBay Media LLC).

©2011 Oliver Peters