Tips for Production Success – Part 2

df2015_prodtips_2_smPicking up from my last post (part 1), here are 10 more tips to help you plan for a successful production.

Create a plan and work it. Being a successful filmmaker – that is, making a living at it – is more than just producing a single film. Such projects almost never go beyond the festival circuit, even if you do think it is the “great American film”. An indie producer may work on a project for about four years, from the time they start planning and raising the funds – through production and post – until real distribution starts. Therefore, the better approach is to start small and work your way up. Start with a manageable project or film with a modest budget and then get it done on time and in budget. If that’s a success, then start the next one – a bit bigger and more ambitious. If it works, rinse and repeat. If you can make that work, then you can call yourself a filmmaker.

Budget. I have a whole post on this subject, but in a nutshell, an indie film that doesn’t involve union talent or big special effects will likely cost close to one million dollars, all in. You can certainly get by on less. I’ve cut films that were produced for under $150,000 and one even under $50,000, but that means calling in a lot of favors and having many folks working for free or on deferment. You can pull that off one time, but it’s not a way to build a business, because you can’t go back to those same resources and ask to do it a second time. Learn how to raise the money to do it right and proceed from there.

Contingencies at the end. Intelligent budgeting means leaving a bit for the end. A number of films that I’ve cut had to do reshoots or spend extra days to shoot more inserts, establishing shots, etc. Plan for this to happen and make sure you’ve protected these items in the budget. You’ll need them.

Own vs. rent. Some producers see their film projects as a way to buy gear. That may or may not make sense. If you need a camera and can otherwise make money with it, then buy it. Or if you can buy it, use it, and then resell it to come out ahead – by all means follow that path. But if gear ownership is not your thing and if you have no other production plans for the gear after that one project, then it will most likely be a better deal to work out rentals. After all, you’re still going to need a lot of extras to round out the package.

Shooting ratios. In the early 90s I worked on the post of five half-hour and hourlong episodic TV series that were shot on 35mm film. Back then shooting ratios were pretty tight. A half-hour episode is about 20-22 minutes of content, excluding commercials, bumpers, open, and credits. An hourlong episode is about 44-46 minutes of program content. Depending on the production, these were shot in three to five days and exposed between 36,000 and 50,000 feet of negative. Therefore, a typical day meant 50-60 minutes of transferred “dailies” to edit from – or no more than five hours of source footage, depending on the series. This would put them close to the ideal mark (on average) of approximately a 10:1 shooting ratio.

Today, digital cameras make life easier and with the propensity to shoot two or more cameras on a regular basis, this means the same projects today might have conservatively generated more than 10 hours of source footage for each episode. This impacts post tremendously – especially if deadline is a factor. As a new producer, you should strive to control these ratios and stay within the goal of a 10:1 ratio (or lower).

Block and rehearse. The more a scene is buttoned down, the fewer takes you’ll need, which leads to a tighter shooting ratio. This means rehearse a scene and make sure the camera work is properly blocked. Don’t wing it! Once everything is ready, shoot it. Odds are you’ll get it in two to three takes instead of the five or more that might otherwise be required.

Control the actors. Unless there’s a valid reason to let your actors improvise, make sure the acting is consistent. That is, lines are read in the same order each take, props are handled at the same point, and actors consistently hit their marks each take. If you stray from that discipline, the editorial time becomes longer. If allowed to engage in too much freewheeling improvisation, actors may inadvertently paint you into a corner. To avoid that outcome, control it from the start.

Visual effects planning. Most films don’t require special effects, but there are often “invisible” fixes that can be created through visual effects. For example, combining elements of two takes or adding items to a set. A recent romantic drama I post-supervised used 76 effects shots of one type or another. If this is something that helps the project, make sure to plan for it from the outset. Adobe After Effects is the ubiquitous tool that makes such effects affordable. The results are great and there are plenty of talented designers who can assist you within almost any budget range.

Multiple cameras vs. single camera vs. 4K. Some producers like the idea of shooting interviews (especially two-shots) in 4K (for a 1080 finish) and then slice out the frame they want. I contend that often 4K presents focus issues, due to the larger sensors used in these cameras. In addition, the optics of slicing a region out of a 4K image are different than using another camera or zooming in to reframe the shot. As a result, the look that you get isn’t “quite right”. Naturally, it also adds one more component that the editor has to deal with – reframing each and every shot.

Conversely, when shooting a locked-off interview with one person on-camera, using two cameras makes the edit ideal. One camera might be placed face-on towards the speaker and the other from a side angle. This makes cutting between the camera angles visually more exciting and makes editing without visible jump cuts easier.

In dramatic productions, many new directors want to emulate the “big boys” and also shoot with two or more cameras for every scene. Unfortunately this isn’t always productive, because the lighting is compromised, one camera is often in an awkward position with poor framing, or even worse, often the main camera blocks the secondary camera. At best, you might get 25% usability out of this second camera. A better plan is to shoot in a traditional single-camera style. Move the camera around for different angles. Tweak the lighting to optimize the look and run the scene again for that view.

The script is too long. An indie film script is generally around 100 pages with 95-120 scenes. The film gets shot in 20-30 days and takes about 10-15 weeks to edit. If your script is inordinately long and takes many more days to shoot, then it will also take many more days to edit. The result will usually be a cut that is too long. The acceptable “standard” for most films is 90-100 minutes. If you clock in at three hours, then obviously a lot of slashing has to occur. You can lose 10-15% (maybe) through trimming the fat, but a reduction of 25-40% (or more) means you are cutting meat and bone. Scenes have to be lost, the story has to be re-arranged, or even more drastic solutions. A careful reading of the script and conceiving that as a finished concept can head off issues before production ever starts. Losing a scene before you shoot it can save time and money on a large scale. So analyze your script carefully.

Click here for Part 1.

©2015 Oliver Peters

Tips for Production Success – Part 1

df1915_prodtips_1_smThroughout this blog, I’ve written numerous tips about how to produce projects, notably indie features, with a successful outcome in mind. I’ve tried to educate on issues of budget and schedule. In these next two entries, I’d like to tackle 21 tips that will make your productions go more smoothly, finish on time, and not become a disaster during the post production phase. Although I’ve framed the discussion around indie features, the same tips apply to commercials, music videos, corporate presentations, and videos for the web.

Avoid white. Modern digital cameras handle white elements within a shot much better than in the past, but hitting a white shirt with a lot of light complicates your life when it comes to grading and directing the eye of the viewer. This is largely an issue of art direction and wardrobe. The best way to handle this is simply to replace whites with off-whites, bone or beige colors. The sitcom Barney Miller, which earned DP George Spiro Dibie recognition for getting artful looks out of his video cameras, is said to have had the white shirts washed in coffee to darken them a bit. The whiteness was brought back once the cameras were set up. The objective in all of this is to get the overall brightness into a range that is more controllable during color correction and to avoid clipping.

Expose to the right. When you look at a signal on a histogram, the brightest part is on the righthand side of the scale. By pushing your camera’s exposure towards a brighter, slightly over-exposed image (“to the right”), you’ll end up with a better looking image after grading (color correction). That’s because when you have to brighten an image by bringing up highlights or midtones, you are accentuating the sensor noise from the camera. If the image is already brighter and the correction is to lower the levels, then you end up with a cleaner final image. Since most modern digital cameras use some sort of log or hyper gamma encoding to record a flatter signal, which preserves latitude, opening up the exposure usually won’t run the risk of clipping the highlights. In the end, a look that stretches the shadow and mids to expose more detail to the eye gives you a more pleasing and informative image than one that places emphasis on the highlight portion.

Blue vs. green-screen. Productions almost ubiquitously use green paint, but that’s wrong. Each paint color has a different luminance value. Green is brighter and should be reserved for a composite where the talent should appear to be outside. Blue works best when the composited image is inside. Paint matters. The correct paint to use is still the proper version of Ultimatte blue or green paint, but many people try to cut corners on cost. I’ve even had producers go so far as to rig up a silk with a blue lighting wash and expect me to key it! When you light the subject, move them as far away from the wall as possible to avoid contamination of the color onto their hair and wardrobe. This also means, don’t have your talent stand on a green or blue floor, when you aren’t intending to see the floor or see them from their feet to their head.

Rim lighting. Images stand out best when your talent has some rim lighting to separate them from the background. Even in a dark environment, seek to create a lighting scheme that achieves this rimming effect around their head and shoulders.

Tonal art direction. The various “blockbuster” looks are popular – particularly the “orange and teal” look. This style pushes skin tones warm for a slight orange appearance, while many darker background elements pick up green/blue/teal/cyan casts. Although this can be accentuated in grading, it starts with proper art direction in the set design and costuming. Whatever tonal characteristic you want to achieve, start by looking at the art direction and controlling this from step one.

Rec. 709 vs. Log. Digital cameras have nearly all adopted some method of recording an image with a flat gamma profile that is intended to preserve latitude until final grading. This doesn’t mean you have to use this mode. If you have control over your exposure and lighting, there’s nothing wrong with recording Rec. 709 and nailing the final look in-camera. I highly recommend this for “talking head” interviews, especially ones shot on green or blue-screen.

Microphone direction/placement. Every budding recording engineer working in music and film production learns that proper mic placement is critical to good sound. Pay attention to where mics are positioned, relative to where the person is when they speak. For example, if you have two people in an interview situation wearing lavaliere mics on their lapels, the proper placement would be on each’s inner lapel – the side closer to the other person. That’s because each person will turn towards the other to address them as they speak and thus talk over that shoulder. Having the mic on this side means they are speaking into the mic. If it were on their outer lapel, they would be speaking away from the mic and thus the audio would tend to sound hollow. For the same reasons, when you use a boom or fish pole overhead mic, the operator needs to point the mic in the direction of the person talking. They will need to shift the mic’s direction as the conversation moves from one person to the next to follow the sound.

Multiple microphones/iso mics. When recording dialogue for a group of actors, it’s best to record their audio with individual microphones (lavs or overhead booms) and to record each mic on an isolated track. Cameras typically feature on-board recording of two to four audio channels, so if you have more mics than that, use an external multi-channel recorder. When external recording is used, be sure to still record a composite track to your camera for reference.

Microphone types. There are plenty of styles and types of microphones, but the important factors are size, tonal quality, range, and the axis of pick-up. Make sure you select the appropriate mic for the task. For example, if you are recording an actor with a deep bass voice using a lavaliere, you’d be best to use a type that gives you a full spectrum recording, rather than one that favors only the low end.

Sound sync. There are plenty of ways to sync sound to picture in double-system sound situations. Synchronizing by matched timecode is the most ideal, but even there, issues can arise. Assure that the camera’s and sound recorder’s timecode generators don’t drift during the day – or use a single, common, external timecode generator for both. It’s generally best to also include a clapboard and, when possible, also record reference audio to the camera. If you plan to sync by audio waveforms (PluralEyes, FCP X, Premiere Pro CC), then make sure the reference signal on the camera is of sufficient quality to make synchronization possible.

Record wild lines on set. When location audio is difficult to understand, ADR (automatic dialogue replacement, aka “looping”) is required. This happens because the location recording was not of high quality due to outside factors, like special effects, background noise, etc. Not all actors are good at ADR and it’s not uncommon to watch a scene with ADR dialogue and have it jump out at you as the viewer. Since ADR requires extra recording time with the actor, this drives up cost on small films. One workaround in some of these situations is for the production team to recapture the lines separately – immediately after the scene was shot – if the schedule permits. These lines would be recorded wild and may or may not be in sync. The intent is to get the right sonic environment and emotion while you are still there on site. Since these situations are often fast-paced action scenes, sync might not have to be perfect. If close enough, the sound editors can edit the lines into place with an acceptable level of sync so that viewers won’t notice any issues. When it works, it saves ADR time down the road and sounds more realistic.

Click here for Part 2.

©2015 Oliver Peters

More Life for your Mac Pro

df_life_macproI work a lot with a local college’s film production technology program as an advisor, editing instructor and occasionally as an editor on some of their professional productions. It’s a unique program designed to teach hands-on, below-the-line filmmaking skills. The gear has to be current and competitive, because they frequently partner with outside producers to turn out actual (not student) products with a combination of professional and student crews. The department has five Mac Pros that are used for editing, which I’ve recently upgraded to current standards, as they get ready for a new incoming class. The process has given me some thoughts about how to get more life out of your aging Apple Mac Pro towers, which I’ll share here.

To upgrade or not

Most Apple fans drool at the new Mac Pro “tube” computers, but for many, such a purchase simply isn’t viable. Maybe it’s the cost or the need for existing peripherals or other concerns, but many editors are still opting to get as much life as possible out of their existing Mac Pro towers.

In the case of the department, four of the machines are fast 2010 quad-cores and the fifth is a late 2008 eight-core. As long as your machine is an Intel of late 2008 or newer vintage, then generally it’s upgradeable to the most current software. Early 2008 and older is really pushing it. Anything before 2009 probably shouldn’t be used as a primary workhorse system. At 2009, you are on the cusp of whether it’s worth upgrading or not. 2010 and newer would be definitely solid enough to get a few more productive years out of the machine.

The four 2010 Mac Pros are installed in rooms designated as cutting rooms. The 2008 Mac was actually set aside and largely unused, so it had the oldest configuration and software. I decided it needed an upgrade, too, although mainly as an overflow unit. This incoming class is larger than normal, so I felt that having a fifth machine might be useful, since it still could be upgraded.

Software

All five machines have largely been given the same complement of software, which means Mavericks (10.9.4) and various editing tools. The first trick is getting the OS updated, since the oldest machines were running on versions that cannot be updated via the Mac App Store. Secondly, this kind of update really works best when you do a clean install. To get the Mavericks installer, you have to download it to a machine that can access the App Store. Once you’ve done the download, but BEFORE you actually start the installation, quit out of the installer. This leaves you with the Install Mavericks application in your applications folder. This is a 4GB installer file that you can now copy to other drives.

In doing the updates, I found it best to move drives around in the drive bays, putting a blank drive in bay 1 and moving the existing boot drive to bay 2. Format the bay 1 drive and copy the Mavericks installer to it. Run the installer, but select the correct target drive, which should be your new, empty bay 1 drive and NOT the current boot drive that’s running. Once the installation is complete, set up a new user account and migrate your applications from the old boot drive to the new boot drive. I do this without the rest (no documents or preferences). Since these systems didn’t have purchased third-party plug-ins, there weren’t any authorization issues after the migration. My reason for migrating the existing apps was that some of the software, like volume-licensed versions of Microsoft Office and Apple Final Cut Studio were there and I didn’t want to track down the installers again from IT. Naturally before doing this I had already uninstalled junk, like old trial versions or other software a student might have installed in the past. Any needed documents had already been separately backed up.

Once I’m running 10.9.4 on the new boot drive, I access the App Store, sign in with the proper ID and install all the App Store purchases. Since the school has a new volume license for Adobe Creative Cloud, I also have an installer from IT to cover the Adobe apps. Once the software dance is done, my complement includes:

Apple Final Cut Pro Studio “legacy” (FCP 7, DVD Studio Pro, Cinema Tools, Soundtrack Pro, Compressor, Motion, Color)

Apple Final Cut Pro X  “new” applications and utilities (FCP X, Motion, Compressor, Xto7, 7toX, Sync-N-Link X, EDL-X, X2Pro)

Adobe Creative Cloud 2014 (Prelude, Premiere Pro, SpeedGrade, Adobe Media Encoder, Illustrator, Photoshop, After Effects, Audition)

Avid Media Composer and Sorenson Squeeze (2 machines only)

Blackmagic Design DaVinci Resolve 11

Miscellaneous applications (Titanium Toast, Handbrake, MPEG Streamclip, Pages, Numbers, Keynote, Word, Excel, Redcine-X Pro)

Internal hard drives

All Mac Pro towers support four internal drives. Last year I had upgraded two of these machines with 500GB Crucial SSDs as their boot drive. While these are nice and fast, I opted to stick with spinning drives for everything else. The performance demand on these systems is not such that there’s really a major advantage over a good mechanical drive. For the most part, all machines now have four internal 1TB Western Digital Black 7200 RPM drives. The exceptions are the two machines with 500GB SSD boot drives and the 2008 Mac, which has two 500GB drives that it came supplied with.

After rearranging the drives, the configuration is: bay 1 – boot drive, bay 2 – “Media A”, bay 3 – “Media B” and bay 4 – Time Machine back-up. The Media A and B drives are used for project files, short term media storage and stock sound effects and music. When these systems were first purchased, I had configured the three drives in the 2, 3 and 4 slots as a single 3TB volume by RAIDing them as a RAID-0 software stripe. This was used as a common drive for media on each of the computers. However, over this last year, one of the machines appeared to have an underperforming drive within the stripe, which was causing all sorts of media problems on this machine. Since this posed the risk of potentially losing 3TB worth of media in the future on any of the Macs, I decided to rethink the approach and split all the drives back to single volumes. I replaced the underperforming drive and changed all the machines to this four volume configuration, without any internal stripes.

RAM and video cards

The 2010 machines originally came with ATI 5870 video cards and the 2008 an older NVIDIA card. In the course of the past year, one of the 5870 cards died and was replaced with a Sapphire 7950. In revitalizing the 2008 Mac, I decided to put one of the other 5870s into it and then replace it in the 2010 machine with another Sapphire. While the NVIDIA GTX 680 card is also a highly-regarded option, I decided to stick with the ATI/AMD card family for consistency throughout the units. One unit also includes a RED Rocket card for accelerated transcoding of RED .r3d files.

The 2010 machines have all been bumped up to 32GB of RAM (Crucial or Other Word Computing). The 2008 uses an earlier vintage of RAM and originally only had 2GB installed. The App Store won’t even let you download FCP X with 2GB. It’s been bumped up the 16GB, which will be more than enough for an overflow unit.

Of these cutting rooms, only one is designed as “higher end” and that’s where most of the professional projects are cut, when the department is directly involved in post. It includes Panasonic HD plasma and Sony SD CRT monitors that are fed by an AJA KONA LHi card. This room was originally configured as an Avid Xpress Meridien-based room back in the SD days, so there are also Digibeta, DVCAM and DAT decks. These still work fine, but are largely unused, as most of the workflow now is file-based (usually RED or Canon).

In order to run Resolve on any external monitor, you need a Blackmagic Design Decklink card. I had temporarily installed a loaner in place of the KONA, but it died, so the KONA went back in. Unfortunately with the KONA and FCP X, I cannot see video on both the Panasonic and Sony at the same time with 1080p/23.98 projects. That’s because of the limitations of what the Panasonic will accept over HDMI, coupled with the secondary processing options of the KONA. The HDMI signal wants P and not PsF and this results in the conflict. In the future, we’ll probably revisit the Decklink card issue, budget permitting, possibly moving the KONA to another bay.

All four 2010 units are equipped with two 27” Apple Cinema Displays, so the rooms without external monitoring simply use one of the screens to display a large viewer in most of the software. This is more than adequate in a small cutting room. The fifth 2008 Mac has dual 20” ACDs. Although my personal preference is to work with something smaller that dual 27” screens – as the lateral distance is too great – a lot of the modern software feels very crowded on smaller screens, such as the 20” ACDs. This is especially true of Resolve 11, which feels best with two 27” screens. Personally I would have opted for dual 23” or 24” HPs or Dells, but these systems were all purchased this way and there’s no real reason to change.

External storage

Storage on these units has always been local, so in addition to the internal drives, they are also equipped with external storage. Typically users are encouraged to supply their own external drives for short edits, but storage is made available for extended projects. The main room is equipped with a large MAXX Digital array connected via an ATTO card. All four 2010 rooms each gained a LaCie 4Big 12TB array last year. These were connected on one of the FireWire 800 ports and initially configured as RAID-1 (mirror), so only half the capacity was available.

This year I reconfigured/reformatted them as RAID-5, which nets a bit over 8TB of actual capacity. To increase the data throughput, I also added CalDigit FASTA-6GU3 cards to each. This is a PCIe combo host adapter card that provides two USB 3.0 and two SATA ports. By connecting the LaCie to each of the Macs via USB 3.0, it improves the read/write speeds compared to FireWire 800. While it’s not as fast Thunderbolt or even the MAXX array, the LaCies on USB 3.0 easily handle ProRes 1080p files and even limited use of native RED files within projects.

Other

A few other enhancements were made to round out the rooms as cutting bays. First audio. The main room uses the KONA’s analog audio outputs routed through a small Mackie mixer to supply volume to the speakers. To provide similar capabilities in the other rooms, I added a PreSonus AudioBox USB audio interface and a small Mackie mixer to each. The speakers are a mix of Behringer Truth, KRK Rokit 5 and Rokit 6 powered speaker pairs, mounted on speaker pedestals behind the Apple Cinema Displays. Signal flow is from the computer to the AudioBox via USB (or KONA in one room), the channel 1 and 2 analog outputs from the AudioBox (or KONA) into the Mackie and then the main mixer outputs to the left and right speakers. In this way, the master fader volume on the mixer is essentially the volume control for the system. This is used mainly for monitoring, but this combination does allow the connection of a microphone for input back into the Mac for scratch recordings. Of course, having a small mixer also lets you plug in another device just to preview audio.

The fifth Mac Pro isn’t installed in a room that’s designated as a cutting room, so it simply got the repurposed Roland powered near field speakers from an older Avid system. These were connected directly to the computer output.

Last, but not least, it’s the little things. When I started this upgrade round, one of the machines was considered a basket case, because it froze a lot and, therefore, was generally not used. That turned out to simply be a bad Apple Magic Mouse. The mouse would mess up, leaving the cursor frozen. Users assumed the Mac had frozen up, when in fact, it was fine. To fix this and any other potential future mouse issues, I dumped all the Apple Bluetooth mice and replaced them with Logitech wireless mice. Much better feel and the problem was solved!

©2014 Oliver Peters

Avid Everywhere

df_avid_ev_1It’s interesting to see that in spite of a lot of press, the Avid Everywhere concept still results in confusion. They’ve certainly been enunciating it since last year, with a full roll-out at NAB this past April. For whatever reason, Avid Everywhere seems to be lumped together with Adobe Anywhere in the minds of many. Maybe it’s the similarity of names or it’s that they both have a cloud component, but they aren’t the same thing. Avid Everywhere is a corporate vision, while Adobe Anywhere is a specific product (more on that later).

Vision and strategy

Avid Technology is a company with a diverse range of hardware and software products, covering content creation (video, audio, graphics, news), asset management, audio/video hardware i/o, consoles and control surfaces, storage and servers. In an effort to consolidate and rebrand a wide-ranging set of offerings, Avid has repackaged these existing (and future) products under the banner of Avid Everywhere. This is a marketing strategy designed to convey the message that whatever your media needs might be, Avid has a product or service to satisfy that need. This is coupled to a community of users that can benefit from their common use of Avid products.

This vision positions Avid’s products as a “platform”, in the same way that Windows, Mac OS X, iOS, Android, Apple hardware and PC hardware are all platforms. Within this platform concept, the products become stratified into product tiers or “suites”. Bear in mind that “suite” really refers to a group of products and not specifically a collection of hardware or software that you purchase as a single unit. The base layer of this platform contains the various software hooks that tie the products together – for example, APIs required to use Media Composer software with Interplay asset management or in an ISIS SAN environment. This is called the Avid MediaCentral Platform.

df_avid_ev_2On top of this sits the Storage Suite, which consists of the various Avid storage solutions, such as ISIS, along with news play-out servers. The next tier is the Media Suite, which encompasses the Interplay asset management and iNews newsroom products. In the transition to the Avid Everywhere strategy, you’ll see a lot of references on Avid’s website and in their marketing literature to “formerly Interplay ___”. That’s because Avid is in the process of rebranding these products into something with a “Media ___” name.

Most users who are editing and audio professionals will mainly associate Avid with the Artist Suite tier. This is the layer of content creation tools, including Media Composer, Pro Tools, Sibelius and the control surfaces that came out of Digidesign and Euphonix, including the Artist panels. If you are a single user of Media Composer, Pro Tools or Sibelius and own no other Avid infrastructure, like ISIS or Interplay, then the entire Avid Everywhere media platform doesn’t touch you very much for now.

The top layer of the platform chart is MediaCentral | UX, which was formerly known as Interplay Central. This is a web front-end that allows you to browse, log and notate Interplay assets from a desktop computer, laptop or mobile device. Although the current iteration is targeted at news production, the concept is story-centric and could provide functionality in other arenas, such as drama and reality series production.

Surrounding the entire structure are support services (tech support and professional integration services) plus a private and public marketplace. Media Composer software has included a Marketplace menu item for a few versions. Until now, this has been a web portal to buy plug-ins and stock footage. The updated vision for this is more along the lines of services like SoundCloud, Adobe’s Behance service or the files section of Creative Cloud. For example, let’s say you are a composer that uses Pro Tools. You create licensable music tracks and post them to the Marketplace. Other users can browse the Marketplace and find your tracks, complete with licensing and payment arrangements. To make this work, the Avid MediaCentral Platform includes things like proper security to enable such transactions.

All clouds are not the same

df_avid_ev_3I started this post with the comment that I feel many editors confuse Adobe Anywhere and Avid Everywhere. I believe that’s because they mistakenly interpret Avid Everywhere as the specific version of the Media Composer product that enables remote-access editing. As I’ve explained above, Everywhere is a concept and vision, not a product. That specific Media Composer product (formerly Interplay Sphere) is now branded as Media Composer | Cloud. As a product, it most closely approximates Adobe Anywhere, but there are key differences.

Adobe Anywhere is a system that requires a centralized server and storage. Any computer with Premiere Pro CC or CC 2014 can remotely access the assets on this system, which streams proxy media back to that computer. All the “heavy lifting” is done at the central site and the editor’s Premiere Pro is effectively working only as a local front-end. The operation does not allow hybrid editing with a combination of local and remote assets. All local assets have to be uploaded to the server and then streamed back to the editor. That’s because Anywhere manages the assets for multiple editors during collaborative workflows and handles project versioning. If you are working on an Anywhere production, you always have to be connected to the network.

df_avid_ev_4In contrast, Media Composer | Cloud is primarily a plug-in that works with an otherwise standard version of the Media Composer software. In order for it to function, the “home base” facility must have an appropriate Interplay/ISIS infrastructure so that Media Composer | Cloud can talk to it. In Avid marketing parlance “you’ve got to get on the platform” for some of these things to work.

Media Composer | Cloud permits hybrid editing. For example, a news videographer in the field can be editing at the proverbial Starbucks using local assets. Maybe part of the story requires access to past b-roll footage that lives back at the station on its newsroom storage. Through Media Composer | Cloud and Interplay, the videographer can access those files as proxies and integrate them into the piece. Meanwhile, local assets can be uploaded back to the station. When the piece is cut, a “publish” command (an AAF of the sequence) goes back to the station for quick turnaround to air. Media Composer | Cloud, by its nature, doesn’t require continuous connection, so editing can continue during transit, such as in a vehicle.

While not everything about Avid Everywhere has been fully implemented, yet, it certainly is an aggressive strategy. It is an attempt to move the company as a whole into areas beyond just editing software, while still allowing users and owners to leverage their Avid assets into other opportunities.

©2014 Oliver Peters

The Ouch of 4K Post

df_4kpost_sm4K is the big buzz. Many in the post community are wondering when the tipping point will be reached when their clients will demand 4K masters. 4K acquisition has been with us for awhile and has generally proven to be useful for its creative options, like reframing during post. This has been possible long before the introduction of the RED One camera, if you were shooting on film. But acquiring in 4K and higher is quite a lot different than working a complete 4K post production pipeline.

There are a lot of half-truths surrounding 4K, so let me tackle a couple. When we talk about 4K, the moniker applies only to frame dimensions in pixels, not resolution, as in sharpness. There are several 4K dimensions, depending on whether you mean cinema specs or television specs. The cinema projection spec is 4096 x 2160 (1.9:1 aspect ratio) and within that, various aspects and frame sizes can be placed. The television or consumer spec is 3840 x 2160 (16:9 or 1.78:1 aspect ratio), which is an even multiple of HD at 1920 x 1080. That’s what most consumer 4K TV sets use. It is referred to by various labels, such as Ultra HD, UHD, UHDTV, Quad HD, 4K HD and so on. If you are delivering a digital cinema master it will be 4096 pixels wide, but if you deliver a television 4K master, it will be 3840 pixels wide. Regardless of which format your deliverable will be, you will most likely want to acquire at 4096 x 2304 (16:9) or larger, because this gives you some reframing space for either format.

This brings us to resolution. Although the area of the 4K frame is 4x that of a 1080p HD frame, the actual resolution is only theoretically 2x better. That’s because resolution is measured based on the vertical dimension and is a factor of the ability to resolve small detail in the image (typically based on thin lines of a resolution chart). True resolution is affected by many factors, including lens quality, depth of field, accuracy of the focus, contrast, etc. When you blow up a 35mm film frame and analyze high-detail areas within the frame, you often find them blurrier than you’d expect.

The brings us to post. The push for 4K post comes from a number of sources, but many voices in the independent owner-operator camp have been the strongest. These include many RED camera owners, who successfully cut their own material straight from the native media of the camera. NLEs, like Adobe Premiere Pro CC and Apple Final Cut Pro X, make this a fairly painless experience for small, independent projects, like short films and commercials. Unfortunately it’s an experience that doesn’t extrapolate well to the broader post community, which works on a variety projects and must interchange media with numerous other vendors.

The reason 4K post seems easy and viable to many is that the current crop of 4K camera work with highly compressed codecs and many newer computers have been optimized to deal with these codecs. Therefore, if you shoot with a RED (Redcode), Canon 1DC (Motion-JPEG), AJA Cion (ProRes), BMD URSA (ProRes) and Sony F55 (XAVC), you are going to get a tolerable post experience using post-ready, native media or by quickly transcoding to ProRes. But that’s not how most larger productions work. A typical motion picture or television show will take the camera footage and process it into something that fits into a known pipeline. This usually means uncompressed DPX image sequences, plus proxy movies for the editors. This allows a base level of color management that can be controlled through the VFX pipeline without each unit along the way adding their own color interpretation. It also keeps the quality highest without further decompression/recompression cycles, as well as various debayering methods used.

Uncompressed or even mildy compressed codecs mean a huge storage commitment for an ongoing facility. Here’s a quick example. I took a short RED clip that was a little over 3 minutes long. It was recorded as 4096 x 2304 at 23.976fps. This file was a bit over 7GB in its raw form. Then I converted this to these formats with the following results:

ProRes 4444 – 27GB

ProRes HQ (also scaled to UHD 3840 x 2160) – 16GB

Uncompressed 10-Bit – 116GB

DPX images (10-bits per channel) – 173GB

TIFF images (8-bits per channel) – 130GB

As you can see, storage requirement increase dramatically. This can be mitigated by tossing out some data, as the ProRes444 versus down-sampled ProResHQ comparison shows. It’s worth noting that I used the lower DPX and TIFF color depth options, as well. At these settings, a single 4K DPX frame is 38MB and a single 4K TIFF frame is 28MB.

For comparison, a complete 90-100 minute feature film mastered at 1920 x 1080 (23.976fps) as ProRes HQ will consume about 110-120GB of storage. UHD is still 4x the frame area, so if we use the ProRes HQ example above, 30x that 3 min. clip would give us the count for a typical feature. That figure comes out to 480GB.

This clearly has storage ramifications. A typical indie feature shot with two RED cameras over a one-month period, will likely generate about 5-10TB of media in the camera original raw form. If this same media were converted to ProRes444, never mind uncompressed, your storage requirements just increased to an additional 16-38TB. Mind you this is all as 24p media. As we start talking 4K in television-centric applications around the world, this also means 4K at 25, 30, 50 and 60fps. 60fps means 2.5x more storage demands than 24p.

The other element is system performance. Compressed codecs work when the computer is optimized for these. RED has worked hard to make Redcode easy to work with on modern computers. Apple ProRes enjoys near ubiquitous playback support. ProRes HQ even at 4K will play reasonably well from a two-drive RAID-0 stripe on my Mac Pro. Recode plays if I lower the debayer quality. Once you start getting into uncompressed files and DPX or TIFF image strings, it takes a fast drive array and a fast computer to get anything approaching consistent real-time playback. Therefore, the only viable workflow is an offline-online editorial system, since creative editorial generally requires multiple streams of simultaneous media.

This workflow gets even worse with other cameras. One example is the Canon C500, which records 4K camera raw files to an external recorder, such as the Convergent Design Odyssey 7Q. These are proprietary Canon camera raw files, which cannot be natively played by an NLE. These must first be turned into something else using a Canon utility. Since the Odyssey records to internal SSDs, media piles up pretty quickly. With two 512GB SSDs, you get 62 minutes of record time at 24fps if you record Canon 4K raw. In the real world of production, this becomes tough, because it means you either have to rent or buy numerous SSDs for your shoot or copy and reuse as you go. Typically transferring 1TB of data on set is not a fast process.

Naturally there are ways to make 4K post efficient and not as painful as it needs to be. But it requires a commitment to hardware resources. It’s not conducive to easy desktop post running off of a laptop, like DV and even HD has been. That’s why you still see Autodesk Smokes, Quantel Rio Pablos and other high-end systems dominate at the leading facilities. Think, plan and buy before you jump in.

©2014 Oliver Peters

Avid Media Composer | Software v8

df_mc8_sm

At NAB Avid presented its Avid Everywhere concept. While Everywhere is a over-arching marketing concept, involving “the cloud”, storage, asset management, a marketplace and more, for most independent editors, Avid is all about Media Composer and/or Pro Tools. Given that, there’s very little in the Everywhere concept that affects these users. The most salient part is a restructuring of licensing and software options.

Media Composer and the options

Avid’s flagship NLE is now known as Media Composer | Software and version numbers are only internal, rather than part of the product branding. Avid released Media Composer version 8.0 in May, but it is only known as Media Composer. Added to this are three options: Media Composer | Symphony, Media Composer | NewsCutter and Media Composer | Cloud. NewsCutter, which always was a variation of Media Composer, is now sold as an option, which adds news-centric features to the interface. Media Composer | Cloud (formerly known as Interplay Sphere) is essentially a plug-in to Media Composer that allows remote access to an Avid asset management and storage system. NewsCutter and Cloud require a larger facility infrastructure, so I’ll skip them in this discussion. They have little bearing on what most independent editors do.

Two other past options, PhraseFind and ScriptSync, are currently not available, as these are based on a phonetic search engine technology licensed from Nexidia. Avid and Nexidia are in current discussions for a new licensing arrangement. While many editors rely on this technology, most do not. It is important to realize that Avid’s script integration and the internal Find tool are not completely tied to this technology and continue to work fine. The Nexidia options add a level of automation to the process through a phonetic match-up between waveforms and typed text.

Without ScriptSync, you can still create script-based bins, but the alignment of takes to script lines has to be done manually. Without PhraseFind, you can still search for text found in bin fields, but you cannot search by audio. Nexidia sells its own products, as well as licenses another application for editors that is sold through BorisFX as Soundbite. This is a standalone application geared to Final Cut and Premiere, but is not compatible with Media Composer. Until this gets resolved, Avid has advised editors who are dependent on ScriptSync or PhraseFind, not to upgrade past Media Composer version 7 software. Resellers still have these options available, in a version that is compatible with earlier versions of Media Composer.

Enter the new model

Media Composer version 8 is the first release of the application under the new licensing guidelines. You can now buy or rent Media Composer using three methods: perpetual license (own), subscription (rental) or floating license. The latter applies to larger facilities that are interested in purchasing “packs” of 20 or 50 perpetual licenses, which can be assigned to various machines as their production needs shift. The subscription license is based on an annual commitment ($49.99/mo-individual) or month-by-month ($74.99/mo-individual) rental and may be used by individuals or facilities. For example, facilities may have a number of perpetual licenses, but need to add a few seats of Media Composer for several months to accommodate an incoming, short-term production. They could choose to augment their “owned” licenses with additional subscription licenses to get through this immediate production crunch.

Most customers are likely to be interested in the changes in how you “own” the software, as the perpetual license model has changed from that of the past. When you now purchase Media Composer | Software, the cost is $1299, which covers the cost of the software plus one year of Avid support and any upgrades within the course of that year. (The actual support portion of that includes unlimited tech support over the web and one tech support phone call per month.) Customers still interested in a hardware license key (dongle) may purchase one for an additional $500. The Symphony option adds $749 to the bill. Current Media Composer owners (MC 6.5 or higher) can upgrade to MC 8 simply by purchasing a single year of support at $299 before the end of 2014. No matter how they got there (new purchase or renewal of an existing license) the software license is now on the current plan.

The important thing is that you have to renew again at the end of the first year of support. This is where the complaints have come in. As long as you renew your support contract each year at $299 (current price) then you will get any Avid updates to the software without having to purchase a separate software upgrade. (In the past, a Media Composer version upgrade has been more expensive than that year of support.) However, if you decide to let the support lapse for a year and then decide you want an upgrade, you will have to repurchase the product and any options anew.

Let’s say you bought Media Composer with the Symphony option – $1299 + $749. Hypothetically, by the end of the first year, Media Composer | Software has moved up to v8.5 and then you decide not to renew. From that point on, your version is “frozen” and cannot be upgraded. A year later, Media Composer | Software v10 comes out with enough compelling features to get you back on board. You cannot renew your v8.5 software license to upgrade, but instead have to purchase the current version Media Composer and Symphony again. Now you have two licenses: MC v8.5 and MC v10. Both work, but the older one is not upgradeable while the newer one is, as long as you renew its support contract after the end of the first year from the time of purchase.

Third-party bundles

In addition to the Nexidia issue, Avid now offers fewer third-party applications and effects as a bundle with the software. With the last few versions, you received Avid DVD, AvidFX, Sorenson Squeeze and BorisFX BCC filters (BCC only with the Symphony option). Avid DVD is no longer being developed. Variations of the others are now sold with a separate Production Pack third-party bundle. It gets a little confusing, because the options vary a bit between the perpetual and subscription models. If you buy the software, you now only get the NewBlueFX Titler Pro 1 and a starter set of their filters. “Lite” versions of Sorenson Squeeze and BCC (4 effects only) are offered with the subscription model. Since these are third-party products, you can still purchase them independently and existing versions that you already own will continue to work with Media Composer. BorisFX is offering upgrade deals to their products from past versions. Since AvidFX is simply an OEM version of Boris RED, one of their current deals is to upgrade from AvidFX to Boris RED 5.5 for $295. You can also upgrade to BCC 9 AVX for $599.

It’s a shame to lose the tools that were included in the past, but it really boils down to a consequence of the industry’s “race to the bottom”. At the prices that Avid currently sells Media Composer | Software, there simply is no margin left over to make third-party bundling deals. Developers aren’t going to accept a pittance just to be packaged with Media Composer. From the customer’s angle, you still have a decent set of audio and video filters included with Media Composer, including the NewBlueFX starter filters, Avid Illusion effects and the built-in Animatte effects tools. If you need more than that, you’ll simply have to purchase other plug-ins.

What to do

You own Media Composer version 7. What should you do now? The good news is that there’s no urgency to upgrade. MC v8 is essentially the same as v7.0.4, except with a new resident license tool (Application Manager). There are no new compelling features in MC v8 itself. Avid has promised one or more upgrades to happen during the year and resolution-independence has been mentioned as a technology that will come to Media Composer (although no specific commitment to a timeframe). You have until the end of the year to spend your $299 for support and get onto version 8.x. The smart money is advising to wait a few months and see what the next update brings. If it’s compelling, then you can take advantage of the deal and purchase the annual support, which gives you access to the new software (if you are on a recent version of Media Composer). The advantage to this is that the one-year clock starts at that time, so the later in the year that you do this, the longer the time (from now) that you have, before you need to renew again.

Changes like this always create a certain amount of tension. That’s been clear in the debates around Adobe’s shift to subscription with Creative Cloud. Users will inevitably compare the new costs to their old upgrade patterns and what the software used to cost them. I’m not sure that’s entirely fair, since financial pressures change and none of these companies have ever said that changes to their pricing wouldn’t happen, if it’s necessary. It seems to me that Avid has adopted the best blend of purchase and rental that I’ve seen among the NLE companies. There’s an incentive to stay current with the software, which is both to Avid’s and the customer’s advantage. If you were a loyal user who stayed current and always bought the upgrades when they came out, then the new deal is better for you financially. If you tended to sit on old versions and only sporadically upgraded, then you are likely to pay more this way. No right or wrong – just the way it is.

©2014 Oliver Peters

Final Cut “Studio 2014″

df_fcpstudio_main

A few years ago I wrote some posts about Final Cut Pro as a platform and designing an FCP-centric facility. Those options have largely been replaced by an Adobe approach built around Creative Cloud. Not everyone has warmed up to Creative Cloud. Either they don’t like the software or they dislike the software rental model or they just don’t need much of the power offered by the various Adobe applications.

If you are looking for alternatives to a Creative Cloud-based production toolkit, then it’s easy to build your own combination with some very inexpensive solutions. Most of these are either Apple software or others that are sold through the Mac App Store. As with all App Store purchases, you buy the product once and get updates for free, so long as the product is still sold as the same. Individual users may install the apps onto as many Mac computers as they personally own and control, all for the one purchase price. With this in mind, it’s very easy for most editors to create a powerful bundle that’s equal to or better than the old Final Cut Studio bundle – at less than its full retail price back in the day.

The one caveat to all of this is how entrenched you may or may not be with Adobe products. If you need to open and alter complex Illustrator, Photoshop, After Effects or Premiere Pro project files, then you will absolutely need Adobe software to do it. In that case, maybe you can get by with an old version (CS6 or earlier) or maybe trial software will work. Lastly you could outsource to a colleague with Adobe software or simply pick up a Creative Cloud subscription on a month-by-month rental. On the other hand, if you don’t absolutely need to interact with Adobe project files, then these solutions may be all you need. I’m not trying to advocate for one over the other, but rather to add some ideas to think about.

Final Cut Pro X / Motion / Compressor

df_fcpstudio_fcpx_smThe last Final Cut Studio bundle included FCP 7, Motion, Compressor, Cinema Tools, DVD Studio Pro, Soundtrack Pro and Color. The current Apple video tools of Final Cut Pro X, Motion and Compressor cover all of the video bases, including editing, compositing, encoding, transcoding and disc burning. The latter may be a bone of contention for many – since Apple has largely walked away from the optical disc world. Nevertheless, simple one-off DVDs and Blu-ray discs can still be created straight from FCP X or Compressor. Of course, FCP X has been a mixed bag for editors, with many evangelists and haters on all sides. If you square off Premiere Pro against Final Cut Pro X, then it really boils down to tracks versus trackless. Both tools get the job done. Which one do you prefer?

df_fcpstudio_motion_smMotion versus After Effects is a tougher call. If you are a power user of After Effects, then Motion may seem foreign and hard to use. If the focus is primarily on motion graphics, then you can certainly get the results you want in either. There is no direct “send to” from FCP X to Motion, but on the plus side, you can create effects and graphics templates using Motion that will appear and function within FCP X. Just like with After Effects, you can also buy stock Motion templates for graphics, show opens and other types of design themes and animations.

Logic Pro X

df_fcpstudio_lpx_smLogic Pro X is the DAW in our package. It becomes the replacement for Soundtrack Pro and the alternative to Adobe Audition or Avid Pro Tools. It’s a powerful music creation tool, but more importantly for editors, it’s a strong single file and multitrack audio production and post production application. You can get FCP X files to it via FCPXML or AAF (converted using X2Pro). There are a ton of plug-ins and mixing features that make Logic a solid DAW. I won’t dive deeply into this, but suffice it to say, that if your main interest in using Logic is to produce a better mix, then you can learn the essentials quickly and get up and running in short order.

DaVinci Resolve

df_fcpstudio_resolve_smEvery decent studio bundle needs a powerful color correction tool. Apple Color is gone, but Blackmagic Design’s DaVinci Resolve is a best-of-breed replacement. You can get the free Resolve Lite version through the App Store, as well as Blackmagic’s website. It does most of everything you need, so there’s little reason to buy the paid version for most editors who do some color correction.

Resolve 11 (due out soon) adds improved editing. There is a solid synergy with FCP X, making it not only a good companion color corrector, but also a finishing editorial tool. OFX plug-ins are supported, which adds a choice of industry standard creative effects if you need more than FCP X or Motion offer.

Pixelmator / Aperture

df_fcpstudio_pixelmator_smThis one’s tough. Of all the Adobe applications, Photoshop and Illustrator are hardest to replace. There are no perfect alternatives. On the other hand, most editors don’t need all that power. If direct feature compatibility isn’t a need, then you’ve got some choices. One of these is Pixelmator, a very lightweight image manipulation tool. It’s a little like Photoshop in the version 4-7 stages, with a mix of Illustrator tossed in. There are vector drawing and design tools and it’s optimized for core image, complete with a nice set of image filters. However, it does not include some of Photoshop CC’s power user features, like smart objects, smart filters, 3D, layer groups and video manipulation. But, if you just need to doctor some images, extract or modify logos or translate various image formats, Pixelmator might be the perfect fit. For more sophistication, another choice (not in the App Store) is Corel’s Painter, as well as Adobe Photoshop Elements (also available at the App Store).

df_fcpstudio_aperture_smAlthough Final Cut Studio never included a photo application, the Creative Cloud does include Lightroom. Since the beginning, Apple’s Aperture and Adobe’s Lightroom have been leapfrogging each other with features. Aperture hasn’t changed much in a few years and is likely the next pro app to get the “X” treatment from Apple’s engineers. Photographers have the same type of “Chevy vs. Ford” arguments about Aperture and Lightroom as editors do about NLEs. Nevertheless, editors deal a lot with supplied images and Aperture is a great tool to use for organization, clean up and image manipulation.

Other

The list I’ve outlined creates a nice set of tools, but if you need to interchange with other pros using a variety of different software, then you’ll need to invest in some “glue”. There are a number of utilities designed to go to and from FCP X. Many are available through the App Store. Examples include Xto7, 7toX, EDL-X, X2Pro, Shot Notes X, Lumberjack and many others.

For a freewheeling discussion about this topic and other matters, check out my conversation with Chris Fenwick at FCPX Grille.

©2014 Oliver Peters