IllumiNations 2000: Reflections of Earth

blog_roe_3

One of the coolest projects I’ve ever worked on is about to enter its tenth year. In 2009, the nighttime lagoon show at Walt Disney World’s EPCOT theme park enters what may well be its last year in this current version. Many theme park attractions are refreshed or changed periodically and I suspect that, the economy notwithstanding, it will be time to revamp this popular show, as well. IllumiNations 2000: Reflections of Earth was designed to usher in the new millennium and was to be activated a on New Year’s Eve, 1999. Due to logistical reasons Reflections of Earth was actually fired up in October, 1999. This marked the culmination of nearly a yearlong effort on the part of the video team and a total of several years for the show designers.

 

Themed attractions are often out of the ordinary and this was no exception. IllumiNations 2000: Reflections of Earth closes out each night at EPCOT in a celebration of fire, fireworks, lasers, fountains and a 29-foot tall globe that opens like flower petals – all tied together with an outstanding music score. The globe is mounted on a floating barge. A series of LED video screens in the shape of five continental masses (North and South America, Eurasia, Africa and Australia) are mounted onto the skeletal struts of the globe. Much of the show plays out as video on these screens in harmony with the music and fireworks.

 

Unlike other Disney shows, Reflections didn’t consist of a cast of Disney characters, but instead was designed as a celebration of humankind. The show loosely takes the audience on a journey that start with the big bang and symbolically progresses through the formation of earth, water and land, then to the creation of plants and animals and finally to the introduction of humans and their impact through transportation, architecture, creativity and communication. This story is told visually in a combination stock footage, original footage and some animation.

 

blog_roe_6

 

Reflection of Earth was driven by show director Don Dorsey, an independent creative consultant who handled every aspect of the show’s design, including music, video and fireworks. The score itself is an outstanding piece of work composed by Gavin Greenaway. It features a symphony consisting of London Symphony and Royal Philharmonic players recorded at Abbey Road Studios. Greenaway is closely associated with Hans Zimmer and mixed the final score at Zimmer’s Media Ventures facility. Even if you’ve never visited EPCOT, you’ve probably heard pieces of this score under ABC network special event promos and bumpers.

 

Creating the video content

 

Century III at Universal Studios Florida won the bid to produce the video portion of Reflections. In the year that we developed the video content for the earth globe, I served as one CIII’s project managers, as well as the lead editor/compositor. As one of the leads, I headed up a remarkable team of artists, animators, directors and editors who contributed to the success of Reflections of Earth.

 

You’d expect that screens on a 29-foot tall globe would be of extremely high resolution, but in fact, the opposite is true. The five continental masses amount to about 15,000 LED clusters. These are electronically linked to pixels on the graphics display card of the computer playing back the files. The video itself appears as a flat map of the earth that fits into a file size of only 364×160 pixels. Of course, just like a real map, much of this is blank in the location of the oceans. Our world map aligns to the upper left corner of the display card and these pixels correspond to their companion LEDs on the globe.

 

blog_roe_4

 

Early on we had lobbied for a doubling of the resolution of the screens to a density of at least 720 pixels wide. This idea was quickly dropped because such an increase is actually a square factor that would not only affect the screens, but the weight on the barge, time to hand-wire the LED matrices and last, but not least, the budget. The barge and globe sit at the center of the World Showcase Lagoon (EPCOT) and audience viewing distances range from 500 to 700 feet away. In reality, the resolution of these 364×160 pixel files turned out to be acceptable.

 

Test, test, test …

 

We actually spent quite a lot of time testing images for their readability. The company that manufactured the screens and the player software was able to provide us a sample panel that eventually ended up as the Australia screen. Even at a proportion that was cheated to be larger than Australia really is relative to the other land masses, this panel was only about 40×40 pixels – about the size of a computer desktop icon.

 

blog_roe_1

 

Between lack of pixel resolution and the different ways in which LED screens display contrast, saturation and gamma, a lot of testing was required. Our Australia test screen was set up at the Magic Kingdom maintenance area in a location where we could actually view images at the proper distance. If an image was discernible on Australia, then there would be no difficulty identifying images on larger screens, like North America. In the end, thousands of images were reviewed for technical and creative suitability. Eventually this process whittled down the content to roughly 400 stock clips that made the cut.

 

Putting it all together

 

As the lead editor for this show, I ended up doing most of the post on an Avid Media Composer (version 7.2) with AVR-77 as the best image resolution. Media Composer isn’t usually considered a compositor, but it was ideal for this project, with some sections having over 50 video tracks. Not every 4×3 image neatly fit a sweet spot inside our oddly-shaped, land mass screens. Many clips had to be augmented to blend or extend portions of the image to bleed outside of the matte formed by the continent’s shape. This task was handled on the Avid Media Illusion compositor. [Illusion was eventually discontinued by Avid. Many of its features ended up as part of the Avid DS tool set.] Once a clip was fixed, I’d edit, assemble and composite it into each continent as a full screen image (720×486). The tracks for the five continental masses would then be nested into a timeline combining all five sets of composites into a master sequence resembling the flattened map of the earth.

 

blog_roe_2

 

The master timeline would be exported, scaled and cropped into the 364×160 AVI file. We soon realized that the late-90s-era PC on the barge didn’t have the performance to play and decode compressed files, so these AVIs are actually uncompressed. In fact, the show is divided into seven files, which are played back through custom playlist software. Reflections of Earth is edited to music and the show is driven by timecode. The start of the video is only triggered once and continues in free run for the entire performance. The music plays separately from an external multitrack source, so our biggest fear was that the video would drift out of sync. Much to our delight, these AVIs held sync and, in fact, I’ve never seen the video hiccup in all the years that I’ve been back to see the show.

 

The audience’s delight

 

One of the unique aspects of working on a show like Reflections of Earth is that you can actually see it as often as you like and experience it with a real audience – feeling their reactions anew. We designed the show with a rotating globe and different images on all sides. Through a bit of guesswork, we settled on a speed of three rotations per minute. This means that you only see any given image for a maximum of 20 seconds. Even less, when you consider how long the audience has to lock on that sweet spot for each shot. By design, you will not see all the images during one performance of the show. The next time you see the show or if you stand in a different place, the experience will be new again. Hopefully you’ll discover images and think about concepts and ideas that are different than the last time.

 

blog_roe_51

 

 

Don Dorsey’s intent was not to be literal in how and why our team placed certain images onto specific screens. Sometimes images not specific to European or Asian history or culture might be placed on the Eurasian continental mass, simply because it provided the ideal canvas for that image. A fun moment of serendipity came in the transportation section. We had used an image of a Viking ship, which fit nicely across our Eurasian canvas and instantly communicated the idea of ancient travel and the spirit of exploration. EPCOT happens to have a Norwegian pavilion as one of the many countries that surround the lagoon. These attractions employ foreign students who work as part of the EPCOT cast. The first night that Reflections of Earth ran publically, some of these cast members were outside watching the show. It just so happened that the rotation aligned the Viking ship to face Norway – resulting in a resounding cheer from the Norwegians. I’ve been back a number of times and seen this happen more than once.

 

Disney doesn’t release attendance figures, but if Reflections of Earth makes its full ten year run, it has been estimated that at least 70 million people will have seen this show. There are few projects that any of us have worked on with sort of an audience! IllumiNations 2000: Reflections of Earth was intended to celebrate the start of the new millennium with an expression of hope for humankind. So far that has been a challenge, but it’s nice to know that spirit is still alive somewhere. Here’s hoping we will rekindle that spirit worldwide some day soon.

 

© 2008 Oliver Peters

Final Cut Pro as a Platform

blg_fcp_plat

I’ve had a lot of hits on my Avid vs. FCP post, but when I showed it to some of my friends over at Apple, they suggested one point-of-view that I hadn’t considered. Namely, that Final Cut Pro should be viewed as not just another NLE, but rather a development platform. I think that argument could be made for After Effects or Pro Tools, but since FCP is tightly integrated with Apple hardware, the Mac OS and QuickTime, development possibilities go much deeper. Although the Final Cut Pro ecosystem isn’t open source software, the development community shares many of the same approaches as the open source community.

 

Apple’s Final Cut Studio – and particularly Final Cut Pro – revolves around several open architectures, including core technologies of Mac OS X itself, QuickTime, XML, FxScript, FxPlug, Apple Events, Log and Transfer, MIDI, FireWire and the PCI Express hardware bus. These allow outside developers to produce hardware and software products that extend the functionality of Final Cut Studio beyond Apple’s own development. QuickTime is the most obvious and that’s the media architecture upon which FCP is based. If you develop for QuickTime, most likely that product will work with a whole host of QuickTime-compliant products, including After Effects, Media 100 and Premiere Pro, as well as Final Cut.

 

Hardware

 

One the hardware side, Apple has chosen to leave video development to others. As graphics display cards become more powerful and processing is increasingly offloaded to the GPU, Final Cut Pro is able to ride on the power of OpenGL and newer, faster products manufactured by Nvidia and ATI. Core Apple technologies like Core Image and FxPlug rely on this power. Video I/O is another area that Apple has left to others. Today these include AJA, Blackmagic Design, MOTO and Matrox as the main products that editors use for ingest and output of full bandwidth video. If you toss FireWire and audio interfaces into this mix, then you can also include companies like Convergent Designs, Thomson (Canopus) and Digidesign. Add to this list past developers, such as Digital Voodoo, Pinnacle and Aurora, and you can see that the hardware development sphere that services Final Cut Pro is quite a bit larger than for nearly all other NLE companies.

 

Plug-ins

 

On the software side, the greatest number of developers would be those designing plug-ins. Current versions of FCP work with two video effects plug-in APIs: FxScript and FxPlug. FxScript is the older API and you can find many free or low cost effects filters on the Internet created by enterprising editors and dedicated software developers. FCP even includes the FXBuilder tool that allows users to modify or create effects and transitions on their own. FxScript plug-ins generally can’t be copy-protected, which is why many of them are offered for free.

 

On the other hand, FxPlug is a more advanced framework for effects that takes advantage of OpenGL. Effects can be of higher quality, more complex and maintain a level of real-time performance based on OpenGL acceleration. As a more advanced API, developers are also able to protect their filters, but at the same time, one set of filters can be used not only in FCP but also Motion. One developer, Noise Industries, even offers the ability to create your own effects using Apple’s developer tool, Quartz Composer. The FxFactory filters offered by Noise Industries and its partners can be installed in FCP, Motion and now also After Effects on Intel Macs.

 

Less known is that Final Cut’s Log and Transfer module is also a plug-in API. This is the conduit currently used by RED and Panasonic to import native media into Final Cut Pro. In the case of Panasonic, DVCPRO HD is simply rewrapped, but AVC-Intra is transcoded into ProRes 422. The newest version for RED now permits importing REDCODE files as rewrapped native media. Add to this the fact that FCP also utilizes three audio filter APIs: FCP, Mac Audio Units and VST. All together, this makes Final Cut Studio one of the richest plug-in environments of any NLE.

 

In contrast, Avid is locked into one flavor of MXF as a media structure, which uses primarily Avid codecs. The exception is compatibility with many of the DV and AVC-Intra codecs. On the other hand, you cannot take QuickTime or AVI file formats natively into an Avid NLE without transcoding the media upon ingest. Hardware is limited to Avid’s own and the plug-in architecture is strictly AVX, which only works with Avid products.

 

When you compare these two situations, it boils down to the fact that there are simply more developers creating products for use with Final Cut Pro, Final Cut Studio or Final Cut Express, than there are for Avid Media Composer, Symphony or DS. The reason is partly the fact that there is a larger market for developers on the Final Cut side, but it is also because of a more open development environment. This provides an opportunity for many different programmers and hardware designers to jump in. Some who know what they are doing and some who don’t. Apple has thus far chosen to let the market weed things out, which has proven to be quite successful.

 

Extensibility

 

Another area in which Final Cut has served as a platform is in the creation of companion applications. Here again, if you compare the number of utilities designed to work with various Final Cut Studio applications and compare that with those for Avid, Quantel, Autodesk and others, the edge goes to Final Cut. A key enabler is that Apple makes extensive use of XML and Apple Events. XML is the basis of such apps as those produced by The Assistant Editor, XMiL, Spherico and xm|Edit. XML lets developers not only extract useful information from FCP projects and sequences, but also change the information and import it back with new and different data and results.

 

Apple Events is less known. This is a core OS X technology that applications use to communicate with each other. An interesting application to utilize this is Digital Heaven’s Loader – a new FCP companion utility to control the import of non-footage files, such as graphics and music tracks. Through Apple Events, Loader can ask FCP for a list of open projects and if FCP isn’t busy (rendering or capturing) then it sends a list of project file locations back to Loader. Other useful Digital Heaven products that augment Final Cut Studio include: Final Print (print FCP bins, markers, sequences), Big Time (adds a large timecode window display to the user interface), Automotion (create a series of lower third titles using Motion templates) and MovieLogger (review and log QuickTime media files outside of FCP).

 

Other NLE vendors certainly have their own partners, which generally operate under contractual licensing agreements. This controls the quality of the third-party products, but it doesn’t encourage a quickly-growing developer community. Often a powerful product is made even more attractive through such developers. Witness After Effects, Photoshop and Pro Tools. The same open model benefits Apple and Final Cut users by specifically extending the capabilities of the Final Cut product family. The best part is that this is happening at a pace that exceeds what any one company can do using only its internal resources.

 

Click here for a list of Final Cut Pro / Studio resources.

 

© 2008 Oliver Peters

Apple Aperture 2 for Video Pros

blg_ap2

Programs for two-dimensional graphics fall into three categories: design, paint and photography. Adobe Photoshop has been the “Swiss Army Knife” software that most video professionals rely on to do all of these functions, but its main strength is image layout and design. Realistic painting that mimics natural media like oils and chalks continues to be the hallmark of Corel Painter. Neither application is much help if you need to organize hundreds of images, so programs like Apple’s iPhoto, Corel’s Paint Shop Pro Photo X2 and Google’s Picasa have come to fill that void for legions of photographers worldwide. These serve the needs of most amateurs, but if you’re a pro who needs industrial strength photo organization and manipulation software, then Apple Aperture and Adobe Photoshop Lightroom lead the pack. Both applications offer similar features and Adobe and Apple have been responding to each other tit-for-tat with new features in every software update – all to the benefit of the user.

 

In early 2008, Apple released Aperture 2, which was quickly followed by the 2.1 update. Aperture 2 added 100 new features, but the biggest improvement was faster performance, enabling quicker previews and image browsing. Aperture 2.1 introduced a plug-in architecture that has opened Aperture to a large field of third party developers. To date about 70 plug-ins have been developed for functions that include image manipulation, export, file transfer and Apple Automator workflow scripts. Apple has targeted professional photographers as the main customers for Aperture 2 and offers extensive support, such as video tutorials, on their website. There’s also a growing community of users and developers focused on Aperture and its plug-ins.

 

Organization

 

Documentary films and corporate image videos make extensive use of photographs to tell the story. Aperture 2’s file management is the biggest selling point for video producers and editors. Images in your library are organized by projects, albums, books and light tables. You can store master images in the Aperture library or link to other folders. There’s a new Quick Preview mode to that rapidly updates images during browsing. When not in the Quick Preview mode, Aperture 2 loads the full resolution images into the viewer (if they are available on the mounted drives) from the master files. You may use the Loupe (photographer’s magnifying glass) to isolate and analyze a portion of the photo at a 1:1 pixel size, which is accessed from the master image. Or just zoom the image to its actual size if your prefer. If the drive with the master images is not mounted, then Aperture displays a hi-resolution proxy image. Standard image corrections can be made when master files are available and these are applied as non-destructive filters, like adjustment layers in Photoshop, so your master image is never altered. Corrections are applied only to an exported image, therefore “baking in” these changes to a new version of the photo.

 

You can add custom metadata to each photo, which can be used to automatically populate smart albums. For example, as you browse and evaluate images, a rating system or keyword can be applied to each selected still. Smart albums can be tied to certain metadata information, so as you apply the right criteria, these images instantly show up in the appropriate smart album. Photos in an album, smart album, book or light table are linked back to the original image files in the project. As you adjust the non-destructive color settings, these changes ripples through to all the instances of that image in an album or light table. There are numerous templates and now export plug-ins to send images to locations outside of the Aperture 2 environment. The application is tightly integrated with Apple’s MobileMe web service, but other options via third party exporters include Facebook, Flickr, Gmail and Picassa to name a few. This makes Aperture 2 the ideal tool for location managers, casting directors, producers and directors who like to post photos to a web location for quick and easy client review. If you are a Final Cut Pro editor, there’s even a plug-in to send selected images out as a Final Cut Pro sequence, complete with a choice of transition effects.

 

Image tools

 

Imposing structure on a ton of photos is very important, but in the end, it’s all about image quality. In the documentary scenario, many stills given to the editor require a lot of clean-up, like dust-busting and cropping. Newer snapshots may require red-eye correction. These tasks have been traditional Photoshop strengths, but are actually better handled in a photo-centric application like Aperture 2. The tools include straightening and cropping, as well as a variety of color balance and enhancement filters. The image adjustment toolset is rounded out by non-destructive retouching brushes (repair, clone, healing) and vignettes. If you shoot camera RAW photography, Aperture 2 supports a wide variety of camera models plus the Adobe DNG format, and gained new RAW fine-tuning tools.

 

Photographers can now tether certain Nikon and Canon digital SLR cameras to the computer and capture their images directly into Aperture 2. You might not think this applies to video editors, but I’ve done a lot of projects where old photographic prints had to be scanned or shot with a video camera. Tethered operation for copy stand work seems like a much better and faster way to accomplish this task!

 

While we’re talking about camera RAW images, I have to quickly point out that the .R3D format of RED Digital Cinema’s RED One camera is not supported in Aperture 2. One of the beauties of shooting with RED is that the high resolution progressive frames also make great stills for print campaigns derived from the same shoot. Much like pulling a frame out of the 35mm negative after a film shoot. The RedAlert application can export 2K and 4K stills in the TIFF format, so it’s a simple task to import these into Aperture 2 for further manipulation. Aperture 2 isn’t as complex as Photoshop and its photographic tools are more comfortable for most directors of photography. So, it’s the ideal place for a DP to import sample stills from RED and do a quick grade for the director, client or colorist as a reference for his intended look.

 

Plug-ins

 

Aperture has offered an “edit with” feature since the beginning, which lets you designate an application like Photoshop as an external image editor. The new, third party image filters are accessed through the same “edit with” menu selection. Unlike other plug-in formats, these filters open as separate applications with their own interface. Apple got the ball rolling by integrating a full-featured Dodge and Burn filter. Tiffen and DFT joined the party, as did traditional photography software and Photoshop filter vendors, like Nik Software (Color Efex Pro, Silver Efex Pro) and Picture Code (Noise Ninja). Unlike Aperture’s own internal tools, these filter changes are destructive, so when you use one, a copy of the image is created with the applied effect, so you aren’t locked into that result.

 

Written by Oliver Peters for Videography magazine and NewBay Media, L.L.C.

Blindsided – Case Study of Editing a Documentary

Walter Murch has explained film editing as part plumbing, part performance and part writing. Plumbing is understanding the workflow. Performance is the inherent feeling of knowing where to make the best cut to establish pace and rhythm, much the same as a musician. The third component – the film editor as writer – is most true when cutting documentaries. In dramatic films, the editor helps shape the story and often ends up with a film that is radically different from the script; but, in documentaries, the editor commonly creates that script through the process of juxtaposing images and sounds.

 

A few years ago I began the cut of Blindsided, which chronicles the story of teenager Jared Hara’s struggle with going blind (from Leber’s Hereditary Optic Neoropathy) and the emotional turmoil of the family during that initial period. A false start by another editor resulted in the project coming to me. By then, the production was complete with footage already transcribed and digitized. Nothing had actually been cut yet, so I was able to start with a clean slate. Some documentaries take years to shoot, but the bulk of this footage was recorded during three weeks in the midst of Florida’s series of hurricane encounters during the fall of 2004.

 

The footage consisted predominantly of on-camera interviews with friends, doctors and family taped with a Sony F900 high-def camera. Other content included a fishing trip to Canada, Jared at school, a hurricane relief concert performance by the band Shinedown and coverage of Jared and two friends tubing from Ft. Lauderdale to the Bahamas in 4-5 foot seas. In total, about 50-60 hours of 24p HD content. The Bahamas trip was the original impetus for this production, as the Haras had hoped to attract attention to this medical disorder. It also provided an important diversion for Jared to help take his mind off of the reality of permanently going blind.

 

To aid in the production, the Haras brought in Talia Osteen – a family friend and then USC film student – as the director. The project quickly morphed from simple coverage of the event into a full-blown documentary film with an eye towards a Sundance submission. As the concept grew, so did the production, which added Digibeta helicopter aerials to the tube crossing, as well as Panasonic DVX-100 “B-Unit” camcorders and multi-track audio recording to the Shinedown concert. Talia had wrapped with school and was moving on to her first film career job, so when I joined the team in mid-2005, it was sans writer or director.

 

Where to begin?

 

Cutting Blindsided fit well into an approach I use with most unscripted projects. The first thing you have to do is review the footage, listen to all the interviews and start culling the useable from the useless. I tend not to work from transcripts, because I like to work with the footage that’s in front of me. Some editors swear by transcriptions and software like Avid’s ScriptSync, but I often find that “paper edits” don’t work well. The resulting dialogue edits simply sound odd, because the spoken inflections don’t match. However, transcripts do prove to be useful later on, when the producer or director ask for alternate sound bites. They can quickly find the dialogue on paper or in a Word document and locate the closest timecode (part of the transcription). Then it’s simple to call up the right clip in your NLE for a preview.

 

I’ll make editorial decisions about key sound bites for each person (presumably their best statements) and edit these into a “selects” sequence. Next, I’ll copy-and-paste appropriate segments from each person into new topic-related sequences. Then I duplicate those sequences and start cutting them down. For example, if three people talk about the same subject in the same way, I’ll pick the best of the three and eliminate the other two. I now have a grouping of all the relevant sound bites, but can always step back to an earlier version to restore a comment I might have cut. So, like eating an elephant, you start one bite at a time!

 

The story arc

 

At this point, a story structure – or arc – starts to emerge. In Blindsided, that arc covered the disease and the two-year-long degradation of Jared’s eyesight, then the downward spiral of the family – finally leading to how they have coped with the situation. This sounds clean and concise, but there were several messy parts. One of our challenges was to weave in what might really be considered as too many stories and “events”. For example, the Haras are Jewish, but their best friends are Muslim. This is tied to the friendship of the two sons who met as junior hockey players. Friendship was a key ingredient to the larger story.

 

The trip to Canada, the tubing trip and the concert were events – tangents to the main theme. At the Shinedown concert, Jared appeared on stage with the band, playing guitar on one number in front of an audience of about 6,000 people. This provided an uplifting ending, but it, too, was an event. So editorially, I had to balance these moments – which added a nice level of production value – with the real meat of the story, told largely in less visually-interesting interviews. The concert created a natural ending, but we wanted to be careful not to have it be too long or too triumphant. We didn’t want the viewer to forget the rest of the story and just focus on the end. Furthermore, this story has no happy “Hollywood” ending. The reality is that Jared is blind for the rest of his life.

 

Letting the story tell itself

 

My inclination was to have the interview sound bites tell the story without the use of a “voice of God” narrator. In crafting unscripted films, you first edit a “radio cut” – concentrating on making the sound work without worrying about visual coverage. Think radio play! In other words – Is there a complete story? At the beginning, I didn’t think I could avoid some narration and planned on bringing in a script writer to add narrative bridges between segments – essentially treating the story like chapters in a book. By the end of my two-hour-long first cut, Talia and producer David Coleman stepped back into the picture and helped refine the story. We tightened and rearranged segments and sound bites to eliminate the need for any announcer. This structure had been Talia’s vision from the start and I was glad to succeed in that goal. Talia worked with us for a few weeks and then David and I continued with further refinements. Sundance didn’t pick up Blindsided, which gave us time to screen the film for some informal focus group audiences. Their feedback helped to guide our revisions.

 

Blindness from Leber’s is genetically passed down on the maternal side, so Mark (Jared’s dad) irrationally blamed his wife Ellen (Jared’s mother). These feelings brought the family to the brink of dissolution. Mark had been brutally honest during the on-camera interviews about his emotions at that time and their impact on the rest of the family. He wanted to keep as much of this in the finished film to show the reality of what people go through when confronted by such a family tragedy. It’s an important part of what is in essence a grieving process that families have to pass through. The Haras stayed together, so the point was to show that you can regain a somewhat normal, yet different life by working through this. However, some of our audience feedback indicated a strong negative reaction to Mark’s honesty – so much so, that it tainted their opinion of the show.

 

The second issue was a bit harder to deal with. In an effort to involve Jared in activities that would help him cope and maintain a passion for life, the Haras engaged in “projects”, like the tubing trip, the fishing trip and helping Jared discover a talent for playing guitar. A small percentage of the focus audiences made comments like, “I’d trade my eyesight for the life this kid’s having.” Obviously these people didn’t get it, yet it was something we needed to address editorially. These two issues guided us in toning down some areas of the cut and accentuating others.

 

Adding the spice

 

Even with the wealth of original footage, I still needed more video, especially to cover sound bite edits within the interviews. Family photos and home videos helped flesh out the story and punctuate poignant moments. These included digital stills, scanned photos and nearly every video format imaginable, including VHS, Mini-DV, DVD and Mini-DVD-RAM. I even asked the second unit DP to spend a day with his own HDV camcorder at the Hara’s home to shoot cutaway shots around the house.

 

The first online edit took place in mid-2006. Since I’d used Apple Final Cut Pro for the offline edit, I uprezzed the footage in FCP, as well. The principal HD footage was 23.98fps, but the offline editing had been done from DVCAM copies, so the project was 29.97fps. The chances of this documentary going to 35mm film were slight, so we decided on a 1080i HD finish at 29.97fps. This maintained the quality of the interlaced SD footage, which would inevitably get softer if de-interlaced and converted to 24p. I especially wanted to keep the maximum quality of the dramatic aerials of the boys on an inner tube crossing to the Bahamas with extremely choppy water. All SD footage was upconverted to HD using a Teranex Mini, so even the DVX concert shots cut well against the angles from the F900. It took about a week and a half to uprez and color grade the show in FCP.

 

Blindsided went on to a series of successful film festival appearances. We made a few further trims in 2007 and then again in 2008, when Blindsided was accepted by HBO Family for airing on both the HBO Family channel and the main HBO channel. I don’t believe editors should get a writing credit for this kind of project. In my mind, it’s all part of earning the title of film editor. On the other hand, Blindsided provides the perfect example of how a “script” is created simply through the language of editing.

 

For more about the ongoing story, go to www.blindsidedthemovie.com.

 

Written by Oliver Peters for Videography magazine and NewBay Media, L.L.C.