Sorenson Squeeze 8.5 and 8.5 Pro

For many editors, the preferred video encoding application is Sorenson Squeeze. It’s one of the top encoders for both Mac and PC platforms and also comes bundled with Avid Media Composer in its third-party software package. Recently, Sorenson introduced the optional Pro version, which enables the encoding of Avid DNxHD MXF media, as well as Dolby Pro Audio and Apple ProRes QuickTime codecs (Mac only).

Sorenson launched Squeeze 8.5 with more features and faster encoding, especially for users with some CUDA-enabled NVIDIA GPU cards. CUDA acceleration was introduced with Squeeze 7 and accelerates presets using the Main Concept H.264/AVC codecs; however, 8.5 also accelerates MPEG-4, WebM, QuickTime and adaptive bit rate encoding. Other new features include additional input and output formats, some interface enhancements and 5GB of permanent storage with the Sorenson 360 video hosting site (included with the 8.5 purchase).

I use several different encoders and have always been a fan of Squeeze’s straightforward interface, which is organized around formats and/or workflows. Settings are easy to customize with granular control and modified presets may be saved as favorites. Although I typically import a few files, set my encoding requirements and let it go, Squeeze is also designed to allow import from a camera or work automatically from a watch folder.

The workflow aspect is not to be overlooked. You can set up Blu-ray and DVD disc burns, upload to various web designations and include e-mail notification – all within a single encoding preset. Burning “one-off” review DVDs for a client is as simple as importing the file, applying the DVD workflow preset and loading the blank media when prompted. If you use the web for client review and approval, then it’s handy to have the Sorenson 360 account built-in. This is Sorenson Media’s video hosting site running on Amazon servers, thus giving you a reliable backbone. You may set up player skins and access controls to use the site as an outward facing presence to clients – or embed the videos into your own site.

Aside from the improved speed and encoding quality of 8.5, the Pro version is a great front-end tool for video editors, too. For instance, if you don’t want to edit with the native media format, convert QuickTime files into Avid-compliant MXF media – or take Canon C300 MXF clips and convert them to ProRes for use in Final Cut. These new Pro features continue to enhance the Squeeze “brand” in the eyes of video editors as their top encoding solution.

Originally written for Digital Video magazine (NewBay Media, LLC).

©2012 Oliver Peters

Looper

One of the fun films of this year is Looper, a sci-fi/time travel adventure by director Rian Johnson (Brick, The Brothers Bloom). In the story, the character of Joe – a mob killer – is played by both Joseph Gordon-Levitt and Bruce Willis. One as the present day version of Joe and the other as his future self. An ambitious film like this typically requires the work of numerous visual effects companies. The task of setting up the futuristic environments fell to a relatively new northern California effects shop, Atomic Fiction.

Co-founders Kevin Baillie and Ryan Tudhope started their careers as talented, enthusiastic teenagers who, through persistence, landed slots on the Star Wars, Episode I pre-visualization effects team working at Skywalker Ranch. This led to a decade of work as compositors and effects supervisors on a host of blockbusters (Hellboy, Sin City, Pirates of the Caribbean, Transformers) thanks to a long stint at The Orphanage and ImageMovers Digital. With the demise of those companies, Tudhope and Baillie decided to combine their talents and start a new visual effects company model that could take advantage of the latest in software and post-production concepts, like working in the cloud.

Setting the tone

I spoke with Ryan Tudhope about the work Atomic Fiction did for Looper, as well as some of his thoughts on this new business model. Tudhope explained how they landed the job, “We’ve been a fan of Rian Johnson’s films and love the gritty reality of his stories. So we were all over Looper as soon as it was announced. We also knew that Looper’s visual effects supervisor, Karen Goulekas, was a meticulous and seasoned supervisor and would be looking to pull together the best team possible. Atomic Fiction’s art director Brian Flora and I explained how we were combining our incredibly talented team with a new, lower cost business model. Karen saw an opportunity to utilize us, initially for concept design and later on approximately 80 shots. All of this encompassed the film’s digital environments, which include futuristic city aerials, building design and modifications, set extensions and so on. Everything from wide establishing shots down to street-level views of buildings and scenery.”

The crew at Atomic Fiction took many of their cues from Bladerunner. Tudhope continued, “We wanted Looper to have a classic feel and looked to Bladerunner for inspiration, with its industrial tone and signature style, the lens flares and so on. But we also wanted to find our own way, so the two cities in Looper are more grounded in reality. They are more run down. We used common elements to tie our shots together such as graffiti and shelters for squatters, tents, etc. We designed objects and technology – like generators and antennas – that would be attached onto existing building as you know them today. The story takes place in two future cities – Kansas City and Shanghai, which is shown as the more prosperous of the two. Shanghai had to be newer, shinier and far more futuristic-looking. San Francisco doubled as Shanghai in one particular shot, so we had to transform aerials of the bay into this future version of Shanghai.”

To achieve their vision, Atomic Fiction relied heavily on using elements from the film’s 35mm anamorphic plates whenever possible. Tudhope explained, “We are big believers in having something real in the shot. You get so much good stuff from the plate photography that you don’t have to create from scratch, as you would if it were all CG. For example, the water and building textures and overall atmospherics. Sometimes we were able to use existing buildings and simply modify them. You get inspired by objects that are actually in the shot, which can be used as you transform it. In one of the bay aerials, there was a real barge on the water that we were able to enhance. In Bladerunner, the effects team made extensive use of miniatures and relied on fewer shots to tell the story. I often feel that modern effects films tend to overdo it, while the classics let the audience breathe for a moment. I believe Looper will have some of that classic feel.”

“One of the coolest, but also challenging aspects of Looper, was the film format. The film was shot on anamorphic 35mm. Match moves were a real problem for us, because of extremely complex lens distortion patterns, heavy grain and extensive warping on the edge of the frame during focus pulls. On the other hand, we had a great time matching the anamorphic lens flares that were already in the footage. It was great reference.” In total, Atomic Fiction took five weeks to develop the original concept art and design and then about four months to deliver finished effects.

Tools of the trade

Like any visual effects house, Atomic Fiction taps into a wide range of 2D and 3D software to get the job done. Tudhope described their operation, “We rely heavily on off-the-shelf software, but we tie it together with custom tools and functionality. For asset builds, animation and most lighting, [Autodesk] Maya is the tool of choice, but we tend to use [Autodesk] 3DS MAX for matte painting projections and CG environments. All of the matte painting is done in [Adobe] Photoshop and the final composites are done with [The Foundry’s] Nuke. All three companies have been great partners for us and dedicate substantial resources to the professional market.”

“The combination of 3D and 2D can be very efficient, because you can take 3D building models and reuse them from different angles without the need to draw them again from scratch. We typically put together digital environment teams, pairing 2D and 3D artists based on their strengths and what’s needed for any given shot. Sometimes you have to go back and forth between 2D and 3D. The key is being able to look at a shot and know why something isn’t working and then make the necessary adjustments. That’s harder to learn than figuring out how to bend the tools to meet the task. In general, the industry’s 2D/3D pipeline could still use some improvement. For instance, matte painters using Photoshop can easily put a lot of nuance into a shot through hundreds of layers and composite modes to get haze and glows and other details just right. That’s something that combines well inside Photoshop, but gets subtly altered when you try to pass those layers out to other tools.”

A new business model

Baillie and Tudhope realized great talent would be an important ingredient to launching Atomic Fiction, but they also felt their industry was due for innovation. They wanted to leverage new technologies to allow their company to be nimble, yet produce the high quality visual effects their team was known for.

Tudhope discussed their thoughts behind establishing the new company. “From our experience at other shops, we feel that the right number in any one location is around 40 or 50 employees. We have about 40 now and that number seems to be a sweet spot for efficiency, crew morale and maintaining a sense of team and company culture. We designed the facility with the cloud in mind. When you plan to build local hardware resources like a render farm, you end up buying for peak capacity, which means many times the system is underutilized. Instead, with the help of our partners at ZYNC, we jumped head first into Amazon’s EC2 cloud services. By moving rendering to the cloud, instead of owning the hardware locally, you start to treat it like a utility, such as electricity. You only pay for what you use. This means that rendering can literally be scaled from as little as the Macs on the artists’ desks to as many cores as you need. You have a lower total operating cost, and don’t have to pass unused equipment expenses on to the next client.”

“We render both 2D and 3D in the cloud, using V-Ray for 3D renders and Nuke for comps. ZYNC provides the software to manage the process from end-to-end. In order to make it work efficiently, we have an extremely fast internet connection. We literally push terabytes of data back and forth. Fortunately, it only takes a few minutes to get our shots into the cloud at first. After that, many of the revisions to a shot only require sending the changed data, which makes subsequent updates and renders just as fast as a local render farm.”

Security is always a concern when you talk about cloud-based services for the studios. Atomic Fiction has taken that issue head-on. Tudhope explained, “When we pitch a studio, we are often prepared with all sorts of data as to why the process is secure. In most cases, they are actually quite eager to exploit the cost savings and quality improvements, via more and faster artist iterations that cloud rendering provides. The reality is that many effects shops – especially smaller ones – don’t even have dedicated firewalls between them and the Internet. We take security very seriously, with high-end firewalls, a well-engineered internal network architecture, and heavy encryption of data going into and out of the cloud. Despite these intense security precautions, we are careful to only process small slices of a shot – not edited scenes with audio – with the cloud. Those micro-level components pose a much smaller security risk for our clients. We believe that the most important security measure of all is the professionalism of your staff and imparting to them how important the issue of security is.”

Looper opens across the country in September. Check it out to see how Atomic Fiction has used the cloud and off-the-shelf tools to transform the reality of today into the cities of tomorrow.

More from FxGuideWired and Movieline.

Originally written for Digital Video magazine (NewBay Media, LLC).

©2012 Oliver Peters

Red Giant Software Arsenal

Thanks to the growth of the internet, laptops and now tablets, the use of physical portfolios and demo reels to show your wares has been increasingly replaced by digital alternatives. The newest of these is Red Giant Software’s Arsenal application for the Apple iPad. This is a mobile presentation application designed to present your creative work to clients using an iPad.

Arsenal is easy to use and supports the presentation of both still images and video. You can import images by syncing folders with your iPad, via FTP and Dropbox. Multiple presentations can be organized in the Light Table display. Arsenal uses a series of Collections, each of which can hold several Strips. Once you pick the Collection to load, the available Strips are displayed. For a Strip of photographs, simply swipe the Strip with your finger in a standard iPad gesture and you can scroll through the images to see what is available in that Strip. Tap any image to display it full screen. From the full screen mode, you can swipe left or right to change images, set up a slideshow with automatic advance (three, five or eight seconds) or access any image from a small film strip at the bottom.

Arsenal offers a set of editing controls to add, move or delete images, name Collections and Strips (with choices of font style and size and theme colors), as well as add your logo at the top. Syncing is a big part of the application. You can e-mail a customer an Arsenal file that’s synced via Dropbox. The customer can run this file on their iPad using a free Arsenal Reader application and the images stay updated via Dropbox. In addition, you can also e-mail any image to another person from within Arsenal as a standard e-mail attachment.

Arsenal currently supports all three iPad models, JPEG and PNG images, plus MOV and MP-4 video files. There’s full support for the new iPad’s Retina display with images up to 5,000 pixels (3,000 for the original iPad model). I tested Arsenal on my first generation iPad, using still photos from my Olympus point-and-shoot camera as well as older 35mm slide scans. A few of these exceeded 3,000 pixels, but didn’t present any problems.

I prefer manual sync for my iPad and have an “iPad Transfer” folder on my desktop computer where I copy files headed for the iPad. Stills or movies that I’ve synced are accessible by Arsenal to be organized into Collections and Strips. 720p/23.98 H.264 MOV files at 5Mbps play nicely on the iPad and look great full screen on this first generation device. My only complaint is that videos don’t offer a thumbnail for quick visual recognition. Remember that productions use stills in lots of ways, including casting photos, location scouting stills and behind-the-scenes shots. All of these uses can find a home with Arsenal.

Originally written for Digital Video magazine (NewBay Media, LLC).

©2012 Oliver Peters