Apple Pivots – WWDC 2020

Monday saw Apple’s first virtual WWDC keynote presentation. This was a concise, detail-packed 108 minute webcast covering the range of operating system changes affecting all of Apple’s product lines. I shared my initial thoughts and reactions here and here. If you want some in-depth info, then John Gruber’s follow-up interview with Craig Federighi and Greg Joswiak of Apple is a good place to start. With the dust settling, I see three key takeaways from WWDC for Mac users.

Apple Silicon

Apple Silicon becomes the third processor transition for the Mac. Apple has been using Intel CPUs for 15 years. This shift moves Mac computers to the same CPU family as the mobile platforms. The new CPUs are based on Arm SoC (system on chip) technology. Arm originally stood for Acorn RISC Machine and is a technology developed by Arm Holdings PLC. They license the technology to other companies who are then free to develop their own chip designs. As far as we know, the Apple Arm chips will be manufactured in foundries owned by TSMC in Taiwan. While any hardware shift can be disconcerting, the good news is that Apple already has more than a decade-long track record with these chips, thanks to the iPhone and iPad. (Click here for more details on RISC versus CISC chip architectures.)

macOS software demos were ostensibly shown on a Mac with 16GB RAM and operating on an A12Z Bionic chip – the same CPU as in the current iPad Pro. That’s an 8-core, 64-bit processor with an integrated GPU. It also includes cores for the Neural Engine, which Apple uses for machine learning functions.

Apple is making available a developer transition kit (DTK) that includes a Mac mini in this configuration. This would imply that was the type of machine used for the keynote demo. It’s hard to know if that was really the case. Nevertheless, performance appeared good – notably with Final Cut Pro X showing three streams of 4K ProRes at full resolution. Not earth-shattering, but decent if that is actually the level of machine that was used. Federighi clarified in the Gruber interview that this DTK is only to get developers comfortable with the new chip, as well as the new OS version. It is not intended to represent a machine that is even close to what will eventually be shipped. Nor will the first shipping Apple Silicon chips in Macs be A12Z CPUs.

The reason to shift to Arm-based CPUs is the promise of better performance with a lower power consumption. Lower power also means less heat, which is great for phones and tablets. That’s also great for laptops, but less critical on desktop computers. This means overclocking is also a viable option. According to Apple, the transition to Apple Silicon will take two years. Presumably that means by the end of two years, all new Macs released going forward will use Arm-based CPUs.

Aside from the CPU itself, what about Thunderbolt and the GPU? The A12Z has an integrated graphics unit, just like the Intel chip. However, for horsepower Apple has been using AMD – and in years past, NVIDIA. Will the relationship with AMD continue or will Apple Silicon also cover the extra graphics grunt? Thunderbolt is technology licensed by Intel. Will that continue and will Apple be able to integrate Thunderbolt into its new Macs? The developer Mac mini does not include Thunderbolt 3 ports. Can we expect some type of new i/o format in these upcoming Macs, like USB 4, for example? (Click here for another concise hardware overview.)

macOS Big Sur – Apple dials it to 11

The second major reveal (for Mac users) was the next macOS after Catalina – Big Sur. This is the first OS designed natively for Arm-based Macs, but is coded to be compatible with Intel, too. It will also be out by the end of the year. Big Sur is listed as OS version 11.0, thus dropping the OS X (ten) product branding. A list of compatible Macs has been posted by Apple and quite a few users are already running a beta version of Big Sur on their current Intel Macs. According to anecdotes I’ve seen and heard, most indicate that it’s stable and the majority of applications (not all) work just fine.

Apple has navigated such transitions before – particularly from PowerPC to Intel. (The PowerPC used a RISC architecture.) They are making developer tools available (Universal 2) that will allow software companies to compile apps that run natively on both Intel and/or Arm platforms. In addition, Big Sur will include Rosetta 2, a technology designed to run Intel apps under native emulation. In some cases, Apple says that Rosetta 2 will actually convert some apps into native versions upon installation. Boot Camp appears to be gone, but virtualization of Linux was mentioned. This makes sense because some Linux distributions already will run on Arm processors. We’ll have to wait and see whether virtualization will also apply to Windows. Federighi did appear to say that directly booting into a different OS would not be possible.

Naturally, users want to know if their favorite application is going to be ready on day one. Apple has been working closely with Adobe and Microsoft to ensure that Creative Cloud and Office applications will run natively. Those two are biggies. If you can’t run Photoshop or Illustrator correctly, then you’ve lost the entire design community – photographers, designers, web developers, ad agencies, etc. Likewise, if Word, Excel, or Powerpoint don’t work, you’ve lost every business enterprise. Apologies to video editors, but those two segments are more important to Mac sales than all the video apps combined.

Will the operating systems merge?

Apple has said loudly and clearly that they don’t intend to merge macOS and iOS/iPadOS. Both the Mac and mobile businesses are very profitable and those products fulfill vastly different needs. It doesn’t make any business sense to mash them together in some way. You can’t run Big Sur on an iPad Pro nor can you run iPadOS 14 on a Mac. Yet, they are more than cousins under the hood. You will be able to run iOS/iPadOS applications, such as games, on an Arm-based Mac with Big Sur. This is determined by the developer, because they will retain the option to have some apps be iOS-only at the time the application is entered into the App Store.

Technology aside, the look, feel, style, and design language is now very close between these two operating system branches. Apple has brought back widgets to the Mac. You now have notification and control centers on all platforms that function in the same manner. The macOS Finder and iPad Files windows look similar to each other. Overall, the design language has been unified as typified by a return to more color and a common chiclet design to the icons. After all, nearly ever Apple product has a similar style with rounded corners, going back to the original 1984 Mac.

Are we close to actually having true macOS on an iPad Pro? Or a Mac with touch or Apple Pencil support? Probably not. I know a lot of Final Cut Pro X users have speculated about seeing “FCPX Lite” on the iPad, especially since we have Adobe’s Rush and LumaTouch’s LumaFusion. The speculation is that some stripped-down version of FCPX on the iPad should be a no-brainer. Unfortunately iPads lack a proper computer-style file system, standard i/o, and the ability to work with external storage. So, there are still reasons to optimize operating systems and applications differently for each hardware form factor. But a few years on – who knows?

Should you buy a Mac now?

If you need a Mac to get your work done and can’t wait, then by all means do so. Especially if you are a “pro” user requiring beefier hardware and performance. On the other hand, if you can hold off, you should probably wait until the first new machines are out in the wild. Then we’ll all have a better idea of price and performance. Apple still plans to release more Intel Macs within the next two years. The prevailing wisdom is that the current state of Apple’s A-series chips will dictate lower-end consumer Macs first. Obviously a new 12″ MacBook and the MacBook Air would be prime candidates. After that, possibly a Mac mini or even a smaller iMac.

If that in fact is what will happen, then high-end MacBook Pros, iMacs, iMac Pros, and the new Mac Pro will all continue running Intel processors until comparable (or better) Apple Silicon chips become available. Apple is more willing to shake things up than any other tech company. In spite of that, they have a pretty good track record of supporting existing products for quite a few years. For example, Big Sur is supposed to be compatible with MacBook Pros and Mac Pros going back to 2013. Even though the FCPX product launch was rough, FCP7 continued to work for many years after that through several OS updates.

I just don’t agree with people who grouse that Apple abandons its existing costumers whenever one of these changes happen. History simply does not bear that out. Change is always a challenge – more for some types of users than others. But we’ve been here before. This change shouldn’t cause too much hand-wringing, especially if Apple Silicon delivers on its promise.

Click here for a nice recap of some of the other announcements from WWDC.

©2020 Oliver Peters

The Missing Mac

High-end Mac users waited six years for Apple to release its successor to the cylindrical 2013 Mac Pro. That’s a unit that was derided by some and was ideal for others. Its unique shape earned the nickname of the “trash can.” Love it or hate it that Mac Pro lacked the ability to expand and grow with the times. Nevertheless, many are still in daily service and being used to crank out great wok.

The 2019 Mac Pro revitalized Apple’s tower configuration – dubbed the “cheese grater” design. If you want expandability, this one has it in spades. But at a premium price that puts it way above the cost of a decked out 2013 Mac Pro. Unfortunately for many users, this leaves a gap in the product line – both in features and in price range.

If you want a powerful Mac in the $3,000 – $5,000 range without a built-in display, then there is none. I really like the top-spec versions of the iMac and iMac Pro, but if I already own a display, would like to only use an Apple XDR, or want an LG, Dell, Asus, etc, then I’m stuck. Naturally one approach would be to buy a 16″ MacBook Pro and dock it to an external display, using the MacBook Pro as a second display or in the clamshell configuration. I’ve discussed that in various posts and it’s one way nimble editing shops like Trim in London tend to work.

Another option would be the Mac Mini, which is closest to the unit that best fits this void. It recently got a slight bump up in specs, but it’s missing 8-core CPU options and an advanced graphics card. The best 6-core configuration might actually be a serviceable computer, but I would imagine effects requiring GPU acceleration will be hampered by the Intel UHD 630 built-in graphics. The Mini does tick a lot of the boxes, including wi-fi, Bluetooth, four Thunderbolt 3/USB-C ports, HDMI 2.0, two USB 3.0 ports, plus Ethernet and headphone jacks.

I’ve tested both the previous Mac Mini iteration (with and w/o eGPU) and the latest 16″ MacBook Pro. Both were capable Macs, but the 16″ truly shines. I find it hard to believe that Apple couldn’t have created a Mac Mini with the same electronics as the loaded 16″ MacBook Pro. After all, once you remove the better speaker system, keyboard, and battery from the lower case of the laptop, you have about the same amount of “guts” as that of the Mac Mini. I think you could make the same calculation with the iMac electronics. Even if the Mini case needed to be a bit taller, I don’t see why this wouldn’t be technically possible.

Here’s a hypothetical Mac Mini spec (similar to the MacBook Pro) that could be a true sweet spot:

  • 2.4GHz 8-core, 9th-generation Intel Core i9 processor (or faster)
  • 64GB 2666MHz DD4 memory
  • AMD Radeon Pro 5600M GPU with 8GB HBM2 memory
  • 1TB SSD storage (or higher – up to 8TB)
  • 10 Gigabit Ethernet

Such a configuration would likely be in the range of $3,000 – $5,000 based on the BTO options of the current Mini, iMac, and MacBook Pro. Of course, if you bump the internal SSD from 1TB to the higher capacities, the total price will go up. In my opinion, it should be easy for Apple to supply such a version without significant re-engineering. I recognize that if you went with a Xeon-based configuration, like the iMac Pros, then the task would be a bit more challenging, in part due to power demands and airflow. Naturally, an even higher-spec’ed Mac like this in the $5,000 – $10,000 range would also be appealing, but that would likely be a bridge too far for Apple.

What I ultimately want is a reasonably powerful Mac without being forced to also purchase an Apple display as this isn’t always the best option. But I want that without spending as much as on a car to get there. I understand that such a unit wouldn’t have the ability to add more cards, but then neither do the iMacs and MacBook Pros. So I really don’t see this as a huge issue. I feel that this configuration would be an instant success with many editors. Plug in the display and storage of your choice and Bob’s your uncle.

I’m not optimistic. Maybe Apple has run the calculation that such a version would rob sales from the 2019 Mac Pro or iMac Pros. Or maybe they simply view the Mini as fitting into a narrow range of server and home computing use cases. Whatever the reason, it seems clear to me that there is a huge gap in the product line that could be served by a Mac Mini with specs such as these.

On the other hand, Apple’s virtual WWDC is just around the corner, so we can always hope!

©2020 Oliver Peters

It’s great… Until it isn’t

No, that picture at the top isn’t a map of Covid-19 hotspots. It’s the map of US users affected by this week’s Adobe sign-in outage. This past Wednesday (May 27th) affected users across all of Adobe’s various cloud products were unable to sign into their accounts – locking them out of using any installed Adobe products. But not every user – only those who needed to sign in to enable their applications.

Adobe products, like Creative Cloud, are paid on a monthly or prorated annual basis. You sign in one time to activate the account on that device and you are good to go until renewal time, as long as you stay signed into the cloud license manager application. In theory, you don’t need to be continually connected to the internet for the applications to function. However, once a month Adobe’s servers are pinged and you may be prompted to sign in again. So if Wednesday was your machine’s day to “phone home,” you haven’t used the apps in a while, or if you were signing in after having previously signed out, then your Adobe cloud manager application was unable to connect to the server and activate (or reactive) your software product(s). Just like that, a day of billable time flushed away.

Before you grab the torches and pitchforks, it may be useful to revisit the pros and cons of the various software licensing models.

Subscription

Quite a few companies have adopted the subscription – aka software as a service or SaaS – model. The argument is that you never actually own any software, regardless of the company or the application (read any EULA). Rather, you pay for the right to use it over a specified period of time – monthly, annually, or perpetually. Adobe decided to go all-in on subscription plans, arguing that the upfront costs were cheaper for the user, the plans offered a better ROI with access to many more Adobe products, and that this provided a predictable revenue stream to fuel more frequent product updates.

Generally, these points have been realized and the system works rather well most of the time. Yes, you can argue that over time you pay more for your CC subscription than in the old CS days (assuming you skipped a few CS updates). But if you are an active facility, production company, or independent contractor, then it’s a small monthly business expense, just like your internet or electric bill, which is easily absorbed against the work you are bringing in. The software cost has shifted from cap-ex to op-ex.

That is all true, unless you have no revenue coming in or are merely working with media as a hobby. In addition, once you stop subscribing, all past project files – whether that’s Premiere Pro, Lightroom, Photoshop, In Design, etc – can no longer be opened until you renew.

Unfortunately, when you hit a day like Wednesday, all rational arguments go out the window.

The App Store model

If you are a Final Cut Pro X user, then Wednesday might have stirred the urge to say, “I told you so.” I get that. The App Store method of purchasing/installing/updating software works well. You only have to sign in for new purchases, new Mac installations, and occasionally for updates.

However, don’t be smug. Certain applications that Apple sees as a service, like News, occasionally prompt you to sign in with your Apple ID again before you can use that software. This is true even if you haven’t subscribed to any paid magazines or newspapers. In a scenario such as Adobe’s Wednesday outage, I can image that you would be just as locked out. Not necessarily from your creative apps, like FCPX, but rather certain software/services, such as News, iTunes, Music, etc. As someone who uses my iCloud e-mail account quite a lot through web browsers, I can’t count the number of times access has been hampered.

License managers

Similar to the App Store or Adobe Creative Cloud, some companies use license manager software that’s installed onto your machine. This is a common method for plug-in developers, such as FxFactory and Waves. It’s a way of centralizing the installation and authorization of that software.

FxFactory follows the App Store approach and includes purchasing and update features. Others, like Waves and Avid Link, are designed to activate and/or move licenses between machines, based on the company’s stored, online database of serial numbers. You typically do not need to stay online or be signed in unless making changes or updates, or your system has changed in some way, like a new OS installation. These work well, but aren’t bulletproof – as many Media Composer editors can attest.

Activation codes

One of the oldest methods is simply to provide the user with a serial number/activation code. Install the software and activate the license with the supplied code number. If you need to move the software to a different machine, then you will typically have to deactivate that application on the first machine and activate it again on the second machine. You only have to be connected to the company’s servers during these activation times. Some companies also offer offline methods for activation in the event you don’t have internet access.

Seems simple, right? Well, not necessarily. First of all, if you have a lot of software that uses this method, that’s a lot of serial numbers you will have to keep track of. Second, some companies only give you a limited number of times you can deactivate and reactivate the software. If that’s the case, you can’t really move the license back and forth between two machines every other week. If your motherboard dies with the software activated, you are likely going to have to jump through hoops to get the company to deactivate the number on their server in order to be able to activate it again on the repaired machine. That’s because the new motherboard IDs it as a completely different machine. Finally, even some of these companies require you to occasionally sign in or reactivate the software in order for you to continue being able to use it.

Hardware license key

Ah, the “dongle.” When Avid switched to software licensing, many Media Composer editors approached it with the attitude of “from my cold, dead hands.” And so, Avid still maintains hardware licensing for many Avid systems. Likewise, Blackmagic Design has also shipped dongles for DaVinci Resolve and Fusion. iLok devices, common among Pro Tools users, are another variant of this. Dongles, which are actually USB hardware keys, make it simple to move authorization between machines. That’s useful if you maintain a fleet of rental systems. No internet required. Just a USB port.

Unfortunately, dongles are subject to forgetfulness, loss, breakage, theft, and even counterfeiting. A friend reminded me that when Avid Symphony first came out and cost $100K for a system, dongle theft was a very real issue. That’s likely less the case now, because software is so cheap by comparison. I do know of film schools that extended their Media Composer dongles on a USB extension cable and then strung it to the inside of their Mac Pro. Lock the case for viable theft prevention.

Free

The Holy Grail – right? Or so many users believe. It’s the model Blackmagic uses for the standard versions of Resolve and Fusion. Many plug-in developers use the free model on a few plug-ins just to get you interested in their other paid products. Of course, the ease of making Motion Templates for Final Cut Pro X has create a homegrown hobbyist/developer market of free or extremely cheap effects and graphics templates.

Even though some commercial software is free, you are only granted the right to use it, not ownership. Often in exact for user data so that you can be marketed to in the future. As a business plan for a commercial developer, this model is only sustainable because of other revenue, like hardware sales. And in practice, even the Mac App Store model, with its “buy once” policy, is close to free when you own and personally control a number of Macs.

There are pros and cons to all of these models. They all work well until they don’t. When there’s a hiccup, roll with the punches, or contact support if it’s appropriate. With some luck, there will be a speedy resolution and you’ll be back up and running in no time.

©2020 Oliver Peters

Video Technology 2020 – Shared Storage

Shared storage used to be the domain of “heavy iron” facilities with Avid, Facilis, and earlier Apple Xserve systems providing the horsepower. Thanks to advances in networking and Ethernet technology, shared storage is accessible to any user. Whether built-in or via adapters, modern computers can tap into 1Gbps, 10Gbps, and even higher, networking speeds. Most computers can natively access Gigabit Ethernet networks (1Gbps) – adequate for SD and HD workflows. Computers designed for the pro video market increasingly sport built-in 10GbE ports, enabling comfortable collaboration with 4K media and up. Some of today’s most popular shared storage vendors include QNAP, Synology, and LumaForge.

This technology will become more prolific in 2020, with systems easier to connect and administer, making shared storage as plug-and-play as any local drives. Network Attached Storage (NAS) systems can service a single workstation or multiple users. In fact, companies like QNAP even offer consumer versions of these products designed to operate as home media servers. Even LumaForge sells a version of its popular Jellyfish through the online Apple Store. A simple, on-line connection guide will get you up and running, no IT department required. This is ideal for the individual editor or small post shop.

Expect 2020 to see higher connection speeds, such as 40GbE, and NAS proliferation even more widespread. It’s not just a matter of growth. These vendors are also interested in extending the functionality of their products beyond being a simple bucket for media. NAS systems will become full-featured media hubs. For example, if you an Avid user, you are familiar with their Media Central concept. In essence, this means the shared storage solution is a platform for various other applications, including the editing software. There are additional media applications that include management apps for user permission control, media queries, and more. Like Avid, the other vendors are exploring similar extensibility through third-party apps, such as Axle Video, Kyno, Hedge, Frame.io, and others. As such, a shared network becomes the whole that is greater than the sum of its parts.

Along with increased functionality, expect changes in the hardware, too. Modern NAS hardware is largely based on RAID arrays with spinning mechanical drives. As solid state (SSD) storage devices become more affordable, many NAS vendors will offer some of their products featuring RAID arrays configured with SSDs or even NVMe systems. Or a mixture of the two, with the SSD-based units used for short-term projects or cache files. Eventually the cost will come down enough so that large storage volumes can be cost-effectively populated with only SSDs. Don’t expect to be purchasing 100TB of SSD storage at a reasonable price in 2020; however, that is the direction in which we are headed. At least in this coming year, mechanical drives will still rule. Nevertheless, start looking at some percentage of your storage inventory to soon be based on SSDs.

Click here for more on shared storage solutions.

Originally written for Creative Planet Network.

©2020 Oliver Peters

Video Technology 2020 – The Cloud

The “cloud” is merely a collection of physical data centers in multiple locations around the world – not much different than a small storage center you might have. Of course, they employ more advanced systems for power, redundancy, and security than you do. When you work with one of the companies marketing cloud-based editing or a review-and-approval service, like Frame.io or Wipster, they provide the user-facing interface, but are actually renting storage space from one of the big three cloud providers – Google, Amazon, or Microsoft.

There are three reasons that I’m skeptical about ubiquitous, cloud-based editing (with media at native resolutions) in the short term: upload speeds, cost, and security.

Speed

5G (fifth generation wireless) is the technology predicted to offer adequate speeds and low latency for native 4K (and higher) media. While 5G will be a great advancement for many things, it’s a short distance signal requiring more transmission spots than current wireless technology. Full coverage in most metro areas, let alone widespread geographical coverage worldwide, will take many years to fully deploy. Other than potential camera-to-cloud uploads of proxy media in the field, 5G won’t soon be the killer solution. Current technology still dictates that if you want the fastest possible upload speeds for large amounts of data, then you have to tap as close as possible to the internet’s backbone.

Cost

Cloud storage is cheap, but extensive upload and download times aren’t. Unfortunately modern video resolutions also result in huge amounts of data generated on every shoot. Uploading native 4K media for a week-long production is considerably more expensive than FedEx and overnight charges to ship drives. What about long term storage? Let’s say that all of your native media is in the cloud and you pay according to a monthly or annual subscription plan. But what if you want to stop? That media will have to be downloaded and stored locally, which will incur data rate charges, as well as your time to download everything.

Security

Think these sites are unequivocally secure? Look at any data hack at a major company. Security is such a concern in our business that most major movie studios won’t let their editors connect the computers to the internet. Many make these editors check their cell phones at the door. No matter how secure, it’s going to be a hard sell, except for limited slices of the production, such as cloud-based VFX rendering.

I do believe 2020 will be a year in which many will take advantage of some modes of long distance, cloud-based edit services using low-res proxy media. Increasingly some services will be used to move dailies and deliverables around the globe via the cloud. But that’s a big difference from cloud-based editing becoming the norm. One edit scenario many will experiment with is to store the edit project files in the cloud, but with the media mirrored locally at each edit site. This way only the lightweight files used for edit collaboration need be moved over the internet. Think of this as Google Docs for editing. Adobe already offers a version of this, but I suspect you’ll see others, including solutions for Final Cut Pro X. So while true cloud-based editing is not a near-term solution, bits and pieces will become increasingly commonplace.

Originally written for Creative Planet Network.

©2020 Oliver Peters