Apple recently announced that they will soon begin a two-year transition away from using Intel chips in their computers to using in-house ARM chips instead. How will this major paradigm shift affect creatives who use Macs for their work?
A huge amount of photographers, videographers, and other creatives rely on Mac computers day in and day out for their work. Apple is known for doing a great job of creating seamless systems that work reliably, and their recent announcement that they will be transitioning away from Intel chips to using their own in-house ARM chips over the next two years should serve to increase the reliability of the ecosystem. But what impacts will it have on creatives who use Macs? There are still a ton of technical details to iron out, but here are some of the broad things we can expect.
Better Battery Life
ARM chips are generally more power-efficient than comparable x86 chips, and that can translate to thinner and light MacBooks with better battery life. Creatives are often running very intensive applications that can suck down power quite quickly, and for those that are on the go, particularly photographers and filmmakers who frequently travel for work, battery life can be one of the most important features of a device.
Less Heat
An ARM processor generally produces less heat than an equivalent Intel chip. That can mean longer periods of high-level performance when working on processor-intensive tasks. Lower-end Mac will also see performance benefits.
A Better Continuity Between Mobile Devices and Macs
iPhones and iPads are already on ARM chips and have been since their respective introductions. With ARM chips being introduced in Macs, apps should be able to work on both platforms with much greater ease, with the biggest hurdle likely being simply adjusting for different input methods (touchscreen versus mouse and keyboard). This is fantastic in two ways. First, it means you will likely be better able to grab your iPad on your way out the door and pick up where you left off on your Mac with little to no change in interface or capabilities. Second, it means there will be a greater availability of apps on both platforms, particularly since you will likely be able to run iOS apps on the Mac.
It Could Strongly Encourage Companies to Reexamine Their Software Code
It is no secret that apps like Lightroom are bloated and slower than they should be given what they are doing and the hardware they are running on. If you have used Lightroom for the iPad, which was developed from the ground up, you have probably noticed that it is often much, much faster than its desktop counterpart. In fact, given the smoother performance, the great touchscreen performance, and the fantastic screen of the iPad Pro, I actually do the majority of my Lightroom work on my tablet now. The switch to ARM chips could be the impetus companies like Adobe need to rebuild their core apps and modernize them, which could translate to significant performance improvements down the road.
An Instant Performance Improvement Due to ARM64's Architecture
ARM64's architecture has support for twice the number of general purpose registers as x64. These are used to temporarily store data for the processor. Having more of them helps the CPU avoid needing to rely on memory and caches, which means when put up against a comparable x64 architecture, you can expect a performance improvement of around 25%.
No More Boot Camp
Boot Camp, which allows Intel-based Mac to run Windows in a dual-boot setup, will not be included in future ARM-based Macs. This does not rule out the possibility of virtualization, and in fact, this is the likely route that will be taken.
Your Apps Will Be Fine
If you are worried about your apps suddenly not working on the new processors, everything will be fine. When Apple switched from PowerPC to Intel, they bundled MacOS with Rosetta, an emulator that recompiled PowerPC apps for the new computers. The new ARM Macs will come with Rosetta 2, which will allow you to run your x86 applications as needed. From the user experience side of things, it will likely look like nothing has changed at all. It is not clear what kind of performance you can expect from running x86 apps through Rosetta 2, but Apple has said performance is quite good, as can be seen in the Developer Transition Kit. For apps that support both x86 and ARM, Apple has introduced the "Universal 2" binary, which will contain a version of compiled code for each architecture. For the end-user, it will look largely like business as usual.
Developers Are Already Starting to Prepare
Apple has already made Developer Transition Kits available, which are essentially Mac Minis with A12Z chips (the one used in the 2020 iPad Pro) inside. This will allow developers to get to work on applications built for ARM chips, making the transition significantly smoother.
Continued Support of Intel-Based Macs
There are millions upon millions of Intel-based Macs in use right now, and many of them are quite new and nowhere near the end of their service life. In fact, Apple just released the third generation of the Mac Pro, their flagship desktop and most powerful computer, less than a year ago, in 2019. It is not like the company is going to flip a switch and leave their Intel-based hardware in the dust. In fact, Apple stated that they plan to support Intel-based for "years." What that means precisely is of course not yet clear, but given that the company will not have even completed the hardware transition for two years and it surely does not want to pull the rug out from underneath millions of Mac users, I would say it is fair to assume Intel-based Mac users can expect to receive updates and support for at least the next 5-6 years and likely for the reasonable service life of their machines.
Business as Usual With Some Benefits
Developers certainly have a lot of work ahead of them, but for users, though this transition is a huge shift for Apple, it seems there is not much to worry about and some great perks to look forward to.
I just hope people don't expect full performance of a X86 CPU on a ARM chip designed for mobile.
This continuity stuff, is Apple not admiting that their MacBook Pro is too thin to house a full i7 or i9 CPU.
Thermal throttling is an issue on the MacBook and MacMini computers, so their move is not to make them bulkier, but rather install a CPU with a completely different instruction set and hope that Adobe and other developers will follow.
To me, this is Apple last shot on their feet.
Lots of misinformation here... First off, you CAN get a Macbook pro with an i7 or i9 RIGHT NOW. Performance will likely be BETTER than Intel chips—iPad Pro’s are already faster than many laptops out there, including from Apple. Intel has been too slow to move to the smaller process sizes that Apple and Qualcom have been using in smartphones for years now, and they haven’t been able to integrate many of the custom features Apple designs for their A-series chips found in iOS devices, such as a secure element that enables touch ID and Face ID, or the neural core for much faster machine learning stuff. Not only have they been unable to keep up, they’ve been dragging down the Mac line for years because of the constant delays in new processors. Apple’s last shot? LOL! Honestly, if Microsoft starts putting real resources behind Windows on ARM, this could be the beginning of the end for Intel. Sure, its a risky move for Apple in many ways, but hardly a “last shot”.
You are dead wrong...
The last 5 generations of CPU's have been more than enough for photographers. The majority of PP Software including Lightroom make use of fewer cores and faster clock speeds.
Video is an entirely different matter and that depends on which software you are running...Adobe or FCP.
ARM is going to optimized for native applications and customized for MAC hardware. It's going to be faster/more efficient.
I just hope that Adobe does NOT use this as an excuse to drop Lightroom Classic for the Mac... During the keynote, apple was like “look, here’s photoshop and Lightroom running on apple silicon”, but it was obviously the new Lightroom... Otherwise, these machines should be pretty damn nice when they roll out, with better battery life, better performance, and more features that developers can use, such as the neural core for AI stuff.
This is the worst thing about MacOS, new OS updates EOL software at an annoying rate.
Unfortunately, that's part of the MacOS/iOS architecture. It's been good in that it allows Apple to change things faster -- they don't have to drag around as much legacy. But if they move too quickly, they lose developer support. It's not going to change going forward, either.
I exactly expect this. LR Classic is an outdated software by design (it's actually a LUA interface on top of ACR) and I don't expect Adobe to port it.
The attractiveness of the "Apple Silicon" approach is pretty clear. For Apple, anyway. Apple's Mac ecosystem has been a falling percentage of their overall business every year. They need more models than for iOS, they have all kinds of extra software support, testing, etc. So unifying the two architectures means lower development costs in software. But it also suggests lower costs in silicon, as they're going to deliver iPad-like SOCs for Macs. That's not the same processor as in the iPad or iPhone, but like the iPad generally includes an upgraded version of the same basic architecture from the iPhone, the Mac processor will be an iPad chip with more CPU and GPU cores.
Next comes the GPU. That's a concern for anyone doing high-end work. If you're happy with Intel graphics, you're probably going to be okay with Apple graphics, especially they drive their silicon and "Metal" API together, rather than need to optimize for OpenGL. On the other hand, any professional application is going to need OpenGL. I don't thin Apple's going to be competitive with nVidia or AMD in ten years, much less 2022.
That could also get really, really expensive. The SOC saves money -- one fairly low pin-count chip, with DRAM in a POP module on top. But as you chase Intel, nVidia, and AMD too far down the performance rabbithole, you can't make it an SOC anymore. Too much heat, since you can't trade performance for low power anymore and still be competitive at the high end. So now it's not one chip, it's a separate GPU and CPU. So you dozens of PCI express lanes, making the CPU even hotter. The GPU needs its own graphics memory, it's not going to be competitive with shared memory anyway. And GPU isn't just graphics these days, but computation. And of course, pro systems, Mac Pro-class systems, can have four GPUs.
How many Mac Pros does Apple sell in a year? I don't think they can make a radically different chip for such a small market. Sure, they could still support AMD or even nVidia GPUs for high-end, but they're going to need those PCIe lanes, those four or so DDR4 lanes, etc... and that's how your 200-400 pin SOC becomes a 4094 pin CPU.
Now of course, that's more a videography thing than a photography thing. Though we do get GPU acceleration in Photoshop. I've had a 64GiB memory system since 2013, specifically for photographic needs (large composites). Not everyone does, but you're not going to get those large memories if Apple's keeping costs low and architectures identical sticking to POP modules. At least not yet... I guess we'll see where that goes in 2022!
I'm just not convinced that this 1st gen Apple silicon is going to bring any benefits to the consumer. I know Apple are very good at not making direct comparisons and hiding behind marketing jargon. Plus they have a user base that is more or less locked in, so even if this first round isn't up to scratch it probably still won't hurt their sales.
Are you *the* Dave Haynie who developed the Zorro-Bus' of the Amiga and more? I remember a sentence you(?) put on the dev-papers of the Amiga 3000+ "I believe what I see, I see what I believe".
Yeah, I'm the Zorro III guy (the original Amiga team designed the Zorro bus -- I did the 32-bit version using the same connector). Most of the quotes in papers and schematics were from songs or books, a thing I swiped from Steven King's books. I think this one comes from Lewis Caroll's Mad Hatter.
Great! Thank you for responding. I have received the former developer papers of the 3000+. We were a pretty big user group "Amiga User Group Switzerland" AUGS (which somehow still exists, but I quit in the mid 90s) and from about 1989 on we operated several uucp nodes with Amigas using USRobotics HST modems. C= got us the developer status back then. I was and still am very impressed by your work. I was a hardware hacker working on Cpu cards for the 68020 for the Amiga 1000 (which I still have along with my A3000UX). It is an honour to "meet" you. You were one of my idols back then. Cheers!
Cheers! It's great to "meet" folks who were inspired by the work we did back then. That's the part that's lastest, even after most people have forgotten Commodore and Amiga.
That's right! (Thanks!). These were great IT pioneering times. The Amiga was state of the art back then (went from ZX80 to CBM3000, was impressed by a Intertec Superbrain with two Z80 cpus my math teacher solved elliptic encryption problems with, to C64 to Amiga). I was fascinated as if by photography, the former got my profession, the latter my passion.
Your apps will be fine as long as the new rosettaII is included in macOS. Remember what happened with PPC to Intel? I don't like the road Apple is choosing to take - a chip soldered on the mainboard to guard all components - memory soldered on board, ssd soldered on bord. Repairability = almost none.
I moved onto linux - it was hard, it took me years to get calibrated colors - but at the end - it's my machine - i'm the master of it. Not Apple, not google, not Microsoft, not Oracle, not Amazon. And i want it to stay this way - no matter what that takes.
There still stays the possibility that you'll be able to buy intel cpu's inside a mac - but it'll cost you even more than before. No one knows - and it'll depend on how smart the apple-buyers will be.
I don't like it to be locked up in an eco-system - and what we see on the market is that you have to make a choise and then kaboom - you're locked in - no escape possible. Your data - are theirs!
I guess Apple is giving up on the commercial market. It was a brilliant move when S. Jobs switched to Intel, it allowed Macs to run Windows via Boot camp, and that gave Macs the ability to run PC apps, such as Microsoft Office. This increased sales as many people wanted to use their Macs in the office but were limited because of software. Like it or not, most companies use PC based software. In my work the software (CAD/CAE) is almost all PC based so having the ability to run PC apps under VM ware or boot camp makes it possible for me to use a Mac. VM ware works better when when the processor is the same. By changing the processor architecture and losing boot camp, Apple will be taking away from the ability to run PC software and VM ware will not work as well. I personally don't think this is a wise move for Apple. They will lose commercial sales because of this. I rue the day I will have to give up my Mac at the office but I fear this change is moving that way.
Yup... x86 Macs with Bootcamp allowed Apple to spend less on development, and it also took away one big barrier to entry in many businesses: if Apple fails, we can still run Windows. That wasn't an uninteresting question in the early Mac x86 days.
Today, of course, no one believes Apple's in any danger of failure. And once again, moving processors will save them money, as they'll swap out mult-chip x86 processors for ARM System-on-Chip for at least most of their product line.
I don't see much of a viable way for them to support Mac Pro, since there just aren't enough Mac Pros made for Apple to make an ARM version of a Xeon. Though maybe with some advanced techniques like AMD's chiplets, it wouldn't be entirely out of the question, it still looks like a big NRE for maybe a few hundred thousand CPUs. They still need to support multiple AMD/nVidia-class GPUs, too. And 4-8 DDR4 memory buses, not a simple POP module, to be able to feed the 8-64 CPU cores you're going to get on a modern desktop workstation.
That doesn't mean Apple can't sell a machine called Mac Pro -- they of course define what that means. But matching today's Mac Pro on performance, much less the substantial performance increase they'll need some 2022 (the year Apple claims they'll be ARM-only), it will be interesting to watch!
What about Capture One Pro then?