Is Apple Pushing Photographers to Use Windows?

Is Apple Pushing Photographers to Use Windows?

For years, I've been the biggest supporter of everyone using a Mac, except gamers. Especially if you are a photographer or graphic designer, it just makes sense and it always has. But as current events unfold it's becoming harder and harder to stick with the platform, no matter how great it actually is. 

The Good About Apple

The Mac operating system is what makes it so great. It's not so much the hardware, although it is very nice high-quality hardware. That said it's still commodity hardware and they are using Intel processors just like PC. It's not about the hardware really for the "user" anymore. The operating system (when you know how to use it as a power user) is what makes things so efficient and effective for graphics professionals in general, and especially photographers.

The MacOS Finder is truly incredible and as I mentioned before in my Mac tips. Little things like right-clicking the header of an open document to open that file's containing folder (or anywhere in its tree with ease) is vastly superior to Windows.

The reliability of MacOS is probably a decade ahead of Windows, no joke. There's no comparison on reliability.

I won't just make a statement like that without explaining why. The reason is Apple licenses the OS to work with their own hardware built computers. Meaning they know exactly what configurations of Apple computers are running their OS, since they manufactured them all. Windows, in contrast, has to "generalize" many things to be able to work on an infinite amount of different hardware configurations. On PC Part Picker alone you could configure a PC with thousands of different setups and the one operating system Windows, has to try and work with all those different configurations. That is a lot more difficult than making an OS work with your own handful of specifically built hardware, therefore the reliability of the Mac by that alone is very solid before even getting into the BSD derivative base system that MacOS runs on.

Apple Versus Windows: The Pros and Cons

Either operating system, Windows or MacOS, is capable of running the programs and getting the job done. They differ greatly in workflow, but they do run the same programs with relatively the same capabilities. Windows even has a few little things that are better than Mac, such as the ability to customize extra mouse buttons if you have say, a seven-button mouse. Mac has never been able to utilize those buttons and that's a real shame because that one little thing can make a tremendous difference in efficiency when utilized. As cool as that feature is, it doesn't make up for all the benefits MacOS provides, but it's something.

Bottom line, a capable computer user who has above novice level computer skills can use either operating system and get your work done. 

Why Apple Is Making a Grave Mistake

Here's the scenario: I have one of my workstations that's a 2006-2008-era Mac Pro and when it was new it was leading edge quad-core with 32 GB RAM, 512 MB GPU. Now the Apple operating system is designed well unlike Windows which I believe is planned obsolescence the way the registry is structured, Windows actually slows down the longer you use it. MacOS will stay the exact same speed, however the perceived speed will change as software continues to develop and become more demanding. Cameras get higher megapixels, software has new features that are more processor and GPU intensive; those are the things that are making my 06-08 Mac Pro not work as well as it once did. It's the same speed it was in 08, but that's not good by today's demanding standards.

Apple's Mac Pro releases are few and far in between and the last release was the 2013 MacPro, and it was absolutely cutting edge with the 2nd Gen PCI-e SSD, good GPU, fast RAM, Thunderbolt, etc., but it's now four years old already and in computer terms that is an eternity.

So, my options for the 06-08 workstation Mac Pro? I could buy a used 2013 Mac Pro for still close to $2,000. That's a lot of money for an already four-year-old technology, and while they are fast and work well they still are much behind the current curve of fast processors and hardware architecture, such as the i7 7700. For example, for less money than a used 2013 Mac Pro costs, one could buy this. The 8 GB GPU, i7 at 4.2 GHz, DDR4 RAM, SSD. That computer, performance wise, will run circles around a 2013 Mac Pro. You can back a few of those specs down and be around $1,000 for a really fast modern-architecture computer system. 

This makes it very difficult to buy a four-year-old used computer that has half or maybe less the performance specs for twice the money. Yes, MacOS is ideal for what I do, but at some point hardware that is 20x faster has to make a difference. Yes, there will be some negatives about the Windows OS to deal with (such as having to deal with some kind of anti-virus), as well as a few positive little things like the mouse button customization. 

Further doubling down on Apple's mishandling of the pro market, they announced at the keynote that there won't be a new Mac Pro desktop built, rather a new iMac Pro which does boast some nice specs, but I hate iMac's for professional use. They are a great home or family computer, but I require more customization than that. What if I don't like the screen size? What if I want extra HDs or a different GPU? With the Windows build if I want a different GPU I can just pop out the existing and add a new one, no big deal.

Then there's the cost of this new iMac Pro at a rumored whopping $5,500 or more. 

So Apple is leaving a certain market of professionals behind with the path they are headed down, a really expensive high-end iMac which I don't want anyway, or pay a lot of money for really old technology and there's no in-between. That's a rough place to be in because I truly love MacOS and what it gives me for my workflow. Windows will definitely be cumbersome but the speed of the computer is so much farther advanced, that it is appearing to now be the lesser of two evils. 

Apple is essentially forcing my hand and likely many others. There will be a few markets left for them as professionals, since some folks may like and be ok with an iMac Pro and have the $5,000-plus to buy one. The rest of us are left with a tough choice. In the past, I have been happy to live with four-year-old hardware to not have to deal with Windows, but the gap now is only growing in performance due to the lack of Mac Pro production, so at some point it makes more sense to deal with the hassle of using Windows for the performance while saving a bunch of money and increasing future upgrade options without having to go out and buy into a whole new system.

It seems Apple is gearing heavily toward the consumer market and as a business decision that makes sense since there are a lot more regular consumers than graphics professionals. But it also seems like a huge mistake to abandon the original customer base that made the Apple computer so strong and good, evolving into what it is today.

What do you think? Is this really the end of the line for feasibly using Apple computers for professional photography?

Log in or register to post comments


Jonathan Krier's picture

Great write-up! Also for those of us who use the Adobe Suite, by having Adobe doing what seems like throttling on Mac software it's most definitely frustrating. I know they now favor PC over Apple but come on, the lag and speed I encounter sucks compared to less powerful PC's I use.

That alone has made me think about switching, but Airdrop and iMessage keep me here.

PC still has the stigma, whether you believe it to be true or not, that their hardware has a short shelf life compared to Apple. From my experience, this is true and is why I switched to Apple a few years ago. The price tags of PC hardware is tempting but I can't change back given the experience I have and of those around me.

Leif Sikorski's picture

Personally I don't think that Adobe is "favoring" PCs - in my opinion it has more to do with how the drivers are updated on Windows vs Mac.
During the last years we saw (beside SSDs / M2) the biggest progress in the utilization of the GPU. Almost any professional photo / video application makes use of it. The GPU vendors like Nvidia and AMD work heavily on improving it and release very often new drivers for Windows, but on Mac OS it has to go through Apple and system updates (from my understanding). It's something not just Adobe is struggling with, also PhaseOne (CaptureOne) wrote an article about the problematic a year ago or so. It's a fast evolving technology but Apple throttles it down because they don't let GPU vendors update anything that relates to OpenCL and such things.
Many professional programs that make use of the GPU had issues on the Macs which never happened on the Windows version. That's why tips like "turn off the GPU acceleration" are quite common on the MacOS side, but not on Windows.

Jonathan Krier's picture

Good to know thank you! I appreciate the enlightenment.

Anonymous's picture

The graphics card is not an important hardware point for Photoshop.

Maggie Mabon's picture

Actually mate, it is. Your graphics card directly relates to how much video ram you have and your video ram processes all of your image data.

Anonymous's picture

No, it is not.
1. Yes (and obviously) the vid card has the vid ram.
2. Read the term in context: VIDEO ram. As in "video" ie high frame rates & surface texture rendering. Not photography.

It's apparently a surprisingly common misconception, because I seem to be the only critic here of the mistaken notion that Photoshop requires a fancy video card. It doesn't even materially benefit from one. Rendering moving images in real time benefits from a powerhouse vid card, photo editing does not. You want to survive in multi-player online gaming like Halo, spend a $grand on a vid card. You want to do expert Photoshop editing, save your money.

Please read the reply below I made to Bill Larkin and see the links. Happy to be corrected if you think you have a correction, but I'd like it to be fact-based.

Bill Larkin's picture

The GPU has everything to do with common daily tasks in Photoshop, including but not limited to live brush, brush size changes with full tip view... GPU rotates.... etc. - and if it didn't matter, Adobe wouldn't have a GPU settings area in PS.

Anonymous's picture

The menu you posted does not support the point you are trying to make. Read it. It just says you can turn off features if your machine does not run well with them on ie if your card does not support certain features ie OpenCL (or OpenGl). This is not a function of your card's power. Being able to turn it off is a backdoor trick for older hardware compatibility. Lots of low to moderate cost cards utilize OpenGL/CL.

I refer you back to Northrup, or at least I invite willing learners to watch his very clear advice. The machine he has there, with it's common graphics card, is fantastic, even at two years old. Blinding by any standard.

Nolan Henley's picture

We have a newer iMac with an AMD R9 in it, in about this Mac it doesn’t even register that the graphics card is installed(Nvidia fan here) because of some error. This computer will not render half of the adjustments, you cannot use scrubby zoom, and you can forget about rendering files larger than a Gig. You might be right that you don’t need a graphics card, but photoshop is buttery smooth with a GTX 1080 under the hood :)

Anonymous's picture

Congratulations! But;

1. I never said, "you don’t need a graphics card."

2. Your very specific example is not particularly useful for the purpose of generalization. You have one setup that is showing some error which probably invalidates it's usefulness, another works well. It does not then follow that you can make a general case from the what you see as a difference.

3. I'm not the perfect source of all info. I know some things, have some schooling & experience, and do some research. I have common sense, I read critically. This is why when I see an absolute expert (Tony Northrup) agree with what I know, I feel I am probably on the right track.

>> Actually mate, it is. Your graphics card directly relates to how much video ram you have and your video ram processes all of your image data.

Speaking as a professional programmer, this is utter BS. There are some advantages to having a more powerful GPU, but data for the image being worked on by a graphics applications resides by default in main system memory.

Bill Larkin's picture

Yes, the GPU is very important for Photoshop

Anonymous's picture

I say it is not, but I'm happy to be proved wrong on tech matters, because it's like winning by losing:) Another factor overlooked in this piece that I was going to comment on but did not is disc speed. Disk speed is the main bottleneck, not ram or the CPU. Disc speed, and the bus speed. I use dual 10,000 rpm drives in a RAID 0 for my OS, which was the fastest thing you could do back when I built it. Today SSDs are faster, but you'd still build a RAID setup of some type. Next issue is getting the Photoshop/Lightroom swap files off of the disc holding the OS and onto a separate disc, that really makes it fly.

I digress, back to the point: Think about it, why would a graphics card matter for still images? A fast graphics card is all about frame rates and rendering surfaces, all non factors in editing a single photo. The rotation of a single image (via open GL) is a puny task so that is a non-issue. You don't need a gamer card, and therein is where you really save real money on your PC (which I suggest geeky people build from components partly at least for the fun of it.)


If you don't wish to believe me, perhaps Tony Northrup will convince you:

Graphics card info starts at 15:45 if you are impatient, but the whole vid is excellent (albeit old) info beautifully presented. Key line is repeated as the last line in classic pro tech writer style, "... save yourself some money and get yourself a cheap graphics card":

Short text list version:


And I'd like to emphasize that my view is not (and yours should not be) based on the celebrity status of an Internet opinion alone (although Northrup is a "go-to" source of information on digital photography). Rather it's simply computing hardware 101 (and old info at that) and if it does not ring true for you, it's time you restudy the topic.

As for the 5 down thumbs on my correct original note - whoops!

John, this video is old and no longer accurate. As far as I know, nowadays OpenCL acceleration makes a tremendous difference in rendering speed whenever one develops raws using Camera Raw/Lightroom.

I personnaly use Capture One for raw editing, and with OpenCL acceleration turned off for rendering, the image stutters when I make editing changes (sliding a slider), whereas it renders smoothly in perfect real time (@ 96hz, which is my screen refresh rate) whenever I turn on OpenCL on my GTX 1080.

The same is true for OpenCL assisted image exporting in Capture One: I recorded a three times faster rendereing speed whenever this feature is turned on as compared to CPU only rendering.

I can also monitor my GPU use and all of its power is maxed out whenever I make editing changes (I can also hear the fan ramp up it speed!), and then it throttles down whenever the image is rendered. This proves that the GPU is fully utilized for still image rendering.

Anonymous's picture

1. It's 2015, which is relatively new. And, if you peruse the comments you will see "Photographic Utopia" asks: how about an update to this? To which Tony N. replies just 11 months ago: "Really, nothing much has changed. The PC industry has ground to a halt." So in fact it's one year old. The principles behind it include older established ideas (the RAID setup) that are rock solid.

2. I never suggested that OpenGL/CL are not important. The incorrect assumption is that they do much better with a high end card. Yes you want them activated. But, and this is the point, you don't need a gamer card for OpenGL/CL operations. Looking at Newegg just as example I see that there are endless numbers of OpenGL compatible cards that are in the lower end of the price spectrum under $300 down to far lower. Fewer offerings in CL, but still some low cost. Note: The high end of the graphics card spectrum in my mind is over $500 to thousands of dollars.

3. So the question properly put is; with the understanding that yes you have OpenGL/CL engaged, is a low price card materially slower than a very high priced card? My understanding is, Tony's is, and yours should be, no.

Happy to be shown to be wrong, I learn from that.

John - What you have said is also substantially incorrect. GPU acceleration is not just on or off - more powerful GPUs process data faster.

BUT there is a price curve - ie you get past a certain point and returns diminish.

Anonymous's picture

Your statement is a non sequitur. A slightly faster cpu is faster too, but it does not matter in real time work flows.

Open GL is either on or off if your card does not support it, while a card that supports it suffices as long as it has recommended ram. That is the basic hurdle and it says so in the Adobe support page for graphics cards, with a recommendation for memory of around 2G. Now I'm not paid for my time to do research to write an article here but I do wind up doing some, to comment. My looksee shows an above average card that meets Adobe specs can be had for under $250 nowhere near a gamer card. Any more card is in fact a waste of money, as Tony Northrup advises. And of course my own experience agrees.

FYI - I offered a correction to Northrup too, he never mentions a separate disc for the Lightroom and Photoshop swap files. They should be on a disc that does not hold the Windows swap file - that really improves performance.

Here's a question for you; I am curious, how do you monitor your GPU? What utility does that for you? I'd be interested to know, thanks.

The "substance" (as in substantial) we are discussing is best practices advice, perhaps for neophytes or non-nerds needing to cut to the chase, save $ and get on with learning/working. My view concurs with Northrup, and not because he said it. However he is probably as good an authority as any, and surely better than any writer here.

>>>Your statement is a non sequitur. A slightly faster cpu is faster too, but it does not matter in real time work flows.<<<

Ahh, the tell tale combination of pompousity and distortion...

A "slightly faster cpu" won't matter. A *MUCH* faster one will. No one was restricting the discussion to "slight" anything, John.

And, yes, this matters even during "realtime workflows." In so much as that horrible phrasing means anything.

More power changes what you can do in without waiting around - for example wavelet splits, film simulations, even something as simple as running a gaussian filter with a wide radius.

Now, this extra power might not help you - because very possibly, you don't know what the hell you are doing and stick to using only very basic tools. But - as people have already told you,although you were too silly to listen - other people have different needs.

...Not everything is about, *you*, John.It's a harsh truth, I know...

Anonymous's picture

Pardon, excuse the Latin; "A non sequitur, in formal logic, is an invalid argument. In a non sequitur, the conclusion is either true or false, but the argument nonetheless asserts the conclusion to be true and is thus fallacious." I hope that helps.

If you watched the Northrup vid, which I posted as the source of authority rather than implying I am, you know he states that "some filters" benefit from a top end vid card. Like your blur. If you spend all day blurring images (like you do your written replies) then yeah, it will matter to your workflow. I wonder; How does your volume of work compares to his? In any case direct your criticisms from now on to the expert - Tony Northrup.

I'm concerned that willing learners not be mislead by bad information. Willing learners should understand that a good editing machine does not require a top end vid card. P.S. Oh and did you stuff some lame slights about me in your rickety post? I didn't notice.

Have a great day!


Anonymous's picture

Desperate stuff, lol!

Elan Govan's picture

Never used an Apple computer.....more of "a devil you know", kind of issue here.

Just get an iMac. Either Pro or regular. They’re great and work well. You can still add extras. Don’t overthink it. Personally I’m glad the iMac Pro is what it is. The modular trash can was a clunky option.

Bill Larkin's picture

See, that's the problem, with iMac, you are married to THAT screen, which is looks nice, but it's NOT pro level like an Eizo or similar for a high-end color workflow. That's why the iMac is really better suited for a home use application, than a professional environment. I know some photographers use them, I just cannot.

Anonymous's picture

You have that wrong Bill. The pro rule of thumb is that you do not need an Eizo type monitor unless you are doing graphic design work related to offset printing output, as when you are distributing advertising via print media nationwide using many print vendors. As with our interaction above on graphic cards, I can whip up the reference material, but I'm not paid to take the time so ...;)

Photography is no longer related to offset printing as it was in the heyday of the glossy magazine, so the need for triple-the-cost Eizos (and some other specific brands) is no more. I've not used my color spyder matching device in years.

Most photo work (for readers here especially) is either Internet or inkjet output wherein precise color matching monitors like Eizo are not important. What you want for digital photography is a large display with high res for lots of workspace, with a proper Adobe RGB (or better) color gamut. You then tune it up with B&W step scales to get the brightness right and you are good to go. Other features matter according to taste or convenience. If there is a criticism of an iMac monitor, it's the gloss, which I find annoying personally but some folks like. They are fine otherwise.


I must note that I do not wish to be taken as a shoot-from-the-hip critic, I step up as I feel sincerely that I have noticed a technical error worth correcting, or in aesthetics and practice where my experience is useful. It does take some time/effort to do so, and I am not afraid to be shown to be wrong. I'd like to keep it friendly and mutually productive.

Peter Guyton's picture

It's not like an iMac can't run an Eizo as a second screen.

Anonymous's picture

But you would not want to, it's not appropriate for the work. You could nearly buy a Canon L lens for the money you save.

David Boyars's picture

Sounds like you need a Mac Mini! I wanted a Mac that still gives me some expandability. Apple does not make the computer I want, so I made my own.
I'm sure lots of people heard of Hackintoshes, but I didn't want it to look like a gaming computer.
With some careful planning, I made the computer I wish Apple made. My desktop can fit inside any backpack with a footprint smaller than the Mac Mini. It's not as thin, but using non-specialized components kept the price under $500. Let me know if you're interested in hearing more about it.

Anonymous's picture

I give an upthumb though I disagree personally, because it's solid advice well written. The best point is the idea that you can save thinking with a Mac if you wish to not get into being a computer nerd. You can, and that is a path to results that is a big relief for many users.

Bill you are right, but here’s the point- apple really doesn’t care… this is a company which has perhaps the biggest market cap in the world… something a little known company called GE lost several years ago. Apple doesn’t NEED to care about the creative professional as they once did. if you walk into 10 coffee shops (the present ‘libraries’ of today) and looked at the folks on laptops the vast majority are ‘consumers’. Period. Apple has always been smart, crafty even. –from developing disruptive software and technology - final cut pro for example which they eventually abandoned and killed off to replace with a version which generated a greater market share… for them. FC X etc., to years ago essentially handing out computers to schools - they were building their future market. Once upon a time we creatives [and musicians] were ‘apple’ dependent. And as creatives we have to focus on the technology and players who will allow us to get where we need to go, and be willing to adapt. I bought a PC laptop on top of my apple gear over a year ago because I didn’t want to be limited by one manufacturer’s hardware decisions vs my creative needs [I like having 3 USB ports… I like not needing to upgrade to new expensive storage because I’m forced or need to carry peripherals].
Thanks for the wake up call article

More comments