Is Apple Pushing Photographers to Use Windows?

Is Apple Pushing Photographers to Use Windows?

For years, I've been the biggest supporter of everyone using a Mac, except gamers. Especially if you are a photographer or graphic designer, it just makes sense and it always has. But as current events unfold it's becoming harder and harder to stick with the platform, no matter how great it actually is. 

The Good About Apple

The Mac operating system is what makes it so great. It's not so much the hardware, although it is very nice high-quality hardware. That said it's still commodity hardware and they are using Intel processors just like PC. It's not about the hardware really for the "user" anymore. The operating system (when you know how to use it as a power user) is what makes things so efficient and effective for graphics professionals in general, and especially photographers.

The MacOS Finder is truly incredible and as I mentioned before in my Mac tips. Little things like right-clicking the header of an open document to open that file's containing folder (or anywhere in its tree with ease) is vastly superior to Windows.

The reliability of MacOS is probably a decade ahead of Windows, no joke. There's no comparison on reliability.

I won't just make a statement like that without explaining why. The reason is Apple licenses the OS to work with their own hardware built computers. Meaning they know exactly what configurations of Apple computers are running their OS, since they manufactured them all. Windows, in contrast, has to "generalize" many things to be able to work on an infinite amount of different hardware configurations. On PC Part Picker alone you could configure a PC with thousands of different setups and the one operating system Windows, has to try and work with all those different configurations. That is a lot more difficult than making an OS work with your own handful of specifically built hardware, therefore the reliability of the Mac by that alone is very solid before even getting into the BSD derivative base system that MacOS runs on.

Apple Versus Windows: The Pros and Cons

Either operating system, Windows or MacOS, is capable of running the programs and getting the job done. They differ greatly in workflow, but they do run the same programs with relatively the same capabilities. Windows even has a few little things that are better than Mac, such as the ability to customize extra mouse buttons if you have say, a seven-button mouse. Mac has never been able to utilize those buttons and that's a real shame because that one little thing can make a tremendous difference in efficiency when utilized. As cool as that feature is, it doesn't make up for all the benefits MacOS provides, but it's something.

Bottom line, a capable computer user who has above novice level computer skills can use either operating system and get your work done. 

Why Apple Is Making a Grave Mistake

Here's the scenario: I have one of my workstations that's a 2006-2008-era Mac Pro and when it was new it was leading edge quad-core with 32 GB RAM, 512 MB GPU. Now the Apple operating system is designed well unlike Windows which I believe is planned obsolescence the way the registry is structured, Windows actually slows down the longer you use it. MacOS will stay the exact same speed, however the perceived speed will change as software continues to develop and become more demanding. Cameras get higher megapixels, software has new features that are more processor and GPU intensive; those are the things that are making my 06-08 Mac Pro not work as well as it once did. It's the same speed it was in 08, but that's not good by today's demanding standards.

Apple's Mac Pro releases are few and far in between and the last release was the 2013 MacPro, and it was absolutely cutting edge with the 2nd Gen PCI-e SSD, good GPU, fast RAM, Thunderbolt, etc., but it's now four years old already and in computer terms that is an eternity.

So, my options for the 06-08 workstation Mac Pro? I could buy a used 2013 Mac Pro for still close to $2,000. That's a lot of money for an already four-year-old technology, and while they are fast and work well they still are much behind the current curve of fast processors and hardware architecture, such as the i7 7700. For example, for less money than a used 2013 Mac Pro costs, one could buy this. The 8 GB GPU, i7 at 4.2 GHz, DDR4 RAM, SSD. That computer, performance wise, will run circles around a 2013 Mac Pro. You can back a few of those specs down and be around $1,000 for a really fast modern-architecture computer system. 

This makes it very difficult to buy a four-year-old used computer that has half or maybe less the performance specs for twice the money. Yes, MacOS is ideal for what I do, but at some point hardware that is 20x faster has to make a difference. Yes, there will be some negatives about the Windows OS to deal with (such as having to deal with some kind of anti-virus), as well as a few positive little things like the mouse button customization. 

Further doubling down on Apple's mishandling of the pro market, they announced at the keynote that there won't be a new Mac Pro desktop built, rather a new iMac Pro which does boast some nice specs, but I hate iMac's for professional use. They are a great home or family computer, but I require more customization than that. What if I don't like the screen size? What if I want extra HDs or a different GPU? With the Windows build if I want a different GPU I can just pop out the existing and add a new one, no big deal.

Then there's the cost of this new iMac Pro at a rumored whopping $5,500 or more. 

So Apple is leaving a certain market of professionals behind with the path they are headed down, a really expensive high-end iMac which I don't want anyway, or pay a lot of money for really old technology and there's no in-between. That's a rough place to be in because I truly love MacOS and what it gives me for my workflow. Windows will definitely be cumbersome but the speed of the computer is so much farther advanced, that it is appearing to now be the lesser of two evils. 

Apple is essentially forcing my hand and likely many others. There will be a few markets left for them as professionals, since some folks may like and be ok with an iMac Pro and have the $5,000-plus to buy one. The rest of us are left with a tough choice. In the past, I have been happy to live with four-year-old hardware to not have to deal with Windows, but the gap now is only growing in performance due to the lack of Mac Pro production, so at some point it makes more sense to deal with the hassle of using Windows for the performance while saving a bunch of money and increasing future upgrade options without having to go out and buy into a whole new system.

It seems Apple is gearing heavily toward the consumer market and as a business decision that makes sense since there are a lot more regular consumers than graphics professionals. But it also seems like a huge mistake to abandon the original customer base that made the Apple computer so strong and good, evolving into what it is today.

What do you think? Is this really the end of the line for feasibly using Apple computers for professional photography?

Bill Larkin's picture

Bill is an automotive and fashion inspired photographer in Reno, NV. Bill specializes in photography workflow and website optimization, with an extensive background in design and programming.

Log in or register to post comments
168 Comments

Great write-up! Also for those of us who use the Adobe Suite, by having Adobe doing what seems like throttling on Mac software it's most definitely frustrating. I know they now favor PC over Apple but come on, the lag and speed I encounter sucks compared to less powerful PC's I use.

That alone has made me think about switching, but Airdrop and iMessage keep me here.

PC still has the stigma, whether you believe it to be true or not, that their hardware has a short shelf life compared to Apple. From my experience, this is true and is why I switched to Apple a few years ago. The price tags of PC hardware is tempting but I can't change back given the experience I have and of those around me.

Personally I don't think that Adobe is "favoring" PCs - in my opinion it has more to do with how the drivers are updated on Windows vs Mac.
During the last years we saw (beside SSDs / M2) the biggest progress in the utilization of the GPU. Almost any professional photo / video application makes use of it. The GPU vendors like Nvidia and AMD work heavily on improving it and release very often new drivers for Windows, but on Mac OS it has to go through Apple and system updates (from my understanding). It's something not just Adobe is struggling with, also PhaseOne (CaptureOne) wrote an article about the problematic a year ago or so. It's a fast evolving technology but Apple throttles it down because they don't let GPU vendors update anything that relates to OpenCL and such things.
Many professional programs that make use of the GPU had issues on the Macs which never happened on the Windows version. That's why tips like "turn off the GPU acceleration" are quite common on the MacOS side, but not on Windows.

Good to know thank you! I appreciate the enlightenment.

The graphics card is not an important hardware point for Photoshop.

Actually mate, it is. Your graphics card directly relates to how much video ram you have and your video ram processes all of your image data.

No, it is not.
1. Yes (and obviously) the vid card has the vid ram.
2. Read the term in context: VIDEO ram. As in "video" ie high frame rates & surface texture rendering. Not photography.

It's apparently a surprisingly common misconception, because I seem to be the only critic here of the mistaken notion that Photoshop requires a fancy video card. It doesn't even materially benefit from one. Rendering moving images in real time benefits from a powerhouse vid card, photo editing does not. You want to survive in multi-player online gaming like Halo, spend a $grand on a vid card. You want to do expert Photoshop editing, save your money.

Please read the reply below I made to Bill Larkin and see the links. Happy to be corrected if you think you have a correction, but I'd like it to be fact-based.

The GPU has everything to do with common daily tasks in Photoshop, including but not limited to live brush, brush size changes with full tip view... GPU rotates.... etc. - and if it didn't matter, Adobe wouldn't have a GPU settings area in PS.

The menu you posted does not support the point you are trying to make. Read it. It just says you can turn off features if your machine does not run well with them on ie if your card does not support certain features ie OpenCL (or OpenGl). This is not a function of your card's power. Being able to turn it off is a backdoor trick for older hardware compatibility. Lots of low to moderate cost cards utilize OpenGL/CL.

I refer you back to Northrup, or at least I invite willing learners to watch his very clear advice. The machine he has there, with it's common graphics card, is fantastic, even at two years old. Blinding by any standard.

We have a newer iMac with an AMD R9 in it, in about this Mac it doesn’t even register that the graphics card is installed(Nvidia fan here) because of some error. This computer will not render half of the adjustments, you cannot use scrubby zoom, and you can forget about rendering files larger than a Gig. You might be right that you don’t need a graphics card, but photoshop is buttery smooth with a GTX 1080 under the hood :)

Congratulations! But;

1. I never said, "you don’t need a graphics card."

2. Your very specific example is not particularly useful for the purpose of generalization. You have one setup that is showing some error which probably invalidates it's usefulness, another works well. It does not then follow that you can make a general case from the what you see as a difference.

3. I'm not the perfect source of all info. I know some things, have some schooling & experience, and do some research. I have common sense, I read critically. This is why when I see an absolute expert (Tony Northrup) agree with what I know, I feel I am probably on the right track.

>> Actually mate, it is. Your graphics card directly relates to how much video ram you have and your video ram processes all of your image data.

Speaking as a professional programmer, this is utter BS. There are some advantages to having a more powerful GPU, but data for the image being worked on by a graphics applications resides by default in main system memory.

Yes, the GPU is very important for Photoshop

I say it is not, but I'm happy to be proved wrong on tech matters, because it's like winning by losing:) Another factor overlooked in this piece that I was going to comment on but did not is disc speed. Disk speed is the main bottleneck, not ram or the CPU. Disc speed, and the bus speed. I use dual 10,000 rpm drives in a RAID 0 for my OS, which was the fastest thing you could do back when I built it. Today SSDs are faster, but you'd still build a RAID setup of some type. Next issue is getting the Photoshop/Lightroom swap files off of the disc holding the OS and onto a separate disc, that really makes it fly.

I digress, back to the point: Think about it, why would a graphics card matter for still images? A fast graphics card is all about frame rates and rendering surfaces, all non factors in editing a single photo. The rotation of a single image (via open GL) is a puny task so that is a non-issue. You don't need a gamer card, and therein is where you really save real money on your PC (which I suggest geeky people build from components partly at least for the fun of it.)

~~~

If you don't wish to believe me, perhaps Tony Northrup will convince you:

Graphics card info starts at 15:45 if you are impatient, but the whole vid is excellent (albeit old) info beautifully presented. Key line is repeated as the last line in classic pro tech writer style, "... save yourself some money and get yourself a cheap graphics card":

https://www.youtube.com/watch?v=WbsyglPqK6o

Short text list version:
https://northrup.photo/reviews/computers-for-editing/

~~

And I'd like to emphasize that my view is not (and yours should not be) based on the celebrity status of an Internet opinion alone (although Northrup is a "go-to" source of information on digital photography). Rather it's simply computing hardware 101 (and old info at that) and if it does not ring true for you, it's time you restudy the topic.

As for the 5 down thumbs on my correct original note - whoops!

John, this video is old and no longer accurate. As far as I know, nowadays OpenCL acceleration makes a tremendous difference in rendering speed whenever one develops raws using Camera Raw/Lightroom.

I personnaly use Capture One for raw editing, and with OpenCL acceleration turned off for rendering, the image stutters when I make editing changes (sliding a slider), whereas it renders smoothly in perfect real time (@ 96hz, which is my screen refresh rate) whenever I turn on OpenCL on my GTX 1080.

The same is true for OpenCL assisted image exporting in Capture One: I recorded a three times faster rendereing speed whenever this feature is turned on as compared to CPU only rendering.

I can also monitor my GPU use and all of its power is maxed out whenever I make editing changes (I can also hear the fan ramp up it speed!), and then it throttles down whenever the image is rendered. This proves that the GPU is fully utilized for still image rendering.

1. It's 2015, which is relatively new. And, if you peruse the comments you will see "Photographic Utopia" asks: how about an update to this? To which Tony N. replies just 11 months ago: "Really, nothing much has changed. The PC industry has ground to a halt." So in fact it's one year old. The principles behind it include older established ideas (the RAID setup) that are rock solid.

2. I never suggested that OpenGL/CL are not important. The incorrect assumption is that they do much better with a high end card. Yes you want them activated. But, and this is the point, you don't need a gamer card for OpenGL/CL operations. Looking at Newegg just as example I see that there are endless numbers of OpenGL compatible cards that are in the lower end of the price spectrum under $300 down to far lower. Fewer offerings in CL, but still some low cost. Note: The high end of the graphics card spectrum in my mind is over $500 to thousands of dollars.

3. So the question properly put is; with the understanding that yes you have OpenGL/CL engaged, is a low price card materially slower than a very high priced card? My understanding is, Tony's is, and yours should be, no.

Happy to be shown to be wrong, I learn from that.

John - What you have said is also substantially incorrect. GPU acceleration is not just on or off - more powerful GPUs process data faster.

BUT there is a price curve - ie you get past a certain point and returns diminish.

Your statement is a non sequitur. A slightly faster cpu is faster too, but it does not matter in real time work flows.

Open GL is either on or off if your card does not support it, while a card that supports it suffices as long as it has recommended ram. That is the basic hurdle and it says so in the Adobe support page for graphics cards, with a recommendation for memory of around 2G. Now I'm not paid for my time to do research to write an article here but I do wind up doing some, to comment. My looksee shows an above average card that meets Adobe specs can be had for under $250 nowhere near a gamer card. Any more card is in fact a waste of money, as Tony Northrup advises. And of course my own experience agrees.

FYI - I offered a correction to Northrup too, he never mentions a separate disc for the Lightroom and Photoshop swap files. They should be on a disc that does not hold the Windows swap file - that really improves performance.

Here's a question for you; I am curious, how do you monitor your GPU? What utility does that for you? I'd be interested to know, thanks.

The "substance" (as in substantial) we are discussing is best practices advice, perhaps for neophytes or non-nerds needing to cut to the chase, save $ and get on with learning/working. My view concurs with Northrup, and not because he said it. However he is probably as good an authority as any, and surely better than any writer here.

>>>Your statement is a non sequitur. A slightly faster cpu is faster too, but it does not matter in real time work flows.<<<

Ahh, the tell tale combination of pompousity and distortion...

A "slightly faster cpu" won't matter. A *MUCH* faster one will. No one was restricting the discussion to "slight" anything, John.

And, yes, this matters even during "realtime workflows." In so much as that horrible phrasing means anything.

More power changes what you can do in without waiting around - for example wavelet splits, film simulations, even something as simple as running a gaussian filter with a wide radius.

Now, this extra power might not help you - because very possibly, you don't know what the hell you are doing and stick to using only very basic tools. But - as people have already told you,although you were too silly to listen - other people have different needs.

...Not everything is about, *you*, John.It's a harsh truth, I know...

Pardon, excuse the Latin; "A non sequitur, in formal logic, is an invalid argument. In a non sequitur, the conclusion is either true or false, but the argument nonetheless asserts the conclusion to be true and is thus fallacious." I hope that helps.

If you watched the Northrup vid, which I posted as the source of authority rather than implying I am, you know he states that "some filters" benefit from a top end vid card. Like your blur. If you spend all day blurring images (like you do your written replies) then yeah, it will matter to your workflow. I wonder; How does your volume of work compares to his? In any case direct your criticisms from now on to the expert - Tony Northrup.

I'm concerned that willing learners not be mislead by bad information. Willing learners should understand that a good editing machine does not require a top end vid card. P.S. Oh and did you stuff some lame slights about me in your rickety post? I didn't notice.

Have a great day!

John's
education
continued
below....

Desperate stuff, lol!

Never used an Apple computer.....more of "a devil you know", kind of issue here.

Just get an iMac. Either Pro or regular. They’re great and work well. You can still add extras. Don’t overthink it. Personally I’m glad the iMac Pro is what it is. The modular trash can was a clunky option.

See, that's the problem, with iMac, you are married to THAT screen, which is looks nice, but it's NOT pro level like an Eizo or similar for a high-end color workflow. That's why the iMac is really better suited for a home use application, than a professional environment. I know some photographers use them, I just cannot.

You have that wrong Bill. The pro rule of thumb is that you do not need an Eizo type monitor unless you are doing graphic design work related to offset printing output, as when you are distributing advertising via print media nationwide using many print vendors. As with our interaction above on graphic cards, I can whip up the reference material, but I'm not paid to take the time so ...;)

Photography is no longer related to offset printing as it was in the heyday of the glossy magazine, so the need for triple-the-cost Eizos (and some other specific brands) is no more. I've not used my color spyder matching device in years.

Most photo work (for readers here especially) is either Internet or inkjet output wherein precise color matching monitors like Eizo are not important. What you want for digital photography is a large display with high res for lots of workspace, with a proper Adobe RGB (or better) color gamut. You then tune it up with B&W step scales to get the brightness right and you are good to go. Other features matter according to taste or convenience. If there is a criticism of an iMac monitor, it's the gloss, which I find annoying personally but some folks like. They are fine otherwise.

~~

I must note that I do not wish to be taken as a shoot-from-the-hip critic, I step up as I feel sincerely that I have noticed a technical error worth correcting, or in aesthetics and practice where my experience is useful. It does take some time/effort to do so, and I am not afraid to be shown to be wrong. I'd like to keep it friendly and mutually productive.

It's not like an iMac can't run an Eizo as a second screen.

But you would not want to, it's not appropriate for the work. You could nearly buy a Canon L lens for the money you save.

Sounds like you need a Mac Mini! I wanted a Mac that still gives me some expandability. Apple does not make the computer I want, so I made my own.
I'm sure lots of people heard of Hackintoshes, but I didn't want it to look like a gaming computer.
With some careful planning, I made the computer I wish Apple made. My desktop can fit inside any backpack with a footprint smaller than the Mac Mini. It's not as thin, but using non-specialized components kept the price under $500. Let me know if you're interested in hearing more about it.

I give an upthumb though I disagree personally, because it's solid advice well written. The best point is the idea that you can save thinking with a Mac if you wish to not get into being a computer nerd. You can, and that is a path to results that is a big relief for many users.

Bill you are right, but here’s the point- apple really doesn’t care… this is a company which has perhaps the biggest market cap in the world… something a little known company called GE lost several years ago. Apple doesn’t NEED to care about the creative professional as they once did. if you walk into 10 coffee shops (the present ‘libraries’ of today) and looked at the folks on laptops the vast majority are ‘consumers’. Period. Apple has always been smart, crafty even. –from developing disruptive software and technology - final cut pro for example which they eventually abandoned and killed off to replace with a version which generated a greater market share… for them. FC X etc., to years ago essentially handing out computers to schools - they were building their future market. Once upon a time we creatives [and musicians] were ‘apple’ dependent. And as creatives we have to focus on the technology and players who will allow us to get where we need to go, and be willing to adapt. I bought a PC laptop on top of my apple gear over a year ago because I didn’t want to be limited by one manufacturer’s hardware decisions vs my creative needs [I like having 3 USB ports… I like not needing to upgrade to new expensive storage because I’m forced or need to carry peripherals].
Thanks for the wake up call article

Agree. Apple runs a very successful business with control over their hardware and OS. Their business model has evolved over time with more of a focus now on the general consumer rather than specialty niches. The consumer market is where the money and growth reside. I use both Apple and Windows 10 products. Apple products are very well made, but many of the PC products have more advanced and progressive hardware at a lower price. A continuing trade off. Thanks!

For me this is where my problem is with Windows. My day job is as a software developer. So, by far my life is lived in the Windows ecosystem. I used to build all my systems years ago. It was actually challenging and fun researching every component that went into making a beast! That was 25 years ago, however, and I'm an old fart now, LOL I'm still really learning my art. I really don't have the time nor the desire to run down Windows idiosyncrasies and do what I did 25 years ago. That's where Apple comes in for me. I had to use my daughter's weak little Macbook (no pro) once. It was a dream compared to my much more powerful Windows 10 notebook that needed repair. I swore that my next machine would be an Apple. Right now, I have my eye on the new IMacs coming out. If Apple provides this easier path to doing what I really want to be doing, that's a reasonable trade-off.

It seems that you talk without having any idea what you are talking about. Sure Windows back in the 2004 era was horrible but starting with Windows 7, 8, 8.1 and now 10 they are miles ahead of OSX. Windows 10 runs really smooth.

I worked with OSX for around 4 years. I was really excited because many "artists" were using it and I thought it was amazing. Little did I know, the people that were using it were either hipsters, trying to impress their customers, or "geeks" that didn't even know the OS had a right click.

I used it with an open mind. At the work place at first they bought me a MacBook Pro, iMac, iMac 5k, Mac Pro. The funny thing is they were different machines with different specs but Premiere was always crashing. After Effects ran slow and rendering was a pain.

At home I have a custom built workstation (while you may see it as a con, I love the fact that I can mix and match what hardware I want and not let Apple decide what's good for me) and it runs buttery smooth. No crashes, no lag just as you would expect it.

While I think there are some people that truly love the brand and have a connection with it since many years, I respect that but I cannot stand the "artists" and the hipsters that buy a Mac because....well because it's gray and it impresses the clients...

Don't want it to sound as a rant(or do I :D) but I am honestly tired of this non sense.

PS I love my iPad 2017(non Pro) :D

Regarding the industry wide, it depends...I know 3 media trusts that run all post-production on Windows. Also some of the production houses I work/worked with made the switch from OSX to Windows some 4-5 years ago.
Of course this is based on personal experience. In your area it might be the other way around.

Regarding the level of education of Mac users again from personal experience is the other way around. The no knowledge, I will buy what's cool and so on. The number of non hipsters/geeks that I ran into and had a Mac was really low.

An app not running well equals to a problem with the OS because there is a limited amount of hardware and the OS is "optimized" to run buttery smooth.
Since OSX is a "professional" post-production OS, I expect it to run smooth.

The above post was made to describe my experience with the Mac users, your mileage may vary or be completely different.

"That's fine, but it's not statistically meaningful. Are you also privy to the maintenance of their systems?"

I will agree with you that anecdotal evidence garnered from a few businesses is not statistically significant, but if you're attempting to argue the contrary, it might help to post a link to a recent study or survey of what types of systems are used across creative industries.

From my understanding, there has actually been a fairly large and gradual migration away from the Mac platform over the past decade or so due, in part to some of the issues brought up in the article, but also largely due to the cost-benefit analysis of Macs vs. Windows-based systems. In short, Windows has not only caught up a lot in terms of performance and stability, but software and peripheral companies are increasingly embracing both platforms meaning that the gap between the two is now smaller than ever. Over the years, many businesses have decided to take advantage of that smaller gap in performance (since saving money is always a good thing) and you'll find more creative professionals using Windows-based systems today than probably ever before. For similar reasons of cost and better options, creatives are also increasingly opting to migrate to Linux-based systems as well as "Hackintosh" systems. When you take into account the number of creatives working around the world where the premium price of Macs simply make them unfeasible options and who are now plugged into the global creative marketplace, such a shift does certainly make sense.

Granted, I don't know how big the shift has been and I do know that the highest echelons of the creative industries are still dominated by the Mac platform since they have the benefit of larger budgets and when you're running a high end business, any deficiency in stability or time spent getting your staff and workflow acclimated to a new platform is a lot of lost money (same reason why you have multi-million dollar corporations still running Windows XP or Vista). I suspect that the shift is largely happening among smaller to mid-sized businesses as well as off-shore (mostly talking like China, India, Korea, etc) businesses who can be much more agile than a larger entity.

--------------------------

"Regarding the level of education of Mac users again from personal experience is the other way around."
Yes, there is a correlation between higher income and higher education level that cant be denied. However, higher education level does necessarily equate to greater familiarity with operating technology. Just walk around your average university campus and see how many highly educated professors can barely turn on their projector to give a lecture. Or look at the tremendous number of people with tons of disposable income and top notch education that would be lost if you asked them to format a hard drive. Many of these people get by just fine, however, because you don't need to understand how your tools work in order to live day to day. You only need them to work and if you have disposable income, you can either hire someone to do the work for you, pay someone to get it to work when something fails, or just buy a new device. It's only a specific subset of people who have an interest in technology and also possess the income to pursue that interest that benefit in this case.

More often than not, however, if we're not talking about suburban technology programs with middle school robotics clubs, it actually tends to be those with less disposable income that are more intimately familiar with their technology because it represents not only a larger investment, but also a potentially irreplaceable loss in the event of a malfunction. In general, poor people have to learn how to repair and maintain things because they're not easily replaced. Rich people, while they often have more luxury (and resources if they choose) to do so, generally don't have that kind of pressure. They can forego their knowledge of how their tools work to focus on other things. Mind you, there's nothing intrinsically wrong with this. It would probably be similarly odd (and a waste of time) for a CEO of a company to sit down and start illustrating marketing material.

--------------------------

"It should go without saying that an OS can't guarantee an app will run properly."
While I would generally agree with this, I think there can be a legitimate criticism leveled if an application is not running properly on a platform like OSX, which prides itself on its tightly controlled ecosystem and resulting stability. While it may not be the fault of the operating system per se, for a professional application to be released on the OSX platform and not function properly would certainly represent a quality control failure on multiple levels.

This is rich, "... higher disposable income tends to correlate with a greater level of education." If that's so I must be filthy rich without knowing it.

Choosing a weak word to enforce a truism is no credit to the writer.

You simply parrot a common misconception, that knowledge that results in income is somehow more important as education than other knowledge. It's not of course. The world's finest historian probably makes 1/10th of 1% of what a good hedge fund manager makes. Who's better educated?

News for you. Most of the computers in use in UK public service are all Windows. Disposable income.....made me laugh.

I got it, but I'm happy to help. Your comment was glaring for it's acceptance of shallow truism. This induces laughter under the general rule that "lame" is funny. Hey it's better than anger, right?

If you thought Windows 7 and 8 were "miles ahead of OSX" ...then that's all I need to know about your opinion regarding the smoothness of Windows 10.

...And here I was, wondering if Windows 10 might be decent! :-(

In the Apple pencil box there is also a female > female lightning adapter you can use so you can charge it with your iPad charger. BTW, the other way of charging is not that bad. Imagine you are sitting on a bus or a train and just need to charge your pencil but don't have access to a power outlet. Stick it in your iPad and you have 1 hour use in 1 minutes of charging!

In the Apple pencil box there is also a female > female lightning adapter you can use so you can charge it with your iPad charger. BTW, the other way of charging is not that bad. Imagine you are sitting on a bus or a train and just need to charge your pencil but don't have access to a power outlet. Stick it in your iPad and you have 1 hour use in 1 minutes of charging!

The pencil charges to full charge in about 10 minutes or less, so it's really not that big a deal. To me, it's worth it. I use my ipad pro for digital art. No other stylus (that i've tried) comes close IMO. It even feels more natural than a wacom (also IMO)

Agreed. As a photography instructor who teaches a Lightroom course, I can't tell you how frustrating it is that on all Mac OS updates after Yosemite, when a SD card is inserted into a Mac (those that still have an SD card reader of course) up pops the Photos app. You can tell it to forget the card, but if you format your SD card, it thinks it's another card. My goto response on Yosemite was to just delete or zip up the Photos app. Now Apple has made it impossible to do so. This is completely asinine.

On my full-sized keyboard with number pad, made by Apple, the Home /
End and Page Up / Page Down keys don't work as they should without installing 3rd party software which I have to run each and every time the computer is restarted. This is completely asinine. What the hell does "Home," "End," "Page Up," and "Page Down" even mean in the Mac world if not those very things?

Finder is a joke. A complete joke. Again, with students I spend time getting their sidebar to actually show the Pictures folder, have the path bar enabled, turn on image preview, etc etc. Copy and Paste? Nope. Cut and Paste? Nope. Make a tab and drag images to that tab and let go when you drag around to the folder you want copy/move and hope you don't let go. Either that, or make a new finder window and drag things between them. It's completely kludgy. So much so, early on I bought a great program called Path Finder. I would be lost without it.

I have not and will not upgrade any of my machines past Yosemite because I no longer trust what has worked in the past will work in these upgrades. I have an old Wacom Intuos 2 tablet which works perfectly fine, but trying to install it in anything newer than Yosemite doesn't work.

I love my Macs when they work, but having to bludgeon them into behaving like an efficient operating system should is a lot of work.

El Capitan FTW.

Spot on. Now, I currently use an iMac 5k with maxed out specs as my studio machine. I tried to love the iPad Pro for my 'on the road' needs, but the lack of color space control (screen calibration) killed that for me (although for casual/social media photo use, the iPad Pro/Affinity Photo combination rocks...). So, I keep using a 2015 MacBook Pro Retina for my on the road needs; way more than I need (computing power AND bulk), but I simply cannot justify replacing it with one of the newfangled MBPros. Played around with a Wacom Mobile Studio Pro for a while, but that Windows stuff just didn't work for me. Really hope there will be options available in 2-3 years when I plan on updating my IT supporting the business... If not, I may have to bite the bullet and switch to Windows after all - at least all the software I use (Capture One, Lightroom, Photoshop, Affinity Photo, Imageprint) runs on both OSs :(

"Windows even has a few little things that are better than Mac, such as the ability to customize extra mouse buttons if you have say, a 7-button mouse. Mac has never been able to utilize those buttons"

WRONG. It's just not built into macOS. But, I used a multi-button Microsoft Trackball Explorer with my Mac for many years. All I had to do was install Microsoft's driver. Sheesh!

Also, it's macOS, not "Mac OS".

Lot of good points here, but you have absolutely no clue about the latest Windows and its stability. I built myself a new workstation 13 months ago (after having used Mac for 6 months), and it's been running 24/7 ever since. Not one glitch from the OS or any of the programs. Nothing. Nada. I'm a member of several pro user groups where Mac user always talk about Photoshop crashes and other "minor" issues like trouble with Wacom tablets.Well, none of that occur on my three Windows 10 machines. I have my custom build plus a Microsoft Surface Pro 4 and a top speced Surface Book with Performance Base. For those interested, I made a blog post about my build here. It's ancient hardware by now, I know, but it still runs in circles around even the most powerful Mac. For a lot less money.

https://erohne.wordpress.com/2017/02/10/how-to-build-a-powerful-computer...

Cheers,
Eivind Rohne / instagram.com/erohne

The cons with Macs isn't software or hardware. It's the arrogant users! ;-)

I was joking but, seriously, I think whatever you're used to, is the most efficient. I can't stand my stupid phone (it isn't smart) after about three months of having one! Whenever I ask someone how to get around something I think is inefficient, the reply is, 'That's just how it works.' :-)

More comments