Ryzen 9 3900X Versus Intel i9-9900K: Which CPU Is Better for Creatives?

If you have been thinking of building your own desktop computer for creative work, such as photo retouching or editing 4K video, then your CPU choice is one of the main things to consider. Should you choose the new Ryzen 9 3900X 12-core CPU or the Intel i9 9900K 8-core?

Up until July 2019, Intel has pretty much been the leader when it comes to the most powerful CPU, but with the release of the new AMD Ryzen 3900X 12-core processor on July 7th, for the first time in a while, AMD is pulling ahead of Intel.

At roughly the same price, at first glance, the Ryzen 12-core processor seems like a much better value considering it beats the Intel i9 9900K at certain tasks. Another notable mention is that the new AMD Ryzen processors also make use of PCI Express 4.0, while the Intel 9th generation processors still use PCI E 3. 


If future-proofing your next computer build is important to you, then the Ryzen 3900X might be the way to go for you. Having said that, there are still some things that the Intel i9 9900K 8-core processor does better than the 12-core Ryzen 3900X. 

Check out the included video for some in-depth tests to help you to determine which processor is better for your workflow. One thing to think about as well when you are choosing a processor is that all of the other components in your computer build are compatible, such as your motherboard, RAM, CPU cooler, and graphics card.

Which CPU would you choose for your next desktop computer build?

Log in or register to post comments


Black Z Eddie .'s picture

I'm so itchin' to build with the 3900x. Machine I have now was built about six and a half years ago. It runs well for what I do. But, I have this impulsive need to build something new.

I plan on just replacing the main components:

More memory. I think for proper Image Processing in our days you need 32 GB.

Black Z Eddie .'s picture

Not necessarily, not for photos. On Capture One, even after hours and hours of editing 24mp and 42mp files, I'm barely at 8-9 of 16 gigs memory usage.

The size of 42 [MP] image in 32 Bit mode (Which is what I'd say appropriate for modern algorithms) is ~ 0.5 GB. If you have few layers and work on few images in Photoshop you'd get easily something like 8-12 [GB]. Since most people do something in the background (The OS itself, Browser, Music Player. etc...) you can easily get to the point the OS use Virtual RAM.

Black Z Eddie .'s picture

I don't use Photoshop. I use Capture One as my main raw converter and over 90% edits happen on it.

But, for the heck of it, I opened up 3 42mp files in Affinity Photo (alternate to PS). It does take up quite a bit of memory. Without layers, AP is using up 4.6 gigs. I'm sure there are those that use raw files directly like this. I'm just generalizing and speculating, I think most photographers don't.

Again, when you open, look if it in 16 Bit or 32 Bit Mode. The Math I wrote above is a fact of life. So the numbers suggests that for 42 MP Images and above I'd go for 32 GB.

Black Z Eddie .'s picture

1. Wrong, Your "math" doesn't factor in how the software and/or OS will handle it.

2. If I've said it 2 times, I've said 2 million times, I don't use Photoshop.

3. Re-read my last two sentences from the 20th.

The calculation I made has nothing to do with the host. It is the best case to handle 42 MP of RGB image using 32 Bit data. Indeed the host might add its own additional data.

Since most of us work with hosts which use the layer concept and handle multiple images in parallel it means 16 GB might start feel dense.

Anyone can chose either not to work in 32 bit, not use layers, not to process few images in parallel or just buy more RAM.

Rob Davis's picture

“Check out our 17 minute video to find out which processor is right for you!”

*smacks head on desk*

High Adobe and SRGB gamut monitors are typically 60 MHz so any gaming scores above 60 FPS are meaningless to me. I only care about CPU that can handle heavy workloads and for the last 2 years, it has been Ryzen.

I bought the 3900x, a few minor bugs with motherboard BIOS settings because it’s a new architecture, but overall it’s been great. The last few days I’ve been archiving (compressing) old videos with Handbrake while using my computer to do other tasks and even game a little. Back in my Intel i7 days, it was fine for doing single tasks like rendering or exporting photos in Lightroom, but my computer was basically off-limits during that time. That is the power of more cores and something I rarely see mentioned.

Have You tested Lightroom performance?

I have and I find it much faster than my other computers (Intel 6700K and AMD 1700). However, it only uses 4 cores when converting uncompressed Sony A7RIII files to DNG within the program, but when exporting DNGs it uses all cores. Lots of these inconsistencies throughout Adobe's hodgepodge of a program.

Tony Tumminello's picture

When I eventually replace my home desktop (still rocking a Phenom II X4 955) it'll absolutely be another AMD chip; I'm currently looking at the R7 3700X over the Intel 9700K: lower power consumption, PCIe 4.0 future-proofing, more L2 and L3 cache, comes with a stock cooler that seems to review well and saves additional money, and I'd rather have more threads to use my computer while rendering than have faster threads that lock up the computer. Oh yeah, and the AMD chip is less expensive. I don't game anymore, so it seems like a no-brainer for my use case.

Matt Rennells's picture

For the target demographic of this website, premiere pro, photoshop and handbrake performance is from 2:00-2:45. Shows exactly what you'd expect. A 12/24 core/thread processor outperforms a 8/16 processor.

Tyler Chappell's picture

There's just no real reason to buy an Intel machine anymore. Basically none whatsoever. Not to mention the fact that Intel is a whopping 2 years behind AMD in bringing their 7nm CPU's to market. Intel has always offered extremely poor value for money compared to AMD, and things are no different in 2019. Intel's extraordinary lack of innovation for the past decade has allowed AMD to catch them with their pants down, and it is astonishing that despite all their billions and substantially larger marketshare, a smaller company like AMD can outmaneuver them.
AMD is the superior company with the superior product lineup, and without the horrible reputation and anti-competitive and anti-consumer practices Intel (and nVidia) are known for.

Only update if it would make you work faster and more efficient. Otherwise, spend the money on knowledge and health - both mental and physical.

William Howell's picture

This comment is meant to re-ignite the computer wars of yore: I would never build my own computer, I use a Mac!

Przemek Lodej's picture

Here it goes...3...2...1...damn Apple lover fanboy :) :)