Why Photographers Need to Start Asking: What GPU Does My Computer Have?

Why Photographers Need to Start Asking: What GPU Does My Computer Have?

Your computer's graphics card isn't just for games anymore — with recent updates to essential core products, including Lightroom and Photoshop, there are a number of reasons why you should start paying attention to your GPU.

What's A GPU Doing Anyway?

Your computer's graphics card, a separate chip from Nvidia/AMD or built into some CPUs, helps create what you see on your computer's screen. While this was the primary function for photographers in the past; video editors have enjoyed graphics card accelerated workflows for a while now. Photographers were left waiting for their CPU to perform the work involved in making adjustments. This has been changing, with Adobe adding updates that have moved more and more tools in Lightroom and Photoshop over to the GPU. GPU enabled processing can be much faster than CPU limited tools, typically showing a preview on the fly or enabling features that'd be prohibitively slow via CPU.

The distinction behind CPU and GPU preferred tasks is technical, but it's important to recognize that only some features will benefit from GPU acceleration. To get a sense of the features that require or are greatly improved by a GPU, check out this list from Adobe. In Photoshop, all of the following features can take advantage of your GPU:

  • Camera Raw
  • Image Size – Preserve Details
  • Select Focus
  • Blur Gallery - Field Blur, Iris Blur, Tilt-Shift, Path Blur, Spin Blur (OpenCL accelerated)
  • Smart Sharpen (Noise Reduction – OpenCL accelerated)
  • Perspective Warp
  • Select and Mask (OpenCL accelerated)
  • Scrubby Zoom
  • Birds Eye View
  • Flick Panning
  • Smooth Brush Resizing

Support has just now been greatly expanded in Lightroom, "Using Process Version 5, most adjustments are now GPU accelerated. For example, full acceleration can improve how fast you see results as you move the Texture slider. Using the GPU also helps Camera Raw keep up with the demands of 4K, 5K and larger displays" This speedup, especially with a high performing GPU, can make culling quicker and editing smoother.

For video editing, a powerful GPU is even more important. Simple clips may not show a drastic performance improvement with any acceptable GPU, but when you start piling up GPU-accelerated effects or high resolution clips, you'll need a recent, higher end card. Based on testing from Puget Systems, you should have 4GB minimum of VRAM for 1080P footage, going to 6GB at 4K.

This GTX 970 works for Lightroom and Photoshop, but would probably struggle under a heavy workload in Premiere.

Updating Drivers

To take full advantage of these features, you'll want a newer card and the latest drivers. For example, running a driver that is a few months out of date will preclude the use of your GPU in Adobe Premiere. Fortunately, it's easy to update the drivers.

On Apple's computers, GPU updates are handled through the system update mechanism, so just run and install any outstanding updates.

On Windows, you'll want to confirm whether you have an AMD or Nvidia graphics card, if any. Depending on the computer, you may have a sticker indicating this, or you may be able to right click on your desktop and see a mention of Nvidia or AMD. Some computers may not have either, instead relying on the graphics built into your CPU — these are often updated via Windows Update. If you can't find it by looking around, open up Device Manager by opening the Start Menu, then click Run. In the Run dialog box, type "devmgmt.msc" without the quotes. This will bring up the device management window, a list of all the hardware in your computer. You can click the triangle next to Display Adapters, which should show you whether or not a GPU is installed, as well as the manufacturer.

Once you've identified the brand of card, download the latest drivers from the manufacturer. If you have an Intel CPU and no additional graphics card, you can use Intel's tool to automatically identify and update your drivers. If you have an AMD CPU with no card, or an AMD graphics card, AMD offers a tool on their support page to automatically perform the updates. Lastly, if you have a Nvidia card, you can use Nvidia's automated tool called GeForce Experience — although I recommend manually searching for and downloading the drivers, as Nvidia requires creating an account just to use their tool.

Once you've updated your drivers, make sure you're running the newest versions of your editing software. New GPU supported functionality is being added frequently, so consider updating when possible.

Future tools may increasingly rely on GPU performance improvements to help the editing experience, since GPUs have been improving at a greater rate while CPU's single thread performance gains have decelerated. Beyond just raw speed, GPUs are powering developments in the field of deep learning. Deep learning based tools are still in the very early stages, but software like Gigapixel AI, an intelligent upscaling tool that actually synthesizes new detail, show the promise in this field.

Upgrading Your GPU

If you're having difficulty enabling these features, or want to know what part to upgrade to, Adobe provides some guidelines. The card should be released in 2014 or later, with over 2GB of VRAM. This is a pretty low bar to hit, but if you are running an older computer, you should consider upgrading to take advantage of these features.

Typically, you'll only be able to upgrade the cards in desktop computers, as the laptop form factor necessitates the GPU to be mounted directly to the motherboard. Have a desktop and are looking to update? Consider the Nvidia GTX 1660 Ti. It is more than capable in Photoshop and Lightroom, while its 6GB of VRAM means 4K editing in Premiere is no problem. Recent trends in GPU prices also make this the most affordable option that doesn't compromise performance. 

This external GPU enclosure shows how a regular desktop card can be added to a laptop - a previously impossible task.

If you have a recent laptop with Thunderbolt 3, and need more graphics power, consider an external GPU enclosure. These boxes allow you to slot in a desktop style graphics card, while wiring it up to your laptop over a single cable. Enclosures can cost a fair bit, especially considering you'll still need to buy a card to put in it, but present the only upgrade option for most laptops. For an idea of the products available, check out the Sonnet eGFX Breakaway Box.

Depending on your existing equipment, taking advantage of the new GPU based improvements to photo and video editing may be as simple as updating your drivers and software. Even if you need to upgrade some hardware, consider the benefits of a faster and smoother editing experience — it may make more of a difference than you think.

Alex Coleman's picture

Alex Coleman is a travel and landscape photographer. He teaches workshops in the American Southwest, with an emphasis on blending the artistic and technical sides of photography.

Log in or register to post comments
23 Comments

I'll save you all a read...

Don't buy Apple anything basically is what the article says.

That's neither said nor implied anywhere in the article. Most of Apple's computers that are targeted at professionals come with an adequate GPU - there's just less to be said about updating/upgrading them, and correspondingly less space devoted to it in the article. All the same benefits apply regardless of Apple vs Windows.

I'm a Linux and Windows PC user, both laptop and desktop are good machines. But in this regard, we need to thank Apple for creating OpenCL and then sharing it with the likes of Intel, AMD (ATi) and Nvidia by creating the Khronos Compute Working Group . Wasn't for this and we'd still be using OpenGL based applications, with slow and hard to work code. Apple allowed us PC users to actually have access to our GPU's fully with AMD OpenCL cores and Nvidia CUDA cores.
Microsoft tried with DirectCompute within DirectX, but still OpenCL is the main choice for both GPU brands.
So, if you buy Apple, you'll have access to this technology as you do on PC, having the first been the pioneer.

Such an ill-informed article.

Incomplete at best.

Hi Leigh - this is an overview aimed at a general audience, showing why it is important to have an updated GPU when working with photo/video post-processing.

Where did you see any ill-informed remarks? I'd be happy to correct any mistakes.

This whole article feels like it was written by someone utterly unfamiliar with how PCs work. Pretty dreadful stuff.

Hi Nicholas - mind pointing out any errors? I'd be happy to correct them.

To polish this article, I will also says that many laptop have a dual Graphic card and it need to be setup the right way to get the GPU set to help computing.
You can open the resource monitor under a PC to see the actual workload for your GPU and check if your setting are working.
GPU for exporting video with Adobe Media encoder is a great way to see the difference with CPU.
Camera Raw with GPU was long expected. Thanks Adobe

Good point. The dual GPU situation with laptops can be tricky.

Actually, no. You can define your main GPU on the OS. My laptop has an i7 8750H with integrated graphics and a GTX 1050, I've completely shutted down the Intel GPU. Hurted my battery a LOT, but, I don't get applications crashing when GPU A or GPU B is called.

Actually yes. By your own admission, you're having a dramatic performance impact (hurt your battery a lot) just to get it to work.

Update drivers? You know what Nvidia is doing to Pascal based cards right?
Since the RTX cards came out and with the sales not being so good, they had the bright idea of crippling the GTX 10 series via driver update.
There are already reports of people having lost up to 15% performance in some game titles due to this practice and most are returning to legacy drivers or versions prior to RTX release.

I have a GTX 1070 TI 8GB paired with a Ryzen 7 1700X CPU, works flawlessly on Premiere, possibly will too on GPU based photography software. OpenCL thanks to the gods of IT is still a relative new API which was launched with MacOS X Snow Leopard back in 2008.
Having OpenCL to take full control of the GPU will be nice, specially for Lightroom. I just hope to have all those CUDA cores on steroids working good.

Broadly speaking, newer drivers will be better than older drivers. Typically, performance improvements, feature improvements, and security improvements all are worth the update. The minutiae of which exact driver version is best is beyond the scope of the article - for example, the 399 branch in Nvidia's drivers have the best DX11 performance, but make other tradeoffs.

I run an eGPU for Resolve 15 on my Thinkpad X1 Carbon and it's amazing. Now that Lightroom is also finally catching up with the times, I hope it's going to make syncing 200+ timelapse images much faster.
The beauty with an eGPU is that you can maintain a small footprint laptop that is easy to carry and can last 10+ hours on battery, with the power of a desktop gpu when you get home and start editing. Plus, laptops nowadays last 5+ years no problem, meaning I can keep upgrading the GPU in the enclosure to suit more power-hungry apps, 50+ megapickels images or 8K video editing later on, without the need to switch laptops.
Case in point - my 2009 MBP still runs strong, but it's crippled by its GPU/CPU combo and can only be used for light editing, but if it could be connected to a eGPU and run apps that use that instead of the CPU, it'd be fine for pretty much everything I need a computer for.

I agree, eGPUs can open up a number of possibilities for small and light machines that can still put in work at the desk.

Readers thinking of upgrading a desktop computer to an improved graphics card need to be aware that the power supply in most desktops will NOT have adequate power to support the additional and usually power hungry card. In many cases the power supply can be upgraded for about US $50 so that the computer will support the new card. Be sure to check before upgrading the graphics card.

If it is a prebuilt system from Dell or similar, that's possible, but it will definitely depend on the computer. If you already have a GPU, however, newer cards are typically as efficient or better at the same tier.

Not true. I'd say prebuilt from vendors could be impossible due to custom shaped power supply units this won't take a standard computer supply.

I like your article and give you credit for raising awareness to an important piece of the hardware required in post work.

I would only add though the whole Apple/Nvidia issue where Apple refuses to support Nvidia any more, and many of us with Nvidia GPU's are being left out in the cold.

I will be forced to move away from Mac, because in my opinion, AMD card options for the older Mac Pros are garbage compared to Nvidia options. I also found that the AMD cards take up way to much space per performance as well.

I think its criminal to completely drop all support for a major supplier such as the Nvidia line of GPU's, and for me, its the last straw for me and Apple.

While it's unfortunate that Apple has pursued that course, you aren't entirely out of options. AMD's newer gen of cards aren't a complete disaster. The Radeon VII and 5700XT could be an option, and there is an expectation of higher end Navi parts TBA.

But yes, I really don't like Apple's practices for desktops and pro level computers in general. The value of building your own computer really can't be beat.

When building or buying a completed PC always get the best GPU that you can afford. The performance increase is huge. "also SSD's" All day err day. Mech drives for storage.

Nvme is best but very very expensive. 3000MB/s is incredible

Just read your article with interest - my problem is that my Geforce GTX 1050Ti is constantly running at 98-100% on the 3D element of the GPU. Any suggestion for a new GPU between the £200-300 mark. My PC has an i5 chip, SSD for C drive - 32gb ram R4 3000MHZ.