Are Your Photos Future-Proof?

Are Your Photos Future-Proof?

A number of products premiering at CES 2019 give an indication to the future of photography and videography. Even without new equipment, there are some changes you can make to ensure your images and video are ready for the future.

The content we create as photographers and videographers has to be displayed at some point, in some medium. What used to be displayed via prints in the darkroom or glossy photos from a lab has evolved into primarily digital displays. The jump to LCD displays brought with it a number of considerations, particularly related to resolution and color space. The typical finished image for digital display is now a sRGB JPEG not usually bigger than 3,000 or so pixels on the long edge. Web and social media forces heavy compression and resizing beyond that.

A number of these standard attributes may change soon, as larger color gamuts are increasingly supported in consumer devices. Apple has already rolled out the Wide Color gamut, their version of DCI-P3, to a number of devices, including iPhone 7 and newer. Expanded gamut support is still challenging, especially in web applications, but with the excellent resolution and color gamut in Apple's devices, photographers should consider exporting photos specifically for use in a mobile portfolio. The 4K UHD spec also calls for DCI-P3 support, making this a likely winner in future color gamut wars. Shooting raw offers photographers the best chance at accommodating future color space developments, and best practices in color management will continue to be an important consideration for commercial shooters.

Typical resolutions are also rapidly changing, as many phones now have 1080P or higher displays, while 4K is increasingly common on computers. 8K is already on the horizon for TVs. Also, ultra-wide 21:9 LCDs, like the LG 38UC99, offer new opportunities for editing a standard ratio image with room left for tool palettes. Put together, resolution will be increasingly important, after stagnating for quite a while. Camera bodies are already there, with even consumer bodies putting out 24 MP files, but photographers will need to get more comfortable with sharing higher resolution finished images across different platforms.

Finally, along with these new display standards, mobile user's internet speeds are also poised to jump ahead with the rollout of 5G. Taken together, photographers should consider changing up their approach to finished images. The standard 2K sRGB JPEG might not cut it anymore. Instead, consider increasing the resolution, dialing back compression, and getting ready for some wider color spaces. I'm reminded of the quote that "the future is already here. It's just not very evenly distributed". For photographers, your clients may be on the latest and greatest devices, so your images should look right at home on them.

Lead image by Tomasz Frankowski.

Log in or register to post comments

3 Comments

Nicolas KIEFFER's picture

Hmmm, "4K increasingly common on computer" ?? Seriously ? Common ???

We have that little but loud share of gamers, and some rich video/photographers rich enough to get color accurate hi rez monitors... But nobody can say it is common.
Just the price of the monitor + the good enough graphix card to use 4K panel still keep a vast majority of PC owners out of reach of 4K panels. Even for desktop computer, the upgrade of a FHD monitor for a 4k one is more rare than swaping the graphic card for a better one !

Of course, the market share of 4k panel is growing, but it is far from being a tsunami, nor even common.

Oh, and by the way, all the bunch of cheap 4K panels are usually so bad at color fidelity and rendering, and most owners, that you can drop a lousy 2K JPEG sRGB onto such panel without worrying at all if you apply put clarity at max.

Alex Coleman's picture

The article is examining developing trends in technology, including the growing market for 4K displays. 3 of the top 4 best selling monitors on B&H are 4K. HDMI has supported 4K since the 1.4 spec back in 2009.

Unless you are gaming, "graphix cards" aren't a consideration, as even the cheapest graphics cards on the market, as well as most motherboards, are able to drive 4K monitors without any difference in performance.

Nicolas KIEFFER's picture

Maybe in the USA 4K monitor are selling well. It can be the most trendy caracteristic for monitor panel, but reality stays here :
http://gs.statcounter.com/screen-resolution-stats

Just have an eye onto the resolution chart. How much FHD panel are used whereas they are the most easy to get nowadays ?

You are living in a geek world, where there is an obvious need for nice color fidelity, high accutance and wide gamut display, but it is far from being mainstream.

I am working in IT, I already know that most hardware is able to feed a stream to a 4k panel, but the vast majority of people are unaware about what really 4k or 8 stand for.
And most of them are unable to distinguish a lousy upscaled 1080 feed displayed onto a 4k panel vs an excellent FHD feed displayed onto a top notch FHD panel.

So, of course, the market and the manufacturers are trying to make us buy 4k panels now, and as the price are going down they will of course get more and more market share, but no way you can say it is common.
I want to change my 2 FHD panel for my own photographic work, and I am not looking for a 4k display but an excellent QHD or FHD panel. 4k staying out of reach nor WAF compliant.

And as I already told, whenever the masses are seeking new hardware to improve its PC, they will more look for a better processor or graphix card than change their still working almost new FHD monitor for a 4k/8k panel.

The TV sets are going 4k, as the manufacturer are making more and more of them, and high quality FHD are almost as costly as good 4k panels. But sources in Europe are so scarce it is almost pointless and disapointing it is more working against mass adoption.