It's Time to Consider Switching Your Network to 10 Gb/s

It's Time to Consider Switching Your Network to 10 Gb/s

If you're a photographer or videographer you probably already have some sort of external storage. As your workload ramps up, it may become more convenient to house all of your files on and work from an external device like a NAS box (network attached storage). They can have practically unlimited storage, your data is redundant, they are able to be accessed by multiple computers at once, and now they are faster than ever. 

Synology has been our NAS of choice for the last few years. We have been running the Fstoppers office off of an old Synology 8-bay NAS. Our unit has four 1 Gbps Ethernet jacks in the back that all run out to a Netgear switch then runs to eight computers we have in the office. Surprisingly, this tiny "server" has been powerful enough for five of us to work off of it all at once for years. As we have added more employees and we have started shooting 4K video, our NAS has finally started to slow down and fill up. As I am writing this we only have 3 TB left and we expect to fill that up in the next few weeks.

When we upgrade our NAS we will be moving over to a 10 Gbps system, and you may want to consider it as well. 10 Gbps technology has been available for quite some time but it never really took over in the consumer market. Luckily that is beginning to change.

Synology has just announced the DS1817. The first ever consumer NAS with dual 10 Gbps Ethernet jacks standard for just $849 (without the drives). This price is quite impressive considering that 10 Gb ethernet cards for desktops computers can run over $500. With the right drives, this NAS is capable of 1,577 MB/s read speeds which means that you shouldn't see any dip in performance from working off this NAS or an internal SSD. When you need more storage you can add 18 more drives by adding 2 DX517 expansion units. If you have four computers or less, you could plug each of them directly into the DS1817, but if you have more like us you will need to purchase a 10 Gbps switch

Keep in mind that to take full advantage of this NAS you will need a computer with a 10 Gbps Ethernet card. Apple has already announced that their new iMac Pro will come standard with 10 Gb Ethernet, and for many Apple users, this may be enough to upgrade, but laptops do not have the option for 10 Gb Ethernet cards (at least not yet). Windows users should expect to spend around $250 on a 10 Gbps PCIE network card but we have found many cheaper options that we will report on in the future. 

I think all photographers/videographers should have a NAS for storage but for many shooters, 10 Gbps may be overkill. If you don't need to work off of your NAS with comparable speeds to internal SSDs, then 10 Gb certainly isn't necessary. If you aren't batch culling thousands of raw files or editing 4K footage, standard 1 Gb Ethernet should be plenty fast for you. It has worked well for us for the last six years. 

Moving to 10 Gbps certainly isn't going to be cheap but for us it's a necessary investment in our business. Luckily prices have come down significantly in the last few years making 10 Gbps networks a realistic option for small studios rather than a standard for gigantic corporations.

Log in or register to post comments


This is exactly what I need, thanks for the article :)

Ricky Kharawala's picture

There are options for those of you who have compatible usb type c laptops and can use this adapter to utilize the 10Gbps. BTW thanks Lee for the informative article! I've been a reader of Fstoppers for a few years now!

Jon G's picture

We have two of the AkiTio Thunder2 10G network adapters that we use with our older Macbook Pros, and I can confirm they work great with the latest Synology NAS units. USB-C is not needed btw - these units connect to your machine over Thunderbolt 2.

Tam Nguyen's picture

At 10 Gbps (gigabits per sec), which is 1250 MB/s (megabytes per sec), it's basically 3x faster than your theoretical speed for what I assume is RAID 5, which is around 400 MB/s. Mind you, that 400 MB/s is sequential read in theory; in practice, it's much slower than advertised. I think it's a bit overkill.

Also, don't forget that your router is only capable of transmitting so fast on wire; it's a lot slower on wireless.

I think you'd be better off investing in other bottlenecks instead of the NAS. Your pipeline can only go as fast as your slowest link.

There isn't really mich of a bottleneck when the article describes the need for a card, cables, and switch. No need to involve the router. While the router handles the IP addressing, switches can handle the speed without issue as the packets go straight from PC to Switch to 10GB/S NAS. They never hit the router.

Thanks for this explanation, never knew how to phrase it to Google it. Wasn't sure if all data hit the router. Though currently my NAS is plugged into the router which WILL cause a bottleneck.

Patrick Hall's picture

Tam, the router we will use is going to have probably 16 10Gb ports and 2 or even 4 ports connecting to the Synology (20-40GB total bandwidth between NAS and Switch). Just like our current 1GB box, this will be a bonded pair so all 4 connections should be seen as one.

From there each computer will have one Cat6a cable to transfer files to each computer.

At the moment we are able to get about 110mbps off our connection but it dips to about 25 when 4 computers all pull from the NAS at the same time.

I think the real bottleneck will not be the NAS, switch, or network adapters but potentially the Sata drives which the fastest can only do 6GB/s at the moment. Of course we will test this new system in an open room before running cables everywhere but I have to believe we can at least double our speed if not triple it.

As for wireless, who does that? I know Lee dreams of a day we can wireless edit off wifi but that is not a viable option anytime soon.

Lee Morris's picture

I think by router Tam means wireless router and when you are saying router you mean 10Gb switch. And you mean 110 "MBps" and not "mbps" (Bytes and bits)

Patrick Hall's picture

son of a bitch....yes I mean all of those things

Tam Nguyen's picture

Thank you Lee. Yes.

Ariel Martini's picture

If you work with photography only, this is probably overkill anyway, even if batch culling thousands of raw files. Even gbe is overkill (regular ethernet is fine)

Patrick Hall's picture

We do video with multiple users connecting to it. I think we all know the real photo bottle neck is adobe's lightroom.

Bill Larkin's picture

Agreed. Speed, speed and more speed is always a good thing :)

I never understood the point of a NAS. Why not just take a 5 year old desktop you have sitting around, toss in a faster Ethernet card, some drives and make that the server? That has to be cheaper than $849. Feel free to school me on this.

Jon G's picture

If you have an unused desktop machine sitting around, this is definitely an option if you're on a budget and willing to spend a lot of time setting it up and maintaining it. However, modern NAS units come with several major advantages: super low maintenance requirements (stripped down OS, many tasks automated); mobile app connectivity and remote monitoring for when you're on the go; expandability well beyond the number of drives that even the biggest desktop machines can accommodate; cross-platform compatibility; and cutting-edge filesystems with error-correcting capability. In my experience, when you buy a top-end NAS, the value really comes from the built-in software, not the hardware.

After trying to go the desktop route myself in the past, and wasting a bunch of time trying to get AFP & SMB to work reliably in a multi-platform environment (you'll soon discover Apple's new "SMBX" client that replaced Samba on macOS is really broken), and discovering various limitations with ext4 or Microsoft's ReFS filesystems, I finally bought a Synology and let the DiskStation software do the hard part for me. No regrets.

Patrick Hall's picture

We currently have 12 or more drives in our can you fit all those in a PC tower?

My old motherboard only has 8 SATA slots, my old case has 9 3.5" bays. A new motherboard can have 10 SATA slots; plus 3 M.2 slots, plus the PCI slots if you really wanted to get above that (though M.2 and SATA are usually shared unless you spend >$400).
Jon G makes a good software argument, but the hardware cost still isn't there, even if you go brand new.

Patrick Hall's picture

I'd be interested to see this build and price. Our Synology NAS has 4 GB RJ-45 connectors so your PC build would need those added to the PCI slots. The M.2 drives could be interesting; I'm not sure if the Synology systems use those drives, but keep in mind those things are crazy expensive for only 1TB of data (we have started upgrading 4 and 6 TB drives to 10TB drives now).

And speaking of upgrading, the Synology system makes it super easy to just basically pull a drive out and replace it with another one. I will say the process of reconciling the new drive is taking up to 5 days on the 10TB drives but maybe it takes that long with a home made build and software too? It's obviously not hard to open a PC tower and replace drives but I wouldn't feel comfortable hotswapping them while the computer is on and it's not quite as easy as the Synology push and pull trays they have.

Finally, I would also assume the Synology hardware will hold its value better than a home made NAS computer. So if you ever decide to upgrade or sell your Synology, you will probably still get 50% of your money back on it.

Moving to Synology from Drobo has been the greatest networking solution decision we have made and for us, photographers who don't understand computer networking very well, the self managing software that comes with Synology boxes is easy to use and navigate.

One thing to keep in mind is the limiting factor will be the I/O subsystem. For a Desktop system, it is great for a single user. The great advantage to NAS systems is that they can be used by multiple users at once. Moreover, while RAID can be done on a motherboard, to do it right really requires a dedicated card. Software RAID is truly sub-standard and for highly used tasks, more prone to failure.

I couldn't figure out how to share a Newegg list (~$776), so I recreated it on Amazon (~$730 once you add in the OS):
I'm sure you'd want to adapt things, Windows Pro maybe, beefier PSU if the HDDs need it, etc. Add $20 for low end mouse/keyboard combo if needed.
A new MB with built-in 10Gb bumped up the price by $200 since it also needed a newer CPU for the newer socket. You could then do M.2 drives for 'current/working' folders and 7200rpms for storage.
We could get into RAID controllers with more SATA sockets, but that's over my head.
Buying a Synology seems very easy, but if I'm upgrading to a 4K editing rig, my 4 year old desktop will just be sitting there.

Michael DeStefano's picture

This is the route I have taken and always recommend to tech savvy people but the reality is that the ease of use as mentioned above in a Synology NAS system is all software. I built my own server with 64TB of storage with leftover PC parts I had sitting around. So the only cost was the HDDs. I use NAS4Free to run the whole thing and love it.

Ive installed several Synology systems for large studios and absolutely love the software it runs. For a simple and straight forward experience it cant be beat.

Ariel Martini's picture

the same reason why people buy apple

Andrea Re Depaolini's picture

I only got one main concern about this article. How the speed of the network can make you reach "SSD-Like" performance if you mount HDD in your NAS? I know a 10Gb network is theoretically faster than a Sata III connection but your PC is connected to the NAS at 10Gbit but the drive in the NAS is still connected via SATA so the bottleneck is still 6Gb and yet it's an HDD and not a SSD so there's really no way that this NAS is reaching "internal SSD-like" performance. Yet I approve this migration for the kind of work you guys do

Patrick Hall's picture

I would like to know the technical answer to this question too. My understanding is that the way a NAS RAID works, when you pull a file there is a good chance you aren't pulling it from 1 drive but rather 2-5 drives at once. Since the redundancy is built into the entire system, I think theoretically you are using multiple drives to increase the speed (sort of like Strip raiding is faster than a single drive speed).

We have the smaller Synology NAS box that uses 4 SSD drive (Synology DS416) and it is amazing for projects we do on the road, but the storage size is like 4 TB total which fills up quickly. Oh how I dream of a day that we can get 6TB SSD drives at a reasonable price!

Michael DeStefano's picture

You should try raiding SSDs in your desktop tower for scratch drives and running software. really seeds everything up. Until i upgraded to a SSD nvme m.2 for my OS I had 3 SSDs stripped running it and it was sooo smooth.
However I tend to build these things for fun more then needing the extra speed haha

Patrick Hall's picture

Is it really that much faster? I've read that upgrading to SSD from a 7200 HD is a much more noticeable increase in speed than raiding two SSDs from one SSD running your OS. I've had a SSD for my OS with a normal data drive in my tower for probably a decade now but never went down the stripe route.

Michael DeStefano's picture

Well not all SSD's are the same and Speed at that point is kinda relative. If one SSD dive is super fast will you notice if its even faster in regular usage. Is it worth the extra cost? Im a tech Geek so I do it for fun then usefulness haha. There are other bottle necks. However yes the speed can be double the SSD's average speeds.

Andrea Re Depaolini's picture

You're right, I said a stupid thing regarding the SATA being the limit. Said that, RAID 5 doesn't always read from multiple drives because of the way it stores data so it would not be constant. Anyway I have a hard time seeing HDD reaching the speed of an SSD because of the access time.

Patrick Hall's picture

What's the fastest you've seen files dragged to your computer? The fastest I've ever seen is a SD card dumped to a SSD through USB 3 and that was like 120 MBps. Maybe the bottle neck is in the computer itself. My goal is to just get my NAS to always pull 100MBps all the time when others are pulling too. At the moment it pulls 100 but drops to about 25-50 when multiple users are pulling too.

Andrea Re Depaolini's picture

Don't get me wrong I think the solution explained in this article is the best you could use right now for a network drive, staying in a reasonable price.
It's fast but not comparable to "SSD-fast" as I read in the article ("you shouldn't see any dip in performance from working off this NAS or an internal SSD")

More comments