OK, I'm toying with the idea of upgrading my home network to 10GbE, and I need a sense check. i.e. someone to (constructively) shoot holes in my plan and tell me where I'm going wrong, and maybe what I can do to overcome it. Let's not get hung up on why. I need this because of reasons, and that is all that matters. There are a few affordable options on the market now which offer GbE and SFP+ ports on an unmanaged switch. This will likely be a incremental upgrade, starting with a 10GbE link between my PC, the switch and my NAS/Server box. For the NAS I can buy a cheap SFP NIC and run a DAC cable to the switch. The main issue I see is that my PC is ITX, so I'd be looking for a 10Gbe to USB solution - maybe some kind of SFP+ to USB 3.0/USB C converter: So, what do you think? Do-able?
IIRC, it's doable with Thunderbolt, because Macs etc... not sure if it's doable with USB... EDIT: even then it looks like they're all RJ45/Cat rather than SFP linky EDIT: QNAP does 5GBe -> USB [3.0] adaptors, not seeing any 10GBe -> USB 3.1+
Yeah, I was just looking at the QNAP dongle, because 5GbE is probably enough to make the HDD transfer rate the bottleneck. USB C is compatible with Thunderbolt, just slower; but still enough to deal with 10GbE.
You can use USB C with Thunderbolt 3... iirc you [still] can't use Thunderbolt 3 with USB C... So the QNAP adaptor should work on a mac's TB port... but the TB->SFP adaptor probably wouldn't work if you plugged into your board's usb C port...
I've been looking at the QNAP QSW-308-1C switch as a starting point - it seems to offer some pretty awesome flexibility. The switch plus DAC cable, SFP NIC and the 5G QNAP USB dongle could be had for ~£350. I've just had a £250 rebate too... I might treat myself this christmas - its a lot of money but a two or threefold increase in transfer speed is a massive boost.
I don't need a big switch, and it's the inclusion of multiple port types that interests me. I looked at the Mikrotik CRS305-1G-4S+IN too, becuase it's cheap, but too limiting; and the SFP modules to allow GbE too would push up the cost. I haven't seen much in the way of used 10GbE stuff that would work out any cheaper, once I've coughed up for the extras to make it work - but that's why I made this thread. If you have knowledge to drop, I'd really appreciate it.
Naff that. Use PCIe Bifurcation (supported by damn near any 1xx/2xx/3xx series ITX mobo and some 9x/8x/7x with a custom BIOS if you ask nicely e.g. ASRock), take tyhe x16 slot and give an x8 slot to your GPU and the other x8 to the NIC. There are vanishingly few cases where PCIe 3.0 x8 will cause measurable bottlenecking with current GPUs (it's a "well, maybe if you have two 2080Ti's in SLI..." issue). Other options: m.2 to PCIe (gets you a PCIe 3.0 x4 slot), u.2 to PCIe (gets you a PCIe 3.0 x4 slot), or SATA Express to PCIe (gets you a PCIe 2.0 x2 slot). PCIe 2.0 x2 is sufficient for a single 10GbE port.
What storage do you have in your ITX PC? Are you genuinely pushing data rates that even max out 1GB on that one alone? You’d have to have large file copy sustained performance over 850MBps or let us call it 110MBytes a second before that’s an issue? Even with NVMe are you genuinely getting that level of IOPS? USB3 is 450MBps IIRC? As others have stated thunderbolt goes higher in theory but I’m not sure there’s demand for 10GBe NICs I’m this space, which means either £££ or just don't exist. So the reason I ask about IO throughout on your ITX is that going to 10GBe backbone and uplink to your NAS might actually provide enough bandwidth that you can max out several 1GB connections at the same time I.e it removes the storage to LAN bottleneck at the NAS? I guess it depends what you're trying to fix right now?
PCIe 3,0 x8 does impact performance, not majorly but enough that it negated the benefit of my meagre CPU upgrade (1900x to 2920x) as I wanted to put more NVMe drives in at the same time, I only have 2 x16 slots and one already has 4 NVMe I dropped my GPU down to the x8 slot as I thought there would be no impact, but many of my post upgrade gaming benchmarks are slower despite more cores, higher clocks and extra L3 cache, switching my 1080Ti back to x16 and performance is improved as expected, bit of a bummer really as I want it all. The reality is once I game at normal res and not 1080 which I was only doing to exercise CPU it will probably have negligible impact but I was surprised to see a difference.
I have 3 TB of SSD storage in my PC. Like I said - this is something I want but, fine... I'm moving around a lot of 4k footage - slowly replacing most of the video library with higher definition copies, as well as storing raw and processed 4k video from the drone (not huge amounts yet, but I don't see this number going anywhere but up). Moving tens or hundreds of gigs over to my NAS is not an uncommon event and waiting for it to complete at ~109 MB/s is driving me up the ****ing wall. Even doubling or tripling that rate will be a win in my eyes. As an aside, general backups and array rebuilds/expansions/transplants will see a benefit too. I'm not trying to be rude but this if not an "if" situation, it's a "when and how". I know my main rig isn't ideal to fit in this setup, but I ain't changing it and building a similarly specced PC just to handle video won't save me any money. That's why I'm reaching out to the forum for ideas.
It's something to look at, but it would mean sacrificing my NVMe OS drive, unless I throw £200+ at one of those QNAP 10GbE/M.2 cards; assuming one would work off a third party m.2/PCIe adapter.
Completely with you David, I want to do similar for similar reasons, whilst my NAS at the moment can't support that speed, it is only a matter of time, and I can definitely saturate a 10Gb connection once my NAS is upgraded. It will be like having a remote SSD once done, really fast network storage, who doesn't want that. want.
No worries dude, I was just trying to understand what the driver is. I wasn't sure if the NAS was constrained or where the issue is. Clearly getting 10GBe into your ITX is the challenge here. I found this https://www.amazon.co.uk/dp/B06XKWQPYB but I'd obviously consider that cost pretty insane. I also wonder if you could install a Thunderbolt 3 PCI-E expansion chassis, and instead of installing a graphics card, install a PCI-E 10GBe card?
Hol up there, while you may have stonking loads of bandwidth available for copying very large contiguous files, as fast as a local SSD it sure ain't. The big killer is latency (and by association, IOPs): rather than a single hop back and forth across the PCIe bus for the CPU to grab a given bit of data from a drive, you now need to: - Traverse the PCIe bus to the NIC - Have the NIC do its thang - Packets travel to the switch, switch does its thang, packets travel to the NAS NIC (admittedly this can be reasonably assumed to be 'wire speed' for all intents and purposes unless you have a really weird network setup) - NAS NIC does its thang - Hop over the PCIe bus to the NAS CPU - NAS CPU performs whatever filesystem processing necessary to retrieve the file you want (could be anything from hewing it out of a deduped ZFS vdev, or 'jsut' a JBOD acces operation) - Hop back and forth over the NAS PCIe or SATA bus - Retrieved data now has to go back through that whole chain again While there are DMA techniques that can sped things up, you're then looking at a SAN rather than a NAS and some whole new worlds of headaches to discover.
Sure OK it is not like an SSD in every sense, but like David I am shifting multiple 4k camera recordings after encodes as I like to keep the uncompressed source backed up, bandwidth rather than random is perfect here.
Yeah, I've looked at those and I consider the cost to be insane too. Now there's a thought... definitely a fairly straightforward solution if a tad on the pricey side. I recall an article somewhere about a guy building his own box - not sure if it was TB though.
I just bought a RTX 2060 from Goatee which he had in an external chassis box - I didn't need the box. He was thinking of keeping it but maybe he would sell it separately? Might cut down the cost a bit to give it a try.
which ITX motherboard do you have, do you not have any internal usb3 you could move something onto or a free slot.