I’m looking at getting a 10 gigabit network switch. I only have 3 devices that could use that speed right now but I do plan on upgrading things over time.

Any recommendations?

  • Lem453@lemmy.ca
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    2 months ago

    The comments here saying to not bother with 10gbe is surprising considering it’s the selfhosted community, not a random home networking self help. Dismissing a reasonable request form someone who is building a homelab is not a good way to grow niche communities like this one on the fediverse.

    10gbe has come down in price a lot recently but is still more expensive than 1gb of course.

    Ideas for switches: https://www.servethehome.com/the-ultimate-cheap-10gbe-switch-buyers-guide-netgear-ubiquiti-qnap-mikrotik-qct/

    https://www.servethehome.com/nicgiga-s25-0501-m-managed-switch-review-5-port-2-5gbe-and-sfp-realtek/

    For a router: https://www.servethehome.com/everything-homelab-node-goes-1u-rackmount-qotom-intel-review/

    • czardestructo@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      I bought all the gear to do 10gbe but ultimately went back to 1gig simply because the power consumption. The switch alone used 20w at idle and each NIC burned 8w and I couldn’t justify it.

      • Lem453@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        Very reasonable. FWIW, sfp uses way less power than rj45 for 10gbe if that’s an option.

    • Neshura@bookwormstory.social
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 months ago

      Personally going 10G on my networking stuff has significantly improved my experience with self-hosting, especially when it comes to file transfers. 1G can just be extremely slow when you’re dealing with large amounts of data so I also don’t really understand why people recommend against 10G here of all places.

      • tburkhol@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        2 months ago

        And X-windows. There’s a few server tasks that I just find easier with gui, and they feel kind of laggy over 1G. Not to mention an old Windows program running in WINE over Xwin. All kind of things you can do, internally, to eat up bandwidth.

      • JustEnoughDucks@feddit.nl
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 months ago

        I think it has to do with data differences between self hosters and data hoarders.

        Example: a self hosted with an RPI home assistant setup and a N100 server with some paperwork, photos, nextcloud, and a small jellyfin library.

        A few terabytes of storage and their goal is to replace services they paid for in an efficient manner. Large data transfers will happen extremely rarely and it would be limited in size, likely for backing up some important documents or family photos. Maybe they have a few hundred Mbit internet max.

        Vs

        A data hoarder with 500TB of raid array storage that indexes all media possible, has every retail game sold for multiple consoles, has taken 10k RAW photos, has multiple daily and weekly backups to different VPS storages, hosts a public website, has >gigabit internet, and is seeding 500 torrents at a given time.

        I would venture to guess that option 1 is the vast majority of cases in selfhosting, and 10Gb networking is much more expensive for limited benefit for them.

        Now on a data hoarding community, option 2 would be a reasonable assumption and could benefit greatly from 10Gb.

        Also 10Gb is great for companies, which are less likely to be posting on a self hosted community.

        • Neshura@bookwormstory.social
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          2 months ago

          I somewhat disagree that you have to be a data hoarder for 10G to be worth it. For example I’ve got a headless steam client on my server that has my larger games installed (all in all ~2TB so not in data hoarder territories) which allows me to install and update those games at ~8 Gbit/s. Which in turn allows me to run a leaner Desktop PC since I can just uninstall the larger games as soon as I don’t play them daily anymore and saves me time when Steam inevitably fails to auto update a game on my Desktop before I want to play it.

          Arguably a niche use case but it exists along side other such niche use cases. So if someone comes into this community and asks about how best to implement 10G networking I will assume they (at least think) have such a use case on their hands and want to improve that situation a bit.