Just had this idea pop up in my mind. Instead of relying on volunteers mirroring package repositories all around the world, why not utilise BitTorrent protocol to move at the very least some some load unto the users and thus increase download speeds as well as decrease latency?

  • ಠ_ಠ@infosec.pub
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    1 month ago

    Some distros do this already.

    Alternative downloads

    There are several other ways to get Ubuntu including torrents, which can potentially mean a quicker download, our network installer for older systems and special configurations and links to our regional mirrors for our older (and newer) releases.

    BitTorrent is a peer-to-peer download network that sometimes enables higher download speeds and more reliable downloads of large files. You need a BitTorrent client on your computer to enable this download method.

    https://ubuntu.com/download/alternative-downloads

  • sorrybookbroke@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 month ago

    That’s actually a really interesting idea. Windows even does something, or at a point did something, similar with system updates.

    Peer to peer packages would have some privacy, and potential security issues of course but I like the thought

    • delirious_owl@discuss.online
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      Good lord, and windows doesn’t have a way to verify their ISOs are authentic. Do they sign this p2p payload in any way? Seems like a great opportunity to spread a worm

  • delirious_owl@discuss.online
    link
    fedilink
    arrow-up
    46
    arrow-down
    2
    ·
    1 month ago

    What are you talking about? All that torrent traffic that my ISP sees is definitely Linux ISOs.

    Just doing my part

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 month ago

    BitTorrent would likely increase latency, not lower it. The bit torrent protocol is very inefficient for small files and large numbers of files (https://wiki.debian.org/DebTorrent - see “Problems”).

    But I think your question is more “why not use p2p to download files” for which I think the answer is likely “because they don’t need to.” It would add complication and overhead to maintain. An FTP/HTTP server is pretty simple to setup / maintain and the tools already exist to maintain them. You can use round-robin DNS to gain some redundancy and a bit of load spread without much effort either.

    • Omega_Jimes@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      Bittorrent is nice for getting isos, but it would pul my hair out if I tried to download patches with it.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      29
      ·
      1 month ago

      OP is taking about packages and updates using peer to peer, not just the install media. AFAIK no distro does that.

  • biribiri11@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 month ago

    Another thing not mentioned yet is maintenance overhead. These distros operate around the clock, all over the world, with talent from the likes of RH and co. There are far fewer people (who run your mirrors) who know how to maintain a torrent tracker (or similar), and on top of that, I haven’t really seen any good BitTorrent caching methods. Support would need to be added to your package manager of choice.

    It also comes down to most client having asymmetric bandwidth, and that most users do not have every package installed and therefore can only distribute a very small amount of the total distro. Those users probably don’t want to be constantly uploading, either. I also can’t imagine torrents are too fun to work with when it comes to distributing constantly changing package manager metadata, too.

  • makeasnek@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    1 month ago

    There is an apt variant that can do this, but nobody uses it. BitTorrent isn’t great for lots of small files overhead wise.

    IPFS is better for this than torrents. The question is always “how much should the client seed before they stop seeding and how longs should they attempt to seed before they give up”. I agree something like this should exist, I have no problem quickly re-donating any bandwidth I use.

  • recarsion@discuss.tchncs.de
    link
    fedilink
    arrow-up
    2
    ·
    1 month ago

    To add to everything else mentioned, many places (schools, workplaces) don’t allow any usage of BitTorrent, even legal. A guy at my uni got yelled at for torrenting a Linux iso. Not to mention depending on where you live your ISP might be interested in that activity unless you’re using a vpn.

  • Rogue@feddit.uk
    link
    fedilink
    arrow-up
    41
    ·
    1 month ago

    I suspect if this was enabled by default there would be uproar from people annoyed the distro was stealing their bandwidth, and if it were opt-in then very few people would do it.

    Windows Update uses peer to peer to distribute updates. It’s one of the first things I always disabled.

  • GravitySpoiled@lemmy.ml
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    1 month ago

    One reason is privacy and hence security. If you share a package, you also share the information that your system contains the oudtated package “xy” which has a backdoor and can be accessed by a hacker.

    I’m not sure if that is a valid argument with atomic image distros since you share the whole image. And the tracker could just disable the old image as soon as the new image arrives.

      • Lemmchen@feddit.de
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        But as a third party you can not know which clients are using this outdated http mirror. On BitTorrent you can see every participating peers and some of them are probably enduser machines (depending on the actual implementation of OP’s suggestion).

    • vort3@lemmy.ml
      link
      fedilink
      arrow-up
      11
      ·
      1 month ago

      For the longest time people wondered: how do bees fly and don’t bump into each other? There are so many of them!

      To find out, people used high speed cameras, and then they were shocked by the fact that bees actually do bump into each other.

      Isn’t it ridiculous that we just take our assumptions on something we have no idea about as facts?

      • moreeni@lemm.eeOP
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 month ago

        Yeah, not my brightest one, that’s for sure. Still, idk about what manu distros everyone is talking about. Big distros utilise volunteer-run mirrors, from what I’ve been able to find.