Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.

Edit: I settled for SFTP in my GUI filemanager for now. When I have some spare time I will try to look into the other options too. Thank you for the helpful information.

  • node815@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 days ago

    I work from home, however my two systems (home and work) are on the same LAN, they don’t see each other for file sharing. I get paid via direct deposit like everyone else which means my pay stubs are all electronic. I print those out and then use WinSCP to copy those over to my desktop. No other files are ever sent.

    At home, depending on the amount of files, I either use SFTP via Filezilla, or if the mood strikes me and for a single file, I will just use SCP if I’m already on the cli which is most of the time it seems anymore doing work on my personal servers. I’ve found that SFTP is faster at transferring than doing a copy/paste to the NFS share to the same drive.

  • Turboblack@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    you can use a regular ftp server with administrator and user rights, distribute rights to those who replenish, and those who just take - guests at home I transfer in this way from computer to computer without connecting them to a common network, what could be simpler? why invent some ways with keys or bash if there is a 40-year-old technology that just works great, and to open ftp it is enough to enter the IP address in the explorer

    • pirat@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      A single folder synced between all of them, or a separate folder for each, syncing everything to a single device?

  • motsu@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 days ago

    smb share if its desktop to desktop. If its from phone to PC, I throw it on nextcloud on the phone, then grab it from the web ui on pc.

    Smb is the way to go if you have identity set up, since your PC auth will carry over for the connection to the smb share. Nextcloud will be less typing if not since you can just have persistent auth on the app / web.

    • BeardedGingerWonder@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      Solid explorer on android is pretty useful too, it can access the SMB share. I use nextcloud for photo backup, but usually solid explorer for one off file transfers.

    • boreengreen@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 days ago

      As I understand it, the establishing of the connection is reliant on a relay server. So this would not work on a local network without a relay server and would, by default, try to reach a server on the internet to make connections.

  • MasterBlaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 days ago

    By “homelab”, do you mean your local network? I tend to use shared folders, kdeconnect, or WebDAV.

    I like WebDAV, which i can activate on Android with DavX5 and Material Files, and i use it for Joplin.

    Nice thing about this setup is that i also have a certificate secured OpenVPN, so in a pinch i can access it all remotely when necessary by activating that vpn, then disconnecting.

  • Xanza@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 days ago

    rclone. I have a few helper functions;

    fn mount { rclone mount http: X: --network-mode }
    fn kdrama {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/KDrama/$x --filter-from
    ~/.config/filter.txt }
    fn tv {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/TV/$x --filter-from ~/.config/filter.txt }
    fn downloads {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/Downloads/$x --filter-from ~/.config/filter.txt }
    

    So I download something to my seedbox, then use rclone lsd http: to get the exact name of the folder/files, and run tv "filename" and it runs my function. Pulls all the files (based on filter.txt) using multiple threads to the correct folder on my NAS. Works great, and maxes out my connection.

  • lemmylommy@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 days ago

    WinSCP for editing server config

    Rsync for manual transfers over slow connections

    ZFS send/receive for what it was meant for

    Samba for everything else that involves mounting on clients or other servers.

  • raldone01@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    Ähm. So your not gonna like this but I just connect with vscode remote-ssh and drag’n drop em from the os file explorer into the vscode one.

    So long story short scp I guess.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      Yeah, I mean I do still use rsync for the stuff that would take a long time, but for one-off file movement I just use a mounted network drive in the normal file browser, including on Windows and MacOS machines.

    • theorangeninja@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 days ago

      Sounds very straight forward. Do you have a samba docker container running on your server or how do you do that?

        • GamingChairModel@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          Yeah, if OP has command line access through rsync then the server is already configured to allow remote access over NFS or SMB or SSH or FTP or whatever. Setting up a mounted folder through whatever file browser (including the default Windows Explorer in Windows or Finder in MacOS) over the same protocol should be trivial, and not require any additional server side configuration.

      • Lv_InSaNe_vL@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        7 days ago

        I dont have a docker container, I just have Samba running on the server itself.

        I do have an owncloud container running, which is mapped to a directory. And I have that shared out through samba so I can access it through my file manager. But that’s unnecessary because owncloud is kind of trash.

      • Kit@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 days ago

        I have two servers, one Mac and one Windows. For the Mac I just map directly to the smb share, for the Windows it’s a standard network share. My desktop runs Linux and connects to both with ease.

      • drkt@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        16
        ·
        edit-2
        7 days ago

        I just type sftp://[ip, domain or SSH alias] into my file manager and browse it as a regular folder

          • drkt@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            6 days ago

            Linux is truly extensible and it is the part I both love and struggle to explain the most.
            I can sit at my desktop, developing code that physically resides on my server and interact with it from my laptop. This does not require any strange janky setup, it’s just SSH. It’s extensible.

            • blackbrook@mander.xyz
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              6 days ago

              I love this so much. When I first switched to Linux, being able to just list a bunch of server aliases along with the private key references in my .ssh/config made my life SO much easier then the redundantly maintained and hard to manage putty and winscp configurations in Windows.