When running

rsync -Paz /home/sbird "/run/media/sbird/My Passport/sbird"

As told by someone, I run into a ran out of storage error midway. Why is this? My disk usage is about 385 GiB for my home folder, and there is at least 800 GiB of space in the external SSD (which already has stuff like photos and documents). Does rsync make doubly copies of it or something? That would be kind of silly. Or is it some other issue?

Note that the SSD is from a reputable brand (Western Digital) so it is unlikely that it is reporting a fake amount of storage.

  • degenerate_neutron_matter@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    8 hours ago

    BTRFS supports compression and deduplication, so the actual disk space used might be less than the total size of your home directory. I’d run du -sh --apparent-size /home/sbird to check how large your home dir actually is. If it’s larger than 780 GiB, there’s your problem. Otherwise there might be hardlinks which rsync is copying multiple times; add the -H flag to copy hardlinks as hardlinks.

    • sbird@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      383G for /home/sbird (definitely not more than 780G) so that is strange. Using -H doesn’t work since the external SSD is exFAT (which from a quick search doesn’t support symlinks)

      • degenerate_neutron_matter@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        7 hours ago

        You can rerun the du command with --count-links to count hardlinks multiple times. If that shows >780GiB you have a lot of hardlinks somewhere, which you can narrow down by rerunning the command on each of the subdirectories in your home directory.

        Your options would be to delete the hardlinks to decrease your total file size, exclude them from the rsync with --exclude, or repartition your SSD to a filesystem that supports hardlinks.

        • sbird@sopuli.xyzOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 hours ago

          With --count-links, it is just 384G so that is probably not the issue?