When running

rsync -Paz /home/sbird "/run/media/sbird/My Passport/sbird"

As told by someone, I run into a ran out of storage error midway. Why is this? My disk usage is about 385 GiB for my home folder, and there is at least 800 GiB of space in the external SSD (which already has stuff like photos and documents). Does rsync make doubly copies of it or something? That would be kind of silly. Or is it some other issue?

Note that the SSD is from a reputable brand (Western Digital) so it is unlikely that it is reporting a fake amount of storage.

  • bleistift2@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 hours ago

    You can store the output of rsync in a file by using rsync ALL_THE_OPTIONS_YOU_USED > rsync-output.txt. This creates a file called rsync-output.txt in your current directory which you can inspect later.

    This, however means that you won’t see the output right away. You can also use rsync ALL_THE_OPTIONS_YOU_USED | tee rsync-output.txt, which will both create the file and display the output on your terminal while it is being produced.

    • sbird@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      Having a quick scroll of the output file (neat tip with the > to get a text file, thanks!) nothing immediately jumps out to me. There isn’t any repeated folders or anything like that from a glance. Anything I should look out for?

      • bleistift2@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 hours ago

        You checked 385GiB of files by hand? Is that size made up by a few humongously large files?

        I suggest using uniq to check if you have duplicate files in there. (uniq’s input must be sorted first). If you still have the output file from the previous step, and it’s called rsync-output.txt, do sort rsync-output.txt | uniq -dc. This will print the duplicates and the number of their occurrences.

        • sbird@sopuli.xyzOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 hours ago

          when using uniq nothing is printed (I’m assuming that means no duplicates?)

          • bleistift2@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 hours ago

            I’m sorry. I was stupid. If you had duplicates due to a file system loop or symlinks, they would all be under different names. So you wouldn’t be able to find them with this method.

            • sbird@sopuli.xyzOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 hours ago

              running du command with --count-links as suggested by another user returns 384G (so that isn’t the problem it seems)

              • bleistift2@sopuli.xyz
                link
                fedilink
                arrow-up
                1
                ·
                7 hours ago

                du --count-links only counts hard-linked files multiple types. I assumed you had a symlink loop that rsync would have tried to unwrap.

                For instance:

                $ ls -l
                foo -> ./bar
                bar -> ./foo
                

                If you tried to rsync that, you’d end up with the directories foo, bar, foo/bar, bar/foo, foo/bar/foo, bar/foo/bar, foo/bar/foo/bar, ad infinitum, in the target directory.

      • confusedpuppy@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        8 hours ago

        If you don’t spot any recusion issues, I’d suggest looking for other issues and not spend too much time here. At least now you have some troubleshooting knowledge going forward. Best of luck figuring out the issue.