Thanks but even though it’s on a plugged HDD I don’t even care for any of that data. What I mean is that none of that data is sensitive. It might be useful, potentially, but it’s not unique. What I mean is that if somehow my .zim file for Wikipedia was corrupted I could download it again from https://library.kiwix.org/#lang=eng&category=wikipedia or elsewhere in ~30min (just checked).
What I’m trying to highlight here is more the process than the actual outcome.
TL;DR: yes, if one is actually serious about just getting and storing, they should verify periodically if the data is indeed fine. What I do want to highlight though is to first know how to do it at all. Anyway, you are right that for a proper solution on the long run one must understand how (cold) storage actually works. My heuristic is that it’s like can food (which I don’t use much), it might last a while, but not forever.
It can be but not to me. To me the point is to test what’s actually feasible and usable. It can be Wikipedia on my HDD but it could also be SO on a microSD or a RPi … or it could be something totally different on another piece of hardware with another piece of storage. It will depend on the context.
So again, sure, having the data itself feels nice but in practice I never really needed it. If tomorrow my HDD would die I would shrug. If tomorrow Kiwix library wouldn’t work anymore, I’d be disappointed but I could rely on .zim file elsewhere, e.g. on torrent trackers.
IMHO the point isn’t files, the point is usable knowledge.
Edit : to be clear this isn’t philosophy, you can see exactly what I mean and even HOW I do it (and even when) with the edits of my public wiki or my git repositories.
Thanks but even though it’s on a plugged HDD I don’t even care for any of that data. What I mean is that none of that data is sensitive. It might be useful, potentially, but it’s not unique. What I mean is that if somehow my
.zim
file for Wikipedia was corrupted I could download it again from https://library.kiwix.org/#lang=eng&category=wikipedia or elsewhere in ~30min (just checked).What I’m trying to highlight here is more the process than the actual outcome.
TL;DR: yes, if one is actually serious about just getting and storing, they should verify periodically if the data is indeed fine. What I do want to highlight though is to first know how to do it at all. Anyway, you are right that for a proper solution on the long run one must understand how (cold) storage actually works. My heuristic is that it’s like can food (which I don’t use much), it might last a while, but not forever.
I thought the point of backing stuff up was to have things in case just downloading it again isn’t a viable option?
It can be but not to me. To me the point is to test what’s actually feasible and usable. It can be Wikipedia on my HDD but it could also be SO on a microSD or a RPi … or it could be something totally different on another piece of hardware with another piece of storage. It will depend on the context.
So again, sure, having the data itself feels nice but in practice I never really needed it. If tomorrow my HDD would die I would shrug. If tomorrow Kiwix library wouldn’t work anymore, I’d be disappointed but I could rely on
.zim
file elsewhere, e.g. on torrent trackers.IMHO the point isn’t files, the point is usable knowledge.
Edit : to be clear this isn’t philosophy, you can see exactly what I mean and even HOW I do it (and even when) with the edits of my public wiki or my git repositories.