Watch out for flash data corruption. Lots of cheap flash (USB sticks, SD cards, SSDs) lose data after just a few years of offline storage. Something something quantum tunnel bullshit, iirc.
So either look for media that guarantee long cold storage retention (lots of businesses need to keep shit for 10 years for tax reasons), or occasionally plug it in and let do the housekeeping.
User older flash tech can be useful here. You might not always need the highest density storage if you want to maintain files for a long time. Getting stuff built in a much larger process node makes for a much more stable form of storage.
It’s more that flash NAND uses a small electric charge to keep the NAND gates in the correct configuration. Over time, that charge dissipates. If you power the storage device every once in a while, you minimize these chances.
Here’s a video explaining why it happens to Wii U’s after being powered off for a while. https://youtu.be/JHME4zLs6Qs
Thanks but even though it’s on a plugged HDD I don’t even care for any of that data. What I mean is that none of that data is sensitive. It might be useful, potentially, but it’s not unique. What I mean is that if somehow my .zim file for Wikipedia was corrupted I could download it again from https://library.kiwix.org/#lang=eng&category=wikipedia or elsewhere in ~30min (just checked).
What I’m trying to highlight here is more the process than the actual outcome.
TL;DR: yes, if one is actually serious about just getting and storing, they should verify periodically if the data is indeed fine. What I do want to highlight though is to first know how to do it at all. Anyway, you are right that for a proper solution on the long run one must understand how (cold) storage actually works. My heuristic is that it’s like can food (which I don’t use much), it might last a while, but not forever.
It can be but not to me. To me the point is to test what’s actually feasible and usable. It can be Wikipedia on my HDD but it could also be SO on a microSD or a RPi … or it could be something totally different on another piece of hardware with another piece of storage. It will depend on the context.
So again, sure, having the data itself feels nice but in practice I never really needed it. If tomorrow my HDD would die I would shrug. If tomorrow Kiwix library wouldn’t work anymore, I’d be disappointed but I could rely on .zim file elsewhere, e.g. on torrent trackers.
IMHO the point isn’t files, the point is usable knowledge.
Edit : to be clear this isn’t philosophy, you can see exactly what I mean and even HOW I do it (and even when) with the edits of my public wiki or my git repositories.
Watch out for flash data corruption. Lots of cheap flash (USB sticks, SD cards, SSDs) lose data after just a few years of offline storage. Something something quantum tunnel bullshit, iirc.
So either look for media that guarantee long cold storage retention (lots of businesses need to keep shit for 10 years for tax reasons), or occasionally plug it in and let do the housekeeping.
User older flash tech can be useful here. You might not always need the highest density storage if you want to maintain files for a long time. Getting stuff built in a much larger process node makes for a much more stable form of storage.
It’s more that flash NAND uses a small electric charge to keep the NAND gates in the correct configuration. Over time, that charge dissipates. If you power the storage device every once in a while, you minimize these chances.
Here’s a video explaining why it happens to Wii U’s after being powered off for a while. https://youtu.be/JHME4zLs6Qs
Thanks but even though it’s on a plugged HDD I don’t even care for any of that data. What I mean is that none of that data is sensitive. It might be useful, potentially, but it’s not unique. What I mean is that if somehow my
.zim
file for Wikipedia was corrupted I could download it again from https://library.kiwix.org/#lang=eng&category=wikipedia or elsewhere in ~30min (just checked).What I’m trying to highlight here is more the process than the actual outcome.
TL;DR: yes, if one is actually serious about just getting and storing, they should verify periodically if the data is indeed fine. What I do want to highlight though is to first know how to do it at all. Anyway, you are right that for a proper solution on the long run one must understand how (cold) storage actually works. My heuristic is that it’s like can food (which I don’t use much), it might last a while, but not forever.
I thought the point of backing stuff up was to have things in case just downloading it again isn’t a viable option?
It can be but not to me. To me the point is to test what’s actually feasible and usable. It can be Wikipedia on my HDD but it could also be SO on a microSD or a RPi … or it could be something totally different on another piece of hardware with another piece of storage. It will depend on the context.
So again, sure, having the data itself feels nice but in practice I never really needed it. If tomorrow my HDD would die I would shrug. If tomorrow Kiwix library wouldn’t work anymore, I’d be disappointed but I could rely on
.zim
file elsewhere, e.g. on torrent trackers.IMHO the point isn’t files, the point is usable knowledge.
Edit : to be clear this isn’t philosophy, you can see exactly what I mean and even HOW I do it (and even when) with the edits of my public wiki or my git repositories.