Anything from now that people want to authenticate in the future, they can publish the hash of.
So long as people trust the fact that the hash was published now, in the future when it’s fakable they can trust that it existed before the faking capability was developed.
They could make it difficult to open up the camera and extract its signing key, but only one person has to do it successfully for the entire system to be unusable.
In theory you could have a central authority that keeps track of cameras that have had their keys used for known-fake images, but then you’re trusting that authority not to invalidate someone’s keys for doing something they disagree with, and it still wouldn’t prevent someone from buying a camera, extracting its key themselves, and making fraudulent images with a fresh, trusted key.
I’m guessing it wouldn’t work for a variety of reasons, but having cameras digitally sign the image+the metadata could be interesting.
Anything from now that people want to authenticate in the future, they can publish the hash of.
So long as people trust the fact that the hash was published now, in the future when it’s fakable they can trust that it existed before the faking capability was developed.
Sounds like NFT talk to me, sailor
They could make it difficult to open up the camera and extract its signing key, but only one person has to do it successfully for the entire system to be unusable.
In theory you could have a central authority that keeps track of cameras that have had their keys used for known-fake images, but then you’re trusting that authority not to invalidate someone’s keys for doing something they disagree with, and it still wouldn’t prevent someone from buying a camera, extracting its key themselves, and making fraudulent images with a fresh, trusted key.