Sounds like a flawed workflow, if this didn’t go through at least code review. Was it committed directly to master?
Curious to know what kind of system relies on hashed not changing? Technically the hashes don’t change, but a new set of commits is made. The history diverges, and you can still keep the old master if you need it for some time, even cherry pick patches to it…
The guy at work who managed git before me, well didn’t quite have the knowledge I do and was not using LFS. In one of the main repos a 200mb binary was pushed 80+ times. This is not the only file that this happened to. Even if you do a shallow clone, you still need to add the commit depth eventually. It’s a nightmare.
Because the git history will still keep it completely to be able to restore it once you go to a commit in which it wasn’t deleted.
And you cannot tell git to completely forget about it as that requires a rewrite of history and this changes all commit hashes which are touched (and any future ones).
They most likely did in a later commit. However the commit adding it can not be removed as OP said. So in order for git to be able to work properly the file is still in the repository in case someone wants to check out a commit where it still was present.
Special shout out to the person who committed a gigabyte memory dump a few years ago. Even with a shallow clone, it’s pretty darn slow now.
We can’t rewrite history to remove it since other things rely on the commit IDs not changing.
Oh well.
If you want to be the team’s hero, I’ve had good luck removing old commits using
git filter repo
.https://hibbard.eu/erase-sensitive-files-git-history-filter-repo/
Oh right. Oof.
I would be working to arrange an accident for those other things. Those other things probably need to be retired.
Sounds like a flawed workflow, if this didn’t go through at least code review. Was it committed directly to master?
Curious to know what kind of system relies on hashed not changing? Technically the hashes don’t change, but a new set of commits is made. The history diverges, and you can still keep the old master if you need it for some time, even cherry pick patches to it…
The guy at work who managed git before me, well didn’t quite have the knowledge I do and was not using LFS. In one of the main repos a 200mb binary was pushed 80+ times. This is not the only file that this happened to. Even if you do a shallow clone, you still need to add the commit depth eventually. It’s a nightmare.
why did you merge it?
Why can’t you just remove the file?
Because the git history will still keep it completely to be able to restore it once you go to a commit in which it wasn’t deleted.
And you cannot tell git to completely forget about it as that requires a rewrite of history and this changes all commit hashes which are touched (and any future ones).
They most likely did in a later commit. However the commit adding it can not be removed as OP said. So in order for git to be able to work properly the file is still in the repository in case someone wants to check out a commit where it still was present.