洪 民憙 (Hong Minhee)@lemmy.ml to Open Source@lemmy.mlEnglish · 23 hours agoHistomat of F/OSS: We should reclaim LLMs, not reject themwritings.hongminhee.orgexternal-linkmessage-square62fedilinkarrow-up162arrow-down114cross-posted to: foss@beehaw.org
arrow-up148arrow-down1external-linkHistomat of F/OSS: We should reclaim LLMs, not reject themwritings.hongminhee.org洪 民憙 (Hong Minhee)@lemmy.ml to Open Source@lemmy.mlEnglish · 23 hours agomessage-square62fedilinkcross-posted to: foss@beehaw.org
minus-squarefakasad68@lemmy.mllinkfedilinkarrow-up2arrow-down1·edit-221 hours agoChecking whether a proprietary LLM model running on the “cloud” has been trained on a piece of TGPL code would probably be harder than checking if a proprietary binary contains a piece of GPL code, though.
minus-square☆ Yσɠƚԋσʂ ☆@lemmy.mllinkfedilinkarrow-up1·6 hours agoNot necessarily, the models can often be tricked into spilling the beans of how they were trained.
Checking whether a proprietary LLM model running on the “cloud” has been trained on a piece of TGPL code would probably be harder than checking if a proprietary binary contains a piece of GPL code, though.
Not necessarily, the models can often be tricked into spilling the beans of how they were trained.