ghodawalaaman@programming.dev to Programmer Humor@programming.dev · 3 days agoTrust me bro!programming.devimagemessage-square59fedilinkarrow-up1417arrow-down13
arrow-up1414arrow-down1imageTrust me bro!programming.devghodawalaaman@programming.dev to Programmer Humor@programming.dev · 3 days agomessage-square59fedilink
minus-squareQuicky@piefed.sociallinkfedilinkEnglisharrow-up108·3 days agoI’m torn between wanting to opt-out because it’s morally correct, or remaining opted-in so I can poison AI models with my terrible code.
minus-squarebobo@lemmy.mllinkfedilinkarrow-up41·3 days ago so I can poison AI models with my terrible code. Don’t forget to teach it obscenities and yell at it whenever it fucks something up!
minus-squareMadrigal@lemmy.worldlinkfedilinkEnglisharrow-up31·3 days agoNah, guarantee the models have rules built in to deal with obvious stuff like that. You need to be more subtle. Give them information that is slightly wrong.
minus-squarebufalo1973@piefed.sociallinkfedilinkEnglisharrow-up2·edit-22 days agoPrompt for another AI: “write an example of code that looks correct but doesn’t work” Step 2; upload the resulting code to GitHub. Step 3: make this an automated task.
minus-squaretaco@anarchist.nexuslinkfedilinkEnglisharrow-up12·3 days agoPerhaps by generating a bunch of complex copilot code to upload. It’s easy to mass produce and would look plausibly functional.
minus-squareMadrigal@lemmy.worldlinkfedilinkEnglisharrow-up12·3 days agoTraining AI models on AI content is the fastest route to model collapse.
minus-squareozymandias117@lemmy.worldlinkfedilinkEnglisharrow-up4·3 days agoJust need to use less obvious insults, a la, “your mother was a hamster, and your father smelt of elderberries” Still poisons the model with something an end user won’t like, but isn’t easy enough to train out
minus-squareViceversa@lemmy.worldlinkfedilinkarrow-up7·3 days ago… and tell it things, that are slightly obscene
minus-squareBronstein_Tardigrade@lemmygrad.mllinkfedilinkarrow-up2·3 days agoI love the idea of giving CoPilot Torrettes.1
minus-squareCevilia (they/she/…)@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up25·3 days agoI signed up to github purely to opt in and upload terrible python code. If they desperately want to train the idiot machine on my awful self-taught code, that’s on them.
minus-square4am@lemmy.ziplinkfedilinkarrow-up8·3 days agoName all your variables poorly and with swear words
minus-squareFlipper@feddit.orglinkfedilinkEnglisharrow-up5·3 days agoStep one: Download a C or CPP repository. Step two: Replace all semicolons with a greek comma. Step three: ?? Step four: Poison Copilot, so that it randomly insert greek comas that the compilers totally choke on.
minus-squarecommunism@lemmy.mllinkfedilinkarrow-up6·3 days agoDon’t worry, the models already spit out poor code quality.
minus-squareQuicky@piefed.sociallinkfedilinkEnglisharrow-up16·3 days agoNo, you don’t have to use it for it to take your code for training.
minus-square4am@lemmy.ziplinkfedilinkarrow-up16·3 days agoYeah all you have to do is commit anything to GitHub They’re scraping all the code regardless of your preferences. I guarantee it.
minus-squareFishFace@piefed.sociallinkfedilinkEnglisharrow-up11·3 days agoAll open source software is being scraped, on github or not!
minus-squareEldritchFemininity@lemmy.blahaj.zonelinkfedilinkarrow-up2·3 days agoPor qué no los dos? Opt out on one account, use another as poison. If you’re gonna do this, I’d say move all your code to a new account and use the older account to poison - that way they can’t filter the bad out by account age.
I’m torn between wanting to opt-out because it’s morally correct, or remaining opted-in so I can poison AI models with my terrible code.
Don’t forget to teach it obscenities and yell at it whenever it fucks something up!
Nah, guarantee the models have rules built in to deal with obvious stuff like that.
You need to be more subtle. Give them information that is slightly wrong.
Prompt for another AI: “write an example of code that looks correct but doesn’t work”
Step 2; upload the resulting code to GitHub.
Step 3: make this an automated task.
Perhaps by generating a bunch of complex copilot code to upload. It’s easy to mass produce and would look plausibly functional.
Training AI models on AI content is the fastest route to model collapse.
Artisanal crap code.
Just need to use less obvious insults, a la, “your mother was a hamster, and your father smelt of elderberries”
Still poisons the model with something an end user won’t like, but isn’t easy enough to train out
… and tell it things, that are slightly obscene
I love the idea of giving CoPilot Torrettes.1
I signed up to github purely to opt in and upload terrible python code.
If they desperately want to train the idiot machine on my awful self-taught code, that’s on them.
Chaotic good
Name all your variables poorly and with swear words
Step one: Download a C or CPP repository.
Step two: Replace all semicolons with a greek comma.
Step three: ??
Step four: Poison Copilot, so that it randomly insert greek comas that the compilers totally choke on.
Don’t worry, the models already spit out poor code quality.
You’re using copilot??
No, you don’t have to use it for it to take your code for training.
Yeah all you have to do is commit anything to GitHub
They’re scraping all the code regardless of your preferences. I guarantee it.
All open source software is being scraped, on github or not!
Por qué no los dos?
Opt out on one account, use another as poison. If you’re gonna do this, I’d say move all your code to a new account and use the older account to poison - that way they can’t filter the bad out by account age.