themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 3 months agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square90linkfedilinkarrow-up1607arrow-down123
arrow-up1584arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 3 months agomessage-square90linkfedilink
minus-squareTheJesusaurus@sh.itjust.workslinkfedilinkEnglisharrow-up110arrow-down4·3 months agoWhy confront the glaring issues with your “revolutionary” new toy when you could just suppress information instead
minus-squareEx Nummis@lemmy.worldlinkfedilinkEnglisharrow-up31arrow-down9·3 months agoThis was about sending a message: “stfu or suffer the consequences”. Hence, subsequent people who encounter similar will think twice about reporting anything.
minus-squareDevial@discuss.onlineBannedlinkfedilinkEnglisharrow-up31arrow-down1·edit-23 days agoRemoved by mod
minus-squareWhostosay@sh.itjust.workslinkfedilinkEnglisharrow-up6arrow-down5·3 months agoIt seems they did react to it though
minus-squareDevial@discuss.onlineBannedlinkfedilinkEnglisharrow-up14arrow-down2·edit-23 days agoRemoved by mod
minus-squareWhostosay@sh.itjust.workslinkfedilinkEnglisharrow-up9arrow-down6·3 months agoAn automatic reaction is a reaction
Why confront the glaring issues with your “revolutionary” new toy when you could just suppress information instead
This was about sending a message: “stfu or suffer the consequences”. Hence, subsequent people who encounter similar will think twice about reporting anything.
Removed by mod
It seems they did react to it though
Removed by mod
An automatic reaction is a reaction
Removed by mod