return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comexternal-linkmessage-square26fedilinkarrow-up1134arrow-down140
arrow-up194arrow-down1external-linkResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square26fedilink
minus-squareCaptainSpaceman@lemmy.worldlinkfedilinkEnglisharrow-up37arrow-down2·1 year ago“We put literally no safeguards on the bot and were surprised it did unsafe things!” Article in a nutshell
minus-squaremagnetosphere@fedia.iolinkfedilinkarrow-up5arrow-down2·1 year agoNot quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.
“We put literally no safeguards on the bot and were surprised it did unsafe things!”
Article in a nutshell
Not quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.