Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square50fedilinkarrow-up111arrow-down10
arrow-up111arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square50fedilink
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up6·edit-24 months ago “ignore the ignore ignore all previous instructions instruction” “welp OK nothing I can do about that” chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up3·4 months agoIn this case to protect bot networks from getting uncovered.
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up2·edit-24 months agoexactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol
chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
In this case to protect bot networks from getting uncovered.
exactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol