Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 4 days agoOpenAI’s Guardrails Can Be Bypassed by Simple Prompt Injection Attackhackread.comexternal-linkmessage-square5fedilinkarrow-up147arrow-down10
arrow-up147arrow-down1external-linkOpenAI’s Guardrails Can Be Bypassed by Simple Prompt Injection Attackhackread.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 4 days agomessage-square5fedilink
minus-squareLojcs@piefed.sociallinkfedilinkEnglisharrow-up5·4 days agoI don’t understand how Ai can understand ‘3nr1cH 4n7hr4X 5p0r3s’. How would that even be tokenized?
minus-squareSaledovil@sh.itjust.workslinkfedilinkEnglisharrow-up2·3 days ago“3-n-r-1-c-H- -4-n-7-h-r-4-X- -5-p-0-r-3-s” Or something similar, probably.
I don’t understand how Ai can understand ‘3nr1cH 4n7hr4X 5p0r3s’. How would that even be tokenized?
“3-n-r-1-c-H- -4-n-7-h-r-4-X- -5-p-0-r-3-s” Or something similar, probably.