Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 19 days agoA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comexternal-linkmessage-square1linkfedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 19 days agomessage-square1linkfedilink
minus-squareKairos@lemmy.todaylinkfedilinkEnglisharrow-up1·19 days agoIs the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?
Is the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?