Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 1 month agoA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comexternal-linkmessage-square1linkfedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-linkA nearly undetectable LLM attack needs only a handful of poisoned samples - Help Net Securitywww.helpnetsecurity.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 1 month agomessage-square1linkfedilink
minus-squareKairos@lemmy.todaylinkfedilinkEnglisharrow-up1·1 month agoIs the “attack” the fact that LLM fundamentally can’t distinguish between instructions and data?