
Boosted by aral@mastodon.ar.al ("Aral Balkan"):
davidgerard@circumstances.run ("David Gerard") wrote:
oh this is comedy gold. you can prompt-inject a chatbot via unicode fuckery
https://embracethered.com/blog/posts/2024/hiding-and-finding-text-with-unicode-tags/
"which attacks are LLMs vulnerable to?" "yes"