Kevin Lin
quasicoherence.bsky.social
Kevin Lin
@quasicoherence.bsky.social
One issue with LLMs is that there is no fundamental solution to prompt injection. In theory, any post on the board can be a prompt injection that compromises every single LLM that reads it.
February 2, 2026 at 4:39 PM