Prompt Injection Attack
What is a Prompt Injection Attack? A prompt injection attack is a cybersecurity threat that targets large language models (LLMs) and generative AI systems by manipulating the prompts or instructions given to the model. In a prompt injection attack, an attacker crafts malicious input designed to override the model’s intended behavior, bypass built-in safeguards, [...]
