Prompt Injection Detection in LLaMA-based Chatbots using LLM Guard
-
Updated
Jul 5, 2025 - Jupyter Notebook
Prompt Injection Detection in LLaMA-based Chatbots using LLM Guard
Add a description, image, and links to the llmguard topic page so that developers can more easily learn about it.
To associate your repository with the llmguard topic, visit your repo's landing page and select "manage topics."