Session
Prompt Hacking and How to Safeguard Your LLM with Nvidia NeMo Guardrails
Presented at Bell Cloud Day Conference in Toronto & Montreal - June 2024
Presenting different methods bad actors use to circumvent traditional LLM prompt guards via prompt injection and prompt hacking and how to safeguard your LLM from these techniques using NeMo Guardrails from Nvidia. Will walk through how to setup NeMo Guardrails, how to implement various guards and rails and demo them in action using the NeMo Guardrails server.

Farshad Ghodsian
Sr. Technical Product Manager - AI Infrastructure & MLOps @ AMD
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top