Attack References
It’s disturbingly easy to trick AI into doing something deadly
GhostStripe attack haunts self-driving cars by making them ignore road signs
MadRadar hack can make self-driving cars 'hallucinate' imaginary vehicles and veer dangerously off course
Two big computer vision papers boost prospect of safer self-driving vehicles
References
SecML:
Secure and Explainable Machine Learning in Python
ChatGPT Prompt Engineering for Developers
Prompt Engineering Guide
Google's Generative AI learning path
A jargon-free explanation of how AI large language models work
Pinecone Makes Accurate, Fast, Scalable Generative AI Accessible to Organizations Large and Small with Launch of its Serverless Vector Database
Pinecone Vector Database
Free Training Building Applications with Vector Databases
The Databricks Data Intelligence Platform
Attention in transformers, visually explained
|