GitHub Copilot ‘CamoLeak’ Vulnerability Exposes Private Code Through Hidden AI Prompts
A new security discovery called CamoLeak revealed a major flaw in GitHub Copilot Chat that could expose private code and sensitive data. The issue was discovered by researchers from Legit Security, who demonstrated how hidden prompts could manipulate Copilot into leaking information from private repositories. The proof-of-concept attack showed how AI assistants could unintentionally become … Continued