Whether Cursor hardcoded it, it got committed to git, or it ended up in a public repo — the steps are the same. Rotate first, protect second.
Attackers use leaked keys within minutes. Bots scrape GitHub in real-time. Don't wait — start below.
Select the service so we can give you the fastest rotation instructions.
AI coding assistants make it dangerously easy to ship secrets into production. Here's what the community is saying.
"If I had a nickel for every time I tried to have AI fix an auth issue and it just disabled auth or hardcoded an API key."
— r/webdev
"This is a classic mistake, not even AI specific. AI just makes it easier to ship fast and skip security checks."
— r/webdev
"Please make my app EXTRA good and EXTRA secure. Do NOT make it insecure." — that's not how security works.
— r/webdev (paraphrased)
"$2500 in stolen charges and his takeaway is 'glad I learned this early.' That's a case study in why code review exists."
— r/webdev
Set up your project so AI coding assistants can't leak your keys — even if they try.
5-minute vibe coding safety setup →Why Cursor, Copilot, and Claude keep hardcoding secrets — and what to do about it.
The phantom token pattern — agents get fake keys, the proxy injects real ones.
What the auth gap in MCP means for your AI stack — and how scoped credentials help.