X @Decrypt
Decrypt·2026-04-21 22:02
Researchers say a prompt injection bug in Google's Antigravity AI coding tool could have let attackers run commands, despite safeguards. https://t.co/aguTa8W4FB ...
Researchers say a prompt injection bug in Google's Antigravity AI coding tool could have let attackers run commands, despite safeguards. https://t.co/aguTa8W4FB ...