
A security researcher has demonstrated how a malicious Google Calendar invite can prompt-inject ChatGPT and coax it into leaking private emails once Google connectors are enabled. In a post onX, on September 12, Eito Miyamura outlines a simple scenario: An attacker sends a calendar invitation seeded with instructions and waits for the target to engage with ChatGPT and ask it to perform an action. ChatGPT then reads the booby-trapped event and follows orders to search Gmail and follow sensitive details….








