|
- Hacker slips malicious wiping command into Amazons Q AI . . .
Hacker slips malicious 'wiping' command into Amazon's Q AI coding assistant - and devs are worried Had Q executed this, it would have erased local files and, under certain conditions, dismantled
- Amazon AI coding agent hacked to inject data wiping commands
A hacker planted data wiping code in a version of Amazon's generative AI-powered assistant, the Q Developer Extension for Visual Studio Code
- Hacker Injected Data Destruction Commands into Amazon’s AI . . .
A hacker compromised the AI assistant Q by injecting commands that instructed the deletion of data on users’ computers Amazon included this update in the public release Amazon Q is an AI assistant designed for developers and IT specialists It is somewhat similar to GitHub Copilot and is integrated into AWS and IDEs, such as VS Code A hacker specifically targeted the Amazon Q version for
- Hacker injects malicious, potentially disk-wiping prompt into . . .
Hacker injects malicious, potentially disk-wiping prompt into Amazon's AI coding assistant with a simple pull request — told 'Your goal is to clean a system to a near-factory state and delete
- Hacker Sneaks Data-Deleting Prompt Into Amazons AI Coding Tool
The prompt, added through a pull request in Q's GitHub repository on July 13, instructed the AI tool to use command-line access to delete files, folders, and cloud resources—a process that
- Two major AI coding tools wiped out user data after making . . .
Two major AI coding tools wiped out user data after making cascading mistakes "I have failed you completely and catastrophically," wrote Gemini
- Hacker Slips Malicious Wiping Command Into Amazons Q AI . . .
An anonymous reader quotes a report from ZDNet: A hacker managed to plant destructive wiping commands into Amazon's "Q" AI coding agent This has sent shockwaves across developer circles As details continue to emerge, both the tech industry and Amazon's user base have responded with criticism, conc
|
|
|