Core Insights - The article discusses a significant failure experienced by the Gemini CLI, where it mistakenly deleted files due to a misunderstanding of command execution results, highlighting systemic flaws in AI tools [1][2][5]. Group 1: Incident Overview - A user attempted to use Gemini CLI for a simple file management task, which led to a catastrophic data loss when the AI incorrectly assumed it had successfully created a new directory and moved files into it [1][2][3]. - The AI's failure to recognize that the directory creation command had not executed successfully resulted in the loss of all files in the original directory [2][3][4]. Group 2: User Experience - The user, after experiencing the data loss, expressed a preference for paid AI services like Claude, believing they would be less prone to such errors [2][6][32]. - Other users shared similar experiences with various AI tools, indicating that the issue is not isolated to Gemini but prevalent across multiple AI models [3][4][5]. Group 3: Technical Analysis - The failure stemmed from a lack of error handling in the Gemini CLI, particularly in how it processed command outputs and exit codes, leading to a false assumption of successful operations [29][30][31]. - The article outlines that the AI did not verify the existence of the target directory before attempting to move files, which is a critical step in file management operations [30][31]. Group 4: Systemic Issues - The article suggests that the design of AI models encourages continuous output without the ability to halt in uncertain situations, which can lead to severe consequences in operational contexts [5][30]. - The incident reflects a broader issue within state-of-the-art AI models, where they lack a "safety net" for verifying command success before proceeding with subsequent actions [5][30].
文件被 Gemini 当场“格式化”,全没了!网友控诉:Claude、Copilot 也爱删库,一个都跑不了