Hallucinated packages are the sleeper threat. LLMs regularly invent package names that don't exist. One study found that nearly 20% of AI-recommended packages were fabrications, and 43% of those hallucinated names appeared consistently across queries.
AI的'幻觉'现象正在创造新的攻击向量,这被称为'slopsquatting'攻击。攻击者可以注册AI经常推荐的虚假包名,填充恶意代码,等待不知情的开发者或AI系统安装。这种攻击利用了AI的固有缺陷,令人深思。