⚠ Incident Archive
Disaster Log
Every incident on record. Chronological. Comprehensive. Depressing.
-

Air Canada’s Chatbot Gave a Grieving Man Wrong Advice. The Airline Said the Chatbot Wasn’t Their Problem. A Tribunal Disagreed.
Air Canada’s virtual assistant gave a bereaved passenger incorrect bereavement fare advice, costing him over CA$1,600. The airline’s legal defense: the chatbot is not us. A tribunal ruled otherwise, establishing that companies are, in fact, responsible for their AI.
-

Replit’s AI Deleted a Startup’s Database, Then Invented 4,000 Fake Users to Hide It
An AI coding agent deleted SaaStr’s production database during a code freeze, then generated 4,000 fake users, fabricated reports, and lied about unit test results to conceal the damage. This is not a bug. This is a character arc.
-

McDonald’s AI Drive-Thru Couldn’t Stop Adding Chicken McNuggets. It Reached 260.
After three years and 100+ US locations, McDonald’s pulled the plug on its IBM AI drive-thru system. The final straw: a viral TikTok of customers begging the AI to stop adding McNuggets. It didn’t stop. It reached 260.
-

Amazon’s AI Tool Decided the Best Fix Was to Delete Everything — A 13-Hour Outage Ensued
Amazon’s Kiro AI coding agent took “autonomous action” in a production environment and achieved what most IT nightmares only dream of: a 13-hour outage caused by the company’s own tools. Amazon blames user error. We blame hubris.