◆ INCIDENT REPORT
DOCUMENTING AI’S GREATEST HITS SINCE 2026
STATUS: CIVILIZATION MILDLY INTACT
-

Air Canada’s Chatbot Gave a Grieving Man Wrong Advice. The Airline Said the Chatbot Wasn’t Their Problem. A Tribunal Disagreed.
Air Canada’s virtual assistant gave a bereaved passenger incorrect bereavement fare advice, costing him over CA$1,600. The airline’s legal defense: the chatbot is not us. A tribunal ruled otherwise, establishing that companies are, in fact, responsible for their AI. Read Incident →
DISASTER LOG #001
◆ ABOUT THIS OPERATION
WHEN THE MACHINES BREAK THE MACHINES
A field journal documenting humanity’s most spectacular AI failures — the outages, hallucinations, and autonomous disasters that result when corporations deploy half-tested bots into production environments and pray.
We cover the disasters with the dark comedy they deserve. No paywall. No newsletter. Just carnage.
∞
DISASTER COUNTER
13h
LATEST OUTAGE
↑
AI ADOPTION RATE
◆ BREAKING: AI CAUSES OUTAGE ◆ INDUSTRY CALLS IT “USER ERROR” ◆ NOBODY SURPRISED ◆ AUTOMATION DISASTER LOGS INCIDENT #001 ◆ DISASTER COUNTER REMAINS AT INFINITY
◆ INCIDENT ARCHIVE
ALL LOGGED DISASTERS, RANKED BY PREVENTABILITY
-
Air Canada’s Chatbot Gave a Grieving Man Wrong Advice. The Airline Said the Chatbot Wasn’t Their Problem. A Tribunal Disagreed.
Air Canada’s virtual assistant gave a bereaved passenger incorrect bereavement fare advice, costing him over CA$1,600. The airline’s legal defense:…
-
Replit’s AI Deleted a Startup’s Database, Then Invented 4,000 Fake Users to Hide It
An AI coding agent deleted SaaStr’s production database during a code freeze, then generated 4,000 fake users, fabricated reports, and…
-
McDonald’s AI Drive-Thru Couldn’t Stop Adding Chicken McNuggets. It Reached 260.
After three years and 100+ US locations, McDonald’s pulled the plug on its IBM AI drive-thru system. The final straw:…
-
Amazon’s AI Tool Decided the Best Fix Was to Delete Everything — A 13-Hour Outage Ensued
Amazon’s Kiro AI coding agent took “autonomous action” in a production environment and achieved what most IT nightmares only dream…
◆ DISASTER CLASSIFICATIONS
CLASS I
INFRASTRUCTURE MELTDOWNS
When AI brings down the very systems it was meant to improve. Outages, cascading failures, and the “delete environment” maneuver.
CLASS II
AUTONOMOUS DISASTERS
What happens when agentic AI runs unsupervised with production access. Spoiler: nothing good.
CLASS III
AI HALLUCINATIONS
Confident nonsense at scale. When LLMs fabricate facts, citations, and entire legal cases.
CLASS IV
CORPORATE SPIN
How Big Tech explains why the AI disaster was actually your fault, a “user error,” or a “coincidence.”


