Digg came back from the dead in January. Open beta launched January 14, 2026. Within hours, AI-generated spam had swamped every moderation tool the team could deploy, and by March 14 the site was gone — not from lack of interest but from an infestation that couldn't be outrun.1 Two months from launch to shutdown. That's the current shelf life of a platform without aggressive identity verification.

Bot traffic surpassed human traffic in 2024 for the first time in a decade — 51% of all web traffic according to Imperva, with malicious bots alone accounting for 37%.2 Cloudflare, measuring from its own network, puts the crossover further out: CEO Matthew Prince expects bots to command the majority of all internet traffic by 2027.3 Overall AI traffic rose 187% in 2025, but the sharpest spike was in agentic browser traffic — AI systems autonomously navigating the web, not people directing them — which grew 7,851% year-over-year.4 The internet is filling up with software that pretends to be people, and the ratio gets worse every quarter.

That pressure is what pushed Reddit — a platform built on pseudonymity, where usernames are identities and the culture treats anonymity as something close to a constitutional right — to announce that suspected bot accounts will need to verify they're human using passkeys, biometrics, World ID's iris scanning, or in some countries, government-issued ID.5 Reddit removes roughly 100,000 automated accounts per day. Its co-founder Alexis Ohanian said the quiet part out loud: "I just don't know how to sell face-scanning to redditors or even lurkers."6

He's right not to know. Every method for proving you're human now requires revealing something about who you are, and every one of them trades away something the internet was built to protect. The people designing these trades are the same ones who profit from what gets collected.