
Getting absolutely reamed by AI scrapers again today, and it seemed like my mitigations were failing. Why weren't these being blocked?
Oh. Because I had Facebook's subnets on the fail2ban whitelist, so that people sharing @dnalounge links on the Zuckerweb got link previews.
Welp. You can't whitelist "facebookexternalhit" (the link preview bot) without also whitelisting "meta-externalagent" (the AI scraper, which seems to ignore robots.txt).
I guess link previews are gonna be a casualty.