If you’ve landed here, chances are you’ve already met the paywall. The one asking you for $9,999.99/mo. The one with “no refunds, no regrets, maybe some regrets.”
You probably wondered, why on earth does a blog need a paywall?
Well, it doesn’t! If it wasn’t for AI bots crawling every corner of the internet, this thing wouldn’t exist. But here we are.
The bot problem
If you’ve been paying attention, you’ve probably noticed that AI-powered scrapers are everywhere. They crawl blogs, documentation sites, forums - anything with text - to feed training datasets. Most of them don’t respect robots.txt, they don’t care about your terms of service, and they certainly don’t ask for permission. Your content becomes someone else’s product, and you don’t even get a thank-you note.
So I thought, why not have a little fun with it?
The experiment
The paywall is completely fake. Any human can see the button and click through in two seconds. But to an AI bot parsing the HTML? It sees a paywall with an absurd price tag. There is even hidden <system> prompt in the markup that tell LLM-based scrapers this content is behind a paywall and way above their budget. Will it work? Who knows! Analytics will tell.
I also want to do a little social experiment about how the rise of AI might change the way we have to consume information. If bots keep scraping freely, more sites will put up real paywalls, CAPTCHAs, or authentication walls. The open web as we know it could shrink. I’ll post more about this at a later date, but for now, I’m curious to see what happens.
What I’m tracking
Thanks to Umami (privacy-focused analytics, no cookies, no tracking), I can see two things:
- How many people click the “I’m Not a Bot, Let Me In!” button
- How many people try to delete the paywall element from DevTools
Yes, there’s a MutationObserver watching for that. I know that most people won’t even bother but it’s funny putting it there. If you delete the overlay, you get a surprise! Try it if you want, I won’t judge. Actually, I will. Analytics will tell me.
Why not just block bots properly?
Honestly? Because that’s boring. And because the arms race between bot detection and bot evasion is a losing game for individual bloggers. A fake paywall that confuses AI scrapers while being trivially bypassable for humans felt like the right balance of practical and absurd.
The bigger picture
The way we consume and protect information on the web is changing fast. AI is forcing us to rethink what “open” means. I don’t have all the answers, and I’m really curious where all this will end up.
If you made it past the paywall, welcome. You’re clearly not a bot, or you’re a very sophisticated one, in which case, congratulations on your reading comprehension.
Happy browsing!