this post was submitted on 07 Jul 2025
571 points (98.3% liked)

Open Source

38810 readers
116 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 4 days ago (7 children)

What advantage does this software provide over simply banning bots via robots.txt?

[–] [email protected] 83 points 4 days ago

Robots.txt expects that the client is respecting the rules, for instance, marking that they are a scraper.

AI scrapers don't respect this trust, and thus robots.txt is meaningless.

[–] [email protected] 42 points 4 days ago

Well, now that y'all put it that way, I think it was pretty naive from me to think that these companies, whose business model is basically theft, would honour a lousy robots.txt file...

[–] [email protected] 28 points 4 days ago

TL;DR: You should have both due to the explicit breaking of the robots.txt contract by AI companies.

AI generally doesn't obey robots.txt. That file is just notifying scrapers what they shouldn't scrape, but relies on good faith of the scrapers. Many AI companies have explicitly chosen not no to comply with robots.txt, thus breaking the contract, so this is a system that causes those scrapers that are not willing to comply to get stuck in a black hole of junk and waste their time. This is a countermeasure, but not a solution. It's just way less complex than other options that just block these connections, but then make you get pounded with retries. This way the scraper bot gets stuck for a while and doesn't waste as many of your resources blocking them over and over again.

[–] [email protected] 30 points 4 days ago

the scrapers ignore robots.txt. It doesn't really ban them - it just asks them not to access things, but they are programmed by assholes.

[–] [email protected] 14 points 4 days ago

The difference is:

  • robots.txt is a promise without a door
  • Anubis is a physical closed door, that opens up after some time
[–] [email protected] 6 points 4 days ago

The problem is Ai doesn't follow robots.txt,so Cloudflare are Anubis developed a solution.

[–] [email protected] -5 points 4 days ago

I mean, you could have read the article before asking, it's literally in there...