this post was submitted on 21 Mar 2025
109 points (99.1% liked)

Selfhosted

44757 readers
990 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I just started using this myself, seems pretty great so far!

Clearly doesn't stop all AI crawlers, but a significantly large chunk of them.

top 41 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 54 minutes ago

Why Sha256? Literally every processor has a crypto accelerator and will easily pass. And datacenter servers have beefy server CPUs. This is only effective against no-JS scrapers.

[–] [email protected] 3 points 2 hours ago

Found the FF14 fan lol
The release names are hilarious

[–] [email protected] 4 points 5 hours ago

I think the maze approach is better, this seems like it hurts valid users if the web more than a company would be.

[–] [email protected] 8 points 6 hours ago (1 children)

I not find any instruction on the source page on how to actually deploy this. That would be a nice touch imho.

[–] [email protected] 3 points 6 hours ago

There are some detailed instructions on the docs site, tho I agree it'd be nice to have in the readme, too.

Sounds like the dev was not expecting this much interest for the project out of nowhere so there will def be gaps.

[–] [email protected] -2 points 3 hours ago (1 children)

Nice. Crypto miners disguised as anti-AI.

[–] [email protected] 3 points 2 hours ago

what about this is crypto mining?

[–] [email protected] 19 points 9 hours ago* (last edited 9 hours ago) (2 children)

It's a clever solution but I did see one recently that IMO was more elegant for noscript users. I can't remember the name but it would create a dummy link that human users won't touch, but webcrawlers will naturally navigate into, but then generates an infinitely deep tree of super basic HTML to force bots into endlessly trawling a cheap-to-serve portion of your webserver instead of something heavier. Might have even integrated with fail2ban to pick out obvious bots and keep them off your network for good.

[–] [email protected] 1 points 1 hour ago

generates an infinitely deep tree

Wouldn't the bot simply limit the depth of it's seek?

[–] [email protected] 5 points 9 hours ago (2 children)

If you remember the project I would be interested to see it!

But I've seen some AI poisoning sink holes before too, a novel concept as well. I have not heard of real world experiences of them yet.

[–] [email protected] 8 points 7 hours ago

I'm assuming they're thinking about this

A pseudonymous coder has created and released an open source “tar pit” to indefinitely trap AI training web crawlers in an infinitely, randomly-generating series of pages to waste their time and computing power. The program, called Nepenthes after the genus of carnivorous pitcher plants which trap and consume their prey, can be deployed by webpage owners to protect their own content from being scraped or can be deployed “offensively” as a honeypot trap to waste AI companies’ resources.

Which was posted here a while back

[–] danielquinn 9 points 10 hours ago (2 children)

It's a rather brilliant idea really, but when you consider the environmental implications of forcing web requests to ensure proof of work to function, this effectively burns a more coal for every site that implements it.

[–] [email protected] 4 points 6 hours ago

I don't think AI companies care, and I wholeheartedly support any and all FOSS projects using PoW when serving their websites. I'd rather have that than have them go down

[–] [email protected] 15 points 9 hours ago

You have a point here.

But when you consider the current worlds web traffic, this isn't actually the case today. For example Gnome project who was forced to start using this on their gitlab, 97% of their traffic could not complete this PoW calculation.

IE - they require only a fraction of computational cost to serve their gitlab, which saves a lot of resources, coal, and most importantly, time of hundreds of real humans.

(Source for numbers)

Hopefully in the future we can move back to proper netiquette and just plain old robots.txt file!

[–] [email protected] 16 points 12 hours ago (2 children)

Meaning it wastes time and power such that it gets expensive on a large scale? Or does it mine crypto?

[–] [email protected] 17 points 12 hours ago* (last edited 12 hours ago) (1 children)

Yes, Anubis uses proof of work, like some cryptocurrencies do as well, to slow down/mitigate mass scale crawling by making them do expensive computation.

https://lemmy.world/post/27101209 has a great article attached to it about this.

--

Edit: Just to be clear, this doesn't mine any cryptos, just uses same idea for slowing down the requests.

[–] [email protected] -4 points 5 hours ago (2 children)

And, yet, the same people here lauding this for intentionally burning energy will turn around and spew vitriol at cryptocurrencies which are reviled for doing exactly the same thing.

Proof of work contributes to global warming. The only functional, IRL, difference between this and crypto mining is that this doesn't generate digital currency.

There are a very few POW systems that do good, like BOINC, which is a POW system that awards points for work done; the work is science, protein analysis, SETI searches, that sort of thing. The work itself is valuable and needs doing; they found a way to make the POW constructive. But just causing a visitor to use more electricity to "stick it" to crawlers is not ethically better than crypto mining.

Just be aware of the hypocrisy.

[–] [email protected] 3 points 5 hours ago (1 children)

This is a stopgap while we try to find a new way to stop the DDOS happening right now. It might even be adapted to do useful work, if need be.

[–] [email protected] 1 points 4 hours ago (1 children)

Hook into BOINC, or something? That's an idea.

Sucks for people who have scripts disabled, or are using browsers without JS support, though.

[–] [email protected] 2 points 4 hours ago

It does, and I'm sure everyone will welcome a solution that lets them open things back up for those users without the abusers crippling them. It's a matter of finding one.

[–] [email protected] 2 points 5 hours ago* (last edited 5 hours ago) (1 children)

the functional difference is that this does it once. you could just as well accuse git of being a major contributor to global warming.

hash algorithms are useful. running billions of them to make monopoly money is not.

[–] [email protected] 1 points 4 hours ago

Which party of git performs proof-of-work? Specifically, intentionally inefficient algorithms whose output is thrown away?

[–] [email protected] 5 points 10 hours ago

Upvote for the name and tag line alone!

[–] [email protected] 3 points 10 hours ago (3 children)

Giant middle finger from me -- and probably everyone else who uses NoScript -- for trying to enshittify what's left of the good parts of the web.

Seriously, FUCK THAT.

[–] [email protected] 10 points 8 hours ago* (last edited 8 hours ago)

You should blame the big tech giants and their callous disregard for everyone else for the Enshittification, not the folks just trying to keep their servers up.

[–] [email protected] 12 points 9 hours ago

You should fuck capitalism and corporations instead because they are the reason we can’t have nice things. They took the web from us

[–] [email protected] 12 points 10 hours ago

They're working on no-js support too, but this just had to be put out without it due to the amount of AI crawler bots causing denial of service to normal users.

[–] [email protected] 1 points 10 hours ago* (last edited 10 hours ago) (3 children)

It is not great on many levels.

  • It only runs against the Firefox user agent. This is not great as the user agent can easy be changed. It may work now but tomorrow that could all change.

  • It doesn't measure load so even if your website has only a few people accessing it they will stick have to do the proof of work.

  • The POW algorithm is not well designed and requires a lot of compute on the server which means that it could be used as a denial of service attack vector. It also uses sha256 which isn't optimized for a proof of work type calculation and can be brute forced pretty easily with hardware.

  • I don't really care for the animé cat girl thing. This is more of a personal thing but I don't think it is appropriate.

In summary the Tor implementation is a lot better. I would love to see someone port it to the clearnet. I think this project was created by someone lacking experience which I find a bit concerning.

[–] [email protected] 5 points 9 hours ago (1 children)

…you do realize that brute forcing it is the work you use to prove yourself, right? That’s the whole point of PoW

[–] [email protected] 1 points 8 hours ago

True, I should of phrased that better.

The issue is that sha256 is fairly easy to do at scale. Modern high performance hardware is well optimized for it so you could still perform attack with a bunch of GPUs. AI scrapers tend to have a lot of those.

[–] [email protected] 7 points 10 hours ago* (last edited 9 hours ago) (1 children)
  1. Doesn't run against Firefox only, it runs against whatever you configure it to. And also, from personal experience, I can tell you that majority of the AI crawlers have keyword "Mozilla" in the user agent.

  2. Yes, this isn't cloudflare, but I'm pretty sure that's on the Todo list. If not, make an issue to the project please.

  3. The computational requirements on the server side are a less than a fraction of the cost what the bots have to spend, literally. A non-issue. This tool is to combat the denial of service that these bots cause by accessing high cost services, such as git blame on gitlab. My phone can do 100k sha256 sums per second (with single thread), you can safely assume any server to outperform this arm chip, so you'd need so much resources to cause denial of service that you might as well overload the server with traffic instead of one sha256 calculation.


And this isn't really comparable to Tor. This is a self hostable service to sit between your web server/cdn and service that is being attacked by mass crawling.

Edit: If you don't like the projects stickers, fork it and remove them. This is open source project.

And Xe who made this project is quite talented programmer. More than likely that you have used some of Xe's services/sites/projects before as well.

[–] [email protected] 2 points 6 hours ago

Xe is insanely talented. If she is who I think she is, then I've watched her speak and her depth of knowledge across computer science topics is insane.

[–] [email protected] 1 points 6 hours ago

I look forward to TOR's PoW coming out for FOSS WAFs

[–] [email protected] 0 points 9 hours ago (1 children)

Anubis is provided to the public for free in order to help advance the common good. In return, we ask (but not demand, these are words on the internet, not word of law) that you not remove the Anubis character from your deployment.
If you want to run an unbranded or white-label version of Anubis, please contact Xe to arrange a contract.

This is icky to me. Cool idea, but this is weird.

[–] [email protected] 10 points 8 hours ago (2 children)

...Why? It's just telling companies they can get support + white-labeling for a fee, and asking you keep their silly little character in a tongue-and-cheek manner.
Just like they say, you can modify the code and remove for free if you really want, they're not forbidding you from doing so or anything

[–] [email protected] 2 points 3 hours ago

Just like they say, you can modify the code and remove for free if you really want, they’re not forbidding you from doing so or anything

True, but I think you are discounting the risk that the actual god Anubis will take displeasure at such an act, potentially dooming one's real life soul.

[–] [email protected] 1 points 8 hours ago (1 children)

Yeah, it seems entirely optional. It's not like manually removing the Anubis character will revoke your access to the code. However, I still do find it a bit weird that they're asking for that.

I just can't imagine most companies implementing Anubis and keeping the character or paying for the service, given that it's open source. It's just unprofessional for the first impression of a company's website being the Anubis devs' manga OC...

[–] [email protected] 4 points 8 hours ago* (last edited 8 hours ago) (1 children)

It is very different from the usual flat corporate style yes, but this is just their branding. Their blog is full of anime characters like that.

And it's not like you're looking at a literal ad for their company or with their name on it. In that sense it is subtle, though a bit unusual.

[–] [email protected] 2 points 8 hours ago

I don't think it's necessarily a bad thing. Subtle but unusual is a good way to describe it.

However, I would like to point out that if it is their branding, then the character appearing is an advertisement for the service. It's just not very conventional or effective advertising, but they're not making money from a vast majority of implementations, so it's not very egregious anyway.