this post was submitted on 21 Mar 2025
210 points (99.1% liked)

Linux

6709 readers
295 users here now

A community for everything relating to the GNU/Linux operating system

Also check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
 

LLM scrapers are taking down FOSS projects' infrastructure, and it's getting worse.

you are viewing a single comment's thread
view the rest of the comments
[–] YetiSkotch@ieji.de 1 points 5 days ago (1 children)

@refalo @sudo If Proof of Work gets widely adopted I foresee a future where bot running data-centers can out-compute humans to visit sites, while old devices of users in poorer countries struggle to compute the required task for hours … Or is that fear misguided?

[–] sudo@programming.dev 1 points 2 days ago (1 children)

Admins will always turn down the bot management when it starts blocking end users. At that point you cough up the money for the extra bandwidth and investigate different solutions.

[–] YetiSkotch@ieji.de 2 points 2 days ago* (last edited 2 days ago) (1 children)

@sudo yeah, the bot-problem is hard, especially for voluntary that help others.

https://nadeko.net/announcements/invidious-and-the-bot-problem/

* they use a proof of work system called #Anubis to fix their #bot problem. I hope it works. #proofofwork

The proof of work right now needs about 1 second on my phone, so I am happy with that.

Perhaps the biggest problem of bots is the number of requests they start, which is impossible to replicate by a normal human clicking on buttons.

[–] sudo@programming.dev 1 points 21 hours ago

I've been criticizing Anubis and Proof of Work solutions in general. Its my speculation that they mostly work just by requiring you to execute javascript not by being an actual burden on the bots CPU.