this post was submitted on 08 Mar 2025
763 points (98.1% liked)
Technology
64937 readers
5206 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't mind AI. It is simply a reflection of whoever is in charge of it. Unfortunately, we have monsters who direct humans and AI alike to commit atrocities.
We need to get rid of the demons, else humanity as a whole will continue to suffer.
not even that. it's an inherently more regressive version of whatever data that person feeds it.
the two arguments for deploying this shit outside of very narrow laboratory uses, where everyone was already using other statistical models.
A. this is one last grasp at fukuyama's 'end of history', one last desperate scream of the liberal order that they want to be regressive shit heads and build the abdication machine as their grand industrial-philosophical project, so they can do whatever horrible shit they want, and claim that they're still compassionate and only doing it because computer said so.
B. this is a project by literal monarchists. people who wish to kill democracy. to murder truth and collaboration; replace it with blind tribalistic loyalty to a fuhrer/king. the rhetoric coming from a lot of the funders of these things supports this.
this technology is existentially evil, and will be the end of our society either way. it must be stopped. the people who work on it must be stopped. the people who fund it must be hanged.
I mean yes, but it can be VERY useful in these narrow laboratory use cases
im skeptical but open to that. it's just that these models are pushing pushed into literally everything, to the point they're hard to avoid. I can't think of another kind of specialized lab tool that has had that done. I do not own, nor have I ever owned, a sample centrifuge. I don't have CRISPR tools. I have never, outside of academic settings, opened wolfram alpha on my home computer. even AUTOCAD and solidworks are specialist tools, and I haven't touched any version of either in years.
because these models, while not good for anything anyone should ever actually want outside a lab setting, are also very very good for fascism. they do everything a fascist needs to, aside from the actual physical killing.
and I don't think the level of development and deployment that these tools get, along with the wildly inflated price of the hardware to run them (or anything else) and death of web search, the damage to academic journals, etc, is a net benefit. even to specialized researchers who have uses for specialized versions of them as the statistical tool that they are. certainly not to the fields over the long term.
Why shouldn't they have long term benefits for researchers?
Reminds me a bit of when CRISPR got big, people were worried to no end about potential dangers, designer babies, bioterrorism ("everybody can make a killer virus in their garage now") etc. In reality, it has been a huge leap forward for molecular biology and has vastly helped research, cancer treatment, drug development and many other things. I think machine learning could have a similar impact. It's already being used in development of new drugs, genomics, detection of tumours just to name a few
because murdering truth is not good for science. fascism is not good for science funding. researchers use search engines all the time. academia is struggling with a LLM fraud problem.