Badland9085

joined 2 years ago
[–] Badland9085@lemm.ee 3 points 3 weeks ago

I don’t think I’ve known of a “designated tourist area” in Japan. I lived there for a month in some town that I doubt there’d be any foreign tourist and nobody harassed me and my friends. We were even helped by the locals and police when one of my friends lost their wallet, and they were super patient about it too.

[–] Badland9085@lemm.ee 2 points 3 weeks ago (1 children)

I’ve read (in Japanese, written by Japanese people) that those places will actually welcome you in if speak Japanese and is respectful of the place and other patrons, the latter 2 conditions just like any respectable human should. It could be that they’ve put those up due to xenophobia, but there are also those that just don’t want to deal with people who don’t respect their culture.

[–] Badland9085@lemm.ee 3 points 3 weeks ago

Which will in turn affect business. Nobody can buy their stuff is nobody has the money for it, unless their plan is just to keep the wealthy people alive, while at the same time, making them poorer

It might take a while for this to play out though.

[–] Badland9085@lemm.ee 6 points 3 weeks ago

Fwiw mentioning, the level of fentanyl crossings is not even high, and is simply an enlarged wedge issue for the Trump administration to stave off dissuaders and forcibly push their way pass Congress to do whatever they want.

[–] Badland9085@lemm.ee 2 points 1 month ago

You could create an account that blocks off communities for news and technology, and any other communities that have a high likelihood of reporting on current events. Just switch to the account on days where you just don’t want to read such news, for any respectable reason you may have (it’s understandable, it can be draining).

This should be a no-brainer, but Lemmy doesn’t really filter stuff out by default, unless the admins decide so. So as long as you’ve created an account on a fairly managed instance, and given that the current news cycle, especially in the Western & English-speaking world, you won’t be able to escape Trump and Musk, especially when they’re dominating headlines due to how they are literally affecting the lives of millions, if not billions, of people.

[–] Badland9085@lemm.ee 8 points 1 month ago

For some, human pride and dignity have literally no value, or is something they will hold simply to trade it off whenever convenient, especially in a world that can value it, so it’s just like a commodity.

[–] Badland9085@lemm.ee 2 points 1 month ago

It’s the 21st century. Many of us are educated enough and have a strong enough image of what a country is. Any country may try to annex any land, but they’ll almost always face resistance. Even in the event of a full annexation, you can’t stop the people from revolting, essentially making your country look as miserable as possible to everyone. Heck, even the full cleansing of an entire population won’t guarantee you’ll reach long-lasting stability on annexed lands; people will hide, repopulate, teach their descendants about their past and forever torture your nation and its people, however horrifying of a worldview it may sound like.

I remember reading somewhere that some department in the US gov have a paper on their inability to annex or even control foreign lands and their people. Essentially, it doesn’t matter if the USA has the most powerful military in the history of humanity; it cannot conquer the minds of people today, and will suffer from instability for a very long time.

[–] Badland9085@lemm.ee 201 points 1 month ago

Says the company that literally crawled the Internet without anyone’s permission to train their damn model.

Rules for thee, not for me.

[–] Badland9085@lemm.ee 2 points 1 month ago

You come from a healthy background is what I’m hearing. And that’s good, and I don’t mean that in a derogatory way. What you have there is absolutely the right mindset to have. These tools are made by humans, who have their own set of problems they want to solve with their tools. It may not be the best tool, but it can work pretty damn well.

However, it’s also not uncommon to see communities rage and fight over the superiority of their tools, if not just to shun those that they think are inferior. It’s a blatantly childish or tribalistic behaviour, depending on how you look at humanity. And you’ll see this outside of programming too; in the office, in town, on the streets. People engage in this behaviour so that they can show that “I am on your side”, for the side where they think is the right or superior side, based on factors like a perception of group size, a perception of power, a perception of closeness. It appeals to a common human desire to belong to a strong group. It appeals to the human desire to feel safe. And when you start looking at it that way, that’s not too different from how animals behave. It’s important to note that not all humans have the same amount of desire for this sort of tribe, or would give into that desire to engage in such behaviours, but it’s not surprising to see.

In any case, this article is essentially a callout to the sort of toxic behaviour done for the sake of feeling superior, that exists within the programming community, to a point where some may even say is a major subculture.

[–] Badland9085@lemm.ee 5 points 1 month ago (1 children)

Didn’t something similar happen just a couple months ago?

[–] Badland9085@lemm.ee 7 points 1 month ago

This. Any time someone’s tries to tell me that AGI will come in the next 5 years given what we’ve seen, I roll my eyes. I don’t see a pathway where LLMs become what’s needed for AGI. It may be a part of it, but it would be non-critical at best. If you can’t reduce hallucinations down to being virtually indistinguishable from misunderstanding a sentence due to vagueness, it’s useless for AGI.

Our distance from true AGI (not some goalpost moved by corporate interests) has not significantly moved from before LLMs became a thing, in my very harsh opinion, bar the knowledge and research being done by those who are actually working towards AGI. Just like how we’ve always thought AI would come one day, maybe soon, before 2020, it’s no different now. LLMs alone barely closes that gap. It gives us that illusion at best.

[–] Badland9085@lemm.ee 8 points 1 month ago

Is there anyone here who’s familiar with the paper(s) mentioned in the article? I’d actually like to read them, so if you do, it’d be great if you could share it with me. I couldn’t really find it in the article, unless it’s just hidden under one of their links.

I found the following paper with the authors mentioned:

https://arxiv.org/abs/2501.11582

But not sure if that’s it. It does have some semblance to the topic though. My search-fu isn’t really doing me great with just author names though.

view more: ‹ prev next ›