jsomae

joined 11 months ago
[–] [email protected] 6 points 1 month ago (1 children)

Lix is the better open-source lemmings

[–] [email protected] 4 points 1 month ago

As with crypto, dedicated hardware should be able to do better than GPUs, if the model is pinned down.

[–] [email protected] 1 points 1 month ago (1 children)

The first few episodes were so uncomfortable I couldn't get through them. :(

[–] [email protected] 2 points 1 month ago

I think Bloom Into You is good. However, you'll need to switch to the manga halfway through since the anime is incomplete (and will probably never be finished).

[–] [email protected] 5 points 1 month ago (2 children)

in the case of discord, it's not convenience, it's the network effect.

[–] [email protected] 1 points 1 month ago

Wikipedia suggests f-droid is a "free and open-source app store."

[–] [email protected] 17 points 1 month ago* (last edited 1 month ago) (4 children)

60 seconds in 24 hours seems too prone to the possibility of a false positive. What if you forget and take a nap? What if there's a power outage? What if your phone breaks unexpectedly?

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago)

Obviously it depends on your GPU. A crypto mine, you'll leave it running 24/7. On a recent macbook, an LLM will run at several tokens per second, so yeah for long responses it could take more than a minute. But most people aren't going to be running such an LLM for hours on end. Even if they do -- big deal, it's a single GPU, that's negligible compared to running your dishwasher, using your oven, or heating your house.

[–] [email protected] 2 points 1 month ago (1 children)

Exactly. Talking. Violence isn't going to make more leftists.

That said, call me paranoid but I think three-letter organizations are the main obstacle to organizing. I don't know what to do about that.

[–] [email protected] 12 points 1 month ago* (last edited 1 month ago) (4 children)

I don't have a source for that, but the most that any locally-run program can cost in terms of power is basically the sum of a few things: maxed-out gpu usage, maxed-out cpu usage, maxed-out disk access. GPU is by far the most power-consuming of these things, and modern video games make essentially the most possible use of the GPU that they can get away with.

Running an LLM locally can at most max out usage of the GPU, putting it in the same ballpark as a video game. Typical usage of an LLM is to run it for a few seconds and then submit another query, so it's not running 100% of the time during typical usage, unlike a video game (where it remains open and active the whole time, GPU usage dips only when you're in a menu for instance.)

Data centers drain lots of power by running a very large number of machines at the same time.

[–] [email protected] 5 points 1 month ago (3 children)

agreed -- how do we make more leftists though?

[–] [email protected] 21 points 1 month ago (12 children)

Running an llm llocally takes less power than playing a video game.

view more: ‹ prev next ›