this post was submitted on 01 Jul 2025
2130 points (98.5% liked)

Microblog Memes

8537 readers
4224 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 166 points 2 weeks ago* (last edited 2 weeks ago) (30 children)

I had my energy company remove their LVTC smart meter this week after they started using it to shut off our condenser unit during our 100 degree days

The fact that it exists at all is bad enough, but they were doing this at a time when our AC was already malfunctioning due to low refrigerant. On the day they first shut it off, our house reached 94 degrees.

The program that the previous owner signed up for that enabled them to do this gave them a fucking two dollar a month discount.

I use a smart thermostat to optimize my home conditioning - having a second meter fucking with my schedule ends up making us all miserable. Energy providers need to stop fucking around and just build out their infrastructure to handle worst case peak loads, and enable customers to install solar to reduce peak loading to begin with.

The other thing that kills me about this is that our provider administers our city's solar electric subsidy program themselves. When i had them come out to give us a quote, they inflated their price by more than 100% because they knew what our electricity bill was. All they did was take our average monthly bill and multiplied it by the repayment period. I could have been providing them more energy to the grid at their peak load if they hadn't tried scamming me.

FUCK private energy providers.

[–] [email protected] 29 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

How tf can a meter shut of an applience? Did you also have smart breakers from them?

Anyway absolutely ridiculous

[–] [email protected] 58 points 2 weeks ago

It's separate from the main meter and connected directly at the condenser unit.

It monitors power draw and acts as a relay when the provider sends a shutoff signal. The thermostat thinks the system is still going, and the fans still push air through the vents, but the coils aren't being cooled anymore so the air gets hot and musty.

load more comments (29 replies)
[–] [email protected] 93 points 2 weeks ago (2 children)

Yeah, that thing that nobody wanted? Everybody has to have it. Fuck corporations and capitalism.

[–] [email protected] 26 points 2 weeks ago (10 children)

Just like screens in cars, and MASSIVE trucks. We don't want this. Well, some dumbass Americans do, but intelligent people don't need a 32 ton 6 wheel drive pickup to haul jr to soccer.

[–] [email protected] 23 points 2 weeks ago

Massive trucks? They need those trucks for truck stuff, like this giant dilhole parking with his wife to go to Aldi today. Not even a flag on the end of that ladder, it filled a whole spot by itself.

My couch wouldn't fit in that bed, and every giant truck I see is sparkling shiny and looks like it hasn't done a day of hard labor, much like the drivers.

load more comments (9 replies)
[–] [email protected] 22 points 2 weeks ago (1 children)

Oh, and you don't want it and want the stupid model? You can still buy it for 3x the price.

load more comments (1 replies)
[–] [email protected] 84 points 2 weeks ago (1 children)

Also they can build nuclear power generators for the data centers but never for the residential power grid.

load more comments (1 replies)
[–] [email protected] 82 points 2 weeks ago (16 children)

Worse is Google that insists on shoving a terrible AI-based result in your face every time you do a search, with no way to turn it off.

I'm not telling these systems to generate images of cow-like girls, but I'm getting AI shoved in my face all the time whether I want it or not. (I don't).

[–] [email protected] 29 points 2 weeks ago

Then I guess it's time to stop using Google!

[–] [email protected] 13 points 2 weeks ago (3 children)

And including the word "fuck" in your query no longer stops it.

load more comments (3 replies)
load more comments (14 replies)
[–] [email protected] 67 points 2 weeks ago (1 children)

Meanwhile I'm down town I'm my city cleaning windows in office buildings that are 75% empty but the heat or ac is blasting on completely empty floors and most of the lights are on.

load more comments (1 replies)
[–] [email protected] 59 points 2 weeks ago (2 children)

When I’m told there’s power issues and to conserve power I drop my AC to 60 and leave all my lights on. Only way for them to fix the grid is to break it.

[–] [email protected] 17 points 2 weeks ago

Literally rolling coal to own the cons

[–] [email protected] 12 points 2 weeks ago
[–] [email protected] 50 points 2 weeks ago* (last edited 2 weeks ago) (19 children)

Let’s do the math.

Let’s take an SDXl porn model, with no 4-step speed augmentations, no hand written quantization/optimization schemes like svdquant, or anything, just an early, raw inefficient implementation:

https://www.baseten.co/blog/40-faster-stable-diffusion-xl-inference-with-nvidia-tensorrt/#sdxl-with-tensorrt-in-production

So 2.5 seconds on an A100 for a single image. Let’s batch it (because that’s what’s done in production), and run it on the now popular H100 instead, and very conservatively assume 1.5 seconds per single image (though it’s likely much faster).

That’s on a 700W SXM Nvidia H100. Usually in a server box with 7 others, so let’s say 1000W including its share of the CPU and everything else. Let’s say 1400W for networking, idle time, whatever else is going on.

That’s 2 kJ, or 0.6 watt hours.

…Or about the energy of browsing Lemmy for 30-60 seconds. And again, this is an high estimate, but also a fraction of a second of usage for a home AC system.


…So yeah, booby pictures take very little energy, and the usage is going down dramatically.

Training light, open models like Deepseek or Qwen or SDXL takes very little energy, as does running them. The GPU farms they use are tiny, and dwarfed by something like an aluminum plant.

What slurps energy is AI Bros like Musk or Altman trying to brute force their way to a decent model by scaling out instead of increasing efficiency, and mostly they’re blowing that out of proportion to try the hype the market and convince them AI will be expensive and grow infinitely (so people will give them money).

That isn’t going to work very long. Small on-device models are going to be too cheap to compete.

https://escholarship.org/uc/item/2kc978dg

So this is shit, they should be turning off AI farms too, but your porn images are a drop in the bucket compared to AC costs.


TL;DR: There are a bazillion things to flame AI Bros about, but inference for small models (like porn models) is objectively not one of them.

The problem is billionaires.

[–] [email protected] 34 points 2 weeks ago (3 children)

I don’t disagree with you but most of the energy that people complain about AI using is used to train the models, not use them. Once they are trained it is fast to get what you need out of it, but making the next version takes a long time.

[–] [email protected] 13 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Only because of brute force over efficient approaches.

Again, look up Deepseek's FP8/multi GPU training paper, and some of the code they published. They used a microscopic fraction of what OpenAI or X AI are using.

And models like SDXL or Flux are not that expensive to train.

It doesn’t have to be this way, but they can get away with it because being rich covers up internal dysfunction/isolation/whatever. Chinese trainers, and other GPU constrained ones, are forced to be thrifty.

load more comments (2 replies)
load more comments (2 replies)
load more comments (18 replies)
[–] [email protected] 33 points 2 weeks ago

Crossposted to fuck ai community

[–] [email protected] 26 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

We're going away folks, and nothing of any true value will be lost, except all the species that did live in homeostasis with the Earth that we're taking with us in our species' avarice induced murder-suicide

[–] [email protected] 12 points 2 weeks ago (4 children)

Carlin had some good material, but this is an absolutely stupid mindset. We can cause an extreme level of ecological damage. Will the planet eventually recover? Quite possibly. But that's not a certainty, and in the mean time we're triggering a mass extinction precisely because irresponsible humans figure there's no way we can hurt the Earth and it's self-important hubris to think that we can.

But the time we're living through and the time we're heading into are all the proof we should need that it's actually hubris to assume our actions have no meaningful impact.

load more comments (4 replies)
load more comments (3 replies)
[–] [email protected] 20 points 2 weeks ago

I have a crazy theory that requests like these will actually push people to care more about and take action on global warming.

[–] [email protected] 18 points 2 weeks ago

Laughs in total recall

[–] [email protected] 16 points 2 weeks ago (1 children)
[–] [email protected] 14 points 2 weeks ago (4 children)

we had three tiddied aliens in total recall, like 40 years ago. we don't need AI to give us more tits.

[–] [email protected] 16 points 2 weeks ago

we don't need … more tits.

Blasphemy!!

load more comments (3 replies)
[–] [email protected] 15 points 1 week ago

Could someone please help me save some power and just post the image with the 5tits so I don't need to have it regenerated de novo?

[–] [email protected] 14 points 2 weeks ago (2 children)

I tried to make an image of a woman with 5 tits but got distracted and got married to a rock

load more comments (2 replies)
[–] [email protected] 12 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

1 prompt is avg 1Wh of electricity -> typical AC runs avg 1,500 W = 2.4 seconds of AC per prompt.

Energy capacity is really not a problem first world countries should face. We have this solved and you're just taking the bait of blaming normal dudes using miniscule amounts of power while billionaires fly private jets for afternoon getaways.

[–] [email protected] 20 points 2 weeks ago (7 children)

They are blaming the billionaires (or their companies), for making the thing nobody wanted so they can make money off of it. The guy making a five-breasted woman is a side effect.

And sure, that one image only uses a moderate amount of power. But there still exists giant data centers for only this purpose, gobbling up tons of power and evaporating tons of water for power and cooling. And all this before considering the training of the models (which you better believe they’re doing continuously to try to come up with better ones).

load more comments (7 replies)
load more comments (6 replies)
[–] [email protected] 12 points 2 weeks ago* (last edited 2 weeks ago) (29 children)

I know she's exaggerating but this post yet again underscores how nobody understands that it is training AI which is computationally expensive. Deployment of an AI model is a comparable power draw to running a high-end videogame. How can people hope to fight back against things they don't understand?

[–] [email protected] 30 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

She's not exaggerating, if anything she's undercounting the number of tits.

load more comments (1 replies)
[–] [email protected] 24 points 2 weeks ago (3 children)

I mean, continued use of AI encourages the training of new models. If nobody used the image generators, they wouldn't keep trying to make better ones.

load more comments (3 replies)
[–] [email protected] 20 points 2 weeks ago (29 children)

It's closer to running 8 high-end video games at once. Sure, from a scale perspective it's further removed from training, but it's still fairly expensive.

load more comments (29 replies)
load more comments (26 replies)
load more comments
view more: next ›