this post was submitted on 01 Jul 2025
2104 points (98.4% liked)

Microblog Memes

8376 readers
3412 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 49 points 2 days ago* (last edited 2 days ago) (24 children)

Let’s do the math.

Let’s take an SDXl porn model, with no 4-step speed augmentations, no hand written quantization/optimization schemes like svdquant, or anything, just an early, raw inefficient implementation:

https://www.baseten.co/blog/40-faster-stable-diffusion-xl-inference-with-nvidia-tensorrt/#sdxl-with-tensorrt-in-production

So 2.5 seconds on an A100 for a single image. Let’s batch it (because that’s what’s done in production), and run it on the now popular H100 instead, and very conservatively assume 1.5 seconds per single image (though it’s likely much faster).

That’s on a 700W SXM Nvidia H100. Usually in a server box with 7 others, so let’s say 1000W including its share of the CPU and everything else. Let’s say 1400W for networking, idle time, whatever else is going on.

That’s 2 kJ, or 0.6 watt hours.

…Or about the energy of browsing Lemmy for 30-60 seconds. And again, this is an high estimate, but also a fraction of a second of usage for a home AC system.


…So yeah, booby pictures take very little energy, and the usage is going down dramatically.

Training light, open models like Deepseek or Qwen or SDXL takes very little energy, as does running them. The GPU farms they use are tiny, and dwarfed by something like an aluminum plant.

What slurps energy is AI Bros like Musk or Altman trying to brute force their way to a decent model by scaling out instead of increasing efficiency, and mostly they’re blowing that out of proportion to try the hype the market and convince them AI will be expensive and grow infinitely (so people will give them money).

That isn’t going to work very long. Small on-device models are going to be too cheap to compete.

https://escholarship.org/uc/item/2kc978dg

So this is shit, they should be turning off AI farms too, but your porn images are a drop in the bucket compared to AC costs.


TL;DR: There are a bazillion things to flame AI Bros about, but inference for small models (like porn models) is objectively not one of them.

The problem is billionaires.

[–] [email protected] 2 points 2 days ago (9 children)

I'm really OOTL when it comes to AI GHG impact. How is it any worse than crypto farms, or streaming services?

How do their outputs stack up to traditional emitters like Ag and industry? I need a measuring stick

[–] [email protected] 2 points 1 day ago (1 children)

How is it any worse than crypto farms, or streaming services?

These two things are so different.

Streaming services are extremely efficient; they tend to be encode-once and decode-on-user's-device. Video was for a long time considered a tough thing to serve, so engineers put tons of effort into making it efficient.

Crypto currency is literally designed to be as wasteful as possible while still being feasible. "Proof-of-work" (how Bitcoin and many other currencies operate) literally means that crypto mining algorithms must waste as much computation as they can get away with doing pointless operations just to say they tried. It's an abomination.

[–] [email protected] 1 points 1 day ago (1 children)

I legit don't know much about tech, and it ts showing. I didn't know streaming was so efficient.

What I. Trying to get at (I still have to read that article in the parent comment) is that how is AI any worse than crypto? As far as I can tell crypto impact, while bad, was relatively minor and it drastically decreased in popularity; it's kind of logical AI does the same, unless it's impact is way higher.

Meanwhile we have cargo ships burning bunker crude .

[–] [email protected] 2 points 1 day ago (1 children)

If you are expecting AI to not have much impact and turn out to be a bubble, then I guess there isn't much reason to believe it it will have much environmental impact. If you expect AI to not be a fad, then yeah it could have big environmental consequences if we can't find renewable power and coolant. If AI is all it is hyped up to be, then it would dwarf the rest of humanity's power consumption down to a footnote. So it really depends on how bullish you are about AI, or at least how bullish you expect the market to be going forward.

Regarding proof-of-work crypto, well, bitcoin is currently at its all-time high in terms of value, exceeding USD$100k/BTC. So I'm not sure I exactly buy the idea that it's less popular, though perhaps people aren't reporting on it as much. If the power consumption of crypto has levelled off, which I don't know if it has, then it might be because it's expensive to build a mining rig and the yield goes down over time as more bitcoin is mined. (It's presumably true of other proof-of-work crypto, too, but as more BTC is mined, the marginal yield of mining more BTC decreases.)

[–] [email protected] 2 points 23 hours ago (1 children)

Honestly, all of this is really interesting. It's a whole side of humanity that I very much do NOT think about or follow. I previously spent the last decade much, much, too busy stomping through the forest, so I really didn't follow anything during that time. A new game or phone came out? sure, cool, I might look that up. When I finally emerged from the fens, sodden and fly-bitten, I was very much out of the loop, despite the algorithm trying to cram articles about NFTs, crypto etc., down my throat. I actually tend to avoid tech stuff because it's too much of a learning curve at this point. I get the fundamentals, but beyond that I don't dig in.

I agree with you on the bubble - it depends on the size. I guess my original take is how could it actually get bigger than it is? I just don't see how it can scale beyond begin in phones or used in basic data analysis/like another google. The AIs can definitely get more advanced, sure, but with that should come some sort of efficiency. We're also seemingly on the cusp of quantum computing, which I imagine would reduce power requirements.

Meanwhile (and not to detract from the environmental concerns AI could pose) we have very, very real and very, very large environmental concerns that need addressing. Millions of cubic metres of sulphur are sitting in stockpiles in northern Alberta, and threatening the Athabasca river. That's not even close to the top of the list of things we need to focus on before we can get out in front of the damage AI can cause.

We're in a real mess.

[–] [email protected] 2 points 22 hours ago (1 children)

The AIs can definitely get more advanced, sure, but with that should come some sort of efficiency.

This is what AI researchers/pundits believed until roughly 2020, when it was discovered you could brute force your way to have more advanced AIs (so-called "scaling laws") just by massively scaling up existing algorithms. That's essentially what tech companies have been doing ever since. Nobody knows what the limit on this is going to be, but as far as I know nobody has any good evidence to suggest that we're near the limit of what's going to be possible with scaling.

We’re also seemingly on the cusp of quantum computing, which I imagine would reduce power requirements.

Quantum computing is not faster than regular computers. Quantum computing has efficiency advantages for some particular algorithms, such as breaking certain types of encryption. As far as I'm aware, nobody is really looking to replace computers with quantum computers in general. Even if they did, I don't think anyone has thought of a way to accelerate AI using quantum computing. Even if there were a way to, it would presumably require quantum computers like, 15 orders of magnitude more powerful than the ones we have today.

We have very, very real and very, very large environmental concerns that need addressing.

Yeah. I don't think AI is really at the highest level of concern for environmental impact, especially since it is looking plausible it will lead to investing in nuclear power, which would be a net positive IMO. (Coolant could still be an issue though.)

[–] [email protected] 2 points 21 hours ago (1 children)

How do they brute force their way to a better algorithm? Just trial and error? How do they check outcomes to determine that their new model is good?

I don't expect you to answer those musings - you've been more than patient with me.

Honestly, I'm a tree hugger, and the fact that we aren't going for nuclear simply because of smear campaigns and changes in public opinion is insanity. We already treat some mining wastes in perpetuity, or plan to have them entombed for the rest of time - how is nuclear waste any different?

[–] [email protected] 2 points 19 hours ago

It's not brute-force to a better algorithm per se. It's the same algorithm, exactly as "stupid," just with more force (more numerous and powerful GPUs) running it.

Three are benchmarks to check if the model is "good" -- for instance, how well the model does on standardized tests similar to SATs (researchers are very careful to ensure that the questions do not appear on the internet anywhere, so that the model can't just memorize the answers.)

load more comments (7 replies)
load more comments (21 replies)