this post was submitted on 13 Feb 2025
1202 points (99.4% liked)

Lemmy Shitpost

28171 readers
3259 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 60 points 1 day ago (2 children)

The year is 2045. My grandson runs up to me with a handful of black cords

"Poppy, you know computers, right? I need to connect my Jongo 64k display, but it has CFONG-K6 port, and my Pimble box only has a Holoweave port, have you got an adapter?"

Sadly I sift through the strange cords without a shred of recognition. Truly my time on this mortal coil is coming to a wrap

[–] [email protected] 9 points 1 day ago (2 children)

Don't worry, maybe they misjudged the size of the asteroid and 2032 is it.

[–] [email protected] 4 points 1 day ago

Cool, I won't have to update my date formats.

[–] [email protected] 2 points 1 day ago* (last edited 1 day ago)

Honestly, that'd be kinda sick. Imagine seeing the wall of fire, coming to cleanse the Earth once again. Hopefully the next ones to inherit it don't fuck it up as bad as we did.

[–] [email protected] 9 points 22 hours ago (2 children)

Kinda already happening with usbc

[–] [email protected] 5 points 13 hours ago (1 children)

USB-C uses the DisplayPort protocol in many cases.

[–] [email protected] 0 points 6 hours ago

Yeah but in the context of physical connector that doesn't matter, you can really run almost all protocols over almost any connector if you want to (no bitrate guarantees)

[–] [email protected] 77 points 1 day ago (7 children)
[–] [email protected] 1 points 5 hours ago

Hell, remember dot-matrix printer terminals?

[–] [email protected] 10 points 1 day ago (4 children)

VGA and DVI honestly were both killed off way too soon. Both will perfectly drive a 1080p 60fps display and honestly the only reason to use HDMI or Displayport with such a display is if that matches your graphics output

The biggest shame is that DVI didn't take off with its dual monitor features or USB features. Seriously there was a DVI Dual Link with USB spec so you could legitimately use a single cable with screws to prevent accidental disconnects to connect your computer to all of your office peripherals, but instead we had to wait for Thunderbolt to recreate those features but worse and more likely to drop out

[–] [email protected] 9 points 1 day ago (1 children)

DVI -- sure, but if you think 1080p over VGI looks perfect you should probably get your eyes checked.

[–] [email protected] 2 points 23 hours ago (1 children)

I wouldn't be surprised if it varies by monitor but I've encountered plenty of monitors professionally where I could not tell the difference between the VGA or HDMI input, but I can absolutely tell when a user had one of the cheap adapters in the mix and generally make a point of trying to get those adapters out of production as much as possible because not only do they noticably fuck with the signal (most users don't care but I can see it at least) but they also tend to fail and create unnecessary service calls

[–] [email protected] 2 points 21 hours ago* (last edited 21 hours ago)

We are still using VGA on current gen installations for the primary display on the most expensive patient monitoring system products in the world. If there's a second display it gets displayport. 1080p is still the max resolution.

load more comments (3 replies)
[–] [email protected] 23 points 1 day ago (1 children)

Remember? I still use it for my second monitor. My first interaction with DVI was also on that monitor, probably 10-15 years ago at this point. Going from VGA to DVI-D made everything much clearer and sharper. I keep using this setup because the monitor has a great stand and doesn't take up much space with its 4:3 aspect ratio. 1280x1024 is honestly fine for having voice chat, Spotify, or some documentation open.

[–] [email protected] 3 points 1 day ago

Hell yeah. My secondary monitor is a 1080p120 shitty TN panel from 2011. I remember the original stand had a big “3D” logo because remember those NVIDIA shutter glasses?

Connecting it is a big sturdy DVI-D cable that, come to think of it, is older than my child, my cars, and any of my pets.

[–] [email protected] 36 points 1 day ago (1 children)

Remember it? I work on PCs with DVI connected monitors every day.

[–] [email protected] 14 points 1 day ago (1 children)

Hell, I still use VGA for my work computer. I have the display port connected to the gaming laptop, and VGA connected to the work CPU. (My monitors are old, and I don't care)

[–] [email protected] 4 points 1 day ago

My monitors are old, and I don't care

Sung to the tune of Jimmy crack corn.

[–] [email protected] 15 points 1 day ago* (last edited 1 day ago)

My monitor is 16 years old (1080p and that's enough for me), I can use dvi or HDMI. The HDMI input is not great when using a computer with that specific model.

So I've been using DVI for 16 years.

load more comments (2 replies)
[–] [email protected] 87 points 1 day ago* (last edited 1 day ago) (19 children)

I mean, it could... but if you run the math on a 4k vs an 8k monitor, you'll find that for most common monitor and tv sizes, and the distance you're sitting from them...

It basically doesn't actually make any literally perceptible difference.

Human eyes have ... the equivalent of a maximum resolution, a maximum angular resolution.

You'd have to have literally superhuman vision to be able to notice a difference in almost all space scenarios that don't involve you owning a penthouse or mansion, it really only makes sense if you literally have a TV the size of an entire wall of a studio apartment, or use it for like a Tokyo / Times Square style giant building wall advertisement, or completely replace projection theatres with gigantic active screens.

This doesn't have 8k on it, but basically, buying an 8k monitor that you use at a desk is literally completely pointless unless your face is less than a foot away from it, and it only makes sense for like a TV in a living room if said TV is ... like ... 15+ feet wide, 7+ feet tall.

[–] [email protected] 11 points 1 day ago

Yes. This. Resolution is already high enough for everything, expect maybe wearables (i.e. VR goggles).

HDMI 2.1 can already handle 8k 10-bit color at 60Hz and 2.2 can do 240Hz.

[–] [email protected] 8 points 1 day ago (3 children)

Commercial digital cinema projectors aren’t even 8K. And movies look sharper then ever. Only 70mm IMAX looks better and that’s equivalent of 8K to 12K but IMAX screens (the real ones not digital LieMAX) are gigantic.

Also screen technology has advanced faster than the rest of the pipeline. Making a movie in a full 8K pipeline would be way too expensive. It’s only since recent years that studios upgraded to a complete 4K pipeline from camera to final rendered out image including every step in between. And that’s mostly because Netflix forced studios to do so.

True native non upscaled 8K content won’t be here for a long while.

load more comments (3 replies)
load more comments (17 replies)
[–] [email protected] 141 points 1 day ago (2 children)

DVI is the Gen X of video connectors

[–] [email protected] 45 points 1 day ago (18 children)

Where do my boys Component and S-Video end up?

[–] [email protected] 66 points 1 day ago (2 children)

in my box of cables, for one!

load more comments (2 replies)
load more comments (17 replies)
load more comments (1 replies)
[–] [email protected] 20 points 1 day ago (1 children)

nobody noticed dark side of the moon cover on the background

[–] [email protected] 2 points 22 hours ago

It's a woke poster. /$

[–] [email protected] 32 points 1 day ago (1 children)

VGA was just analog, it wasn't because the resolutions supported weren't HD.

[–] [email protected] 27 points 1 day ago* (last edited 1 day ago) (5 children)

100% right. I know it can handle 1920x1080 @ 60hz and it can handle up to 2048x1536

[–] [email protected] 5 points 1 day ago (1 children)

I remember my buddy getting a whole bunch of viewsonic CRTs from his dad who worked at a professional animation studio. They could do up to 2048×1536 and they looked amazing, but were heavy as fuck for lan parties lol. I loved that monitor though, when i finally 'upgraded' to an lcd screen it felt like a downgrade in alot of ways except desk real estate.

[–] [email protected] 2 points 1 day ago (1 children)
[–] [email protected] 3 points 1 day ago (2 children)

Trinitron was better because it was flat.

[–] [email protected] 1 points 16 hours ago

Sony just used to make good quality electronics, in general.

[–] [email protected] 2 points 1 day ago
load more comments (4 replies)
[–] [email protected] 47 points 1 day ago (7 children)

The meaning of high Res wasn't changed though also VGA was able to output Full HD as well.

load more comments (7 replies)
[–] [email protected] 28 points 1 day ago* (last edited 1 day ago) (8 children)

It already happened to them multiple times that's why we are on HDMI 2.2 which can go up to:

7680 × 4320 @ 60FPS

HDMI 1.0 could only reach:

1920 x 1080 @ 60FPS

The only reason it still works is because they keep changing the specifications.

And I think I can confidently say we will never need more than 8K since we are reaching the physical limits of the human eye or at the very least it will never be considered low resolution.

load more comments (8 replies)
load more comments
view more: next ›