30 is acceptable for most games but stuff where the gameplay is mainly the movement itself (platformer, racing, first person shooter) needs to hit 60. I could go lower than 30 for the visuals on a lot of games but that’s the threshold where the interface starts feeling unresponsive and that really gets to me.
Gaming
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
60 FPS, I can't stand an unstable framerate, I prefer to lower quality/effects if I can't get constant 60 FPS
My personal minimum is a stable 40/s, which is roughly where I start noticing the lower framerate without paying attention to it.
With 30/s I need to get used to it, and I usually underclock (or, rather, power-limit) my GPU to hit an average 50 unless the game in question is either highly unstable (e.g. Helldivers 2) or the game is so light I don't have to care (e.g. Selaco).
I think I'd feel like a millionaire if I ever got 90 on a high refresh monitor. Lol. I like me poor and not too spoiled.
@penquin I never owned a monitor with more than 60 fps cause the newer ones are too expensive.
So I got used to the panels with bad colors and "low" fps. No free sync or gsync just good old vsync 😂
Also not sure if my 3070 would be able to get a 120 fps 2k screen working with high settings for games.
Back in the days of CRT displays I had a 120Hz Trinitron, to pair with the video card and 3D goggles (which shuttered each eye in turn) to give 60Hz per eye
No way could that system or video card keep up with anything more modern than Turok 1 but it was nice for the couple of years it was good enough.
I wish I still had that Trinitron, I'd need a deeper desk though
@psud I do like my displays cause they are so old right now and still working. One of them is that old that the most modern connection he got is DVI. To get that working I was forced to buy an adapter for my graphics card.
Because I got two displays the color of them is so different that if I move a window between them to be shown on both I can visually see how broken the colors are on both of them.
The also have issues with ghosting if there are fast movements. I still like them.
I started playing on a PC in the 90s so as long as it’s above 40 with consistent frame pacing it’s fine. Those VRR displays and games targeting 40 are a game changer for me and why I play on Xbox with a modern LG OLED.
For shooters, especially competitive ones, as high as possible up to my monitor's refresh rate (165Hz). Everything else 60 FPS is fine. Even 30 FPS can be fine, especially if I'm playing something on Switch.
Am I the only one who just plays games and doesn't know what FPS he's getting? If it plays, I'm good.
Or,... maybe I am missing out on something? Lol
I used to be that and I have no idea how and when I started caring. But you know, I'm turning that shit off as of now and will now look at it.
i am 100% with you. there must be something to it if it's that important to so many people but i genuinely can't tell the difference as long as it's stable
and if it does make a difference, for competitive games wouldn't you want it to be consistent between all players instead of "better" based on whoever has more horsepower? it all makes no sense to me
45
Also, 5
@penquin does it have to be first person? If third person is allowed I'd say Warframe. If not, classic Doom with mods
Shit, I knew a comment like this would come up. I was asking specifically about refresh rate not, first person shooter game. Let me fix the title 😁
@penquin oh! Well in that case I used to be a 1080p 60Hz monitor kinda guy, and about a year ago I had to upgrade to dual 1440p 165Hz monitors.
While I can definitely feel the difference, 60 FPS is barely noticeable, and even 30 FPS is acceptable.
I grew up with slower machines so sub-30 was fairly normal, even older consoles targeted 30 and faltered below that, so at this point I'll take anything above what's acceptable for film
So far, all my mentality/generation folks. <3
I just don't care about FPS, as long as 25 or higher. Once you get to the 20ish, you start seeing the jitter.
@penquin like, I can tell the difference under 60, and I can tell it gets choppy under like, 40? But I probably don't make a comment about the "lag" or framerate dropping until it's below 20-30
100%. I can absolutely tell, but I just don't care. I'm here for the fun. Playing God of war with my son and fighting all these bosses and getting into it and yelling is just way too much fun to worry about FPS.
@penquin sometimes it's even more exciting overcoming the FPS drops, especially when I can tell why it's happening and/or if it's only temporary/rare. I've definitely caused my fair share during some overly modded Doom setups