this post was submitted on 19 May 2024
125 points (93.7% liked)

Technology

34889 readers
642 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 6 months ago (2 children)

Competitive (professional) gamers?

Seems there are diminishing returns, but at least some gains are measurable at 360.

[–] [email protected] 3 points 6 months ago (2 children)

In thought that 60Hz was enough for most games, and that for shooters and other real time games 120 or 144 was better. However, it reaches a point where the human eye can't notice even if it tried.

Honestly, going up in framerate t9o much is just a waste of GPU potency and electricity.

[–] [email protected] 10 points 6 months ago

A better way to look at this is frametime.

At 60 FPS/Hz, a single frame is displayed for 16.67ms. At 120 Hz, a single frame is displayed for 8.33ms. At 240 Hz, a single frame is displayed for 4.16ms. A difference of >8ms per frame (60 vs 120) is quite noticeable for many people, and >4ms (120 vs 240) is as well, but the impact is just half as much. So you get diminishing returns pretty quickly.

Now I'm not sure how noticeable 1000 Hz would be to pretty much anyone as I haven't seen a 1000 Hz display in action yet, but you can definitely make a case for 240 Hz and beyond.

[–] [email protected] 4 points 6 months ago (1 children)

It's pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.

[–] [email protected] 2 points 6 months ago (1 children)

Sure, but wasting double or triple the resources for that is not fine. There's very limited places where that even is a gain on games, because outside those super competitive limited games it's not like it matters.

[–] [email protected] 4 points 5 months ago (1 children)

Yeah I agree with you, but I was just refuting your claim that it's not perceivable even if you try.

[–] [email protected] 0 points 5 months ago

oh, yeah I've read and heard of plenty people saying that they definitely notice it. I'm lucky enough not to because most ARPGs don't run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there's people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

I'm just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.