Jump to content

I can’t do 30fps any more.


Recommended Posts

I've been thinking this over all day but in the end I've landed on the idea that games can only run on the hardware that exists at the time. And by that I don't mean, the most cutting edge stuff - I mean the mass market hardware.

 

If Geoff Crammond or David Braben and Ian Bell had waited for more powerful hardware, you lose that grand widening of what's possible in a game for the widest possible audience and you lose those years of influence on other developers.

 

If iD had waited a year, or two years to release Doom so hardware could catch up with their engine...

 

Nintendo are never making games for anything other than Nintendo hardware, so waited another console generation to release Ocarina of Time. And then again with Breath Of The Wild...

 

If Blizzard didn't just whack an execution timer on their turn based dungeon crawler and set it to 20fps and push it out the door...

 

If SEGA had waited a few more years so Virtua Racing or the original Daytona could have run at 60fps...

 

In each case you lose more in possibilities if those games were hypothetically punted forward in time then you do by having a game that a vocal minority consider to have "unacceptable" performance. I know most of those are old games but who knows where we'll be in 5-10 years? Maybe there's going to be a push for procedural generation because traditional development techniques can't handle the scope expected in newer games, or some other tech that we can't imagine yet - there's going to need to be people to do stuff first, on the hardware that exists at the time.

Link to post
Share on other sites
4 minutes ago, mikeyl said:

I’ve got 120Hz enabled on most things unless the choice is for prettier, but to say that you can’t go back to under 60  or refuse to play some of the best games ever in the format the creators intended is wild. It’s an artificial made up barrier.

 

Just for info the Hz/fps aren't the same thing. If you have a 120hz TV, everything you're playing on Series X is 120hz, but if it's running at 60fps it's displaying each frame twice. Or (shudder) four times at 30fps. Or five times at 20fps, the true gamer framerate. You shouldn't ever have to decrease the refresh rate unless something about the game or display is broken (like near black flickering at 120hz/30fps with VRR enabled on current LG TVs. Although if you just turn off VRR that works too).

Link to post
Share on other sites
6 minutes ago, matt0 said:

I've been thinking this over all day but in the end I've landed on the idea that games can only run on the hardware that exists at the time. And by that I don't mean, the most cutting edge stuff - I mean the mass market hardware.

 

If Geoff Crammond or David Braben and Ian Bell had waited for more powerful hardware, you lose that grand widening of what's possible in a game for the widest possible audience and you lose those years of influence on other developers.

 

If iD had waited a year, or two years to release Doom so hardware could catch up with their engine...

 

Nintendo are never making games for anything other than Nintendo hardware, so waited another console generation to release Ocarina of Time. And then again with Breath Of The Wild...

 

If Blizzard didn't just whack an execution timer on their turn based dungeon crawler and set it to 20fps and push it out the door...

 

If SEGA had waited a few more years so Virtua Racing or the original Daytona could have run at 60fps...

 

In each case you lose more in possibilities if those games were hypothetically punted forward in time then you do by having a game that a vocal minority consider to have "unacceptable" performance. I know most of those are old games but who knows where we'll be in 5-10 years? Maybe there's going to be a push for procedural generation because traditional development techniques can't handle the scope expected in newer games, or some other tech that we can't imagine yet - there's going to need to be people to do stuff first, on the hardware that exists at the time.

 

It's a good argument for tiered consoles. I think the Series S/X dichotomy is a brilliant idea because you're not paying for things your TV can't display or your eyes don't care about. The problem is when you can't brute force an acceptable framerate by playing the PC version or getting the "pro" console instead. I assume that everyone in this thread arguing that the switch is fine is not going to bother with the Switch Pro unless certain games are excusive to it?

Link to post
Share on other sites
7 minutes ago, matt0 said:

I've been thinking this over all day but in the end I've landed on the idea that games can only run on the hardware that exists at the time. And by that I don't mean, the most cutting edge stuff - I mean the mass market hardware.

This is a bit of an odd post, because absolutely nobody has said this (plus it is stupidly obvious anyway.) 

 

What has been said and acknowledged is that developers make a conscious choice to target 30/60/eleventy billion fps. If Nintendo wanted BotW to run at 60, they could've done it. It would've had some compromises, but then I guess equally for some people doing what they did and aiming for 30fps was also a compromise.

 

The point is, this is not something magical that developers don't have any control over, it is always a choice and as has been said, is largely a case of shines vs frames (I accept that is a rather reductive argument.) 

Link to post
Share on other sites

They will do it, in a few months, in the Switch Pro. I would argue it's not so much shinies versus frames as scope and size versus performance though. I'm sure they would have liked the world to be more dense and detailed too, but the size of the world was probably the driving factor for all their other design decisions. I wouldn't have wanted a smaller gameworld in exchange for better graphics, but I'd have taken one in exchange for better performance given the option. Nintendo almost seem to bake the concept of the remaster/rerelease into their games now though. Everything they've made since the Wii U looks brilliant running at higher resolutions and framerates because of their carefully chosen artstyles. I've often wondered if that's something they take into account during development.

Link to post
Share on other sites

Anecdotally, we're keen on a Switch Pro in this house not because BotW, which still looks stunning every time I see the kids fire it up, or Hyrule Warriors getting janky, but because our island in ACNH is complex enough to occasionally drop frames. For whatever reason, because there really isn't much going on, and it never occurred at first (while island was simpler), it stands out as the game where the whole family comments on "ANCH is getting buggy again". Is it otherwise 30fps? I assume so but never really sat and figured it out. The point is, had it continued to hit whatever its frame rate cap is, no one here would have batted an eye lid. 

 

I should note that we've only got the demo of Hyrule Warriors, but the kids have played through several times and never once complained about the dip in frames, which is actually a problem for me.

Link to post
Share on other sites
3 hours ago, matt0 said:

 

How much more did the PC you emulated Ocarina Of Time on cost than an N64 though?

 

I might have you mixed up with another poster, but aren't you someone who usually has a top of the range GPU in a mahoosive PC setup at any given time? You're basically saying an experience on cheaper hardware is unacceptable compared to your experience on top of the range hardware.

 

I've flip flopped between PC and console gaming over the years, there's been times when I've been ahead of the curve (having even a mid budget gaming PC in the late 90s felt like being in another universe compared to the twilight years of the PS1), somewhere in the middle of the curve (Amiga 1200 in 1992) and right at the bottom of the curve (Amiga 1200 in 1996). I kept up with the console generations from the Dreamcast onwards but only joined the outgoing gen in 2018 when I got an Xbox One S for game pass. I currently have a reasonably hefty gaming PC, but one that will need a GPU upgrade in the next year or so (RX 5700 XT at the moment).

 

Maybe if I'd always had top of the range hardware I'd see things differently but the thing is Ocarina Of Time wasn't borderline unplayable when it came out although it might be a struggle to adjust to it now - coming off the back of ST / Amiga 3D games or Star Fox on the SNES it was a massive step up. And the 3DS version isn't unacceptable now because it's 30fps, it's a tidy little remaster that plays fine. It sounds like you've had a rarefied experience of gaming, a better experience, but one that cost far more, and you're telling people who haven't had that rarefied experience that the experiences they've had are unacceptable and in some cases that they must have a "brain injury".

 

It's pretty ridiculous.

 

Nah I've just bought a PC to replace one which was over 8 years old and completely knackered. I've done virtually all of my gaming on console for a few years. The PS5 and especially the Series X has been brilliant for me as I've gotten to go back and play things that I felt didn't hit the performance mark first time (like Ghost of Tsushima). When I got my Series X I downloaded practically very game on gamepass or from my prior purchases, went through them one by one and immediately deleted anything that was still 30fps!

 

I didn't say that incidentally, I said people who claim they can't tell the difference between very different framerates are either faking to prove a point, justifying purchases to themselves, or their head goo doesn't work. Of course it goes in the opposite direction too where people convince themselves they have to play everything at ultrawide 240fps because they've just spent a packet on a new monitor. But we're talking about baselines here, and I have always argued that the absolute minimum baseline for performance should be 60fps because lower framerates look like a flipbook and verges on an accessibility issue.

Link to post
Share on other sites
6 minutes ago, TehStu said:

Anecdotally, we're keen on a Switch Pro in this house not because BotW, which still looks stunning every time I see the kids fire it up, or Hyrule Warriors getting janky, but because our island in ACNH is complex enough to occasionally drop frames. For whatever reason, because there really isn't much going on, and it never occurred at first (while island was simpler), it stands out as the game where the whole family comments on "ANCH is getting buggy again". Is it otherwise 30fps? I assume so but never really sat and figured it out. The point is, had it continued to hit whatever its frame rate cap is, no one here would have batted an eye lid. 

 

I should note that we've only got the demo of Hyrule Warriors, but the kids have played through several times and never once complained about the dip in frames, which is actually a problem for me.

 

The Link's Awakening remake does this too. It'll inexplicably switch between 60 and 30fps for no apparent reason in scenes that don't look at all taxing. I swear every time you walk into one of the phone hint rooms the framerate switches when you answer the phone. Weird.

 

Has Hyrule Warriors been patched since this video that shows it running at 9fps docked? That's why I didn't bother with it.

 

 

 

Link to post
Share on other sites
42 minutes ago, Moz said:

 

It's a good argument for tiered consoles. I think the Series S/X dichotomy is a brilliant idea because you're not paying for things your TV can't display or your eyes don't care about. The problem is when you can't brute force an acceptable framerate by playing the PC version or getting the "pro" console instead. I assume that everyone in this thread arguing that the switch is fine is not going to bother with the Switch Pro unless certain games are excusive to it?

Of course I’ll have a Switch Pro, I’m not some kind of tired martyr for the cause or anything, but that doesn’t mean what we’ve had until now has been sub par, it won’t diminish the enjoyment I’ve already had. 

Link to post
Share on other sites
17 minutes ago, Moz said:

 

The Link's Awakening remake does this too. It'll inexplicably switch between 60 and 30fps for no apparent reason in scenes that don't look at all taxing. I swear every time you walk into one of the phone hint rooms the framerate switches when you answer the phone. Weird.

 

Has Hyrule Warriors been patched since this video that shows it running at 9fps docked? That's why I didn't bother with it.

 

 

 

No it’s still quite janky despite being patched, it’s perfectly playable though. 

Link to post
Share on other sites
1 hour ago, Gabe said:

This is a bit of an odd post, because absolutely nobody has said this (plus it is stupidly obvious anyway.) 

 

What has been said and acknowledged is that developers make a conscious choice to target 30/60/eleventy billion fps. If Nintendo wanted BotW to run at 60, they could've done it. It would've had some compromises, but then I guess equally for some people doing what they did and aiming for 30fps was also a compromise.

 

The point is, this is not something magical that developers don't have any control over, it is always a choice and as has been said, is largely a case of shines vs frames (I accept that is a rather reductive argument.) 

 

I was following on from my earlier post about older games but there are posts in here about how the most powerful handheld available at the time was underpowered.

 

And I don't think it's as simple as saying any game could run at 60fps if developers wanted it to. If you look at older games, at what point do you get Metal Gear Solid and Ocarina of Time running at 60fps? When they're just flat polygons? Even if they're mechanically identical are they still the same game at that point? There might not be a point of compromise you can reach before the developers vision isn't lost. There's only so much you can get out of PS1 or N64 in the same way there's only so much you can get out of a Switch or a base Xbox One or PS4. To take BotW as a more recent example, can you just dial back the graphics or is there a bottleneck with the physics engine? What other things need to be dialed back, if you compromise the physics, the world size, the graphics - is there a point where it's not the game the developers wanted to make anymore? I suspect most games can be dialed back to run at 60fps, but a game like No Mans Sky is really pushing procedural generation beyond anything done before? Again, none of us actually know.

 

And if 30fps is unacceptable then what? You just don't make games for lower powered hardware? - that's the hardware that exists. That's what I'm talking about.

Link to post
Share on other sites
19 minutes ago, matt0 said:

 

I might be arguing some hyper specific tangent that's only of interest to myself but... I was following on from my earlier post about older games but there are posts in here about how the most powerful handheld available at the time was underpowered.

 

And I don't think it's as simple as saying any game could run at 60fps if developers wanted it to. If you look at older games, at what point do you get Metal Gear Solid and Ocarina of Time running at 60fps? When they're just flat polygons? Even if they're mechanically identical are they still the same game at that point? They're might not be a point of compromise you can reach before the developers vision isn't lost. There's only so much you can get out of PS1 or N64 in the same way there's only so much you can get out of a Switch or a base Xbox One or PS4. To take BotW as a more recent example, can you just dial back the graphics or is there a bottleneck with the physics engine? What other things need to be dialed back, if you compromise the physics, the world size, the graphics - is there a point where it's not the game the developers wanted to make anymore? I suspect most games can be dialed back to run at 60fps, but a game like No Mans Sky is really pushing procedural generation beyond anything done before? Again, none of us actually know.

 

And if 30fps is unacceptable then what? You just don't make games for lower powered hardware - that's the hardware that exists. That's what I'm talking about.

Yeah this is exactly it and Breath of The Wild is quite a good example to use because you can tell the decision to go with 30fps then goes on to affect how they designed the game and then it will be a balancing act between ambition and performance until it’s finished. 
 

The only way every game could run at 60fps is if they all ran on PC, and even then you have a user base made up of all different kinds on set ups and you’d struggle - in fact PC is the worst for that as it’s all about ever more powerful hardware and more demanding graphics settings rather than getting the best optimisation for (relatively) fixed hardware, as on console. 

Link to post
Share on other sites
52 minutes ago, Moz said:

 

Nah I've just bought a PC to replace one which was over 8 years old and completely knackered. I've done virtually all of my gaming on console for a few years. The PS5 and especially the Series X has been brilliant for me as I've gotten to go back and play things that I felt didn't hit the performance mark first time (like Ghost of Tsushima). When I got my Series X I downloaded practically very game on gamepass or from my prior purchases, went through them one by one and immediately deleted anything that was still 30fps!

 

I didn't say that incidentally, I said people who claim they can't tell the difference between very different framerates are either faking to prove a point, justifying purchases to themselves, or their head goo doesn't work. Of course it goes in the opposite direction too where people convince themselves they have to play everything at ultrawide 240fps because they've just spent a packet on a new monitor. But we're talking about baselines here, and I have always argued that the absolute minimum baseline for performance should be 60fps because lower framerates look like a flipbook and verges on an accessibility issue.

 

Fair enough on the head injury thing, I hadn't fully grasped the point you were making, apologies.

 

But an accessibility issue? The baseline? Did you think Doom in 1993 at 35fps looked like a flipbook? Do you think Metal Slug does? If it's a genuine accessibility issue how come Diablo 2 isn't a long forgotten curio instead of one of the most fondly remembered games of all time and people were fine with PS1s and N64s? Most people are fine with games that run lower than 60fps. They were in the 90s. They are now. Standards are higher now, frame rates across the board are higher now, that's great. But the idea that anything less than 60fps is unacceptable? That's just your personal taste.

 

Nobody is the lowest common denominator gaming equivalent of "a Coldplay listening Tory voter" because they're okay with a lower frame rate. Everyone telling you 30fps was "fine"? That's because it was fine, for them.

 

I can't stand mouse aiming at less than 60fps, but I can play the same game happily enough on console with a pad. It's vaguely weird but it's just me personally. It's not a baseline for anyone else who doesn't have that hang-up.

Link to post
Share on other sites

2D games get away with it much more easily than 3D games. Doom's OK at 35fps but Quake 1 at 30fps is horrible, for example. It depends on the control method too, anything which involves a first or third person camera, a cursor or crosshair is infinitely worse. 

 

There are some useful things devs can do to ameliorate the problem. Naughty Dog are very good at tuning their motion blur and aiming just right so that 30fps isn't so much of an issue, but it's still subpar. And it's taken a long time to get even to this point, having suffered through multiple generations of terrible postprocessing effects  which only exist to hide a low framerate.

 

The fact remains that people have different tolerances and baselines for these things. Research shows that fighter pilots can identify an aircraft shown on screen for 1/220th of a second, whereas only the very extremes of society (the elderly, and people with genuine visual impairment or the aforementioned brain injuries) truly can't see framerates above 24fps. If people are telling you that sub-60fps is inadequate for them as a baseline and causes visual discomfort to the point that some games become unplayable, why not believe them? What's the difference between that and replying to every thread about lack of colourblind options saying "it works fine for me, I can see exactly what I'm shooting at, stop being a usability snob, stop being a colour elitist". It's incredibly tedious because the argument many people make that 30fps is too low for comfort has never changed once in literally decades. And yet in response you see the same cycle of people claiming it's completely fine and anyone who says otherwise is elitist and not a true gamer and they can't tell the difference anyway. Then they buy the remastered version of the game on the next console and claim it's brilliant and how did they ever live without it. It gets tied into console and platform wars bullshit and I've never given a shit about that because I'm dedicated and agnostic enough to buy all the platforms eventually, if not immediately.

 

The same goes for VR. The Virtual boy failed because a) It was dogshit and b) it made everyone sick. We now know thanks to lots of research that VR needs to be a good 70hz bare minimum to avoid making people feel queasy. But people have different tolerances to that too. And yet we work on the assumption that making users feel sick and annoyed isn't a good idea and set a baseline accordingly. But you can guarantee that if the Virtual boy came out today, the usual suspects would be lining to say it's brilliant even though it's awful because they're having so much fun and they like the taste of sick anyway. In fact I imagine people probably did exactly that when they played BOTW with the Labo!

 

It's probably completely fair to say we've never been able to make 60fps a baseline on all platforms for various technical or commercial reasons. And that many amazing games which also happened to unfortunately framerates wouldn't have been possible without waiting a while for the hardware to catch up with the vision of the creator. Which incidentally some developers do. But those reasons aren't good enough any more. The mainstream is moving toward 120hz and beyond and 60fps+ is now a back of the box selling point, especially with backward compatibility. Microsoft and Sony are both doing a brilliant job in that regard. All I'm saying it, let's keep it that way. Forever. At the very, very least, exclusives should wherever possible have an unlocked framerate option so they can be easily ported to future generations and benefit from the increased horsepower. Aside from a few really old games and oddities, that's how the PC works. We've proven that early adopters and hobbyists are willing to put up with increased complexity on console, and that Coldplay Tories can at least tolerate it and even understand it when it comes to choosing between platforms or SKUs. So let's keep taking the good things that PC does and the good things that consoles do and combine them. Microsoft seem to finally get it after a long time out in the wilderness, so I have hope.

Link to post
Share on other sites

On a recent Digital Foundry video they made the point that with the advent of 120Hz displays, there might be an option this gen for developers to target 40fps as a compromise. I wonder whether this would just create too many options, but it was certainly an interesting idea. 

Link to post
Share on other sites
2 hours ago, TehStu said:

Anecdotally, we're keen on a Switch Pro in this house not because BotW, which still looks stunning every time I see the kids fire it up, or Hyrule Warriors getting janky, but because our island in ACNH is complex enough to occasionally drop frames. For whatever reason, because there really isn't much going on, and it never occurred at first (while island was simpler), it stands out as the game where the whole family comments on "ANCH is getting buggy again". Is it otherwise 30fps? I assume so but never really sat and figured it out. The point is, had it continued to hit whatever its frame rate cap is, no one here would have batted an eye lid. 

 

I should note that we've only got the demo of Hyrule Warriors, but the kids have played through several times and never once complained about the dip in frames, which is actually a problem for me.

In my case it's because I have a lite (which I've been delighted with until now), but I wouldn't mind playing MHR on the TV. But I don't see the point of the current switch fat for me. 

 

Hyrule Warriors AoC definitely pushes the system beyond its limits, I don't mind it that much though. 

Link to post
Share on other sites
58 minutes ago, Moz said:

The fact remains that people have different tolerances and baselines for these things.

 

At the risk of dragging this out again, which I don't want to do because I agree with most of your last post, but your earlier posts came across like you were saying the exact opposite of this, especially the weird Coldplay / Tories stuff.

 

I jumped to conclusions and went off on one with the PC eltisim stuff, but that's frequently been the angle people have come from on here when arguing about frame rates. 

 

I'm not 100% sure how I feel about multi tiered console generations to be honest. I've always thought one of the strengths of consoles was the fixed spec, but with different display rates and resolutions to support even with one console sku it's not a fixed experience anymore.

Link to post
Share on other sites
1 hour ago, Moz said:

2D games get away with it much more easily than 3D games. Doom's OK at 35fps but Quake 1 at 30fps is horrible, for example. It depends on the control method too, anything which involves a first or third person camera, a cursor or crosshair is infinitely worse. 

 

There are some useful things devs can do to ameliorate the problem. Naughty Dog are very good at tuning their motion blur and aiming just right so that 30fps isn't so much of an issue, but it's still subpar. And it's taken a long time to get even to this point, having suffered through multiple generations of terrible postprocessing effects  which only exist to hide a low framerate.

 

The fact remains that people have different tolerances and baselines for these things. Research shows that fighter pilots can identify an aircraft shown on screen for 1/220th of a second, whereas only the very extremes of society (the elderly, and people with genuine visual impairment or the aforementioned brain injuries) truly can't see framerates above 24fps. If people are telling you that sub-60fps is inadequate for them as a baseline and causes visual discomfort to the point that some games become unplayable, why not believe them? What's the difference between that and replying to every thread about lack of colourblind options saying "it works fine for me, I can see exactly what I'm shooting at, stop being a usability snob, stop being a colour elitist". It's incredibly tedious because the argument many people make that 30fps is too low for comfort has never changed once in literally decades. And yet in response you see the same cycle of people claiming it's completely fine and anyone who says otherwise is elitist and not a true gamer and they can't tell the difference anyway. Then they buy the remastered version of the game on the next console and claim it's brilliant and how did they ever live without it. It gets tied into console and platform wars bullshit and I've never given a shit about that because I'm dedicated and agnostic enough to buy all the platforms eventually, if not immediately.

 

The same goes for VR. The Virtual boy failed because a) It was dogshit and b) it made everyone sick. We now know thanks to lots of research that VR needs to be a good 70hz bare minimum to avoid making people feel queasy. But people have different tolerances to that too. And yet we work on the assumption that making users feel sick and annoyed isn't a good idea and set a baseline accordingly. But you can guarantee that if the Virtual boy came out today, the usual suspects would be lining to say it's brilliant even though it's awful because they're having so much fun and they like the taste of sick anyway. In fact I imagine people probably did exactly that when they played BOTW with the Labo!

 

It's probably completely fair to say we've never been able to make 60fps a baseline on all platforms for various technical or commercial reasons. And that many amazing games which also happened to unfortunately framerates wouldn't have been possible without waiting a while for the hardware to catch up with the vision of the creator. Which incidentally some developers do. But those reasons aren't good enough any more. The mainstream is moving toward 120hz and beyond and 60fps+ is now a back of the box selling point, especially with backward compatibility. Microsoft and Sony are both doing a brilliant job in that regard. All I'm saying it, let's keep it that way. Forever. At the very, very least, exclusives should wherever possible have an unlocked framerate option so they can be easily ported to future generations and benefit from the increased horsepower. Aside from a few really old games and oddities, that's how the PC works. We've proven that early adopters and hobbyists are willing to put up with increased complexity on console, and that Coldplay Tories can at least tolerate it and even understand it when it comes to choosing between platforms or SKUs. So let's keep taking the good things that PC does and the good things that consoles do and combine them. Microsoft seem to finally get it after a long time out in the wilderness, so I have hope.

Being unable to tolerate a lower frame rate than 60fps is not a visual impairment.
 

And you’re right the Virtual Boy failed. Some of the most successful videogames and videogames consoles of all time ran much less than 60fps, if you’re using that metric, so clearly they are fine. 
 

60fps is always preferable, because it’s demonstrably better, and hopefully with hardware and display advances it will become more common now, it’s certainly headed in the right direction, but there are still plenty of excellent games that play just fine at 30fps. 
 

Link to post
Share on other sites

The last guardian. Objectively a great game apparently, but subjectively a game with an appalling variable frame rate on base PS4.

cyberpunk 2044. A great game for those of us able to play it as intended, crippled by a fucked framerate on base PS4.

 

“30fps is fine” inverse snobbery forgets that 30fps often drops well below 30fps, which is not a good experience for anyone.

Link to post
Share on other sites

I have been playing games continuously since the Commodore 64. I can remember 2D machines slowing down or flickering because they hit their sprite limit. I played and loved N64 games that occasionally hit single digit framerates. 

 

As such, I have a wide tolerance for this stuff. I'd not really fancy going back to the worst of the N64 particularly on modern TVs but while I can notice framerates it just doesn't bother me. Stutters or poor frame pacing can be annoying but I played through the Hyrule Warriors demo and Sekiro and didn't really care. 

 

Second, games have a feel linked to the framerate. How fast characters run etc. You speed it up and it doesn't always work. I knocked maybe 300 hours into Dark Souls and the remaster felt just wrong for ages and ages. And as noted above, Perfect Dark is way more twitchy than intended on Xbox. 

 

Is the game fun? I can get over it. 

Link to post
Share on other sites
3 minutes ago, footle said:

The last guardian. Objectively a great game apparently, but subjectively a game with an appalling variable frame rate on base PS4.

cyberpunk 2044. A great game for those of us able to play it as intended, crippled by a fucked framerate on base PS4.

 

“30fps is fine” inverse snobbery forgets that 30fps often drops well below 30fps, which is not a good experience for anyone.

Yeah there’s a big difference between a stable 30fps and an unstable frame rate. 

Link to post
Share on other sites

 

12 minutes ago, kensei said:

Second, games have a feel linked to the framerate. How fast characters run etc. You speed it up and it doesn't always work. I knocked maybe 300 hours into Dark Souls and the remaster felt just wrong for ages and ages. And as noted above, Perfect Dark is way more twitchy than intended on Xbox. 

 

I secretly love that shit. Dark Souls 2 had that brilliant bug where things like projectiles, jumping and equipment durability were linked to framerate. I kept saying the stupid little urns which shoot corrosive goo at you were doing an insane amount of damage and it took about 3 weapons worth of durability to get past them. I thought I was going completely insane until the bugs came to light. I also played the whole of Vanquish at 144fps meaning I was taking more than double damage. On hard. The power of mouse aim! I did notice the difficulty spike and thought it was on purpose to compensate for the increased aim. More games should do that.

Link to post
Share on other sites
12 minutes ago, Stanley said:

Being unable to tolerate a lower frame rate than 60fps is not a visual impairment.

 

Sure it is, there are research papers out there detailing techniques to alter framerate up and down on the fly to better support people to see content more clearly. That's essentially what motion smoothing is when you strip away all the marketing - interpolated frames to remove headache-inducing panning judder. I also wear glasses and had multiple brain injuries from a car accident when I was 11 so maybe low framerates give me more trouble than your average person. It's more likely that it's down to "training" however - just like fighter pilots can train themselves to see 1/255th of a frame, I played a LOT of first person shooters at high framerates growing up, and switching to low/unstable framerate games is so noticeable that it's hard to adjust. Impossible in some cases. I tried the Switch version of Apex Legends the other day and I was literally laughing out loud for a couple of minutes trying to complete the tutorial before I gave up. That's some crazy nonsense.

Link to post
Share on other sites

You also said you played mainly on 360 and PS4 until recently, so clearly your “impairment” wasn’t as much of an issue during that time. Look I get what you’re saying, but describing something you don’t like as an impairment is kind of insulting to people who actually do suffer from real problems with their sight. 

Link to post
Share on other sites
23 minutes ago, Stanley said:

Yeah there’s a big difference between a stable 30fps and an unstable frame rate. 

I suppose 30 fps becoming unstable is a greater issue than 60. I do wish we'd do away with hyperbole like "inverse snobbery", though.

Link to post
Share on other sites
Just now, TehStu said:

I suppose 30 fps becoming unstable is a greater issue than 60. I do wish we'd do away with hyperbole like "inverse snobbery", though.

Yes me too, obviously I would always prefer a stable 60fps (because an unstable 60 can be just as egregious) but a nice stable 30fps is fine, depending on the game too of course. But I played Dishonored 2 recently on Series X, and it’s only 30fps albeit rock solid - and I thought it looked and played great but there were people saying it’s an unplayable mess. So clearly people do have varying tolerances, but it’s not inverse snobbery. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.