Jump to content

I can’t do 30fps any more.


Pob
 Share

Recommended Posts

On 17/03/2021 at 17:09, Kevvy Metal said:

I basically want 60 fps for everything now :/

 

The only way you'll ever manage that is to get used to paying for and playing on a powerful PC, the consoles will never guarantee that and nor should they, you can't square a circle sometimes.

 

I'm imagining the thread that gets created in 2030 or 2040 where people will be demanding 120+ fps on everything :) , PCs are in the early stages of that so given how long it's taken for console only gamers to demand something that has existed for decades in gaming already in greater numbers, they'll eventually get a taste for that and the expectations baseline will move up.

Link to comment
Share on other sites

13 hours ago, mushashi said:

 

The only way you'll ever manage that is to get used to paying for and playing on a powerful PC, the consoles will never guarantee that and nor should they, you can't square a circle sometimes.

 

I'm imagining the thread that gets created in 2030 or 2040 where people will be demanding 120+ fps on everything :), PCs are in the early stages of that so given how long it's taken for console only gamers to demand something that has existed for decades in gaming already in greater numbers, they'll eventually get a taste for that and the expectations baseline will move up.


Nah, the PS2 was pretty great at producing a lot of 60 fps games. Loads in fact with interlaced field-rendering techniques. 

It also helped that SD resolution never changed. 

If you design and build your game with 60 fps in mind it can be done, it’s a choice. My prior 8 years in game dev certainly showed this. 

Link to comment
Share on other sites

I was looking at starting a "smoke and mirrors" thread.  Becoming more disillusioned with modern gaming and my recent foray into PS2 modding has me playing Ridge Racer 5, import version so no borders.  Then Sega Rally 2006 which I hadn't heard of until recently.   It makes me think how something has gone wrong somewhere.  Why am I playing. Need For Speed Hot Pursuit on a PS4 and it's not 60fps? A remastered PS3 game.  And yet PS2 is doing great stuff at 60fps 21 years ago.

Link to comment
Share on other sites

It's like Kevvy said in the post above yours - it's a choice the devs make. I really hope with Microsoft making a big deal out of 60FPs and it being a selling point of various remasters, more devs will make that choice. It just looks, feels, and plays better. 

Link to comment
Share on other sites

4 minutes ago, dumpster said:

Why am I playing. Need For Speed Hot Pursuit on a PS4 and it's not 60fps? A remastered PS3 game.  And yet PS2 is doing great stuff at 60fps 21 years ago.

Because PS3 was targeting 1080p resolution, while PS2 was 480p.  Absolutely hhheeeeuuuuugggee difference. There's more to it than that, but that's the big one. 

Link to comment
Share on other sites

17 minutes ago, Thor said:

Because PS3 was targeting 1080p resolution, while PS2 was 480p.  Absolutely hhheeeeuuuuugggee difference. There's more to it than that, but that's the big one. 

Agreed, but does that mean that, on a like for like basis, your PS2 connected to an SD CRT was effectively more powerful than a PS4 because the TV industry went HD and then 4K?  I think that's an interesting issue.

Link to comment
Share on other sites

On 20/03/2021 at 23:09, Kevvy Metal said:


Nah, the PS2 was pretty great at producing a lot of 60 fps games. Loads in fact with interlaced field-rendering techniques. 

It also helped that SD resolution never changed. 

If you design and build your game with 60 fps in mind it can be done, it’s a choice. My prior 8 years in game dev certainly showed this. 

 

As a percentage of the games on the system, 60Hz titles are in the minority, so yes, it's a choice, but the wider market has voted with its feet for decades that advancements in graphical frippery trump 60Hz gameplay on home consoles, hence why some of the major 60Hz PS2 games which got sequels abandoned performance for pushing the graphical envelope.

 

Graphical frippery sells to more people than performance does, is this console generation finally the one where that changes for the majority? considering you had to go all the way back to the year 2000 to show that major devs can do 60Hz if they wanted to is kind of telling as it indicates they were happy to abandon it in the pursuit of better graphics since then like Insomniac or Naughty Dog or Criterion or KojiPro did.

 

Rez is an interesting example from that era, conceived as a Dreamcast exclusive until the demise of SEGA so they could have aimed for 60fps, but the hardware was too weak to get the graphical quality they wanted so they settled for 30fps instead, didn't matter to the review scores and it only got upgraded to 60fps on the PS2 port. The Xbox was more powerful than the PS2, yet Microsoft didn't push the games they paid for to reach 60fps, despite their hardware power advantage, instead they chose higher resolutions and better graphics.

 

As I said, if you have to have 60Hz or better in 99.9% of games, invest in a PC, consoles won't give you that for everything like you want as most people don't care, even Nintendo fans don't bang on about their first party 60fps advantage as some plus point for why the games are good. It certainly makes games play better, but it don't make shit into wine either :P

Link to comment
Share on other sites

Started playing Minecraft Dungeons on the XSX, obviously nice and smooth. Later in the day I moved to the XB1 and holy shit it's so jarring dropping to 30 FPS. I mean, I soon stopped thinking about it, but I thought I was going to get motion sick at first (am prone to it).

Link to comment
Share on other sites

I was watching the NFS: Hot Pursuit Digital Foundry analysis for the new consoles early and pondering on the original game being 30fps. I can’t imagine people accepting that kind of game being locked to 30 any more. I think things are changing. 

Link to comment
Share on other sites

I don't want to be a dick but the pc lads have been saying this for.... 20 years? Before we all dropped back to 60hz for a bit to accommodate flatscreen monitors it was common for us to play at higher framerates, which came back into vogue about ten years ago with the catleap monitor and now virtually all gaming monitors. And finally 120hz is part of the HDMI 2.1 spec, so we've come full circle. 60hz/fps should have always been the minimum spec from the moment it was possible on commercial TVs. It has been the most incredibly aggravating experience to be told for literally two decades that playing games at 30fps or less is fine. It's bloody rubbish and you can all admit it now. There's loads of "I've been spoiled lately and now I can't go back" talk which is great. I'm really glad the tide is turning because 30fps or below is really not acceptable and never has been for the majority of games. Having to play utter classics like Bloodborne running like a child's janky flipbook is a damn tragedy. 

 

I've got a pretty hard stance on it now. Last Of Us 2 effectively sunsetted 30fps gaming for me and I probably won't bother trying to adapt to 30fps in future. I've even experimented with the dreaded frame interpolation (motion smoothing) to make Switch games like Luigi's Mansion and Link's Awakening run at a bodged 60fps, and it worked surprisingly well on an LG CX. Certainly better than playing at 30fps, anyway.

Link to comment
Share on other sites

No, because it doesn’t fucking matter. Get over yourselves, most of the computer games I played on PS4 were 30fps, from Destiny 1 to Last of Us 2, and I had a whale of a time. They weren’t invalidated to the overwhelming majority because of some imperceptible frame rate boost. Shiny beautiful graphics is what matters to the masses, for the screenshots, for the TV spots and for the pure eye candy.


Get over it. Looking forward to the shiny graphics arms race restarting with Last of Us 3 etc and we can marvel once more at artistic merit rather than frame rate militancy.

Link to comment
Share on other sites

On 22/03/2021 at 21:11, dumpster said:

I was looking at starting a "smoke and mirrors" thread.  Becoming more disillusioned with modern gaming and my recent foray into PS2 modding has me playing Ridge Racer 5, import version so no borders.  Then Sega Rally 2006 which I hadn't heard of until recently.   It makes me think how something has gone wrong somewhere.  Why am I playing. Need For Speed Hot Pursuit on a PS4 and it's not 60fps? A remastered PS3 game.  And yet PS2 is doing great stuff at 60fps 21 years ago.

 

As soon as the main way to promote your game was through video trailers rather than screenshots in magazines or on the back of a box, the shinies took priority over frame rate. I hope that devs target 60fps this gen, as it does seem like the positive reaction to these modes in backwards compatible games will have taken some by surprise and may influence thinking going forwards.

Link to comment
Share on other sites

6 hours ago, mikeyl said:

No, because it doesn’t fucking matter. Get over yourselves, most of the computer games I played on PS4 were 30fps, from Destiny 1 to Last of Us 2, and I had a whale of a time. They weren’t invalidated to the overwhelming majority because of some imperceptible frame rate boost. Shiny beautiful graphics is what matters to the masses, for the screenshots, for the TV spots and for the pure eye candy.


Get over it. Looking forward to the shiny graphics arms race restarting with Last of Us 3 etc and we can marvel once more at artistic merit rather than frame rate militancy.

That’s the spirit. Take the interactivity out of games and call it artistic merit.

Link to comment
Share on other sites

1 hour ago, footle said:

That’s the spirit. Take the interactivity out of games and call it artistic merit.

It depends on the game, if Breath of The Wild required 60fps to be playable then it wouldn’t exist, or not for a good while anyway. As long as the developer fashions the game around the technical constraints then 30fps can be just as effective as 60fps. 

Link to comment
Share on other sites

8 hours ago, mikeyl said:

No, because it doesn’t fucking matter. Get over yourselves, most of the computer games I played on PS4 were 30fps, from Destiny 1 to Last of Us 2, and I had a whale of a time. They weren’t invalidated to the overwhelming majority because of some imperceptible frame rate boost. Shiny beautiful graphics is what matters to the masses, for the screenshots, for the TV spots and for the pure eye candy.


Get over it. Looking forward to the shiny graphics arms race restarting with Last of Us 3 etc and we can marvel once more at artistic merit rather than frame rate militancy.

 

Maybe if you've got a brain injury you'd struggle to tell the difference. If you've got no choice you'll kid yourself it's OK. Like we did when we played TLOU before remastered came out, or like we did with Destiny 1 before 2 came out on PC. 60fps needs to be the minimum because anything else is unacceptable. Notice I didn't say 120fps, or 240fps, or 360fps, the current standards on PC. It's not "militancy" to demand a baseline of acceptable performance in a consumer product, just like it's not militancy to expect basic accessibility options.

Link to comment
Share on other sites

A fucking brain injury, because people who don’t give a fuck about FPS are what exactly? Say it.

 

It’s not ‘unacceptable’, it’s been fine for the majority for ages and will be again. It’s niche elitism where you’ve all convinced yourselves that good games are now shit games because they don’t meet some spoilt kid criteria.

Link to comment
Share on other sites

Here's a smattering of games that were "unacceptable child's janky flipbooks" on their original release:

 

Metal Slug, Doom, Diablo, Diablo 2, Halo 3, Halo ODST, Halo Reach, PGR 1-4, Bloodborne, Demon's Souls, Shadow of The Colossus, Ocarina Of Time, Majora's Mask, Breath of the Wild, Metal Gear Solid, Final Fantasy 7.

 

Or maybe they're not "unacceptable child's janky flipbooks", maybe they're some of the best games ever, made by incredibly talented studios to run as they saw fit on the affordable hardware of the time? Would they have been better at 60fps or higher? Of course, yes (with the possible exception of Metal Slug, which I think wouldn't feel like Metal Slug anymore). Were they unacceptable? Of course they fucking weren't.

 

I'm someone who used to turn Quake 3 down to minimum settings to get 75fps on my shitty Voodoo Banshee card. I care about frame rates (and frame pacing and input lag and all of that). But the truth is, you could make a 60fps game on the PS1. You could make a 60fps game on literally every piece of gaming hardware ever released. Developers chose fidelity over frame rate over and over again and will continue to do so. Not because they were making "unacceptable" games, but because it doesn't matter to most people.

Link to comment
Share on other sites

I was all excited for Ray-Tracing and all the associated bells and whistles, to the point that i thought i'd always pick the quality mode over the performance mode because it's got to look better right?

I was wrong. 60fps looks better to me now, well certainly runs better. 

On pc, 1440 @ 60 is now my sweet spot too. Cyberpunk might look a bit better with all the RT stuff on but it just runs so much better without it.

Link to comment
Share on other sites

Just now, Opinionated Ham Scarecrow said:

Shadow of the Colossus was pretty unplayable though. Don't get me wrong, lovely game but boy did it run like ass and I knew that before I really knew what FPS was.

 

It was rough, but widely regarded as a masterpiece at the time. Unplayable is a big exaggeration.

Link to comment
Share on other sites

It's really not niche. NES, SNES, Megadrive, Master System etc games output at 60hz or 50hz PAL. Ancient CRT technology usually ran at anything from 72-120hz, although some games were capped at framerates to support their netcode or CPU/engine timings, e.g. Doom was capped at 35fps. Quake 1 was capped at 72fps. When flatscreens came out they were mostly capped at 60hz. 

 

Sub-30fps has always been the standard in film certainly, and that's now becoming an issue for content with no natural interpolation between frames, where panning shots create too much judder on modern screens. That's why dejudder and deblur options exist.

 

The difference is, 3D games came along with wildly higher CPU/GPU requirements and developers had to figure out what the lowest possible technically playable framerate was. Ocarina of time running at 20fps wasn't an artistic choice like shooting in B&W or 4:3. It was the absolute minimum viable product they could get away with.

 

9 minutes ago, matt0 said:

Here's a smattering of games that were "unacceptable child's janky flipbooks" on their original release:

 

Metal Slug, Doom, Diablo, Diablo 2, Halo 3, Halo ODST, Halo Reach, PGR 1-4, Bloodborne, Demon's Souls, Shadow of The Colossus, Ocarina Of Time, Majora's Mask, Breath of the Wild, Metal Gear Solid, Final Fantasy 7.

 

Or maybe they're not "unacceptable child's janky flipbooks", maybe they're some of the best games ever, made by incredibly talented studios to run as they saw fit on the affordable hardware of the time? Would they have been better at 60fps or higher? Of course, yes (with the possible exception of Metal Slug, which I think wouldn't feel like Metal Slug anymore). Were they unacceptable? Of course they fucking weren't.

 

I'm someone who used to turn Quake 3 down to minimum settings to get 75fps on my shitty Voodoo Banshee card. I care about frame rates (and frame pacing and input lag and all of that). But the truth is, you could make a 60fps game on the PS1. You could make a 60fps game on literally every piece of gaming hardware ever released. Developers chose fidelity over frame rate over and over again and will continue to do so. Not because they were making "unacceptable" games, but because it doesn't matter to most people.

 

I still fondly remember several of my mates getting together to each download a floppy-sized RAR chunk of the gigantic 25MB Ocarina of time ROM, bringing them to school, reassembling the RAR file, extracting the ROM, then burning it to five then-extremely expensive CDs in the library, just so we could emulate it on UltraHLE instead of playing it at 20fps. 

 

All of the games you've mentioned suffer greatly for their low framerates. Some are borderline unplayable and others are extremely ugly as a result. Most of the console games mentioned I played on PC though, but the rest I suffered through with a heavy heart that they ran so terribly. Bloodborne is still the tragic sore thumb in the list because it hasn't been remade, ported or emulated. Not only does it have an awful framerate but it has awful frametiming too. It's probably already been posted but I still can't believe the astonishing work done to get it running at an acceptable framerate on modded consoles recently.

 

 

 

 

Link to comment
Share on other sites

Yeah people are quick to rewrite history, at the time of its release Colossus was regarded as a next generation experience before next gen had arrived (the 360 and PS3 were just around the corner if you remember) and obviously now with hindsight it’s easy to assess it according to more modern standards but at the time there was nothing else quite like it on such a scale, it was really quite impressive. 

Link to comment
Share on other sites

3 minutes ago, Moz said:

I'm actually looking at 10+ year old threads on Gamefaqs right now where people are arguing the Gamespot and IGN reviews were wrong for mentioning the low framerate because they've completed it 11 times and the framerate is perfect. Not much changes does it.

It’s a question of ambition, should Mario 64, Ocarina, Colossus, Breath of The Wild etc. etc. not have been released until it was possible to make them run at 60fps and on affordable hardware?

Link to comment
Share on other sites

They're all easily emulatable so not really a problem. Holding software hostage on shitty hardware is the issue. See Bloodborne, or BOTW2 no doubt. It's not a question of ambition, it's a question of acceptable baseline. Normal people listen to Coldplay and vote for the Tories while the rest of us have to live in their mad collective reality unfortunately.

Link to comment
Share on other sites

27 minutes ago, Moz said:

 

I still fondly remember several of my mates getting together to each download a floppy-sized RAR chunk of the gigantic 25MB Ocarina of time ROM, bringing them to school, reassembling the RAR file, extracting the ROM, then burning it to five then-extremely expensive CDs in the library, just so we could emulate it on UltraHLE instead of playing it at 20fps. 

 

All of the games you've mentioned suffer greatly for their low framerates. Some are borderline unplayable and others are extremely ugly as a result. Most of the console games mentioned I played on PC though, but the rest I suffered through with a heavy heart that they ran so terribly. Bloodborne is still the tragic sore thumb in the list because it hasn't been remade, ported or emulated. Not only does it have an awful framerate but it has awful frametiming too. It's probably already been posted but I still can't believe the astonishing work done to get it running at an acceptable framerate on modded consoles recently.

 

 

 

 

 

How much more did the PC you emulated Ocarina Of Time on cost than an N64 though?

 

I might have you mixed up with another poster, but aren't you someone who usually has a top of the range GPU in a mahoosive PC setup at any given time? You're basically saying an experience on cheaper hardware is unacceptable compared to your experience on top of the range hardware.

 

I've flip flopped between PC and console gaming over the years, there's been times when I've been ahead of the curve (having even a mid budget gaming PC in the late 90s felt like being in another universe compared to the twilight years of the PS1), somewhere in the middle of the curve (Amiga 1200 in 1992) and right at the bottom of the curve (Amiga 1200 in 1996). I kept up with the console generations from the Dreamcast onwards but only joined the outgoing gen in 2018 when I got an Xbox One S for game pass. I currently have a reasonably hefty gaming PC, but one that will need a GPU upgrade in the next year or so (RX 5700 XT at the moment).

 

Maybe if I'd always had top of the range hardware I'd see things differently but the thing is Ocarina Of Time wasn't borderline unplayable when it came out although it might be a struggle to adjust to it now - coming off the back of ST / Amiga 3D games or Star Fox on the SNES it was a massive step up. And the 3DS version isn't unacceptable now because it's 30fps, it's a tidy little remaster that plays fine. It sounds like you've had a rarefied experience of gaming, a better experience, but one that cost far more, and you're telling people who haven't had that rarefied experience that the experiences they've had are unacceptable and in some cases that they must have a "brain injury".

 

It's pretty ridiculous.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.