Jump to content

mushashi

Members
  • Content Count

    15,760
  • Joined

Profile Information

  • Gender
    Not Telling

Recent Profile Visitors

8,753 profile views
  1. Yeah, I think that was the first time I became aware of them, though they were active before the first 3D consoles and the founders are ex-game devs. Seems they did dev tools for a whole bunch of platforms when they were independent: Sony PlayStation (PSY-Q -> SDevTC) SEGA Saturn (PSY-Q) SEGA 32X SEGA MegaDrive/Genesis (SNASM65k) SEGA Mega-CD (SNASM2) Super Nintendo Entertainment System (PSY-Q) Commodore Amiga 1200 and 600 Williams Phoenix Arcade Board Nintendo 64 (SN64 -> ProDG) There was at least one other company competin
  2. It maybe a silly notion, but I've heard some of the vocal partisan gaming podcasters say exactly that, they ain't putting up with 30fps console games anymore. I think they'll have a problem with that stance as the generation eventually arrives for real, rather than this last gen assets with headroom to enable 60fps transition period that we are currently in. Or they'll have to put up with reconstructed resolution games being the norm as you'll only get 2 out of the 3 from the Holy Trinity the majority of the time I expect, Graphics/Rendering Resolution/Frame Rate. PC gamers are alr
  3. Do you use an external framerate limiter?, the 3 options being Special K, RTSS and then NV control panel, ranked in order of how consistent they are in terms of framepacing, the downside with Special K being it's obtuse to use but it's near perfect in terms of framepacing. The built-in ones for most games should in theory be best, with lowest added input latency, but most of the time is broken.
  4. The primary gaming market for JRPGs is considered to be of a similar age to the age of the characters in most JRPGs, who else has the fucking time in Japan to play such bloated games? Ryu ga Gotuku, even in its 'original' form is essentially an off-shoot/evolution of the JRPG genre, they just replaced the turn-based battle system with a realtime one. It still features random encounters and leveling and item management.
  5. It's because for whatever reasons game developers can't/don't bother ensuring the 30fps modes on their PC ports are actually properly functional (if they exist at all, it's sometimes a 40Hz mode as the lowest supported frame cap), but then again, they often don't do the same for their 60Hz frame rate caps either. But at least on PC, you can do it for them 99.9% of the time. Also, once you've used enough of them, you begin to realise that monitors don't all actually have the exact same refresh rate, they actually differ slightly, which makes things more complicated if you are after
  6. Their engine uses a built-in DLSS alternative, which explains how they freed up the resources to get a full RTGI solution running, it's not native 4K/60fps. The consoles still don't have enough grunt to do GI and reflections together. The real interesting test will be RDNA2 vs Ampere on PC. On consoles, they have a custom RT solution, while on PC, it has to go via DX12U. Their CTO is great, a no bullshit person so I'd be confident they can deliver on XSX and PS5 what he claims they can. He says a bunch of interesting things in this interview: htt
  7. Game devs already have access to way more detailed per frame performance metric data as-is. That kind of reporting has in the past been responsible for performance gains in consoles over time as they use it for working out optimisations. There used to be independent companies solely making those kinds of tools for consoles at least as far back as the 1990s. https://www.snsystems.com/ DF making this stuff more mainstream raises the technical quality bar for console games and you could argue partly fuels the ever increasing demands too, but ignorant technically ill
  8. Apparently this was Edgar Wright's top choice for films he'd like to remake so at least he should be committed to doing it if it's a passion project: https://www.yahoo.com/entertainment/edgar-wright-direct-stephen-king-200301898.html?guccounter=1 Though the article ends with a list of other projects he seems to also want to do so it might not come out that soon if he has a backlog of other work to clear first.
  9. If length was correlated with great character development, daily soap operas must have the deepest characters ever created US TV series length has been dropping over time, and was primarily commercially driven by Syndication, nothing to do with artistic requirement. Few other countries do 22+ episodes per run. An interesting article on the trend in US TV show episode count and why less is better: https://www.vulture.com/2015/06/10-episodes-is-the-new-13-was-the-new-22.html
  10. I think they aren't lying, cost reductions have been getting worse every generation. The PS1 and PS2 were able to be cost reduced down to $100 or less, the PS3 never came close to that, and the PS4 hasn't managed to get down to where the PS3 managed either. I think the current boxes will continue that upwards trend in minimum pricing. This would mean any cost reduced smaller XSX which would have been the XSS equivalent would only be commercially viable a very long time away from now, so they decided to get a weaker one out instead at launch, rather than a cheaper same power one muc
  11. Moore's observation was that transistor density at the same cost was doubling every year back in the 1960s, he then had to double that time interval. I think the evidence isn't particularly strong that companies can currently offer twice the transistor density for a similar price every 2 years. If you think there is a strong link between increased transistors and performance increases, then a 7 year gap between the consoles should have resulted in an 8X+ leap in performance for a similar chip cost. Using the widely marketed TFLOP measurement, the PS4 was at 1.8 TFLOPs, so Moore's L
  12. Crytek can be forgiven for the direction they took, CryEngine was developed in an era where Intel were promising 10GHz CPUs and people like Gabe Newell were sceptical of multicore CPUs: The only reason CPUs had to go multicore is because of the death of Dennard Scaling which broke down around 2006, meaning increasing clockspeeds could no longer be easily achieved. It's taken over a decade for the problems mentioned by Gabe Newell to be solved and even then, core utilisation still isn't anywhere near perfect.
  13. The Xbox Series S was launched now because Microsoft's engineering team believe Moore's Law is defunct. The X1X relied on it to deliver a performance increase (similar size chip as the X1 which could cram in more transistors because of Moore's Law). With it slowing to a glacial pace and the cost of manufacturing going up, it'll take much longer now to be in the same situation that allowed Microsoft to deliver the same performance leap without having to cut their own throat. It took Sony 3 years to roughly double the performance of the PS4, which for some people was cons
  14. Even with a DLSS equivalent, the consoles can't match Nvidia as AMD's RT hardware acceleration is significantly weaker. Control on console already has to resort to a DLSS-equivalent to run acceptably, its RT is checkerboarded and reduced in scope and features while PC is native resolution RT with more RT effects. The Medium has similar cutbacks to its RT versus Nvidia on PC. Gran Turismo 7 will make use of the same trick to reach acceptable performance.
  15. xCloud uses customised Xbox One S-based server blades, so only games that are native to the X1S work on it. They are promising upgrades to X1X-based blades at some point.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.