Jump to content

Is Ray Tracing all that?


Recommended Posts

19 hours ago, mushashi said:

 

The Xbox Series S was launched now because Microsoft's engineering team believe Moore's Law is defunct.

 

 

 

Claims the marketing department, who want the S too seem like a sound longterm investment.

 

I totally think we have diminishing returns, but MS would give that answer, regardless of the truth of their plans.

Link to post
Share on other sites

Video showing various games withRT.  A game I’ve never heard of before called Into the Light actually uses RT for gameplay purposes so cannot be played without it

 

 

Link to post
Share on other sites
54 minutes ago, LaveDisco said:

 

Claims the marketing department, who want the S too seem like a sound longterm investment.

 

I totally think we have diminishing returns, but MS would give that answer, regardless of the truth of their plans.

 

TBF I don't think the marketing department have pushed or even claimed that. 

 

Their sales pitch is that its a cheap option for those on HD screens who want a good enough next gen experience. 

Link to post
Share on other sites
1 hour ago, LaveDisco said:

 

Claims the marketing department, who want the S too seem like a sound longterm investment.

 

I totally think we have diminishing returns, but MS would give that answer, regardless of the truth of their plans.

So the S isn't just an X at a lower resolution? That's marketing fluff and not something demonstrated by tedious DF videos? 

Link to post
Share on other sites
5 hours ago, Broker said:

Genuinely though the control video and people doing posts about how amazing it is we’re joking for a second. It makes me sad that hardware improvements have tailed off so badly that this is what we’ve got to get excited about now, slightly better reflections. It’s an absolutely pitiful improvement in image quality for the insane cost in hardware usage. 

 

I realise that you are too cool for triple A games but this is silly hyperbole.

 

5 hours ago, Broker said:

I think the Minecraft option is hilarious. It’s a bit better looking but renders the game completely unplayable as you’re destroying your draw distance and removing the ability to actually distinguish between the blocks. It’s a fancy way for people with a very expensive new graphics card to demonstrate how useful their graphics card would be if it was 1999 and they didn’t actually want to play their games. 

 

Yes' it is 'a bit better looking' but that is the point. It is a tech demo.

 

The rest is gibberish.

 

 

Insert oldmanshoutingatclouds.gif...

Link to post
Share on other sites

This has to be the most underwhelming "next-gen" graphical advance since Perfect Dark Zero made every surface look like it was wrapped in clingfilm. Under no circumstances am I halving my framerate for reflective windows and puddles that look like mirrors. Turning all that shit off first chance I get. When even Digital Foundry have to do a 400% zoom on a coffee pot to show off the amazing differences, I'm fairly sure this isn't something that's really going to have a material impact on my gaming experience.

 

I understand that this tech has potential when it's used for things like global illumination (the Metro Exodus section in that video up there actually looks pretty good) and even more so when it becomes the default method of lighting a scene, but we're years away from that on console still. Puddles and windows for us plebs until at least the mid-gen refresh. For now all that stuff will have to remain the domain of the kind of PC gamer who likes to talk about their "rig".

Link to post
Share on other sites
3 hours ago, footle said:

RISC involves throwing out half the software stack

 

For clarity that was a general point about keeping up with Moores law, I wasn't suggesting they switch the PS5 or Series X over to a completely different architecture as part of a mid-gen refresh.

 

I take your point about an increase in cost but think you'll be able to put out an approx 20TF machine in 3-4 years for about the launch price at the start of this gen. Even if the silicon costs a bit more stuff like the SSD will likely cost much less. 

 

Link to post
Share on other sites
6 minutes ago, Garwoofoo said:

This has to be the most underwhelming "next-gen" graphical advance since Perfect Dark Zero made every surface look like it was wrapped in clingfilm. Under no circumstances am I halving my framerate for reflective windows and puddles that look like mirrors. Turning all that shit off first chance I get.

 

It may well be underwhelming on consoles at present but let's not forget that it is very early in the generation and that consoles are having to force the tech in a way that Nvidia equipped PC's are not so, to be fair, they are doing pretty well (from what I've seen on PS5 at least). If surfaces 'look like mirrors' then that is as much poor implementation as anything but it's not really surprising that a new technology is abused as devs try to shoehorn in the most screamingly obvious implementation. The real magic is in the subtlety of light behaving like light and the take it for granted sense of realism that imparts to an enviornment.

 

 

6 minutes ago, Garwoofoo said:

For now all that stuff will have to remain the domain of the kind of PC gamer who likes to talk about their "rig".

 

:facepalm:

 

 

 

Link to post
Share on other sites
16 minutes ago, Garwoofoo said:

This has to be the most underwhelming "next-gen" graphical advance since Perfect Dark Zero made every surface look like it was wrapped in clingfilm. Under no circumstances am I halving my framerate for reflective windows and puddles that look like mirrors. Turning all that shit off first chance I get. When even Digital Foundry have to do a 400% zoom on a coffee pot to show off the amazing differences, I'm fairly sure this isn't something that's really going to have a material impact on my gaming experience.

 

I understand that this tech has potential when it's used for things like global illumination (the Metro Exodus section in that video up there actually looks pretty good) and even more so when it becomes the default method of lighting a scene, but we're years away from that on console still. Puddles and windows for us plebs until at least the mid-gen refresh. For now all that stuff will have to remain the domain of the kind of PC gamer who likes to talk about their "rig".


This is quickly turning into console vs PC. 

Link to post
Share on other sites
1 minute ago, simms said:


This is quickly turning into console vs PC. 

:) Yep. Although there isn't much difference between the two these days to be fair. Well except price perhaps.... In the end if I'd have spent what a decent PC + GPU costs these days I'd expect things to be incredibly shiny considering the investment I'd made. Although the games will still be targeted at the lowest common denominator which will be the consoles. Part of the reason I'm kind of happy I went with a Series X over a PC upgrade in the end. Third of the cost and at least a decent upgrade over my current PC in terms of future games. Although saying that my recent experience with The Medium showed me that my current PC is more than upto the job of giving a great experience. Didn't notice that much difference between the two and it ran smooth enough on my PC.


Saying all that I'm more interested in them using the new power available bringing new game ideas and experiences to the table rather than shiny graphics with some whizzy lighting effects. Reminds me a bit of lens flare that was over used back in the PS1 days.

Link to post
Share on other sites
2 hours ago, TehStu said:

So the S isn't just an X at a lower resolution? That's marketing fluff and not something demonstrated by tedious DF videos? 

 

Don’t get me wrong, I like the S. Its great. I have one. I don’t know what MS will do.  But downplaying mid gen refreshes now is something each company does until they announce. Hell Nintendo did it the day before they announced the new 3DS.

Link to post
Share on other sites
21 minutes ago, LaveDisco said:

 

Don’t get me wrong, I like the S. Its great. I have one. I don’t know what MS will do.  But downplaying mid gen refreshes now is something each company does until they announce. Hell Nintendo did it the day before they announced the new 3DS.

Guessing games will be scalable like they are on the PC. As far as I understand it the CPU is the same as the Series X (just clocked a touch slower). The GPU has fewer compute units. They'll just have a couple of target builds on the dev environment. Think all the extra memory in the Series X is mainly for the all the extra VRAM associated with 4K stuff.

Link to post
Share on other sites
1 hour ago, LaveDisco said:

 

Don’t get me wrong, I like the S. Its great. I have one. I don’t know what MS will do.  But downplaying mid gen refreshes now is something each company does until they announce. Hell Nintendo did it the day before they announced the new 3DS.

I think the next Series refresh won't be a refresh by definition, if you see what I mean. But yeah, there's a new device coming at some point. 

Link to post
Share on other sites

This is a weird thread, generally. These new consoles are bonkers, they're currently shitting out 4k@60 for fun, and it looks gorgeous. It's only when we get into this RT lark that suddenly they're cheap, underpowered PCs you could make for the same money. 

 

I exaggerate, but you know what I mean. I'm 100% fine with a $500 console handily beating a $500 PC but not a $500 graphics card. Plus you get to pick framerate or shines. Nice. Normally, shinies are foisted upon you. 

 

Let's check back in on real time Pixar films when a $500 box can reasonably be expected to do it :)

Link to post
Share on other sites
4 minutes ago, TehStu said:

This is a weird thread, generally. These new consoles are bonkers, they're currently shitting out 4k@60 for fun, and it looks gorgeous. It's only when we get into this RT lark that suddenly they're cheap, underpowered PCs you could make for the same money. 

 

I exaggerate, but you know what I mean. I'm 100% fine with a $500 handily beating a $500 PC but not a $500 graphics card. Plus you get to pick framerate or shines. Nice. Normally, shinies area foisted upon you. 

 

Let's check back in on real time Pixar films when a $500 box can reasonably be expected to do it :)


To be honest, if you take gameplay out of the equation and just do a real-time demo the you are going to get incredibly close. 
 

It all goes to pot when you have to bung in an actual ‘point’ to these graphics, and that’s when downgrades are obviously going to happen.

Link to post
Share on other sites
3 hours ago, simms said:

This is quickly turning into console vs PC. 

 

Yeah, and it is a bit stupid in the context of the original question.

 

RT is a goalpost moving tech but it is still very much in it's infancy so we pay a lot for it in terms of hardware, as is always the case. Current consoles might not be especially capable at the moment but will certainly get better over the course of the generation, albeit with the hardware cap of the AMD cards limiting that a little. Does the lack of RT stop me playing Demon's Souls or Astro, or Sackboy on the PS5? Of course not. Would I want the devs to compromise performance in order to shoehorn RT in to those (or any) games? No, and if there is an option I'll always choose frame rate over fancier graphics. That doesn't mean I'll not enjoy experiencing RT in all it's shininess on platforms that are able to implement it, nor does it's current niche application diminish the technology per se.

 

I tell you what though, as it hasn't happened for Sony and Microsoft this generation, I would like to see a Switch 2 equipped with a Nvidia chip that has a few discrete RT cores and DLSS capability. 

 

 

Link to post
Share on other sites

I feel they same way about RT as people do who say they turn off HDR because it's shit and I start punching my phone so I have to accept I'm pig ignorant of the value of RT because I haven't appreciated it properly yet and in about three years or something I'll be going on about it loads. 

Link to post
Share on other sites
41 minutes ago, mwaawm said:

Bah Ray tracing got old back in the days of my Atari ST


i got old waiting for an Atari ST to finish the render: 320x200? In colour? Couple of hours...

 

GFA Raytrace right? ST Format cover disc. Wireframe Cylinders and balls, all night long...

Link to post
Share on other sites
4 minutes ago, MattyP said:

Juggler on the Amiga..... Wonder how many Amigas that demo sold..... First ray traced amination I saw blew my mind back in the day! :)

 

 

 

 

 

wT4YuvM.gif

Looks good for a Switch game.

Link to post
Share on other sites
On 08/02/2021 at 10:52, petrolgirls said:

 

People have been predicting the end of Moores law for decades yet silicon keeps getting faster. To describe Moores law as having already slowed to glacial pace makes little sense: 

 

1542332229_1920px-Moores_Law_Transistor_Count_1970-2020.thumb.png.3f8d974f0dd4fad58c6f118c05f23b50.png

 

 

Cramming more transistors into a single chip will slow in the years to come but there are several other variables to increase performance, RISC, specialization and increased parallelism will all play their part. For me the mid gen refresh will be sold as an improvement to global illumination, it's a strangely pessimistic view that AMD in 3-4 years time won't have caught up to where Nvidia are now. 

 

 

Moore's observation was that transistor density at the same cost was doubling every year back in the 1960s, he then had to double that time interval. I think the evidence isn't particularly strong that companies can currently offer twice the transistor density for a similar price every 2 years. If you think there is a strong link between increased transistors and performance increases, then a 7 year gap between the consoles should have resulted in an 8X+ leap in performance for a similar chip cost.

 

Using the widely marketed TFLOP measurement, the PS4 was at 1.8 TFLOPs, so Moore's Law should have been able to deliver a 14.4+ TFLOP beast solely with the expected 8X+ increase in transistor count, let alone design changes to the architecture.

 

The PC GPU market is only managing to double performance every 3-4 years now, so if Moore's Law isn't dead, something is going horribly wrong elsewhere.

 

Nvidia's current mainstream flagship RTX 3080 is roughly double the raster performance of the GTX 1080 Ti and took over 3 years to arrive after it and needs more watts to achieve that performance increase. It also illustrates why TFLOPs is a useless measurement of performance most of the time, 2.6X the TFLOPs doesn't directly translate into actual useful game rendering performance currently.

 

Given the state of AMD's RT implementation, I'd wait until PS6 before we can see useful implementations that aren't full of compromises on consoles as that is long enough to get a useful jump. The only bright spot for AMD's implementation is its flexibility so how it works can be tweaked and customised, unlike Nvidia's more powerful black box dedicated RT cores.

 

I would agree that other avenues will need to be explored to make up for pure generalist power, which is precisely what Nvidia have already done with their dedicated RT and Tensor cores, and what Apple is doing with their chip designs with multiple specialist bits included.

 

 

 

Link to post
Share on other sites
On 08/02/2021 at 13:10, LaveDisco said:

 

Claims the marketing department, who want the S too seem like a sound longterm investment.

 

I totally think we have diminishing returns, but MS would give that answer, regardless of the truth of their plans.

 

I think they aren't lying, cost reductions have been getting worse every generation. The PS1 and PS2 were able to be cost reduced down to $100 or less, the PS3 never came close to that, and the PS4 hasn't managed to get down to where the PS3 managed either. I think the current boxes will continue that upwards trend in minimum pricing.

 

This would mean any cost reduced smaller XSX which would have been the XSS equivalent would only be commercially viable a very long time away from now, so they decided to get a weaker one out instead at launch, rather than a cheaper same power one much, much later.

Link to post
Share on other sites
13 hours ago, mushashi said:

Using the widely marketed TFLOP measurement, the PS4 was at 1.8 TFLOPs, so Moore's Law should have been able to deliver a 14.4+ TFLOP beast solely with the expected 8X+ increase in transistor count, let alone design changes to the architecture.

 

The PC GPU market is only managing to double performance every 3-4 years now, so if Moore's Law isn't dead, something is going horribly wrong elsewhere.

 

Your sample of one, the PS4-5 marginally underperformed vs Moore's law. The Xbox One to Series X was exactly on target. Not really evidence of Moore's law "slowing to a glacial pace" as you claimed initially. I'm not disputing the fact that computer die are starting to butt up against quantum limits but I think it's premature to charectorize computing as having flatlined to the extent that mid gen upgrades are off the cards this time round. Guess time will tell. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.