Jump to content
IGNORED

Xbox Series X | S


djbhammer

Recommended Posts

So let's assume DLSS isn't possible on the new consoles, what should devs be aiming for, IQ wise? Surely it would be better to aim for 1440P or FauxK and use hardware resources elsewhere seeing as apparently most people can't even notice the difference after 1080P?

 

I think 99% of people would be quite happy with 1440P/FauxK/60 than 4K/30 next gen if it meant higher framerate and more shinies?

Link to comment
Share on other sites

5 minutes ago, AI1 said:

So let's assume DLSS isn't possible on the new consoles, what should devs be aiming for, IQ wise? Surely it would be better to aim for 1440P or FauxK and use hardware resources elsewhere seeing as apparently most people can't even notice the difference after 1080P?

 

I think 99% of people would be quite happy with 1440P/FauxK/60 than 4K/30 next gen if it meant higher framerate and more shinies?


I’d prefer 4k/60 and less shinies, but, you know...

Link to comment
Share on other sites

There’s all sorts of tricks they can use. Quantum Break rendered at an internal resolution of 720p and used a form of temporal upscaling to make it appear higher res. It was flawed but worked well for the kind of aesthetic they were going for. 
 

The new consoles both also support Variable Rate Shading, which allows certain areas of the image to be shaded on a less than per pixel basis - so despite an image being rendered at 4K, not each pixel of that image is being shaded individually, reducing GPU load. This works well for big parts of the screen which look pretty much the same - not every single pixel of a big blue sky needs to be rendered individually, for example.
 

There’s a good video on it by Digital Foundry. 

Link to comment
Share on other sites

1 hour ago, AI1 said:

So let's assume DLSS isn't possible on the new consoles, what should devs be aiming for, IQ wise? Surely it would be better to aim for 1440P or FauxK and use hardware resources elsewhere seeing as apparently most people can't even notice the difference after 1080P?

 

I think 99% of people would be quite happy with 1440P/FauxK/60 than 4K/30 next gen if it meant higher framerate and more shinies?

Running with your assumption, less proprietary reconstruction techniques have improved a lot over the current generation. My stance is that it doesn't really make sense for any game to push native 4K when so much performance can be clawed back by targeting a lower resolution – somewhere north of 1440p – and reconstructing the image up.

 

I think the Series S complicates things, however. The hardware is supposedly feature equivalent to the Series X but designed to push a quarter of the resolution, so if the latter is rendering games at less than native 4K in stands to reason the former will have to scale proportionately, too. In that case games on Series S are either rendering below 1080p – which will be a lot more noticeable than a game not being 'true' 4K – or other graphical features will need to be further cut or toned down.

 

In other words, the existence of the Series S might push the Series X into hitting native 4K even though I don't think it's the best use of the hardware.

Link to comment
Share on other sites

14 minutes ago, Ferine said:

Running with your assumption, less proprietary reconstruction techniques have improved a lot over the current generation. My stance is that it doesn't really make sense for any game to push native 4K when so much performance can be clawed back by targeting a lower resolution – somewhere north of 1440p – and reconstructing the image up.

 

I think the Series S complicates things, however. The hardware is supposedly feature equivalent to the Series X but designed to push a quarter of the resolution, so if the latter is rendering games at less than native 4K in stands to reason the former will have to scale proportionately, too. In that case games on Series S are either rendering below 1080p – which will be a lot more noticeable than a game not being 'true' 4K – or other graphical features will need to be further cut or toned down.

 

In other words, the existence of the Series S might push the Series X into hitting native 4K even though I don't think it's the best use of the hardware.

 

1080p is a quarter of 4k. The S should have at least a  1/3 of the GPU power of the X.  There's some headroom if the X version isn't at native 4k.  That's assuming everything does scale with resolution.  That's probably a fair assumption for current gen rendering.  With next gen stuff like ML upscaling and wizzy virtual geometry, who knows. 

 

 

Link to comment
Share on other sites

14 hours ago, Broker said:

Surely you just build the same thing but with a cheaper GPU? Which given how ridiculously expensive they are is a big saving. 

 

I think in bulk that saving might not be as large as you're hoping but even so it can be dodgy because a GPU kinda isn't, it's perfectly possible and in some cases desireable to offload other jobs onto it as well, especially later in a generation where you're hunting for every mhz.  Physics calculations especially can be suited to GPUs.

Link to comment
Share on other sites

1 minute ago, Dudley said:

 

I think in bulk that saving might not be as large as you're hoping but even so it can be dodgy because a GPU kinda isn't, it's perfectly possible and in some cases desireable to offload other jobs onto it as well, especially later in a generation where you're hunting for every mhz.  Physics calculations especially can be suited to GPUs.


This is true, but I suspect even a small amount of ray tracing being done will eat up any spare capacity, and those Zen2 cores can be used for physics.

Link to comment
Share on other sites

Maybe it's time for "they're just basically PCs" to cost a PC. This price thing is dumb, no one will be suddenly swayed by the price of one versus the other.  We've had a whole generation to dig in and commit to the backfire effect, plus our respective digital libraries.

Link to comment
Share on other sites

With Microsoft's strategy they can afford to not take a huge loss, the games will all be playable on all of your devices, gamepass is cross gen even cross device with PC, if they do release a Series S as well i can see that selling very well, i'd definitely be interested in a Mini Series X that's 1080p focussed.

Link to comment
Share on other sites

I saw this price rumoured a few days ago by someone who reckoned shops had it in their systems as rrp $599 too. 

 

Perhaps taking the government's approach and leaking info to see what reaction is. Also, if they can afford to go lower then $500 doesn't sound so bad. 

 

Thanks to brexit $=£ but I still can't quite believe it'll be £600. I feel that £500 is going to be the limit for the mass market, I'll be surprised if they go over that. 

Link to comment
Share on other sites

44 minutes ago, Ninja Doctor said:

The word ‘estimated’ is doing heavy lifting in that giveaway legalese text. 


I was curious about this stuff so I had a look, and the prize value is there because you can get taxed on the value of contest prizes in the US. So there’s a big incentive to not underestimate how much a prize is worth, lest you leave a winner with a big tax bill.

Link to comment
Share on other sites

38 minutes ago, Alex W. said:


I was curious about this stuff so I had a look, and the prize value is there because you can get taxed on the value of contest prizes in the US. So there’s a big incentive to not underestimate how much a prize is worth, lest you leave a winner with a big tax bill.

 

Yep, Dan from Giant Bomb won that Taco Bell wedding and had a massive bill to pay for it, it as mentioned on an episode of the Beastcast one week.

 

Also, if the price gets leaked out that it's 599, maybe the incentivises Sony to detail their price at 550, then M$ come in and smash a 449 Series X proper announcement on the table :)

Link to comment
Share on other sites

On 17/08/2020 at 22:10, Alex W. said:

Can someone who knows more than me explain what the “GDDR6” bits are? Their board pics show RAM off the die, so is that just the interface on the APU?

Memory interfaces. It has 10 x 32bit channels. 20 x 16bit channels, each interface has 2 x 16bit links to each GDDR6 chip. (I forgot GDDR6 changed from 1 x 32bit to 2 x 16bit)

Link to comment
Share on other sites

8 hours ago, TehStu said:

Bizzare urge to make that in Minecraft, if you're wondering about my mental health. 

 

Edit - everyone sees IO and sings it like a Disney dwarf, right? 

 

I will now! :lol:

 

Link to comment
Share on other sites

  • djbhammer changed the title to Xbox Series X | S

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.