Jump to content

The official can AMD finally deliver a competitive GPU thread


Recommended Posts

Watched the Gamers Nexus review last night which really showed up the difference between 4K, DLSS and ray tracing and confirmed that SAM doesn't seem to make any real difference so that took a little of the shine off.  Boo!

 

That said, my current monitor can only handle 1440p and my primary use case is actually VR so I think that I'm still good from a rasterisation angle - DLSS may be very interesting if applied to boost VR games but I've not seen DLSS applied in a VR content so that's untested at the moment.  Without a VR focused performance assessment and review of the graphics cards its hard to know what is best.  At the end of the day, I'm happy that I have a new card and it's a big step up from my 980TI.

 

Managed to find time over lunch to install the 6800XT.  Installation was super smooth.  Not had time to test things out properly.

 

First thing I noticed was just how quiet it sounds *WARNING* I'm a little deaf ... however my deafness is with mid to high pitches, low pitched noises are actually much clearer because low pitch rumblings often "drown out" normal sound and appear louder to me ... if that makes any sense.  I was worried I had accidentally pulled out a fan cable somewhere ... nope all good ... just very quiet.

 

I won't be able to give the card in VR a proper go until the weekend - looking forward to PC2 and F1 2018 (via Vorpx) to gauge the level of difference.  Got the Oculus v23 update last night so I'm all set. 

Link to post
Share on other sites
49 minutes ago, Simbo said:

I won't be able to give the card in VR a proper go until the weekend - looking forward to PC2 and F1 2018 (via Vorpx) to gauge the level of difference.  Got the Oculus v23 update last night so I'm all set. 

 

Probably worth mentioning if you're going to dabble in VR - if you're going the wireless route via Virtual Desktop, don't forget to change the video encode format. IIRC, AMD's better on H264, Nvidia's better on NVENC. small change that can be forgotten about when changing cards, but a major difference in reducing lag/frame stutters! Enjoy!

Link to post
Share on other sites

Cheers @Siri.  Never thought of that.  I will need to give virtual desktop a go.  It's such a faff getting Oculus link setup.   You get one taste of wireless vr (quest native games) then you never look at that cable the same way again.

 

 

While I am here, thanks for your advice on swapping over the cards ... Worked perfectly first time so cheers for that too!

Link to post
Share on other sites
46 minutes ago, JackG4 said:

It makes me feel warm and cosy inside to know that the 6800 bots accidentally bought and sold out all of 2004's remaining 6800 GT stock. I hope they were excessively scalped

i have literally just learnt that there was also a card called...the 6800 XT....yep...nvidia had an identically named card :lol:! lets hope some scalping schmucks ended up with them too lol

 

also , this...woah!

 

image.thumb.png.3889195db3f3c25ad059f1b92b14624f.png

 

i don't think we will be seeing path traced games on the next gen consoles any time soon!

Link to post
Share on other sites
On 19/11/2020 at 18:18, 5R7 said:

image.thumb.png.3889195db3f3c25ad059f1b92b14624f.png

 

i don't think we will be seeing path traced games on the next gen consoles any time soon!

 

The settings must be pretty different or it runs better on the non-Nvidia exclusive MAX GRAPHIX levels, but then again, Microsoft haven't mentioned it again since showing this tech demo:

 

 

Link to post
Share on other sites
46 minutes ago, mushashi said:

 

The settings must be pretty different or it runs better on the non-Nvidia exclusive MAX GRAPHIX levels, but then again, Microsoft haven't mentioned it again since showing this tech demo:

 

 

 

the oddest thing is that it was nvidia that MADE the path traced minecraft...they created the entire thing, complete with the redone textures and that... does make one wonder if that really was captured on XsX

 

Id like to see what the xsx does with quake 2 rtx as well really.

 

edit - bold ridiculous prediction!

 

The last gen was hamstrung by terrible CPU's, this gen, it will be the fairly poor RT performance of the AMD chipset that holds things back to 30fps!

 

now wait!, the standard rasterisation performance is good! as evidenced by the gears 5 benchmark on xsx that shows a sort 0f 2080/s ish level, which would make sense being as its 52 CU's so 11% lower than a 6800, and as a fixed spec, great! Loads of room to use that!

 

BUT, from what we have seen from the RT performance....its more like a 2060s in that regard...not nearly enough grunt to actually do RT+60...

 

SO, expect graphics and performance options! where you get either or, but not both...apart from sony first party stuff, which will look great as it will target 30, and it will just be that.

Link to post
Share on other sites

I don’t know much about graphics cards, but I’ve been looking into the 6800xt to use for rendering on my Mac and wondered if one of you knowledgable lot could answer a couple of questions.

 

I know they’re practically impossible to get hold of at the moment, but overclockers are saying they’ll have some up for preorder on the 25th. The price seems to be £679.99. Is this the RRP or will prices fluctuate like the did with Nvidia cards last time I was in the market for one? Is that the price I should be looking for?

 

I also believe there aren’t any Mac drivers yet, but will these be coming soon in an update to Big Sur?

 

I also read that eGPU boxes won’t work with the new M1 chip. This seems weird to me, as Big Sur is the first Mac OS to properly allow them, so why would they be stopping that already? It’s just a limitation of this first gen M1 cards, right?

Link to post
Share on other sites

They’ll fluctuate in price.
 

Overclockers still have thousands of preorders from the 3080 launch; many of which still won’t be fulfilled for months. Don’t buy there.

 

No support on M1 is certainly a limitation of the initial chips. Who knows whether that limitation will remain; it is the kind of limitation that Apple might choose to make to simplify and ensure they have complete control over your computer.

Link to post
Share on other sites

Thanks @footle

 

When they are finally available, what kind of price should I be looking for then? And, which manufacturer? I noticed the PowerColor cards seem to be well thought of, but like I say, I’m new to this game.

 

I honestly can’t see Apple locking off eGPUs going forward. They’ve been banging on for years now about trying to win the pro market back and they’ve finally got OS level eGPU support. They’d be mental to do that, but this is Apple we’re talking about.

Link to post
Share on other sites
10 minutes ago, krenzler said:

Isn't the Ampere architecture generally better for this none gaming stuff - like rendering etc.? 

I’ve not idea what that is, sorry!

 

All I know is Apple are only allowing AMD GPUs these days, so I’m currently stuck on High Sierra with an Nvidia 1080ti. I really want to upgrade my OS for various reasons, so I’m looking at the best way to do it.

 

Edit: Just to say, one of the reasons for the upgrade is for speed, so any card I get would really need to be better than my current one.

Link to post
Share on other sites
15 minutes ago, krenzler said:

Isn't the Ampere architecture generally better for this none gaming stuff - like rendering etc.? 

Apple fell out with Nvidia.

 

A 6800XT would be a good and significant improvement on a 1080Ti, though perhaps not £600 worth of improvement. I’d wait until:

- someone tries one on an Apple Mac and reports the results.

- there’s some announcement about eGPU support long term: they introduced it when they weren’t selling their own GPU capability.

- the cards aren’t quite so supply constrained: that £680 is at a premium over what the competitive RRP is meant to be.

Link to post
Share on other sites
14 minutes ago, footle said:

Apple fell out with Nvidia.

 

A 6800XT would be a good and significant improvement on a 1080Ti, though perhaps not £600 worth of improvement. I’d wait until:

- someone tries one on an Apple Mac and reports the results.

- there’s some announcement about eGPU support long term: they introduced it when they weren’t selling their own GPU capability.

One of the reasons I’m looking to upgrade is that being stuck on High Sierra means that a lot of newer software that I need to use (Adobe CC) isn’t compatible now, so I’m stuck on an old version. Not something I can really carry on with, as sometime soon I’ll no doubt be receiving files I won’t be able to open.

 

What is Apple’s own GPU capability? I know the GPU is built into the M1 chips, but I’m currently on a 2019 MacBook Pro and don’t plan on getting a new one for a couple of years at least, so I need some way to be able to render using an eGPU.

 

I suppose getting a couple of years out of it, as long as the support is there, would be cost effective enough for me.

Link to post
Share on other sites

The m1, but I’m expecting they’ll announce something more powerful for pros next year. You’re kind of between a rock (a reasonable GPU, but a petulant OS supplier) and a hard place (limited stock of GPUs that might work but no ones demonstrated and which may have no driver support yet; a large scale hardware transition).

 

buy an older amd GPU that’s in stock, isn’t as good as the 1080ti, but may work with a newer version of macOS as a holding pattern?

Link to post
Share on other sites
5 hours ago, krenzler said:

Isn't the Ampere architecture generally better for this none gaming stuff - like rendering etc.? 

 

Yes, it's a data centre-centric lopsided design, which is why it scales worse than RDNA2 at lower rendering resolutions but is a beast at compute (they doubled the compute capability without doing the same for the rest of it)

 

 

18 hours ago, 5R7 said:

 

the oddest thing is that it was nvidia that MADE the path traced minecraft...they created the entire thing, complete with the redone textures and that... does make one wonder if that really was captured on XsX

 

Id like to see what the xsx does with quake 2 rtx as well really.

 

 

Minecraft DXR runs at 720p to achieve 60fps most of the time on a RTX 2060, the XSX is around that power level for RT (based on the current shipped evidence) so I can believe it was real. The game uses DirectX 12 Ultimate features, which is a universal Next Gen Microsoft ecosystems feature.

 

I assume they don't want to release it at present if it has to render at sub-1080p to get stable performance as the tech demo fluctuated between 30 to 60 according to the DF video, the XSS would definitely struggle and that might be a tad problematic for the marketing department.

 

Quake 2 RT would be similarly as bad in terms of performance, full path-tracing seems too much for RDNA2 to handle at modern accepted rendering resolutions.

 

Link to post
Share on other sites
On 19/11/2020 at 11:28, Gabe said:

 

Each time AMD are in consoles people think it could mean great things for them, yet Nvidia still smashes them in market share (even with the low-effort they have put in over the years), so I wouldn't hold my breath on that. These cards are clearly a good step up, but team green have a vastly bigger R&D budget to keep the lead on ray tracing and I fully expect them to do so. Not sure how that situation changes really without AMD releasing something either much, much better and much, much cheaper.

 

At the very least though, this should mean Nvidia actually do more than twiddle their thumbs with poor £/performance iterations over the next few years. Maybe.

 

I also think you've confused Nvidia with Intel ;)

 

For a company accused of laziness and phoning it in, they've innovated more than AMD and provided as-big or better leaps in performance in the same time frame. You've just had to pay for it.

 

AMD's problem in graphics is that the competitor has way more money and is led by a ruthless leader who knows how to win and doesn't like losing. Just see how quickly Nvidia announced they too were supporting the PCIe resizable memory trick that AMD are promoting as a bit of recent evidence.

 

Years ago, AMD's first choice of GFX card manufacturer to buy wasn't ATI according to a story published years ago, it was Nvidia. The deal fell through because Jensen wanted to be CEO, now that would likely have altered history if they had let him have the top job. Or maybe he had a narrow escape and got to turn Nvidia into what it is today instead.

Link to post
Share on other sites
5 hours ago, footle said:

The m1, but I’m expecting they’ll announce something more powerful for pros next year. You’re kind of between a rock (a reasonable GPU, but a petulant OS supplier) and a hard place (limited stock of GPUs that might work but no ones demonstrated and which may have no driver support yet; a large scale hardware transition).

 

buy an older amd GPU that’s in stock, isn’t as good as the 1080ti, but may work with a newer version of macOS as a holding pattern?

Yeah, I’ve been looking at a 5700XT, but that’s roughly £400 and I don’t know what kind of performance hit I’ll have to take. Knowing my luck, I’d get one of those then Apple will announce an OS update for the newer cards.

 

Fuck knows why they have to be so opaque about things. As much as I love using Macs, they’re a complete pain in the arse with stuff like this.

Link to post
Share on other sites
3 hours ago, mushashi said:

 

 

Minecraft DXR runs at 720p to achieve 60fps most of the time on a RTX 2060, the XSX is around that power level for RT (based on the current shipped evidence) so I can believe it was real. The game uses DirectX 12 Ultimate features, which is a universal Next Gen Microsoft ecosystems feature.

 

I assume they don't want to release it at present if it has to render at sub-1080p to get stable performance as the tech demo fluctuated between 30 to 60 according to the DF video, the XSS would definitely struggle and that might be a tad problematic for the marketing department.

 

Quake 2 RT would be similarly as bad in terms of performance, full path-tracing seems too much for RDNA2 to handle at modern accepted rendering resolutions.

 

 

helpfully gamers nexus have a handy review of the 6800, showing minecraft  with RT! at 1080p... id guess the xsx could do a solid 1080/30

The 2060, with DLSS, can run it 1080p/60, or 4k/30!?

image.thumb.png.df6811bbd7b695f84b516a6740001495.png

 

 

Link to post
Share on other sites

Was able to get some time to try out my RX6800XT.  My weekend went through the full Gartner hype cycle.  Started off at the "Peak of Inflated Expectations", then plummeted very quickly into the "Trough of Disillusionment" - I am now on the "Slope of Enlightenment" !

 

From Peak of Inflated Expectations to landing face first into the Trough of Disillusionment:

 

I was expecting big things moving from a 980TI to the 6800XT.  My primary use case is VR so that is where I started.  First up was Project Cars 2.  Started the game, noticed the resolution was indeed nicer on my Quest 2 (as it was before when running with my 980TI) so I enabled metrics logging and started to experiment with the graphics settings.  Long story short, was disappointed that I couldn't nudge any of my settings up from Low/Off (textures/complexity etc).  

 

Then I tried F1 2018 via Vorpx - the resolution has never been great on my 980TI but somehow ... somehow ... this was even worse.  It was like an old DoS game - I swear the resolution appeared to look like the old 320x200 resolution (or whatever it was).  Between those failed first attempts and the Gamers Nexus review I was pretty disappointed and started to consider if I should sell the AMD and try to get a 3080.

 

Also tried to give Quake II RTX a go - that just rubbed salt into the wounds as it wouldn't launch - up pops a error message, "you need an RTX compatible card, thicko" or words to that effect.  I believe that NVIDIA opened this up to more cards in their range but I can't see a way to run it on AMD.  Looks like it runs on proprietary NVIDIA ray racing Vulkan extensions.  I gather that AMD are planning on Vulkan support and there is a standard set of extensions in the pipeline. 

 

Climbing out of the Trough and edging back up the slope of enlightenment:

 

Ghost Recon Breakpoint - fired this up - I noticed that the graphics options appeared to have changed ... everything was set to Ultra (not by me).  Started to look into the Radeon software and it looks like there is a setting that bumps up the settings automatically - I realised it may have tried to apply (non-VR) optimisations for things I am playing in VR?  In terms of GRB - it was great to jump from 1080p to 1440p (as high as my monitor supports) and have everything bumped up to the max.  OK, in reality, all I really noticed was more grass blowing in the wind but at least I felt that my new card was able to achieve something.

 

Decided I should have another go at F1 2018.  Disabled the automatic settings thingy in AMD software and also randomly opted into the dx12 F1 2018 beta.  Gave it another go with Vorpx.  It ran butter smooth .. whilst keeping the quality settings low I was able to bump up the game resolution much much higher than I ever had before.  Jackpot!  Had a couple of time trial laps around Albert Park and had a huge grin on my face.  Finally, things started to turn the corner.

 

Tried Project Cars 2 - still no improvement - so at a loss what to do here.

 

However, I did also give Assetto Corsa Competizione a go.  This ran great from the offset - large obvious improvement straight away.  The UI was crystal clear, the graphics were great and the OculusDebugTool confirmed I still had lots of headroom spare.  This was a big turning point as I gather that ACC is pretty hard on systems for VR.  It's a joy with the increased resolution of the Oculus Quest 2 and I think its needed as the fonts are quite small to read (on my CV1 anyway).

 

Observations:


I've realised how hard it is to meaningfully test and compare games when I've upgraded by my Oculus Headset and graphics card.  The improved screen resolution of the quest 2 does ask big questions of the GPU - but I only realise now how much of that is masked by the underlying software.  v23 of the oculus runtime makes it easier to see that there is automatic tuning of the target rendering resolution which I guess varies from person to person depending on their PC build.  (It explains why a friend of mine running on a 1060 said he couldn't see an improvement moving from CV1 to Quest 2 for PC games over Oculus Link).  With hindsight, I should have recorded the recommended values it chose for my 980TI and Quest 2 combo but I have a vague recollection that it was running at a lower resolution (eg x0.8) which is now improved to (x1.0).  Note:  still only targeting 72Hz mind - again had the expectation that 90Hz would have been easy ... it's not.

 

I think there is something funny going on with PC2 but in fairness there is a large resolution bump in game - the quality of graphics is obvious when looking at just how crystal clear the steering wheel LCDs in game appear and there is an improvement when looking further into the distance, the apex is much clearer.  But I am still disappointed that I can't nudge up the settings from low to medium (eg textures etc) - there is no headroom for improvement.

 

F1 2018 (on dx12) and ACC completely met my expectations.  I have a choice to see if I stay at 72Hz and push up graphic quality or try it at 90Hz and see how I get on.  Very pleased with the graphics as they are running right now and its nice to have so much headroom to play with.

 

Read an article comparing 15 VR games between the RX6800 XT and 3080 which suggested that both cards are broadly similar.  That said, their choice of a Vive Pro is a lower screen resolution than the Quest 2 so might not be stressing the cards as much.  Wouldn't mind seeing similar results for headsets with higher resolutions eg Quest 2 / HP G2 - maybe one card performs better than the other at higher resolutions?

 

Summary:

 

I've still got more games to test; Elite Dangerous, Star Wars Squadrons, Oculus exclusives such as Asgards Wrath and of course Half Life Alyx.  My initial confidence with the card took a big knock over the weekend - a really huge knock - but it's starting to recover.  Maybe a 3080 would have been a safer choice, but its early days and maybe some tinkering around will improve things further.

 

If ray tracing is important to you it's clear that the 3080 is a better card.  I'm also keen to understand if DLSS will have huge ramifications for high end VR headsets too longer term as I can see scenarios for new games plus Vorpx working out rather nicely.  (I'm looking at you - F1 2020)

 

I have had some mixed experiences and I'm still not convinced I've backed the right horse but its early days.  I am genuinely happy with the F1 2018 & ACC performance, so I'll give it a chance to grow on me.   I'll wait a year and see what the landscape looks like - maybe I'll end up loving my card or maybe 3080TI's will be widely available a year from now - I can always sell it, but I'll keep the 6800XT for now. 

 

edit: forgot to mention - my Oculus Graphics Performance settings (defaulting to recommended) are 3616 x 1840 @ 72 hz.  I've ran debug tool to ensure 72Hz is achieved steadily - no drop down to 36 ASW etc.  (I never thought to check if my 980TI / Quest 2 combo was running on ASW so I've not accurate baseline to compare against)

Edited by Simbo
forgot to mention the graphic settings
Link to post
Share on other sites
12 hours ago, Simbo said:

Long bit snipped....

 

Summary:

 

I've still got more games to test; Elite Dangerous, Star Wars Squadrons, Oculus exclusives such as Asgards Wrath and of course Half Life Alyx.  My initial confidence with the card took a big knock over the weekend - a really huge knock - but it's starting to recover.  Maybe a 3080 would have been a safer choice, but its early days and maybe some tinkering around will improve things further.

 

If ray tracing is important to you it's clear that the 3080 is a better card.  I'm also keen to understand if DLSS will have huge ramifications for high end VR headsets too longer term as I can see scenarios for new games plus Vorpx working out rather nicely.  (I'm looking at you - F1 2020)

 

I have had some mixed experiences and I'm still not convinced I've backed the right horse but its early days.  I am genuinely happy with the F1 2018 & ACC performance, so I'll give it a chance to grow on me.   I'll wait a year and see what the landscape looks like - maybe I'll end up loving my card or maybe 3080TI's will be widely available a year from now - I can always sell it, but I'll keep the 6800XT for now. 

 

edit: forgot to mention - my Oculus Graphics Performance settings (defaulting to recommended) are 3616 x 1840 @ 72 hz.  I've ran debug tool to ensure 72Hz is achieved steadily - no drop down to 36 ASW etc.  (I never thought to check if my 980TI / Quest 2 combo was running on ASW so I've not accurate baseline to compare against)

Nice one simbo. Interesting to see. 

 

I was just about ready to set myself up for a 2pm 6800xt f5 race today, when the partner cards are out as I've got vr coming soon. 

 

The AMD cards are broadly similar, until you get to 4k res, when the nvidia leads by a not insignificant margin. I know the quest native res is almost 4k total, but is it not actually rendering it lower, and upscaling? Or does that only apply to thetherless games? 

Link to post
Share on other sites
43 minutes ago, IcEBuRN said:

Nice one simbo. Interesting to see. 

 

I was just about ready to set myself up for a 2pm 6800xt f5 race today, when the partner cards are out as I've got vr coming soon. 

 

The AMD cards are broadly similar, until you get to 4k res, when the nvidia leads by a not insignificant margin. I know the quest native res is almost 4k total, but is it not actually rendering it lower, and upscaling? Or does that only apply to thetherless games? 

Oh, where are they available today? I don’t know whether to just try and grab one now, in the hope that Apple sort out the drivers sooner rather than later.

Link to post
Share on other sites
3 minutes ago, JPL said:

Oh, where are they available today? I don’t know whether to just try and grab one now, in the hope that Apple sort out the drivers sooner rather than later.

If going by other launches. From scan, ebuyer etc...apparently more stock as well, but we'll see. Both amd and nvidia have already crushed my soul this season. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.