1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics 22/23 GPU Thread - RTX4000 AND RX7000 Series

Discussion in 'Hardware' started by The_Crapman, 19 Sep 2022.

  1. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,712
    Likes Received:
    6,216
    Jealousy is a cruel mistress, Daniel.
     
    The_Crapman likes this.
  2. adidan

    adidan Guesswork is still work

    Joined:
    25 Mar 2009
    Posts:
    20,154
    Likes Received:
    5,949
    I do love a cruel mistress. We each have our kinks David.
     
    David likes this.
  3. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,473
    Likes Received:
    3,032
    [​IMG]
     
    adidan and The_Crapman like this.
  4. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    You can not monitor VRAM usage. So whatever you thought you knew on that? erase it. It is impossible to monitor exactly what the game is calling for. And it always has been. TBH? if there was a way to monitor it properly Nvidia would have upped it a long time ago. Just like how for many years Crossfire did not work. And AMD knew it. Problem was? no way to prove it. So people continually said "I don't have any stuttering" and "Fine on my PC !" which of course was total BS. Then Nvidia released FCAT, AMD were totally outed and people realised that those who said it was crappy and it stuttered were actually 100% correct. It was generating runt frames, which added to the FPS but in reality made it stutter like a bitch. That is how brainwashed people become.

    So like I said. There is no way to categorically prove there is not enough VRAM. None. The way to find out is when your game starts running like warm buttocks. But again, there is no way to statistically prove it. Watch a few videos on it. All they will show is games lagging and stuttering (most don't even know the cause of this*) and or textures looking terrible or missing completely. Note how none of the charts show VRAM allocation, because as I said, there is no real way to know.

    As for allocation on the other hand? it differs wildly between manufacturers. So those figures you are seeing on your card mean nothing. Even on the same manu cards will allocate wildly different amounts. IE, if you have more? the game will allocate more. You don't have more than I do, hence why the game is allocating less. It doesn't mean your card is or can use less than mine.

    Example. COD MW (the new one) used over 10gb on my 2080Ti. On a 2080 it "used" around 7.

    Because that is not what the card needs or is using, it is what the game is allocating.

    Because like I said, there is no legible way to see how much is being used, and tools to prove it do not exist.

    *Texture streaming. This used to be by way of your paging file on the HDD. Imagine what that was like on a spinning disc. Problem is? even through RAM it is still total ass, because RAM is far "slower" than VRAM. IE, VRAM is a totally different entity with terrible latency BUT in a 1/4 mile it is very fast.
     
    Last edited: 29 Jul 2023
  5. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    Dude with all respect you don't know what you are talking about. Not being nasty, but you don't seem to understand. More research needed I feel.
     
  6. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    5,025
    Likes Received:
    779
    Yup VRAMs a funny one, there's the local GPU RAM but if you don't have enough you will still use system RAM which is far slower but management of assets is controlled for the most part, more dedicated VRAM is generally better but it is true that Nvidias architecture and drivers tend to do more with less.

    Amusingly even on APUs where you have no dedicated VRAM, there is better performance if you reserve a section of system RAM to be VRAM as the game can leave assets cached, no loading an decompression etc required and again the more memory reserved for this the better hence why if you run an APU you should have a decent amount of RAM.

    I think you can see GPU memory and shared GPU memory usage in tools?
     
  7. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    You can see allocation, that is it. IE what the game is allocating to the VRAM. Some games have a slider in the menu so you can see if it is going to allocate too much, but you can not measure it in reality. This is why Keef is confused. He thinks his 4070 uses less VRAM than my 6950XT. That isn't what is going on, obs. The game sees I have more, and allocates more.

    BTW Keef. All of the terms you use like DLSS etc? to me? all come under the "cheating" bracket. I buy AMD 100% because of their REAL raster performance, and not what Nvidia have convinced me into thinking is actual raster performance by ways of cheating.

    Edit in. BTW, this is why I think the next gen of AMD APUs are (well I know, given the rumours) going to be extremely fast. Because DDR5 is now hitting 8000 MTPS. Which is what will give them the biggest boost. Still won't be as crudely fast as GDDR, but will go a long ways to making sure they run a lot faster.

    Oh and Keef. Also something to think about.

    I would bet my hat Nvidia know how much VRAM is being used, and have the tools to know, and this is why they deliberately hobble their GPUs. Planned obsolescence I think HWUB called it.

    But yes, basically if there was the tools and you could chart it? lmfao they would be dead on launch. ALL of them. But they are not are they? because Nvidia know. They also likely know that they can block these sorts of tools and I would bet they have, too.

    That is why you don't get charts on it in reviews etc.
     
    Last edited: 29 Jul 2023
  8. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    [​IMG]

    It wasn't over 10gb. It was that much. That is COD MW. So how did it run on a 2080 really well without the VRAM? allocation. Obviously if it allocated that much on an 8gb card you would be in deep doodoo from the off.
     
  9. keef247

    keef247 Modder

    Joined:
    11 Aug 2006
    Posts:
    1,619
    Likes Received:
    244
    Ha you clearly didn't read anything I said :hehe: you've just repeated to me what I said to you in the first place (in 3 different posts and on another thread)?

    Im well aware that the game allocates more in reserve if it sees you have more vram? Hence why I've said that before you did?

    I literally have said this 3 times?

    And you weren't even aware of how to get the adrenaline overlay up when I asked you for output on your fps/power usage the other day?

    You also don't undervolt or manually overclock you just click the auto button so sounds like you're greatly out of touch with the adrenaline software let alone afterburner etc..

    So what are you talking about there isn't a osd showing allocated vs usage live data, I suggest you learn to use the monitoring output in adrenaline/afterburner... And look at EVERY video ever comparing live output of fps/vram/gpu/cpu usage etc etc in ANY video online!

    Why do you think there's some hidden secret a gpu company is hiding and some conspiracy about it, if that was the case we wouldn't be able to monitor temps or anything else and wouldn't have friggen sensors on everything fml.

    Hence why it's used when showing statistics used for overclocking and benchmarking and tweaking for stability/smoothness...

    It literally shows you how much vram is being used on the fly as does the software for nvidia?

    You're just assuming I mean when you initially setup the graphics settings when I shows you a rough idea of what it'll allocate, my point was the reality in real time greatly differs from what it uses and if you'd bothered to look up a side by side of our cards playing the same games side by side with the same res and settings with real time output on usage like I suggested, you'd see I have a point, ie the game merely allocates more in reserve for cards with more to spare, but the reality is, they either use roughly the same as each other or one card is better optimised for X game than the other so uses less or more...

    As for raw raster raw fps, you skim reading again mate? I said 3 times I can match your fps without dlss in every game. Then beat it with it...

    For example 4k high NATIVE in TLOU, then I can beat that WITH dlss as well whilst running ultra settings at 92 fps and get higher than you using fsr as we know fsr isn't as effective.

    I clearly said multiple times it'll match your fps with AND without dlss...

    You also mentioned on here lol how you're using fsr in tlou so you're now contradicting yourself? I remember reading you were getting iirc 52-58fps in 4k so used fsr to get 60+ which exactly the same for me only I do that in ultra and you were using high?

    I also can beat it at RT? so hardly a bad card is it... All whilst using less total power at the wall including a monitor, amp, and speakers vs the raw tbp of your gpu... So hardly shite is it!

    I suggest you actually read what I put in future as you've literally jumped in half cocked stating things as facts that ive already said first? I.e. we are both correct on those facts...

    As for your COD example as i've said from the start now 4 times, it depends on the card/game what is used. hence why I said yesterday for example don't you think it's strange how at the same settings/fps/res our cards vary in actual usage of vram on live usage statistics - i.e. you could have 2 different 10gb cards and they'd be different how it's allocated because they're 2 different cards!

    Regardless of this as long as you have some vram left on the table and are getting the fps/settings/res you want with or without your choice of fsr/dlss/native, then you're winning...

    THAT was my point in the first place that we are both able to achieve the same results in 90% of games (unless it's as you say something biased heavily to vram like FC6) so that's impressive isn't it how well the 4070 is dealing with vram usage/efficiency...

    Either way I'm not gonna repeat myself again... The fact you've literally quoted me back to me proves you weren't even listening and neither was anyone else... So it's just a waste of time...

    I'm well aware how hardware works along with overclocking/undervolting/memory allocation etc etc.. So being condesending saying I know nothing when you're lmfao quoting my own convo back to me just proves who's being a dick here doesn't it.
     
    Last edited: 29 Jul 2023
  10. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    Your 4070 is about equal to a 3080. Which is around 30% slower than a 6950XT.

    As for what I do with it? yeah, I don't spend hours of my life trying to save on power.

    That is why it uses a lot more power. Because it is a lot faster. Efficiency on the 6000 series was excellent compared to Ampere. Nvidia knew this too. Problem is you can't get around that when the card has a very big die and goes very fast.

    TBH? the 4070 is one of the worst cards on that gen. For the price to performance and the 12gb VRAM it's pretty terrible. However, I never said a word because that was down to you. Your money, your life. It is also why I tried not to spend many, many hours peeing with my rig and changing it all because I would have had to put it back. Even the resolution settings, etc etc.

    OIder? yup. Wiser? won't go into that. Let's just say I like me a nice easy life now.
     
    SuperHans123 likes this.
  11. keef247

    keef247 Modder

    Joined:
    11 Aug 2006
    Posts:
    1,619
    Likes Received:
    244
    It isn't a lot more powerful. YOU stated you had to turn on FSR to be able to hit 60fps + at 4k in TLOU yet just now were giving it all that about native - contradicting yourself big time...

    I can get THE SAME fps as you in 4k high in TLOU without dlss/fsr then beat you with dlss on.

    There is like a handful of games like esports shite i've no interest in running uncapped that you'll beat me in at the high end of those triple figures. But who cares?

    When it comes down to what matters the most demanding visuals at 60fps native or with 'cheating' we can match each other, and when we use 'cheating' I WILL beat you as DLSS3.0 and frame gen shits on FSR FACT.

    Seeing as you play at 4k with or without fsr at 60fps on a tv mainly then as long as it hits a locked 60 wether that's with 'cheating' or not then it's good isn't it... I can do the same, but with the 'cheating' on I can beat you, and in many games with RT on too thanks too DLSS3.0/frame gen being way better than fsr...

    Go get 92 FPS in 4K ultra with Ray Tracing on Pyscho in Cyberpunk with dlss3.0/frame generation or 1440p at 120fps + proven everywhere, then come back to me how I'm slower... You can't hit those figures.

    It doesn't use more power because it's faster it uses more power because it's a BINNED chip that didn't make the grade hence the term so they then roided it up to the gills and marketed it as it's own thing entirely vs chucking them away because they didn't meet the strict requirements of the 6900xt...

    That's the whole point of binned chips, surely you know this?
    Instead of chucking them away they just name them something else/tweak/clock the tits off them then sell them as a seperate product!

    Hence why yours uses 140w MORE than a 6900XT yet isn't worlds apart in terms of performance unlike a 7900xt/xtx is...

    The 6700xt is slower than mine by miles or a 3080 etc etc and uses 250w TBP so go work that one out... So wattage=more fps/performance, it's just on a higher less efficient NM chip and older tech and less optimised.

    Hence why they're rubbish when you look at data for being idle/video playback/multi monitors than other cards...

    You know full well I'm team AMD/ATI but I won't be biased and not buy something if it does what I want especially at a more efficient rate, who wants to chuck away 300w that could be powering my tv with watts to spare ie costing me pretty much nothing to rinse my games excessively vs clock watch and never worry about the bill...

    If it was a let down I'd of sent it back, I even said this to you in our inbox messages saying if it was shite I'd simply send it back... Why would I 'put up' with it or sing it's praise from being genuinely impressed with it's performance vs wattage then be double impressed by what it'll do overclocked and undervolted...
    I wouldn't lie, I'd come on here and say well I gave nvidia a chance I should have listened it's shite I've sent it back and bought the card I should of that was the same price!

    It's not like I couldn't afford it is it... I even came £600 under budget...

    Until you actually put one in your own rig and then see for yourself that all this lack of vram is utter horseshit and merely defined by the game in question/usage at the time and chosen card at the time can work together with it, then you will only have snake oil bs from trolls/sponsors/biased people online to quote vs real hands on experience.

    You know for a fact I'd be fuming if it didn't do what I want and telling you laughing how I'd sent it back and bought your card which I did plan to do in the first place as you're FULLY aware of!

    I love the fact that i've stated what I've achieved and instead of saying "oh actually that's pretty decent for that card/tbp/your situation" all i've got back is childish playground "no I'm the best you're shite cause I've red online that X does Y..."

    So I suggest you do the research hands on next time vs speculating based on what teh interwebz claims from trolls/shills/sponsored bs - obviously when you've plugged my card into yours tested it recorded your results then done a back to back with yours at same settings in the aforementioned games with and without dlss3.0/frame gen then come back to me... But you won't you'll just rely on the trend/fanboys online's opinion...

    I coulda just bought a 7900xt/xtx and slated your card the same but I didn't? My point has been the whole time how mine matches 90% of the fps/settings/res of yours then will beat it thanks the clever dlss3.0/frame gen tech that's on it - which isn't a dig, it's just me being impressed with the next gen tech it has because the amd stuff doesn't have it as you're WELL aware... And me being an amd guy, this has obviously impressed me vs going without it if I went for yours or a 6800xt...

    I'm out. Unsubbed!
     
    Last edited: 29 Jul 2023
  12. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    Not sure where you are getting your figures from, but.

    [​IMG]

    [​IMG]

    In the games I play it is 30%. However, had they included the games where the 4070 runs out of VRAM and goes down to 1FPS it would be a lot more. That never happens on a 6950XT. Not at any resolution.

    Not sure why you seem to take it all so personally? no idea. You are about the only person I have ever seen speaking positively about the 4070. Every reviewer panned it, and it was almost as bad as the 4060Ti. In fact, they only get "better" as you pay more and more. Which of course was all deliberate and why the 4070 and cards below it have been ragged on so hard.

    The rest? seems like angry ranting to me dude, sorry.
     
  13. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,887
    Likes Received:
    4,151
    I have ignored what you said because I have no interest in reading a giant wall of rambling, which I have told you before. Begrudgingly I have gone back and read it because now like a child desperate for attention you've started name calling.

    So, where do I begin. Well, there's actually not an awful lot actually said despite the pages of text. "You" have noticed that games allocate as much vram as they can and don't actually use it :jawdrop:. Shocked. I am shocked. Well, not really. This is a well known thing, has been true for a long time. So I can ignore the thing we all already knew.

    You say your 4070 can handle 4k in a particular game, in your particular system, at whatever settings it was. Well done. You used a card. How does this help in a debate about if that card would be faster/better value/have more longevity than the same card with more vram? It does not.

    You originally realy wanted an AMD card but bought Nvidia and are happy. Welcome to purchaser bias.

    So, what have you provided that is in anyway related to, valuable to, or a demonstration of my theory that adding more vram to 4000series cards will not help performance? None.

    You claim I have quoted and regurgitated other people's work, when to the best of my knowledge, no one else is talking about it from this view point. What I have done is formed a theory, found evidence that supports my theory, while being unrelated to the pursuit of that theory, so fairly independent. Unfortunately, there are very few sources, because Nvidia (in my opinion) are hiding their shame of a card that is the 4060ti 16gb. They know more vram will not help in a meaningful way (again, in my opinion), so haven't sampled it, forbidden AIBs for sampling it, and delaying the inevitable (imo) backlash when independent reviews show it to be ********.

    But this, is just my opinion. Largely unsupported theory, being discussed with a reasonable minded person who keeps things nice and succinct and relevant and put points forward that make me question myself, rather than rambling on aimlessly at great length about things that are unrelated and irrelevant.

    Peace out.
     
    SuperHans123 likes this.
  14. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
    TBH I kinda knew deep down he was going to go Nvidia. I did a load of testing for him last week on the 6950XT and I knew that I should not spend hours upon hours as he had already made his mind up. Which is fine, I just wish I hadn't spent all afternoon doing that. Whatever, tried to help.

    He then told me he was buying a 4070. That's fine. As I told him earlier? it's his money and his life.

    I then get dragged into an argument on here saying how great a 4070 is. That's the part of this that has been lost on me, and I obviously don't get it. Almost like he is trying to force me into accepting it is better or something? IDK. Not going there. Already wasted far too much time on this.

    I wish him luck though. Trying to convince the internet he made the correct decision will be a very difficult task.
     
  15. SuperHans123

    SuperHans123 Multimodder

    Joined:
    27 Dec 2013
    Posts:
    2,206
    Likes Received:
    435
    Set your budget, read/watch as many respected reviews of cards within your budget.
    Make your choice.
    Play games.
    Stop staring at frame rate counters.
     
    Vault-Tec, IanW and The_Crapman like this.
  16. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,473
    Likes Received:
    3,032
  17. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    15,466
    Likes Received:
    4,048
  18. SuperHans123

    SuperHans123 Multimodder

    Joined:
    27 Dec 2013
    Posts:
    2,206
    Likes Received:
    435
    Reading some reviews and benchmarks yesterday, that 7600 looks ok value for money
     
  19. Spanky

    Spanky Multimodder

    Joined:
    14 Apr 2010
    Posts:
    1,101
    Likes Received:
    313
    I think at initial release of £299 it didnt look too appealing . @ £230 its a bloody good card. If peoples performance expectations dont exceed the cost of the card then they will be very happy. I had a RX6600 i used for a while and i thought it was great.
     
    Vault-Tec and SuperHans123 like this.
  20. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,473
    Likes Received:
    3,032

Share This Page