1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia 1100/2000 Series Thread.

Discussion in 'Hardware' started by The_Crapman, 7 Aug 2018.

  1. kuroidoragon

    kuroidoragon Minimodder

    Joined:
    14 Oct 2012
    Posts:
    190
    Likes Received:
    6
    Do we need to rob a bank or sell a kidney again to be able to afford these ?

    I thought the prices were going to be coming down since the whole bit coin miner thing as they promised obviously it hasn't

    Nvidia RTX cards can be pre-ordered now for $600 for the 2070, $750 for the 2080, and $1200 for the 2080 Ti with some variation by manufacturer.
     
  2. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    Nvidia reveals that raytracing can mean raytracing but doesn't necessarily mean raytracing...

    https://www.overclock3d.net/news/software/nvidia_clarifies_-_rtx_in_games_doesn_t_mean_ray_tracing/1

    :wallbash::wallbash::wallbash:
     
  3. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,680
    Likes Received:
    3,939
    MLyons likes this.
  4. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    MLyons and The_Crapman like this.
  5. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
  6. yuusou

    yuusou Multimodder

    Joined:
    5 Nov 2006
    Posts:
    2,878
    Likes Received:
    955
    Truth isn't truth.
     
  7. Sentinel-R1

    Sentinel-R1 Chaircrew

    Joined:
    13 Oct 2010
    Posts:
    2,389
    Likes Received:
    408
    But $1200 is definitely $1200. That's one thing they're sure of.
     
    Otis1337 and The_Crapman like this.
  8. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,117
    Likes Received:
    673
  9. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,132
    Likes Received:
    6,727
    Is this the point at which I smugly point out that my article on the matter very clearly splits the games into supports-raytracing and supports-DLSS? Engaging smug mode...

    [​IMG]
     
  10. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    So something I've been trying to work out and it would be handy if someone could either confirm or clarify for me: AFAIK games mainly depend on FP32 calculations, machine learning/AI mainly uses matrices of FP16 calculations, yes?

    If so does raytracing mainly depend on FP64 calculations?
     
  11. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Quiet you, how are we supposed to get outraged over not actually reading what an acronym means and making an incorrect guess if you just go and spell it out so clearly!
    AIUI, the RT cores use FP32 for ray intersection calculations, and offload INT operations to the Tensor cores to do the ray preselection calculations (basically using statistics from ML to make a good guess as to where the bounce rays need to point in order to usefully contribute to the image, rather than just spewing as many rays as possible omnidirectionally with most being discarded or contributing little to the final pixel), with Turing allowing FP and INT operations to execute at the same time.
     
    Corky42 likes this.
  12. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    Nvidia spend half the keynote banging on about RTX and ray tracing, showing examples of RTX on and off with ray tracing being displayed. They then talk about DLSS, with no mention of RTX, show an example with it on and off with no mention of RTX.

    Their own dev site lists
    and also doesn't mention specifically DLSS, though I appreciate NGX is there. If RTX is a fusion of all of those parts, one of which is ray tracing, if you remove the ray tracing then how can it still be RTX? Surely that's like removing water from cola and saying you've still got cola?

    Props to Gareth for picking up the split in the games, I thought I'd read mention of it before the OC3D link now I guess I know where! Gareth doesn't mention that all the games Nvidia listed all come under RTX games, and most people, myself included, would have looked at that slide of games and seen RTX after being shown lots of examples of RTX on/off all including ray tracing, their own platform showing ray tracing is part of it, and, seen afterwards, Nvidias own description stating it's a fusion of, in part, ray tracing, and probably assumed that RTX games would all include ray tracing.

    If, along with the ray tracing part of RTX, we also remove the AI bit and just keep rasterisation, can we now claim that all games are RTX games?

    NB - I used the anandtech keynote because I couldn't be bothered to sit through the presentation again, I switched off enough the first time.

    PS I'm actually interested to see how DLSS affects performance in games, I assume it could give it a boost vs normal AA.
     
  13. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    So how much do the Geforce RTX cards borrow from the professional cards? It seems over the years (5-10) that Nvidia's uarch's were slowly diverging, things mainly used for AI and/or CAD/CAM were not really needed in a gaming orientated GPU, do the RTX's allow Nvidia to put was essentially redundant silicon on down binned parts to good use?
     
  14. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,132
    Likes Received:
    6,727
    I think you're misinterpreting what Nvidia's saying there (and you're not alone): The RTX platform includes ray tracing, deep learning, and Other Shiny Buzzwords; you could be using any combination of these, including only ray tracing or only deep learning, and you're still using a feature of the RTX platform. Using a feature of RTX? You're using RTX.

    To extend your highlighting from that documentation: 'The Nvidia RTX platform fuses ray tracing, deep learning and rasterisation [...]'

    If I buy a car which "fuses great off-road capabilities with extreme in-city efficiency" but only ever use it in the city, that doesn't mean the manufacturer was trying to mislead anyone when it said it offered both.

    They're basically, and pretty much always have been, the same thing: GeForce gets mainstream memory and artificially-crippled floating-point performance, along with a licence agreement forbidding you to use 'em in a data centre; Quadro is the same thing hardware-wise but without so many lower-end options, a couple of higher-end options, and ECC memory (but only external to the GPU); Tesla is the same thing but with the graphics ports sliced off. They're all the same GPUs, effectively, just with different stuff strapped to 'em to suit each market segment.

    Yes, there are other technical differences, but that's a basic overview of Nvidia's (and AMD's) market strategy: build one GPU family, sell it to as many markets as possible.
     
  15. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    With Volta being the exception, never making its way down to consumer cards (with the exception of the Titan V, which is about as 'consumer' as the Quadro GV100).
     
  16. Sentinel-R1

    Sentinel-R1 Chaircrew

    Joined:
    13 Oct 2010
    Posts:
    2,389
    Likes Received:
    408
    Wonder how long before crypto currency starts making use of Tensor cores?..

    £2k GPUs anyone?
     
  17. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I get what you're saying and perhaps i did a bad job of explaining what i meant, at what point in slicing things off does it no longer resemble the original uarch, arguably CUDA 'cores' are an example of what you and i mean, they're in included in Geforce cards because they're in the Tesla/Quadro range but the utility of GPGPU in games is questionable.

    I'm not saying it's useless or isn't used in gaming just that it's not used to a great extent, and it seems Tensor 'cores' are similar in that they're dead handy for machine learning tasks but of questionable value in a gaming card.

    Do the RTX's allow Nvidia to put was essentially under-utilised silicon on down binned parts to better use.
     
  18. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,132
    Likes Received:
    6,727
    Err... The CUDA cores are the graphics cores. No CUDA cores, no 3D acceleration. While Nvidia calls 'em CUDA cores, they're more properly termed unified shader cores: CUDA is a very clever way of making non-shader applications run on shader cores, which is why we call it general-purpose graphics processing unit (GPGPU) and not general-purpose processing unit. So, you've got that exactly backwards: CUDA cores exist because games need 'em, and the Tesla range came about because a very clever person at Nvidia went "hey, I bet I can Do Science on those buggers." And he was right.
    This, though, you've got exactly right. Deep learning isn't really something games need to do on the desktop and the Tensor cores were developed solely for Quadro/Tesla/Drive purposes, but Nvidia's already found a way to go "see, you need these really" to gamers by working out a way to run better supersampling on 'em (and by using them to accelerate the RT cores, too.)
    Not sure I understand the question, I'm afraid. In my defence I've had some kind of illness the past couple of days that's made me feel sick and dizzy, and my brain ain't firing on all cylinders.
     
    MLyons likes this.
  19. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    I may well be misinterpreting Nvidias buzzwords, but as I said, if you spend the entire time talking about RTX and showing ray tracing everytime, but when you bring up DLSS you're now talking about NGX and there's no RTX stamps on the demo picture it's not hard to see why.

    To counter your car analogy, both the off-road capabilities and the city efficiencies would be used in their counterpart location albeit to a different degree. The car doesn't stop using those off-road capabilities if it's trying to climb a steep hill in a city, nor will it not use the fuel efficiencies in an off road situation if it's in a situation that allows it's engine to be fuel efficient. (my cola analogy was tastier btw, might be less healthy :p )

    To me fuses is the key word, sure the platform fuses the capabilities of all the things in their photo, but if a game makes use of RTX I would assume that it'd make use of that entire fusion, not just part of it. And as I said, Nvidia didn't help with their presentation. I'd also go back to

    because rasterisation is part of RTX. How much can we remove from RTX and it still be RTX? IMO none, in Nvidias opinion you can remove the ray tracing part. They should have labelled the games as something else, and maybe used a label such "ray tracing enhanced" or RT games, and labelled the slides RT on/off since that's what they called their ray tracing core.

    Hope you feel better!
     
  20. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,132
    Likes Received:
    6,727
    All but one, basically. If you're using any part of the RTX platform, you're using the RTX platform. I use Linux, which is a platform that can do everything from make sure your self-driving car doesn't crash into buildings to make satellites work, but I only use it to argue with people on internet fora. Doesn't mean I'm not using Linux, though. Not using it fully, but I'd wager nor is anybody else.

    To bring it back to RTX itself, if you're using some kind of RTX-specific rasterisation feature, then yes: you could be doing only rasterisation and still say "I'm using RTX." If what you're doing isn't RTX-specific, though, then you can't - even if it runs on RTX. (Though you could say "my software is RTX compatible," but then so is everyone else's.)
    Ta. Getting a bit tired of it now, would quite like it to go away!
     

Share This Page