1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia 1100/2000 Series Thread.

Discussion in 'Hardware' started by The_Crapman, 7 Aug 2018.

  1. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    I think we could discuss RTX forever, but I imagine we're both gonna see things from different points of view. I do see where you're coming from though, and I understand what you're getting at with your analogies. I still think they've left it ambiguous (maybe deliberately so?), both in their presentation and dev page.

    Grab yourself all the chicken soup Gareth!
     
    MLyons likes this.
  2. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,085
    Likes Received:
    6,635
    Oh, aye, they're being deliberately cheeky, no question about that.
    Chicken soup for the soul! L'chaim!
     
    MLyons likes this.
  3. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    Doesn't make having a presentation where half the list of RTX games only uses a tiny fraction of it any less shady, especially when mixed with the whole preorders and no reviews yet thing....
     
  4. stealth80

    stealth80 Minimodder

    Joined:
    1 Feb 2015
    Posts:
    343
    Likes Received:
    9
    I agree with you, but AMD did the same with Ryzen, pre-orders before reviews, I think its just the way business is handled now

    with this price hike im officially boycotting Nvidia now. I was considering doing a new build based around 8700k and 2080ti, but a £200 hike over the 1080ti with no reviews yet ..... its like a 25% price increase, again, they did it with the 1080 and I said then, problem then was I didn't wanna sell my Gsync, well now I have and I would rather crossfire 2x Vega 64's for less than a 2080ti than give Nvidia anymore of my money
     
  5. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Not really, Compute Unified Device Architecture 'cores' is just Nvidia's name for the programmable parallel processors and in the Geforce range there mainly used for graphics related arithmetic, in the professional range of cards they can be tasked with computational tasks other than graphics related stuff.

    The general purpose processing that the graphics processor units get used for outside of graphics is just a variation on the arithmetic needed for calculating graphics work, CUDA is just Nvidia's fancy name for programmable parallel processors, there was 3D acceleration before CUDA after-all, programing the parallel processors to do other stuff came afterwards.

    The requirement for the programmable parallel processors to do more than FP32 calculations is something that adds complexity but isn't much use for graphics related workloads, hence why it seems to me that all these extra features have been added to graphics processors so they can work better on non-graphics related work, that is unless, like i suspect RTX is attempting to, those extra but underused (in the geforce range) features can be put to good use in graphics related workloads.

    That doesn't sound good, hope you feel better soon. :(
     
  6. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,085
    Likes Received:
    6,635
    I'm entirely happy to assume I'm being stupid owing to either my illness or just the fact that I'm stupid, but as far as I'm aware this isn't right. The CUDA cores are what Nvidia called "unified shaders" or "scalar processors", which replaced the single-purpose vertex and fragment shader hardware of everything up to and including the GeForce GTX 7000 series. When CUDA came around, the cores didn't stop being unified shaders; CUDA simply repurposes them for new workloads. Because Nvidia knows the power of branding and because it's the only one that offers CUDA support, the "unified shaders" became "CUDA cores."

    The same thing applies, though: you take the CUDA cores away from any Nvidia graphics card from the GeForce 8000 series upwards and it'll no longer do any 3D acceleration, because it won't have any unified shaders any more - and it already lost its vertex and fragment shaders, so it's now trying to do 3D rendering with no shaders at all. Which, y'know, won't work.

    (If you want to get even more technical, no current Nvidia graphics card has any shaders at all: it has scalar processors which can run shader or CUDA workloads, and which Nvidia calls "CUDA cores.")
     
  7. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I get the impression were talking about the same thing, yes it went from vertex and fragment processors, to unified shaders, to CUDA 'cores', however in my eyes they all trace there heritage back to accelerating 3D graphics, that acceleration of 3D graphics mainly uses specific arithmetical computations, however it seems to have picked up a fair amount of non-3D acceleration graphics related arithmetical computational baggage along the way.

    That's fair enough but making a single chip where 2/3rds of it isn't being used (*am i using the right word?) probably isn't great.

    *When i say isn't being used i don't mean it's not doing anything at all, if i design an ASIC to perform FP64 calculations and only send it an FP32 calculation every clock cycle while it's able to calculate that FP32 it uses the same amount of resources (power, silicon area, memory bandwidth, etc, etc) as if i had asked it to do a FP64, would underutilised be more correct?
     
  8. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,653
    Likes Received:
    3,909
    Just wait until next year's International Lighting Artists Conference....
     
  9. RedFlames

    RedFlames ...is not a Belgian football team

    Joined:
    23 Apr 2009
    Posts:
    15,401
    Likes Received:
    2,996
    Is it ever off? :p

    Also if you don;t have that as a t-shirt i'll be mildly disappointed.
     
    MLyons likes this.
  10. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    When the venerable GeForce 256 launched, nobody was doing texture and lighting in hardware, so all that die area was 'wasted' with all existing games using software T&L.
    PBR didn't kill off texture artists, just changed the skillset required. Lighting Designers exist for IRL lighting for buildings, structures, spaces, etc. Similar skills will be needed for games too.
     
  11. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I've thought about it more on the way home and i really didn't do a good job of getting what was inside my head out onto the screen. :)

    Wasted maybe the wrong word, basically what i was trying to say is that a 'core' inside a GPU, however we want to define that, is designed to process X bits of data per clock cycle, be it 8, 16, 32, or 64bits, did i get that right?

    If i design each core to handle 32bits and i send it 16bits each clock cycle then I'm processing half the data but using the same resources as if i had processed 32bits, i could use rapid packed maths but that comes with drawbacks.

    I wonder if I'm making any sense anymore. :)
     
  12. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,653
    Likes Received:
    3,909
    Well if you're going to drink a crap American lager on the job you best up your skill set!
     
  13. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,085
    Likes Received:
    6,635
    Closest I've got is an Ace Rimmer T-shirt, I'm afraid!
    I think I understand now (the rum's probably helping): you're not saying the CUDA cores are Quadro/Tesla-specific, you're saying that over the years they've been modified to be better suited to GPGPU workloads at the potential expense of traditional GPU workloads. Aye, it's hard to argue against that one, I reckon.
     
  14. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I knew i was being infuriating and more than a bit Tim Nice-But-Dim but OMG I've driven you to the drink. o_O ;)
     
    The_Crapman and Gareth Halfacree like this.
  15. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
  16. Sentinel-R1

    Sentinel-R1 Chaircrew

    Joined:
    13 Oct 2010
    Posts:
    2,379
    Likes Received:
    404
    Gamer's Nexus tore that article apart on Saturday too. New senior editor apparently. I don't think he'll be at Tom's much longer...
     
    MLyons likes this.
  17. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    No tears would be shed if he was kicked to the curb, we certainly don't need the excessive feeding of hype and preorder culture that infests gaming in hardware.
     
  18. MLyons

    MLyons 70% Dev, 30% Doge. DevDoge. Software Dev @ Corsair Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    3 Mar 2017
    Posts:
    4,177
    Likes Received:
    2,744
    Anfield and adidan like this.
  19. RedFlames

    RedFlames ...is not a Belgian football team

    Joined:
    23 Apr 2009
    Posts:
    15,401
    Likes Received:
    2,996
  20. TheMadDutchDude

    TheMadDutchDude The Flying Dutchman

    Joined:
    23 Aug 2013
    Posts:
    4,739
    Likes Received:
    523
    I've got someone at work taking my 1080 for $400, which I will upgrade to a 1080 Ti for an additional $100. They're crazy 'cheap' now.

    The lack of competition from AMD is really showing us that NVIDIA can charge whatever they'd like, and that they'll sell in the boatloads. It's sad, but that's the way the industry goes. Just look at Intel... :lol:
     

Share This Page