1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Dumping nVidia

Discussion in 'Hardware' started by thehippoz, 1 Jul 2011.

  1. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    I've finally decided to dump nvidia and go ati.. just bought a 6970 and will probably buy another one for other machine

    mainly because I use cuda and think it's a joke.. seeing guys with ati roll over cuda in everything from hashes to now mining.. we've came to the conclusion that folding at home is definitely paid off

    I've been to stanford quite a few times, and all those guys in the engineering wing are asian.. I could see huang walking down the halls saying 'ok guys here's how were going to do it'

    was talking about this yesterday with a friend.. notice it's hard to get 5850's anymore? last year we knew something wasn't right when guys with 4 nvidia cards in a cuda rig pulling insane amounts of power were getting wiped out by one 5850 and now mining has pretty much pegged it

    they maybe have a partnership going with nvidia to give them folding and other apps.. I mean there's only 2 video card companies- without one, the other can't survive.. weird you don't see ati supporting it like you see nvidia

    like when vlc added gpu acceleration last year.. ati wasn't even supported in windows! people out of the fruit loop now know how bad nvidia hardware actually is in comparison

    or in laymans terms nvidia markets like crazy.. here's an old thread back in 2009 that explains it pretty well.. but he forgot to add- folding is also paid off and reach around friendly :blush:

    http://hardforum.com/showthread.php?p=1034740407
     
  2. Bozwell

    Bozwell What's a Dremel?

    Joined:
    26 Jun 2011
    Posts:
    26
    Likes Received:
    1
    Hmmm. A conspiracy theory with just hint of racism.

    Maybe no-one is doing anyone favours, but ATI architecture just lends itself to number crunching more than Nvidia? Who could blame Nvidia for being more interested in making things look better rather than crunching numbers faster? It is a video card after all.

    It's how it has always been. ATI is ahead, then Nvidia, then ATI, etc. Same with CPUs ATI vs Intel.
     
    MazzaB likes this.
  3. greypilgers

    greypilgers What's a Dremel?

    Joined:
    23 Jan 2011
    Posts:
    442
    Likes Received:
    23
    In most real game fps comparisons I look at ATi seems to be quite close to Nvidia until AA is turned on, then ATi falls back much further than Nvidia. I would rather have pretty screens than be able to crunch numbers quicker.

    Swings and roundabouts
    Horses for courses
    Spades and Shovels
    (Insert cliche here)

    :D
     
  4. Zurechial

    Zurechial Elitist

    Joined:
    21 Mar 2007
    Posts:
    2,045
    Likes Received:
    99
    I stopped reading here. What the ****? You've posted a mixture of both good sense and absolute drivel in the past, but lately your posts are just getting worse and worse; with more and more ridiculous leaps in logic.
    The quoted comment is the kind of thing that gets people on my ignore list here.
     
  5. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    just the way things work sometimes.. I'm pretty sure I know more about the asian community than you guys do xD that's the only thing that can explain the folding farce for so long
     
  6. docodine

    docodine killed a guy once

    Joined:
    10 Feb 2007
    Posts:
    5,084
    Likes Received:
    160
    ???

    Explain
     
  7. Teelzebub

    Teelzebub Up yours GOD,Whats best served cold

    Joined:
    27 Nov 2009
    Posts:
    15,796
    Likes Received:
    4,484
    Lol what do you expect from someone that wears a pumpkin on their arse ?? lol
     
  8. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    The 5850 is no longer in production, being hard to get is no big surprise. A couple of months ago Sapphire (iirc) aquired a few remaining 5850s which they then sold at quite low prices, that was pretty much the last sale of new 5850s.

    Furthermore, if AMD hardware is so shockingly better why don't you have some hard proof rather than theoretical estimates based off of specifications? Gaming performance certainly shows that AMD doesn't have some mysterious advantage.
     
    Last edited: 1 Jul 2011
  9. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    hehe well let's just say asians stick together like white on rice

    pretty cool what's happening in opencl.. guys who are optimizing opencl right now are finding even faster ways to exploit ati cards too.. nvidia not even in the picture- why I think conspiracy.. I posted this in the folding forum here on bit think late last year when playing around in pyrit, might have been earlier this year

    it ended with the seasoned folders calling me a troll, go figure

    ah that's why then.. thanks sloth

    I was trying to get one before the tax law for online purchases in cali went into effect today
     
  10. BRAWL

    BRAWL Dead and buried.

    Joined:
    16 Aug 2010
    Posts:
    2,668
    Likes Received:
    186
    Number crunching... tragically is the way forwards.
     
  11. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    yeah it is

    just in number crunching.. gaming nvidia still has it (probably down to drivers and twimtbp)- guess it depends on what you use it for
     
  12. Sexton

    Sexton Minimodder

    Joined:
    2 Jun 2010
    Posts:
    621
    Likes Received:
    19
    For me, I would rather have my eyes pulled out and eaten by a rapist than own another ATI card. God awful things. The drivers are just ridiculous - the last one I owned I think I spent more time looking at a BSOD than I did what I wanted to see, and even when it did manage to do what it's supposed to, it didn't do it very well. Turning AA on was like asking a badger to drive me into town - not something that would end well.

    Thank christ NVIDIA get everything right. Couldn't give a monkeys toss about numbers... a graphics card's job is to make things look pretty and run smoothly - thankfully NVIDIA are good at that, unlike ATI, which just make cheap crap which looks awful, doesn't work and has a a total lifetime of about a week (if you're lucky).

    Really have never seen why ATI became so popular... they were perhaps ahead of the GPU game for a few weeks, but NVIDIA rightfully took back their top spot not long after and I don't see that changing any time soon... at least I hope not.
     
  13. mjb501

    mjb501 What's a Dremel?

    Joined:
    20 Jun 2010
    Posts:
    37
    Likes Received:
    7
    I did GPGPU for my MRes last year, I can tell you there is no great nVidia conspiracy.

    GPGPU started by using HLSL and GLSL shaders to do the GPU processing in applications, so back then the fasting gaming card was also the best GPGPU. (Before ATI moved to Vector Processors)

    Then nVidia brough CUDA along which meant programmers without a background in graphics programming could do GPGPU and was therefore amazingly popular, which meant all the universities bought nVidia GPUs and the academics learnt CUDA.

    OpenCL hasn't been around as long as CUDA and requires you to learn another API, which maybe a much better API than CUDA but most programmers tend to stick to the best documented (Check out amazon, there are far more cuda book than OpenCL). Also as a great deal of research projects have been made in CUDA meaning it is easier to find references and methods for getting the utmost performance from a CUDA application are better known. Look how popular DirectCompute is at the moment compared to CUDA and that has the backing of Microsoft!

    I dont think nVidia are bribing Stanford, but the the do clearly know CUDA better, which is why nVidia GPUs get much better performance, this is not to say that nVidia arent giving them help or early access to stuff, but AMD could offer the same help!

    Now lets discuss the different processors used by nVidia and ATI as this may help to explain why nVidia appear to win when they shouldn't be.

    VLIW5 (AMD 5000 series): A single vector processor can perform the SAME instruction on a vector with upto five components.

    VLIW4 (AMD 6000 series): A single vector processor can perform the SAME instruction on a vector with upto four components

    SCALAR (Nvidia): Each component of a vector is processed on a single processor

    Right to demonstrate the differences between the types of processors and why it makes raw processing numbers useless, we are going to assume that each processor can perform the same number operations per second and have the same access are write speed to memory.

    To use you example cards (and a 5970 for the old VLIW5), I want to add perform an addition between two different sets of 10,000 floating point numbers on the GPU:

    AMD Radeon HD 5970: 3200 ALUs - (640 Vector processors giving 3200 ALUs)

    You'd think it can perform the add 3200 floats concurrently, so it take 3 operations for all processors and then 400 would do an addition operation. This would be true if I did/could vectorise the values, however lets say I cant or am lazy, now I can only do 640 operations per iteration, so it would actually take over 15 iterations, quite a difference!

    AMD Radeon HD 6990: 3072 ALUs - (768 Vector Processors giving 3072 ALUs)

    Would take just over 3 iterations vectorised, but unvectorised it can do 768 operaitons, so 13 ( and a bit). THe VLIW4 reduces the penatity as there are more actual procesors.

    Nvidia GTX 590: 1024 ALUs - (1024 Scalar Processors giving 1024 ALUs)
    As it is a scalar it would take 9 and a bit iterations, so slower when you can vectorise on and AMD but much faster when you can't.

    (I know this is based on alot of assumptions and doesnt taking into account some of the important stuff like memory bandwidth or clock speed and the fact that the example GPUs are Dual cards which makes the maths even more suspect!)

    The above example should demonstrate the difference in performance between stuff that can be vectorised and stuff that cant be. This is why you can get massive differences in performance.

    The "improvement" in the Radeon 6000 Series over the 5000 Series and why it can have better performance which fewer ALUs is because reducing the vector size from 5 units to 4, is a better fit for most vector maths that the GPU will perform, especially in games fits where 3D vector calcuations and colours are typically 4 units rather than 5. This meant that 5000 Series had some processors idle for a lot of the time if it didnt have special coding.

    Basically it is the same debate as CPU vs GPGPU, apart from it is Scalar Processors (nVidia) vs Vector Processors (ATI), each is good at certain things but has penalties for others, therefore you code for the majority hardware.

    Anyway, basically you can't tell anything from RAW numbers, it depends on so much stuff that you HAVE TO benchmark in actually software to get even close to a realistic idea of the relative performance!

    I had better declare that, I used DirectCompute for my MRes and I have used both ATI and nVidia GPUs in the research and in my own builds.

    Also remember that most cross plaform games are ported to PC from XBOX 360 becuase of the code compatability, so the Shaders are written around to work on the X1900 which used scalar processors, like nVidia still use!
     
    Guest-16, thetrashcanman and Jipa like this.
  14. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,255
    Likes Received:
    1,822
    Either way, do a few benchmarks for us (gaming and otherwise) when you get two 6970s in Crossfire hippoz!
     
  15. kingred

    kingred Surfacing sucks!

    Joined:
    27 Mar 2008
    Posts:
    2,462
    Likes Received:
    87
    I love the way how tons of people gleefully be pigeon-holed into their respective tech based fanboy box, easily segmented, and marketed too!
     
  16. outlawaol

    outlawaol Geeked since 1982

    Joined:
    18 Jul 2007
    Posts:
    1,935
    Likes Received:
    65
    Ive had both brands and at current run ati/amd. I'd use either one in all honesty. I tend to buy on the down side of the scale with hardware. New implementation is just expensive TBH (getting cheaper indeed, but still).

    Point being is one thing, SLI/crossfire. I've got a 5850 and it does what I want, now looking at getting another to crossfire it with because its cheap to do so. I'll effectively give my graphics a 70-80% boost in performance. I honestly believe this is the way to do upgrades without having to buy all over. I just dont see how people can dump a complete crossfire setup (like a 4870, or a 460) to buy a card that will perform a bit better than that setup (yes single card, but really? $500 purchase?!) So buy mid range ($300 or so) 1-2 years later buy the same GPU and crossfire (about $150 or so) and bam! Instant decent upgrade that'll run nearly anything once again. I was looking at selling my 5850 and going to 2x 6970's, which would have cost upwards of $700+, but I spent $160 instead and it'll still rock the socks off of everything.

    Now if budget is not an issue who cares, rebuild and buy every 2-3 weeks... But only rich nutcases do such things... :D
     
  17. play_boy_2000

    play_boy_2000 ^It was funny when I was 12

    Joined:
    25 Mar 2004
    Posts:
    1,618
    Likes Received:
    146
    SLI/xfire is a wast of money imo. You just end up spending the money elsewhere (dual PCIe mobo/psu/cooling/power bill/etc) and at the end of the day, the only thing you're guarenteed to get is the annoying whirr of 2 fans.
     
  18. Jipa

    Jipa Avoiding the "I guess.." since 2004

    Joined:
    5 Feb 2004
    Posts:
    6,367
    Likes Received:
    127
    And then there's the guy that has same experiences about Nvidia cards, and as a result we're really stupid to buy either manufacturers' cards? So then, lets just recommend Intel IGPs to everyone, as other solutions are clearly absolute rubbish and don't work? :rolleyes::duh:

    Thank you! I hope that helps the OP.

    dual PCIe mobo? How many modern mobos are there that do not support CFX? PSU? Most people have enough oomph on the PSU anyway. Cooling? What? Just what? Power bill? The CFX-set isn't really going to draw any more power than the comparable single GPU solution.
     
  19. law99

    law99 Custom User Title

    Joined:
    24 Sep 2009
    Posts:
    2,390
    Likes Received:
    63
    WTF man? So you decide to ignore all logic and go with some "birds of a feather flock together" type nonsense?
     
  20. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    get the cards in tomorrow.. 6970 and 6770.. if these work well I'll get another 6970 for the other rig

    I already know the numbers though- can run some compares.. and law- it's really the only thing that makes sense on why the ati client is so bad at folding.. I know good opencl programmers can look at the ati folding client and make the changes so maybe I'm trippin

    mjb has a good handle on maybe why nvidia leads here.. but it's funny to me how guys with nvidia cards used for brute force will defend their cards to the hilt (probably cause they've invested so much in them)

    like one guy on another forum (who will remain anonymous) built a rig just for crunching and was getting around 80k a second with 4 gtx480's.. oh bow to the mighty! xD

    his claim to me was that the reason he only builds nvidia is because ati misses a lot.. now get this his logic was nvidia never misses and gives the best results without any do overs

    of course this goes against our own tests with a 5850 that does 80k/sec by itself.. there's a lot of disinformation out there
     

Share This Page