1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs Nvidia says Fermi is great, but who knows?

Discussion in 'Article Discussion' started by Cutter McJ1b, 18 Jan 2010.

  1. crazyceo

    crazyceo What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    563
    Likes Received:
    8
    Christ get off the high horse and read the posts for a change.

    I haven't compared anything with future yet to be released products. It was the other ATi fanboys claimimg ATi will still be "ALL CONQUERING!" after Fermi is released.

    Yes you are right, the current crop of cards do not go far enough. DX11? whose going to immediately benefit from that? DX10 has only just got to grips with the community. Making money? AMD made money last year due to Intel paying them off. Without it, they made losses! Review scores? That just proves my point that the new 5xxx is just a very small step away from the 4xxx series. If you purchased a high end 4xxx last year, are you going to spend all that money again on a high end 5xxx this year? Obviously no.

    As to the £110 GTX260, Aria had them running for a while but popularity obviously took over and they sold that batch out. Be quicker next time.

    Just because YOU own one, doesn't make it the greatest.
     
  2. memeroot

    memeroot aged and experianced

    Joined:
    31 Oct 2009
    Posts:
    1,215
    Likes Received:
    19
    Only 3d realy seems to offer the benefits that I'm looking for with the new batch of cards... Would rather have that and the power benefits of the ATI cards but there doesn't seem to be that option at the moment.

    hence waiting to see what fermi offers
     
  3. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    dude just chill. the only reason ati fanboys are saying that the 5000 series are all conquering is because right now they are. the HD5870 is twice as fast as the next single-gpu (GTX285).

    just remember that the last time there was a massive leap forward (like 3x the performance of last-gen) was the 8800 (3 yrs ago), but nvidia havent been able to that trick since.

    for someone building a PC right now (that they want to use for gaming), the HD58xx series is what you are forced to get. fermi might be released in march, but when is general availability? 1-2 months down the line? Also, dont expect a huge leap forward when compared to the 58xx series. Latest i heard was a 15-20% increase at best (with a chip almost twice as big and right on the edge of the ATX power envelope).

    i have 2x HD4890s in CF. I have considered getting a HD5870 to replace them, just because i think the 2 cards are noisy, and for eyefinity. but now i might just wait for the 58xx series refresh.
     
  4. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Actually no one has.
     
  5. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    thats my point - no-one has been able to make such a huge leap forward, so to expect it from every new product generation is asking too much.
     
  6. crazyceo

    crazyceo What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    563
    Likes Received:
    8
    "Also, dont expect a huge leap forward when compared to the 58xx series. Latest i heard was a 15-20% increase at best "

    Based on what evidence when no one is quoting facts about the structure of Fermi because no one knows.

    Why can't we expect every new release to push the limits like the 8800GTX did? That's the yardstick they all have to aim for. Have ATi done it in the last 4 years? No! Have Nvidia recaptured that excellence? On current product base clearly no! Will Fermi recapture it? No one including you have no any idea other than ATi fanboy rumours and wishes they don't.

    That's the problem with the current ATi range, it doesn't push enough past the 4xxx series to warrant the upgrades to most people.

    As a community, we have to demand ATi and Nvidia to push those boundaries and not just rush out the next replacement but with DX11 or a hdmi port or two but just change the number at the front.

    Why is that asking too much?

    You can settle for far less if you like but I'm not parting with £400+ on a card that gives me nothing more than a hdmi port.
     
  7. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    then dont. no-one is forcing you to.

    but think about it this way. how much R&D money do you think it takes to create a product 3-4 times better than the previous generation? do you think these companies will spend that much EVERY generation? of course they wont (which is why you find 8800 tech in the current gen of nvidia cards). it is in their interests to give an evolutionary change as opposed to a revolutionary one, simply because some people (like most who frequent these forums) will stay at the bleeding edge, and will pay good money to have that extra edge, as insignificant as it is. then there are people like yourself who havent upgraded since the 8800 - no extra money going in nvidias pockets on the 200 series.

    maybe they could recoup their R&D spending every revolutionary generation, but they would make more money with the evolutionary path.
     
  8. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    You could say the same for the X1950s ---> HD2xxx, the 9800Pro users ----> X800XTs...

    It's unreasonable to expect a company to make a product +3X faster than their current one, and to be honest, the HD5xxx is 2x faster than the previous gen, take a look, the only reason it doesn't seem so is that we had X2 cards as of late.
     
  9. crazyceo

    crazyceo What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    563
    Likes Received:
    8
    BHAAHHHHHH!

    Sorry just pretending to be a sheep like you!

    It would be cutting edge if was good. sadly it isn't! Let's just wait for Fermi and then decide and not be a ATi fanboy like you!

    The companies SHOULD be spending the money from their successes on the R&D for the next generation of cards. Nvidia didn't and thus the GTX200 range although good just wasn't great. ATi had been playing catch up for a few years after getting spanked by Nvidia and therfore had the time to develope an OK card but again it wasn't great.

    I could put my hand in my pocket today and happily go 3 way system but won't until I see the next generation pan out. You go follow the rest of your sheep and potentially waste your money.

    I'll wait and see what Fermi brings to the table.
     
  10. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    ati said in that interview with bt.. they won't be ahead of fermi at launch but they will later in the year.. they've had their card out since last september.. but unlike nvidia doubt they've been sitting on their hands the whole time expecting to milk out the competition

    really both companies are money milkers.. since they were caught price fixing already I don't see how you can like one over the other.. just go with the price to performance (not to mention features)- it'd be kinda lame to buy nvidia right now..

    you gotta admit nvidia's dropped the ball this round.. just releasing fermi and ati has had all this time, they will counter they said it themselves.. I'd like to see the leobeater card myself

    I'm probably getting the 5850 when price drops some.. pretty sure they won't have much to compete with that card pricewise
     
  11. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    not being a sheep, just being a realist. i realise that these companies first loyalty is to their creditors and investors, and they have to make as much money as possible. perhaps you should remove your head from your arse then you might see this too.
     
  12. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    This is where your argument somewhat fails. You can't realistically expect any, I repeat, any company to use all of it's resources on one project, that's first of all risky, secondly expensive, thirdly makes the investors unhappy, and finally it is just unrealistic. Not to mention that the G80/G92 chips were good enough that almost nothing could catch up to it, but look at the time it took for them to get there...

    Late 2005-->2007 for the G80, it only took a year for the G80 to be replaced by the GT200. If they didn't replace the G80, they would've been stomped on(well even more) by the RV7xx chips.

    Also, in this market top performance doesn't win anything, and in fact isn't even the goal, the goal is to 1 up the opposing team just slightly or enough to convince consumers to purchase their product. If they had G80 like advances every generation, well frankly, we wouldn't need GPUs for a while.
     
  13. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    they could have kept on top like intel does amd if they kept going.. think they were riding on their success as the fastest single gpu cards for quite awhile and milking that.. I really hope fermi is all they say it is- but learned not to believe it until I see it..

    same with ati.. remember the hype after the 8800gtx was released- I put off buying until that pos ati released, was a big fan of the older x800- even us ati fanboys jumped ship over to the g80 as they had fixed the image quality issues and managed to make ati's new card look clueless

    price we were looking at 600+ on the gtx.. considering how good it was at the time- I bought one and it kept it's value long enough to sell it for 350 on ebay and go with the 9800gtx on release.. that card was a downgrade in certain respects like supersampling

    think that's what buyers into nvidia thought about g92.. it was nothing but a cheap way for them to make a sub par 8800gtx.. wasn't until the 200 series we saw a real jump up

    can say nvidia cards hold their value.. went through 3 generations now and expect to sell off this 260 for around 80 bucks- but probably going over to that 5850 unless fermi is affordable, or so far over the top ridiculous (like the g80 was) that it's worth the price

    not really likin nvidia right now
     
  14. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,199
    Likes Received:
    2,706
  15. crazyceo

    crazyceo What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    563
    Likes Received:
    8
    White Noise, nothing more. Realist? Again you bring nothing to the conversation.
     
  16. crazyceo

    crazyceo What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    563
    Likes Received:
    8
    "This is where your argument somewhat fails. You can't realistically expect any, I repeat, any company to use all of it's resources on one project, that's first of all risky, secondly expensive, thirdly makes the investors unhappy, and finally it is just unrealistic. Not to mention that the G80/G92 chips were good enough that almost nothing could catch up to it, but look at the time it took for them to get there..."

    Once it was there, it stuck around for over 2 years for someone to even come close to it. Don't you think in that time, they could have come up with something better?

    It's naive to think just because they did it once they couldn't do it again or ATi couldn't release a product making the same level impact. It's almost as if both companies are just making do since there really isn't anything out there to software wise to task them.

    Both companies are making money (ATi especially after the Intel handout), so why not push the boundaries.

    Otherwise, every release will still be held up to the 8800GTX and asked have they made that level of impact. Unless Fermi can, the question will still go on being asked of both companies.
     
  17. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    Ah my mistake. I didnt realise that 'bringing something to the conversation' meant calling anyone who disagreed with you a fanboy then insulting them.

    Till nvidia release Fermi me and alot of other people will be skeptical about anything they say, based on their past deeds.
     
  18. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Remember that the G80 chip was a revolutionary chip, much like the RV3xx chips. The GT200 was an evolution or refinement of the G80, as was the G92, although the G92 really shouldn't count since it's more or less a die shrink.

    Well look at their histories, every DX iteration there's usually a revolutionary card followed by cards that refined the process(from shader pipelines method that were refined until the G70 and the RV5xx to the Shader Cores that were created during the G80/RV6xx series and refined even to this day) from the RV3xx which marked the DX9, to the G80, these were revolutionary for sure, but inbetween most people didn't need to upgrade every gen, rather every other Gen.

    Because the high end market isn't the most profitable, the big money lays in price/performance for OEMs and mid--low range consumers.

    Well if you look back many GPUs were held against the 9800PRO, which still today is quite formidable. And until we find a new revolutionary method for GPUs, they will be compared to the 8800GTX, because it created well a damn good performing product for the Shader Core method.
     
  19. crazyceo

    crazyceo What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    563
    Likes Received:
    8
     
  20. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    quoting fail. i havent called you a fanboy in the slightest, yet you persist in doing it.

    you say you wanted AMD to push the boundaries. how could they have done this when the intel settlement happened AFTER the HD58xx shipped? you cant reinvent the wheel every generation when you have no money (although the gfx part of AMD - ATi is the part which makes the most money).
     
Tags: Add Tags

Share This Page