1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs What’s next for the GeForce 500-series?

Discussion in 'Article Discussion' started by Lizard, 10 Nov 2010.

  1. Lizard

    Lizard @ Scan R&D

    Joined:
    17 Feb 2007
    Posts:
    2,890
    Likes Received:
    34
  2. rehk

    rehk New Member

    Joined:
    21 Feb 2010
    Posts:
    48
    Likes Received:
    2
    I predict wallets suddenly becoming lighter if any of this is even slightly true.
     
  3. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    10,970
    Likes Received:
    325
    wow, everywhere i see, people praising the 500 series. is it that good? so good that makes 400 cards obsolete?

    of course, by the time a full lineup has been released, it'd probably be one year since 500's corresponding card came to market.
     
  4. okenobi

    okenobi New Member

    Joined:
    3 Nov 2009
    Posts:
    1,231
    Likes Received:
    35
    And when is this mythical 560 due to arrive? I'm assuming it's a ways off yet....
     
  5. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    I know I'm bias towards the green team but even I can't believe AMD would let such an excellent head start slip through their fingers. I know you guys must have the cards in hand but can't you just let slip the potential of the AMD cards?
     
  6. flipman

    flipman New Member

    Joined:
    21 May 2010
    Posts:
    20
    Likes Received:
    0
    "wow i never saw this coming" so lets wait and see if it will make this differences ,AMD is near the release of there cards
     
  7. mrbens

    mrbens New Member

    Joined:
    15 Aug 2009
    Posts:
    511
    Likes Received:
    4
    Nvidia do seem to be impressing at the moment. I'm so impressed by my 460 and cannot wait 'till around March time to buy another for SLI.

    C'mon Clive!:
     
  8. TWeaK

    TWeaK Member

    Joined:
    28 Jan 2010
    Posts:
    521
    Likes Received:
    7
    In their defense, most of their advantage was lost due to TSMC screwing up 32nm. That was the cat they were going to let out of the bag after Nvidia responded, instead TSMC threw the bag in a river.
     
  9. new_world_order

    new_world_order 4.0 GHz Dremel

    Joined:
    9 Nov 2010
    Posts:
    37
    Likes Received:
    0
    ROFL!!!!!!!!!!!!!!!!!!! :)
     
  10. memeroot

    memeroot aged and experianced

    Joined:
    31 Oct 2009
    Posts:
    1,215
    Likes Received:
    19
    indeed - looking forward to them... but WHEN!
     
  11. Lord-Vale3

    Lord-Vale3 His Tremendousness

    Joined:
    1 Dec 2009
    Posts:
    301
    Likes Received:
    8
    Thats interesting. nVidia seems to have jumped right back on its feet.
     
  12. chrisb2e9

    chrisb2e9 Dont do that...

    Joined:
    18 Jun 2007
    Posts:
    4,059
    Likes Received:
    46
    This is how is goes. One company has the lead for a while, and then the other one will. and back, and forth. so on, etc.
     
  13. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    granted, the 580 is a good card that has truely unlocked the potential of the Fermi architecture, but i'm not a big fan if this kind of wild conjecture. For instance, the NF104 is a completely different design then the NF100 (of which th NF110) is a direct derivative. There is no basis at all to believe that it has as much overhead left to be unlocked as the NF100, and for a start, the GTX460 it doesn't have any unused SMs like the GTX480, has it?

    Going even further, you could say that the 6870 is the successor of the 5770, and it has a 70% performance increase (min framerate, 19x12, here: http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/7) , so this means that the 6970 will also feature 70% performance hike over the 5870, obliterating nVidia till kingdom come.

    Ofcourse, there's a milion things wrong with all that, just as your ludicrous math has no merit to it whatsoever. From nVidia fanboys i am expecting to see these kind of posts, but you are supposed to be a respected journalistic outlet. Yeah fine, it's a blog, and some speculation is fine, but please don't become a next Charlie.
     
  14. WarrenJ

    WarrenJ Well-Known Member

    Joined:
    14 Oct 2009
    Posts:
    2,764
    Likes Received:
    251
    Personally, i would like to see a dual gpu card a'la GTX295's. maybe 2 460's or 2 470s? Are they on they're way yet?
     
  15. borandi

    borandi New Member

    Joined:
    27 Jan 2010
    Posts:
    128
    Likes Received:
    1
    The newer fp16 capabilities were on the GF104 and 106 already. Only the GF100 cards didn't have them.
     
  16. Evildead666

    Evildead666 New Member

    Joined:
    27 May 2004
    Posts:
    340
    Likes Received:
    4
    This seems a bit too much AMD bashing for a blog on the 500 series.
    "...ATi (they're gone btw) could be truly scuppered."
    "...the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison."

    Your PR is as strong as Nvidia's, and you state you have info about the 6900 series as well, which could influence some people (insider knowledge and all that).

    So basically you're giving free PR to Nvidia, whilst trashing AMD's new 6xxx lineup.

    And yet this is 'speculation' ?

    I'm not pro- any side, but i do take exception to someones personal speculative blog being on the front page of what should be a respectable tech site.
    Being respected is to not be biased. If Bit Tech is leaning to one side or the other, put THAT on the front page, and change your name to NvNews or something.

    This should not be on the Front page.
     
    Ficky Pucker likes this.
  17. schmidtbag

    schmidtbag New Member

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    i think how that nvidia has finally built that supercomputer they can finally spend their money and attention on their video cards. imo, the gtx500 series is just them milking their old architecture for what its worth but i'm sure the gtx600 series will be much faster, much less power consuming, much less heat, much less noise, and probably smaller. although i currently use ati, i've always been an nvidia fan - they've always tried really hard to be the best and they just couldn't afford it this past year, but i'm sure they'll come back.

    although i'm glad how well ati/amd have been doing (they really needed it) i'm REALLY disappointed in their hd6000 series so far. its barely better, i sure hope the hd69xx cards will live up to their expectations.


    i do feel that bit-tech is a little biased towards nvidia. from the article i read yesterday about the gtx580, it did do great but it also wasn't #1 in every test, but bit-tech was acting as though it was #1 in all tests. i checked guru3d.com, and it was positioned #2 in almost every test.
     
  18. Phoenixlight

    Phoenixlight New Member

    Joined:
    29 Oct 2010
    Posts:
    64
    Likes Received:
    0
    "...the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison..."
    Not cool.
     
  19. fingerbob69

    fingerbob69 Member

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    "...the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison..."
    Not cool.

    But Not right.

    What's more the 580 is not a new chip in terms of new architecture. It's a mostly fixed 480. If we want to start getting our knickers in a twist over the name on the box a lá 68xx v 67xx then this card should rightly be called the gtc485.
     
  20. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
    This is the bit of the article that has my attention:

    "Unfortunately, while we know some interesting things about the forthcoming ATI Radeon HD 6900-series, we’re still not allowed to tell you anything about it"

    Which can be taken either way. I hate secrets :(
     
Tags: Add Tags

Share This Page