1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia predicts 570x GPU performance boost

Discussion in 'Article Discussion' started by Sifter3000, 27 Aug 2009.

  1. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
  2. schwabman

    schwabman New Member

    Joined:
    27 Aug 2009
    Posts:
    4
    Likes Received:
    0
    So that's what, a 57,000% increase? Huang, I am not buying what you are selling!
     
  3. V3ctor

    V3ctor Tech addict...

    Joined:
    10 Dec 2008
    Posts:
    583
    Likes Received:
    2
    ...and then comes ATi with 569 1/2x for half price... :D:D
     
  4. yakyb

    yakyb i hate the person above me

    Joined:
    10 Oct 2006
    Posts:
    2,064
    Likes Received:
    36
    umm any idea why its exactly 570x
     
  5. 13eightyfour

    13eightyfour Formerly Titanium Angel

    Joined:
    9 Sep 2003
    Posts:
    3,413
    Likes Received:
    115
    Its probably the same amount of times they can rebrand the same card as something new. :D
     
  6. GFC

    GFC New Member

    Joined:
    7 Nov 2008
    Posts:
    118
    Likes Received:
    0
    Yea.. and let me guess, It's gonna be the size of moon, right?
     
  7. DaMightyMouse

    DaMightyMouse New Member

    Joined:
    1 Aug 2008
    Posts:
    49
    Likes Received:
    2
    Probably will need a nuclear reactor just to power it...
     
  8. bogie170

    bogie170 New Member

    Joined:
    11 Aug 2008
    Posts:
    340
    Likes Received:
    5
    And 2 Nuclear Reactors for SLI :p
     
  9. biebiep

    biebiep New Member

    Joined:
    12 Dec 2007
    Posts:
    101
    Likes Received:
    3
    Bullcrap.
    nVidia's raytracing acceleration that they use in demo's is soooooo ugly compared to real raytracing.

    They'll need that 570x just to be at the same level of what a CPU CAN do.
     
  10. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,955
    Likes Received:
    17
    It's gone up a lot in the last five years, but not 570 times. So why would we expect it to go up that much in the next five years.
     
  11. RotoSequence

    RotoSequence Lazy Lurker

    Joined:
    6 Jan 2004
    Posts:
    4,588
    Likes Received:
    7
    Because they think it will sell more product NOW, of course.
     
  12. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    2,195
    Likes Received:
    170
    I wonder if these cards will simply be rebadged 8800GT's with clever *cough* marketing?!
     
  13. Zero_UK

    Zero_UK New Member

    Joined:
    12 Aug 2008
    Posts:
    661
    Likes Received:
    9
    no no no no... everyone and bittech, you've misread the data all wrong.

    He's on about PRICE increases.. NOT preformance. Preformance will just increase 0.1% from the next gen rebranded cards that allow a slightly better overclock.

    ;)
     
  14. damienVC

    damienVC New Member

    Joined:
    25 Aug 2009
    Posts:
    225
    Likes Received:
    17
    Hmmm - that sounds interesting! Something I've been trying to achieve for years through various (legal) mindaltering methods...!
     
  15. Ross1

    Ross1 New Member

    Joined:
    15 Feb 2009
    Posts:
    194
    Likes Received:
    5
    nvidias marketing/PR/spin department is one of the most odious around. this, their incredibly annoying naming/rebranding of gpus, their constant catfights with intel (which isnt just words, see there being no sli on p35/p45).... it doesnt do anyone any good.
     
  16. PQuiff

    PQuiff New Member

    Joined:
    7 Sep 2001
    Posts:
    557
    Likes Received:
    4
    Hmm... I think all the large companies like Nvidia, MS , Sony etc should have someone outside with a big whip. Any time an employee tries to leave they should get whipped back into the building. Thus, no more of these rapidly escalating stupid comments would come out. And we might actually get some good kit, not just the re branded stuff we get now.

    I remember when the first GeForce(256) to be exact, came out. I nearly pooped myself it was such a good card, and light years ahead of anything that had come out. I had just bought a Number 9 Graphics card, but chucked it in the bin for the new GeForce. Cant see me doing something like that for this gen stuff.

    Bit of Subjective stuff here.
    Quake Jun 96
    Quake 3 Dec 99 say 100% better(gfx wise) ?
    Halflife 2 Nov 04 say 200% better(gfx wise than quake)?
    Unreal Tourney 3 Dec 07 250-300% better (graphics wise than quake)?

    Thats over 10 years....and its a bit of a dodgy comparison and highly subjective...but id love to see how that equates to the gfx cards speed and memory etc....prolly find that a Gfx card that is 500% faster wont lead to a mind blowing images or games performance.

    Still fingers xed.
     
  17. Denis_iii

    Denis_iii New Member

    Joined:
    1 Jan 2007
    Posts:
    1,224
    Likes Received:
    14
    lmao classic
     
  18. Denis_iii

    Denis_iii New Member

    Joined:
    1 Jan 2007
    Posts:
    1,224
    Likes Received:
    14
    and there krappy mobile drivers
    at least for my M7900GTX
     
  19. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,955
    Likes Received:
    17
    Anyone who bought an 8800GTX when if first came out must be feeling pretty smug right now. They're still selling what are effectively die shrunk versions several years later.
    Let's hope the Early DX11 cards offer the same jump in performance over the DX10 cards, that the early DX10 cards did over DX9 cards. The leaked specifications for ATi's upcoming cards certainly seem hopeful.
     
  20. Comet

    Comet New Member

    Joined:
    21 Jan 2009
    Posts:
    27
    Likes Received:
    0
    What NVIDIA CEO is saying reflects the "behind the doors" war that is happening. The hardware market is changing drastically.
    AMD/ATI was in a bad shape and NVIDIA was reigning just a couple of years ago. But AMD did one hell of a move that is shaking things.
    It bought ATI. It is now the first chip maker that has the "tools" to combine the CPU and the GPU in one single integrated platform.
    Those in the industry no well that this is the way to go. CPU's are becoming more parellel but GPU's give lot of promisse in AI and physics processing.
    What you really need is a better fusion between the two. Intel is openely stating that their next CPU's will be able to suite many gamer needs. And developers are also giving clues that what Intel is saying is not a lie.
    INTEL is attacking NVIDIA and AMD market.
    NVIDIA is trying to catch up but it knows it has to have a foot in other platforms as they create, and here goes "THEIR OWN CPU PROCESSOR" combined with a GPU one.
    NVIDIA is investing heavly on atom based pcs because thats an area where a GPU can still make an huge difference and they can increase profits.
    Let's be clear about one thing. Despite how the market evolves, you can't have a cpu or gpu doing AI, physics, graphics, gameplay and all that heavy stuff in a single processor.
    You need a hybrid solution. That's what AMD did. It upped the stakes and sent a clear warning to competitors. They sent the message.
    "HELL with you. Were moving forward. Were building an hybrid platform. No more depending on third parties."
    Developers are sending the message that as GPUs and CPUs come togheter it doesn't make sense to talk with each other in a different way.
    Future platforms will share a "common" language so to speak and true paralelism despite the core technology. What this means is that highly parallel processing such as physics, graphics, ai
    will be interchanging threads between cores in both the cpu and gpu as if they were one.
    Now imagine this. Imagine that you want to build a platform for cloud computing. You got a hybrid GPU/CPU chip that interchanges threads. You add another hybrid gpu/cpu that shares the same interchange capability.
    The two of them together can do the same stuff interchanging stuff between them as if they were one. The more you add, the more efficient the system gets. And you don't loose processing power.
    They're all interchanging threads with each other, there is no unused core as it happens in the solutions available today.
    That's what this guys are aiming for.
    When NVIDIA CEO states that GPU's will have a 500x and the CPU only a margin of that, what he really wants to say is. "We got loads of experience in parallell processing. Our future solutions will be able to handle both traditional CPU and GPU tasks with breaze".
     
Tags: Add Tags

Share This Page