News PCIe 2.0 announced

Discussion in 'Article Discussion' started by Da Dego, 16 Jan 2007.

  1. Da Dego

    Da Dego Brett Thomas

    Joined:
    17 Aug 2004
    Posts:
    3,913
    Likes Received:
    1
  2. will.

    will. A motorbike of jealousy!

    Joined:
    2 Mar 2005
    Posts:
    4,461
    Likes Received:
    20
    Oh dear god! Just one year without technological advances... please... I'll be so happy.
     
  3. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    But I just got PCIe and DDR2! Curse you technological advances!
     
  4. M4RTIN

    M4RTIN What's a Dremel?

    Joined:
    11 Sep 2006
    Posts:
    1,259
    Likes Received:
    3
    whats the point? okay the extra power is "needed" but that can be provided by leads directly to the psu. the extra bandwidth is a complete waste
     
  5. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,139
    Likes Received:
    382
    [rant]DAMN GRAPHIC CARD COMPANIES THAT MAKE STUPID POWER HUNGRY GRAPHIC CARDS!!!!! COPY INTEL GODDAMNIT!!!!![/rant]

    hmmm...DDR3 sounds nice.... the same thing for me.... wait for reviews and stable hardware.
     
  6. Bladestorm

    Bladestorm What's a Dremel?

    Joined:
    14 Dec 2005
    Posts:
    698
    Likes Received:
    0
    Better that they add something they dont yet need but might eventually now while they are coming up with a revision anyway than need a version 3.0 another year or two down the road really.
     
  7. MrWillyWonka

    MrWillyWonka Chocolate computers galore!

    Joined:
    25 Jul 2004
    Posts:
    5,892
    Likes Received:
    12
    Wow, my computer is so out of date, yet it plays modern games well. I'll settle for my rig until the end of the year at least! Wonder how long it will be until the current 1.1 standards' bandwidth is actually used up completely by a card?
     
  8. Tulatin

    Tulatin The Froggy Poster

    Joined:
    16 Oct 2003
    Posts:
    3,161
    Likes Received:
    7
    The look of that new 8 pin connector worries me slightly. It appears that they're taking the current 12V EPS connector, and inverting the wiring. WHY? Are they stupid,or something? There WILL be users who WILL mix the two up, and then you'll have instant card death. If they want to use ridiculous standards, can't they at least follow ones already existing?

    Along with that, does it worry anyone else that the shot of the dual connectors present on the 8800 there show off the fact that it can take the new 8-pinner?
     
  9. SteveyG

    SteveyG Electromodder

    Joined:
    23 Nov 2002
    Posts:
    3,049
    Likes Received:
    8
    Surely they should work on reducing the power consumption of these cards! If you think about it (and how much electrical appliances round the house use), 185W is pretty insane really just for the GPU. :eeek:
     
  10. kosch

    kosch Trango in the Mango

    Joined:
    12 Feb 2005
    Posts:
    2,313
    Likes Received:
    249
    I'm still using VLB whats all this PCI-E nonsense :D
     
  11. TheoGeo

    TheoGeo What are these goddamn animals?!

    Joined:
    10 Jun 2003
    Posts:
    2,218
    Likes Received:
    14
    ahhhh, Wonka, your not penski!

    seems pointless but hopefully it means it will be longer before the next step up
     
  12. olly_lewis

    olly_lewis What's a Dremel?

    Joined:
    19 May 2006
    Posts:
    759
    Likes Received:
    0
    Though it seems that there are advances in every part of a computer's hardware, these are steps that show that the industry is constantly moving forward with new technologies, developments and insight. Though, of course, it'll be a while before PCI-E 2.0 is released to Joe public and as well all know, you wait for the reviews to come in and you wait for the technology to be used in the mainstream and then you hand over your hard earned cash...
     
  13. flabber

    flabber What's a Dremel?

    Joined:
    10 Jan 2005
    Posts:
    122
    Likes Received:
    0
    They make it seem like v.2, but isn't this just "taking the easiest solution"?
    Instead of making bigger connectors, shouldn't they be working their behinds
    off in order to make the videocards give more performance per Watt? I mean,
    2x 8800GTX's including the rest of your highend system will occupy half the power
    of a Powerplant, lol!

    It's about time they start to realize that imho ;)
     
  14. dragontail

    dragontail 5bet Bluffer

    Joined:
    9 Jun 2005
    Posts:
    1,779
    Likes Received:
    30
    FFS, stop making new standards when we don't *need* new standards!! At least it's backwards compatible, if it wasn't I would a lot more pissed off ¬_¬
     
  15. Ramble

    Ramble Ginger Nut

    Joined:
    5 Dec 2005
    Posts:
    5,596
    Likes Received:
    43
    A standards group actually define the PCIe standard, not Nvidia or AMD.
     
  16. LoneArchon

    LoneArchon What's a Dremel?

    Joined:
    15 Jun 2004
    Posts:
    425
    Likes Received:
    0
    There are rumors that the R600s will use a 8 pin and a 6 pin connector. But that may be for backwards compatibility

    I also notice that with the 8800 maybe the next revision will use one 8 pin. Also agree with the the need to change the connector so it is different from the 12v ESP
     
  17. specofdust

    specofdust Banned

    Joined:
    26 Feb 2005
    Posts:
    9,571
    Likes Received:
    168
    Sorry folks, but this kinda is neccesary.

    I've seen benchies of high end SLI systems and in dual 8x slots the cards perform less well than in dual 16x slots. The difference is around 5-20 percent depending on the games. Do you really want to be losing 20% after spending £500 on graphics cards?

    Yes, this can be remedied by having a southbridge with extra lanes, but that's a hackish solution, and basicly requires the board to be have extra chips(a southbridge in AMD boards' case) just to be able to provide sufficient bandwidth. I hate tech advances that make my kit obsolete as much as any of you, I upgrade rarely and dislike fat tech just for fat tech's sake. However, I don't think this is a case of that. I think this really is neccesary given the cards we're seeing today.
     
  18. hitman012

    hitman012 Minimodder

    Joined:
    6 May 2005
    Posts:
    4,877
    Likes Received:
    19
    What's wrong with them upgrading things when it's even backwards compatible? It's called future-proofing, and you'd probably be complaining in a year if they hadn't, because your brand new card wouldn't work as well in a 16x slot. Best to get the architecture out there and in use before it's really needed.

    Yes, they'll let you plug them into one another and blow everything up :rolleyes:. Notice the offset last row of pins to stop that happening.
     
  19. randosome

    randosome Banned

    Joined:
    17 Sep 2006
    Posts:
    226
    Likes Received:
    0
    whats the word on this btw

    i mean, atm, if you have SLI cards then you get x8 on each card, which does impact performance on each card like 5% (at least)
    On top of that, Higher speed = possibility to use less lanes = less wires = cheaper boards
    So is the extra bandwidth wasted ? i don't really think so

    Ontop of that, if you introduce the specification now, then it will be in place when people actually need it, instead of trying to introduce it when people need it
     
  20. M4RTIN

    M4RTIN What's a Dremel?

    Joined:
    11 Sep 2006
    Posts:
    1,259
    Likes Received:
    3
    i looked at some p965 sli benchmarks compared to 975x.. so in other words 16x and 4x compared to 2 8x lanes.. and unless you were to run higher than x1950xtx there was barely nothing in it. maybe 5% and in all honesty i doubt mane people could tell the performance hit. that was what i was baseing my thoughts on.

    however i've no idea what any 8800gtx sli has been done on so god knows.. anyone got numbers on that in different lanes?
     
Tags: Add Tags

Share This Page