1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Rumour - AMD Radeon HD 7000 supports PCI-E 3

Discussion in 'Article Discussion' started by arcticstoat, 29 Jul 2011.

  1. Goty

    Goty New Member

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
     
  2. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    Someone correct me if I'm wrong, but in situations where certain chipsets can only support a set number of PCI-E lanes wouldn't this allow motherboard manufacturers to setup boards for SLI/CF using two x8 PCI-E 3 slots and still get the same performance as using two x16 PCI-E 2 slots? If so, seems like a pretty good upgrade even if current cards can't saturate 16 lanes.
     
  3. TAG

    TAG New Member

    Joined:
    7 Aug 2002
    Posts:
    313
    Likes Received:
    9
    Then how is crossfire/SLI working?
    I'm not talking spreading transistors individually, but spreading blocks of them. In the scale of maybe cutting a chip in 4 and spreading it over an area twice its size.
     
    Last edited: 29 Jul 2011
  4. thehippoz

    thehippoz New Member

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    yeah not shabby.. double the bandwidth and still within spec of current power supplies
     
  5. Action_Parsnip

    Action_Parsnip New Member

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    I was quoting this: "They're skipping 32nm and going straight to 28nm.
    Next gen GPUs will hopefully use a lot less power." New process has never really meant a trend of falling power consumption for GPUs

    I never wrote this, this is someone else's quote.

    I was quoting this: "They're skipping 32nm and going straight to 28nm.
    Next gen GPUs will hopefully use a lot less power." New process has never really meant a trend of falling power consumption for GPUs
     
  6. Wwhat

    Wwhat Member

    Joined:
    2 Oct 2005
    Posts:
    263
    Likes Received:
    1
    If you try to squeeze more power through a PCIE connector you'd need to use more pins, breaking compatibility unless you do a secondary edge connector at the rear of the PCIE connector, and the motherboard would get too hot too unless you use multiple traces, but seeing that would be undoable to have it run clear across the motherboard you'd need power connectors on it near the slots, at which point it seems much simpler to just use the current system of having connectors on the graphicscards.
    The placement and design of those is an open discussion though I'd say.
     
  7. fluxtatic

    fluxtatic New Member

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    300W is the total power draw. The max allowed by the spec for the PCI-E connector itself is 75W. Which follows what Tattysnuc already said - so the limit is likely a lot lower than he realized. And I would take a guess that that is the limitation - you can't suck too much power through traces on a board before things start catching on fire ;)

    Does it make sense? I think so. After replacing my water heater with a much more efficient model, I felt justified in getting a video card that takes a single 6-pin connector. That is, power draw is one of the things that limits me in what hardware I get, to some degree. It doesn't make sense to me to run a machine drawing nearly a kilowatt from the wall (extreme example, but they do exist) just because I want my shadows to be extra shadowy and my explosions extra explodey when I'm gaming. I'm not passing judgment on that, though - if that's what you dig, go for it. For me, though, give me a card with the power draw of the 5770 with the capabilities of the GTX570 and that would be all I need for a good long while. Not that it's likely to happen anytime soon.
     
  8. dyzophoria

    dyzophoria Member

    Joined:
    3 May 2004
    Posts:
    391
    Likes Received:
    1
    honestly 500 watt gpus?, and nobody is worried about their monthly electricity bill? lol, I hope they can find ways to have more performance but less power consumption. imagine this you have to technically re-wire house just to get the next generation of computers running. that is just impractical
     
  9. TAG

    TAG New Member

    Joined:
    7 Aug 2002
    Posts:
    313
    Likes Received:
    9
    Wouldn't a watercooled, overclocked GTX590 hit 500W already?
     
  10. fingerbob69

    fingerbob69 Member

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    "... give me a card with the power draw of the 5770 with the capabilities of the GTX570 and that would be all I need for a good long while. Not that it's likely to happen anytime soon."

    nVidia cards of late seem to have much higher power draws, so you could be waiting an awful long time there.

    You might not have so long to wait to get 6950 performance for a 5770 power draw though: currently the same idle and within 18.5% at load.
     
  11. TAG

    TAG New Member

    Joined:
    7 Aug 2002
    Posts:
    313
    Likes Received:
    9
    Not sure where you're getting your numbers from

    guru3d finds the following
    6950: 158W
    5570: 93W +20W (idle)=113W

    From these numbers we find the 6950 uses 40% more power than the 5770, or the 5770 uses 28% less power than a 6950.
     
  12. Evildead666

    Evildead666 New Member

    Joined:
    27 May 2004
    Posts:
    340
    Likes Received:
    4
    Yup, and boards could also do 4x PCIe3 x8 which would be the equivalent of 4x PCIe2 x16.

    I think it basically means we should start seeing motherboards with all PCIe3 x16 slots, but some running at 8x speed, and the GPU ones at 16x speed. Nothing but full length slots though.
    PCI is dead and buried.
     
  13. west

    west New Member

    Joined:
    3 Jun 2011
    Posts:
    51
    Likes Received:
    0
    "PCI is dead and buried."

    I don't think so. Look at any consumer motherboard.
     
  14. Elledan

    Elledan New Member

    Joined:
    4 Feb 2009
    Posts:
    948
    Likes Received:
    34
    I'm waiting for Intel's 3D transistor technology to make it into GPUs. Now that should knock down the power usage something seriously :)

    *keeps waiting for Ivy Bridge to be released*
     
  15. Farting Bob

    Farting Bob New Member

    Joined:
    21 Jan 2009
    Posts:
    469
    Likes Received:
    13
    A reference 590 has a TDP of 365w, which is out of spec with PCIe but in reality, if you have the PSU to handle it then its not going to break anything. But even with a crazy WC overclock you'd unlikely hit 500w. Thats liqued nitrogen territory i suspect.
     
  16. TAG

    TAG New Member

    Joined:
    7 Aug 2002
    Posts:
    313
    Likes Received:
    9
    I doubt that's LN2 territory. 2x standard clocked GTX580 would use in excess of 500W
    Overclocking a GTX590 to GTX580 clocks wouldn't require LN2 now would it?

    Its limits are in it's power circuitry, not in the cooling.
    This is already almost hitting 500w aircooled.
    Considering how a 570 uses 213W, I'd say this overclocked 590 uses 457w
     
    Last edited: 2 Aug 2011
  17. play_boy_2000

    play_boy_2000 It was funny when I was 12

    Joined:
    25 Mar 2004
    Posts:
    1,417
    Likes Received:
    48
    Sadly true. I just had a quick peek at my local computer store as well as newegg; In some catagories the number devices availble in PCI still outweigh PCIe by as much as 8:1, with only non-raid SATA addon cards exceeding a 1:1 ratio (USB is close, SATA/SAS RAID cards dominate the other way).
     
  18. slothy89

    slothy89 MicroModder

    Joined:
    17 Feb 2011
    Posts:
    145
    Likes Received:
    5
    Of course 2 gtx580s would exceed 500w.. Combined..

    These numbers are on a per card basis.. Plus the 590 is not two full 580s on the one pcb, many components are shared. Hence the lower power ceiling. *rolls eyes*

    The suggested possibility of higher bandwidth storage cards (raid) or half lane count equal bandwidth graphics seems a more reasonable outcome. Even the gtx580 only sees very minor improvement on x16 over x8.

    In 5 years we might have a real need for more than x16 pcie2 but for the near future it's enough.
     
  19. Haphestus

    Haphestus ....the folding under dog

    Joined:
    29 Jan 2010
    Posts:
    242
    Likes Received:
    6
    50 years.........try 5-10 :D
     
  20. TAG

    TAG New Member

    Joined:
    7 Aug 2002
    Posts:
    313
    Likes Received:
    9
    457W ...
    Close enough
     
    Last edited: 2 Aug 2011
Tags: Add Tags

Share This Page