1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News G80 launch specs released

Discussion in 'Article Discussion' started by Da Dego, 5 Oct 2006.

  1. TRG

    TRG New Member

    Joined:
    24 Jun 2006
    Posts:
    23
    Likes Received:
    0
    I already have a two-power supply system, but case doesn't have room for another... perhaps I should buy a new one...
     
  2. r4tch3t

    r4tch3t hmmmm....

    Joined:
    17 Aug 2005
    Posts:
    3,166
    Likes Received:
    48
    What do you mean no room? Is your title true? Whats a Dremel? You can always fit another in somewhere.
    I agree though, the power requirements are a bit high. If it is drawing that much power, that cooling solution must be extreamly efficient.
     
  3. Marquee

    Marquee Mac Pro Modder

    Joined:
    22 Jul 2005
    Posts:
    558
    Likes Received:
    1
    This new memory bus is wierd to me. Its like they to a 256bit bus and add anther 128bit bus. thats not normal, they normaly double the namber rather then add on a little to it. I am really not happy to here the numbers coming out of Nvidia. The max bandwidth is now 86GB/s, the 7950GX2 was pumping out close to 80Gb/s any way.

    Nvidia needs to release something like the GPU in the PS3. Something with muilt core GPU, with like muilt membory banks. At least give us 1GB for RAM. Most workstations use GPUs with 1Gb of RAM why can gamers get the same.
     
  4. Aankhen

    Aankhen New Member

    Joined:
    15 Oct 2005
    Posts:
    406
    Likes Received:
    0
    I desperately want DX10 (I love me some Crysis!), but if both ATI and NVIDIA are going to pull a NetBurst with regards to power consumption and heat dissipation... I honestly don't know. Maybe I'd be better off living an ascetic life in a cave somewhere.
     
  5. GrahamC

    GrahamC New Member

    Joined:
    21 Jun 2006
    Posts:
    53
    Likes Received:
    0
    That power consumption is a big concern for me as well but I'll have to wait and see. The point made about TCO is very relevant I think and maybe we need to shift our view point away from individual items to the whole system over a set time scale. With the way the graphics cards are going why does a multi core GPU mean two cores of the same design one core could be a lower power highly tuned OS/UI interface core use for most of the time and one high power 3D/DX10 core use for your FPS etc. Add to this increased power control and we have power saving on a large scale.
     
  6. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,783
    Likes Received:
    101
    There was a mention of the whole power consumption issue in the new Maximum PC Magazin this montha and them mentioned that since a normal 110V outlet can only supply somthing like 15 A we may soon have to plug our PCs into the 220 V outlet next to the oven and the dryer.
     
  7. Duste

    Duste Sierra my delta, bravo!

    Joined:
    1 Oct 2006
    Posts:
    818
    Likes Received:
    0
    Lucky for me the custom case I'm thinking of building will have room for dual 400W rackmount PSUs! :D
     
  8. Marquee

    Marquee Mac Pro Modder

    Joined:
    22 Jul 2005
    Posts:
    558
    Likes Received:
    1
    If they had more then one core they can scale the GPU cores. So when there is no work, they run at a lower speed. And when there is work all the GPU are turned on and run at full speed. ATI has the right idea with the new version of crossfire. The dual GPU either can work on shifting frames so like 1gpu does frame one while 2gpu does frame 2. Or they divide the screen into many little boxes and GPUs divide the work load.
     
  9. Warrior_Rocker

    Warrior_Rocker Holder of the sacred iron

    Joined:
    26 Jun 2005
    Posts:
    938
    Likes Received:
    1
    Soon we will need two towers. One to simply provide 1000 AMPS of DC power to the one that is sucking up 900AMPS. And then nvidia can start its own electric company under the flag of "if you need power for the grahpics card we sold you, look no further" :eyebrow:
     
  10. aggies11

    aggies11 New Member

    Joined:
    4 Jul 2006
    Posts:
    180
    Likes Received:
    1
    Are the wattage terms used in Computers the same as other electronics, eg. the traditionally 100W light bulb?

    Because if it is, then a 200W card isn't really that bad. I mean, turn off the lights in your room before you start to game, and you've essentially made up the power savings.

    Am I out to lunch on this one? (never did pay too much attention in electronics class ;) )

    Whats the typical power consumption of a 19"CRT or LCD, or 5.1 speakers?

    Aggies
     
  11. mclean007

    mclean007 Officious Bystander

    Joined:
    22 May 2003
    Posts:
    2,035
    Likes Received:
    15
    erm...yes. A Watt is a Watt is a Watt. And a PC drawing, say, 500W is really quite a hefty power drain. If you left it running loaded 24/7 (folding on CPU and GPU for example), you'd be looking at an extra 12 kWh per day, or over £400 a year in electricity at 10p per unit.
     
  12. DarkReaper

    DarkReaper Alignment: Sarcastic Good

    Joined:
    9 Jan 2004
    Posts:
    1,751
    Likes Received:
    0
    But as aggies11 said, turning off a few light bulbs whilst you game will even things out - lots of people just never turn lights off which is an equally hefty drain.
     
  13. aggies11

    aggies11 New Member

    Joined:
    4 Jul 2006
    Posts:
    180
    Likes Received:
    1
    Re: Meclean

    Ah, I guess that does add up. Here in Canada electricity prices are a bit more tame (0.05$(CAD)) per kWh. So worstcase it's more in the line of $200(CAD), and realisticly, if you game 6hrs per day, it drops to ~$50(CAD), which is basically the price of a pc game. Which makes it a rather small cost, when the total cost of pc gaming is considered.

    But thats only over here, 500W is definitely alot scarier if you are paying your prices :(

    Aggies
     
  14. Aankhen

    Aankhen New Member

    Joined:
    15 Oct 2005
    Posts:
    406
    Likes Received:
    0
    I know a lot of people have an obsession with gaming (and general computing) in the dark, but I can't handle it. The glare from a monitor that's the single source of light in a room is too much for my eyes, and I'm not into masochism. :)
     
  15. -EVRE-

    -EVRE- New Member

    Joined:
    12 May 2004
    Posts:
    372
    Likes Received:
    1
    I dont get it.. Most of you are concerned about the power this thing is going to consume. If you realy want to save some power try turning off lights in rooms your not in.

    Either no one knows or no one cares about the true meaning of having 128 unified shaders clocked at 1350MHz.

    I posed the question
    after a little research on what is known about unified shaders in the Xbox 360. A piece of hardware that can't be benchmarked like a PC with our favorite games due to its architecture. (And is based on ATI's design.)

    ATI's Xbox GPU
    [​IMG]



    so... maybe this will make you curious on how it will perform vs complaining that its going to jack your utility bill up a few $
     
  16. M4RTIN

    M4RTIN New Member

    Joined:
    11 Sep 2006
    Posts:
    1,259
    Likes Received:
    3
    well considering the 360 is very impressive with "only" 48 shaders the G80's should be pretty damn impressive..

    oblivion on one card may finally look good
     
  17. trig

    trig god's little mistake

    Joined:
    10 Aug 2006
    Posts:
    2,835
    Likes Received:
    42
    ill wait and see what ati offers since it will most likely have gddr4...if its as power hungry and sounds like my wife's hairdryer...ill pass. seel my x1800xt on ebay, upgrade to the x1950xtx, and call it a year-and-a-half
     
  18. speedfreek

    speedfreek New Member

    Joined:
    9 Nov 2005
    Posts:
    1,453
    Likes Received:
    1
    Its the heat that concerns me. Think of the size of heatsink you would need and how fast of a fan you would need. I hate the sound of leafblowers in my room, plus you will have copper thieves stealing the 2lb heatsink you have in there.
     
  19. Highland3r

    Highland3r New Member

    Joined:
    25 Jul 2003
    Posts:
    7,553
    Likes Received:
    16
    Card's actually not that noisy. Temps wise it's not that bad either. Was quite surprised when I found out how cool it actually run's....
     
Tags: Add Tags

Share This Page