1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs What does TDP mean, Nvidia?

Discussion in 'Article Discussion' started by Claave, 11 Nov 2010.

  1. Claave

    Claave You Rebel scum

    Joined:
    29 Nov 2008
    Posts:
    691
    Likes Received:
    12
  2. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    You can't open up a can of worms purely based on a marketing department. ALL the manufacturers do it, Nvidia, Intel, AMD, Ann Summers claiming their products do this or do that. Real world benchtests can prove otherwise so let's just keep it at that. I constantly moan about the AMD marketing, something I'm serving time for, but I went down that trap of believing the pap and then seeing the crap. Be warned, you gonna get it bigtime Clive.
     
  3. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,987
    Likes Received:
    706
    perhaps it is the power draw for the actual GPU, rather than the 293w measured for the whole card?

    but agree with article, why did nvidia choose to use power consumption rather than heat output? it doesn't exactly make them look good, only adds more to confusion
     
  4. Picarro

    Picarro What's a Dremel?

    Joined:
    9 Jun 2009
    Posts:
    3,331
    Likes Received:
    134
    Unless they went for the "MOAR NUMBERS IS MOAR!!1!!!!1" tactic .. Wouldn't really surprise me
     
  5. greigaitken

    greigaitken Minimodder

    Joined:
    26 Aug 2009
    Posts:
    431
    Likes Received:
    14
    besides the 1/2w for sound - surely ALL the power consumed is turned to heat? surely it's better to know how much power the device will consume so that you know how much you will need to supply it. which should also be the same as how much heat it produces. Where else would the energy go?
     
  6. barrkel

    barrkel What's a Dremel?

    Joined:
    31 Jan 2007
    Posts:
    82
    Likes Received:
    1
    Thanks to the magic of conservation of energy, the power used by solid-state hardware (no moving parts) is all but equal to the amount of heat it needs to dissipate.
     
  7. mclean007

    mclean007 Officious Bystander

    Joined:
    22 May 2003
    Posts:
    2,035
    Likes Received:
    15
    This is kind of an odd blog post - essentially all of the power input of a computer component is output as waste heat. This isn't some kind of physical machine where some of the power is actually used to move something from A to B or to light a room or whatever. A CPU or GPU which uses 100W outputs 100W of heat. The logic (work) done by the chip comes from the entropy gap between (high entropy) electricity and (low entropy) low level waste heat. So to draw a distinction between measuring TDP by power input and by heat output is pure semantics - the figure is the same. Okay so Nvidia may be selling themselves short by using the TDP for the entire card rather than just the GPU, but ultimately this is a less misleading measure - what users really care about is (a) how much electricity does it use (i.e. how much is it going to cost me in electricity bills); and (b) how much heat will it dump into my case (or room, if it exhausts to the rear as many GPUs now do). The two things are the same.
     
  8. sleepygamer

    sleepygamer More Metal Than Thou

    Joined:
    24 Apr 2010
    Posts:
    1,064
    Likes Received:
    72
    Ann Summers Scientist does a press conference:

    "Our new product, the 'Mysterious Passion' nightdress, complete with stockings and boots has been shown in tests to increase the TDP of your significant other by upto 20%. Please ensure that you have adequate hardware to tame this increase in heat."

    Hee...
     
    theflatworm likes this.
  9. Bakes

    Bakes What's a Dremel?

    Joined:
    4 Jun 2010
    Posts:
    886
    Likes Received:
    17
    Clive,

    nVidia is actually using that definition in order to fool the consumer. Whilst Intel and AMD/ATI are using the definition of the power output over every condition (so an Intel TDP would be the absolute maximum that chip/unit would use), nVidia is defining 'over standard operating conditions', that being games. So, when you get an ATI TDP and an nVidia TDP, they can mean wildly different things.

    For example:

    [​IMG]

    Original source: http://www.behardware.com/articles/787-5/report-nvidia-geforce-gtx-480-470.html

    That site measured the actual power input using a special PCIe riser card - as you can see the ATI 5870 is very close to its published TDP of 188w, but the GTX 480 is far higher - the published tdp is around 260w but under furmark (ie full load rather than the load you'll see in games) it increases by around 40 watts.

    This makes the nVidia tdp values even more incomparable - since whilst ATI shows tdp values that roughly correlate to the power draw, nVidias only correlate to the power draw whilst playing games - doing a high-stress test such as furmark or folding will increase the power draw massively.
     
    Last edited: 11 Nov 2010
  10. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    @Bakes cant get the image to load and also do they have a pic of this special riser card???
     
  11. MaverickWill

    MaverickWill Dirty CPC Mackem

    Joined:
    26 Apr 2009
    Posts:
    2,658
    Likes Received:
    186
    Not helped by the fact that nVidia are apparently using circuitry on the card itself to prevent full power draw on the 580. Anandtech managed to circumvent it here, which shows, if we subtract the SLI result from the single result, a difference in power usage of 325W. Calling an efficiency of 87% (good, efficient PSU), I make that to be 283W.
     
  12. jrs77

    jrs77 Modder

    Joined:
    17 Feb 2006
    Posts:
    3,483
    Likes Received:
    103
    TDP is totally uninteresting tbh.

    Maximum powerdraw at the AC-plug is a way better measurement as this is what you pay for with your electricity-bill.

    So it would be great if manufacturers would just introduce a new term, which then would be called: MPD.

    As it is currently it's totally flawed, and with a short example you can picture it very well.

    The fuel needed for cars is measured in litres/100km (standard triple-mix) for example, and they speak about the fuel you need to fill in your tank.
    Now, if a manufacturer would instead measure the whole thing in kW used to drive 100km instead, that would totally ruin the picture, as motors used in cars vary between 30-50% efficiency.
     
    Last edited: 11 Nov 2010
  13. Jampotp

    Jampotp What's a Dremel?

    Joined:
    11 May 2010
    Posts:
    51
    Likes Received:
    1
    +1 :thumb:
     
  14. Phil Rhodes

    Phil Rhodes Hypernobber

    Joined:
    27 Jul 2006
    Posts:
    1,415
    Likes Received:
    10
    But those numbers should be so nearly identical as to make no difference.

    Other than the power required to run the fan (a couple of watts) and the energy required to drive the PCIe bus lines and DVI outputs (not insignificant, but not much), all of the energy supplied to the card will eventually be dumped as heat.

    If everyone measured TDP as power input, we'd be much better off.
     
  15. alwayssts

    alwayssts What's a Dremel?

    Joined:
    20 Feb 2009
    Posts:
    31
    Likes Received:
    2
    I think this discussion is interesting, and I've mentioned it in the comments of articles from a couple different sites because it really could change the perception of a product, or what is allowable under pci-e spec. I'm glad the people at Bit-tech publicly exposed this issue, as I'm sure I'm not the only one that thinks there needs to be a standard.

    Not everyone looks at charts and graphs like me, or perhaps you. Even if they choose to inform themselves beyond a TDP rating, independent reviews have variables, such as power supplies and their efficiency. A consistent test platform to test/meet a pci-e spec TDP does not seem much to ask. Both nVIDIA/AMD conform to the pci-e spec, yet can define how the spec is measured? That's ridiculous. Furmark/OCCT seems like a test fair for a workstation card, but something like Crysis makes sense for consumer products.

    I compared the 4870x2 for example, versus the 5970. 4870x2 has a TDP of 286W. 5970 has a TDP of 294W. In reality, under a normal gaming scenario, the former uses close to it's TDP while the 5970 uses less than 225W. Hell, 5970 uses less power gaming than GTX580! The confusion is simply because AMD made a stupid decision starting with the 5000-series to change their TDP testing. Furmark/OCCT are not representative of the power/heat for the 5000/6000 series at all, and I hope AMD changes back. I understand some may think nVIDIA is twisting numbers, and that's a fair argument if you agree with AMD's method, but I happen to believe nVIDIA is calling this correctly and AMD is shooting themselves in the foot. Who is to say a 5870x2 wouldn't use under 300W using nVIDIA's method? It likely would. Why cock-block themselves?

    As I've said before, it will be interesting to see if and/or when dual GF104/Cayman products are released, and if this discrepancy still exists.
     
  16. jrs77

    jrs77 Modder

    Joined:
    17 Feb 2006
    Posts:
    3,483
    Likes Received:
    103
    Like I said there allready. Manufacturers simply need to agree upon "maximum powerdraw" at 100% load and state this number as "MPD" instead of using anything else.
     
  17. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    Clive, do i understand correctly that you are criticizing the nVidia Marketing department for not spinning this number to their advantage? :lol:

    @jrs77: nVidia and ATI disagree about 100% load though. A card runs hotter and uses more power in furmark then when playing games, yet both generate 100% load if you believe GPUz.
     
  18. Altron

    Altron Minimodder

    Joined:
    12 Dec 2002
    Posts:
    3,186
    Likes Received:
    61
    The real issue here is the wording

    "TDP is a measure of maximum power draw over time in real world applications"

    What's "maximum power draw over time"? That's not an engineering term. You can have a maximum instantaneous power draw, and an average power draw over time, but not a "maximum power draw over time".

    And what's "real world applications" - I feel like this would exclude articifial benchmarks that run every part of the chip at 100% utilization.

    But splitting hairs over "input power" and "power wasted as heat" - that's just stupid. The only thing that produces an actual energy output other than heat is the fan (produces kinetic energy in air and produces sound) and the very low power signal going out over the video cables or the PCIe data lanes. That amount of energy is marginal compared to the power spent as heat.
     
  19. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    I didn't even know there wasn't a standard set until read this.. you know nvidia though.. they will spin it to look better for themselves every time- even if what they are talking about has wood screws in it
     
  20. jrs77

    jrs77 Modder

    Joined:
    17 Feb 2006
    Posts:
    3,483
    Likes Received:
    103
    100% load = 100% of all the card can possibly deliver before it starts friying it's components.

    I don't give a **** about benchmarks etc...
     
Tags: Add Tags

Share This Page