Hardware BFGTech GeForce 8600 GTS OC2 ThermoIntelligence

Discussion in 'Article Discussion' started by Tim S, 14 Sep 2007.

  1. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
  2. christopher3393

    christopher3393 New Member

    Joined:
    14 Sep 2007
    Posts:
    5
    Likes Received:
    0
    Nice review! The one aspect that I'd like to know more about is fan noise. My understanding is that the stock 8600 GTS fan idles around 40dBs and ramps up from there. BFG does provide a footnote rating the fan at roughly 39dBs, which IMO is already not quiet even by faily flexible standards. No reviewer has tested this w/ a sound meter, but most saysomething like "not bad, fairly quiet, a little quieter than stock.." Interesting how noise is glossed over on some products.
     
  3. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    Hi there, welcome to the forums - thanks for signing up to ask your questions.

    The loudest thing in the system it was tested on was not the graphics card (believe it or not, the Zalman CNPS 9500 is the loudest thing in the system, and that's not exactly "noisy"). This makes any kind of sound testing fairly difficult. Also, it's not entirely accurate to just record sound in an open environment - proper noise recording requires an enclosure designed for monitoring noise and our open plan office certainly isn't one of those.

    Speaking subjectively with a lot of experience using just about every graphics cooler available in the last four years, you will not hear this in a case once the drivers have been installed. :)
     
  4. Bluephoenix

    Bluephoenix Spoon? What spoon?

    Joined:
    3 Dec 2006
    Posts:
    968
    Likes Received:
    1
    one thing this will be attractive for is HTPCs, at least as soon as dell makes an HDCP 30" display :thumb:
     
  5. xion

    xion New Member

    Joined:
    23 Aug 2006
    Posts:
    165
    Likes Received:
    0
    The more reviews I read for the midrange dx10 hardware like this - the more pleased I am in my decision to buy the little gem that is the x1950pro, that puppy was meant to be a stop gap for me until I decided which route to take. I still haven't found a game it wont play on full details at least 1280x1024 at the same time as powering my other monitor/TV. being HDCP compliant also just added the sweetener. It's going to have to take some BIG differences in DX10 games and even bigger price drops for me to consider "upgrading" any time soon...
     
    Last edited: 14 Sep 2007
  6. -EVRE-

    -EVRE- New Member

    Joined:
    12 May 2004
    Posts:
    372
    Likes Received:
    1
    71* and 80* c!!!!!!!

    I dont like the sounds of that..

    I had no idea thats how hot those cards ran!
     
  7. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    Yeah, it's a shame - not BFG's fault though. :(
     
  8. Mankz

    Mankz 5318008

    Joined:
    15 Jan 2006
    Posts:
    14,508
    Likes Received:
    473
    Costs far to much for what it is. If it were £120 like the 7600 GT OC's were I might have been tempted.
     
  9. xion

    xion New Member

    Joined:
    23 Aug 2006
    Posts:
    165
    Likes Received:
    0
    Couldn't agree more, BFG seem to be one of the few company's that do something different in order to get the most out of the kit, as for the quite impressive clocks, that's potentially free money - higher returns maybe, but to invest in the development of an improved thermal solution that performs better than some aftermarket coolers is commendable, and quite a big CAPEX risk. IMHO the lifetime/10yr warranty tends to be a marketing ploy though, if anything is likly to go wrong chances are its in the first week, letalone after a year. TBF its a "nice to have" feature/extra.

    It's clear that partners are having to do a whole lot more to differentiate themselves from the crowd than ever before (throwing in a "free" XXXL T-Shirt). With the number of reference board releases there can't be much profit margin available after the price-wars on mid-rage consumer products, the onlyway to secure a chunk of release-frenzy-buying is to get your card out of the door and onto the shelves first, Ultimatly that means 20 versions of the same product priced within a pound of each other with only the sicker and box to sepparate them.

    BFG clearly seem to pull out all the stops. I Hope it pays off until the next graphics evolution

    My 2p's worh anyhoo!
     
  10. christopher3393

    christopher3393 New Member

    Joined:
    14 Sep 2007
    Posts:
    5
    Likes Received:
    0
    Thanks Tim. Very helpful.
     
  11. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
  12. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    SLI is only slower at 1024x768 0xAA - that's because it's CPU limited at that resolution. As you increase resolution/AA, the SLI scaling comes into effect. If you look at 1280x1024 0xAA, the performance gap isn't that large either (but the load has increased). :)
     
  13. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    ah okay that makes sense hmm I wonder if they could program SLI so that in such an event/situation it shuts of the second gpu and treats it as non-sli, but I guess that could only work if this is common in other games as well which Im not sure never used SLI, Im an ATI guy.
     
Tags: Add Tags

Share This Page