1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Windows Does 'Prefer Performance' mode make any difference?

Discussion in 'Tech Support' started by Tripwires, 20 Jul 2012.

  1. Tripwires

    Tripwires What's a Dremel?

    Joined:
    15 Jun 2012
    Posts:
    70
    Likes Received:
    0
    Hi guys and girls,

    Something I've always wondered about with windows 7 and also Nvidia graphics cards.

    For a start, windows 7 is by default on 'Balanced' power mode, presumably meaning the CPU clocks up when it needs to and clocks down when it's not needed so much.

    Nvidia cards are similar.

    However what difference, if any, would there be putting both of these things to 'prefer high performance' mode?

    I did this with windows and noticed that my CPU stayed at 4.5 ghz at all times, even when idle.

    Is there any discernable benefit when gaming?

    Does it degrade the lifespan of components unnecessarily?

    would be interested to hear your well founded thoughts :):thumb:
     
  2. munkey

    munkey What's a Dremel?

    Joined:
    24 May 2012
    Posts:
    62
    Likes Received:
    2
    I wouldn't expect it to make any major difference except that your electricity usage would go up since it takes more energy to run higher speeds.

    I'm sure someone else here would know better though. :)
     
  3. noizdaemon666

    noizdaemon666 I'm Od, Therefore I Pwn

    Joined:
    15 Jun 2010
    Posts:
    6,084
    Likes Received:
    784
    Pretty much this. Downclocking the CPU when largely inactive is good as it reduces heat output and voltage, keeping lifespan as lengthy as possible. Setting Windows power options is mainly for laptops, so you can decide how much power you'd like to save. I'd leave it at balanced, then alter any things you think need altering. (I changed how quick it turned my HDDs and monitor off.)
     
  4. Tripwires

    Tripwires What's a Dremel?

    Joined:
    15 Jun 2012
    Posts:
    70
    Likes Received:
    0
    Thank you very much guys.

    I guess the same would be true of graphics cards?

    Only once, when playing Arma 2 I was getting an OK frame rate but not a great one (30fps ish)
    and my graphics card wasn't at 100% CPU, couldn't it be giving me better fps if I told it, "yes, 30 fps is OK but you can do better, give me full clock speed even when you're not using all of your brain"

    Or something like that haha.

    I'm probably completely wrong and cards don't work like this at all.

    I think what I'm trying to say is that doesn't using balanced mode depend on how good the component is at appropriating resources? When I saw my GPU was not at full cpu usage in a very intensive part of a very intensive game, it got me wondering.
     
  5. noizdaemon666

    noizdaemon666 I'm Od, Therefore I Pwn

    Joined:
    15 Jun 2010
    Posts:
    6,084
    Likes Received:
    784
    When your graphics card isn't under full load, it means one of two things.

    1. You've got vsync on and it only needs some of the available power to meet your monitor refresh rate.

    2. Other parts of your system are bottlenecking it.

    So you can't really tell your GPU to do more work, they're designed to be at 99% load most of the time.
     
  6. Tripwires

    Tripwires What's a Dremel?

    Joined:
    15 Jun 2012
    Posts:
    70
    Likes Received:
    0
    Hmmm interesting, thanks.

    I can't see that I'd have any bottlenecks so it must be the former, as I always use V sync. I think screen tearing is worse than poor frame rate.
     

Share This Page