1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Gigabyte Z68 boards to use ‘virtual GPU’ software

Discussion in 'Article Discussion' started by arcticstoat, 6 May 2011.

  1. arcticstoat

    arcticstoat New Member

    Joined:
    19 May 2004
    Posts:
    916
    Likes Received:
    13
  2. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,710
    Likes Received:
    321
    Interesting. Sounds a bit like nVidia's Optimus.
     
  3. c0ldfused

    c0ldfused What's a Dremel?

    Joined:
    19 Aug 2010
    Posts:
    31
    Likes Received:
    0
    Sounds like a great idea, I use my PC for surfing the net or watching movies but have a 560Ti for gaming.
    Could save money on the bills in the long run.
     
  4. Flibblebot

    Flibblebot Smile with me

    Joined:
    19 Apr 2005
    Posts:
    4,640
    Likes Received:
    150
    But if everything is output through Sandybridge, does that mean having to use the motherboard video connection? Does that mean that you'll lose the ability to use multiple monitors?

    It sounds like a cool idea in theory, but I'll wait to see what it's like in practice before getting excited.
     
  5. thelaw

    thelaw New Member

    Joined:
    10 Sep 2010
    Posts:
    1,096
    Likes Received:
    27
    Hybrid transitions between discreet graphics and the cpu/gpu graphics eh?

    I agree with the coments the biggest issue will to design the system to seem flawless when switching off discreet graphics cards when just in windows and them turning on when the system demands graphics processing powervwithout a time delay/lag...just sounds like a extra feature to charge more for a z68 board...
     
  6. Deders

    Deders New Member

    Joined:
    14 Nov 2010
    Posts:
    4,048
    Likes Received:
    106
    Quicksync?
     
  7. Guest-16

    Guest-16 Guest

    The ASUS Z68 boards can do QuickSync even if you plug the display output into the graphics cards because they have additional power hardware; it doesn't matter what output you use. AFAIK Gigabyte boards cannot do this. Hopefully bit-tech will test this :)
     
  8. TheStockBroker

    TheStockBroker Well-Known Member

    Joined:
    19 Nov 2009
    Posts:
    1,517
    Likes Received:
    106
    From the article: "The idea is that the software will take advantage of the media-centric capabilities of the Sandy Bridge PGU, which Intel claims is capable of encoding video faster and with less power consumption than mid-range discrete GPUs. This might sound unlikely, but it is plausible"

    Has Bit-Tech not yet done their own tests?

    Everywhere else is showing quite literally unbelievable encoding results.

    TSB
     
  9. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    1,959
    Likes Received:
    99
    This is the biggest "niggle" that affects Optimus. Users of the M11x R2 have noted many times where games simply refuse to use the discreet GPU instead defaulting the ingetraged one regardless of following the correct Optimus related procedures (adding the game to the "whitelist").

    Hopefully this won't affect the way this (Lucid logic) is supposed to work.
     
  10. azrael-

    azrael- I'm special...

    Joined:
    18 May 2008
    Posts:
    3,846
    Likes Received:
    124
    The way Virtu (and nVidia's Optimus/Synergy) works is by copying the content of the discrete GPU's frame buffer to the internal GPU's frame buffer. The problem here, as I see it, is that this may incur quite a substantial performance penalty. Although, if Virtu is a hit (and perhaps if Intel should buy Lucid or do their own Virtu), this process might move into hardware at some point.

    Oh, and regarding the Z68 chipset. I haven't really been able to figure out what all the commotion is. Z68 is the same chipset as P67 and H67 as well as all the other revisions of Cougar Point. The difference comes from which parts of the chipset are fused off and which aren't. It's called artificial market segmentation, and it's a game Intel just loves to play... :)
     
  11. TWeaK

    TWeaK New Member

    Joined:
    28 Jan 2010
    Posts:
    521
    Likes Received:
    7
    This. How will the GPU pass its display data to the system? Will it go back over the PCI-E connection? I know we're nowhere near saturating that connection, but it strikes me as pretty inefficient to have data shuffled around like that. Surely it would've been better to output whatever the Sandybridge PGU does over the discrete card.
     
  12. Bungletron

    Bungletron Well-Known Member

    Joined:
    25 May 2010
    Posts:
    1,169
    Likes Received:
    62
    I am an M11x R2 owner, as far as I am concerned the issue was fixed when the updated drivers were released last summer. As long as you are happy to add programs to the whitelist then I would say the Optimus system is flawless, cool and quiet doing running of the mill stuff, performance boost when discrete graphics are required, the transition is seemless.

    If this system works like Optimus it would be very useful. For example, I use 2 computers as a main rig and server, running the rig 24 hours is inefficient. If the discrete card was not used when not gaming then it would certainly be more viable to use one machine as gaming rig and server, I would cut my power consumption in half.

    I would hope that all video is routed internally (likely through the motherboard's video connector, which may negate its use in multi monitor setups as was mentioned). One thing I am unsure about is the mention of software, as far as I am aware Optimus is a hardware implementation for the most part, surely this is where the flawless and full performance transitions must come from. I fear any implementation in software would cause overhead and cripple performance, we shall see.
     
  13. dunx

    dunx ITX is where it's at !

    Joined:
    1 Sep 2010
    Posts:
    463
    Likes Received:
    13
    Be honest ! How many of you actually look at the power consumption ?

    dunx

    P.S. i7-960 + i7-870+ i7-870+i5-655K+Q6600....
     
  14. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,588
    Likes Received:
    231
    some Gigabyte boards doesn't have VGA output, most likely they just desoldered P67 returned, soldered Z68 and off it goes.

    so nvidia optimus (aka Synegy) won't work on these boards anymore, they require monitor to be pluged into the integrated VGA output. this press release is simply Gigabyte's way of saying "there, have some duck-tap to fix this issue we are too lazy to fix".


    using nvidia optimus for nvidia cards, and AMD's solution for AMD cards, are will be much better considering they already have strict game-driver test procedure.
     
  15. Hawkest

    Hawkest I got some 4GB new RAM

    Joined:
    22 Jun 2009
    Posts:
    257
    Likes Received:
    4
    i thought hydra wasn't as impressive as it was made out to be..... have lucid got this right?
     
  16. Deders

    Deders New Member

    Joined:
    14 Nov 2010
    Posts:
    4,048
    Likes Received:
    106
    I think that has more to do with multi GPU gaming performance, where there will be overhead translating the ATI and Nvidia driver code into something compatible with each other. I think with the Z68 implementation only 1 GPU will be used at once, for whatever purpose it's best for.
     
  17. Denis_iii

    Denis_iii New Member

    Joined:
    1 Jan 2007
    Posts:
    1,224
    Likes Received:
    14
    is Z68 the x58 replacement?
     
  18. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,945
    Likes Received:
    17
    I'd be interested in this as a way to save battery life on laptops, but on desktops it isn't an issue for me as I'd expect these boards to carry a slight price premium over ther standard boards without this technology.
     
  19. bobwya

    bobwya Custom PC Migrant

    Joined:
    3 May 2009
    Posts:
    193
    Likes Received:
    1
    Quite often since we had a new electricity meter fitted - which appears to actually measure our household usage (6 computer peak + various gadgets)... :-(
    I prefered the old meter which didn't appear to notice I was running a 4Ghz core i7 920, dual 2.6Ghz Opteron 270's, both with 8800GTX's and a HD array 24/7... Just about as inefficient (since the motherboard the Opterons are in doesn't support AMD PowerNOW! properly and the 8800GTX doesn't have a 2D low voltage mode) - as you can get!
     
  20. Deders

    Deders New Member

    Joined:
    14 Nov 2010
    Posts:
    4,048
    Likes Received:
    106
    Now and then I like to see how much power certain games and processes take up, Crysis 2 consistently draws the most for me, nearly 400w from the wall. Only the last Protos level from starcraft 2 came close.
     
Tags: Add Tags

Share This Page