1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Asus Maximus IV Gene-Z Review

Discussion in 'Article Discussion' started by arcticstoat, 18 Jul 2011.

  1. Guest-16

    Guest-16 Guest

    Yes they do!

    Maximus IV Extreme
    http://www.asus.com/Motherboards/Intel_Socket_1155/Maximus_IV_Extreme/

    and the Extreme-Z with Z68.


    We're just better at making motherboards :cooldude:

    ;)
     
  2. slothy89

    slothy89 MicroModder

    Joined:
    17 Feb 2011
    Posts:
    145
    Likes Received:
    5
    ASUS seem to have the upper hand on Intel's 1155 iteration. Very happy with my P8P67 Pro.
    If only I had the money, this board would make a killer replacement for my ageing Core2Quad Media/Games PC that sits under the tv.
     
  3. tigertop1

    tigertop1 What's a Dremel?

    Joined:
    23 Apr 2009
    Posts:
    65
    Likes Received:
    0
    OK and I appreciate your response but in the greater scheme of things such oncosts are minimal in a production run when weighed against the advantage of being two steps ahead of the opposition. It is going to happen some time so why not now on a premium board? I already use a PCIe USB3 card . All it does is take up a valued PCIe slot as opposed to having the stuff built in . Same applies to SATA 6
     
  4. Guest-16

    Guest-16 Guest

    But it's not in the actual chipset design. Intel simply doesn't provide it for any board.
     
  5. KidMod-Southpaw

    KidMod-Southpaw Super Spamming Saiyan

    Joined:
    28 Sep 2010
    Posts:
    12,592
    Likes Received:
    558
    I just need it, too bad we're all not able to afford these nice things. I'll go with the p8p67.
     
  6. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
  7. tigertop1

    tigertop1 What's a Dremel?

    Joined:
    23 Apr 2009
    Posts:
    65
    Likes Received:
    0
    Ok I understand.. Let's hope that the next Intel chip introduction will deal with this. We seem to be at a intermediate stage of the evolution of moving to USB3 and SATA 6. Having downloaded a vast programme to a USB3 flash drive and then seemingly instantly loaded it to my MSI P67A mobo equipped with a SATA 6Gb hard drive I am a total convert to the upgrade
     
  8. hurrakan

    hurrakan What's a Dremel?

    Joined:
    13 May 2010
    Posts:
    56
    Likes Received:
    1
    Don't know, but here is the recent AnandTech review of the UD3:

    http://www.anandtech.com/show/4493/gigabyte-z68xud3hb3-review

    However their benchmarks are different than the Bit-Tech benchmarks - so it's hard to equate them.
     
  9. casheye

    casheye What's a Dremel?

    Joined:
    21 Jul 2011
    Posts:
    1
    Likes Received:
    0
    I just bought this board, and discovered why it is fast at stock speed.
    It allows the 2600/2600K cpu to run at 3.8Ghz/4 cores at full load, where as a reference board would run @3.5Ghz when all 4 cores are stressed.

    This is actually overclocking.
     
  10. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    Hmmm yes, they see a difference between the C300 and the Vertex 3 as well...not as much as BitTech though.
     
  11. Edvuld

    Edvuld Minimodder

    Joined:
    2 Aug 2005
    Posts:
    230
    Likes Received:
    0
    Is it possible to use the 2nd PCIe x16 slot for a single graphic card configuration? Will it only run at x8 speed even if I buy a Gen3 model?

    I might have to do this to get my Zalman 7700 to fit, unsure about that at the moment :/
     
  12. debs3759

    debs3759 Was that a warranty I just broke?

    Joined:
    10 Oct 2011
    Posts:
    1,769
    Likes Received:
    92
    Slot 1 will work as x16 or x8. Slot 2 will only work as x8. That's the same as the original gene-z (before I set up my watercooling, I had to run my old gpu as a x8 device in slot 2, because my Noctua NH-D14 covered most of the motherboard!).
     
  13. Edvuld

    Edvuld Minimodder

    Joined:
    2 Aug 2005
    Posts:
    230
    Likes Received:
    0
    Ok, thank you. At least I know it'll work. Is there any performance loss?
     
  14. debs3759

    debs3759 Was that a warranty I just broke?

    Joined:
    10 Oct 2011
    Posts:
    1,769
    Likes Received:
    92
    I'm not a game player, so my most gpu intensive work is folding, which isn't affected by the bandwidth.

    I think the best way to see if you are fully using the gpu when in the x8 slot is to use a monitor, such as EVGA Precision or MSI Afterburner. Both will help you to overclock (although the EVGA software doesn't store voltage changes between sessions) and have a tool to show how much of the gpu you are using, clock speeds, etc.
     
  15. Marvin-HHGTTG

    Marvin-HHGTTG CTRL + SHIFT + ESC

    Joined:
    10 Oct 2010
    Posts:
    1,187
    Likes Received:
    58
    There's not a massive amount of change at x8 vs x16, depending on card. The dual-GPU cards will be affected far more than single GPU cards. If it's a 6950 level card, then it will make little (~2-5% max) difference, GTX580 level will be affected more. As it's PCI-E 3.0, when Ivy Bridge comes out, (and thus PCI-E 3.0 will become relevant) in theory the x8 slot will be able to provide as much bandwicth as a x16 PCI-E 2.0 slot, but IIRC that will only work with PCI-E 3.0 compatible cards, so won't make any difference to older cards.
     
Tags: Add Tags

Share This Page