1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Gigabyte GA-Z68AP-D3 Review

Discussion in 'Article Discussion' started by arcticstoat, 11 Oct 2011.

  1. arcticstoat

    arcticstoat New Member

    Joined:
    19 May 2004
    Posts:
    916
    Likes Received:
    13
  2. Hustler

    Hustler Well-Known Member

    Joined:
    8 Aug 2005
    Posts:
    1,009
    Likes Received:
    30
    "they only have Crossfire X certification - there's no SLI."

    Oh...great, give us the inferior dual card system, why dont you.......

    If your only going to give us one, then at least let it be Nvidia's, which seems to be a far more stable and reliable dual card system.

    Seriously Gigabyte....i do wish you'd used SLI
     
    Last edited by a moderator: 11 Oct 2011
  3. Xir

    Xir Well-Known Member

    Joined:
    26 Apr 2006
    Posts:
    5,249
    Likes Received:
    88
    Z68 boards have been just a tenner or so more expensive than the comparable P67 boards, I really can't hear this
    anymore.
    Fast response (and sometimes encoding on the iGPU) for very little more. How is this not a win?
     
  4. favst89

    favst89 New Member

    Joined:
    23 Jul 2010
    Posts:
    390
    Likes Received:
    13
    I guess more features without much added cost is good and maintaining the overclocking ability.

    As for the x-fire SLI. I have not tried either so as to which is better I can't comment.
    I believe SLI certification requires 2 slots with x8 lanes each. Whereas x-fire only requires x8 and x4. I assume this means extra cost for components/wiring of some form. It may also cost gigabyte a fee to have the certification in the first place.
     
  5. damien c

    damien c Mad FPS Gamer

    Joined:
    31 Aug 2010
    Posts:
    2,713
    Likes Received:
    97
    SLI or Crossfire on Sandy Bridge is not as good as it could be simply because I don't think any of the board's will give it true 16x for each slot.

    I know when I was running SLI'd GTX 580's when I installed the Nvidia Driver's I alway's got the message that the bottom card was running, slow because it wasn't in a high speed slot.

    As for this board not having SLI it could put people off buying it who may want to SLI some cheap Nvidia card's, like the GTX 560's but what it offers is fine for those who decide to use 2x ATI card's or a single GTX 590.

    Seem's like a decent board for someone on a budget but for someone with enough cash to buy something decent like a SLI rig they need to spend, more on a board and unless you can get a board that will give both card's the 16x speed then I think they should look, at the X58 setup's as they will give each card the full speed available to them.

    Good Job Gigabyte keep it up!
     
  6. hurrakan

    hurrakan New Member

    Joined:
    13 May 2010
    Posts:
    56
    Likes Received:
    1
    Why weren't the benchmarks for the "Asus Maximus IV Gene-Z" included in the charts in this review?
     
  7. hurrakan

    hurrakan New Member

    Joined:
    13 May 2010
    Posts:
    56
    Likes Received:
    1
    "The one useful feature the more expensive chipset has is Intel Smart Response"

    This comment is misleading. It's only true for people who only have a single, small-capacity SSD. For people with lots of SSDs, Smart Response is the most useless feature.
     
  8. Christopher N. Lew

    Christopher N. Lew Folding in memory of my father

    Joined:
    23 Apr 2009
    Posts:
    1,347
    Likes Received:
    44
    Does Smart Response make a difference in the real world?
     
  9. Xir

    Xir Well-Known Member

    Joined:
    26 Apr 2006
    Posts:
    5,249
    Likes Received:
    88
    For Smart Response, I think there was a test right here on bit somewhere?
    For Virtu-for-transcoding, Check Tom's or Anand (as the bit article tested iGPU and dGPU the wrong way round)
    :waah:
     
  10. phuzz

    phuzz This is a title

    Joined:
    28 May 2004
    Posts:
    1,700
    Likes Received:
    24
    Does SLI or Crossfire actually work yet?
    As in, does it actually provide faster frame rates than a comparable cost single card system without loads of headaches?
     
  11. YingKo

    YingKo New Member

    Joined:
    8 Sep 2003
    Posts:
    57
    Likes Received:
    0
    Seriously, a parallel port?
     
  12. fluxtatic

    fluxtatic New Member

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    If you're dropping $120 on a mobo, I doubt you really care much about XFire or SLI, honestly....are you honestly going to cheap out on the board and then deal with dual-card headaches on cheap cards? Why not buy a single, better card in the first place? If money was no object to me, I might go in for a multi-card setup, but it seems to be demonstrated over and over again that it doesn't scale as nicely as you might like. Aside from that, it seems that driver issues are a constant pita. Sure, some games will scale monstrously under XF/SLI, but it mostly seems to be an e-peen thing. As to favst89's comment, that may be the case (cba atm), but the board should be able to provide that, since SB does have 16 lanes (or are there other components eating into those lanes?) I would guess he's dead on about the fee part, though. Maybe XFire's just cheap enough that Gigabyte can make it a tick-box feature. Even an enthusiast might think it pretty lame that their shiny new mobo won't do either one, whether or not it matters to them personally. If I could have gotten my new board for a few bucks less without XF, I would have in a heartbeat, if that was the only thing that got dropped. (I do find it funny that mine is quad-GPU certified when they run at x16 x4, though - how well would that even work, shoving two 6990s in there? Not to mention spending 10x the cost of the board on the video cards - yes, I know the 6990s aren't the only dual-GPU AMD cards, but shiny new mobo! Need shiny new video cards!

    Word, to YingKo - why the hell would I be buying a Z68 board if I needed a parallel and/or serial port? In most scenarios I can think of, Z68 would gain me no advantage in that sort of setup - better off getting a low-end ASRock or Biostar AMD board and shoving an Athlon II X2 in there - hell of a lot cheaper, too, even for a cheap Intel board. Not counting a CNC controller or similar, I suppose - who was it mentioning their greatly stripped-down XP install running the CNC machine? Total focus on running the CNC control so the part doesn't get hosed by WinUpdate being 'helpful' or the like. Would the superior platform help in that situation?
     
  13. accountlink

    accountlink New Member

    Joined:
    1 Nov 2011
    Posts:
    1
    Likes Received:
    0
    What heatsink/cooler are you using to achieve 4.9GHz with Gigabyte mobo?

    I have the Gigabyte GA-Z68AP-D3-B3 w/i5 2500k ...using CoolerMaster Hyper 212+ ...i'm able to overclock to 4.5 GHz no problems.

    would like to replicate my successes with 4.9 if possible using the cooler you guys tested with. Thx.
     
Tags: Add Tags

Share This Page