1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Build Advice Mini-ITX or MicroATX daily use (lots of video encoding)

Discussion in 'Hardware' started by Rapture2k4, 7 Apr 2011.

  1. Rapture2k4

    Rapture2k4 What's a Dremel?

    Joined:
    24 Aug 2009
    Posts:
    37
    Likes Received:
    0
    So I've decided I would like to upgrade from a Q9550 to something a bit more modern and smaller.

    Budget: $400-$500 (CPU, Heatsink, Mobo, and RAM only)

    What I need:
    Improved video/audio encoding
    Improved Adobe suite performance
    Improved gaming capabilities
    4+ SATA ports (II or III)
    4+ USB 2.0/3.0 ports
    Ability to reuse HD 6850 to full capacity (if possible)
    Small footprint.

    What I don't care about/don't need:
    Overclocking
    Benchmark chasing
    SLI/Crossfire

    I am trying not to be too wordy (or vague for that matter), but I can't decide on the CPU/Mobo combo I need. I was looking at the Sandy Bridge chips and getting lost in the many different reviews on them. Would it be beneficial to go for the i7-2600K just so I can get hyperthreading or will a i5-2400/2500 (K or not) be better? If so, which and why? I've also been confused on how alot of Mini-ITX boards that have PCIe x16 are only electrically x4. Is this true and is it even a factor? Will my gfx card be hampered by a Mini-ITX system? Finally, I'm looking for a CPU heatsink that is about the same height as my gfx card (installed), recommendations?

    Currently eyeing:

    GIGABYTE GA-H67N-USB3-B3 (2xHDMI????) or
    ASUS P8H67-I DELUXE (will need to buy RAM)
    i5-2400 Sandy Bridge
    Scythe Big Shuriken SCBSK-1000

    Thank you for reading.

    P.S. I will be building a custom case and posting the whole build on here!
     
  2. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    The non K sandy bridge cpu's with an H67 motherboard will be able to use quicksync feature on the on die graphics chip (apparently some motherboards will allow you to use a graphics card as well as the Intel chip but I've not seen any, there may be a solution with a H67 and a Lucid chip) to greatly increase video encoding speed, sometimes by up to 6x that of a normal CPU.
     
  3. Marine-RX179

    Marine-RX179 What's a Dremel?

    Joined:
    24 May 2010
    Posts:
    406
    Likes Received:
    14
    Ignore.
     
    Last edited: 7 Apr 2011
  4. 3lusive

    3lusive Minimodder

    Joined:
    5 Feb 2011
    Posts:
    1,120
    Likes Received:
    45
    The Asus P8H67-I deluxe uses 204-pin SO-DIMM DDR3 ram (instead of standard 240-pin), because of the number of features thats crammed onto the mobo, so thats something to take into account. Its mITX, as you say, and not mATX, so you obviously lose some expansion features by going with that kind of mobo. Any H67 board will be able to take a dedicated gpu at full functionality - just not sli-or xfire, but some do support multi-gpu setups but the second PCie lane runs at x4 instead of x16.

    Seeing as your putting 6850 in it, you dont need a K version because you obviously cant overclock it and you wont be using the integrated hd 3000 gfx that you get with k-series chips. The 2400 is the same chip as the 2500 just clocked a little lower.

    If i was you, I would forget the deluxe and get the mATX Asus P8H67-M Pro - it has more Pci slots, can use normal sized ram (and 4 instead of 2 dimm slots), and still has usb-3 support, 4+ sata etc (it doesnt have on board wifi support though). If you definitely need it to be mitx the Deluxe should take your gfx card without any quarms (it says it has 1 PCI Express 2.0 x16) and an i5 2500 would go nice with it - if you go 2600 your paying a large premium for the HT which is not going give you 'that' much more grunt, but you can see some benchmarks here.
     
  5. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    It's not the lack of support for a dedicated GPU, the problem is that when you plug one in, the on die Intel GPU with the quicksync video encoding features is disabled, there may be ways round this though.
     
  6. 3lusive

    3lusive Minimodder

    Joined:
    5 Feb 2011
    Posts:
    1,120
    Likes Received:
    45
    Cant see there being a way around it until the Z68 boards are released, and honestly I dont think thats much of a deal for him unless hes encoding everyday and has the necessary software which supports Quicksync in the first place.
     
  7. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    Is there no software for it yet? Would have though they would have sorted that at least. there were talks of being able to use a lucid chip to link the GPU's, or some MB's apparently allow you to plug a second monitor in to enable the chip.
     
  8. 3lusive

    3lusive Minimodder

    Joined:
    5 Feb 2011
    Posts:
    1,120
    Likes Received:
    45
    Going off Intels website:

    Arcsoft MediaConverter*
    Arcsoft MediaImpression*
    Corel Digital Studio*
    CyberLink MediaEspresso*
    CyberLink PowerDirector*
    Elemental Badaboom*
    MainConcept*
    Movavi Video Converter*
    Roxio Creator*

    Support quicksync. I use corel vid studio X4 (recently upgraded from using X2 for years) and it so-called supports it but the program has been really buggy for me that I haven't had the chance to properly use it (not necessarily a quicksync issue). Anyway, unless hes using one of those programs hes nt really going to benefit from it, although im sure more and more software providers will be implementing it now.
     
  9. IvanIvanovich

    IvanIvanovich будет глотать вашу душу.

    Joined:
    31 Aug 2008
    Posts:
    4,870
    Likes Received:
    252
    maybe take a look at the ZOTAC H67ITX-C-E? i'd also go for the 2400, maybe do a corsair h50 for the cooler.
     
  10. azazel1024

    azazel1024 What's a Dremel?

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    You can always go 2600 if you want hyperthreading. No point in spending the extra money for the 2600k when you aren't going to overclock. For 2400 vs 2500, they are "different chips". You might get a 2400 up to higher clocks than a 2500, but its unlikely. All quad core SB chips are effectively the same silicon, as far as I know even the hyperthreaded 2600(k) vs the 2400, 2500, etc. The difference is what features are activated on the silicon (all the same transters, etc are on chip, just some is disabled) as well as chip binning. Binning is taking a bunch of chips, testing them at a certain frequency and looking at the results, such as power draw, voltage fluctuation, stability, etc. They have certain "parameters" to meet the qualification for each level of chips. So out of 100 chips they make, they might get 40 that meet the qualifications for a 2400, they might get 40 the meet the qualifications for a 2500 and 20 that meet the qualifications for a 2600. They then set aside a few of the 2500 and 2600 qualified chips and activate multiplier unlocking to turn them in to K parts. They then unlock hyperthreading on the 2600 and 2600k chips. Slap the labels on the chips and package them up.

    So physically, pretty much all quad core SB chips are identical. The results of the production process is going to lead to some chip to chip variation and this is where binning comes in to determine how the chip is going to be packaged and sold. Obviously "higher end" chips need to meet much more stringent qualifications to be sold as such, so there are fewer of them that meet those qualifications and thus cost more because the production process results in fewer of them.

    Lucid has or is shortly going to be introducing a software program that allows the SB GPU to be encoding at the same time that a discrete video card is working. Downside is that it only works on H67/H65 boards and the upcoming Z68 boards and the P67 have the onboard GPU disabled.
     
  11. 3lusive

    3lusive Minimodder

    Joined:
    5 Feb 2011
    Posts:
    1,120
    Likes Received:
    45
    What I meant is that for all intents and purposes the 2400 and 2500 are the same chip, in the sense that theyre both quad-core i5 sandybridge cpus but one runs at 3.1ghz and the other at 3.3ghz... dont see why you have to make it more complicated (especially when he wont be OC them on a H67 board).

    For the OP's budget, he said he had about $400/500 to spend on mobo, cpu, ram and heatsink. an i7 2600 with the Asus deluxe h67 mobo and 4gb ram will come to about $500, or he can save $90 and get the i5 2500 but lose HP (he may well make use of it if he is regularly encoding, but if hes not he is better off getting the 2500 and maybe 8gb ram instead).
     
  12. Rapture2k4

    Rapture2k4 What's a Dremel?

    Joined:
    24 Aug 2009
    Posts:
    37
    Likes Received:
    0
    WoW!

    Thank you all for the insightful replies. I see this topic can make people whip out the boxing gloves, lol.

    In any event, I'm an open source kind of guy. So, it appears none of my software supports (or plans to support) QuickSync any time soon. I was hoping the money I spent on the Adobe suite would benefit from Sandy Bridge, but it appears they are holding out until the next major release which means more money :grr:

    Like I said, overclocking is not something I am interested in. One of the reasons I am interested in Sandy Bridge is that it does it automagically when needed.

    I read somewhere that with the Z68 chipset (or maybe even current chipsets) will be able to bypass this by enabling the onboard GPU and hooking a monitor up to that GPU. I think it was Anandtech or somewhere like that. I am definately interested in the implications of Z68 though. As we all know, rumor vs implementation is always a far cry apart.

    For the money, I think you are right. I just hope this board follows the mATX standard and is really 9.6x9.6 inches. I like the layout aswell. Biggest plus is I won't need to buy RAM (hopefully) as I will be able to reuse my GSkill DDR3 1333 RAM. I still am eyeing that mITX board though. It looks purdy and has some neat features. The SO-DIMM RAM isn't a big deal to me. It acually makes more sense to me to use SO-DIMM on a small board like that. But what do I know...

    Thank you all for the replies!

    I'm still trying to find a clear answer on PCIe slots on mITX boards. Anyone know the difference between logically x16 and logically x4?
     
  13. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    x4 is a quarter of the bandwidth of x16
     
  14. Rapture2k4

    Rapture2k4 What's a Dremel?

    Joined:
    24 Aug 2009
    Posts:
    37
    Likes Received:
    0
    Sorry, typo.

    What's the difference between logically x16 and electrically x4?

    I.E. A board is advertised as having an x16 slot, but is only electrically x4.
     
  15. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    It's the same size as a x16 slot, but will only work at 4x speeds.
     
  16. azazel1024

    azazel1024 What's a Dremel?

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    That. The slot mechanically fits a x16 card, but only 4 of the lanes actually function. So you can run a x16 PCI-e video card in it, but it'll only communicate over 4 of the lanes and thus limited to x4 speeds.

    I also wish some open source stuff (I am a handbrake guy) would adopt quicksynch, but I don't see it happening. The good news is, a 2600 has a crap load of horsepower to encode with on its own.
     

Share This Page