1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Zotac GeForce 9300 (MCP7a) Motherboard

Discussion in 'Article Discussion' started by Tim S, 15 Oct 2008.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
  2. The boy 4rm oz

    The boy 4rm oz Project: Elegant-Li

    Joined:
    10 Apr 2008
    Posts:
    5,297
    Likes Received:
    54
    Looks like a fantastic HTPC board. I really like the colour combo of orange and blue. I may use that in a future mod lol.

    Once again a very nice review.
     
  3. R3veNG

    R3veNG You were R3veNG'ed >:-)

    Joined:
    12 Dec 2007
    Posts:
    10
    Likes Received:
    0
    In the last section:

    Zotac's board isn't the fastest or most capable out there, but for just being stable and for HTPC playback it's the best Intel solution currently available. The Nvidia GeForce 9300 ...

    Shouldn't that be nvidia ??

    Great review !
     
  4. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Best integrated graphics solution for Intel processors
     
  5. Jojii

    Jojii hardware freak

    Joined:
    12 Dec 2007
    Posts:
    122
    Likes Received:
    1
    What? No overclocking section with your cascade phase change setup? You are losing your touch.
     
  6. Guest-16

    Guest-16 Guest

    Cant do jack all overclocking cause there's no CPU voltage change anyway :p. I tried, got about 80MHz FSB which is pants.

    GPU overclocks like a rocket though - we maxed it out on the Nitro :D
     
  7. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    I know I keep repeating myself but...

    An HTPC board without "analogue" TV out?

    S-Video...Component?

    Xir
     
  8. Cupboard

    Cupboard I'm not a modder.

    Joined:
    30 Jan 2007
    Posts:
    2,148
    Likes Received:
    30
    How does it do on games? A quick benchmark on something not too demanding would be nice to see!
    I do like the colour scheme and it seems a good board so long as your aren't trying to overclock it, which you aren't going to in an HTPC. I suppose you might want to underclock though.
     
  9. Guest-16

    Guest-16 Guest

    I didn't have time or space in the office right now for games as I'd have to retest ALL the other AMD boards again. We reviewed it as a HTPC and general productivity board since that's what most people will be using it for. Not many really use GeForce Boost or the mGPU to play 800x600 with everything turned off.

    Xir - only very, very few boards offer(ed) component without an adapter and none these days. It's all HDMI, VGA and DVI!
     
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I've got some plans to include the GeForce 9300 board in my 4550 review that's currently being worked on... :)
     
  11. Guest-16

    Guest-16 Guest

    There, that's the reason I didn't of course Tim had it covered..

    :worried:
     
  12. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    first i thought. aah! a micro ATX HTPC board!. And stopped reading when I saw the IDE and FDD cables and most importantly the Active cooling solution.
     
  13. Renoir

    Renoir What's a Dremel?

    Joined:
    19 Jul 2006
    Posts:
    190
    Likes Received:
    0
    From the other geforce 9 series article:
    According to the nvidia website that is not the case
    I thought one of the changes in the G45 relative to previous intel IGPs was that it had a built in TMDS (not TDMS) making it easier for motherboard manufacturers to implement the digital video outputs. Is that not the case? Also the 690G supported dual digital outputs so the 9300 is not the first.

    Believe that should be "also at 2.1GHz".

    According to this that's not necessarily the prime reason for the lack of performance.

    On page 10
    Think you mean maximum not minimum.

    The performance of the G45 with the Casino Royale BD-DVD looks suspicious to me. The cpu usage seems so high that it might be a software issue somewhere.
    What role do the shaders have in video decoding? Don't they have dedicated logic for decoding video e.g. UVD?
    Correct me if I'm wrong but doesn't the 9300 have 16 shaders to the 8200/8300s 8?
    Is that referring to the "edge enhancement" slider?

    I have to say that the HDHQV score for the 9300 was very impressive if not all that relevant for blu-ray movies as you pointed out.

    Overall a promising chipset which will only get better with driver and bios revisions.

    Nice review Bindi
     
    Last edited: 18 Oct 2008
  14. Guest-16

    Guest-16 Guest

    The 690G didn't and nor does the 780G - both have the option of either DVI or HDMI - not both at once - there's a digital switch behind the rear I/O that selects them.

    Intel G45 still doesn't have in built TMDS (consistent typo - D is closer than M :p) - the mini-ITX board has two. It's cheaper for Intel not to have to pay HDCP and HDMI licenses and also because they have massive OEM business most of these customers still want VGA only. This is also probably why G45 has a HDCP repeater issue perhaps.

    On the website it's referring to the motherboard above I think - it says DDR3-1333 without mentioning DDR2. Nvidia were clear about it not including IDE and the Zotac motherboard would not include a JMB386 controller if it didn't need it :)

    Yea it's missing advanced path which there is a setting of in the drivers and I was told it was disabled but forgot to put it in - I was waiting for more BIOS details about the other two settings as well. :duh: But either way, unlinked mode is still **** and the memory settings are completely FUBAR for DDR2 right now.

    Yep, it was 2.1GHz and 2.13GHz

    They are? (Unless Tim changed it..) I started recording max (should have done in the first place in hindsight) to see if it was always smooth and to show the overhead.

    It could well be - I'll revisit it when I come to try the E5200 in the G45 review I've got coming.


    No 8200/8300 has 16 according to the GPU-Z I shot in the Jetway 8200 article.

    Decoding has its own specific hardware, but the shaders do the deinterlacing and some other processing - resizing too iirc. But it's all down to software to use it efficiently - Intel's ClearVideo deinterlacing codec is fantastic compared to AMD's and Nvidia's.

    Indeed it is, and it's an arbitrary number but I try to include as much info as possible for comparison because it's so personal POV.

    Cheers dude :D I always find my stress level increases when I see your posts though :p:p:p
     
  15. Renoir

    Renoir What's a Dremel?

    Joined:
    19 Jul 2006
    Posts:
    190
    Likes Received:
    0
    The info I was going off was from our discussion of the topic here (1st post) that was based on the info here
    Fair enough.
    hmmmm interesting theory I like it.
    I agree that only mentioning ddr3 is suspicious but there is nothing on the page to suggest that it's associated with any particular motherboard. The inclusion of the JMB386 does suggest it's dropped support for pata but it's not conclusive given that intel support gigabit ethernet on the ICH10 but a lot of boards still opt for an auxiliary controller instead so there is precedence.
    No worries for forgetting. I agree that the memory situation is crap and certainly suggests the product launch was rushed.
    Sorry for not being clear. The graphs are correct but the preceding sentence "we recorded the average and MINIMUM CPU usage during this section of the film" is what I was referring to.
    Look forward to it.
    The article here says it's 8 which is backed up by benchmarks here & here showing the 9300 significantly outperforming the 8300. This would also make sense given that the 9300 is built on a 65nm process while the 8300 is built on a 80nm process (I realise some of the "extra" die space is taken up by the memory controller that isn't present on the 8300).
    I would have thought the post processing would take roughly the same cpu usage on each chipset just that one might do it better than the other if it has more shader power. I agree that intel have really nailed SD DVD playback which is still the most important aspect for me personally.
    As I'm sure is clear by now the more info the better as far as I'm concerned so don't change a thing :thumb:
    LOOOOOOOOOL Breaking news forum poster causes author blood pressure issues (the bit community would never forgive me :D)
     
  16. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    Adapter is fine...is ther an Adapter from "HDMI, VGA and DVI" to SVideo?

    I guess as long as Philips manages to sell DVD-Players that only have a Scart and a Component out (so you can hook it up to your beamer in the worst possible quality) there's not enough of a problem.

    Of course this problem will fade over the next few years...but then again PC's are just used a few years.

    Flatscreens really started selling around here 3-4 years ago...HD-ready (720) without HDMI or DVI or VGA in.
    For about 1-2 years now, the mainstream models (still HD-ready 720) have been equipped with VGA in woohoo!

    Xir
     
  17. Renoir

    Renoir What's a Dremel?

    Joined:
    19 Jul 2006
    Posts:
    190
    Likes Received:
    0
    Any comments on my post Bindi especially related to my comments on the issues of pata support & 16vs8 shaders?

    Also in case you're not already reading this I recommend the following blogger for information on intel integrated graphics. In particular his post on the HDCP repeater issue is very interesting and suggests that your theory above on the issue may not be correct.
     
  18. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Each TPC (thread processing cluster) in G8x/G9x derived GPUs has 16 shaders, split into two blocks of eight SPs - all GPUs after G80 have eight texture address/filtering units per TPC, while G80 had four texture address/eight texture filtering units per TPC. These are shared amongst the two blocks of eight SPs and each texture unit is accessible by ANY of the 16 stream processors in that TPC.

    I can't see how Nvidia could make a part with less than 16 stream processors if it's derived from G8x/G9x and less than 24 stream processors if it's derived from GT200, as there would have to be some fairly substantial changes to the transistor layout.

    Pretty much all of the GPUs Nvidia launches follow on from when the new architecture is launched (OK, GT200 is hardly 'new', but it is tweaked) and the number of TPCs and ROPs are just scaled to suit the needs of the price point Nvidia is trying to hit with a particular GPU. The actual TPCs and ROPs remain the same - they're self-sufficient and the number of them is scalable. If GeForce 8200 was an eight SP part, it would be as much of a new architecture as GT200 is when compared to G80/G92 - it's 24 SPs and 8 TMUs per TPC. :)

    Further to support this was the fact that Nvidia could not do mismatched SLI due to what I've been told are scheduling issues and massive driver overhead. To the best of my knowledge, the only cards that support GeForce Boost are cards with 16 stream processors and it's because of this 'problem' Nvidia has had with getting mismatched SLI to scale properly.
     
  19. Renoir

    Renoir What's a Dremel?

    Joined:
    19 Jul 2006
    Posts:
    190
    Likes Received:
    0
    Thanks for the reply Tim! I've just read another couple of reviews which also say it's 16 shaders so I suppose we just have to put it down to an error on anandtech's side. In that case what do you attribute the significantly increased performance to?

    Also do you have any idea why most motherboard manufacturer's appear to be releasing mobos with only single-link dvi when the chipset supports dual-link?
     
    Last edited: 22 Oct 2008
  20. Guest-16

    Guest-16 Guest

    No. S-Video, like composite requires a TV encoder like the old ATI Rage Theatre or Philips SAE chips in early hardware. At some point for some GPUs this was incorporated onto the silicon or was done in drivers (I'm not sure), but either way it's all about VGA, DVI and HDMI these days and occasionally component. It's simply a cost choice for motherboards - they've never really offered it with commitment. Even graphics cards are dropping the 7-pin support occasionally.

    Ren - I'll read the rest of your post in a minute and edit this, but I had Nvidia sitting there telling me the 8200 was 8-shader when I questioned it was 16, yet, unless GPU-Z is consistently wrong I can only assume 16. GPU-Z currently doesn't read 9300 so I can't check the difference.

    Iirc Tech Report confirmed our finding of 16 for 8200 too originally, maybe I'm wrong. I remember reading it though.

    It can mix-match clocks between the 8400 GS (16 shader part) fine, but mis-matching shader quantities is far more difficult.

    EDIT: IDE - maybe Nvidia originally included it but it's currently broken in the final silicon? The chipset is heavily delayed after all. The difference with Intel GbE is that the Intel GbE solution costs more than a marvell or realtek chipset who companies already have contracts with (they buy huuuuge amounts for every board), whereas using native IDE requires no extra chipset = saved $$$.

    690G - I've no board to check any more and I can't remember for the life of me. AMD also said to me that the 780G PCI-E x16 lane could be split into two x8s, but in actual fact, it couldn't. They also said the SB700+ would have HyperFlash support... Our discussion about LVDS was different - the LVDS hardware is (usually) external to the GPU and powers a display on its own. I'm not doubting the slide but my question is: why would AMD go backwards from dual support on the 690G to single digital output on 780G?

    The 9300 is faster simply because it uses an Intel CPU and has a direct access to main memory without going through the CPU. I would guess it's probably mostly CPU driven at very low resolution and graphical features - the 9300 is faster than the 790GX even and it has a lower clock.

    As for shader processing - I think driver optimisation is a bigger limitation and access to performance than raw shader power. It's also dependent on the software playing it too - PowerDVD/Archos/WinDVD/MPC-HC all handle the playback %age differently.
     
Tags: Add Tags

Share This Page