1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Intel HD 4000 Investigation

Discussion in 'Article Discussion' started by brumgrunt, 28 May 2012.

  1. misterd77

    misterd77 New Member

    Joined:
    18 Apr 2011
    Posts:
    96
    Likes Received:
    1
    the a8 k chip is currently around £98 (due for a massive price drop when trinity hits), and the intel i5 3570k is £170, the amd gpu portion is about 15% faster than the intel chip, and has a few inbuilt technologies that intel cant match for legal reasons, also, I think AMD will wipe the floor with them when it comes to driver support, so to sum up, the intel chip is 70% more expensive, and 15% slower (gpu).......
     
  2. Fingers66

    Fingers66 Kiwi in London

    Joined:
    30 Apr 2010
    Posts:
    8,113
    Likes Received:
    611
    Don't hold your breath, Intel's iGPU driver development and release management is pants.
     
  3. Merglet

    Merglet New Member

    Joined:
    28 May 2012
    Posts:
    10
    Likes Received:
    2
    Um. How were the "roles reversed" from the first two tests? You said all throughout the article that the AMD chipset was better. Then for Diablo 3 that the roles were reversed and the AMD chipset was far better? Lost your train of thought? Good article except for that blunder tho.
     
  4. KayinBlack

    KayinBlack Currently Rebuilding

    Joined:
    2 Jul 2004
    Posts:
    5,491
    Likes Received:
    285
    I've got one of these running now, and I call shenanigans. Loaded up Guild Wars, a seven year old game, and it stuttered and lagged so bad that I couldn't play. Put in a simple HD 5770, just fine. Ping was OK (before I get a question about that) but 1920x1200 completely crippled it in an old MMO designed to run on almost anything. It honestly looked like the old Intel integrated that we had to disable DX9 for it to run.

    Only problem with the 5770 is it's not mine. I'll be getting a new GPU in a week or so, but I was just trying to hold out till then. But gaming capable this ain't.
     
  5. 2bdetermine

    2bdetermine New Member

    Joined:
    2 Apr 2009
    Posts:
    74
    Likes Received:
    0
    There isn't anything to investigated about the Intel HD 4000. It's rubbish! Like every others IGP that came before only to torment/killing PC gaming. On the other hand AMD solution offers some hope and not trying to killed off PC gaming in the process.
     
  6. nuc13ar

    nuc13ar New Member

    Joined:
    5 Jul 2010
    Posts:
    72
    Likes Received:
    0
    NO GTA IV/EFLC benchmark?!! BLAH

    In title. I wanted to know if the increased IPC of Ivy Bridge made enough of a difference over the Stars architecture that could catch hd 4000 up to 6550d @ 1024x768/1280x1024. I will have to wait for other websites to review this chip I guess :(
     
  7. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    Well I think it's fair to say that it'd defintively a step up and not made for 1920x1200.

    That said, it's clear that tests done these days are more CPU limited for APUs. Note that if the game was run on the A10 it'd perform much more admirably at a higher resolution. I'd say this is still a step forward, I mean back then integrated was good for only powering the desktop if it was from intel.

    Now at least it can do something. Not to mention it renders most bottom barrel GPUs obselete. Surely that is something to be proud of? Admittedly though, Intel's GPUs are hardly amazing. That and the HD5770 although old is still about 2x faster than the GPU on the A8 and A10.
     
  8. Blademrk

    Blademrk Why so serious?

    Joined:
    21 Nov 2003
    Posts:
    3,982
    Likes Received:
    83
    the "roles reversed" comment was in regards to the older intel chipset which came off better than the newer intel chipset in the Diablo 3 test and had nothing to do with the performance of the AMD chipset.
     
  9. PCBuilderSven

    PCBuilderSven New Member

    Joined:
    3 Oct 2010
    Posts:
    130
    Likes Received:
    1
    Desktop trinity chips will only make this lead bigger, AMDs mobile A10 APU's graphics is signifcantly faster than mobile IB's graphics.
     
  10. kent thomsen

    kent thomsen New Member

    Joined:
    9 Jul 2011
    Posts:
    38
    Likes Received:
    0
    When you use your computer for audio production like I do, it's a problem, that the GPU (HD4770) doesn't recognize the audio sequencing software as a 3D job. Which is fair, it isnt. As a result it will stay at 250 MHz on the core and never scale up to750 MHz, full speed.

    And since the recording software is more demanding on the GPU than you might think, with VU-meters or plug-ins like reverbs and compressors pulsing away, it can get very glitchy. Especially when you zoom in or out on multiple tracks at a time.

    It will play CoD on 1680x1050 with everything to max without a hassle, though, because it will scale up, because the game is seen as 3D.

    Is there a similar scaling going on inside the HD 4000? Or is it running all it can all the time?

    Because if it doesn't scale down, it would be fine for my purposes.
     
  11. Deders

    Deders Well-Known Member

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    Slightly off topic but is there an option to turn power saving features like this off in the Catalyst Control Center, or even windows power options?
     
  12. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    Of course, just run CCC Overclocking at full blast all the time.
     
  13. kent thomsen

    kent thomsen New Member

    Joined:
    9 Jul 2011
    Posts:
    38
    Likes Received:
    0
    @Deders; Elton:

    Sorry for being eager and off-topic.

    I'm afraid it isn't possible to make the GPU run full speed trough CCC OC. I can only adjust the 3D settings, i.e. over- or underclock the 3D frequency, not the 2D 250 MHz setting.

    That would create a lot of noise as wel.

    I'l thank for the replies and take my question to a more proper forum.

    Thank You.
     
  14. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    Hmm, I wonder if you can adjust rivatuner to turn off the Voltage/Power Saving contols?
     
  15. Gradius

    Gradius IT Consultant

    Joined:
    3 Feb 2009
    Posts:
    284
    Likes Received:
    1
    Finally AMD is on the road again!
     
  16. Material

    Material Soco Amaretto Lime

    Joined:
    13 Apr 2010
    Posts:
    633
    Likes Received:
    25
    I only said that the roles were reversed for the HD 4000 and the AMD Radeon HD 6450, which they were.
     
  17. noizdaemon666

    noizdaemon666 I'm Od, Therefore I Pwn

    Joined:
    15 Jun 2010
    Posts:
    5,562
    Likes Received:
    472
    No you'd have to edit the vBIOS to crank up the speed of the lower profiles or to disable them entirely.
     
    Teelzebub likes this.
  18. thehippoz

    thehippoz New Member

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    interesting article.. don't really mess with the integrated gpu though- use a netbook with xp/linux and blinds to save power still.. I know someone who has a newer gaming laptop, it's freaking huge and weighs a ton.. he carries it to class with all of his books and it pretty much always has to be plugged in xD

    intel integrated graphics from 10 years ago could play guildwars.. guess it's the res
     
    Last edited: 4 Jun 2012
  19. DC74

    DC74 Doh!

    Joined:
    4 Jan 2011
    Posts:
    71
    Likes Received:
    2
    Had to laugh when the article said it was tested in diablo III, that game's been in development so long that had i grown a beard when they announced they were making a 3rd installment, i'd have been able to rival `ZZ Top' by now. Yes the game is newly released, but having tested it for 2 months prior to release, its hardly ground breaking, looks more of an upto date version of the D2 engine with multiplayer added as a main feature and thats about it.
     
    Teelzebub likes this.
  20. rovit44

    rovit44 New Member

    Joined:
    6 Oct 2011
    Posts:
    5
    Likes Received:
    0
    It was said above >Is anyone using Virtu MVP with the HD4000 and a discreet card? I've not been able to get any non corrupt results from it. < I read (cant find again) that Virtu MVP would best be used with a discrete card similar to the on-chip gpu, which would appear to be HD6450. Would this set up be happy or is there a better card to add to a z77 mobo and use Virtu, or should Virtu be ignored and a wholely better card be used?
     
Tags: Add Tags

Share This Page