1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Intel teases Iris Pro 5200 Haswell IGP

Discussion in 'Article Discussion' started by Gareth Halfacree, 2 May 2013.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    13,001
    Likes Received:
    2,127
  2. blackerthanblack

    blackerthanblack Active Member

    Joined:
    17 Sep 2004
    Posts:
    667
    Likes Received:
    26
    While it good that they are upgrading their mediocre GPUs I think the buyers more likely to buy the higher end CPUs (to which the best GPUs are attached) are the buyers who will be more likely to buy a dedicated GPU card to go with it. Gamers and those working with 3d models will almost certainly require the grunt of a separate GPU rather than something which will just get by.

    The ones who would benefit the most are in the space where AMD sit currently with its' APUs. But Intel CPUs in this space will be handicapped with the lower spec GPUs.
     
  3. GeorgeK

    GeorgeK Swinging the banhammer Super Moderator

    Joined:
    18 Sep 2010
    Posts:
    8,373
    Likes Received:
    385
    My thoughts exactly - why would you buy a 4770K and then use the IGP...
     
  4. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    709
    Here is why I am not exited about Intel chips:
    -> The most highest end chip, has similar performance to a GeForce 650M.
    -> Performance per Watt is actually very low. 84W, while the GeForce 650M is a fraction of this.
    -> The time it comes out and be in store shelf, its GPU performance will come nearly as close as a ~2 year old dedicated GPU.
    -> Driver support

    Here is why I am exited about:
    -> "Do you want free fries with your ketchup?" What I mean, is that Intel is pushing hard on their GPU solution. They are saying to people and OEM: High-power GPUs are needed.
    This MIGHT make OEM thing about putting med-low range dedicated Nvidia/AMD GPU with laptops again. Right now, OEMs sees no value in putting a dedicated graphic card: It cost room on the board... a lot of it... and it cost more for them to add. The Intel GPU is free, and in the CPU. Beside as most people just surf the web... and not that (because you can argue about hardware accelerated web browser), but they use Google, and Facebook. Both web sites are super primitive in design. I am sure it will run fine on a good old Motorola Razor (the old slim flip phone). So there is no demand for a dedicated GPU, especially that Win8 interface is so simple.
     
  5. azazel1024

    azazel1024 New Member

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    The 4470k IGP is nice in that if you aren't a gamer, the graphics are more than sufficient for pretty much anything you'd want to do. The improved IGP also means better GPGPU for the occasional application that supports it (say, Photoshop CS6). For a gamer, sure, it isn't going to be sufficient.

    For the 4770R, that is 65w TDP, NOT 84w TDP. The clocks are slightly lower, but not by a huge amount. Keep in mind, the 650M + CPU is what the total TDP you are talking about. This is also a desktop part. The 5200 graphics are going to be in laptop parts as well, supposedly 47w TDP and maybe 35/37W parts as well.

    47w TDP for a quad core Haswell CPU + 650M equivelent graphics FAR exceeds even a 35w TDP quad core Ivy Bridge processor + 650M graphics in total TDP and performance per watt. FAR FAR FAR exceeds.

    For most desktop users, 650M equivelent graphics is way more than they need and heck, even for some gamers that is plenty of performance. It is better peformance than what I am running right now with my 3570 + 5570GPU (granted I am probably getting a 7790 in the nearish future).

    For mobile users, 650M level of performance is probably enough, even for a lot of gamers.

    It isn't like Intel was ever likely to suddenly come out with something with 10x the performance. In the end, unless someone comes up with some kind of truely revolutionary design, you are limited by the performance of the process technology and node side as to how much performance you can pack in to a certain TDP. You'll never get a 77w CPU + 150w descrete GPU all in to the same pack at only 84w of TDP. It isn't possible. What Intel has effectively done though is probably shoehorn a 50w TDP CPU and a 25w TDP dGPU in to the same die and between various clock gating and throttling gets the whole thing in to a 65W package (though neither CPU and GPU are going to be able to truely run 100% flat out together...but how often when you are gaming do you need all cores of the CPU hitting 100%???)

    Broadwell comes along and the node size decrease opens another level of performance and so on and so on.

    Intel is never going to have the equivelent of high end discrete graphics shoe horned in to their CPUs. They might have barely mid level graphics shoe horned in though and that is going to be more than enough for probably 90% of the computer market. Between Intels process and node size advantages, for the same GPU TDP, they can probably create a design at least 50-100% more powerful than what Nvidia and AMD can do. So If Intel has a processor with 30w set aside for the CPU and 17W set aside for the GPU, that GPU is probably more like the equivelent of a 34w discrete GPU. Not going to set the world on fire...but still not terrible.

    I can tell you this, if I could have the 5200 in my next laptop I doubt I'd really be itching for better graphics any year soon. Sadly I know it won't be coming to ULV processors.

    Though I am hoping maybe with Broadwell (roughly when I might upgrade from my Ivy Bridge based ULV "ultrabook") we'll see some ~25w TDP quad core processors with at least GT2 equivelent graphics and something like a resonable price tag (even if there are still ~35 and ~45w laptop parts, something in between 17w and 35w would be nice, especially if it had better graphics and quad core unlike 17w TDP. 25w is low enough that you can still create a rather thin and light weight design without breaking the bank for a 13-14inch laptop).
     
  6. kenco_uk

    kenco_uk I unsuccessfully then tried again

    Joined:
    28 Nov 2003
    Posts:
    9,696
    Likes Received:
    308
    The 'R' model sounds good for a Wintel mitx PC hooked up to the living room TV.
     
  7. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    709
    Intel is talking about their highest end CPU. So 84w.

    Yes, but the GPU is far slower as well.

    Not really. It's borderline solid gaming performance for THIS year games.. and maybe next year. Won't pass 3 years though.
     
  8. leexgx

    leexgx CPC hang out zone (i Fix pcs i do )

    Joined:
    28 Jun 2006
    Posts:
    1,352
    Likes Received:
    8
    so 3x faster then slow still at least from the intel HD up can handle 1080p
     
  9. xrain

    xrain Member

    Joined:
    26 Jan 2004
    Posts:
    403
    Likes Received:
    21
    BGA's aren't permanent, just get yourself a BGA rework station and you can swap out BGA's all day. :dremel:
     
  10. Harlequin

    Harlequin Well-Known Member

    Joined:
    4 Jun 2004
    Posts:
    7,071
    Likes Received:
    179
    so intel are trying to catch up with AMD? who are just about the put GDDR5 on die for the next APU and along with GCN bump it up to something around a 7730
     
  11. rollo

    rollo Well-Known Member

    Joined:
    16 May 2008
    Posts:
    7,700
    Likes Received:
    99
    The only reason they are bothering is because the likes of Apple / Microsoft have requested better onboard graphics for there Ultra Portable range of products.

    AMD are a none player for both of these manufactures.

    On the Desktop front the GPU from both amd and intel is overkill for HTPC builds and for the most part underpowered as a gamer chip. AMD is aproaching playable levels in certain games at 1080p ( with there fastest chip) but alot of games require massively reduced settings to get a playable frame rate.


    Intel is not even close to this. But im not sure Intel actually sees the desktop as a market that has the numbers it is looking for from a product.

    On mobile Intels chip looks good on paper low power draw decent performance. Could and will sell well to OEMs. AMD need to work on there power consumption to be a player in the mobile space again as there chips are too hungry for mobile applications at this time.

    Personally would rather Intel gives us a cheaper 6 core Chip in the same Die space and boots the GPU into touch and id guess alot of us on this forum would feel the same way about it.
     
  12. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    13,001
    Likes Received:
    2,127
    No, the Iris Pro is in the R suffix model, not the K - which means a 65W TDP, not 84W.
     
  13. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    709
  14. Harlequin

    Harlequin Well-Known Member

    Joined:
    4 Jun 2004
    Posts:
    7,071
    Likes Received:
    179
  15. azazel1024

    azazel1024 New Member

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    Nope, the 4770k will only have 4600 graphics, NOT the 5200 graphics. The 5200 graphics (Isis Pro) is going to be in the 4770R...which has a TDP of 65.

    So in this case, no, you are incorrect.
     
  16. ChromeX

    ChromeX New Member

    Joined:
    12 Aug 2004
    Posts:
    1,605
    Likes Received:
    22
    Yup, because after spending who knows how much on the bundle, i'll have to spend another £1000+ on a rework station to separate the cpu and the board, both of which are now useless, since they cant be used for anything else!
     

Share This Page