1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Gigabyte leak points to Trinity-based Athlon X4 chips

Discussion in 'Article Discussion' started by Gareth Halfacree, 13 Sep 2012.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,130
    Likes Received:
    6,717
  2. Tangster

    Tangster Butt-kicking for goodness!

    Joined:
    23 May 2009
    Posts:
    3,085
    Likes Received:
    151
    But...why? The only advantage Athlon has over similar price i3 and Pentium parts is the graphics.
     
  3. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    2,550
    Likes Received:
    467
    Main question would be; "why not?".

    Why scrap perfectly usable CPU's just because they have a defetive GPU, when instead you can sell them and still make some money?

    A Trinity based 3.4Ghz quad core Athlon will still be perfectly adequete for a large range of tasks as long as it is priced correctly.
     
  4. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    If you have otherwise functional dies that have borked graphics, you have two options:
    1) Dump them in the bin,
    2) Package them up and flog them cheaply.
    Method one basically flushes the money AMD spent on R&D, development and production of the flawed die down the toilet, but method two allows AMD to recover some of the money spent.

    You can't say it will be unattractive compared to the similarly priced i3/Pentium, because no price for the Athlon is mentioned. Until there is a price we can't gauge how attractive a product it is, and the market will determine how it is priced. If it is too high compared to the i3, it won't sell and the product will flop. If it is priced well then people will buy it.
    As to who would buy it, make it cheap enough and it will attract someone. Of more concern would be the requirement for a discrete graphics card on top of the CPU/MB.
    There are two types of customers who might be attracted to the GPU-less APU:
    1) OEM PC manufacturers who could bundle graphics cards with their systems,
    2) Those who are unnaturally attracted to AMD CPU/Nvidia GPU combinations.
     
  5. barny2767

    barny2767 What's a Dremel?

    Joined:
    27 Sep 2011
    Posts:
    172
    Likes Received:
    8
    Im attracted to AMD CPU/Nvidia GPU combinations. My Phenom II x4 and 680gtx get allong very well and if the Athlon APU without a GPU is cheap why not
     
  6. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    But are you unnaturally so? <Inquisitive raised eyebrows>

    My thoughts were more along the lines of the lower end of Nvidia discrete cards that are rather pointless in the face of AMD and Intel CPUs-with-GPUs. And there are always those who pine for AMD APUs with Nvidia graphics onboard.
     
  7. barny2767

    barny2767 What's a Dremel?

    Joined:
    27 Sep 2011
    Posts:
    172
    Likes Received:
    8
    There is something unnatural about having AMD with NVIDIA but i like being the odd one out. I have always had AMD CPUs, I think about 5 of them and always had Nvidia GPUs apart from last year when i had 2 6790s in crossfire and didnt like them so went back to the green team
     
  8. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    I have an Intel GPU with an AMD GPU on an nvidia mobo xD
     
  9. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    Ugh, that's just sick.
     
  10. Slash88

    Slash88 Just starting out

    Joined:
    2 Jan 2010
    Posts:
    125
    Likes Received:
    5
    I have a amd cpu with a nvidia graphics card.. feels good man.
     
  11. MrJay

    MrJay You are always where you want to be

    Joined:
    20 Sep 2008
    Posts:
    1,290
    Likes Received:
    36
    Nice move, as long as the pricing is correct!
     
  12. fluxtatic

    fluxtatic What's a Dremel?

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    Word.

    Still, I don't think these are aimed at us. Unless maybe some of you that have spawned are maybe going to build a cheap gamer's box for your larva(e), a lot of these will end up in cheap Dells along with whatever "not defective enough to throw it out" GPUs Dell can get a good price on.

    I like that they finally managed a quad-core in 65W in a mainstream proc. I don't like that there's no mention of L3 cache.

    Funny to think that the "on-board graphics" of yore are nearly dead (I think there are still a handful of old-skool AM2 and/or Atom boards on Newegg with chipset graphics.)
     
  13. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    The Trinity cores don't have any L3, so if you want/need L3 on AMD then you are going to have to step up to the upcoming Vishera FX range.

    AMD thinks that L3 at this price point is currently* a waste of power and silicon, and to be frank I agree. For the vast majority of workloads done by the vast majority of users, L3 adds very little while costing money, transistors and milliwatts.

    I say "currently" because that may change, with later iterations of AMD HSA APUs using it to aid GPGPU tasks.
     
  14. tyaty1

    tyaty1 What's a Dremel?

    Joined:
    14 Sep 2012
    Posts:
    1
    Likes Received:
    0
    Well, if someone wants middle class discrete graphics, with a cheap cpu with mostly unlimited features (unlike pentium), for gaming.

    Tle IGP is dead weight, next to a 7770/gtx460, until the the heterogeneous computing will be common. The Cape Verde-s own power save capabilities makes the IGP's advantages non existing.
     
  15. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    Considering that the GPU part is non functional I find the 65W TDP a bit on the high side.
     

Share This Page