1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD unveils its Trinity A10, A8, A6 and A4 desktop chips

Discussion in 'Article Discussion' started by Gareth Halfacree, 27 Sep 2012.

  1. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    [​IMG]

    adding some AMD flavouring into the charts posted previously , your looking at comparable performance ON CHIP to a stand alone `low midrange` card from 18 months ago
     
  2. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,084
    Likes Received:
    6,635
    Given that the GeForce GT 640 beats my GeForce 9800, and that the A10s beat the GeForce GT 640, I think I may have found my next upgrade. It'll be nice to drop the power draw a bit.

    (And yes, I'm aware of the weaker CPU performance relative to Intel's current architecture - but I've got a Core 2 Duo E8400 at the moment, so it's likely to be an improvement there, too.)
     
  3. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    I'm pretty certain that an A10-5700 beats my current Q6600
     
  4. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    it would be interesting for gaming performance in a `mid` settings (or maybe high) as the IGP is the strong selling point; and ofc dont forget to use 1866 or faster ram ;)
     
  5. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    1866 MHz is a must have. Can't wait for the next gen APU, based of Graphic Core Next architecture and with CPU / GPU shared memory space ==> higher bandwidth between CPU and GPU. The next gen is aiming @ 512 stream processor (= to HD7750).
     
  6. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Intel needs AMD and they would prefer them to be competitive or the anti trust board will rip them to shreds.

    For the power users intel is far and away the choice to pick ( anyone using a discrete graphics card)

    For people who buy dell I don't they even read what's in the box before they hit buy as long as its cheap.

    Then you have businesses and this is where AMD might gain some points as power usage is critical in this area. If they can get a CPU Gpu combo that uses very little power they will go for it.

    They can't survive just selling htpc chips no douts there. There days of competing at the top are over they said so themselves they are now the budget mid range company. Lets hope it's an improvement for both intel and AMD sakes.
     
  7. [PUNK] crompers

    [PUNK] crompers Dremedial

    Joined:
    20 May 2008
    Posts:
    2,909
    Likes Received:
    50
    Agreed i can see them doing well in this sector, I just ordered some low profile graphics cards for work because we got some DVI monitors and the old machines only support VGA. Pretty sure they would jump at this for their average office (non-dev) machines
     
  8. azazel1024

    azazel1024 What's a Dremel?

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    If you want to get technical, probably about 100% of computer users as a lot of tasks load the CPU to near 100%. Even things like just loading an application. So, technically yeah, everyone does.

    Now how many will notice a minor difference in performance not many. How many will notice a big difference in performance? Tons.

    I have 3 kids under 5. I probably spend about 3-4hrs gaming per week at most once the kids are to bed. However, I do a resonable amount of photography, a little video editing and/or transcoding and I can tell you, I don't have the budget for a $3,000 system, but I did manage to scrape up a $600 budget this time around (I can generally manage that about every 2-3 years). The performance difference going for a higher end CPU most deffinitely saves me time in my life so I can focus on things like being a father, chores around the house, etc. The extra maybe $70 I spent probably saves me on average 20-30 minutes a week when you account for time savings in things like image editing.

    I can deffinitely tell you my time is worth a lot more than $70. Especially when you go out to 20-30 minutes x 52 weeks a year. An extra maybe 25 hours a year of my life where I am not sitting waiting for an image to finish processing, an application to load, something to export, etc. That is a mere $2 per hour. Well worth the investment.

    Just my 2 cents though. It might not have been worth it if I had been spending a couple of thousand dollars for the extra performance, but 2 tanks of gas worth of expense for all of that time back in my life? Worth it.

     
  9. azazel1024

    azazel1024 What's a Dremel?

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    One last note, NO the listed TDP is not giving ANY head room for overclocking. That would push the TDP higher. The TDP is the maximum thermal dissipated power of the chip under the absolute worst case scenario load AT THE RATED CLOCK SPEED. Realistically a user, even using benchmarking software, is unlikely to ever actually be able to push a chip to true 100% loading across every transister inside of the chip (fully loading CPU, GPU, memory controller, L1, L2, L3, etc, etc, ad nauseum). Also a lot of times manufacturers will band chips together in to the same TDP classification, even if under the worst case scenario they won't actually hit the rated TDP.

    However and in general, the "top of the line" chip in the TDP class CAN and WILL hit those TDP numbers in the worst case scenario and under typical 100% loading of CPU/GPU (if an APU, or just CPU if not) the chip as a whole is likely to consume roughly 90-95% of the rated TDP.

    The TDP numbers account for zero overclocking head room (and most modern chips with turbo boost can and sometimes will briefly exceed TDP turbo boosting their CPU and/or GPU for a period of time until the CPU/heatsink thermal capacity is reached and then it'll clock down slightly. Intels stuff at least will, especially in ULV parts).

    Overclock your chip without overvolting and likely you'll see a modest increase in power consumption. Without overvolting you'll likely see a roughly linear increase in power consumption. So a 10% overclock will likely see roughly a 10% increase in power consumption. Overvolt however and you'll see roughly a linear increase in power consumption with the overclock and a 2nd power increase in power consumption with the increased voltage. So a 20% overclock and a 10% overvoltage would see about a 20% power increase due to raised clock rates and an additional roughly 21% increase in power consumption due to increase voltage. Raise voltage 20% and it'll be increase by roughly 44% for power consumption. So that 20% overclock and a 20% overvoltage to achieve it in the end would lead to a roughly 72% overall increase in power consumption.

    Yeah, that is a hell of a lot of extra power. That is why overclocking tends to decrease performance per watt, especially when you need to overvolt the chip to achieve higher clock rates.

    The locked 5700 with 65w TDP also clocks in at 400mhz slower than the unlocked part and 200mhz less under turbo (and under turbo it might well be exceeding TDP until it runs out of thermal head room). Going with linear scaling it means that it should use roughly 11% power at regular clocks and about 5% less under turbo. The top of the line part probably won't actually hit 100w exactly, but the part below it probably just barely eeks in at 65w or less and is likely a lot more agressive at clocking down from its max turbo speed, especially if the GPU is active as well, to stay within it TDP limit.
     
  10. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    Again, this is a very specific use. Photo editing, like rendering or any CPU intensive tasks are not ave-joe use. In your case, the choice of a better cpu is 500% justified, but for all the grand fathers, uncles, etc ... those chip are more than enough.

    I'm asked twice a month to build a sub 400€ computer for family or relative's family. Tasks is alway : Internet, Word, Souvenir photgraphy storage / viewing, Movie / series. Maybe it's hard to believe, but that's what 90% of computer users will ever do with their computer.

    Our vision is biased on this forum because we play, do 3D modeling / rendering, photography, etc and a little more CPU / GPU power saves us lots of time. But we are a negligible part among all the computer users.
     
  11. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    what guille says is very true, Ive been asked a few times myself and main uses is always facebook or general web browsing + some school work. None of these tasks require a top end cpu.
     
  12. MSHunter

    MSHunter Minimodder

    Joined:
    24 Apr 2009
    Posts:
    2,467
    Likes Received:
    55
    Sorry to bring up tested facts but the performance difference has zero effect on any FPS games and little to no effect on almost all RTS games. The only noticeable difference will be in encoding high def. videos or programming.

    that is an older AMD F1 VS i5k 2500 tested with some friends absolutely no difference in FPS or performance.

    the only game so far that has a difference was Sup COm 2 with shadows on and 1000 units on 8 player superlarge maps.
     

Share This Page