1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware AMD A8-3850 Review

Discussion in 'Article Discussion' started by arcticstoat, 30 Jun 2011.

  1. Jipa

    Jipa Avoiding the "I guess.." since 2004

    Joined:
    5 Feb 2004
    Posts:
    6,363
    Likes Received:
    125
    It does support something called Dualfire, but so far only supports the latest low-end HD6-series cards..

    Also the HD5770 isn't so much "better" as it is "different option". Personally I'd much rather just have the 100 W set than a more expensive 200 W one that still isn't really a powerhouse anyway.
     
  2. murraynt

    murraynt Well-Known Member

    Joined:
    6 Jun 2009
    Posts:
    4,234
    Likes Received:
    128
    Have to agree with the usage being limited.

    If you want a work machine a 955+on board gpu like in the 880 chipset would be better and likwise a overclocked 250 with a 5770 if you wanted to game.

    Perfect base for a lappy but not a desktop really.
     
  3. Yslen

    Yslen Lord of the Twenty-Seventh Circle

    Joined:
    3 Mar 2010
    Posts:
    1,966
    Likes Received:
    48
    OEMs have been churning out PCs with mid-range CPUs and low-end integrated graphics for years. Now they can sell these for the same price and offer actual decent gaming performance at console quality levels. This will be a hit for sure.

    Wait, didn't Apple sign some kind of deal with AMD regarding using these chips? I'm sure I read and commented on that news story a few months back.
     
  4. OCJunkie

    OCJunkie OC your Dremel too

    Joined:
    19 Apr 2011
    Posts:
    619
    Likes Received:
    19
    Pretty impressive value, looks like my next HTPC build might be based on this.
     
  5. John_T

    John_T Member

    Joined:
    3 Aug 2009
    Posts:
    532
    Likes Received:
    21
    That's fair enough, thanks for letting us know why. If you do get time at some point it would be interesting to see, as I reckon it would probably ramp up in quality again...


    This. 100% this.

    Of course if you spend 'only' £40-£50 more you can get a more powerful machine. But then if you 'only' spend £40-£50 more than that you'll get a more powerful machine still. And £40-£50 more than that, and £40-£50 more than that, and on and on, etc, etc. I think these chips, and the machines that will be built around them, are going to be aimed at the people who want to draw the line early, not the ones who keep thinking: 'If I just spend a little more...'.

    With all the other advantages these chips will bring to those systems - and the physical size of the system being a big one - I can see this being really desirable to a large segment of people.

    As a few other people have already pointed out, for very large groups of people, (business users, families on a budget) the average PC has already passed the amount of processing power they need to do the things they want anyway. That a different PC, that costs more money, has even more processing power still - it's going to completely irrelevant to some people.

    A PC that's half the physical size of their old one, that can play modern games on their small to medium sized screen at a reasonable quality, and is still as cheap as their old machine was - I think some people will lap that up.
     
  6. MrJay

    MrJay You are always where you want to be

    Joined:
    20 Sep 2008
    Posts:
    1,290
    Likes Received:
    36
     
  7. alialias

    alialias New Member

    Joined:
    1 Dec 2010
    Posts:
    59
    Likes Received:
    2
    well i just built a gaming system and am worried about the amount of power it uses while im just browsing the web/watching tv etc
    this would be ideal, a low power pc, and an excuse for a new build!
     
  8. Neogumbercules

    Neogumbercules New Member

    Joined:
    14 Aug 2004
    Posts:
    2,464
    Likes Received:
    29
    I've been plotting a small-sized emulation box that runs super quiet with very low power requirements. Maybe I should consider this.
     
  9. Farfalho

    Farfalho New Member

    Joined:
    27 Nov 2009
    Posts:
    424
    Likes Received:
    2
    Hope you follow this through ;)

    If the HD6450 has 512GB, I want one free just for spoting that "secret" :p
    After all, it's we or your? I want to be sure that's mine xD

    Now for the article itself, I'm happy AMD using the aging K10 architecture and make it work really good. On the medium high stream desktop processors I don't want to know about them, bring forth the Bulldozer but in this market segment I'm previewing an huge success for AMD. Finally a really good product perfect for its market share.
     
  10. Bloody_Pete

    Bloody_Pete Technophile

    Joined:
    11 Aug 2008
    Posts:
    7,405
    Likes Received:
    554
    The problem that most people have forgotten is:

    Who is this chip aimed at?

    Budget end home user and office machines.

    Now, what happened a couple of months back, SB got released. ~What happens 4-6 months after ever Intel launch? It dump all of it's old stock to the OEMs, flood the budget home user and desktop market with very very cheap chips and motherboards to clear it's warehouses (remember, Intel always produce more than it can sell). I see this as the biggest challenge for this series of processors, not the i3's...
     
  11. fluxtatic

    fluxtatic New Member

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    Just wait until Dell & HP start churning out desktops with APUs...these could be monster money-makers for the OEMs. And how many people here have a pre-built desktop? Remember, these aren't aimed directly at the bit-tech crowd. Had I known how long it was going to take to put my Zacate build together, I might as well have waited for one of these, at least serving as the replacement for my wife's aging desktop. Once prices settle out a bit, it would be spot-on in that role.

    Granted, I'm specifically not buying Intel, which limits my options a bit, but I could think of a few things this might be nice for. However, I'll be skipping this, using the Zacate for my wife and later as a NAS/HTPC. My own next desktop will be Bulldozer. So I might still pick one of these up for the wife at some point.
     
  12. secu

    secu New Member

    Joined:
    12 Jul 2007
    Posts:
    5
    Likes Received:
    0
    It seams that everybody forgot about that. This is the main reason why, even if Intel would make ten time faster CPU than AMD I wouldn't buy it.
     
  13. azazel1024

    azazel1024 New Member

    Joined:
    3 Jun 2010
    Posts:
    487
    Likes Received:
    10
    Certainly true vis a vie Intel and their illegal practices before. That isn't to say that doesn't leave a bad taste in my mouth still...but Intel just makes a better CPU and in most cases one that has a better processing power vs cost value as well.

    As it stands for the last several years and as it looks to the future, Intel doesn't need to engage in anti-competitive practices to stay ahead of AMD, because their chip technology is easily several years more advanced.

    I am not attempting to defend Intels prior behavior, as it is reprehensible, but one could argue to some degree that even if Intel hadn't engaged in their anti-competitive behavior that AMD wouldn't be too much further ahead. The fact though is we really don't know. It isn't like AMD had a better product for dozens of years before Intel pulled ahead, it was for what...something around 2 years give or take a little? Intel going in to it had a massive lead in market share and AMD wasn't too far off the years where their processors weren't fully "x86 complaint" so a lot of people didn't trust their CPUs (remember the K5 and K6 stuff??? I do).

    Intel's practices likely wouldn't have changed the introduction of their Core archicture and massively outperforming AMD. I just don't see how AMD in 2 short years would have managed to garner enough of the market to make them outcompete Intel financially, or even significantly catch up enough that they would be vastly far ahead of where they are now with their technology and R&D.

    Again, it might be a very different world for AMD if Intel hadn't engaged in their practices, it just isn't possible to know without opening a portal to a parrallel dimension where Intel hand't done it to know.

    As it stands today, CPU technology wise, AMD's current product is at least 4 years behind Intel. Their Llano stuff is performing no better than Core 2 Intel parts (last I checked my Core 2 duo 2.93ghz performs about up to snuff with an AMD x2 of slightly faster clocks, so even Core 2 parts are slightly faster clock for clock), something that was introduced 4+ years ago now. The GPU built in is nice, and in a laptop that is really revolutionary with long battery life, but still pretty decent performance and the ability to take advantage of some nice games with lower settings, and since a lot of entry and mid grade laptops have "low res" screens in the 1366x768 range the higher end Llano low power parts are actually pretty good (higher mid grade and high end laptops they still just aren't that good, SB CPU + discrete mobile GPU doesn't use Thattttt much more power and is a much better CPU+GPU combo).

    For a desktop though, most people aren't running monitors only capable of 1366x768, most are running 900p or 1080p. The new Llano desktop parts are really only capable of running "lower end" games at 1600x900 with most details off at playable frame rates and 1080p with any details turned on really isn't doable on games.

    So yes, you still support casual gamers like my wife, but you probably aren't going to attract even most people who go "You know what, I'd like me a little WoW and maybe some Battlefield 3 to dally with on the side". Not someone who I'd consider a hard core gamer or even heavy enthusiast, but someone who had a game or two they play, and maybe messes with the occasional other game.

    Or if they do, they are getting a discrete GPU as well, and probably not getting the best bag of the buck system possible. Yes I have seen some tests of the Llano desktop parts with a discrete GPU and they do pretty well...but they still don't have the CPU power that an Intel SB (or anything post Core 2 and even some Core 2 parts) has for non-gaming tasks, and there haven't been enough testing of Llano+discrete GPU to see if they'd even be on par in most games compared to an Intel SB + that same discrete GPU.
     
  14. kenco_uk

    kenco_uk I unsuccessfully then tried again

    Joined:
    28 Nov 2003
    Posts:
    9,696
    Likes Received:
    308
    I'd like to see Hybrid SLI and obviously these results again but overclocked. Does the APU have the ability to output to 3 devices, a la Eyefinity?
     
  15. CAT-THE-FIFTH

    CAT-THE-FIFTH New Member

    Joined:
    5 Apr 2009
    Posts:
    53
    Likes Received:
    0
  16. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    Yet again the integrity of Bit-Tech testing methods is questioned by AMD supporters.

    Don't shoot the messenger ladies, Bit-Tech are telling it exactly the way it is.
     
  17. CAT-THE-FIFTH

    CAT-THE-FIFTH New Member

    Joined:
    5 Apr 2009
    Posts:
    53
    Likes Received:
    0
    I own a i3 2100 and a Q6600 overclocked to around 3GHZ. One of my mates owns a 3GHZ Phenom II X4 too and the i3 2100 is still slower than both the overclocked Q6600 and Phenom II X4 at stock speeds for video encoding using HandBrake. I find for most games the Core i3 2100 tends to be ahead though.

    http://www.tomshardware.co.uk/amd-a8-3850-llano,review-32222-17.html

    [​IMG]

    http://www.silentpcreview.com/article1211-page6.html

    [​IMG]

    http://www.pcper.com/reviews/Graphi...-Review-Can-AMD-compete-Sandy-Bridge/Media-En

    [​IMG]

    http://www.hardwarezone.com/product-guide/view/187305/review/187314/page:7

    [​IMG]

    OTH,Quick Sync does help massively when you are making encodes for portable devices. Quality is better with a CPU only encode but it is a very useful feature IMHO.
     
    Last edited: 3 Jul 2011
  18. Material

    Material Soco Amaretto Lime

    Joined:
    13 Apr 2010
    Posts:
    633
    Likes Received:
    25
    Up on the front page.
     
  19. John_T

    John_T Member

    Joined:
    3 Aug 2009
    Posts:
    532
    Likes Received:
    21
    Nice one - cheers for that!
     
  20. Byron C

    Byron C And now a word from our sponsor

    Joined:
    12 Apr 2002
    Posts:
    6,477
    Likes Received:
    1,399
    *raises hand* I did. In fact it wasn't recently either, it was last year. I bought a 512mb GeForce 9600 last April - already well out of date - and haven't upgraded since. This suits my gaming needs just fine. I don't play at 1900x1200 res, as the highest res I'll ever use is 1080p. Any game I've thrown at it is handled without a problem - including modern games such as ME2, StarCraft 2, Fallout New Vegas, etc... Sure there's an occasional chug now and then (though this is more to do with general performance, rather than a GPU bottleneck) and I can't whack the AA all the way up, but I can usually get the highest level of detail - even on a 1080p HDTV. I could do with a new GPU, but I'm not really sure that I'll get that much extra benefit for my investment. If I drop £70-£100 or so on a new GPU, there's very little chance that it'll make much tangible difference in the quality of visuals that I see - other than maybe being able to whack the AA up all the way.
     
Tags: Add Tags

Share This Page