I'm a huge fan of "good enough", however AMD have fallen so far behind that using an APU over an Intel CPU/discrete card, in form factors which allow for this, lands on the wrong side of the line. Good enough does not mean making huge sacrifices to upgrade prospects and initial performance for the sake of a £20 note. A Pentium anniversary has an upgrade path ending in the hefty 4790k, and a discrete GPU will be changeable over the years. Buy cheap, buy twice. It's a cruel irony that it can be more expensive to occupy a financially limited position than not, but given the differences in price here, there is no excuse to chose the vastly more limited system, in both initial performance and future prospects. Perhaps it would be for the best if AMD spun off the GPU division, which is still potent though cash-starved and subject to poor management - the Fury X is a lovely itx sized flagship. The only value I can see in the CPU arm, no pun intended, is the x86 license. Consumer software has not become hugely multi-threaded, as some dreamed it would, nor has the fact all 3 next gen consoles use AMD hardware resulted in PC games performing better with AMD hardware, it's a much more complex equation that that. My first CPU was AMD, as was my second, but AMD have been stunted ever since core2duo replaced the ghasty pentium 4 based pentium Ds, and that was years ago. As I said in the first paragraph, being poor is expensive, AMD's CPUs simply can't recover as-is, it is not possible, Intel utterly dominates now, even AMDs trick-shot APU is now in danger of being replaced by an Intel low end CPU with iris pro graphics. Intel played dirty and won, but it's not like AMD haven't tried pulling fast ones, they hyped bulldozer up, moar coarz0rz, got people buying the motherboards, and then revealed the abysmal 8150. Ebay told the story of how that was received, and the aging 2500k when juiced up is still better than anything AMD have for general purpose/gaming. There's just no competition. And that's partly why we're still looking at 4 core mainstream CPUs from Intel heading into 2016, when the q6600 came out at the start of 2007, which is ironically AMDs only hope, that CPUs just aren't the limiting factor they used to be, and the push is to make them even less limiting via software, not brute CPU hardware. We probably all have more powerful CPUs than we really make use of baring 'content creators', but we all frequently max out our GPUs, pushing them as hard as possible as that workload scales better, in terms of power and the flexibility of programming around the differences of GPU power available in the platform the software is being run on as opposed to hitting real-time limits in CPU performance that to adjust would alter the software in probably unworkable ways. I personally think VR is going to shake things up, as if a reset button has been been hit. 3 way monitor setups are going to be quaint relics of how people used to sit down infront of computers, like when we see a mainframe operator gleefully entering code on a 10" monitor as monoliths of silicon and wirelooms moot round them. This whole sitting down at a computer has got to change - powerful computing unit hidden somewhere linked wirelessly to a lagless VR headset, sensors for everything, massive information processing to recreate environments, truly natural input/control methods...rambling now, but putting too much stock into how technology is currently used, and just thinking "more of that" is not really seeing how it will be used, how it will innovate to evolve.