I think there's a lot more to this topic. Maxwell was designed to be very scalable architecture, usable for mobile and all the way up. Nvidia decided to do this because they have substantial laptop and mobile interests, and it makes sense at this point for them to have a single fully fledged architecture which can be used across all those platforms, AMD on the other hand have no such interests. Given that their mobile Adreno division was sold off to Qualcomm (in 2009 for 65millionUSD), they don't have the same incentive to produce such a scalable architecture. They will far more likely aim to produce a high performance design capable of out-doing Maxwell, but bare in mind, these architecture decisions were probably made 3-5 years ago, it's not like AMD at this point can say "Ahh look guys those Nvidia fellas have done a power efficient one, maybe we should do that with ours too". The decisions are made and the chips are well on their way to being released. They will already know whether they are going to be able to compete on power efficiency and my guess will be they can't. That's not a bad thing, they've got a different target for the architecture, and as long as it does compete with Maxwell sufficiently, they can always price it to an appropriate point in the market. Another point, People are very fast to jump on the "GTX 480 is the work of satan" bandwagon. I still have 2 in my desktop, I know that puts me "behind the times" by modern standards, but in applications which scale well I comfortably get GTX 680 levels of performance. Gaming at 1920x1200 this is plenty of performance, and to get any meaningful upgrade would mean considering likely a GTX 970, or a pair of/equivalent, which would be largely wasted at this resolution. As for temps, they're really not that bad. Yes they're power hungry, yes they run a little hotter than some (but not all) GPUs, but they're perfectly manageable. It's not like I'm trying to hold my desktop against my face while using it, I would rather it used all it's available TDP to give me as much performance as my hardware is capable of. The only factor which sadly limits this setup is the lack of Vram for games like Watchdogs and other games written and then loosely ported back from the modern consoles. It was obvious several years ago when the rumors began to circulate about the consoles having 8GB of ram shared across the GPU and CPU that ports would emerge which had been lazily carried out and would need a larger amount of Vram than was present on the average cards of the day. For some time now when friends have asked me for a spec which is "future proof" I have advised they purchased cards with at least 4GB of Vram (the typical amount developers allocate to the GPU on the consoles). As such friends with, for example, a pair of GTX 680 4GB cards can still utilise all of the shading horsepower without becoming bottlenecked by the lack of available on card storage. As for 4k, I have tried several screens and found the vast majority to be lacklustre. The Dell 24" IPS screen was nice but the resolution was wasted with such a high ppi, the 28" is a non IPS, TN panel (and therefore vomit inducing) and the Asus 31.5" is nice but still priced well out of the reach of most consumers. I would consider that 4k is still an unwise investment at the current time, It requires a shift in the consumer for what they consider to be acceptable graphically. Even with very powerful setups it requires compromise, which is not something that people who can afford this technology are used to experiencing. GPU's have not yet scaled to be sufficiently large to handle the resolution and frankly without the finalised display connectors being readily available on the market at the moment, and while we wait to see the adoption of FreeSync, it would be nigh on foolish to purchase an expensive high end panel at the current time, simply opt for a quality IPS 1440p panel instead.