Yeah the whole thing is rather comical: 1080 with an AIO included for 538.99 at scan and its in stock Vega with an AIO included? 679.99 at scan and maybe they'll have it at the end of the month. Don't want to spend that much? Not a problem: 1070 ITX? in stock and under 400 1070 with comically over specced air cooler? In stock and under 400 1070 with an included EK waterblock? In stock and under 400 Sucking blower fan version of Vega 56? Ehh maybe at some point shops will start taking preorders...
AMD undoing a lot of good work they did for their image and public opinion here... Dangerous territory. Vega needed to be cheaper to look like a good option, to INCREASE the price is a nut shot to everyone out there looking for a proper upgrade from their 290/390's. Expect a lot of Freesync monitors to crop up on forum trading areas and Gumtree etc.
This price increase, if it's at AMD's instigation, is just suicidal, and a massive F You to AMD's customers. If your card only competes with an older, more efficient and cheaper rival, then surely the last thing that you want to do is hike the price of your product? Madness. And it makes AMD's marketing chat about Freesync being cheaper than GSync look a little bit of a piss take - sure, the monitor is cheaper, so we've just hiked the price of the GPU! Thanks for buying AMD!
I'm just praying on AIB cards. From what I've heard they are looking decent but they keep the juicy info away from me around here.
Indeed, it has been a bit of a disaster. My 290 will have to survive until Navi or a Vega price reduction (which is unlikey with the current price inflations).
AMD trying to play off Freesync monitors being cheaper, therefore AMD cards being better value, is just bollox at this end of the market anyways. Let's be real here, most people spending £400-500 on a GPU aren't going to be put off by a G-Sync monitor costing an extra £200 vs it's Freesync version - it just gets worked into an already considerable budget. In the mid-range, back when RX 470/480's were affordable, it was a thing. But at the high end? Nah, I don't buy it, AMD...
That argument would hold more water if Vega had to waste a bunch of die area on professional features. But it doesn't: no ECC for in-flight math (and it's not even clear if ECC on the HBM2 is enabled or not), no 1/2 rate FP64 performance (just 1/16 rate), not even certified drivers for any Vega SKUs. AMD are producing two lines, Polaris and Vega, both eschewing the professional market for the gaming market.
I'd definitely agree with that if you're talking about 1080Ti prices, possibly 1080, once you hit 1070 prices though I'd probably disagree. But this only takes into account if you're buying a GPU and monitor at the same time. If you're just buying a GPU at the moment, I'd imagine most people who don't already have an adaptive sync screen wouldn't really factor it as a concern. Lets face it, if you're OK with 1070/V56 performance without adaptive sync now, if you were to buy a new screen afterwards you may just go with no adaptive sync. At that point what's cheapest would be the card to go for, unless you have other specific requirements.
I think in all honesty they were hoping that games would use GCN. That is why they've been makin' Fermi since the 7970. If games did use it? then yes, the Vega 64 would indeed be faster than the Titan XP. But that isn't happening, because to do so game devs would forsake Nvidia and have to do lots more work. So basically they are making mining cards. That are a bit hot, not very well behaved power wise but hey, I guess at least they are selling right? As for the liquid cooled 64? I paid less than that for a Titan XP. And Titan XPs have been selling for as low as £625, usually making £650. I paid slightly more for mine as LP allowed me to make payments.
Vega 64 dosnt worth the money but the Vega 56 looks promissing trading blows with 1070 and 1080 as well, but sadly there will be no board partner cards.
Hoping developers would re-write their working code to fit your new architecture has never worked, and is never going to work *coughItaniumcoughHSAcough*. The best you can do is do the re-writing yourself and provide the optimised code as drop-in modules. Nvidia did this (Gameworks) and AMD did not.
Yeah, and I for one am sick of AMD promising the earth from their current technology "just as soon as devs catch up". If I'm buying a product then I need it to do the business right now, and if it doesn't then I'm not going to buy it based on speculation that it might do the business at some indeterminate point in the future.
Well, that was even quicker than I thought. Well done AMD, it's quite the achievement making Nvidia look like the good guys!
Oh yeah totally. It was the same sort of thing with Bulldozer. When threading it was actually pretty decent but no one supported it lol.
AFAIK it does though, Vega when used in their professional range (WX 9100 & SSG) does come with all the features you've identified, yes they're not in the RX and FE range but then i suspect Vega was never designed for the consumer and prosumer markets, in the same way as Zen was designed to target primarily the enterprise, embedded and semi-custom markets. I suspect the RX and FE Vega chips are little more than Vega chips that didn't make the cut for Instinct or professional cards. EDIT: They even said back in 2015 that they were prioritizing investment in the enterprise, embedded and semi-custom business.
On the Gamers Nexus undervolting live stream last night they mentioned that another site tested Vega for workstation roles and it was good. But he didn't say much more, also I annoyingly can't remember the site he mentioned. Doesn't seem to make mining 'better' it just stops the drop in performance. As a comparison, Pascal was unaffected. Sauce. Since prices were already beginning to return to normal, let's hope the lack of a drop in performance won't affect much. Otherwise we're all gonna get shafted again. Edit: Just seen it did actually give the mining performance a bit of an uplift. Serves me right for skim reading!