Discussion in 'Article Discussion' started by Gareth Halfacree, 15 Mar 2016.
Outlines new GPU roadmap, too.
an all-in-one sealed-loop loop liquid cooler
"compute performance".. Is "compute" no longer a verb?
Whoopsie - fixed, ta!
"Compute performance" is a term referring to a processor's performance at computing - distinct from, for example, its clock speed in hertz. For GPUs and CPUs, "compute performance" typically means floating-point operations per second (FLOPS).
No worries ! it was like a tongue twister
As for the card? Disappoints. Far too much money and tbh? not enough VRAM. Well actually no, let me rephrase that. There's enough VRAM but lately we've been getting bloated console textures and in at least two games recently the Fury X does not have enough VRAM.
I see they are also craftily cashing in on VR, so all of a sudden this is mainly for VR.
Wasn't there a new API (I can't remember if it was AMD's Mantle or part of DX12) where the VRAM can be used independently (8GB effective) rather than having to be mirrored (4GB effective)?
Apparently yes. DX12 can do that apparently.
DX12 can also support more than one GPU too, yet neither of the DX12 games released so far work with more than one.
So I'm guessing that for all of this magic and wonder to work the game needs to be coded from the ground up using DX12 and not botched in for higher sales at a later date. That worries me.
Edit. Actually let me go into why. We have already seen two games that bring Fury X to its knees at 4k. BLOPS III ended up being patched by Treyarch to reduce the physical settings on cards with less than 6gb VRAM so at 4k I am only allowed to set high details rather than "extra". Before they released this patch (and after, if you know how to hack it out) the game would most often just go to a black screen and crash the PC. On the odd occasion it would actually start and run, but soon after it would crawl to a halt and eventually hang to a black screen.
Rise of the Tomb Raider is actually rather different. This does allow you to set all of the settings to Very High, but from the off it is absolutely rife with problems. It stutters badly, especially when loading a new level. It also freezes for up to a minute before carrying along on its merry way. I looked into this on AMD's website (under driver notes) and AMD simply said that using Very High settings could lead to instability so they recommended against it. At no point did they say it was because of VRAM but it's pretty obvious, given the 980ti in SLI suffers no such ills.
And this is really worrying because it will be a really long time before DX12 is being used properly (the first few DX11 games were really resource heavy too.) So that means for the foreseeable this could become a trend (NFS comes out soon can't wait to try it on that) and we could see 6gb becoming a needed requirement for high end gaming. I'm sure that the new consoles have 8gb total, with 1gb going to the system and something else before 6gb is available to the GPU. And it shows.
I remember when the fury X launched I said 4gb vram would likely not be enough for 4K gaming in the future, people said HBM will cover the lack of vram. As people are now finding out 4gb is really not enough. Proven by the fact that Nvidias 980ti with 6gb of ram runs both games fine.
Dx12 is a long time off using memory like that for SLI or CFX, it will take a massive game launch that nvidia or AMD gets behind and says we want to support this feature. My guess this time next year at earliest.
I don't even think 6gb will suffice for 4K not without Sacrafices.
I think the general consensus was that it would be enough due to the bandwidth and overall speed. However, when a game immediately wants to load in 6gbs worth of textures all at once because that's what the consoles are doing then it becomes a problem.
4gb of HBM is enough IMO. It's just terrible coding and optimisation that is making it not enough.
Expecting devs to improve there coding as been some mythical PC gamer speak for as long as I've been a PC gamer. Even when Crysis first launched the gpu company's avoided most of the scrutiny still a dev problem.
Any dev that has pushed the envelope in graphics has suffered abuse because of it.
There's only been 1 good PC port of a console game in the last 3-5 years and that's GTA5 and that took over a year of additional dev time. Would everyone accept that kind of time lapse for every major game have my douts on that one.
Konami with the fox engine also proves what can be done with a very scalable engine.
Most of the new engines are struggling to scale correctly across a wide range of hardware most of them really struggle on low and even more so when maxed out.
Have you played BLOPS III ?
It looks like doodoo, even with all of the "extra" settings applied.
all those DX12 games where nv cards fall on their ass yeah? working well then
There's only 3 dx12 games that I can list only 2 are actually fully launched the other is alpha / beta at last check.
They're not proper DX12 titles, though. Both started out life as DX11 games and have been made to work using DX12. In order for AMD to show off how good their cards are in DX12 the game will highly likely need to be coded from the ground up to support such function. It's pretty obvious that is the case because "multi adapter" isn't working yet and it should be a given right because they have used DX12 which is supposed to support it with no extra work.
With Ashes of the singularity I reckon it was done purely to make extra sales. I bought it just to see how it ran and found it wasn't a game I would normally play or buy.
It will take a couple of years for proper DX12 games to appear and that's if it even catches on. It took absolutely ages for a proper DX11 title to come along. I know most will mention Dirt 2 but most of that could be ran on DX10 if you hacked it. It was only the DX11 tessellation that was DX11 only.
And by that time all of the hardware we have now will be woefully out of date.
Which, it must be remembered, is why developers do anything. No developer ever worked on a feature for a game 'cos they thought it'd make it less popular.
TBH Gareth I doubt I would have even heard of the game had it not been for DX12 and a benchmark battle on OCUK.
I only wanted to see how my card stacked up compared to a Titan X. There was pretty much nothing in it.
I then uninstalled it.
Edit. I wonder if any one has actually looked into exactly what DX12 features these two games are using. I mean, you could literally add one tiny part and say "Hey look DX12 game innit!". See also - Dirt 2 DX11.
One of the touted features of Windows 10 was cross compatibility between Xbox One and PC. Are there any games that you can play online yet? I don't really want to buy an Xbox, but I would like to play online with friends.
Separate names with a comma.