Discussion in 'Article Discussion' started by bit-tech, 12 Nov 2018.
Oh they have potential.
I'll just get one and test it out on a RTX game. Oh. Hm. Yeah. Nah.
Was there no way to hold back on the release until nearer a time when there was actually a game out there that could show what it could do, why they're so excited and why they're charging uber cash money?
I imagine it would have been received slightly better if people could actually see what it could, maybe, potentially do.
Quality is subjective whereas FPS is not, to a degree image quality doesn't effect gameplay (wait's for all the examples where it does) whereas FPS does.
I guess the real reason most people dismissed ray tracing is because when you're shooting sprites in the face you don't pay much attention if you can't see yourself in said sprites ray-ban's but you do notice if those sprites move like your trying to stream an episode of TWD down a dial-up modem.
I don't think many are dismissing the potential these cards hold for the quality of graphics to come, but as you point out in the article, the combination of price point and absolute performance make these cards a very weak value proposition. From a commercial standpoint I think this makes complete sense for Nvidia. When your competition has a weak product, push out in a new direction that will move the industry forward. They are starting with very high pricing to cream off the early adopters, and I expect a price cut before the 30XX series appears which I'm hopeful will represent much better value for money, and a meaningful upgrade from the 1080 Ti.
Until then, Nvidia will have to live with weaker sales than previous launches, which I expect they anticipated.
Of course we shot it down, one or two demos aren't going to change the mind of anyone (especially when one of those demos is for an EA game).
By the time it actually turns up in more than a handful of games the Nvidia RTX 2380TI (or whatever it ends up being called) will make a complete laughingstock out of the by then museum piece 2080TI.
Or to put it in different words: The rejection of raytracing as something worth a penny has an expiry date, eventually it will be worth something, that time is just not now.
The 20XX cards are a first generation product. They're expensive, they're not especially power efficient, they're faster but the cost offsets that and the main selling point is a technology that doesn't yet exist outside of a controlled environment.
Fine, growing pains are what they are, but they aren't a good value proposition to anyone who wants a GPU to actually play games on. Perhaps in the next three years Raytracing will become the standard of game lighting, but it might turn out to be the next Hairworks too. Games like STALKER and FEAR already did amazing jobs on lighting by using clever tricks and careful planning, more than a decade ago now, so I suspect even with Raytracing the next EA title won't look especially better than the year before.
For a foaming at the mouth FPS player like me it's all academic anyway. I want frames per second, I turn the graphics down because the shinies make the game world harder to read. Any card that can hold 120+fps is just fine.
The RTX situation reminds of John Cleese' famous quote 'is the fog funny?' It's very easy for Nvidia and the tech press to focus on the amazing potential of the new tech, but in the here and now does that make the game better? I guess we'll see, maybe, when they get around to it.
I just don't see the appeal.
Sure, they will cope with ray tracing in games better than the 1080/ti when those games arrive in any real volume but, by that time, we'll be well into our second generation of RTX cards.
I mean, fine, let the must have early adopters take another one in the nads, when nvidia makes their £1200 purchase obsolete in 6-8 months. For the rest of us though? nah.
Untless 20XX cards can offer the same performance and the additional reflections - Not interested.
I normally would be one of the people saying that graphical fidelity matters, even when playing games that you generally wouldn't notice it like COD, Battlefield, and a few others but with the 2080Ti apparently not being able to maintain 60fps at 1080p or even reach 60fps whilst using it, just makes me wonder why bother with something that I am not going to use because it kills performance, and when it cost's so much more to the point that I could buy 2x 1080Ti's for less money than iirc the MSI, Asus or EVGA third party cooled cards.
I understand they need to recover the development cost, but £1050 for a single GPU that offers around 20% to 30% for basically double the price of the previous card it's just to much to ask especially since if someone was building a new machine, they could use that extra money for a better cpu or ssd etc rather than some currently useless tech that yeah looks good but worthless if you don't play the games that use it when they come out or add it in post via a patch or work with it.
It's just like Geforce Experience, I don't use it because when I tested it, it dropped performance by up to 20% and as I use OBS and a 2nd machine currently for recording etc, I will take that extra ~20% performance. Even when I was recording on the main machine that I use for playing games, I still didn't use it because I used OBS for the better features.
Ray Tracing may become the next good thing when it comes to graphics but it's going to be years before it does, because we need not only the performance to improve but also the list of games supporting it to become basically every game.
As it is I don't expect it to take off, until AMD have hardware that offers decent performance because given currently that it's nVidia hardware only that I believe is capable of it people are going to have to pay through the nose for it.
Essentially there's no compelling reason, as of today, to justify the financial cost of being an early adopter of an unproven and as yet unutilised technology. By the time enough titles arrive that make use of ray-tracing, we're likely to be onto the second generation of these cards which will almost certainly be more capable.
Speaking of AMD, David Wang, senior VP of engineering has recently said that "AMD will definitely respond to DXR" and that "the spread of Ray-Tracing's game will not go unless the GPU will be able to use Ray-Tracing in all ranges from low end to high end"
I have to agree with him, it's going to be a struggle to get developers to adopt ray tracing when the only hardware that supports it costs 500 odd pounds, it's a niche within a niche.
Sure, you shouldn't dismiss them, but you probably shouldn't buy them either.
After god knows how many years of PC gaming , these last round of GPU’s pushed me over the edge. Yes, I largely blame GPU pricing.
I decided to abandon PC gaming for the new releases coming up (BF5 being the one I’m looking forward to most). The prices are just ridiculous. Over the years a group of about 10 of us PC gamers have dwindled to 2 and now to none. All of us have gone console or just gave up.
The last 2 of us have just joined everyone else still playing (Xbox for us). Sure the controller isn’t great compared to PC but I just refuse to keep ponying up ridiculous money. It’s more fun playing with friends and the sofa experience ain’t bad at all. The one X in particular is quite a power house , all for less than a mid range GPU card.
Just have other priorities in life than spending 000’s on a gaming PC now, of which the GPU counts for a huge chunk of that. The performance of ray tracing on the RTX cards was a joke as well. Hardly a compelling argument for their vast prices. If you have money to burn though, go for it.
The only thing I will be giving these RTX cards is the finger. Consider them dismissed.
The RTX release kind of reminds me of the dedicated PhysX cards you could purchase. Only when they got absorbed into the main stream GPU architecture did its adaption sky rocket.
There's a lot of "by the time Raytracing is implemented we'll be 1/2/3 generations down the line" talk, which is all rather silly, for two reasons:
- RT implementation in engines (Unreal & Unity make up a supermajority of modern title development) means any developer who has even vague interest in an engine with dynamic lighting (i.e. everyone outside the retro sprite-like arena) will implement RT as soon as possible to eliminate the headaches of cubemap baking. Taking the "hit bake, wait minutes/hours, see results, iterate" to "make change, see results in real-time or a sub-second delay" is a powerful enough tool to be sufficient reason to have a live RT implementation in your particular flavour of game engine. Once that implementation is there, rolling that out for real-time use on clients becomes a case of choosing optimisations rather than of the implementation itself. With availability of DXR & VKR in the Unreal mainline branch (suggested as 4.22) later this year, and Unity at some point (their bass-ackwards excuse for a roadmap lacks most of their development features, go figure) a wait of months rather than years is more likely than multiple years.
- GPU architecture development is not fast. It hasn't been for a while, and it likely never will be again (the same applies to the CPU realm). The time of periodic process shrinking bumping up performance and dropping prices has also long ended: for the last 7 years at least (28nm) cost/transistor has been going up, clock speeds have remained stagnant, and perf/watt at max clocks has remained mostly static (total perf/watt has improved from radical advances in power-gating). To achieve performance gains, new architectures need to either radically change internally, or add additional specific-case functionality. Radical changes can take the better part of a decade, so adding fixed-function blocks has been the order of the day.
I think this would have been better as a standalone 'FPS v Image Quality' feature.
Non RT GPUs have like 99% marketshare (all consoles + all AMD GPUs + all pre 2xxx Nvidia GPUs + all Intel GPUs), so yeah... implementing RT will be an extra step on top of "normal" development for a long time to come.
I suggest you read the paragraph immediately following that quote.
That makes zero mention of games resulting from the "new way of development" being automatically compatible with said 99% of cards.
i.e. The "old way" will still be an absolute requirement unless you forgot to mention that the "new way" produces backwards compatible games.
Or look at it another way, you are expecting the same developers who frequently can't be bothered to unbundle options in the menu, add fov sliders, add proper AA methods (no, FXAA isn't good enough) etc to do extra work, even if it was as little work as ticking a checkbox to add RT support I wouldn't count on it that they actually tick that box.
What else is he going to say - AMD's gpu's don't have ray tracing hardware, they can't exactly say "don't buy a gpu unless it can raytrace" can they? It is also noticeable how they pulled high end Navi, I suspect because they know it's pointless releasing a high end gaming gpu without ray tracing hardware.
As for the RTX bashing, well tbh 90% of gpu forum talk is red vs green, and if RTX/DLSS is successful then red is screwed. Not surprising it gets bashed. The thing to remember is forum talk means practically nothing in the real world - for example the 970 destroyed the 290/390 in sales despite most forums continually pointing out how the 290 was faster and the 970 only had 3.5gb of memory - you'd think that AMD should have dominated the sales if you just read the forums but Nvidia had something like 85% market share.
I think pretty well everyone agrees that ray tracing is the future it's just how long it will take, on one hand it's not in the consoles or AMD cards so why implement ray tracing. On the other hand Nvidia basically has nearly the all the high end pc gaming market and has the money to burn to get devs to adopt it. As it's pretty simple to implement and will be part of all the big gaming engines everyone uses I'm guessing a lot of games will get some limited ray tracing support in the next year.
Reminds me very much of when HDR lighting arrived. There was that patch for farcry - it hit performance and was a bit OTT but it looked amazing compared to what was there before. I had a 6800GT and that really wasn't up to the job (frame rates dropped to the 20's) but image quality was so much better I couldn't bring myself to turn it off.
Separate names with a comma.