Discussion in 'Article Discussion' started by bit-tech, 4 Oct 2019.
Start of Ray tracing seems a bit like how Physx or HDR start was, nice but feels gimmicky, but as hardware catches up will become the norm.
"Future of Gaming or Unnecessary Gimmick?" I think it can be both.
As mentioned above, when something like this first comes out and only runs on expensive hardware and limited software then it's somewhat of an unnecessary gimmick that you can choose to buy into if you like it enough. Then if it's good enough it will gradually become the norm which also makes it the future of gaming, but only time will tell whether it will make this step (I think ray tracing probably will and already is to some extent).
Going to have to disagree here.
I played the game at this resolution, because quite frankly its the only resolution this game can be played at and maintain a 50-60fps lock, with all settings at max, including RT, and quite frankly it looked appalling. The 'noise' that is inherrent to ray traced rendering was terrible with a super heavy, film grain like filter on all the surfaces and textures.
This game only really looks 'amazing' at native 4K, where the 'noise' is now small enough to be (almost) unnoticeable.
...of course at 4K it only runs at 15-20fps, but thats another matter.
The future of real-time rendering mirrors the past of offline rendering (AKA 3DCGI for movies and TV):
Start with basic raycasting (e.g. Tron), then move to raster (e.g. Renderman), then to raytracing/raster mix, then to full unbiased path tracing (e.g. modern Renderman).
Real-time rendering engines (videogames) are following the same sequence: we started with raycasting (e.g. Wolfenstein 3D), moved to raster (e.g. Doom, and everything since), and are now entering the hybrid rendering world.
As someone who purchased one of the Ageia Physx cards back in the day, I'm all for getting on board with new tech. The only thing holding me back from investing in an RTX card is the sheer price hurdle. I already have a 1080ti, which cost a pretty penny 2 years ago but still has plenty of life left in it. I've also gone the 1080p/144hz route for gaming as at the time, there weren't many monitors that ticked all the boxes (G-Sync, IPS, 1440p, high refresh, ultra-wide, etc...). That's slowly changing but I can't see myself dropping 2 grand on a new GPU/Monitor setup unless I win tonight's roll-over!
I wholeheartedly agree with you. The game looked crispest when played maxed out at 1440p on my 1080Ti without RTX. It looked distinctly fuzzy at the same res on the 2080Ti with all RTX gubbins cranked up. This was observed both with and without DLSS, but was obviously better looking without.
In the end I played through the whole game on a 2060 with one of the RTX settings switched off (the window reflections one iirc) and the others turned down a bit. This felt like a compromise worth making, as it sharpened the visuals up and coupled with G-sync I had no discernible FPS issues.
I think Control is the first example of a game where ray-tracing has been used to make the environment feel more realistic, as opposed to prettier, but the image quality compromise was a fairly bitter pill to swallow.
Personally I'm not interested in ray tracing in the slightest. I'm much happier with high fps and conventional settings.
I also think it has a lot to do with the games you are playing. I tend to play the first person shooter type of games and looking at the quality of a reflection is something that I basically never do.
I want things to look good - ie I want water to look like water, not a slab of blue but that's about it and games do that for me now without any need of ray tracing
I'm not saying that it's pointless to me, i'm just not going to sacrifce any fps and I am certainly not going to pay extra for it.
I honestly think that trying to advance graphics is ruining some games; Battlefield being a prime example. If used to be a great game with huge maps and graphics that whilst good for the time were never outstanding. Today it's totally the opposite, the visuals look great and oh wow, it has ray tracing, but the maps are tiny and subsequently the gameplay is awful in comparrison to what went before.
Ray tracing is neat feature but quite low down on my list of GFX must haves.
High polygon counts.
Don't care enough about it to invest money - if it is automagically included in my next GPU without a steep premium, fine. Otherwise, meh.
I imagine my 1080ti will be fine for a while yet.
A 2080TI gets around 25 FPS in Quake 2 with raytracing on high at 4K, in other words: Raytracing will be a gimmick until Nvidia more than doubles the performance of their current cards.
I'd say the way Nvidia has implemented it is a gimmick but a more sensible approach will be the future, it needs to be available to everyone and not just those with deep pockets and it needs to be available to everyone and not just people who buy Nvidia cards.
Funny, I said pretty much the same thing about VR, about four years ago.
RTX is pretty much a 'rich kids' toy' - like VR was 4yrs ago - if it becomes mainstream within 5 yrs of going live, it'll be worth buying into.
RTX is a proprietary standard and I think it all depends on how next gen consoles and AMD in general implement ray tracing. It is the future, I'm sure. But whether that future is 2021 or 2024 in I'm not so sure. And yes, that means I can't see it being mainstream next year.
It's not really compatible to VR. VR requires separate additional hardware and a different approach to gaming.
I think while it is in itself very different, the course of mainstream adoption will be similar to G-Sync/FreeSync:
-AMD will release an implementation after Nvidia.
-There'll be some exclusivity and deals.
-Availability will trickle down the price points on both sides .
-It will become common and lose it's novelty.
-Platform exclusivity/deals will wane with the maturity and commonness of the tech.
-It will become standard where it is appropriate.
RTX implements DXR which is Microsoft's standard that AMD can implement and will be what the next Xbox uses so it's not really an Nvidia thing it's just they were the first to come out with specialised hardware to do it.
This reminds me most closely of the arrival of HDR lighting. I remember turning that on in farcry (which got patched to use it) and thinking it looked amazing but it tanked my frame rates (on my 6800GT). There were also fanboy wars then because Nvidia did DX9c and AMD only did DX9b which couldn't do full HDR. HL2 did the cutdown DX9b HDR and had less of a performance hit. So lots of "only DX9c and Nvidia does it properly" vs "AMD's DX9b looks almost as good and fps much better". Anyway the next AMD cards got DX9c and all games transitioned to using HDR light over the next few years.
I expect exactly the same will happen here - AMD will get decent ray tracing support in a gen or two and in a few years all games will be using DXR (or the Vulcan equivalent).
As for is it worth it - well tbh lighting improvements are always worth it. Just look at a game from 10 years ago vs a modern game and while you will notice it's a bit blockier and the textures aren't quite as sharp, what tends to stand out the most is how flat the lighting is. 10 years ago you probably thought that lighting was great, but the moment you get used to better lighting it ages badly. Same will be true for all the ray tracing improvements imo.
An even closer analogy would be Hardware Texture & Lighting. If you go look up GeForce 256 reviews, you'll see some startling similarities to RTX cards: "It's really expensive, the new tech looks good but very little supports it, what does doesn't look that much better, and you can buy a previous gen card that's almost as fast for much less". Of course, history tells us that Hardware T&L became the universal standard.
Or Unified Shaders, which had the same round of complaints (LOL 8800 Ultra, $830 launch price in 2007 ~ $1000 today) and the same end result of universal adoption, as well as "but CUDA is Nvidia only, it'll never catch on!" missing that CUDA was the implementation, not the technology.
I'm playing Control with DLSS raytacing on at the moment and getting about 75fps most of the time at 3440x1440 by using one overclocked 2080ti. It is very shiney but also slightly odd due to the DLSS scalling it doesn't quite feel like I'm making the most of my pixels. Feels .. almost there. RTX and the fact that nvidia decided to support freesync is why I decided to give the TI a shot (after selling everything I could find in my cupboard of things I'm not using enough these days). Almost got caught out by the A vs Non-A thing which feels like having two seperate products with the same name just to catch people out.
I think gaze tracking and the ability to only put the detail in the bits you are looking at might actualy give us all ray tracing and generaly higher resolutions/fps soon (in and our of VR).
Separate names with a comma.