1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Windows Ray Tracing...Is it a lot of old arse?

Discussion in 'Gaming' started by SuperHans123, 31 Jul 2022.

  1. Gunsmith

    Gunsmith Maximum Win

    Joined:
    23 Sep 2005
    Posts:
    9,777
    Likes Received:
    2,352
    DLSS 1.0 on control was ****, i ended up beating it at native res but its since been patched up and runs really nice.
     
  2. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    I love RT Q2, just not for the reason Nvidia wants me to love it.

    The fact that a 3090TI can't even manage the absolute bare minimum of 60 FPS at 4K in a decades old game specifically remade by Nvidia for the purpose of sucking the d**k of Raytracing very clearly demonstrates that actual full raytracing in modern games is still many many years away from being viable as GPUs aren't even remotely close to offering enough power.
     
  3. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,130
    Likes Received:
    6,718
  4. boiled_elephant

    boiled_elephant Merom Celeron 4 lyfe

    Joined:
    14 Jul 2004
    Posts:
    6,907
    Likes Received:
    1,190
    This is an interesting point, I've always had this feeling about a lot of mods and mod packs for older games, like Elder Scrolls. People mod Skyrim or Morrowind into looking great, but inevitably it just starts a "whoops I made a clean spot" problem where every single asset, texture, animation, lighting effect, map and shader needs redoing to make it coherent. The worst modding results (like a lot of the Morrowind ones) come from modders not understanding this problem. Graphical coherence is more important than the polish on any individual part.

    Which sheds light (hah) on why RT hasn't made a splash yet. Games are not designed around it; they're developed conventionally and then retrofitted, like how 90% of 3D films looked crap and gave 3D a bad name.

    We need the RT equivalent of Avatar: a game exclusively, specifically made for RT that leverages it as a core part of the design philosophy. I don't agree that it needs to be implicated in gameplay. Just central to the entire graphical design process. This won't happen while it's still an optional extra on a minority of cards that incurs a performance penalty.
     
    SuperHans123 likes this.
  5. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,447
    Likes Received:
    5,851
    With respect, no, we don't. Why should we be focussed on what will essentially become something as commonplace as T&L or AA?

    RT is getting a lot of press because Nvidia chose it as their battleground. A USP over AMD's offerings, when it was nigh on impossible to separate comparably tiered cards in rasterisation performance and, worse still, AMD's 6000 series offered better non-RT performance than the 30 series, so Nvidia marketing is doubling down on RT.

    I own a 3090, and it's great, but I ain't touching the Kool-Aid.
     
    Vault-Tec likes this.
  6. boiled_elephant

    boiled_elephant Merom Celeron 4 lyfe

    Joined:
    14 Jul 2004
    Posts:
    6,907
    Likes Received:
    1,190
    Sorry, I worded it badly. I didn't mean that the industry should prioritize RT. What I'm getting at is that for RT to be properly comprehensible to the average consumer, we need a game or two that prioritize RT in their graphical development, from the ground up, to show what it actually is and what difference it makes. Judging it based on its current implementations leaves people guessing as to whether it's snakeoil or not. We, as customers, need to be able to decide for ourselves whether we care about RT with all the evidence in front of us.

    This is the opportunity that Avatar (and to a lesser extent, Gravity) finally afforded us - we could see exactly what 3D was, what it could offer at its best. That was valuable insight into the technology: we finally understood that most 3D implementations were crap not because 3D is irredeemably crap, but because they were bad implementations.

    Apart from the customers, it was also an invaluable opportunity for the industry in question to learn and study. To my immense disappointment, the industry learned the wrong lessons from the Avatar/Gravity era. They weighed up the cost-benefit ratio of a huge, expensive "proper" 3D project like Avatar compared to an affordable crappy retrofit, and decided that crappy retrofits are the way to go. In the long term, this cowardly executive decision eroded the public's interest in 3D and meant the end of big, ambitious 3D projects.

    Half-Life: Alyx was the same sort of experiment for the medium of VR. The full and final result of that experiment is still unfolding. It may reveal that VR was never going to get mainstream; it may have stoked the fire enough to finally get it going. Time will tell.

    "Proper" RT games would give the industry the same opportunity to see what the tech can offer and see how much interest - and economic viability - there is in the tech. I personally suspect it's a giant titwank waste of time, and I agree with earlier comparisons to Bloom and HDR, but I want to know, dammit.
     
  7. Mr_Mistoffelees

    Mr_Mistoffelees The Bit-Tech Cat. New Improved Version.

    Joined:
    26 Aug 2014
    Posts:
    5,250
    Likes Received:
    2,484
    I looked at some sample pictures showing RT in WoW Shadowlands. I have a GTX1070ti so can't use it but, I don't feel I missed out, at all.
     
  8. RedFlames

    RedFlames ...is not a Belgian football team

    Joined:
    23 Apr 2009
    Posts:
    15,417
    Likes Received:
    3,010
    I'm not sure how you'd make RTRT integral to a game in such a way that it wasn't just some visual gloss that can be turned off.

    A stealth level involving properly ray-traced mirrors is about all i can think of...
     
  9. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    No one has dared to push games to the technological limits since the original Crisis and can you really blame game developers for not doing it when some of the most successful games are ones that run on a literal potato?
    Stuff like League of Legends, Minecraft or Fortnite have proven that you do not need to chase bleeding edge graphics tech for a game to become popular.

    Besides, hardware is nowhere near fast enough yet to handle proper raytracing in a game with the detail level associated with modern games, so even if someone was planning to make a "proper" raytraced game they would probably be targeting 2030 or beyond for a release.
     
  10. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,130
    Likes Received:
    6,718
    My all-time favourite game is Infocom's A Mind Forever Voyaging... which doesn't even have graphics. The game I've probably spent the most time playing is Nethack... which only has graphics if you can be bothered to install a tileset.

    Graphics shmaphics. Give me gameplay!
     
  11. boiled_elephant

    boiled_elephant Merom Celeron 4 lyfe

    Joined:
    14 Jul 2004
    Posts:
    6,907
    Likes Received:
    1,190
    I mean, I must admit I agree. I've not given a hoot about graphics since about 2007. I enjoy Morrowind and FFVII more than Crysis or Witcher 3.

    Still interested to see what could be done with a sort of raytracing focused project though. It must have some value or unique qualities.

    Right?
     
  12. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,943
    Likes Received:
    3,717
    I'm with David, sorry.

    In fact I think if one killer app game came out it would be bad.

    I say that because when the Physx cards launched no one wanted them. Then they released what was, IMO, a killer app (Mirror's Edge) which was totally amazing. So every one wanted Physx cards all of a sudden.

    Sadly nothing else worth bothering with came out for it after that. There were a couple like Mafia II, but the effects were absolutely not worth spending the extra for.

    It absolutely did not sway my decision at all to buy Nvidia this time around. In fact, at first I was a little furious with AMD, because whilst it absolutely does not matter to me at all they were charging every penny Nvidia were, for cards with less features. However, given their recent reality slap I got a 3090 level card (if you ignore RT) for less than a 3080. Quite a bit less when I bought, and has more VRAM and a far superior cooling solution.

    Now? AMD's cards are quite a good chunk cheaper, and faster, than their counterparts.
     
  13. boiled_elephant

    boiled_elephant Merom Celeron 4 lyfe

    Joined:
    14 Jul 2004
    Posts:
    6,907
    Likes Received:
    1,190
    Fair. I understand where you're coming from, I'm just less committed to keeping up with the industry's foibles and experimentations, and so don't mind seeing them faff about with new technologies. It might waste someone's money, but it won't be mine.

    I had no idea Mirror's Edge was PhysX-focused, PhysX sort of passed me by. I think by the time I was caught up enough to turn it on in games, they'd stopped implementing it. These things only affect your experience if you concern yourself with them in the first place, after all.
     
  14. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,447
    Likes Received:
    5,851
    That's the thing though, isn't it? They haven't stopped implementing it - it's just so commonplace that it isn't remarked on any more.

    Which is sorta the point I was trying to make.
     
    Last edited: 28 Aug 2022
  15. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    Yep PhysX never went away, just the dedicated PhysX chips got wiped out when Nvidia bought the company making them and PhysX just become something being handled by the GPU.
     

Share This Page