1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs AMD and Nvidia need to step up to the 4K challenge

Discussion in 'Article Discussion' started by Dogbert666, 30 Sep 2014.

  1. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    This seems rather contradictory to me, You comment that most GPUs are Intels but claim that PC gaming is from RTS and Sims, which are usually two of the most demanding genres out there. ARMA, Total War, Flight Sim X... even going back to Company Of Heroes and others all extremely demanding games that demand more than the average FPS needs to run. As for the modding community, consoles are moving in that direction too. As for the pad issue, if they can get a pad to work for FPS I am sure they can get sports games to work with M+K, it would not be as good but if there were more games needing them then more people would buy pads. I mean all Microsoft has to do is release a dongle for it's XBox controllers, job done.

    You may be right but that is at least another 5 years down the line and I don't think either Sony or Microsoft will be in a rush to replace their current generations of consoles. Look how long PCs and TVs have been using 1080 and yet the new consoles don't exactly handle that with ease.
     
  2. schmidtbag

    schmidtbag What's a Dremel?

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    Lol I never said it made sense or should be that way, I'm just stating that PC has an edge with RTSs and the average PC user, according to Steam, has a crappy GPU.
     
  3. Star*Dagger

    Star*Dagger What's a Dremel?

    Joined:
    30 Nov 2007
    Posts:
    882
    Likes Received:
    11
    Mentioning consoles is risking committing the Console Heresy.

    They have been and always will be inferior, and have little to offer a conversation about the future of PC Gaming.
     
  4. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Why does everyone keep saying 'most GPUs are Intels'?

    You are totally misinterpreting the stats. The headline here is actually 90% of gpus are NOT Intel.
     
  5. woods

    woods What's a Dremel?

    Joined:
    27 Apr 2010
    Posts:
    83
    Likes Received:
    0
    I just updated my monitor to a Asus 29 2560x1080p 4 months ago and a GTX 770 last week I am well happy with these, for probably the next four years when 4k will then cost the same or less than these two investments
     
  6. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    I've tried playing with a gamepad, my hands just cramped up and hurt like hell, so I'll stick to Keyboard and Mouse.
     
  7. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    You have to admit that you are in a minority there then, to just not be able to play with gamepads at all. On a console people don't have the choices that PC gamers have but when it comes to PC gaming I use what is best for the job. Racing and sports games, gamepad. RTS, RPG, FPS, M+K.
     
  8. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    You mean 90% of gpus not including intergrated are not Intels.

    As Intel owns 60% of the total gpu market for the record. And that % has gone up in recent years as there gpus are now powerful enough for most peoples casual gaming needs.
     
  9. Chris_Waddle

    Chris_Waddle Loving my new digital pinball machine

    Joined:
    26 Mar 2009
    Posts:
    860
    Likes Received:
    61
    Personally I'm now getting sick to death of the push to make resolutions higher in order to make games 'prettier'. I've spent vast fortunes going SLI, Tri-SLI, Crossfire and now Tri-Crossfire.

    For me, they sort of work, but they never really work as they should. I've built a machine that should run 5760x1920 easily, but it doesn't. I'm still playing games at 1920x1080 because it works and feel that my money has been wasted.

    It's all well and good making games look great, but if they won't run smooth then it's pointless.

    To me, the problem is still caused by consoles. Gaming Studios still make games for consoles. PC's are an afterthought and they are not going to spend money developing games to run perfectly at 4k when a console can't run it.

    It doesn't matter how much you spend on graphics cards, the games isn't being designed to run at 4k and game quality is being compromised to make it run on consoles and look great.

    I can put out one example. Myself and friends get together 4 or 5 times a year to have a 'lads night' where we play(ed) x-box; since the original x-box. Tiger Woods golf was always one we all enjoyed.
    As they years went on, the graphics got far better but the game got far worse. It got to the stage where hitting it just off the fairway became out of bounds!
    The programmers spent so much time making the game look good that they stopped making the game 'realistic'. I say realistic tongue-in-cheek but the point is that it's no longer a 'golf game'. It looks great, but the game play is woeful.

    If I hit a bad shot that is in-bounds, but on another fairway or in the trees, I should have to play it from there. Because 'pretty graphics' are all that matters now, I end up out of bounds because the game hasn't been programmed for that; it takes up too many resources and they need all the 'disc space' they have to make the graphics look good for what they have mapped.

    I'm not saying that we all shouldn't be wanting 4k, but it's pointless (in a gaming perspective) until games companies support it. No amount of GPU grunt is going to make a game run great, if the games companies don't care or support it.
     
  10. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    AMD released (I think) the 7 thousand series cards. Nvidia commented on how underwhelmed they were with the competition. Nvidia then released the GK104 card as the flagship x80 model. They subsequently release the titan super range on GK110 silicone. If AMD had released a better product, the titan line might only exist as the x80 range.

    I don't think they are holding back between generations. But I do think NVidia have shifted each SKU up a level as result of AMD either holding back development or not being good enough to compete.
     
  11. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    Hmmm, that reminds me, could these games revive Force Feedback?

    Three words:

    WII sports golf :hehe:

    (looks awfull but is fun to play)
     
  12. fluxtatic

    fluxtatic What's a Dremel?

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    Hallelujah, brother.

    My big upgrade for the year is a new CPU (FX 6300, I'll have it Thrusday) and shortly, a GTX 750 Ti. The 650 wasn't enough of an upgrade from my GTX 550 Ti, and the 650 Ti Boost didn't stick around for long. Getting the itch to upgrade, I happen across the 750Ti. Double the performance of the 550 Ti at half the power. Or, another way, as good as or slightly better than the GTX 480. Remember, this was NVidia's $500 flagship just four years ago. The 750 Ti has that same performance at barely more than half that power, for less than a third the price.

    Consider the 980. It just knocked off every GPU that came before it at less power. The only thing that's consistently close is the R9 295X2, a $1500 monster that sucks nearly three times the power. And it's Crossfire, with all the bugs and quirks that come with it.

    As others have mentioned, NVidia's still stuck on TSMC's 28nm node. It might be a good long while before there are enough A8s and Kraits in the wild before NVidia and AMD get their hands on TSMC's 20nm line.

    But no, some spoiled children want to cry today that they can't play games at 4K with a $300 or less video card? Give me a ****ing break.
     
  13. Byron C

    Byron C Multimodder

    Joined:
    12 Apr 2002
    Posts:
    10,009
    Likes Received:
    4,639
    Well this all got a little bit feisty, didn't it? :)

    I don't know what you lot are doing - aside from multi-screen stuff - but since I upgraded earlier this year I've found only one game so far that really stresses my PC out: modded Skyrim, and even then it's only really the ENB presets that really slows things down. FYI I run a Core i5 4670K (stock speeds), 2GB GTX 760 and 8GB of RAM - hardly "high end" by the standards of many here.

    Admittedly the most modern game I play with any regularity is Kerbal Space Program, but there are very few "mainstream" games out there that make me go weak at the knees. There are a few I'm looking forward to - GTAV, Star Citizen and Elite: Dangerous (which is indeed doing a pretty good job of making my knees, amongst other body parts, quiver) - but I really don't think I'm going to struggle to run any of them with my current machine. Admittedly GTAV is a bit of a worry, but only because Rockstar handled the port of GTAIV so incredibly badly that even on the highest of high end systems at the time it still wasn't smooth.

    I really don't give a hoot about 4K - sure it looks sharper, but does it really make things that much better? Will it make games any more interesting or fun to play? I seriously doubt it. Higher resolutions, smoother lines, prettier effects, more pixels...? These things don't make good games.

    The thing that concerns me the most when it comes to PC gaming is the fact that I need a power supply capable of consuming over 600 watts, and many people here would think nothing of running a 700W or 800W power supply. My GPU & CPU alone can use up to 254 watts at peak load. Frankly that's insane. Nvidia are heading in the right direction with Maxell if you ask me; the 750Ti was an incredible achievement, and from what I've seen so far the GTX 9xx series continues that trend. If I can massively reduce power consumption but still have equal - or even better - performance then that's something I'm more than willing to pay for.
     
  14. atlas

    atlas What's a Dremel?

    Joined:
    9 Jun 2011
    Posts:
    38
    Likes Received:
    0
    Great article and certainly a problem I am facing at the moment, I've gotten by with my 5870 for years and it still plays the latest games without issue but I am very keen to upgrade to a 4k display and what's holding me back is the graphics cards, tempted by the 970 but it still isn't good enough.
     
  15. r3loaded

    r3loaded Minimodder

    Joined:
    25 Jul 2010
    Posts:
    1,095
    Likes Received:
    31
    A lack of competition is often cited as being the reason Nvidia holds back with its "big" chips (GF100, GF110, GK110), but the reality is that these actually are physically enormous chips. Big chips always give issues with yield rates (given a fixed sized wafer and a constant, random rate of defects across the wafer, big chips are more likely to have failures than small chips purely because of their area). Hence, it takes longer to get these bigger chips to market at a reasonable price, especially with a new process. From a business perspective it's better to get the X04 chips out if they're already competitive, then bring the X00/X10 big guns out when they mature.
     
  16. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    No. Amongst Steam users, 51% are NVidia, 30% are AMD - only 19% are Intel

    The rest of the market is entirely irrelevant, as they aren't gamers - they don't have steam, and if you are a PC gamer in 2014 - you almost certainly have steam.
     
    Last edited: 1 Oct 2014
  17. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    I still think it's not the lack of competition but the lack of games that brings GPU's to it's knees. Since 90% of the games are console ports or has to work on consoles or are small indie games everything will work smoothly on gtx 5,6 or 7 on HD resolution.

    There is no 'new crysis' that destroys the latest greatest GPU based on new and enhanced game graphics modes and calculation of environments and other amazing stuff graphics wise. That is what made people buy new graphics cards. To have that 'more' realistic game environment and awe at moving jungles and twisting leaves when passing by, or seeing rain fall down and flow down the road you stand on. There is absolutely nothing in higher resolutions what makes you go.. wow!
     
  18. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    I suspect its simply a size issue, fingers on buttons, but too much hand left when the bloody thing ends leaving no comfortable way to hold it, one size fits all obviously doesn't apply to me and I doubt I'm alone.
    Lets face it, it can't be terribly hard for a major company like Sony to release the damn thing in small, normal and large, but they haven't done so and neither have the countless rip offs.
     
  19. dolphie

    dolphie What's a Dremel?

    Joined:
    6 Jul 2012
    Posts:
    650
    Likes Received:
    14
    It is going to take a couple of years for graphics cards to cope with 4k screens, for gaming.
     
  20. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    graphics in games are not the talking point they once were, Crysis looked awesome at the time now it looks outdated and runs on a poorly optimised engine.

    Optimisation will bring gamers who want 4k closer to it, PC AAA only Titles are few and far between actually outside of mmos id struggle to name the last pc exclusive big budget title.

    4k and vr both require huge hardware requirements, expecting AMD or Nvidia to deliver a card that can hack either for under £200 ( the mainstream gpu cost) is a long way off if ever for either company.

    Think we will need a new gpu player to really push gpu development, Mantle is already doomed to failure, Metal will be more of a success but Apple only. OpenGL needs a lot of stuff fixed in optimisations.

    Dout AMD or nvidia have the money to take it to 4k or vr for all people at an affordable price.
     
Tags: Add Tags

Share This Page