1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Graphics card market shrinks by 42.7 percent

Discussion in 'Article Discussion' started by CardJoe, 17 Mar 2009.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,343
    Likes Received:
    292
  2. Fod

    Fod what is the cheesecake?

    Joined:
    26 Aug 2004
    Posts:
    5,802
    Likes Received:
    133
    could it be a case of stupidly short product lifecycles and people just not feeling the need to upgrade?

    my 8800ultra still runs all my games fine and there have been two major card releases since that was cutting edge. why would i want to upgrade it?
     
  3. azrael-

    azrael- I'm special...

    Joined:
    18 May 2008
    Posts:
    3,846
    Likes Received:
    124
    ...and a fair number of nVidia rebra.. ahem, *completely new* graphics products. :)
     
  4. Narishma

    Narishma New Member

    Joined:
    21 Feb 2008
    Posts:
    134
    Likes Received:
    0
    I think that's the main reason. Most games currently coming out are ports of console games, and don't need anything more powerful than an 8800, unless you play on ridiculously high resolutions.
     
  5. yuusou

    yuusou Well-Known Member

    Joined:
    5 Nov 2006
    Posts:
    1,950
    Likes Received:
    258
    Your common gamer (kids) don't even know what a rebrand is. If the GPU they have plays the only games they play (WoW, CSS, etc) then they aren't going to buy. Unless there's a big difference from one to another, and from this generation's game to next generation's game, then of course the market share will fall. Since 2002( ? ) Nvidia have gone all the way from the GF4 all the way to 9 and beyond!
     
  6. Journeyer

    Journeyer Well-Known Member

    Joined:
    31 Aug 2006
    Posts:
    3,033
    Likes Received:
    97
    I have no immediate plans to upgrade my graphics card, but I will if the 9800GX2 starts showing signs of weakness in upcoming games. I play at 1920*1200, which the card handles brilliantly so far in the games I play, but the economic climate will not deter me from upgrading if I feel the need to. Then again, I will concentrate on my build instead I think. :D
     
  7. PQuiff

    PQuiff New Member

    Joined:
    7 Sep 2001
    Posts:
    557
    Likes Received:
    4
    LOL at the gfx card makers...thats what you get when you try take your customers for a ride.

    Maybe if they left more time between the new gfx card releases and spent more time on research giving us a genuinely faster card, not just a few percent. People would be more inclined to pay for a new card.
     
    DarkLord7854 likes this.
  8. yakyb

    yakyb i hate the person above me

    Joined:
    10 Oct 2006
    Posts:
    2,064
    Likes Received:
    36
    yeah come out with something new and poerful i may buy it
     
  9. Singularity

    Singularity ******* Operator from Hell

    Joined:
    2 Mar 2008
    Posts:
    583
    Likes Received:
    4
    yeah, my first thought was "wait, were there any really new graphics in the last year or so?"
    And I agree about the nVidia's "innovation" :D
     
  10. antaresIII

    antaresIII tephigram

    Joined:
    25 Feb 2009
    Posts:
    168
    Likes Received:
    4
    Waiting for new ATI 40nm; after it arrives i'm going to construct a new core i7 system around it.
    In the meantime the sales are probably slower.
     
  11. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,118
    Likes Received:
    363
    +1
     
  12. Xir

    Xir Well-Known Member

    Joined:
    26 Apr 2006
    Posts:
    5,249
    Likes Received:
    88
    I'll be upgrading from my X1800XT soon...but it's still playing everything in 1280x1024.
    (But I'll be upgrading to a bigger monitor as well, so I'll need the grunt)
     
  13. Comet

    Comet New Member

    Joined:
    21 Jan 2009
    Posts:
    27
    Likes Received:
    0
    We can't comment on this without talking about how the computer and graphics market are united with the pc gaming market.
    The PC gaming market is fine. The only difference today from what we had 10 years ago, is that the PC now has competition.
    From a pure technological perspective the competition isn't anything special. It is how the competition is doing business that it is different from the PC. And that is having a couple of companies both hyping they're platform and giving a good out of the box experience.
    This is what I've been saying all along and what I think companies like AMD, Intel and NVIDIA and even Microsoft don't seam to listen.
    The PC gaming market has always been the technology pusher. Has consoles gain market penetration developers don't invest so much on pushing
    graphics and physics since consoles don't evolve as fast as PCs. Todays computer technology is way beyond consoles. The new Windows 7 and DX11 cards
    bring alot of new stuff, but developers investing on that in the short term greatly depends on how the market will answer the next windows release.
    If you look at the market you'll see that most games out there still resort to DX 9 sort of graphics. There aren't many games out there pushing hardware accelerated physics, or DX10 sort of effects.
    But we don't even need to mention this technologies. Something as simple as having bigger and better textures could be achieved in todays hardware. Yet developers choose not to. Why?
    Because consoles can't handle those bigger textures that only 1GB cards can. Do keep in mind that this is rare. We are in the second DX 10 generation of graphics card and still this technologies haven't had much penetration.
    The end result is that we can play almost all games out there with AA plus AF at 1080p with a graphics card that is no more than $150.
    This isn't necessarly a bad thing. But it is bad for the companies that create this new tech. If people don't need to buy anything better they'll stick with what they have.
    Obviously in the short term companies like NVIDIA and AMD gain money due to the chips they have implemented on consoles. But in the long run they loose money because the customers that pay for the technology
    don't find the necessity to buy better technology. Operating system makers such as Microsoft also get hit by this because lets face it one of the reasons people upgraded they're pcs with the new OS was to play games.
    The problem is that companies that depend on the PC architecture to get the bucks also feel shy to unite on giving it the necessary treatment.
    Things like having a gamer mode implemented in the windows system could have been made. Having an hardware based protection system to avoid piracy could be implemented already just like it exists on the consoles.
    Achieving an agreement in that each and every pc should have a label stating if it is gaming capable or not. That doesn't necessarly mean that it is more expensive. only that it has a graphics card capable of achieving decent performance for gaming. The windows promoted rating system is a good thing, but games need to come with those ratings on the box or else not much sucess there.
    Today you can get a PC that is leaps and bounds faster and better than any console at an affordable price. The only problem really is marketing and presentation.

    PC has the advantage of having people pick different programs for what they want. Steam, X-Fire, Gamespy. But it wouldn't be bad to have an easy to use game mode built into windows just like there is on the consoles.
    Another point is really marketing. This companies could use the PC Gaming Alliance to market the PC has a good AND cheap gaming machine. They could share costs and it would end up helping every single company.
    Final point. THE GAMING PC SHOULD NOT BE SEEN AS A DIFFERENT BEAST FROM THE CONSOLES. I keep seeing lots of marketing , lots of news, reviews that litteraly put the PC in one side and the consoles in the other.
    Sorry but from my perspective a gaming PC is no different than a console. It is an entertainment machine that allows me to watch movies, play games and so on. So why seperate the two?

    This comments don't necessarly mean PC gaming is doing bad. It isn't. Not at all. It is very very profitable. But like in every market when competition appears people will switch products.
    I think the connection is clear. Some people switching to consoles + PC gamers that don't see the need for anything better = lower sales
     
  14. wgy

    wgy New Member

    Joined:
    28 Jul 2008
    Posts:
    305
    Likes Received:
    15
    im running a 8800gts 640mb.

    the price of upgrade vs preformance gains is not worth it atm.
     
  15. DarkLord7854

    DarkLord7854 New Member

    Joined:
    22 Jun 2005
    Posts:
    4,643
    Likes Received:
    121
    Sorry, didn't read, blocks of giant text is hard to read and follow. :blah:


    On-topic though.. I agree, releasing 60 versions of the same card doesn't exactly help the market. That and there's so little difference between the different cards except for the ultra high-end versions of each series. Why can't they just introduce 3 or 4 versions of a series and leave it at that?

    Meh.
     
  16. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    My first thought was: "conclusion: scare tactics work!".
    People have been told so many times that times are bad that they started believing it and stop buying products.

    Then i thought... wait a minute! Q4 2007 was a much more interesting time to go upgrading. Many people were moving from 6600's and 7-series to the 8800 series, and even more were moving there from the ATI side. It was a great, sensible upgrade that offered real performance increases, unlocking higher quality settings and resolutions that would previously be unheard of.

    Now though, we are still coasting on the fantastic action we had in the summer. Only very few demanding games have come out since then, and current hardware is very able to play anything we can imagine throwing at it. Even a single sub-$200 HD4870 or GTX260 is very capable of playing anything you throw at it on 19x12. Even on the upper-extreme 30" resolution of 2560x1600 a lot of games are playable, and when you have the money to pay for a 30"screen, you can just as easily pop in a second card for higher settings and super-playability.

    Why upgrade?
     
  17. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,754
    Likes Received:
    332
    Nice comment, Comet. 1 little mistake though:

    We're actually in the 3rd generation of DX10 hardware, at least nVidia is [G8x, G9x, G200] whereas AMD is in the 2nd generation [R600, R700].

    Other than that I don't agree with everything you said but partially I couldn't agree more. The part about consoles holding back games' development in particular.

    Let's just wait what they say in 2010 after the release of DX11/SM5.0 hardware. It'll be the same story as with Vista/DX10.
     
  18. nicae

    nicae New Member

    Joined:
    25 Nov 2008
    Posts:
    128
    Likes Received:
    0
    I am sooo gonna upgrade my 8800GT for a 9800GT or a GTS250!


    Not.
     
  19. War-Rasta

    War-Rasta New Member

    Joined:
    22 May 2002
    Posts:
    398
    Likes Received:
    0
    This sums it up perfectly.
     
  20. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    We need a Hiatus, so the GPU makers can make more compelling cards, like back in the X1900 and 7900 days where an upgrade from an X850 or a 6800Ultra was something to be marveled at.

    And we need more amazing games, Crysis is 2 years old and it remains the top benchmark, we need replacements!
     
Tags: Add Tags

Share This Page