1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How it all shakes out

Discussion in 'Article Discussion' started by Tim S, 24 Jul 2006.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
  2. scq

    scq What's a Dremel?

    Joined:
    4 Mar 2005
    Posts:
    879
    Likes Received:
    6
    I don't like the idea of GPU/CPU integration. While it may be easier for mainstream consumers to be able to play new games with little fuss, the GPU would no longer be easily upgradable, unless you upgrade the processor as well.

    It would be like buying a $1000 FX with an integrated 7900GTX, and then having to buy another $1000FX which is only slighty faster, for a 8900GTX (whatever it may be called by then).
     
  3. aggies11

    aggies11 What's a Dremel?

    Joined:
    4 Jul 2006
    Posts:
    180
    Likes Received:
    1
    Agreed. But the GPU-on-chip solution isn't really geared for the high end "niche" market that Wil talks about. It's more for the low-end mainstream to very middle-end medium core. These are the kind of people that don't upgrade often to begin with.

    Look at the transitor count of the current GPU kings. They dwarf CPU's in complexity. You can't reasonably expect "sneak" that thing onto the CPU. It's arguably *more* complex. It's gonna be stand-alone for a while.

    The end of the article is actually the most interesting, I think. The potential for this to make high-end gaming smaller, more niche.

    I think this is exactly the sort of thing Epic's CEO was talking about recently when he identified Intel as gaming's biggest threat.

    If they throw crappy GPU's on all their chips:
    - Everybody can play games now, TONS more people, the gaming market explodes
    - These GPU's suck and can't play high-end games. If your a business/ developer, do you make your game for the 1million highend gamers, or the 100million mass market? Bigger markets = more money. Gaming becomes dumbed down for the mainstream and high end gaming dies as we know it. I hope you like Sudoku... :p

    Aggies
     
    Last edited: 24 Jul 2006
  4. fev

    fev Industry Fallout

    Joined:
    13 Aug 2003
    Posts:
    0
    Likes Received:
    20
    i don't like the idea of AMD + ATi
    it's always been, at least in my eyes,
    green 'n' green (nvidia + AMD)
    blue 'n' red (figure it out)
     
  5. Da Dego

    Da Dego Brett Thomas

    Joined:
    17 Aug 2004
    Posts:
    3,913
    Likes Received:
    1
    But now it's christmas all year long!
     
  6. RTT

    RTT #parp

    Joined:
    12 Mar 2001
    Posts:
    14,120
    Likes Received:
    74
    I'm biased, but i thought this article rocked. I consider me informed!
     
  7. LoneArchon

    LoneArchon What's a Dremel?

    Joined:
    15 Jun 2004
    Posts:
    425
    Likes Received:
    0
    Well one way they could take the tech is having a dedicated GPU socket or make the GPU the same socket as the CPU and have a dedicated chip for it. AMD talk about this by allowing co-processer to plug in a spare socket on a dual socket motherboard to comunicate over the hypertransport channel. If they go down this road it could make upgrades less expensive
     
  8. Reins

    Reins What's a Dremel?

    Joined:
    18 Jul 2006
    Posts:
    51
    Likes Received:
    0
    I agree I can't find these ads anymore but I can remember seeing adds that say "...and remember the 7900GTX runs best with a AMD processor." or something to that effect.

    Hopefully things won’t shake out the way the article says it will for gamers because the article makes me sad. :waah:
    Rising prices of GPU's =bad
    Gaming becoming more niche =bad

    Perhaps as time goes on there won’t be anymore high end pc gaming and all of the high end gaming will be done on consoles :shrug: I guess we always have them. Perhaps by that time I'll be out of my gaming phase.

    I can't wait to see what Nividia does they have a couple of options in front of them as to how they want to play this.
     
    Last edited: 24 Jul 2006
  9. K.I.T.T.

    K.I.T.T. Hasselhoff™ Inside

    Joined:
    1 Jan 2005
    Posts:
    624
    Likes Received:
    1
    i like alot of the people having already posted hope that what is suggested in the article doesn't come true under any circumstances because i personally think its a poor idea, fair enough it will make workstations and your average family systems more compact for the capability but i realisticaaly despite the miniscule number of high-end gamers don't think the companies could kill off high end and bleeding edge PC gaming, besides the graphics systems would still have to be arond and be developed because someone has got to make the systems for consoles. if it does happen though it will be a sad time since everyone knows to play a good FPS you can't beat a keyboard and mouse and console gaming with the a KB and mouse just isn't the same and furthermore even high def TVs don't have the resolutions that computer TFT's and CRT's have and require even more power to be spent on AA to make the game look half decent after being masacred by scaling.


    Nvidia don't have that many options...they have only two long terms furtures as i see it. they can either continue producing graphics systems and not have any mobos or very few to put them in cause all the other companies have gone intergrated and eventually be forced to close down or go intergrated and mid-range like the rest....OR.....they can start producing they're own CPU's, mobos and graphics solutions that work together but due to the incompatibility with other hardware and issues it would have to be very expensive and they'd probably forced into my 'alternative' furture in the end if they didn't ant to close down.
     
  10. Bursar

    Bursar What's a Dremel?

    Joined:
    6 May 2001
    Posts:
    757
    Likes Received:
    4
    Joy. So now my £200 mid range CPU becomes a £300 mid range CPU with a low end GPU tacked on...

    And if you thought processor naming was confusing now, wait until you 4 have variations of the same CPU that include different GPU cores.
     
  11. Jhonbus

    Jhonbus What's a Dremel?

    Joined:
    1 May 2005
    Posts:
    120
    Likes Received:
    0
    I'm not sure I like this much. Sure, I want AMD and ATI both to stay in the game for the sake of creating competition (In fact I use AMD and ATI at the moment.) But if this comes at the cost of sacrificing our ability to choose our processors and graphics solutions independently, it's a very bad thing.
     
  12. suicidal-kid

    suicidal-kid What's a Dremel?

    Joined:
    10 Mar 2006
    Posts:
    26
    Likes Received:
    0
    Well, as an ATI-Intel person, I'm out of luck.
     
  13. valium

    valium What's a Dremel?

    Joined:
    15 Oct 2003
    Posts:
    288
    Likes Received:
    0
    The High-End crowd is basically having their PC turned into an upgradeable console.

    When this takes off I predict that AMD will decline even more in sales, due to this merger and its GPU/CPU unification that no longer allows us to choose our favourite brand of GPU.

    On another note, what does this say about ATi? Is nVidia GPU sales really hurting them so bad they need to be assimilated by a bigger CPU vendor just to keep breathing? Has Crossfire failed to succeed in what is now a multi-GPU world? (imo,yes it has)

    The question now is, will Intel forego its onboard GPU's and ally itself with nVidia? And if it does how much control will the nVidia team have over its design/r&d/etc?
     
  14. jjsyht

    jjsyht Hello, my name is yuri

    Joined:
    19 Jun 2004
    Posts:
    136
    Likes Received:
    0
    Same for me. AMD-nVidia.

    Personally I wouldnt mind having the cpu+gfx combination, where the gfx is comparable to a 7600GT. Since a 7300 can be considered as already available integrated (6150 chipset) the 7600GT seems a good guess for a CPU integration.

    EDIT: I just re-realised... its AMD+ATI, so a mid-range ATI gfx in the likes of 'I dont want an ATI gfx!!!'. A Conroe+7600GT in one package = :thumb:
     
  15. CAL3H

    CAL3H What's a Dremel?

    Joined:
    18 Mar 2006
    Posts:
    31
    Likes Received:
    0
    Bah I'm a green+green person and dont like the idea of the merger. It looks like ATi have ended up just annoying Intel for the time being and nVidia are left in the dark with respect to their AMD link - 'you make the CPUs, we'll make the chipset and graphics'. Isn't this partly demonstrated in their refusal to provide Intel with their own SLi ability for the 96/975 boards?

    I know its probably irrelevant here but looking at the Mac community also, the current Macbook Pro line contain X1600s - future driver concentration goes out the window and the new lineup of Macbook Pros that emerge will have either integrated Intel or some nVidia chip in them. I know there have been a few Macs with nVidia inside but the vast majority seem to have been ATi. If there is really going to be nVidia in the iPod (a rumour which may have come from this very merger originally being a rumour) then maybe Apple will switch to nVidia for serious graphics power. I see this as unlikely, however, as if Intel get their way I'm sure the Macs will all have Intel GPU/CPU combis - kentsfield modified 2xCPU, 2xGPU - is this a possibility?

    The idea of a CPU & GPU manufacturer fixed partnership would take all the fun out of PC building - no more options (to a certain extent). I wonder how long this move has been in the waiting. For AMD it looks like a temporary check-mate. How soon will it backfire when Intel launch the Conroe at an even lower price, with more cores and nVidia give in to Intels wishes. Conroe SLi on intel boards - sure nVidia lose some chipset sales but they could boost on the SLi + Intel combination and smile at their integration into iPods (arguably the fastest selling gadget) and find their way in to the Mac market (which stays adopted to one brand for a long period).

    AMD may have decided this one on an alcohol induced evening...
     
  16. Flibblebot

    Flibblebot Smile with me

    Joined:
    19 Apr 2005
    Posts:
    4,828
    Likes Received:
    295
    But this has already happened. The vast majority of gamers (and I'm not talking hard-core here, I'm talking the kind of people who buy a PC to do the accounts on, then buy the odd game here and there) are playing games with integrated graphics. Remember when Sims2 came out? BBs everywhere were inundated with people trying to get it to work on their crappy Intel integrated graphics.
    The kind of people who buy graphics cards are almost power users by default.

    I don't think we're going to see games developed only for the bottom end of the market - everyone likes a challenge, and that includes developers!
     
  17. K.I.T.T.

    K.I.T.T. Hasselhoff™ Inside

    Joined:
    1 Jan 2005
    Posts:
    624
    Likes Received:
    1
    then again you must bear in mind none of the comapnies no matter how enviromentally friendly and now matter how many baby foxes leap forth from their energy bills a big ol' nice guy company isn't going to make a game 'as a challenge' with the knowledge that they will lose money....it just doesn't happen
     
  18. specofdust

    specofdust Banned

    Joined:
    26 Feb 2005
    Posts:
    9,571
    Likes Received:
    168
    Well, like I said in the other thread, I'm worried. The article was great, it really explained why this was done and how it'll effect the relevent parties, to an extent. But there is plenty of unknowable stuff, and there are plenty of things I am concerned regarding.

    The Nvidia Nforce + AMD combination has produced some of the finest overclocking seen in recent years, will that now stop? ATi from what I understand have been a bit..well..crap, at making mobo chipsets, will AMD now solely use them? For the few generations of tech that I've been into PC's, people have just had to choose their CPU, the rest worked in any mobo. Things split a bit with SLI and Crossfire but most of us don't use dual cards so again that wasn't a huge deal. My major concern now would be that we start to see platforms, as Wil talked about, that lock you into partner technologies, or the other products from the same company.

    So much is up in the air, even with all the talk this came totaly unexpected to me, and I imagine many others. It'll be very intresting to see how the dust settles, but untill it does I'll worried that we may be seeing the end of an era of unparrelled flexibility and choice.
     
  19. r4tch3t

    r4tch3t hmmmm....

    Joined:
    17 Aug 2005
    Posts:
    3,166
    Likes Received:
    48
    I am also a green-green fan. Although conroe is the king right now, AMD will come back. But now with the merger, we will have less choice.
    I wouldn't have thought it would be AMD-ATI, if anything I thought AMD-nVidia would be more likely with the nForce chipsets.
    The GPU on a CPU, deja-vu, first off CPUs were good for everything, then graphics became too much for it and the GPU daughter was born, now its moving back in?
    DONT MOVE BACK IN WITH YOUR PARENTS!!!!!
    Well thats my illogical $0.02
     
  20. jjsyht

    jjsyht Hello, my name is yuri

    Joined:
    19 Jun 2004
    Posts:
    136
    Likes Received:
    0
    True(for KITT's post), but currently only lowest-end gfx are integrated. With OSes (Vista & MacOs) asking for high 3D capabilities, integration will involve 'better' gfx.
    If intel keep pushing their crappy graphics chipset, its gonna kill the game market.
    If intel improve their graphics offerings (4eg: CPU+GFX integration), its gonna explode the gaming market with proper GPUs capable of running new games at 'acceptable' gfx quality. Thus we can all have our 1-time-fix of NFS Most wanted.



    as the article points out, it was the same for co-processor, etc. First they are too much for the CPU, but now it can be integrated. High-end GPU will NOT be integrated (well... who knows), but the other GPUs could, and hopefully, will.
     
    Last edited: 24 Jul 2006
Tags: Add Tags

Share This Page