1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How it all shakes out

Discussion in 'Article Discussion' started by Tim S, 24 Jul 2006.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    spec - I'm just waiting for some answers to some questions back from NVIDIA at the moment. I might give Intel and a few others a call tomorrow and see what they have got to say about the deal. :)
     
  2. RotoSequence

    RotoSequence Lazy Lurker

    Joined:
    6 Jan 2004
    Posts:
    4,588
    Likes Received:
    7
    Interesting view Wil; while an interesting vision, I perceive things going slightly differently. X86's weakness right now is its singular, monolithic design. The movement has been to multiple cores in order to enable higher performance via multiple data requests. Sun's Niagra did this, and its tons of stupid cores seem to work pretty well. Now, Intel is hell-bent on pushing instruction parallelism. Why?

    From a raw executable standpoint, Intel's push doesnt appear to make sense. Ive mentioned previously that no one is going to use more than four cores with today's processing patterns effectively; the rest will be idle. However, the predicted next generation of computer processors is going to represent a technical left turn, that others are predicting to be comperable to the introduction of the Pentium processor itself. The Inquirer brought up the fact that you can refresh a product a lot more often when you create a relatively simple, efficient little design that can be cut and pasted in different patterns to deliver higher performance. GPUs have been doing this since their inception, and Intel wants to cash in on this; monolithic CPUs take a tremendous amount of effort and years of busywork Research and Developement to conceive. Conroe for example, probably took roughly five years to develop.

    With the jump to 64 bit processing, we are no longer dealing with the imposed 4GB memory addressing limit. We can now fling information right left and center, and we are starting to see, in the console wars, integrated memory architecture for the graphics and processor. Conceivably, this will come for PCs too, and the RAM for both the processor and the graphics will be the same. There wont be a memory wall for the two either.

    Within a few years, graphics themselves will be sucked into the metaphorical black hole that is the CPU. This is where I start to disagree with you, wil. CPUs arent going to pull in a monolithic graphics chip just for the sake of unity. Graphics itself will become X86 based altogether. I dont forsee a processor that contains a GPU; I see a processor that is so parallelized, that the GPU concept cant hold a candle to the raw X86 based processing power the new "general processing unit" would have to offer, thanks to Intel's parallelism efforts. The Video accellerator wont be necessary because the processor has the grunt to do all the work on its own.

    Thats why AMD would need ATI to survive. AMD has great engineers, but they cant create parallel technology in such an extreme manner. ATI can; theyve been doing it with graphics for years. Intel has been developing it for years the hard way; AMD is just going to buy its way into this, and get the job done easier. Its much like Malcom's line from Jurassic park; "you stood on the shoulders of geniuses, and you took the next step". Why earn the knowledge the hard way? In business, it destroys you to do things the hard way.

    Its a big gamble, but AMD/ATI could conceivably become hugely powerful over the next five to ten years, and Intel will be following suit. Where does this leave Nvidia? I honestly dont know; that is still up in the air. In the meantime though, things are going to continue much as they always have - that is, until the generation after Core 2's successor, Nehalem or the generation after, sees the light of day.

    Sorry if this is a bit rambling :blush:
     
    Last edited: 25 Jul 2006
  3. Silver51

    Silver51 I cast flare!

    Joined:
    24 Jul 2006
    Posts:
    2,962
    Likes Received:
    287
    I may be alone in this, but I have a feeling that integrating a GPU onto the CPU instead of the motherboard won’t change things that dramatically in the near future. Add in graphics solutions with features such as dedicated memory should be in demand for serious gaming for a while.

    Actually, I’d be interested to know what Microsoft thinks about this.
     
  4. jjsyht

    jjsyht Hello, my name is yuri

    Joined:
    19 Jun 2004
    Posts:
    136
    Likes Received:
    0
    I think MS would like to have a proper GPU into any pc sold - no limit on the abilities of Vista.
    The GPU/CPU integration may be for the far future, but in the near future, isn't AMD just trying sell 'platform' like Intel?


    Imagine a Conroe/7600 integrated, on a nano itx. HD Video, HD Audio, GigaEthernet - an above mid-range gaming pc smaller than a laptop. :jawdrop:
     
    Last edited: 25 Jul 2006
  5. Reins

    Reins What's a Dremel?

    Joined:
    18 Jul 2006
    Posts:
    51
    Likes Received:
    0
    Lol, well put.
     
  6. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,139
    Likes Received:
    382
    in this you mean that i can buy a...say....6 core intel cpu i can use all of them for processing power, and an extra graphics card for higher end gaming. i can use 5 cores for processing and 1 for low graphics power, 4 cores for processing, 1 for the graphics 1 for phisics. 5 cores for processing 1 for phisics and a extra graphics card for high end graphics..... etc....correct?

    than this means that this rocks, you can have 3 cores for cpu and 3 cores for GPU and have enough power to play about any game. kewl

    or even.... 3 cpu cores 1 ppu core 2 gpu cores + a uber graphics card. could it work?

    this looks promising, it may be my future processor.
     
  7. stephen2002

    stephen2002 What's a Dremel?

    Joined:
    23 Sep 2003
    Posts:
    183
    Likes Received:
    0
    I for one would not mind having some sort of super-chip that has so much power that it can handle all of the fancy stuff that GPUs do now. I think it is going to take a long time to get there and even when we do get there I think that there will always be room for add-in boards for the high-end folks. I'm thinking a lot more high-end than games, think graphics for films and other "give me as much power as is possible and I'll show you why I need twice as much" type of industries.
     
  8. -EVRE-

    -EVRE- What's a Dremel?

    Joined:
    12 May 2004
    Posts:
    372
    Likes Received:
    1
    *scratches chin* Nvidia based CPU? I like it!
     
  9. valium

    valium What's a Dremel?

    Joined:
    15 Oct 2003
    Posts:
    288
    Likes Received:
    0
    Something just dawned on me, nVidia acquired ULi not too far back it kind of makes me wonder if nVidia is gonna be tightening the thumbscrews to make upcoming ATi products scarce.
     
  10. K.I.T.T.

    K.I.T.T. Hasselhoff™ Inside

    Joined:
    1 Jan 2005
    Posts:
    624
    Likes Received:
    1
    i see where your coming from and it makes sense, the only bit i've got a niggle with is the above. there is as far as i can see it two main reasons why CPU's wouldn't have a chance in hell of replicating or replacing a mid range GPU (the one in my sig rig, its pretty mid range in terms of things now) as it stands. the first thing as you said is the fact about the memory (which has already been said) the second is the fact that a powerful GPU will completley smother a high end CPU for complexity (if you look at the transistor count), in some cases it can be almost twice as complex.

    so unless there an overall drop in computer graphics (which is completely non-sensical) i think the best thing to happen would be that they intergate a socket on the motherboard for low to mid range GPU solutions (with memory shared with the CPU) or as Roto suggest they could parrallelise it or use spare cores to emulate the work of a dedicated GPU and then as long as they have a universal add-in slot (like AGP or PCI-E x16) for an upgradeable graphics system everyone will be happy. mass producers like dell could churn out 'it'll do' run of the mill systems that utilise the CPU interagted or emulated graphics to keep costs, noise, power uasage, size etc down and then custom builders (like the good people on these forums, at Bit-tech and other such communities) can take the same base components and slot in a High end graphics solution. i know that it would leave us with almst the same situation that we have now but unless there is a huge jump in CPU complexity you cannot expect one to do the work of a GPU well and it proves it when they have the work of a dedicated PPU done just as effectively by a GPU, the memory interface and the complexity of processing unit allows it to do what was the job of the CPU so much more effectively.

    sorry for another 'chain-of-thought' bomb but i just had to say it :blush:
     
    Last edited: 25 Jul 2006
  11. Guest-16

    Guest-16 Guest

    I can definately see some sort of CPU/GPU. Afterall you have hypermemory cards that have local cache and then use the main system memory, but being on-die that will reduce the overhead and enable you to connect the GPU directly to the memory controller. Suddenly you have a Vista enabled integrated solution. Win win.

    GPUs are exceptionally fast because they dont have to deal with x86 instructions, they have their own API of more specific instructions sets. They will become more fp computers but not x86 crunchers.
     
  12. scq

    scq What's a Dremel?

    Joined:
    4 Mar 2005
    Posts:
    879
    Likes Received:
    6
    Good points. The fact is - if you tack on crappier GPUs onto CPUs, or offer crappy hardware to "don't care to be informed" customers, PC gaming will probably end pretty quickly, as people will realize that games look better on cheaper consoles, and sadly, the PC enthusiast market just isn't enough to sustain PC game development.

    As for the fear of ATI+AMD forever, I doubt that would be true. It would be suicide for AMD to force all AMD users to use ATI. Just look at Intel. Just because they produce graphics chips (albiet integrated), and CPUs, it doesn't mean your Pentium D or will-be Core 2 Duo won't run off an nVidia/Via/Sis/ATi etc. chipset.

    I think that things can only get better. With the technologies of two the biggest CPU and GPU companies, we're bound to see some innovation - FO SHIZZLE.
     
  13. Nature

    Nature Minimodder

    Joined:
    21 Nov 2005
    Posts:
    492
    Likes Received:
    1
    I agree with RotoSequence,

    Unification and simplicity between the physical hardware in computers is and has been the future of IT and PC's. I think that a motherboard 5 years from now will have three different physical interface ports: CPU, RAM, and multipurpose expansion slots (something like pci slots..). I'm not suggesting dependency on onboard VRAM or a RAM system wide turbo-cache, but more what Rotosequence was saying which was the evolutionary destiny of the CPU; it just needs to do everything... Including render superb graphics.

    When the boys and girls from ATi and AMD emerge from their "think tank", they will have taken back the performance crown.. rest assured.

    To me, this abrupt merger between the Canadian's and Californian's is nothing more than weak executive support and terrible marketing from AMD, and dead-end results for ATi's single GPU mind set. Declining stocks and being democraticly "consumered out" could be attributed to a lack of vision from both company's.
     
  14. K.I.T.T.

    K.I.T.T. Hasselhoff™ Inside

    Joined:
    1 Jan 2005
    Posts:
    624
    Likes Received:
    1
    yeah, i see where your coming from because more and more i've noticed / realised that the CPU is no longer really a central processor more of a central co-ordinator and all the tasks which are hard work for a CPU to do have been given out to individual processing units so the individuals can be optimised for the relatively small set of intrusctions they have to cope with.

    Nature is right, things do need to be brought back together now, i guess you could look at this as having been the development period for whats going to happen next, you get all the seperate pieces to work well and then you've gotta put them together and if you pull it off the total will definately be greater than the sum of the parts, since you'd be removing alot of the overheads that cause delay in the system.

    i think it'll be a while before we see a CPU being able to do the work of a dedicated GPU unless some clever person 'out there' managaes to make the instructions for graphical processing alot more efficient for the processor (instead of the processor for the insructions as it is now) and reduces the complexity

    EDIT: sorry its another big one
     
  15. Guest-16

    Guest-16 Guest

    No, because graphics requires a specific instruction set to stay fast. CPUs and x86 instructions are very inefficient. GPUs will become more general purpose, but they wont be replaced by CPUs as an x86 behemoth whole. CPUs might get more execution units and smaller processors which make them easier to build and expand like GPUs but both will remain, essentially separate entities even when on the same die. You might even get cache and memory controller sharing for example.
     
  16. Bursar

    Bursar What's a Dremel?

    Joined:
    6 May 2001
    Posts:
    757
    Likes Received:
    4
    Noone knows how it will work at the moment. It's all pie in the sky talk. However, it will take more than some fancy hardware design to make it all work.

    WinXP Home Eula:
    WinXP Pro Eula:
    My bolding for emphasis. I don't think the EULA for Vista is available yet (at least not in its final form). Ofcourse MS might not see multiple cores as the same as multiple physical CPU chips. But don't be surprised to find the word processor replaced by the word core in future EULAs...
     
  17. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    We had this discussion a few days ago in the discussion thread for the last column we wrote. Microsoft allows you to have 'as many' cores as you can pack in to one socket, but multiple processors is a different thing altogether.
     
  18. K.I.T.T.

    K.I.T.T. Hasselhoff™ Inside

    Joined:
    1 Jan 2005
    Posts:
    624
    Likes Received:
    1

    i thought that was the case but judging by other peoples posts they were suggesting (at leats they way i read it) that we were going to be seeing multiple general pupose processors on one die that could do anything i.e phyics, graphics, A.I, x86 instructions etc..

    therefore is it not sensible to have another socket on the motherboard for the graphics processor which will still be a seperate entity despite being on the same die as the CPU so that it make upgrading alot easier. but that comes up against the problem of needing either seperate memory or shared memory, if would get them to share say x GB of GDDR 3 or 4 then you'd still be stuck with a bottleneck at the memory controller. which brings it back to putting them on the same die but if you do that then you have to replace both CPU and GPU at the same time which is a bit silly if i'm honest.

    this is going to be interesting...
     
  19. Bursar

    Bursar What's a Dremel?

    Joined:
    6 May 2001
    Posts:
    757
    Likes Received:
    4
    Aye, I found that info on MS's site not long after I posted. I don't frequent all the forums here so I didn't see the other discussion.

    But I still wouldn't be suprised to see the EULA changing. Maybe not with Vista, but with whatever comes after.
     
  20. Guest-16

    Guest-16 Guest

    Well as stated in the article: stuff like AI/Physics is a VERY specific market. For most users turbocache/hypermemory using GPU-main system memory is more than fast enough. It's where most of the market is: integrated. But it will enable AMD to get back at Intels integrated market by having something fast, cheap and onchip that is Vista enabled. There will always be a market for the dedicated graphics cards because you cannot physically fit 512++meg GDDRx memory onto a CPU die too, and the high end GPUs are massive cores which require 8-10 PCB layers, not 4-6 on a mobo. ATM it's one major area where NV have placed themselves and make some money, so they will be keen to keep it, and Intel will want to keep it because Conroe and the future will/should be good gaming processors.

    Admittidly I havent read everyone's replies though.


    The EULA wont change, MS has stated it's socket point for years.
     
Tags: Add Tags

Share This Page