1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Nvidia Analyst Day: Biting Back at Intel

Discussion in 'Article Discussion' started by Tim S, 14 Apr 2008.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
  2. Spaceraver

    Spaceraver Ultralurker

    Joined:
    19 Jan 2006
    Posts:
    1,363
    Likes Received:
    5
    So we have another war on our hands.. Good.. That means nice prices..
     
  3. p3n

    p3n What's a Dremel?

    Joined:
    31 Jan 2002
    Posts:
    778
    Likes Received:
    1
    If nvidia keep up this retarded product nomanclature/progress I hope intel squashes them.
     
  4. r4tch3t

    r4tch3t hmmmm....

    Joined:
    17 Aug 2005
    Posts:
    3,166
    Likes Received:
    48
    Hmm, that bit about the video encoding on the GPU sounds very interesting. My current choice in CPU is based heavily on the fact I will be encoding DVDs fairly regularly.
     
  5. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Yeah, it's something I'm really excited about. I've been a big advocate of quad-core for its media encoding capabilities... but when you can speed that up by between 10-20x with a GPU, why would you want to buy a quad-core CPU? That's like an order of magnitude faster and it's only going to get faster and faster with more SPs. :)
     
  6. Kipman725

    Kipman725 When did I get a custom title!?!

    Joined:
    1 Nov 2004
    Posts:
    1,753
    Likes Received:
    0
    the man speaks the truth for the moment I doubt it will be the case for many more years though.
     
  7. LeMaltor

    LeMaltor >^_^

    Joined:
    3 Oct 2003
    Posts:
    2,103
    Likes Received:
    27
    Can someone explain what Fab, IP, IA and sandbagging mean please? Thanks >_<
     
  8. Blademrk

    Blademrk Why so serious?

    Joined:
    21 Nov 2003
    Posts:
    3,988
    Likes Received:
    86
    Fab = Fabrication
    IP = Intellectual Property
    IA = Intel Architecture (x86?)
    sandbagging = delaying releases / products
     
  9. chicorasia

    chicorasia What's a Dremel?

    Joined:
    8 Jan 2008
    Posts:
    84
    Likes Received:
    0
    Just for kicks, I ran a benchmark on my parents Dell - a pentium dual core 3,00GHz, 1gb, nothing fancy, a basic productivity machine.

    using the integrated intel GMA X3100: 303 3Dmarks
    using a discrete Geforce 7300GT: 2100 3DMarks

    I didn't run it using a GeForce 8800GT, but I'd expect that to reach at least 10000 3DMarks.

    Oh well, a tenfold increase in gaming performance over the next few years won't be enough....

    <RANT>My macbook has a GMA950 integrated graphics chipset. Apple claims I can use it to drive an external monitor at 1920x1200 resolution. I have it conected to a 22" monitor, at 1680x1050 and the image is chock full of artifacts and rendering errors, even now as I am simply browsing the web. The same thing happens on a newer macbook (GMA X3100), and on a intel macmini (GMA950). Surprisingly enough, an older PPC macmini, with a Radeon 9250 integrated card, has no problems driving this monitor </RANT>
     
  10. johnmustrule

    johnmustrule What's a Dremel?

    Joined:
    12 Jan 2006
    Posts:
    345
    Likes Received:
    3
    I like the commpetition but Intel's got a mountain to climb, I think AMD and Intels soution is going to be similar which will be bad for Nvidia but if Nvidia establishes a physics model with AMD then they might stand a chance of swinging users to adopt the "GPU physics" option which is still hard to justify considering everyone under the sun's still going to have a CPU albeit they'll need a GPU to play any reasonable games. As far as Mental ray goes, it's amazing, I just started using it in 3DS max and it's fast and HQ! And as a CGI artist (a novice at best) I would have to say that water reflections are going to be the biggest raytracing benafactor, but more importantly, subdivided surfaces and physical fluids along with realistic hair most importantly, are the most important improvments Nvidia discussed, or mabey in three years Mental Ray will be efficient enough to run on future hardware.
     
  11. Cupboard

    Cupboard I'm not a modder.

    Joined:
    30 Jan 2007
    Posts:
    2,148
    Likes Received:
    30
    Last edited: 14 Apr 2008
  12. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    The problem for Nvidia at the moment is that, despite being rivals in the CPU manufacturing business, Intel and AMD/Ati appear to have a cosy relationship.
     
  13. Jojii

    Jojii hardware freak

    Joined:
    12 Dec 2007
    Posts:
    122
    Likes Received:
    1
    Care to describe said hints? Mondays call for juicy speculation.
     
  14. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    The actual details of chip specifications are pretty scant, but the graph on page four titled "Era of Visual Computing" suggests that we're going to see a respectable performance boost with the new generation. I also believe we'll see some new features added: things like C++ support in CUDA (2.0), some form of tessellator (or subdivision surfaces) engine, and some optimisations to enable geometry synthesis and more realistic hair.

    There were some more, but these are the ones that I remember because they were very obvious. The other hints came from the way certain things were said by Huang, Tamasi and Hegde in particular. :)
     
  15. xtremeownage

    xtremeownage What's a Dremel?

    Joined:
    27 Jan 2008
    Posts:
    13
    Likes Received:
    0
    Why Intel would say Integrated Graphics are better is beyond me... I for 1 hate when companies decide to lie to consumers....

    Integrated Graphics don't run games like ,Bioshock, Crysis Medium and definately not High or Very High setting. If they do the frame rates will be so poor has shown in the figures. They wont be able to run crysis on High in the next 3 years. Anyone who purchases this carbage from intels attempt to enter the graphics market is just decreasing the number of real next gen graphics engines that will be running Half Life 3, Crysis 2&3 and god knows what amazing games are around. If it werent for Nvidia you wouldnt have had the XBOX with halo & h2.Plus the Nvidia RSX GPU in the PLAYSTATION 3! Besides why purchase IG for games you wont be able to run ^^.

    Im so disappointed some people decided to take intels side. My only assumption is probably they dont own graphics card or dont play games. I love Intel for its processors, not its graphics. Companies should specialize on what they are good at instead of butting in and making things messy in a already loved and advanced industry. (CPU (4 cores)) ----->GPU(128 Cores!)

    For Intel to point the hand at Discrete graphics cards and say they are not it is like saying ok lets scrap all the visual goodness and go back to play shity games without graphics on Dos machines. Those words spoken by intel in the conference really annoyed many and im glad the Nvidia CEO gave intel a revenge they surely deserve. Im surprised Ati hasn't said anything....there GPU is in the XBOX 360. and running Half life 2 EP2 in PCs really well..
    Its hard to give gamers the real visual experience already ,, So plz Intel go away....Im so annoyed now. I'm dissapointed the Cell Processors in the PS3 wasn't brought to Desktops because i would scraped intels junk.(it good junk but old compared to the PS3s processing power).

    I can already see it. Noobs brain washed into buying this lerrab (or wotever IGP Cpu from INtel)....They go in the shop. get a PC and try play crysis or crysis 2 ahahaha then say WTF!. AFter a night of cursing and ranting they return the PC after 2 days and ask for there money back. NExt..They remember reading this post and go out and buy Geforce10,000GT lol Man oh man. They eventually realise there screen xplodes into life, absorbed by the rich goodness, the stunning shadows, man oh man its like being in heaven he or she says, eyes wide open , mouth dribbling wide open.

    If you are really thick headed after reading what i have said you would go out to the shop and buy an Intel integrated graphics motherboard or PC to play Bioshock in dx10 (high settings)....fOR YOUR sake hang yourself for all our sakes.

    (sorry for my english if it is bad ,,its not my first language)

    YAYAYA my GPU 9800GX2 (birthday gift ahaha) for now...eat that INTEL. BEat this f**kers frame rates on High settings in the next 2 years and i'll kill myself.
    Listen to this guy: PC s are not for gaming ( YEs thats true...PC's are for making games!--so we still need the graphics cards dumb ass to make the games )
    PC's are for making games for consoles etc. Consoles rock. PC games are for Hardcore gamers playing large scale strategy games like Galactic Civilization 2 , Supreme commander with a mouse for quick actions, plus for high rich graphics games that surpass console versons. The PC still remain and will continue until they make a console that has a GPU that is upgradable and mouse.
     
  16. Anakha

    Anakha Minimodder

    Joined:
    6 Sep 2002
    Posts:
    587
    Likes Received:
    7
    Again, little mention that RayTracing + CUDA = Massive Win! Raytracing is a very simple algorithm that is naturally parallel, and the more cores (Or in this case, "Stream Processors") you can throw at it, the better.

    With 128 "cores", each tracing a single ray on a screen 1600x1200, with 10 "Bounce" rays, you would need a 9MHz processor to get a stable 60Hz display. ((1600*1200*60)/128*1000000). Considering most GPUs are in the hundreds of MHz range (If not the thousands of MHz), this gives a VERY detailed scene with LOTS of reflection. All you'd really need is a way to use that RayTraced image directly on the card (IE, output from the Stream Processors directly to the FrameBuffer to be rendered) and you have Real-Time RayTracing OOTB.

    I'd put a dollar down for that. Anyone else?
     
  17. EmJay

    EmJay What's a Dremel?

    Joined:
    28 Jun 2007
    Posts:
    316
    Likes Received:
    0
    This reminds me of the sparring between Boeing and Airbus a few years back - Boeing talked up the size of their largest planes, and effectively goaded Airbus into one-upping them by making something even bigger. As soon as Airbus had sunk too much money into it to pull back, Boeing scrapped all their plans and announced that they'd be focusing on fuel efficiency instead - an extremely popular choice with the airlines. +1 for Boeing.

    I'm wondering if Intel is doing the same thing here - making lots of noise about ray tracing, Larrabee, and the death of the GPU, coaxing nVidia into making even bigger (and more expensive) graphics cards, and then quietly making something totally different. People are screaming that current integrated graphics are worthless for gaming, and even a 10x improvement won't be enough, but let's not forget that the technology for decent graphics already exists - it's not like Intel has to research everything from scratch, they just need to find a version that doesn't violate copyright and start building it into their systems. Still not an easy task, but I'll bet they're going to push nVidia into catering to the gaming market (which is only a fraction of the computing industry as whole), while they quietly eat up everything else. The GPU may still exist - but I'm guessing the 'average computer' won't include one in another five years.
     
  18. xtremeownage

    xtremeownage What's a Dremel?

    Joined:
    27 Jan 2008
    Posts:
    13
    Likes Received:
    0
    like i said Emjay the technology has been around for as long as i can remember but the point im making is....unless its not integrated and the GPU is on a separate card then integrated graphics are not for gaming.

    The games that will come out in the future will push Nvidia and ATi to create better GPU's that system vendors will put in their systems e.g Dell, APPle, Alienware. The average PC today from dell at the low end is about $500. It has an integrated GPU from INtel.If you purchase this machine (1) Your Budget was $500 and below (2)It is an Office PC that never uses the IG... NO 1 gets integrated GPU and uses them for the latest games. As a consumer in this industry i can already see that IG will need as many as 128cores to level todays GPU power. Playing games on a future IG in say 2 year would mean 4cores X10 40 cores>>>not sufficient.

    The whole point of this arguement between the 2 is them saying you dont need GPU's but instead use IG which is absurd considering the looooooooooow performance u get from IGs.

    Im guessing processors from intel with many cores will be revealed much much later though there is a prototype in place. At the pace GPU's are going game developers will opt for GPU;s other than IG. ATI's GPU has over 400 cores. ATI Radeon™ HD 3870 X2 Graphics though out performed by Nvidia 9800GX2 it is so advanced and i dont see intel catching up in 5 years because by then GPU's will cross the 1000 core mark, processing teraflops of data and producing the most sophisticated graphics and physics never before seen. With the scale of GPU growing smaller even Intels 80 core processor will have a hard time pulling off spectacles the ATI nVIDIA CARDS will be producing. The world economy is growing and every decade it becomes cheaper for the the average user to purchase GPU;s due to increasing income and GPU's shrinking in manufacturing processes which cuts costs.
    Other than purchasing PCs for office work people will purchase hand held devices like iphone 2 lol , use a wireless keyboard and use word processors and spreadsheets. Oh plug in a usb monitor and do whatever work we please. Nvidia's APX 2500

    http://www.nvidia.com/object/hh_games_demos.html

    Purchasing a PC with furios processing power would be a leisure thing. I myself dont see the need for PC's with Intel integrated graphics. Infact i see laptops today using Nvidia GPUs making PC's with integrated graphics look ancient. Hand held will be for Office...PC;s will be for playing amazing games. Consoles will replace PC gaming maybe the Graphic vendors like Nvidia will provide the horse power for the consoles. I think the Motherboard will soon shrink in size. GPU could be the size of my wallet-who knows. THE POINT IS GPU will LIVE ON and IGs WILL ALWAYS SUCK and wont be needed if most people will be able to afford GPU;s
     
  19. Bladestorm

    Bladestorm What's a Dremel?

    Joined:
    14 Dec 2005
    Posts:
    698
    Likes Received:
    0
    Everytime I think maybe the GPU market has finally settled to the point I can feel justified in buying a new GPU (primarily to finally play bioshock on lets face it) I hear of something big coming out in another month or two and I put it off again.

    Its getting to me a bit. I think its probably a contributing factor though, that when I put the current PC together, the 7900GS game out between me buying my parts and actually finishing building the thing (had to mod the case a bit to get the watercooling sorted first and my dad had an accident with a too long screw leading to a radiator repair delaying things by a couple weeks) and delivered something like 30% more performance for a chunk less cash (and given the GT was £220 at the time, that one stung a fair bit!)

    Right now I'm both very tempted by significantly stock overclocked 8800GT 512mb for £129 and somewhat overclocked 8800GS 512mb for £165, both of which seem like nice value .. but if they may only be good value because something much better (and/or forward looking) is coming out in a month or two it might all be false economy >.<
     
  20. r4tch3t

    r4tch3t hmmmm....

    Joined:
    17 Aug 2005
    Posts:
    3,166
    Likes Received:
    48
    Doesn't quite work like that. You would definitely need more than one clock to calculate the vector of the reflected bounce, plus how much of the intensity should be kept and how much dispersion it needs. Plus that would only be the ray tracing. You still have calculate where everything is in 3 dimensional space so it can calculate the ray tracing (although that may be done by the CPU) So let say it takes 50 clocks to calculate a bounce that means you would need ~ 500MHz, and that's just for ray tracing, you still have to render the geometry etc.
    Don't quote me on this as I don't have a good understanding of how ray tracing works (I know the basic idea) or how ray tracing would affect other areas of graphics processing. If Ray tracing was as easy as resolution x refresh rate/#steam processors x clock speed then it would have been done years ago instead of creating raster engines to do the work.
     
Tags: Add Tags

Share This Page