1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Oh my! What have I done?

Discussion in 'Hardware' started by Jux_Zeil, 16 Oct 2010.

  1. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    Forget what I said... I did a refresh rating, and I got 7.2 on my GPU.
    I guess it performed a miss test, or the last time I did this (because I don't recall), was when I installed Windows 7 and probably did not have the latest graphic card drivers (Win7 built-in drivers?!).
     
  2. MSHunter

    MSHunter Minimodder

    Joined:
    24 Apr 2009
    Posts:
    2,467
    Likes Received:
    55
    8800 is the first card to support CUDA and correct me if I'm incorrect but physix is a CUDA program.

    Or was it the first for folding?
     
  3. Rofl_Waffle

    Rofl_Waffle What's a Dremel?

    Joined:
    24 Mar 2010
    Posts:
    504
    Likes Received:
    12
    Any card from 8600 and above can support Physx.

    PhysX is basically a marketing gimmick. Its not the only Physics engine. Its not the most popular physics engine. Its not the most advanced Physics engine. In fact it is the most inefficient physics engine.

    Except for like the 10 games Nvidia paid to feature PhysX. Nobody in the right mind would actually make them game use PhysX. Any online game already can't use it otherwise over half the PC market and the entire console market can't run it.

    Not to mention PhysX drops your frame rates like crazy because GPUs suck at physics calculations so you have to use dedicated cards. While everyone is buying quad core computers with multi-threading. Much more powerful cores right there waiting to crunch those numbers.

    PhysX doesn't even count for 1% of physics implementations.

    The only reason Nvidia advertises it so you can stick with Nvidia cards and use your old one for PhysX though it has little to no benefits. Thats also why they locked out dedicated PhysX if you have any ATI card in the system, just to make sure you buy from Nvidia only.

    Though Nvidia still makes good video cards. I just have to trash their marketing ********. I wouldn't even put a second card in my computer for PhysX as it is only good for producing heat until im playing one of the 10 single player games that features PhysX.
     
    Last edited: 17 Oct 2010
  4. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    Actually Rofl, you are wrong.
    It has been demonstrated (not by Nvidia), that GPU's are incredibly fast at performing advance math, such as physics. It was never designed that way, just discovered. That is why we have OpenCL and CUDA. To calculate physics, you use math. All Nvidia provides is a language that makes it easier to implement, and is in fact easier to implement compared to doing things manually (I can't comment on other systems).

    In fact, the next version of MathLab will use the GPU to accelerate the calculation it performs.

    The problem with Nvidia idea of PhysiX is that it's exclusively to Nvidia GPU's.
    Another problem, is Nvidia payed game studios to implement it, is also a mistake. Why would a game studio implement it, when they can just wait until Nvidia comes in with their briefcase filled of money?

    What Nvidia failed to do, is demonstrated and convinced the gamers that PhysiX is a need and that there is a demand for, so that studios actually implement it.
    All Nvidia did is show you a game with PhysX and a game without any physiX at all, instead of showing how much faster the game runs with PhysiX compared to other systems. Or what it can actually bring to the gaming environment. So far, there is not one recent game whee I said, I wish that when I hit a box, the box breaks really well. Because that is already implement by the CPU.

    Now it's too late, I think. As we have Quad cores or even dual cores, games can run smoothly on them, and have enough room for the CPU to handle this. This is kinda like dedicated sound cards. In the old days, having a dedicated sound card was serious performance increase, as the CPU virtually did nothing for the audio side of the game. But this has changed when sound card manufacture were too busy partying instead of continuing focus on game improvement, and the result, is that when dual core processor came, no gamer that cared about sound quality, needed one.

    Another problem is the propriety system used. Because PhysiX is exclusive to Nvidia and doesn't run on anything else then on Nvidia GPU's, then game manufacture see no point in implement it. Why Nvidia should get the monopoly? Same for Creative EAX sound engine.. we all know it's equally a gimmick. If Nvidia makes, genuinely do there best, to properly implement PhysiX API call to be rendered on the CPU properly when the feature is turned off or not using an Nvidia GPU, THEN Nvidia has a chance to grow, especially if they make it much easier to implement then any other solutions. Opening up and give the system to ATi will be best for Nvidia, as Nvidia can show that on their GPU, it runs better. But, obviously Nvidia doesn't want to do this, as it will lunch onto another sub-war with ATi, as it would be humiliating if ATi release, on the first day, a card that massacre Nvidia own card at PhysiX.
     
  5. Jux_Zeil

    Jux_Zeil What's a Dremel?

    Joined:
    30 Apr 2009
    Posts:
    493
    Likes Received:
    17
    It's the architecture of the GPU that counts, not the model number of the cards reference design. We all know that nVidia loves to confuse us with it's model naming tactics.

    Any CUDA capable card can support PhysX. It's just that nVidia have taken it upon themselves to make people buy a mid ranged card in order for them to be able to implement it now.


    The same could be said of the touch screen on the iPhone. It's a gimmick I can live without but try telling that to the people that love them and are surgically attached to them.


    There's a few more than ten and if you check I'm almost positive that PhysX is implemented in some X-Box 360 and PS3 games also. Online games can use it as the packets of data that are sent through the net or LAN are just points of reference. It's up to the hosts machine to draw the game to those points. You wouldn't have two monitors right next to each other in order to check that every little fragment of the car you just destroyed split in exactly the same way, so, as long as it blew up spectacularly on each monitor who's counting.

    As GoodBytes has said, The GPU is actually the best maths processor to run such an intensive code. If you get the Custom PC mag then you will have read the review on Mafia 2 and the difference a dedicated PhysX GPU makes to a games graphics and how much it can cripple the CPU and frame rates (even on an ATI-nVidia setup).

    As to the 1%, I bet that of all those workstations crunching frames for the latest big animated movie only a handful don't use nVidia Quadro setups. They are still considered one of the best extreme high end systems you can get for graphics intensive and simulation (physics and logic based algorithms) programs. Shame I can't afford £15,000 for a Quadro SLI box. :waah:

    Tell that to the chap that lost 14 fps with his ATI-nVidia setup. I can't remember where I read it now but he wasn't happy to just turn the PhysX off so it must make quite a difference to him also. Turned out his cracked drivers wouldn't install over the official ones until he deleted them from cache.
    I also like the fact that when I rip/convert/edit movies, the extra card helps to get the job done quicker.
     
  6. Ph4ZeD

    Ph4ZeD What's a Dremel?

    Joined:
    22 Jul 2009
    Posts:
    3,806
    Likes Received:
    143
    It really doesn't change the fact that the only reason Mafia 2 has PhysX is due to nVidia contributing huge sums to the development of the game, and it makes 0 difference to the gameplay. Crysis and BC2 have physics which dramatically affects the gameplay but *shock* the developers saw no reason to add PhysX.
     
  7. Tangster

    Tangster Butt-kicking for goodness!

    Joined:
    23 May 2009
    Posts:
    3,085
    Likes Received:
    151
    Why on earth would you need a Quadro Sli in a render box? All those workstations crunching frames for the next movie are minimal graphics and focus hugely on RAM and CPU performance. It's likely that most of them are multi Xeon/Opteron rigs.
     
  8. nightblade628

    nightblade628 Minimodder

    Joined:
    10 Dec 2009
    Posts:
    263
    Likes Received:
    11
    PhysX is the catchphrase of the day today; tomorrow it will be Raytracing. Half the job these companies have is searching for the next Holy Grail of computing, the other half of the job is convincing consumers that they should care just as much. My gaming is perfectly okay without DX11, or PhysX, or raytracing - would it be nice if all of these could be perfectly implemented right now? Yes. Is my gaming experience destroyed without them? Of course not.
     
  9. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    Sorry, it's heavy GPU.
    They use GPU's to render frame by frame the movie. They use actually GPU farms to render the movie. The high end CPU's is there because the rendering process is not exclusively on the GPU, it still need the CPU. When you work with high-end Quadro's or even Tesla GPU's, you don't want any bottleneck, so high speed memory and CPU's are used.
     
  10. Tangster

    Tangster Butt-kicking for goodness!

    Joined:
    23 May 2009
    Posts:
    3,085
    Likes Received:
    151
    Huh? When did everyone switch over to GPU render engines? I thought most films currently in production were still using CPU based engines as a result of it being a pain in the ass to switch mid production, with possible switchovers for the next gen films.
     
  11. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. What's a Dremel?

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    Physx has some cool features but it all can be done through the CPU, Havok, and OpenGL.

    Nvidia just made a dumb business decision buying Ageia and the new Intel's and AMD's will zip right through Physx without a dedicated GPU.

    Even the new HD6000 should have enough power to do Physx and Tessellation. The upcoming modern PC hardware has the horsepower to handle these instructions. It boils down to proper programming techniques.

    Anyone using a 8800gtx to game please lay off the nachos and soda for a month and upgrade your GPU for the sake of gaming evolution.

    GTX 470 ok purchase I guess but a a GTX 460 SLi 1 GB would be better than a single GTX 480 and run cooler.

    Nvidia should just allow the GTX 460 to do Tri SLi and sell GTX 460's 1GB buy the ship loads
     
  12. xXSebaSXx

    xXSebaSXx Minimodder

    Joined:
    21 Aug 2010
    Posts:
    496
    Likes Received:
    45
    But see... the problem with your statement is that the big animation houses don't do GPU rendering for final production and that is wrong. Render farms at big animation houses are made of Powerful CPUs with loads of RAM; since those two components are the ones that will do most of the rendering work.... Why? Because animation houses don't leave anything to chance when it comes to their product, so parametric animations or animations based on computer simulations are almost non existent... They may run a blocking animation via parameters or blending or curves, but in the end each frame will be done by hand.

    Yes... the actual workstations (the units where the artists work on) will most likely be equiped with PRO type GPUs, but then again... I've seen some studios where they use regular high end desktop GPUs with great success. It really isn't necessary to spend 2K on a QuadroFX card when all it's going to power is the viewport so that the artist can enable more eye-candy when modeling/animating... For that purpose a normal $500 card will do the job without a problem.
     
  13. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    xXSebaSXx, you do realize that Nvidia gaming GPU revenue is small compared to there Quadro line are aimed at businesses with specific needs, like film studios, simulation, research and multiple display setup. Nvidia is dominating in that market. You think Nvidia is really hurt by the bad sell of Geforce GTX 400 series, or the unable to release nForce chipset for the Core i series processor, because Intel is being a bully? Of course not.

    Proof: http://www.nvidia.com/page/success_delight.html
    Avatar, The Da Vinci Code, Open Season, The last air-bender to name a few.
     
  14. xXSebaSXx

    xXSebaSXx Minimodder

    Joined:
    21 Aug 2010
    Posts:
    496
    Likes Received:
    45
    That was not my argument.... I realize that NVidia dominates the PRO market. I didn't even address that on my reply...

    My reply was more aimed towards the poster's statement that animation studios have render farms filled with $15K machines equipped with Multi GPU Quadro setups. Yes; they may be $15K boxes, but that price will be most likely driven by them having dual Xeons inside and loads of ram.

    On everything I've read about this, render farms are just loads of computers with powerfull CPUs and good quality ram. Since most of the rendering is done by the CPU anyway. I think GPU is used in raytracing and shadow mapping, but those are baked in separate passes anyway... It still leaves the CPU/RAM combo to do the largest portion of the work.

    I guess I could have just said:
    There is a difference between a workstation and a render farm PC.
    Workstation = Where the artist does the work --> Most likely equipped with a QuadroFX card, but could function just as well with a high end desktop card. Remember that the workstation is just there so the artist can see what they're working on... More powerfull GPU here only equals more eye-candy on the viewport... It does not equal faster render times.

    Render PC: Plain black box (or hundreds of them) with powerful CPUs and tons of RAM that are there to crunch the actual frames out.
     
  15. B1GBUD

    B1GBUD ¯\_(ツ)_/¯ Accidentally Funny

    Joined:
    29 May 2008
    Posts:
    3,558
    Likes Received:
    558
    Don't the original Ageia PhysX cards run quite happily paired with ATI cards? I'm pretty sure when I had my x1900xt that the Ageia card sat next to it without a grumble.
     
  16. SlowMotionSuicide

    SlowMotionSuicide Come Hell or High Water

    Joined:
    16 May 2009
    Posts:
    835
    Likes Received:
    20

    Will probably take a better part of a month before there's any real availability though.

    no need to feel bad for your purchase, in our hobby things become obsolete the moment tyou actually buy them ;)
     
  17. Jux_Zeil

    Jux_Zeil What's a Dremel?

    Joined:
    30 Apr 2009
    Posts:
    493
    Likes Received:
    17
    Crysis and BC2 probably used the 'Quantum Effects engine' built into the normal cards routines instead, that's why they were crippling all GFX cards at the time. Still physics processing, just not very good on an underpowered nVidia card running the eye candy as well.

    Sorry guys, didn't want to start an argument. I think you are both right though as I'm sure the older render farms are still used for the standard frames (look at how much they cost even back when) and the CUDA based rendering is the real-time live TV stuff and the heavy special effects. I'm no pro but I would still love to play with some of that kit.:sigh:

    Yes they did because they weren't owned by nVidia then.

    You're right there. I've learnt that if you don't jump straight in the sea instead just dipping your toes in, you hesitate and never get in.
     
  18. ShakeyJake

    ShakeyJake My name is actually 'Jack'.

    Joined:
    5 May 2009
    Posts:
    921
    Likes Received:
    71
    The evolution of graphics in video games maybe.

    Even if a more powerful GPU could make gaming better, most people have other things to spend their money on. That's lie me asking you why you have a dual core and not a quad. Please upgrade to a quad or even a hex, for the sake of multi-threaded evolution.
     

Share This Page