Nvidia....ATi.....AMD.....Intel..... What is the difference I ask.... nvidiaShader3.0r0xoxorsz!!11...atipwnZ inHL2...20framepersecondadvantage fastassass 15fpsadvantage...15pipelines4pipelinesupyourspipeline... nvidiaownsyou -some may answer, but really what is the difference? Is there a big difference betwenn 90 and 120 FPS... Well 30FPS difference some might say, in 1998 30FPS was a lot, but is there a difference to your eyes between 90 and 120... When our eyes notice about 75-85 FPS. Does 5 additional minutes make a big difference when encoding a movie for 40mins already? I think that what matters most is what we use our hardware for, how we use, and what we achieve by using it. I would like tech world to focus more on innovative software and various uses of technology, rather than increase raw power of hardware but achieve same or similar results a little bit faster and better than with older technology.
@Cheap Mod Wannabe y'know dude, your absolutely right. sure its nice to have the latest and greatest, but i really dont need it. im sitting here in front of an amd2000xp with 512meg ram and an ATI 9600. just about the only thing i do spend money on is storage array (this machine has 240GB and i have almost a couple of terabytes on the network), and that's because i actually *need* that much storage space. when i 'backup' a dvd, i set the process going and leave it in the background. if it takes 2 1/2 hours or 3 hours, its not that important to me, when i watch a movie on my pc (usually because i can watch it on one screen whilst working on the other screen), if it plays ok, im happy. about the only games i play are sim city, command and conquer and black&white..... and the occasional q3a when im feeling angry >. i can get about 100-120 fps on q3a with all quality settings at maximum (its easy if you know how to optimise), so the latest and greatest uber-sli gfx card isn't gonna make it any better. what i would really like to see is a raid-card that i can boot my machine from, that *doesnt* cost an arm and a leg. hell, 10 years ago we had ISA i/o cards that we could boot from, how hard can it be to turn a raid card into a bootable raid card that doesnt need me to mess around adding drivers. I'd like a usb cam that just works, without me having to patch and recompile my kernel because the company that makes it wont release its drivers so someone else has to reverse engineer them. sure, I'd like my dvd burner to go a lil faster, but thats only because i use it several times a day, but its really not that big of a deal. i can wait for 12 minutes just as easily as i can wait for 10. when i burn a cd i burn at 32x, cause i know thats gonna be fine, and its only a few minutes slower than 52x, so its my own fault i guess (but i still want a dvd burner that can burn a dvd faster than it takes me to type out growisofs ) as the saying goes "i just upgraded my computer, now i can crash in half the time!". at the end of the day, i dont care if it takes 10 minutes or 12 minutes, i want something that works, every time, which is pretty much what i have and i got it without spending a zillion quid on the latest and greatest. `Zidane
I think it's pretty much down to what you want to use your computer for my compy isn't that fast 2500+ oc'ed to 2ghz and r9800pro with 256 dual channel I can play games HL2 it's just the level of detail I suffer with... I think technology is advancing a bit too slow it's always playing catchup with games you can't even play top games at top detail with like decent 80+ CONSTANT fps...
it makes a difference upto your refresh rate, then you have to upgrade your monitor not your graphics card my last upgrade more than halved video encoding, mmm dual core
I usually try to run games on maximum detail smoothly by turning off the little un-noticeable things via the console... ie. HDR turned down a notch, putting it back to SM2.0, a little bit of overclcoking.... usually runs great. I just buy the fastest dvd burner if its at the right price... £30 for a 16x dual layer dvd writer is pretty good.
I do that aswell but try to attain a solid fps like at leat 50... I ALWAYS turn off shadow wastes so much processing power and doesn't really do THAT much, depends on the game though if it's like splinter cell it's losing alot of atsmophere if it's like Need for speed I don't care about that shadow under the car with the neon ya know.
@ou7blaze: technologie isn't advancing to slow, they make us think it does... they force us to buy that stuff to play the newest and "coolest" games (gf. 7800gtx etc. ...) sur its fast, but it's is basicaly old crap... they (nvidia intel etc. ...) could maufakture much faster and mabe even cheaper hardware BUT they sell us that s*** to make more profit... (belive me it's true) why do they sell us expensive graphic cards that are obviusly slowed down by (almost?!)ANY cpu on the market and say you NEED that to play the games???And by the time "normal" people can afford that and it can unlesh its "full power" they allready got somethin "better" with "wiked" features like SM 3.0, HDR, trancparacy anti alasing, dualcore etc. ... which suddenly every game uses ??? And every time you think you got the most "kick ass " system on the market they bring out something better which every new game needs to run nice on high qualaty... And I bet that in some lab they alreaddy have a geforce 9 and a 10 gig processor or mabe something even faster... get what I'm tryn to say here??? It's a mean mean world (correct me if I'm wrong this is what i heard/read...)
I think it's less true of graphics cards that it is of CPUs; usually with a new generation of graphics cards there is a large improvement in performance when compared to the incremental performance increases that we see with processors. However, occasionally you get a sudden shift in the way we think about these components - take dual-core processors for example. In a few years, I think that single-core machines won't be widely available because it's something that is a complete change for the better - not a gradual move forward. It's not worth upgrading from a Athlon 3800+ to a 4000+ by any means, but it's definitely worth (if you have the money) going from a 3800+ to its dual-core equivalent.
Personaly I dont like to jump on the band wagon of new cpu's or videocards or chipsets or what have you when they are frsh on the market. Simply put its a waste of money and allot of agrivation. It seems that everytime they release soemthing its full of bugs. Im primarily an intel kinda guy and the highes ill go is a 630. I dont think ill benefit from a dualcore untill well into next year. I didnt jump on the bandwagon and get a 915board or a 925x or xe board because I didnt like the chipsets and the prices asociated with memory. I dont see myself jumping on dual core when its nice and mature. Probly mid neaxt year or lader. When they have heat issues sorted and the architecture has shrunken. I realy dont want to deal with a cpu that can duble as a home heater or deal with a 7800gtx or what have you. I refuse to spend more than 250bux on any component of my pc. I buy late and oc and thats my philosophy. But I must admit that upgrading can lead to slipery slope of power hungrines and a empty wallet syndrome.
i agree CMW, admitidly i could go out and spend £££ on the latest stuff, an A64, for example but i dont need it, this system (sig) does everything i need it to, and more. It burns CD,s and DVD,s fast enough, and like you say i leave them in the background anyway. we do need more new tech and less "updated" old tech imho.
But if software is to evolve and get better, doesn't it require the neccessary hardware to make it work? I agree to a point that many new products that hit the shelves are only slightly better than whats in our current rigs, but compared to say five years ago, its a major difference. My path of tech was - PII 350mhz to a XP1800+ to a P4 3.2Ghz...I have noticed a significant change at each step - I can now run apps that just was not possible before. So, I have advanced in tech use because the tech world has allowed me. I say, keep everything evolving and lets see what becomes....
yeah, but you dont NEED 80+ FPS in a game. You dont NEED 2x 7800GTX to run Doom 3 and it to look nice.
Of course not. But games are never gonna get any better if game developers are stuck with the same hardware day after day.. Fast forward one year from now - for example - benchmarking Half Life 4 on 2 x 7800 sli mode gets just 35fps.....hmmmm.....looks like we need a new graphics card already...... Todays graphics cards are not just for todays software, but the stuff that we don't even know about yet....
I too find it laughable that a GFX card comes out in 15 different flavours of the same... But that is how evolution works... every "next-gen" component is a slight update of it's predecesser... I run (both on my laptop and on my desktop pc) a 2600+ (Mobile Semperon and Athlon XP resp.) and that is more then enough for me... I don't run XXL Ram, that costs 3x more because it shaves 1 sec of a useless SuperPi calculation... I then prefer adding another 512MB stick so that my office app's and CAD drawings go a bit faster... but even then, the difference between 512 and 1024 isn't that noticable in my case... I don't even run dual channel RAM, just no need for it... But back to the thread, the reasons for the slow advance in the world is pure economy... There are allways them macho-men/woman (we don't discriminate) that want to brag about "what's under the hood"... Hey look at me, I've got an A64 FX 57, 2x 7800GTX in SLi, 2GB XXL Ram, 2x Raptors... it cost me 3000€!! look at me, I'm great... and then his friend comes in, hey, I've got that, all, but watercooled... Then he surely has to get triple cascade cooling, because he isn't "hot" anymore... Just about the EGO... But I must admit, I dream off too sometimes... I sometimes surf to the local pc webshop, and asemble my "dream" pc... sets me back 3000-5000€, and then I asemble a pc I could really use... that I would really use, and then I realise that I'm working at one... The only thing I would invest is working comfort.. a larger screen or something... But what you've read/heard about 10GHz PC's and GeForce 9 & 10's, don't believe it... The techology isn't that much further then they actually sell... Come on, what a blow would it be for (for example) Intel if AMD releases a stable 9GHz CPU while they are "only" at 4GHz... It's, like I said, economic's, combined with evolution... And that all is nicely played on the ego of the people... shaking them down a lot of money... Now, what is the moral of this dumb story? Tech advances are needed, but we all don't need the bleeding edge stuff... Most of us don't even need it... But because we don't need it, doesn't mean we won't buy it, to brag about it (I'm thinking of several sig's on this forum...) And true, the software has to evolve "on par" with the hardware... but the reality shows us that the software evolves on par with the expenses... how long are we waiting for XP 64bit? And this fits in the above... the economy...
Yes but for software to evolve, more power isnt necessary at all. Im sure if you sat and thought about it there would be loads of extra features you can think of for your favourite piece of software. The only thing i cant run nicely on my comp now is HL2/CS:S but by nicely i mean at >30fps on 1024 or something wiht low jaggies. I can manage farcry in decent enough settings and it looks the balls.
Agreed, for office software there isn't really a need for a hardware evolution, but for the cutting edge games there is... But it's a sword that cuts both ways... Example: I'm a game developer that want's the to incorporate the latest cutting edge tech in my brand new game. So I contact a GFX card producent and get the latest tech... My programmers work and work to create a perfect game for that, real eye candy... But when they are finished, the card is "outdated"... We get the newer card, adapt the code, incorporate new features... Eventually we will catch up, and then the game is released... But then there is only a happy few people who can really use all the eye candy... And this isn't good for the economic's of the game producent. And money makes the world go around But what I would like in the tech development scene, is streamlining... Back in the old days, programmers would create a piece of code, test that it works properly, and then go back to the programming board and take out unneeded bits, optimize formula's and calculations... So that everything was smaller and faster. Nowadays, they create a code, test it, and it doesn't matter much that is eats up memory, because RAM and HD's are ass cheap (compared to 10-20 years ago). If there would be more optimalisations, more streamlining, then there wouldn't be that much need of bleeding edge hardware.
you dont NEED anything except air, food, water and shelter doesnt mean having a house and a microwave etc isnt better if you dont feel you could make use faster hardware, dont buy any, whats the problem? other people can and do make use of faster hardware so much whining about nothing in this thread and whilst i dont need 80fps ina game i much prefer to play it at 80fps than 30fps if possible dual core cpus has been a big jump in tech imo dual core cpus are just so much better than single cores even now when there are few multi-threaded apps about
The PC in my sig runs every game I own great. I play CS:S at 1280x1024 at around 40 FPS with everything at medium to high. I can even run Chaos Theory at settings at or above medium and it runs great. I have only recently hit a problem and that's with the F.E.A.R. demo. On 640x480 on high it'll run 30 FPS. But if I change to 800x600, drop the setting to low, I get 5010 FPS. Granted yes, my video card is well over 3 years old (but running strong) but I am not too upset. If I could afford a new card, I would get one, but the one I have is fine for what I have. Screw cutting edge technology and it's high costs. I don't need it. Half the PC's I own are running P2's and they work great with XP and some older games and word processing.