I heard somewhere that there was no point in a framerate over 40, as the human eye cannot tell the difference between that and a higher framerate. That could have been worded better... _C
I did think about monitor refresh rates but some monitors are capable of higher than 120Hz, human eye can detect more than 40fps, depending on the situation. I read something a while back saying that a very brief white flash can be detected by the eye (lasting a time equivalent of 200fps or so). Just to clarify, I do realise that in most situations 40-50fps is plenty and most people would not tell the difference. Anyway, just wondered the source of this really, would like to read it. Where did this, and other "facts" in this poll come from?
Actually, I would suspect this is a common thing. At my place of business, the Intel people work on Intel things and are not allowed to look at AMD things and vice-versa (due to possible legal ramifications). As a consequence, they typically do not care about the technology developed on the opposing side.
AMD and Intel share technology all the time to keep things roughly the same. Look at AMD64 and IAMD64. Intel had to pull the technology off AMD because they made it first-and things need to be the same, or similar enough, for Windows etc to run right.
Not at all. It’s best to think of AMD and Intel as two rival gangs who have agreed only that they will not fight an all out war to snuff one another. There are basically only two ways AMD and Intel “agree” on things or “share” technologies. One is at the behest of Microsoft and the other is at the behest of the market. In either case, Intel generally has the upper hand, because it has control over both. Now, I am not hating on AMD, my favorite systems are AMD, I am just saying it’s the plain facts. Intel employs 80,000 people and AMD employs 14,000 people. Not only does Intel have a nearly 6 times mass advantage, but it probably has a higher ratio of income and development related to the PC market. AMD actually makes a lot of money selling internal parts for cell phones and the like. So, like it or not, Intel is in the driver seat. When Intel decides a technology will help them sell CPU’s they will protect it closely. Only if they see a greater advantage will they “open it” to the general market. Generally, they will setup an independent group (like the PCI-SIG and USB-IF), that will take over the technology and make it “open” to the world. Thus, AMD as part of the general market, gets access to these things and in that sense, yes they “share”. Most people just think these are market universal developments and don’t even realize that Intel designed and developed them and then released them for the good of the entire market (and to influence its direction). But Intel is the most strict as well. Sub-contractors who work on projects with “Intel Classified Material”, (which is anything processor or chipset related), are forbidden by contract to work on any AMD products for approximately 6 weeks after they have seen anything Intel. Their aim is to give the people time to forget the Intel details and minimize something Intel developed working its way into an AMD design. I think AMD is a little less strict, only requiring AMD sub-contractors not to work on any Intel projects at the same time as working on an AMD project. At any rate, the downtime of engineers who aren’t allowed then to work on another company’s stuff is expensive. Thus, when a contractor starts on an Intel project, they basically stay on Intel projects. Probably nothing shows how close Intel guards its “stuff”, better then the names (Pentium, Itanium,etc.). They specifically went to this convention to keep competitors (AMD & Cyrix) from having products that seemed similar to theirs. They took this further to design their own sockets and such. If they were co-operating, there would be a standard socket or even an adapter so you could exchange AMD/Intel CPU’s. Obviously, Intel doesn’t want this feature. Even the new MoBo layout, BX, basically excludes AMD. Microsoft is the only common ground AMD and Intel really have. Generally speaking, they both need to make MS happy and unfortunately for them, that means Microsoft can wright some requirements and they both have to comply. They will comply in different ways for sure and Microsoft has a limited pull (esp against Intel), but this is the only time they will “appear” to work together. Really though, MS and Intel work things out, then MS tells AMD what it is going to need to do. Just because you see two people in the same city, you shouldn’t assume they arrived there in the same car. One might have even come to town on a train. So, as for the AMD64 – it’s not really a new technology, its an extension of the existing platform. There really is not a trick to it. The basic design could have been done in the 80’s, sit on a shelf for 20 years, and not need a revision when implemented today. It really is an oversimplified thing. So, realistically, the logical development would have been first at Intel, where it would have been dismissed as a novelty and then picked up at AMD as a marketing tool more than anything. Certainly, Intel would not have needed to call AMD to figure out the implementation, even if they really had developed it first. Realistically, the only important/ground breaking technology AMD has beaten Intel with is the NorthBridge integrated into the CPU. So, it is interesting to note how many people love the insignificant AMD64 hype and how many ignore/deny the NorthBridge built into the CPU which actually is a significant development.
http://www.ca1clan.com/fpscompare.zip good program I found for comparing frame rates, always shows people who say you cant tell the difference over 60 fps that they are wrong. Personally I can't stand games that run under 60fps, with 85-120 being my most comfortable.
not any that ive ever played differences in max visigle framerates depend on the game with me, for instance a flight sim is usually fine at under 30fps because it is slow moving whereas a fast moving first person shooter or racing game id prefer over 60 or so. the whole "human eye cannot distinguish between 45fps and 60+ fps" is a load of rubbish imo in bery fast moving games, there is definitely a difference with very high framerates
Well, if you cap them yourselves like that, that leaves a lot of unused proccessor for quality or bigger screens.
Well, we all have different eyes/ brains right? Last week, 2-3 of us were trying to fix a problem and were hovering over a particular screen for a few hours straight. It was just text/windows XP. Eventually, we gave up and fetched a "specialist". As he approached, about 20 feet from the computer in question, he screamed at the top of his lungs - "how can you guys sit in front of this thing and work?" I was like - wa? He immediately went into to properties screen and changed it from 60hz to 75hz. LOL - I couldn’t see a bit of difference, but either he was clairvoyant or it was a significant difference to him.
Go on then. WITHOUT looking at any other pages, give us a definition of Hyperthreading and Hypertransport. Your time starts now.
Yeah I can't understand how people can tolerate 60hz (unless it's tft), I can't tolerate anything less than 85Hz, even 75Hz has noticable flickers, I have a 19" monitor though, it's not so bad on smaller ones
This was a ViewSonic E771 which is a nice 17" display. I just don't see any difference at all (60/75/80). LOL Why would it be less of a problem on TFT's? The only thing I can think of is they might hold their brightness/intensity better between scans. EDIT: Well, poop, a TFT shouldn't have a "scan" in the traditional sense, I guess I am not sure exactly what is going on behind a TFT. So, we should all have to use CRT's only, until I catch up on the technology. They must just need to do a digital update of the on/off status for each pixal as they are obviously not doing an traditional analog sweep to re-illuminate the screen.
Yup. You don't see flicker on TFTs because they don't flicker - only the sub-pixels that need to be on or off are changed each frame. What matters is the rise/fall time of the sub-pixels - the time it takes for, say, a red pixel to turn on or off. Screens with lower response times do this faster, so you see less blurring/ghosting. I'm not sure myself about why TFTs need refresh rates at all, though I suspect it has to do with vsync - with it enabled on a 60Hz screen framerates are capped at 60fps rather than 75/85/100 if you have a faster refresh rate. EDIT - Doom 3 is capped at 60fps, far more usual than 30