1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Staring at the sun

Discussion in 'Article Discussion' started by Tim S, 25 Jan 2007.

  1. EQC

    EQC What's a Dremel?

    Joined:
    23 Mar 2006
    Posts:
    220
    Likes Received:
    0

    I was thinking the same thing a few weeks ago...then I realized the problem: If you have a 16.67ms response time, and a 60Hz refresh rate, then it's possible that your monitor NEVER displays the right picture -- the pixels will always be in "transition" between 2 frames...and by the time they get to displaying a frame, it's time to start the transition to the next frame.

    Of course, if the "16ms" is black to white, and most transitions are grey-to-grey (ie: they'd happen faster), then most of the time things will look fine.

    So, I'm thinking, for your eyes, maybe it's best if the pixels look right "most" of the time -- ie: during the 16.67ms each frame is displayed (@60Hz), you want your eyes to see the right image for at least 16.67/2 ms. So, if you've got an 8ms response time or better, you should be all good for sure in every situation.


    Let me finish, though, by saying I've got a 3 year old dell Ultrasharp 1703...and I think it's quoted response time is 25ms, and I've never picked up on any ghosting. So, I might have "slow eyes" or something...or maybe I just don't deal much with fast moving images. But I'd imagine, then, that 16ms is pretty good for most people, and 8ms response times at 60Hz ought to be just about perfect. I'd like to see some "blind" tests on these guys who claim to need a 2ms response time....can they really tell the difference between an 8ms and a 1ms monitor?
     
  2. David_Fitzy

    David_Fitzy I modded a keyboard once....

    Joined:
    8 Jan 2004
    Posts:
    206
    Likes Received:
    2
    For everything but the latest gaming and encoding a mid-range rig from 5 years ago will suffice (i know my rig is one). A bigger problem i think is computer stores irresponsibly selling the latest and greatest hardware to anyone who comes in saying "yeah I wanna do email n stuff"

    I think the Wii makes a good point too. From what I understand it's no more powerful than the game cube but it's selling millions because all the games are designed to be fun. Nintendo are simply saying who needs photo-realism when you can just have fun. I say this because whenever I see a photo of people at a LAN party they all look bored staring into their photo-realistic games. In theory Switching a LAN party to a Wii party everyone would be enjoying themselves whilstst cutting the energy bill by something ridiculous.
     
  3. Da Dego

    Da Dego Brett Thomas

    Joined:
    17 Aug 2004
    Posts:
    3,913
    Likes Received:
    1
    I don't normally like to comment on my own columns, but some good points have been expressed here and I wanted to clarify something I don't think I made clear. I'm not of the mindset of doing this totally for "environmental" or "green" computing. If you notice, I didn't even use the words "carbon footprint" or "carbon" anything. I don't think I made a good enough expression of WHY I didn't, though.

    My point is that "Bigger bigger bigger" is simply not sustainable. We're to the point of diminishing returns - not just for the environment, but for our pocketbooks and even just for the sake of computing. We ARE being wasteful. More chips, more heat...The fact that an 8800 GTX can go over 90C should tell you something. Energy that is wasted as heat cannot be used as processing power. Therefore, why not work on making something run cooler, and only somewhat more powerful?

    As Nexxo mentioned, we're getting to the point where we're building HUGE computers and HUGE cooling systems...to get how much performance increase over a midrange part? And for what, a speed your monitor doesn't allow you to benefit from anyway? It's diminishing returns at its finest. Your extra 100 quid will get you 10% performance but 30% higher heat output (waste) while drawing down 40% more power (which costs more money).

    A good example - Quad SLI. It doesn't matter how few enthusiasts actually use it - why did they waste so much research on developing something that has no tangible benefit? It's priced out of market, it is ungodly wasteful when it's in use (for the few who persist on buying it anyhow)...that effort could have been used to develop a better part that does something more than win a frame-rate count. So why did they do it? Because having that framerate count matters in marketing to you and I, because WE'RE focused on more more more.

    My point is that there is a third route that hasn't been explored between something that puts out the heat of a thermonuclear reactor and a complete integrated solution - that of making the TCO of the higher-end be lower overall. Let the high end get cheaper, better, more efficient and we can even see some increase in games quality - older hardware won't be as expensive to upgrade, so people will do it more often.

    Research in the current way of thinking is starting to now net us 10% return for 40% investment. Though I don't deny there are still some benefits to pushing the envelope, we could take the massive benefits we already have and start refining them to make more useful products now.
     
    Last edited: 25 Jan 2007
  4. Clocked

    Clocked Yar! It be drivin' me nuts...

    Joined:
    11 May 2004
    Posts:
    525
    Likes Received:
    2
    Quite right David although i expect that could get rather messy cos theres always the random wierdo who would turn up with a full bladder... :duh:

    I reckon if you're gonna do the whole energy efficiency thing you gotta apply it to everything you use or do. It's no good just doing half a job.
    I find it really annoying that there was never any major push (I mean not getting global coverage in the news every night or day) for economy with energy or anything with an environmental impact until we started to see the effects of global warming.
    Primarily this is the weather, since the floods, tsunami's, tornado's and hurricanes have hit, people are starting to wake up and the simple fact of the matter is that the damage is done. The climate has changed already and its going to be near impossible to get it back to the natural equilibrium. Look at the amount of forests that are chopped down (Remember kids - photosynthesis takes CO2 in and expels O2 amongst other things) and then people wonder why there are so many green house gases.... ultimately its gonna take a massive effort by everyone concerned to make any impact.

    Also Gfx cards with faster frame rates than the output source what is the point? why did the Gfx companies even make this possible? surely the energy used in this would be better served rendering really nice shiny things for us to run around and play with?

    My 2 pence anyway, any change is appreciated
     
  5. Spaced_invader

    Spaced_invader What's a Dremel?

    Joined:
    25 Sep 2002
    Posts:
    493
    Likes Received:
    0
    and that monkey had better be trained to be a buttler..
    I always wanted a monkey buttler...
     
  6. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    :clap: :thumb:
    That's half the reason I never started.
    I like the way you're thinking but how many of us are running CPUs from Via now? (Though I think they'd have more customers if the up front cost was in line with similar performing systems - not that you can buy new P2 & P3 chips anymore)
     
  7. Sh0ckwave

    Sh0ckwave What's a Dremel?

    Joined:
    26 Jan 2007
    Posts:
    2
    Likes Received:
    0
    Firstly, I've been reading Bit-Tech for a while now and most of the articles have been up to a high standard.

    This one however is not. Power consumption - you claim we will need 2kw power supplies yet the total consumption of nearly the most power hungry system you can build today only uses, according to the article, 437.5W

    After all, no one needs a 185W GPU, if you're concerned about power buy a slower card that uses 50w. It's just like buying a V8 and then complaining about how much fuel it uses.

    It's only the high end models that use that much. For example, the 8800GTX might use 185w but the next gen mid range card might only use 75w and will be just as fast. So why don't you just buy a mid-range card and play games at a low resolution or framerate if you don't notice the extra framerate anyway.
     
  8. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,783
    Likes Received:
    102
    First off @ Dego, I think you're both right and wrong. The power consumption and ROI is totally out of whack, but I think You're underestimating the e-penis factor which is perhaps the MOST important element in high end hardware purchasing. People aren't buying these components because they need them but because they want the biggest e-penis and have the cash to burn to get it. Efficiency and cost are not significant issues.

    I've been struggling with this issue for a while now, though from a slightly different angle. I switched to linux a while back and so thaty means that high end gaming is not really an option (and before Glider says it, yes I know it's possible, but I'm still too much of a noob). Sure I can dual boot to XP64 and game, but in practice I never do. What that means for me is that there is really no point in upgrading. New CPUs, GPUs and whatever else aren't going to make Bit load faster or make my email more readable, so why do I need them? I have to admit this is hard for me because I want two dual core CPUs and quad SLI. I don't need it, I have no real use for it, but I still want it. Basically, it's an e-penis issue. I want a unique, high performance computer that is, well, unique (recognizing this fact, the working name for my conceptual upgrade / mod is the e-[enis pump :naughty: ).

    Where I'm at now is trying to figure out a path forward. Since there is no point in upgrading the inside of my computer (except cooling), I might have to bite the bullet and try actually modding the outside of my computer. :dremel:
     
  9. Techno-Dann

    Techno-Dann Disgruntled kumquat

    Joined:
    22 Jan 2005
    Posts:
    1,672
    Likes Received:
    27
    A very good article! You've managed to put into words a lot of the things I've been thinking about for quite a while.

    I'm coming at this from the perspective of a gamer. I play games. However, I've come to a rather simple conclusion: More performance does not make you a better gamer.

    I'm running two generation old hardware, with an X800 and an Opty 165. At my 17" LCD's native 1280x1024, I get perfectly smooth framerates in everything from Half-Life 2 to Company of Heroes. Furthermore, I can, and have, thrashed people with 7900GTXOMGSLI! at 1600x1200, or whatever their monster 27" widescreen was pushing.

    There is a point, of course, where your hardware slows you down. But, when your framerate is playable, and the settings are high enough that you can see people, there's really no point to upgrading further. To quote the old saying, enough is as good as a feast.

    When the day comes that my computer can't do what I want it to (and with an upgrade to a mid-range DX10 card when the CoH DX10 patch comes out, it'll be a while), I'll upgrade to something that can. I'm looking at Merrom-based Mini-ITX boards, especially the ones that have PCI-Express 16x slots. I like the idea of having a computer that's as easy to move around as one of my TFTs.
     
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    First off, welcome to the forums.

    With the way things are spiralling out of control, we will need 2kW PSU's soon. A pair of R600's or GeForce 8800's (plus a physics card or maybe four R600/GeForce 8800's since 680a has four PCI-E x16 slots) and a pair of AMD K8L quad-core chips in a 4x4 motherboard is going to use up a phenomenal amount of power. Of course, whether you need all of that horsepower is another question entirely.

    As you are right to point out, there is certainly not a need for a 2kW power supply in the system listed, but it's not uncommon to see that kind of system requiring 600W or 700W for good measure. The minimum recommended for a 8800 GTX is 450W if my memory serves me correctly, but that doesn't take quad-core CPUs (or overclocked ones, for that matter), more than 2GB of memory or more than one hard drive into account. The last generation graphics cards required a minimum of 400W for one high-end card, and the generation before that was 350W if I recall. Minimum power recommended power requirements have gone up.

    Also, while Core 2 reduced power consumption by X, increasing performance by Y over Netburst, a lot of the credits that Intel put in the kudos bank with Core 2 got lost when it raised the power envelope back up to 130W with Kentsfield.

    I am not saying that Kentsfield is a bad product, but it was part of a space race between itself and AMD. Intel wanted to be first to "quad-core" (even though it's not a native part; not that it matters at the moment) and that's why the first quad-core chips increased power back up to where Intel's Netburst chips were before. It would have been nice to think that quad-core would have only increased power envelopes from 65W to somewhere in the region of 90W with quad-core, but that's unlikely to happen without a process shrink or a significant clock speed reduction (Core 2 Quad Q6600 has a 105W TDP). The 45nm process shrink on Core 2 won't see the light of day til later this year. That's after AMD is scheduled to launch K8L, meaning that Intel wouldn't be first with a quad-core chip.
     
  11. Da Dego

    Da Dego Brett Thomas

    Joined:
    17 Aug 2004
    Posts:
    3,913
    Likes Received:
    1
    Hi Sh0ckwave. I just wanted to say 2 things.

    1) welcome to the forums.

    2) You kind of proved the entire point of my article while telling me I don't know what I'm talking about.

    This was the exact point of my article - the higher-end of the midrange systems does a great job, so rather than building these rigs for e-penis length (as mentioned above), why aren't we encouraging the companies to spend their R&D budgets to find new ways to increase real efficiency rather than OMG!!11R0X0RCLOCKSPEED++?
     
  12. Nexxo

    Nexxo * Prefab Sprout – The King of Rock 'n' Roll

    Joined:
    23 Oct 2001
    Posts:
    34,540
    Likes Received:
    1,930
    But what is wrong with an elegant, efficient system?

    As I said before, when I think about performance cars, I think about the Lotus Exige, not a Hummer. "The master shows in keeping it simple" as the Germans say. It is a different philosophy of what constitutes an e-penis, but if we can brag about our mobile phone being smaller than someone else's... well, we can make the conceptual leap.

    People generally really like my Metaversa. But what you have there, essentially, is a dual processor PC in a midi-tower box, cooled to a reasonable temperature by a small-ish radiator that is sufficient. The whole thing was a balance of compromise. It is not the fastest, biggest, coolest system around by a long shot; it has no fancy baybusses or controller software, no SLI, no RAID. It is just a fairly compact, well-balanced, well designed rig. People like it for the design, the detail and the engineering. Not for the size of the hardware.

    It is all about what we consider sexy. Small can be sexy. Effectiveness rather than raw power can be sexy. Low power requirements rather than big cooling can be sexy.
    How many are running Pentium Ms? A really efficient yet powerful CPU, they created motherboards for them just so people could put them in their desktop PC. That is the direction in which we need to go. It is what allowed Intel to make dual core CPUs with a TDP of only 65 Watts.
     
  13. Ringold

    Ringold What's a Dremel?

    Joined:
    27 Jul 2006
    Posts:
    48
    Likes Received:
    0
    It seems at this point we're almost all in agreement, I think, then. But a few things I'll point out.

    We'll never probably have a 2kw system for a pretty simple reason: most home outlets can't reliably supply much over 1kw over an indefinite period of time. I sure as hell wouldn't want to try it, at least! I think in theory the max is 1800 watts, but still. At CES I'm sure we all saw the dual SLI high end rig running on Corsair's 500-or-so-watt PSU at load.

    Second point. There's variety; we can get more performance with better battery life in smaller devices than ever before. Thats really what is important for the future, that combination. Actual power use almost doesn't matter as along as that holds. We can also get super-high end, and everything inbetween. Theres something for everybody!

    And that leads to the final post. They make 8800GTX SLI Kentsfield death machines because they're profitable, they serve a good R&D purpose which is to force their R&D to see how far they can go (because to hit higher performance numbers and not bang up against a thermal wall efficiency has to improve, which filters down to the low end). Key part is though that they're profitable, and people want them. Their reasons for wanting them don't really matter. Companies that lose sight of the profit motive just don't fare too well. Besides, the enthusiast community would be something of a paradox if it railed against high-end low-volume flagship products and then included an overclocking part to every review. Lets be honest, too; overclocking destroys energy efficiency and as soon as we load up ATitool or tweak the BIOS voltage and FSB up up and up we've really stopped caring at least partly about the light bill. At the very least an overclocker is saying that energy efficiency hasn't got bad enough yet that he won't OC. And likely the only reason an OC'er doesn't have a high-end rig is because if insufficient income. Hell, why buy a low-end rig and struggle to OC it to high-end specs when you can get a high-end part and OC it to the stratosphere (assuming that money, again, is the discriminating factor there).


    And as an aside.. writing this from a Turion64 that could be running at 1.8ghz but I have undervolted to 800mhz @ 0.8v. Excellent performance and battery life (Except for this darn flash ad that's maxing out my cpu), it does everything I want productivity wise. This sort of performance with this little power and this batterylife (4.5-5 hours) wasn't possible a few years ago. I'd call that energy efficiency progress?
     
  14. randosome

    randosome Banned

    Joined:
    17 Sep 2006
    Posts:
    226
    Likes Received:
    0
    did you leave your machines on and fold before, and now their still on all the time but don't fold ? Or do you now turn them off completely, instead of leaving them folding

    I mean, i leave my PC on 24/7 (its a MCE PC) so i leave it folding, because its going to be using power either way, I'm just wondering whether folding itself is the cause of the electricity bill, or if it was the fact you left your PCs on to fold
     
  15. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    I used to fold 24/7 and obviously leave my PC on because of that. Now I'm a bit more conservative - it's not off every night, but it is off most nights. :)
     
  16. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    At work (retail, lumber yard actually) several of our computers are old Pentium 1s & 2s because all they run is a dumb terminal program. When it's time to replace these someone in an office gets an upgrade and their old system (mostly P4s) becomes a terminal. Some, like myself, use design software and require more processing power but most other activity happens on the server and thus these new office computers are - with the exception of e-mail and the odd Word or Excel file - effectively dumb terminals. Can I convince the management to "go green" and buy Epias or such? No - I can't even get the office manager to turn off the CRT on the server - which she might only glance at a couple of times a week. :wallbash: The problem is that these energy efficient solutions are out of the mainstream distribution channels and not competitive as far as up front cost goes (which is all some -short sighted- people can see). :sigh:

    I think if Intel really cared about the environment they'd try to push these chips into the mainstream - They are available but you have to go out of your way to get one and then the whole system costs more. - Their recent comments about mini-ITX give me hope though. :thumb:

    There's this wonderful little feature called "Hibernate" - it works great (wakes up when it needs to then goes back to sleep) for mine (an undervolted Athlon XP-M 2500+ running @1200).

    Edit: :clap: 750 posts = custom title :D
     
  17. speedfreek

    speedfreek What's a Dremel?

    Joined:
    9 Nov 2005
    Posts:
    1,453
    Likes Received:
    1
    Excellent article, though I am biased towards it because in a way this article said what I always thought, this stuff isnt needed for decent resolutions and fps but people feel they need it just to say they have it.
    But it isnt just computers like nexxo points out with his analogy, I see people driving around in full size SUVs when all they need is little more than a Golf to do 99% of their driving. Most of the efficiency improvements just get wasted by someone being less efficient somewhere else in the overall design, more efficient engines then heavier larger vehicles, more efficient cpus and gpus and then they see more opportunity to produce heat.
    Which makes me ask, is that the only way that technology can really progress?
     
  18. ikra

    ikra What's a Dremel?

    Joined:
    2 Oct 2006
    Posts:
    183
    Likes Received:
    0
    lets say you have a system that can run all the games you ever need at 200fps. But your monitor can only give you a maximum of 80fps. Your eyes can only tell 30fps and nothing more. So all in all you have wasted 120-170 fps.. you have just wasted 60%...all lost in the air, FPS that you dont need at all. Wasted energy, wasted hardware components.. money is wasted... could have gone to someone who needs it more (i.e. charity). I.e. for a £2000 system, all you really needed was £800 and that set up would be more than enough to let you play the oh so nice beautiful games.

    But hey, some people have more money than others... what better to spend it on. And from the poster above... progress right now is driven basically because of competition and not passion.. for the whole of humanity.. which is really very very sad.
     
  19. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    Ok, I'm a little groggy at the moment so someone please correct me if I'm wrong.

    According to this a CRT with a refresh rate of 60Hz flashes a light 60 times per second. This means that no matter what the frame rate my eyes have 60 frames per second flashed into them. I don't know about you but 60Hz on a CRT makes my eyes feel like they've been rubbed in sand and I perceive it flickering. LED Christmas lights and some florescent lights also flicker in my vision (power here is 60Hz). At 70Hz a CRT is borderline for me but the flicker is gone at 75Hz. This tells me that my eyes perceive somewhere close to 70fps.
     
  20. ikra

    ikra What's a Dremel?

    Joined:
    2 Oct 2006
    Posts:
    183
    Likes Received:
    0
    dunno let them explain... but thats the reason why films, tv, and all that crap are less than 30 frames per second and they look flawless... you gotta ask someone else to explain why we can notice the flickering
     
Tags: Add Tags

Share This Page