Discussion in 'Article Discussion' started by Da Dego, 16 Jan 2007.
Oh dear god! Just one year without technological advances... please... I'll be so happy.
But I just got PCIe and DDR2! Curse you technological advances!
whats the point? okay the extra power is "needed" but that can be provided by leads directly to the psu. the extra bandwidth is a complete waste
[rant]DAMN GRAPHIC CARD COMPANIES THAT MAKE STUPID POWER HUNGRY GRAPHIC CARDS!!!!! COPY INTEL GODDAMNIT!!!!![/rant]
hmmm...DDR3 sounds nice.... the same thing for me.... wait for reviews and stable hardware.
Better that they add something they dont yet need but might eventually now while they are coming up with a revision anyway than need a version 3.0 another year or two down the road really.
Wow, my computer is so out of date, yet it plays modern games well. I'll settle for my rig until the end of the year at least! Wonder how long it will be until the current 1.1 standards' bandwidth is actually used up completely by a card?
The look of that new 8 pin connector worries me slightly. It appears that they're taking the current 12V EPS connector, and inverting the wiring. WHY? Are they stupid,or something? There WILL be users who WILL mix the two up, and then you'll have instant card death. If they want to use ridiculous standards, can't they at least follow ones already existing?
Along with that, does it worry anyone else that the shot of the dual connectors present on the 8800 there show off the fact that it can take the new 8-pinner?
Surely they should work on reducing the power consumption of these cards! If you think about it (and how much electrical appliances round the house use), 185W is pretty insane really just for the GPU.
I'm still using VLB whats all this PCI-E nonsense
ahhhh, Wonka, your not penski!
seems pointless but hopefully it means it will be longer before the next step up
Though it seems that there are advances in every part of a computer's hardware, these are steps that show that the industry is constantly moving forward with new technologies, developments and insight. Though, of course, it'll be a while before PCI-E 2.0 is released to Joe public and as well all know, you wait for the reviews to come in and you wait for the technology to be used in the mainstream and then you hand over your hard earned cash...
They make it seem like v.2, but isn't this just "taking the easiest solution"?
Instead of making bigger connectors, shouldn't they be working their behinds
off in order to make the videocards give more performance per Watt? I mean,
2x 8800GTX's including the rest of your highend system will occupy half the power
of a Powerplant, lol!
It's about time they start to realize that imho
FFS, stop making new standards when we don't *need* new standards!! At least it's backwards compatible, if it wasn't I would a lot more pissed off ¬_¬
A standards group actually define the PCIe standard, not Nvidia or AMD.
There are rumors that the R600s will use a 8 pin and a 6 pin connector. But that may be for backwards compatibility
I also notice that with the 8800 maybe the next revision will use one 8 pin. Also agree with the the need to change the connector so it is different from the 12v ESP
Sorry folks, but this kinda is neccesary.
I've seen benchies of high end SLI systems and in dual 8x slots the cards perform less well than in dual 16x slots. The difference is around 5-20 percent depending on the games. Do you really want to be losing 20% after spending £500 on graphics cards?
Yes, this can be remedied by having a southbridge with extra lanes, but that's a hackish solution, and basicly requires the board to be have extra chips(a southbridge in AMD boards' case) just to be able to provide sufficient bandwidth. I hate tech advances that make my kit obsolete as much as any of you, I upgrade rarely and dislike fat tech just for fat tech's sake. However, I don't think this is a case of that. I think this really is neccesary given the cards we're seeing today.
What's wrong with them upgrading things when it's even backwards compatible? It's called future-proofing, and you'd probably be complaining in a year if they hadn't, because your brand new card wouldn't work as well in a 16x slot. Best to get the architecture out there and in use before it's really needed.
Yes, they'll let you plug them into one another and blow everything up . Notice the offset last row of pins to stop that happening.
whats the word on this btw
i mean, atm, if you have SLI cards then you get x8 on each card, which does impact performance on each card like 5% (at least)
On top of that, Higher speed = possibility to use less lanes = less wires = cheaper boards
So is the extra bandwidth wasted ? i don't really think so
Ontop of that, if you introduce the specification now, then it will be in place when people actually need it, instead of trying to introduce it when people need it
i looked at some p965 sli benchmarks compared to 975x.. so in other words 16x and 4x compared to 2 8x lanes.. and unless you were to run higher than x1950xtx there was barely nothing in it. maybe 5% and in all honesty i doubt mane people could tell the performance hit. that was what i was baseing my thoughts on.
however i've no idea what any 8800gtx sli has been done on so god knows.. anyone got numbers on that in different lanes?
Separate names with a comma.