Sorry if this has been asked many times but I don't have time to trawl forums. I am currently thinking about building a new PC. The funds are not totally together yet so it is not urgent. The question is whether spending the extra cash (about £10) on an ivy bridge cpu over a sandybridge is worth it. I do not plan on using internal gfx for anything taxing so that voids that line. But is the increase in heat worth the money? -W
For 10 pounds? It is. It's not about the increased heat, but the electricity savings that add up over time, while having a slightly faster CPU.
So just to clarify, for the same power rating you get higher speeds. But if the temps are higher wont that make the fan use more power (motors have a high power consumption).
Sandy Bridge is rated at 95W, Ivy Bridge at 77W. That's 18W difference. I don't know any CPU cooler that uses that much power, even at full speed.
That is TDP. And that PC would have to run for years 24/7 (do your math - convert the wattage difference to kilowatts, multiply it by 24 and then by 365, multiply it by your electricity rate per kWh and scratch your head at that low result.
when i replaced 2 580s with 2 680s i noticed a saving of a nice 300 watts thats when its sorta worth it. for the sake of completion costs dif 95-77= 18 *24 432 kwatts per day 0.4 432 * 7 = 3024 kwatts per week 3.024 3024 *52 = 157248 kwatts per year 157.248 1000 watts per kilowatts 25 pence per kilowatt ( uk avr cost varies alot i just took 1 of the 5 that i bothered to look at ) £39 a year on 24/7 usage which is more than the cost to get the cpu in the first place between sandy and ivy bridge at stock settings. shockingly more than i thought it would be.
But that assumes 24/7 flat out usage doesn't it? In reality, with the power saving measures intel put in, plus sleep/off, etc and general usage habits that would be massively lower, or am I wrong?
@rollo: And now back to the calculator, because TDP is one thing, and real power consumption is another. For some reason, in the bit-tech review there was a rather huge idle power consumption difference, which somehow is not present anywhere else. Everywhere else the measured difference in idle was pretty much zero, 1-2 watts at best (which is kinda expected, as idle power usage of Sandy Bridge was very low in regions of 20-30W for start, so you can hardly get down by even a single watt). Sure, at load you will get maybe a 10 watt difference (again, keep in mind the difference between TDP and real power usage - Pentium G620 has a TDP of 65W, yet i had problems getting the power consumption above 50W for whole system), but how big part of the day does your PC run at full load ? Unless you fold 24/7, not much. Let's say you run your CPU at full load for 2-3 hours per day (a gaming session) - your savings will be one tenth of the value you calculated, that is £3.9 - or even less. So for this optimistic usage pattern, the power consumption difference would take 2 to 3 years to show up. What i was trying to say is that unless you will fully load your CPU 24/7, then the difference in costs are so small that it is irrelevant to look at power savings as a deciding factor. PCI-E 3 is a deciding factor, worse thermal performance is a deciding factor, power consumption not so much. PS: Most computer fans use 1-4W.
Isn't it mostly a question of how much oc'ing you intend to do? If that is the raison d'etre then a Sandy Bridge would probably be the better choice. In any other case Ivy Bridge wins. Well, unless you were thinking of _upgrading_ from SB to IB. That's not worth it at all.
I posted a worst case scenario I didn't have time for the million usage scenarios I know graphics cards are the bulk of power either way.
I'm not sure that PCIe-3.0 is worth it with the current GPU's. I do not see any GPU taking advantage of it before a year (or probably 2). Just look at the negligible performance difference between PCIe-8x and PCIe-16X.
Yet our current crop of cards are a lot faster than a 5870... http://www.hardwarecanucks.com/foru...s/53901-nvidia-geforce-gtx-690-review-25.html This is a bit more up to date. PCIe 3.0 will see a handful of FPS increase, but not much to write home about.
... with a GTX690. Graphic cards using a single GPU won't take any benefits from PCIe 3.0 ... and if you can afford a GPU that is limited by the PCIe-16x ... then you do not care paying a premium of 10€ on the CPU
If the PC is not being used 24/7 the power usage/energy saving is a moot point. I would say just go for the newer tech; absolutely no reason to get SB now unless you're trying to save money by buying second hand.
Just to throw something else into the mix. When I bought a month or two ago, I got a Z77 board so I'm ready for all that new shiz and have USB3 native....... .....and then I dropped a Sandy Bridge Pentium in there for no money. I'm not overclocking or gaming with current titles, but the machine's silent, perfectly quick and I can drop an Ivy Bridge CPU in whenever I want. Just a thought.