My trust old 1366 rig has had it's 'never enjoyed overclocking' i7 920 swapped out a Xeon 5660 which is currently running at 3.8GHz using a low profile air cooler @ 73 degrees when running Prime95 (Idle = 43 Degrees). It was also happy at 4.1Ghz but that took the under load temps to over 80, so I've cranked it back until I'm confident I'm not going to kill it. Real basic overclock, just changed the base clock. Some questions: 1. How happy are Xeons at higher temps say 80+? 2. For my needs it's not going to be worth replacing the cooler with something like an AIO water job as it'll be a hassle to fit in my old school Coolermaster ATC. Any thoughts on that? This is mainly a workstation to proof of concept a bit of VR dev using UE4. The old 2GB GTX 770 in the machine will do for now, but will be swapped out for something in the 10 series at a later date. So... 3. If I put in a high end card suck as a 1080 or 1080Ti how much of that cards potential am I going to see at 1440p/VR type resolutions? I'm assuming I'm likely to get an average of 80+% of it depending on activity. 4. There's apparently some benchmark results around somewhere (Hardware Unboxed YouTube?) showing a GPU benched against lots of different CPUs. Can I find it, hell no.
1. IIRC you can run 1366 CPUs in the 80s all day with no penalty. They were bloody hot power guzzlers. My 950 used to hit 89c @ 4ghz. 2. On higher core count chips AIOs usually offer the best cooling. 3. Quite a lot, providing you can get around 4ghz out of the CPU. Remember, the higher the res the more the emphasis is about the GPU, not the CPU. 4. No, because reviewers very rarely do articles that include older hardware. I game at 1440p with a Fury X and a 3.1ghz 8 core Ivy and it's more than good enough.
73c under Prime is absolutely fine. 80c is more or less where I'd cut it off for those chips. You're going to need a powerful card to do VR, and the 770 definitely won't cut it. You should see a huge gain, at least 75%, over the 770. I'd wager that you'd see 100%+ improvement in some titles. The easiest way to find GPU comparisons is to use the Anandtech site which can be found here: http://www.anandtech.com/bench/GPU16/1489
Xeons don't mind heat that much - 99% of servers run passive heatsinks with deltas at teh front and some kind of baffle directing the airflow. However! servers are also 99% of the time kept in temperature controlled rooms. I wouldn't run them over 80 really
Poor airflow case and low profile cooler will be the problem, the Xeon should run cooler. I would find a slightly more competent cooler or be happy at a slightly lower clock. In games which are nicely threaded I would have thought a 1080 wouldn't be terribly bottle necked at that resolution but game to game will vary.
Guinevere - I've got a Corsair H115i AND a S1366 Extreme Edition air cooler off the 980X for sale here: https://forums.bit-tech.net/showthread.php?t=321676 Either would help you out.
http://ark.intel.com/products/47921/Intel-Xeon-Processor-X5660-12M-Cache-2_80-GHz-6_40-GTs-Intel-QPI Tcase 81.3c. You can run these in the 80s all day all night. They were hot greedy power guzzlers. The 950, IIRC, was 140w. Time you clocked it to 4ghz it would guzzle down about 220w. IE - no better than Bulldozer just much faster. Now whether I would want to run one in the 80s? no, I doubt I would. My 950 was crap and used to go into the 80s on a NH-D14 so I never did bother overclocking it hard. It wasn't that I was worried about the CPU I just didn't want all that heat being dumped into my tiny office.
https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/29.html The 770 has similar performance to the 960, so it will be night and day. Just thinking out loud, but might be worth considering a graphicscard that comes with an AIO like for example the MSI Seahawk if you can squeeze the rad somewhere where it dumps the heat outside the case, that way you don't get the limitations of the stock GPU cooler and no extra heat to deal with for the CPU cooler.
Yeah the 770 is knocking on a bit now, and wasn't that much faster than a 680. It would be a stratospheric upgrade IMO.
Thanks guys. I didn't phrase my question quite right. I know a decent GPU will knock the 770 for six and benchmarks are everywhere. It's more a case of: "What % of a 1080Ti's maximum performance will I see on my Xeon at say 4GHz compared to something like a 7700K?" I think it's obvious that a "Best in class" CPU will outperform my Xeon, but by how much when running the same GPU and same tasks? Yes I know it'll vary from title to title. I'm basically trying to avoid buying a new rig, which is what I'll have to do if I can't make this one last a while longer. As I may be buying a new GPU, Rift & Vive I'm trying to avoid adding Case, Mobo, CPU, Ram, M.2 etc to the shopping list!
It all depends. Does the game like a fast CPU? does it like cores? does it like fast memory like a few that we have seen lately? I don't know. Personally I would not bother with the 1080ti, though. My reasons? simple, I don't think the CPU is going to be fast enough, and to extract all of the 1080ti's performance you NEED water. Without it? you will only get 10% or so and that is with the latest, fastest CPU. I think I could cap out that Xeon at a 1080 and then upgrade the whole lot later on.
TBH, you'd be better off with the regular GTX 1080 - it should have come down in price with the release of the 1080Ti (hunt around for a good deal). It's got more than enough power for VR, and you don't have to worry so much about bottlenecking... If that's too expensive the the GTX 1070 can be had for around £300-325 these days (Amazon warehouse).
Yup indeed. I have seen them changing hands for less than £400. Even at full price the $ per FPS of the 1080ti is very very poor. Far less than the 1080.