1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware How many CPU cores do games need?

Discussion in 'Article Discussion' started by Sifter3000, 5 Jul 2010.

  1. Kojak

    Kojak Who loves ya baby

    Joined:
    3 May 2010
    Posts:
    251
    Likes Received:
    13
    Thats true
     
  2. MorpheusUK

    MorpheusUK a Noob that knows something

    Joined:
    24 Sep 2009
    Posts:
    111
    Likes Received:
    3
    I find my E8300 (running at sinful stock speeds) & GF260 is still more than capable of running all my games with a fair amount of eye candy @ 1680x1050.

    Would be good to see a lineup of same speed CPU's and where cache plays a part.

    I may be wrong here but I think flight SimX may make use of a multi-Core gaming rig.
     
  3. John_T

    John_T Minimodder

    Joined:
    3 Aug 2009
    Posts:
    533
    Likes Received:
    23
    An excellent idea for an article, and I can see a LOT of work has gone into it, (so congratulations for that) but...

    ...I can't help feeling some of the comments on here do have a point, in that the testing may in itself have been inherently flawed. That's not to suggest that the article is a waste of time or anything, (it certainly isn't, it's good) it's just that to me it looks more complicated than simply looking at the FPS achieved - and I'm not sure that looking at the same results as you I draw the same conclusions.

    For instance, as some have already mentioned, how do we know that some of the games aren't being GPU limited instead of CPU limited: Thus causing additional cores simply not to take up the workload? That some games may not utilise multiple CPU cores efficiently doesn't automatically mean they won't use them at all. Surely the similar frame rates at the top end leave that open to at least being a possibility? As does the fact that the CPU cores aren't maxing out at, or in some cases anywhere close to, 100%?

    Ignoring for a moment the FPS charts and looking solely at the graphs of CPU performance, it seems a very mixed bag to me. In games like Crysis and Call of Duty: Modern Warfare 2 it's very clearly two or three CPU cores carrying the bulk of the load, but as you yourself noted, that isn't the case in Battlefield: Bad Company 2 or Colin McRae: Dirt 2.

    If the point of this exercise was not to test the prowess of the specific CPU or GPU themselves, but to rather to test the take up of multiple CPU cores, would it not have been better to have used the most powerful graphics card available, the 5970, and to have underclocked the CPU, (down to perhaps 2.0 or even 1.5GHz) - thus doing much more to eliminate any potential GPU limitation?

    It just seems to me that the conclusion that two or three cores do most of the work, so therefore a quad core CPU is the best upgrade isn't necessarily true. It's true now - with our current generation of graphics cards potentially limiting the impact of six cores - but it may not be true when the next gen of cards come out - and I should think most people here upgrade their graphics cards far more than they do their CPU's, (I do anyway).

    Obviously money is the deciding factor for most people, (so it's an exercise in balancing anyway) but for those that can afford it, I'd personally guess from these charts that six CPU cores could become relevant far quicker than perhaps expected...
     
  4. Lance

    Lance Ender of discussions.

    Joined:
    6 May 2010
    Posts:
    3,220
    Likes Received:
    134
    Great artical.

    I am very glad i upgraded to an i7 (from a dual core e2160). Would I have gone all the way to a 980x - No. But I don't feel that I'm missing out because of it anymore.
     
  5. John_T

    John_T Minimodder

    Joined:
    3 Aug 2009
    Posts:
    533
    Likes Received:
    23
    I was going to mention that game too as an aside! (Then I decided I'd already written too much anyway).

    I'd love to see Flight Sim X tested more - it can still cane most PC's today, (especially with some of the add-ons installed) and it's older than Crysis...
     
  6. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    It's the fighting over cache. The other 2 cores light up periodically with minor tasks and start cutting in to the cache reserve. The single core results are massaged higher by that 1 core having near exclusive access to 12mb L3 cache. Leading me on to.....


    This article has issues. Namely for a site that pushes Crysis as a means to test GPUs, CPUs SSDs and HHDs they ran a timedemo which as we all SHOULD KNOW disables A.I. in this game. The test is running without A.I., and if it's a timedemo of the harbour level I would guess physics would'nt come into play either. Timedemos do not represent in-game performance in Crysis. This is a fact. You've gone and cut out a chunk of CPU load.

    Also the conclusion doesn't mention that multi-GPU setups require more CPU power to mitigate the driver overhead of the second/third/fourth GPU. Seeing as 'gamers' are referred to in the conclusion surely a mention of a reasonably common 'gamer's' setup would have been a good idea.

    Pull MW1 from the results. Even the second one is a walk in the park for any GPU of any worth.
    Why was X3 not included in the test? when it's in your CPU tests? Why not throw a bone to the idea that more cores may be more useful by testing GTA4? Or should all those dual-core people run around with 20% traffic density...

    Lastly, but not leastly, nothing is made of the fact that the 980 processor has 12 MEGABYTES OF L3 CACHE. On what planet does a dual core have 12 megs to use? so why not mention that when referring to dual-cores in the conclusion? People round here now think their 3mb L2 and 2mb L2 CPUs are doing them fine and a well cache endowed quad-core won't see them any better. Disagree with that point? someone tell me on the strength of this article how you would ascertain that.
     
  7. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    Tell Bindi going back to 'highest playable' settings as a form of results like in the bit-tech of old is the right way forward.
     
  8. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    well it's been well known for quite some time.. since early 2007 actually that cpu cache plays little in gaming- it's the speed you want.. like I still remember the budget e7400 was doing 4.5 gig oc's and in gaming benchmarks it blew the socks off the quads of the day- they had no chance of seeing those numbers

    the results spoke for themselves.. this is all really old news at least over here.. the reason you would want more cores is for other applications unrelated to gaming like 3d rendering.. to have 6+ cores for just gaming is well kind of retarded :D
     
  9. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    Incorrect. You have not done your reading.
     
  10. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    well you can read all you want.. I'm going by what us real gamers were benching back then.. we figured out really quick cache really doesn't matter- it's the clk speed that you see the actual gains

    funny too the lower cache chips produced much higher clocks.. quads are nice but limited in how far you could push them.. nowdays is another matter.. 4 gig is common- as a gamer your looking at clock speeds, number of cores and cache doesn't matter

    guess what I'm saying is.. if you have a choice between a high cache chip that doesn't clock.. or a low cache chip that does- you go with the low cache chip
     
  11. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    It does help form opinions
     
  12. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,956
    Likes Received:
    17
    So games work best on three cores. The Xbox360 has three cores. As most PC games these days are developed for the Xbox too, I can't say this result is a surprise.
    I wouldn't expect four or more cores to be required until the next generation of consoles come out.
     
  13. theineffablebob

    theineffablebob What's a Dremel?

    Joined:
    16 Jan 2010
    Posts:
    4
    Likes Received:
    0
    The Bad Company 2 benchmark is kind of useless if you're going to play it in single-player.

    Multiplayer is what most people play and that's where the game taxes the CPU the most. Dual-core users have reported massive boosts going to a quad-core--as high as double the average frame rate.
     
  14. Hamish

    Hamish What's a Dremel?

    Joined:
    25 Nov 2002
    Posts:
    3,649
    Likes Received:
    4
    still a good multicore test!
     
    Xen0phobiak likes this.
  15. Star*Dagger

    Star*Dagger What's a Dremel?

    Joined:
    30 Nov 2007
    Posts:
    882
    Likes Received:
    11
    It is STUNNING that some of the cyber-luddites have interpreted this as an endorsement of dual cores, when in fact it clearly states that Quads are needed.

    .-
     
  16. Brookstone

    Brookstone What's a Dremel?

    Joined:
    6 Jul 2010
    Posts:
    1
    Likes Received:
    0
    Nice article and many thanks!

    I don't have much literacy in computer hardware knowledge. So that is a headache thing for me to decide a proper configuration for a new PC.

    Now I'm still running a Intel E7400, it performs everything very smoothly. And from your conclusions in the article, I have no reason / thoughts to upgrade it currently.

    __________________
    Digital Camcorders Wholesale | MP4 Players Wholesale | China Electronics Wholesale
     
    Last edited: 15 Jul 2010
  17. Guest-16

    Guest-16 Guest

    It really doesn't. The Koreans in our demo react differently every single time. The main guy even gets stuck between rocks sometimes so I have to start the test again.

    We never recommend multi-GPU setups here at bit-tech ;) Plus it would take an exponentially long amount of time to test. I might save it for another day: 5770 CF/GTX460 SLI.

    It's not worth testing something because it uses all the cores. It's worth doing if PEOPLE PLAY IT! ;) GTA 4 was narrowly missed off the list though I admit. I did want more games but we can't do everything. Crysis is the ONLY exception because it's still our most clicked page. Like I said, we didn't even bother with Sup Com since we wanted to know about popular, new games.

    It also has triple channel memory, most CPUs don't. The article was about CPU scaling under the same conditions. A baseline is required for scientific analysis, then you extrapolate the conclusions by changing a single factor (or minimising the changes). You can read how a Core 2 Duo compares to a Core i5/i7 in our CPU performance articles. If I had used an AMD CPU people would complain an Intel is faster, or if I had used a quad - any quad: i5 or Core 2 Quad - people would want to know how new 6 core CPUs handle games.

    Do you understand what the inclusive L3 cache actually does? It's a snoop filter. It prevents cores from spending time waiting to probe each others' L2/L1 caches to see if the data they have is newer than what they contain. All L1/L2 info is copied to L3 when something is changed and they simply ask L3 instead of disturbing the other cores work. Whether you have 2 or 6 cores using it makes no difference because main memory is still used for the donkey work. It's not L2 cache like on Core 2s.

    theineffablebob - I appreciate that but it's impossible to get consistent and reliable results in MP in any game. I wish there was a way to accurately test it, but we tried blowing stuff up as much as possible and shooting lots of people in SP to get the physics and AI working.

    Action - we got more complaints doing the highest playable settings than we do FPS. People just want to see numbers and make their own conclusions. It's also impractical to test because it takes 5x as long (I'm not exaggerating), and unless you've got THE SAME SETUP that we're running - again, scientifically speaking, changing multiple factors relates it to no one.

    OK all the cores aren't being maxed out but that just tells us you don't need a powerful CPU to play the game. But those cores are being used somehow. If we turn down the resolution and detail to relax the stress on GPU then that's not representative of a high-end gamers setup, and it doesn't relate directly to the other games tested. I appreciate the point it doesn't answer the question on the article front page directly, and I will change the conclusion on Dirt 2 to highlight the game is not as CPU intensive as it is GPU.

    No the exercise is to relate it to people's actual gaming PCs as much as possible - fast cards: both NV and ATI, and a high resoution. I could have run the resolution at 1280x1024 but that doesn't relate to people's PCs.

    Also, if we turn down the FPS we'd start getting a **** frame rate and people just wouldn't understand and complain because that's all they look at. It would also make the gap artificially big for threaded games and not others. If the code is threaded and the cores are there the engine will use them, regardless of MHz, but I admit it's more difficult to decipher.

    Graphics card performance is stagnating. Look at the performance difference between a 5870 and 5970, or even 4870X2 and 5870. It's not massive, it's several FPS. For what, a billion extra transistors give or take? Dare I even point out Fermi is now a 3Bn transistor chip and look how that turned out ;)

    We're not talking about the next generation anyway - we cannot evaluate what doesn't exist. Maybe Windows 8 is so highly threaded that it makes every multicore magical??? :p What if you upgrade to a 6-core now then AMD Fusion/Intel Sandybridge makes it look like a piece of toast in 12 months? You'd be quite unhappy. :lol:

    Also brand new game engines usually take just as long as CPUs to evolve and they very rarely (Crysis/GTA4 being the exception) tax a system beyond what it can currently do to maximise sales. It's generally a slow progression. I remember we were saying "dual cores are enough" 2 years ago, and that argument is just coming to an end of its life now.
     
  18. Horizon

    Horizon Dremel Worthy

    Joined:
    30 May 2008
    Posts:
    765
    Likes Received:
    10
    There, there SupCom don't listen to what the mean man says I still love you.
     
  19. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    The reason we don't see that much gain though, is simply this: Dual GPUs, along with better developed drivers for dual GPUs you don't see that great jump anymore because it's diminished.

    If you think about it, we're getting about 100% performance increases every generation, aside from the G92 of course, but outside of that, ~100% from the HD4870 --> HD5870.

    At the moment, dual cores are enough still, especially with a bit of overclocking, but extra cores allow for more room to breathe and multi-tasking.
     
  20. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,251
    Likes Received:
    1,812
    Err, have you got the same graphs as everyone else?

    The data's enough to make me think about disabling two cores to see if I can up the clock rate on the remaining two. In reality I won't though as I can't be arsed :blah: .
     
Tags: Add Tags

Share This Page