1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Core i7 a waste of money for gamers, says Nvidia

Discussion in 'Article Discussion' started by Tim S, 23 Apr 2009.

  1. damikez

    damikez What's a Dremel?

    Joined:
    24 Apr 2009
    Posts:
    1
    Likes Received:
    0
    Have you noticed that the quality settings are set to "medium" or "mainstream"? In that quality setting, the GPU isn't maxed out, that's why you can see the increased performance of the i7 920.

    I have a C2D E6550 (2.33GHz), i recently got a GTX275 (replaced my 9800GTX), playing Crysis @2.33GHz my FPS is around 33, @3.57GHz, its just around 35FPS all quality settings maxed out, clearly, the GPU is the bottleneck.

    At this point in time, or until the 2nd half of next year, CPU performance won't be that critical when it comes to gaming.
     
  2. metarinka

    metarinka What's a Dremel?

    Joined:
    9 Feb 2003
    Posts:
    1,844
    Likes Received:
    3

    I want to call out right now and say that audio/music production generally isn't that CPU intensive especially when any good audio interface will be handling the sound processing

    I do quite a bit of music production my self, 100% soft synths using a C2Duo 8400 4 gigs of ram and an e-mu 1820M Never once has my cpu usage crept past 50% in Ableton Live and that's going full out with 20+ tracks and a heavy amount of chaining and rewiring into Reason 4

    I've heard of a few physics modelling synths take up some cpu power but please tell me what you are doing audio wise that justifies an i7?

    Compared to video or image editing, audio editing is realitivley low powered when using a proper audio interface
     
  3. dyzophoria

    dyzophoria Minimodder

    Joined:
    3 May 2004
    Posts:
    392
    Likes Received:
    1
    what's with NVIDIA nowadays, instead of bashing everybody else, why not just spend their time researching on a new GPU. from the way I see it, NVIDIA has only two things in mind nowadays 1.) Bash Intel 2.) Rebadge every product they have.
     
  4. V3ctor

    V3ctor Tech addict...

    Joined:
    10 Dec 2008
    Posts:
    583
    Likes Received:
    2
    Q6600 Cheesecake... Best Intel CPU :D We have quad-cores... and those *******s aren't evenu used at the max of their potencial... i7 is just waste of money, (unless u have a sckt. 939 AMD, or a P4 3.4HT) maybe the new architecture "SandyBridge" in 2011 makes me switch my Q6600 and my 4870
     
  5. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    2,306
    Likes Received:
    235
    Probably because they sell chipsets / MB that can take a Core 2 Duo/quad yet don't have anything to run with the i7.

    After doing a little testing myself when I had a HD4870X2 I found there was a noticable difference in minimum framerates between running the card on my old Core 2 Duo (3ghz) vs my new i7 920 (2.66GHZ). Of course this difference would be minimal with the card I have now (downgraded to a HD4850 as I realised I didn't need the GPU HP).

    Personally given how prices of DDR3 have dropped along with 'affordable' X58 motherboards and I wanted a decent quad core system I wouldn't bother with Core2's anymore and move straight to the i7. Of course if I wanted a pure gaming pc with a fast single gpu I would go for a mid range Core2 Duo and clock it.
     
  6. lewchenko

    lewchenko Minimodder

    Joined:
    17 Dec 2007
    Posts:
    367
    Likes Received:
    5
    I normally upgrade to the latest and greatest, but when Core i7 came out I was not impressed. Despite the rave reviews that magazines and websites were giving it, for a gamer... the improvements could not be cost-justified.

    So I upgraded my Core 2 e6750 chip to a quad Core 2 9550 and overclocked it to 3.8Ghz.... I then had money to spare to upgrade my old 8800GTX as well to a 260 216 XFX Black Edition. Now it eats games for breakfast.

    Those upgrades cost me a fraction of the cost of having to build a new Core i7 machine... mainly due to the cost of new MB, new memory and a new CPU/Cooler (plus then a new GPU on top).

    In this recession, people with Core2's could get much better bang for buck by upgrading.. not replacing.
     
  7. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    6,928
    Likes Received:
    1,041
    I think we all agree by now that it's a good [read: future-proof] idea to upgrade to i7 only when you're not on any recent C2D, C2Q or Phenom CPU.

    If I had anything to say at either of the big companies I'd just fire half my marketing staff for being nothing but stupid kids.
     
  8. Shielder

    Shielder Live long & prosper!

    Joined:
    26 Jul 2007
    Posts:
    596
    Likes Received:
    0
    The new helecopter sim BlackShark (very very good, gonna install it when I can afford it!) has a little utility that can utilise up to 8 cores (i7 w HT) for the game. I have played it on my father in laws rig and it is awesome! My 6 year old loves it too...

    Andy
     
  9. jimmymcjimmy1

    jimmymcjimmy1 What's a Dremel?

    Joined:
    26 Apr 2009
    Posts:
    4
    Likes Received:
    0
    I have got to say that until I read the latest Custom PC CPU Guide I was under the impression that the i7 cpu,s had taken PC performance to a whole new level and was serouisly contemplating my next upgrade from my Asus P5K Premium Mobo/ Q6600 CPU based rig. But what the CPC CPU guide has highlighted to me is that with my Q6600 clocked at 3.4GHz the substantial cost of an i7 upgrade for a relatively small gain in overall pc performance is simply not worth it at the present time. I would also agree with the CPC analysis that a more cost effective upgrade path from a Q6600 CPU would be to install a Q9650. So yes in my opinion the man from nVidia is bang on the nail!
     
  10. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    well I wouldn't say anywhere near bang on.. he's trying to sell nvidia sli chipsets over intel- only reason he's even talking.. and nvidia chipsets are crap compared to intels imo- not to mention sli is a big pile of marketing.. the single slot x2 cards are a better way to go if you really need the push for a 30" monitor

    reason I say this is, your drawing way to much power to justify what you get back with sli.. they would love everyone to fall for sli (and they do.. common sense tells you 2 is better than one).. twice the money- oh yeah let's throw in a 3rd card to run physx! :D and they laugh all the way to the bank
     
  11. adam_bagpuss

    adam_bagpuss Have you tried turning it off/on ?

    Joined:
    24 Apr 2009
    Posts:
    4,235
    Likes Received:
    151
    if i7 is a waste of money, then so is SLI.

    2xGPU does not equal 2xperformance and its completely varied across games. Some like SLI and boost frames a bit others hate it and can actually be worse.

    Nvidia should not be commenting on a waste of money since they would also need to say dont buy SLI cause its also a waste of money.

    at least with an i7 it does other things besides gaming in which its amazingly fast.

    2xGPU are good for only 1 thing gaming thats it. well maybe a few other 3d apps but thats it.

    i7 920 + single high end GPU = fast performance at everything.

    slow/average CPU + 2XGPU = fast gaming, mediocre performance at everything else.

    Now which one would you pick !!!!!!!!!!!!
     
Tags: Add Tags

Share This Page