1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Intel Core i5-4670K (Haswell) CPU Review

Discussion in 'Article Discussion' started by Meanmotion, 12 Jun 2013.

  1. fluxtatic

    fluxtatic What's a Dremel?

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    Not sure what Intel or AMD could do to make it better, but this is why the PC market is starting its downward slide - how often over the last three generations have you seen something to the effect of "meh, my Core 2 is still competitive enough"?

    For me, as an AMD sort, I'm not feeling a big urge to dump my Phenom II X3 - OC'd to 3.2, it feels snappy enough for most anything I use it for. It seems to crunch through massive Excel spreadsheets a bit faster than the first-gen Core i3 I have at work.

    Next will likely be a VGA upgrade - my GTS450's a bit long in the tooth. By the time I'm really itching to upgrade my processor, Steamroller will likely be out and it will be time for an entire platform upgrade.
     
  2. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    386
    Yea because server hardware never makes the transition to desktops does it :duh:
    You best tell these guys that they are wasting there time, other wise they are going to be wasting a lot of money :hehe:

    http://www.crucial.com/promo/DDR4.aspx
    And you best get some emails of to the rest of the world as it seems they are all about to make a terrible mistake.
    http://flyingsuicide.net/articles/what-ddr4-will-mean-to-the-desktop-user/
     
  3. Guest-16

    Guest-16 Guest

    Well I'd rather believe my wife who actually works in the memory industry, but there we go :) Things can change I suppose.
     
  4. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Funny how your article mentions the release date to be around 2014-2015. So Bindi's pretty right then.

    Just pointing it out. :D :thumb:
     
  5. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    386
    Well if you want to say a release date of 2014-15 is as Bindi said, that its not coming to the desktop.
    I'm not sure how something rumored to be supported with Skylake, also expected in 2014-15 is something
    "not likely to become a desktop product" :confused:
     
  6. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    Intel is going for the ultrathin/tablet market, because that's where the big money is. As much as I don't like it, that's the reality of today's market. Soon we'll be back to using top-end xeons in our desktop builds as the mainstream desktop die sin favour of laptops....

    Haswell moves the VRM into the CPU. It needs a whole new motherboard layout and power delivery circuit as a result. Might as well change socket to "simplify" consumers lives, since you need a new motherboard anyways.

    If you have a high-end, current-gen GPU, you'll be seeing your CPU limiting things in games, but then again, you don't game, so all is well for you.

    As I said above, its not your limiting factor as of now. And even if you don't need/want to upgrade, others do.

    For instance, my Penryn Core 2 Duo Latitude is visibly slower than my much newer dual-core Sandy Bridge X220 Tablet. I don't see it when using it for office or web browsing, but start compiling stuff or building bitfiles for FPGAs in Xilinx ISE and suddenly the difference is visible without even breaking out the stopwatches.
     
  7. Guest-16

    Guest-16 Guest

    Will we still have a desktop market by then? :D
     
  8. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    386
    God i hope so :waah: :hehe:
    But as ZeDestructor said
    "Soon we'll be back to using top-end xeons in our desktop builds as the mainstream desktop die sin favour of laptops...."
    Damn those tablet's and laptops we were here first, stop stealing my CPU's and get of the blooming grass. :p
     
  9. Hakuren

    Hakuren What's a Dremel?

    Joined:
    17 Aug 2010
    Posts:
    156
    Likes Received:
    0
    So basically Haswell is Nehalem fork to be more energy efficient. Cute

    i7 920 going strong and keeping it for foreseeable future. Jumping from C2D to lga1366 was like switching Ferrari for Mig-25 Foxbat. There is absolutely no reason to upgrade (except some catastrophic occurrence) if you have anything from Nehalem upwards.
     
  10. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,397
    Likes Received:
    118
    Wasn't it possible to overclock not by raisig the base frequency but by enbling more boost steps?

    So It'd run on stock 3.4 but boosts to 4.4 instead of 3.8?
     
  11. rayson

    rayson Damn sure it was legal

    Joined:
    23 Jul 2010
    Posts:
    238
    Likes Received:
    0
    Yes but I don't play anything other than bf3 and maybe borderlands 2 . Skyrim is not really my kind of game neither any of CPU limited games I can think of.
     
  12. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,086
    Likes Received:
    180
    amd dual cores are just as `fast` currently in BF 3 as an octo core anything- its very gpu limited.
     
  13. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    I don't mind actually. I'm more than happy to see the mainstream market collapse in favour of tablets and laptops. No more poorly designed desktops all over the place, and with any luck and the help of Ultrabooks, poorly designed laptops too.

    You, good sir, have clearly no used a Sandy/Ivy Bridge machine. Just the improved core and new memory controller puts it ahead of the Nehalems quite a bit, and then you whack em up to 4.4+GHz and it simply DESTROYS a Nehalem-based machine.

    Nehalem should start CPU-limiting around now as well: it can't clock as high, and doesn't have the IPC to keep up either. If you're on Nehalem, you should start planing for the generation after Broadwell (14nm version of Haswell). If intel maintains its 10-15% increase in performance per "tock" (major architectural overhaul on proven fab nodes), Nehalem would be between 33% and 52% slower than what will come then. Unless you upgrade with Broadwell, of course.

    Core 2 folks should be upgrading right now, to either Ivy Bridge or Haswell.

    Naturally, only upgrade if you actually need it enough. If you're only ever gonna browse facebook and GMail and type out the odd document, a first-gen Core Duo machine should do fine for a while longer if you have enough RAM...

    On the subject of forks, well, if you look close enough, you can trace Haswell all the way back to the Intel 4004 :p

    That's how most people overclock Sandy Bridge and newer machines. I just raised the maximum multiplier on all cores to 44x and VCore by 0.05V. The OS CPU governor then manages how high it goes between 1.6GHz and 4.4GHz. Any decent motherboard should allow you to do that on a per number of enabled cores. All you really do is boost Turbo up. You can also play with BCLK, but I'm not touching that. Ever. It's just too painful.

    EDIT:

    The desktop market will always remain: there is always a certain segment of the market that needs faster machines, people that need a dual-xeon with dual Quadro K6000s (Quadro Titan if you will) box while at the same time having an attached compute farm of some description. These guys upgrade nearly every generation, spending upwards of 10k per machine over a fleet of hundreds of machines, much like in the olden days of computing, before Xeons were a dedicated line, and instead you grumbled under your beard about having an integrated math co-processor and VRAM was the only that mattered in graphics cards.

    *sigh*

    Those were the days.... when tech support was actually technical in nature, and the guy at the other end knew his ISA from his PCI...
     
    Last edited: 14 Jun 2013
  14. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    386
  15. mucgoo

    mucgoo Minimodder

    Joined:
    9 Dec 2010
    Posts:
    1,602
    Likes Received:
    41
  16. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    I'm planning to wait until Skylake-E for my next platform upgrade.

    And there's a good reason for the socket change: new power delivery because of the integrated VRM. Such was the case for many older intel CPUs on the LGA775 socket (you couldn't use a Core 2 in an early P4 board, thus negating any point in retaining the socket).

    325-1000 USD.
     
  17. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    386
    Yea when you mentioned about the change of socket due to the VRM's no longer being on the MoBo a few posts back i had one of those Dohh moments :duh: :hehe:
    Although in my defense Haswell-E is going to change the socket yet again from a normal LGA 2011 to a LGA 2011-3 apparently.

    This from the article giving more details than Gareth Halfacree had back in February.
    (maybe worth holding off on MoBo upgrade for some).
     
  18. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    Same reason haswell changes socket.
     
  19. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    386
    Sorry i maybe being dense here :worried:

    But nothing technically is going to change from Haswell to Haswell-E, the only reason they give for the change of socket is "ensure only LGA 2011-3 are assembled with LGA 2011-3 sockets"

    Technically you could run a Haswell-E in a normal LGA 2011 socket if it wasn't for them changing the layout of the pins from 12x15 to 14x13.
     
  20. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    Nope. Haswell wouldn't be happy in an X79 platform. Not happy at all. Like, set itself on fire not happy. Like I said, new power requirements thanks to the integrated VRMs.

    Unlike what many people think, intel has actual, technical reasons for changing sockets: they shifted from LGA775 to LGA1156 because of the integrated memory controller. Then they swapped to LGA1155 because Sandy Bridge moved the PCIe controller into the CPU, and now, to 1150 because of embedded VRMs.

    On the high-end side, they created LGA1366 for the triple-channel RAM, then LGA2011 for quad-channel AND integrated PCIe, and now LGA2011-3 to suit the updated power delivery.

    They could have bothered not updating the sockets and just requiring a new chipset and/or boards (as they did for C2D), but they opted for changing the socket along with, probably to reduce confusion to a minimum.
     
Tags: Add Tags

Share This Page