1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Intel Core i7-5930K and Core i7-5820K Review

Discussion in 'Article Discussion' started by Dogbert666, 3 Sep 2014.

  1. Dogbert666

    Dogbert666 *Fewer Lover of bit-tech Administrator

    Joined:
    17 Jan 2010
    Posts:
    1,678
    Likes Received:
    181
  2. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,419
    Likes Received:
    5,791
    Has there been a snafu with the game benchmarks, they're surprisingly uniform? ;)
     
  3. enbydee

    enbydee Minimodder

    Joined:
    10 Jul 2014
    Posts:
    509
    Likes Received:
    199
    It was the same story with 5960X review; GPU limited even on lower settings with a 780 apparently.
     
  4. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    What's going on with the idle power consumption graph? You've got two entries for both CPUs being tested.
     
  5. SpAceman

    SpAceman What's a Dremel?

    Joined:
    1 Nov 2010
    Posts:
    267
    Likes Received:
    4
    You might need to go for some games that are a bit more CPU reliant.
     
  6. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Second to last sentence should be i7-4790K not 4970K.
     
  7. Combatus

    Combatus Bit-tech Modding + hardware reviews Lover of bit-tech Super Moderator

    Joined:
    16 Feb 2009
    Posts:
    2,761
    Likes Received:
    89
    Please read the entire article, specifically the overclocking section ;) If you run at 2,133MHz on the RAM, as we did with the extra tests, you can run the motherboard completely at default settings. If you use 2,600MHz and above, then this automatically changed the CPU strap to 125MHz - the short story being, those stock power readings will be higher, so we've included both numbers so you can see what happens when using 2,133MHz RAM and anything above 2,666MHz.
     
    Last edited: 3 Sep 2014
  8. Combatus

    Combatus Bit-tech Modding + hardware reviews Lover of bit-tech Super Moderator

    Joined:
    16 Feb 2009
    Posts:
    2,761
    Likes Received:
    89
    Yep we've had a few comments about this. We were stuck for time getting these out but we'll be looking to include a game that's more CPU dependent and/or dropping another card in for SLI/CrossFire when we get around to looking at the motherboards.
     
    Last edited: 3 Sep 2014
  9. Combatus

    Combatus Bit-tech Modding + hardware reviews Lover of bit-tech Super Moderator

    Joined:
    16 Feb 2009
    Posts:
    2,761
    Likes Received:
    89
    Good spot! thanks!
     
  10. Maki role

    Maki role Dale you're on a roll... Lover of bit-tech

    Joined:
    9 Jan 2012
    Posts:
    1,724
    Likes Received:
    151
    Gah I'm still frustrated by the whole 5960X being the 8-core deal. I don't see much of a point in upgrading from one 6-core to another (coming from a 3930k). The extra cores would definitely be handy for my usage, but whether it's worth not only the cost of the CPU, but the mobo and RAM is another point entirely. Something just feels odd about upgrading the RAM, but not increasing the capacity. At current prices, anything more than 32GB would simply be monstrously expensive.

    Also, seems Mod of the Month is lagging behind again :( Please don't let the same thing happen as last year where we ended up missing months.
     
  11. DbD

    DbD Minimodder

    Joined:
    13 Dec 2007
    Posts:
    519
    Likes Received:
    14
    It's the first Intel 6 core system that's not stupidly expensive. Shame it doesn't o/c particularly well so ipc isn't as good as the cheap 4 cores we are all using.
     
  12. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    I should probably read the entire article, but I generally don't read the overclocking section and I wouldn't expect the pretty graphs to be explained there. Maybe add an NB by the graphs? Got more questions
     
  13. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    Good performance for a single CPU, but too expensive and requires dedicated gpu. for the price I still prefear to build an ITX render farm. I keep thinking that the future of computing is in offloaded computations.
     
  14. Combatus

    Combatus Bit-tech Modding + hardware reviews Lover of bit-tech Super Moderator

    Joined:
    16 Feb 2009
    Posts:
    2,761
    Likes Received:
    89
    No problem, I've just added this to the idle graph description!
     
  15. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    Same issue here. I've just moved from a Q6600 with a geforce 9300 IGP to an i5-4570 with GTX770. While it is a lovely machine, especially with a small case like the lian li PC-V353 ... I still think it's too big.

    I remember learning 3D at school, 10 years ago, on a single core Athlon 64 with 1GB of RAM and a geforce 4 Ti 64MB. Learned to model / animate using low poly and then using subdiv for preview / rendering. My only mater was that rendering my 3 minutes animation for my diploma took a full month 24/7 to render, 100% CPU. I was a student back then and I had to live without the computer for a month.

    The idea of having rendering nodes is primarily to offload the main rig, thus you can still work on something while the farm renders a scene (even if it renders slower than the main rig). Plus, it is expandable.

    I've been looking at FPGA based "caustic" card to help with previewing on the main rig. You could thus have a fast cpu (modelers hardly use more than 1 core when modelling) with the IGP and use the FPGA for ray traced previews. Only problem actually is the price: $1500 for the dual FPGA / $800 for the single FPGA

    http://santyhammer.blogspot.fr/2012/12/imaginationcaustic-graphics-r2500r2100.html

     
  16. Maki role

    Maki role Dale you're on a roll... Lover of bit-tech

    Joined:
    9 Jan 2012
    Posts:
    1,724
    Likes Received:
    151
    I've simply decided that GPU based rendering is the way forward for my work. Usually I can compress it all down to within 6GB, which means I can use a pair of Titans. They absolutely spank my CPU in terms of speed, a 30 minute render may only take a couple minutes using them, which in my book is phenomenal. The handy thing is that GPU memory is continuing to rise at a rapid rate. I wouldn't be surprised if the Maxwell Titan equivalent features 12GB much like the K6000 does now. When GPU VRAM stacks are approaching system memory quantities, you know things are changing. I for one hope this trend continues.
     
  17. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    I have to question the usefulness of the gaming benchmarks. At least the older reviews used CPU bound games like skyrim and total war. Why bother test something in a manner that doesn't show the differences between the CPUs? Look at Tech Reports coverage of the 5960x for gaming benchmarks that actually highlight the differences between CPUs.
     
  18. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    Awesome. So the way I understand it, running the ram at 2666MHz auto overclocks the CPU by pushing the bclk up to 125MHz so you guys had to lower the bclk back down to 100MHz? This pushes the power consumption up at idle, but not load, and there's no change in voltage?
     
  19. Vallachia

    Vallachia What's a Dremel?

    Joined:
    3 Feb 2011
    Posts:
    42
    Likes Received:
    2
    16/8/4 is not the only option for multi GPU setups with the 5820K. 8/8/8/4 is also achievable, but so far when looking at the fine print of motherboard manuals it seems Gigabyte are the only ones offering 8/8/8/4 lane splitting. Neither Asrock or MSI allow it, not sure about Asus yet.

    To be clear, the motherboard must support 8/8/8 with the 5820K by including additional clock gens. The only mobo's I can say for sure support this are GA-X99-UD4 and GA-X99-UD3.
     
  20. Maki role

    Maki role Dale you're on a roll... Lover of bit-tech

    Joined:
    9 Jan 2012
    Posts:
    1,724
    Likes Received:
    151
    I thought Luxrender now supports GPU acceleration? Since you're using blender, I'm surprised that you haven't looked into using Cycles. It's now incredibly proficient, featuring GPU accelerated particles etc, although SSS is still CPU I believe (not for long I imagine). To me, the boost isn't just in the actual render time, but in my workflow. Being able to alter materials in real time in the viewer is insanely efficient, it must have cut a lot of my work in half, time wise. In the same way, it works brilliantly for posing, setting up scenes and lighting as you can instantly see a decent quality representation of the final image. The best part about cycles though is how it supports simply so many GPUs. I use Titans because they have stacks of VRAM, but it works with plenty of other cards too, they even have OpenCL support making a come back. Obviously it depends on individual usage, but Cycles is fast become a serious rendering option.
     
Tags: Add Tags

Share This Page