1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Reviews MSI MEG Z390 Ace Review

Discussion in 'Article Discussion' started by bit-tech, 23 Oct 2018.

  1. bit-tech

    bit-tech Supreme Overlord Lover of bit-tech Administrator

    Joined:
    12 Mar 2001
    Posts:
    3,676
    Likes Received:
    138
    Read more
     
  2. Guest-44432

    Guest-44432 Guest

    It would be interesting to see how my Gigabyte Aorus Z390 Master stacks up.
    I'm yet to overclock my 9900K, but at stock when running AIDA 64 Stress test with a HSF it peaks at 85c (4.7GHz all cores). (Noticed that the vcore was shooting up to 1.45v at stock)
     
    Last edited by a moderator: 24 Oct 2018
  3. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I wish motherboard manufactures would stop covering up heatsinks with plastic bits, that's not directed at MSI specifically as they're all at it, and stop cooling the NAND memory while your at it.
     
  4. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,545
    Likes Received:
    1,769
    Given the current inflation on Intel CPU prices there is no way I'd use them.

    i9 9900K:
    [​IMG]

    i7 8700K:
    [​IMG]

    i7 7700K: (yes, that's almost 375 Euros... *sigh*)
    [​IMG]
     
  5. Guest-56605

    Guest-56605 Guest

    Simon, truly sometimes fella you have more money than sense - forking out that sort of cash for a product that is still inherently flawed with numerous vulnerabilities and at the resolutions you game at not much faster than the 2700X...
     
  6. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,058
    Likes Received:
    969
    Remember that PCs are basically infinitely reusable... reassign to HTPC duties, give away parts to friends and family etc, just because some people here upgrade a bit too often doesn't mean the old stuff won't get used any more.
     
  7. Guest-56605

    Guest-56605 Guest

    Whilst I appreciate all of what you’ve said, nah, I stand by my original post.
     
  8. Combatus

    Combatus Bit-tech Modding + hardware reviews Lover of bit-tech Super Moderator

    Joined:
    16 Feb 2009
    Posts:
    2,761
    Likes Received:
    89
    It's quite wide ranging. 8350K, which was awesome at £150 (basically a 7600K but for £60 less), is now well over £200. I'm sure they'll settle down but I'd definitely wait. We'll have our 9600K and 9700K reviews up soon too.

    Next on the list!!!
     
    Guest-44432 likes this.
  9. Guest-44432

    Guest-44432 Guest

    The upgrade was more towards VR, as the 2700X was holding me back to 45fps where as I get 90fps in the games I play. Also, I have to say, as I stream and use the CPU to encode, the 9900K seems to allow me to stream at 4k 30 without taxing my GPU in VR.

    Anyway, I also managed to get 5GHz all cores stable at 1.260v with temps not peaking past 90c on a HSF using AIDA 64 stress test.

    Full system stress was under 500W, and CPU max temps in the benchmarks was 68c

    [​IMG]

    Also results from my 2700X @4.2GHz in 3Dmarks11

    [​IMG]

    9900K results from 9900K @ 5GHz in 3Dmarks11

    [​IMG]

    3DMark 9900K @5GHz

    [​IMG]
     
  10. Guest-56605

    Guest-56605 Guest

    Whilst I applaud your enthusiasm and synthetic bench numbers, nah sorry, the premium you pay to get it for a few hours per day of virtual enjoyment - personally I’ve drawn the line at a set point with both the current CPU and GPU tax (yes Intel and Nvidia I’m talking about you).
     
  11. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Yes, we get it, you have a hate-on for Intel and Nvidia (and ignore the markup at retailer level). What you determine is good value for you is not what everyone else determines is good value for them.
     
    The_Crapman likes this.
  12. Guest-56605

    Guest-56605 Guest

    It has nothing to do with a hate on as you put it, Nvidia have introduced as yet unproven hardware to the market (as in nothing softwarewise is or will be able to take advantage of it for months or years to come) at an inflated premium - Intel on the other hand are only reacting to AMD's new share of the market after declaring years ago it couldn't be done, again at a premium.

    I fail to see how hate comes into the equation, seeing through the veiled bs of tech manufacturers isn't hate...
     
  13. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,907
    Likes Received:
    722
    Surely the thing holding you back for VR is GPU, VR is fairly low res isn't it, my gen1 Ryzen pretty is almost on par with the results you've posted for 9900K despite being generations behind and having a ~20% clockspeed disadvantage as we have the same GPU.
     
  14. Guest-44432

    Guest-44432 Guest

    After owning three Ryzen Processors from the 1600X, 1800X, and 2700X, the 9900K just feels more snappy on the desktop, and is noticeably faster in games, especially VR.

    VR is not really low res when you think it has to render 1920x1200 x2 x90FPS

    GPU usage seems to never go above 50% usage, as my 2700X couldn't keep above 90fp. So due to ASW, I was limited to 45fps, either that or, or I had to drop the quality right down.

    The 9900k is capable of giving me 90fps in VR, so for me it's a win. :)
     
  15. Guest-44432

    Guest-44432 Guest

    VR rendering 1920x1200x2x90 = 414,720,000 pixels per second. (More GPU dependant)

    VR rendering 1920x1200x2x45 = 207,360,000 pixels per second.
    (More CPU dependant)

    4k rendering 3849x2160x60 = 497,664,000 pixels per second.

    So that is why the 9900K is better for VR. :)
     
    Last edited by a moderator: 25 Oct 2018
  16. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    On top of that, VR is latency sensitive rather than throughput sensitive. That means it responds better to operations being completed more quickly, so single core speed has an outsized benefit as game engines are extremely lightly threaded for the core render loop (out of necessity).
    Plus the render resolution is higher than the panel resolution to account for the non-rectilinear optics, leaving you with a 2700x1600 eye-buffer render target for the Rift CV1 without supersampling, or 2700x1600x2x90 = 777,600,000 pixels/s.
    Supersampling is highly encouraged and has an excellent perceptual effect out to 2x* (though diminishing returns past ~1.5x) due to aliasing causing distant objects to 'shimmer' noticeably.
    At 1.5x SS, that's 4050x2400x2x90 = 1,749,600,000 pixels/s.
    At 2x SS that's 5400x3200x2x90 = 3,110,400,000 pixels/s.

    These huge pixel counts that make UHD cry are why techniques like Lens Matched Shading, Variable Rate shading, etc are useful. If you can only apply a single scaling factor per image, you get a big benefit in the centre of the view but waste pixels around the periphery.

    * I'm using Oculus' SS factor notation, which is linear per axis, meaning 2x takes 100x100 to 200x200. Valve use total-pixel-count SS factors, so to go from 100x100 to 200x200 would be denoted as 4x.
     
    Guest-44432 likes this.
  17. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,907
    Likes Received:
    722
    Whilst I don't disagree with what you are saying, vr res is little more than 1080p, on average with a 1080Ti a 5Ghz chip will provide ~ 9-10% uplift over Ryzen but @ 1080p a 2080Ti will provide on average a 20% uplift, so ditching the 1080 for a 2080 would to me make more sense for similar money.

    That MEG seems like a nice enough board but blimey that price is headed to HEDT level yet seemingly offering nothing more? I guess that is what the review means by stepping on toes.

    A HEDT on an older gen chip will give you quad RAM more PCIe etc and will over clock and perform similarly will it not (I'll be honest Intels line up confuse me so I could be wrong here) ?
     
    Last edited: 25 Oct 2018
  18. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    That simply is not even close to being true. At 1.5x SS, it's out by a factor of 10.
    It will not, as every benchmark of the last decade of HEDT chips will tell you.
     
  19. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,907
    Likes Received:
    722
    OK so you are saying it is more pixels to push, which means you will need more GPU rather than CPU as at higher rendering resolutions you are GPU limited?
     
  20. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    See the first part of my previous post.
     
Tags: Add Tags

Share This Page