1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Reviews AMD Ryzen 7 3800X Review

Discussion in 'Article Discussion' started by bit-tech, 10 Sep 2019.

  1. bit-tech

    bit-tech Supreme Overlord Lover of bit-tech Administrator

    Joined:
    12 Mar 2001
    Posts:
    3,676
    Likes Received:
    138
    Read more
     
  2. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    I still don't think going with the 3700X was a bad idea. The 3800X is marginally better, but it's also 18% more expensive in Germany right now. I haven't seen a single test result yet that shows the 3800X to be 18% faster.
     
  3. Zak33

    Zak33 Staff Lover of bit-tech Administrator

    Joined:
    12 Jun 2017
    Posts:
    263
    Likes Received:
    54
    Historically I have always been a bit miffed when better and better CPU's are launched in rapid succession, feeling as though if the whole range had launched in one, people could make better choices.

    But here, in 2019, I'm really enjoying watching Ryzen take the battle to the top, month by month, having not just awesome multi core/multi thread oomph, but ever increasing clock speeds AND an ability to use RAM bandwidth too.
    I still own a 1700 - I feel that 65w for that much oomph is still a revelation and here we are looking at raw muscle power with only 125w and , with good cooling, frankly astounding ability at really very normal money £380.. I mean, what graphics card is top of it's game at £380?
     
  4. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,679
    Likes Received:
    3,937
    A second hand 1080ti?
     
  5. Zak33

    Zak33 Staff Lover of bit-tech Administrator

    Joined:
    12 Jun 2017
    Posts:
    263
    Likes Received:
    54
    not top of it's game though is it? really? Launched in March 2017
     
  6. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,117
    Likes Received:
    673
    It depends on your expectations, I guess. To me, a £150 8GB RX580 would play anything I want happily at 1080p. A £380 5700XT would set me up for the next 2-3 years.
     
  7. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,131
    Likes Received:
    6,725
    Playing devil's advocate, what CPU has 2,944 cores and 8GB of high-speed RAM on board for <insert price of RTX 2080 here, I can't be faffed looking it up>?
     
  8. Zak33

    Zak33 Staff Lover of bit-tech Administrator

    Joined:
    12 Jun 2017
    Posts:
    263
    Likes Received:
    54
    If we could all run just a GPU we would in many cases. But we can't so ... bit stuck with that. I don't know anyone who can run Windows on just a 1080ti, nor game nor run excel just upon it.

    I believe the devil needs a new advocate.

    Wakka has a good point... £150 buys what he needs. But it's not "top of it's game" as it's been superceded any time over.
     
  9. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,131
    Likes Received:
    6,725
    Nor do I know anyone who can do any of that on a Ryzen 3800X without also buying a graphics card, given it has no iGPU...

    My point was, you're comparing apples to oranges. Like, I don't know of any McLaren F1 cars that are at the top of their game for the amount of cash I splashed on my Vauxhall, but it doesn't mean I'm going to win at Nürburgring in the famvan.

    A CPU is a lot cheaper to make than a GPU: it's physically smaller, it has fewer cores, it has a *lot* less memory on board, it doesn't need a bunch of supporting stuff like voltage regulators 'cos they're on the motherboard... So, yeah, a top-of-the-range consumer CPU is going to cost less than a top-of-the-range consumer GPU.
     
  10. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,283
    Likes Received:
    891
    Aye, comparing top-of-the-line CPUs and GPUs on price is a fool's errand. And in any case, is the 3800X really "top of its game" when there are faster CPUs, such as *checks notes* the 3900X? On that basis, what does "top of its game" actually mean? Absolute best performance? Best performance for the price?
     
  11. Zak33

    Zak33 Staff Lover of bit-tech Administrator

    Joined:
    12 Jun 2017
    Posts:
    263
    Likes Received:
    54
    I think perhaps its because my era of old boy remembers a time when the best CPU was $999 and the best graphics card was half that.

    2006ish was an era of that. Core 2 Duo Extreme was a grand and an X1900 XTX / 7900 GTX was literally half that
    So "top of it's game" has reversed

    They are indeed as you say, chalk and cheese. Or perhaps cheese and crackers..they aren't the same but they work together.

    Back to my original point, (which I agree with myself on and expect no one else to agree with, cos that's how it is) - a CPU that's nder £400 and is frankly one of the ultimates, vs a £400 GPU which is decidely mid range.

    AND.. the CPU will work in many motherboards that are already being used for previous CPU's

    I think it's a wonderful thing.
     
  12. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    I guess we can all agree that it's a good thing to have AMD back in the high-end market. But the real money is made elsewhere and judging by the recent set of slides from the blue side of town there might be the next crisis for Intel brewing already.
     
  13. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,679
    Likes Received:
    3,937
    If you ignore the £1700 2990WX or the £1900 i9 9980XE. You know, THE "top of it's game" cpus.

    So.... :eyebrow:
     
  14. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,117
    Likes Received:
    673
    Yeah, Extreme Edition CPU's still exist.. And, if anything, have got even more extreme in their pricing - especially when you account for the astronomical motherboard cost for their respective-tier product - even if you ignore the silly "limited edition" or absolute halo boards, high end X570 and X299 boards are well north of £400 nowadays. I don't remember the early ROG boards costing that much compared to the mid-range offerings (but happy to be corrected :p).
     
  15. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,131
    Likes Received:
    6,725
    Ah - my perspective is largely informed by... a very different era of Old Boy. An era when CPUs were only swappable if you were handy with a soldering iron and had a matching crystal, and when discrete graphics cards were something only swanky buggers had.
    Ah, if we're talking historical... <pulls up chair, puffs on pipe>

    Skipping the IBM Monochrome Display Adapter (no graphics support) and Colour Graphics Adapter, let's go right to the Enhanced Graphics Adapter. Depending on the amount of VRAM you wanted on it, IBM's official EGA board would set you back between $500 and $1,000. Third-party compatibles were between $300 and $600. At the time of launch, Intel's shiniest new part was the 286 family (scroll down to Page 8, there's a great - though aggrandised - story about the origins of the 286 in there), though the 386 would follow on a year later. The 386 cost $299 at launch for the 12MHz version.

    So, for some of the earliest discrete graphics cards and user-replaceable CPUs, we're seeing the GPU costing way more than the CPU - double or more for an official IBM part, or roughly-equal to double for a third-party part.

    Skipping forward - and ignoring the 386SX, which was cheaper than the 386 - Intel launched the 486 at $950 - "nearly three times as costly as the 386," per the Wall Street Journal at the time. Ignoring specialist stuff, you might have had an ATi VGA Wonder 16 back then, which would have set you back $499 or $699 for either 256KB or 512KB VRAM. Cheaper than the 486 - and even more so when the VGA Edge-16 came out, dropping a few features to lower the cost.

    Skipping ahead to 1999 and cards with hardware transform and lighting for 3D acceleration, this copy of PCMania from 1999 tells me that a high-end graphics card of the time - it being three years on, the Rage family not being up there - would set you back around 40,000 Peseta - or around $265 at the 1999 exchange rate. Intel's shiniest CPU in 1999 was the Pentium III, and the Coppermine 600MHz version (which launched in August, after the first-run Katmai parts) was $670-740 'depending on your region'.

    Your 2006 example: the Core 2 family was, indeed, launched that year, and you were indeed looking at $999 for the Core 2 Extreme. The shiny new was the Radeon X1950 XTX - replaced a year later by the Radeon HD family - which would give you a dollar change from $430.

    Interestingly, a year later, the launch of a new DirectX version would render the older cards largely obsolete; skipping to 2008, the Radeon HD 3870 X2 was what you wanted, combining two shiny GPUs into a single card - yet somehow still only costing $449. That was the year the Core 2 family went away in favour of the Nehalem-based Core parts - and it was still looking for $999 at the top end.

    Let's skip a decade or so, and come bang up to date. The 3800X from the review is $400 on NewEgg, whereas a top-end RX Vega 64 is twice the price. The 3800X isn't AMD's top-end non-server part, though: that'd be the Threadripper 2990WX at $1,800, at least until the Zen 2 Threadrippers pop up. Then, of course, there's Intel: $2,000 for a Core i9 Extreme, anyone?

    What have we learned from all this? That I really, really like doing historical research, and that I'll jump on any excuse to do something that isn't the work I'm supposed to be doing... but more seriously, that in the early days of GPUs it was commonplace for the GPU to cost several times more than the CPU, then it flipped, and now it has flipped again - so long as you don't stray into HEDT territory, in which case CPU pricing is still king by quite some margin.
     
    jb0 likes this.
  16. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    I think the problem is that for the last decade (maybe more) there has been no such thing as 'a' top-of-the-range CPU.
    Any given CPU may have fantastic single-threaded performance and a relatively small number of cores, or a huge number of cores and relatively pants single-threaded performance (because there tend to be complaints if you try and push the better part of a kilowatt into a few square CM), and a huge sliding scale of tradeoffs in between. If your workload is gaming or other workload limited by single thread performance, the 'top of the range' will be one chip. If your workload is non-GPU raytraced rendering, then the top of the range will be a completely different chip. If you have a huge database to serve queries from, then the top of the range may be yet another chip again because it has an obtuse memory interface. And so on and so forth.
     
  17. Zak33

    Zak33 Staff Lover of bit-tech Administrator

    Joined:
    12 Jun 2017
    Posts:
    263
    Likes Received:
    54
    aint no point posting in here, you lot are too bloody clever/specific for your own good. It's like walkiing a minefield, with heavy boots on. They MIGHT protect my toes, but I'm still gonna lose a bollock

    I'm going back to arguing about Waitrose minted peas and Aldi
     
  18. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Oh boy, now you're really going to get the trolls out!
     
    Zak33 likes this.
  19. zx128k

    zx128k What's a Dremel?

    Joined:
    8 Oct 2019
    Posts:
    8
    Likes Received:
    0
    Basically the main issue with the 3800x with performance in games, is the fact that chiplets increase the latency with memory by about 10ns. The 3800x is very weak memory wise when it comes to performance but completely destroys a 9900k with more powerful cores.

    The focus of overclocking a 3800x is two fold:

    1. Getting the cpu as cold as possible so that its all cores boost will be higher. With ABBA BIOS there is no point in touching the cores. If you have a 3800x that can do 4.4GHz all cores @ 1.325volts, then you have the same as me.
    2. Getting the RAM speed as high as possible and the RAM timing as tight as possible. true latency (ns) = clock cycle time (ns) x number of clock cycles (CL) https://uk.crucial.com/gbr/en/memory-performance-speed-latency

    With the 9900k you have CPU cores that are weak against the 3800x. The 3800x does not need to overclock the cores to win against the 9900k. What to 9900k should do is overclock to 5GHz and then add 4000 or higher RAM at the lowest tight timings as possible. This is the only way to beat a 3800x in game. If you put stock RAM 2666 or 3200 with a 5GHz all core overclock. The 3800x with a RAM overclock will destroy that build like it was nothing. With the 9900k you have to tighten the timing on you RAM and get really fast RAM to beat the 3800x.

    3800x RAM overclocking, standard 3600CL16 overclocked to 3800/IF overclocked to 1900 as per AMD instructions.

    Performance in Ryzen DRAM calulator. 102.3seconds. Easy test.
    [​IMG]

    RAM timings and stock cores in Ryzen master.

    [​IMG]

    Games performance
    [​IMG]

    9900k stock with 2666MHz its default RAM. Average FPS 109.9
    3800x stock with IF1900/3600 RAM (3800 tighten timings), AMD recommended for reviews DDR4 3600. Average FPS: 148

    [​IMG]

    [​IMG]

    Source https://www.techspot.com/review/1877-core-i9-9900k-vs-ryzen-9-3900x/
    DDR4-3600 CL16 memory
    i9-9900k all cores 5GHz
    2080 ti
    highest 1080p

    Average: 124fps

    DDR4-3600 CL16 memory @ IF1900/3800 tightened CL16 timings
    3800x stock all cores
    2080
    highest 1080p

    Average: 139fps

    [​IMG]

    In are first game Shadow of the Tomb Raider we beat a 9900k @ 5Ghz all core with 3600CL16 RAM. The 9900k has a 2080 ti and the 3800x has a 2080. The lead is 15fps or 12% for the 3800x.
     
    Last edited: 11 Oct 2019
  20. zx128k

    zx128k What's a Dremel?

    Joined:
    8 Oct 2019
    Posts:
    8
    Likes Received:
    0
Tags: Add Tags

Share This Page