1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Reviews AMD Ryzen 7 3800X Review

Discussion in 'Article Discussion' started by bit-tech, 10 Sep 2019.

  1. bit-tech

    bit-tech Supreme Overlord Staff Administrator

    Joined:
    12 Mar 2001
    Posts:
    2,418
    Likes Received:
    43
    Read more
     
  2. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,869
    Likes Received:
    409
    I still don't think going with the 3700X was a bad idea. The 3800X is marginally better, but it's also 18% more expensive in Germany right now. I haven't seen a single test result yet that shows the 3800X to be 18% faster.
     
  3. Zak33

    Zak33 Staff Staff Administrator

    Joined:
    12 Jun 2017
    Posts:
    240
    Likes Received:
    46
    Historically I have always been a bit miffed when better and better CPU's are launched in rapid succession, feeling as though if the whole range had launched in one, people could make better choices.

    But here, in 2019, I'm really enjoying watching Ryzen take the battle to the top, month by month, having not just awesome multi core/multi thread oomph, but ever increasing clock speeds AND an ability to use RAM bandwidth too.
    I still own a 1700 - I feel that 65w for that much oomph is still a revelation and here we are looking at raw muscle power with only 125w and , with good cooling, frankly astounding ability at really very normal money £380.. I mean, what graphics card is top of it's game at £380?
     
  4. Jeff Hine

    Jeff Hine Nothing special

    Joined:
    8 May 2009
    Posts:
    1,340
    Likes Received:
    161
    None... but maybe that's the kind of pricing that they should be at.
     
  5. The_Crapman

    The_Crapman Don't phone it's just for fun.

    Joined:
    5 Dec 2011
    Posts:
    4,288
    Likes Received:
    986
    A second hand 1080ti?
     
  6. Zak33

    Zak33 Staff Staff Administrator

    Joined:
    12 Jun 2017
    Posts:
    240
    Likes Received:
    46
    not top of it's game though is it? really? Launched in March 2017
     
  7. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,027
    Likes Received:
    590
    It depends on your expectations, I guess. To me, a £150 8GB RX580 would play anything I want happily at 1080p. A £380 5700XT would set me up for the next 2-3 years.
     
  8. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    12,703
    Likes Received:
    1,974
    Playing devil's advocate, what CPU has 2,944 cores and 8GB of high-speed RAM on board for <insert price of RTX 2080 here, I can't be faffed looking it up>?
     
  9. Zak33

    Zak33 Staff Staff Administrator

    Joined:
    12 Jun 2017
    Posts:
    240
    Likes Received:
    46
    If we could all run just a GPU we would in many cases. But we can't so ... bit stuck with that. I don't know anyone who can run Windows on just a 1080ti, nor game nor run excel just upon it.

    I believe the devil needs a new advocate.

    Wakka has a good point... £150 buys what he needs. But it's not "top of it's game" as it's been superceded any time over.
     
  10. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    12,703
    Likes Received:
    1,974
    Nor do I know anyone who can do any of that on a Ryzen 3800X without also buying a graphics card, given it has no iGPU...

    My point was, you're comparing apples to oranges. Like, I don't know of any McLaren F1 cars that are at the top of their game for the amount of cash I splashed on my Vauxhall, but it doesn't mean I'm going to win at Nürburgring in the famvan.

    A CPU is a lot cheaper to make than a GPU: it's physically smaller, it has fewer cores, it has a *lot* less memory on board, it doesn't need a bunch of supporting stuff like voltage regulators 'cos they're on the motherboard... So, yeah, a top-of-the-range consumer CPU is going to cost less than a top-of-the-range consumer GPU.
     
  11. bawjaws

    bawjaws Well-Known Member

    Joined:
    5 Dec 2010
    Posts:
    3,438
    Likes Received:
    360
    Aye, comparing top-of-the-line CPUs and GPUs on price is a fool's errand. And in any case, is the 3800X really "top of its game" when there are faster CPUs, such as *checks notes* the 3900X? On that basis, what does "top of its game" actually mean? Absolute best performance? Best performance for the price?
     
  12. Zak33

    Zak33 Staff Staff Administrator

    Joined:
    12 Jun 2017
    Posts:
    240
    Likes Received:
    46
    I think perhaps its because my era of old boy remembers a time when the best CPU was $999 and the best graphics card was half that.

    2006ish was an era of that. Core 2 Duo Extreme was a grand and an X1900 XTX / 7900 GTX was literally half that
    So "top of it's game" has reversed

    They are indeed as you say, chalk and cheese. Or perhaps cheese and crackers..they aren't the same but they work together.

    Back to my original point, (which I agree with myself on and expect no one else to agree with, cos that's how it is) - a CPU that's nder £400 and is frankly one of the ultimates, vs a £400 GPU which is decidely mid range.

    AND.. the CPU will work in many motherboards that are already being used for previous CPU's

    I think it's a wonderful thing.
     
  13. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,869
    Likes Received:
    409
    I guess we can all agree that it's a good thing to have AMD back in the high-end market. But the real money is made elsewhere and judging by the recent set of slides from the blue side of town there might be the next crisis for Intel brewing already.
     
  14. The_Crapman

    The_Crapman Don't phone it's just for fun.

    Joined:
    5 Dec 2011
    Posts:
    4,288
    Likes Received:
    986
    If you ignore the £1700 2990WX or the £1900 i9 9980XE. You know, THE "top of it's game" cpus.

    So.... :eyebrow:
     
  15. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,027
    Likes Received:
    590
    Yeah, Extreme Edition CPU's still exist.. And, if anything, have got even more extreme in their pricing - especially when you account for the astronomical motherboard cost for their respective-tier product - even if you ignore the silly "limited edition" or absolute halo boards, high end X570 and X299 boards are well north of £400 nowadays. I don't remember the early ROG boards costing that much compared to the mid-range offerings (but happy to be corrected :p).
     
  16. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    12,703
    Likes Received:
    1,974
    Ah - my perspective is largely informed by... a very different era of Old Boy. An era when CPUs were only swappable if you were handy with a soldering iron and had a matching crystal, and when discrete graphics cards were something only swanky buggers had.
    Ah, if we're talking historical... <pulls up chair, puffs on pipe>

    Skipping the IBM Monochrome Display Adapter (no graphics support) and Colour Graphics Adapter, let's go right to the Enhanced Graphics Adapter. Depending on the amount of VRAM you wanted on it, IBM's official EGA board would set you back between $500 and $1,000. Third-party compatibles were between $300 and $600. At the time of launch, Intel's shiniest new part was the 286 family (scroll down to Page 8, there's a great - though aggrandised - story about the origins of the 286 in there), though the 386 would follow on a year later. The 386 cost $299 at launch for the 12MHz version.

    So, for some of the earliest discrete graphics cards and user-replaceable CPUs, we're seeing the GPU costing way more than the CPU - double or more for an official IBM part, or roughly-equal to double for a third-party part.

    Skipping forward - and ignoring the 386SX, which was cheaper than the 386 - Intel launched the 486 at $950 - "nearly three times as costly as the 386," per the Wall Street Journal at the time. Ignoring specialist stuff, you might have had an ATi VGA Wonder 16 back then, which would have set you back $499 or $699 for either 256KB or 512KB VRAM. Cheaper than the 486 - and even more so when the VGA Edge-16 came out, dropping a few features to lower the cost.

    Skipping ahead to 1999 and cards with hardware transform and lighting for 3D acceleration, this copy of PCMania from 1999 tells me that a high-end graphics card of the time - it being three years on, the Rage family not being up there - would set you back around 40,000 Peseta - or around $265 at the 1999 exchange rate. Intel's shiniest CPU in 1999 was the Pentium III, and the Coppermine 600MHz version (which launched in August, after the first-run Katmai parts) was $670-740 'depending on your region'.

    Your 2006 example: the Core 2 family was, indeed, launched that year, and you were indeed looking at $999 for the Core 2 Extreme. The shiny new was the Radeon X1950 XTX - replaced a year later by the Radeon HD family - which would give you a dollar change from $430.

    Interestingly, a year later, the launch of a new DirectX version would render the older cards largely obsolete; skipping to 2008, the Radeon HD 3870 X2 was what you wanted, combining two shiny GPUs into a single card - yet somehow still only costing $449. That was the year the Core 2 family went away in favour of the Nehalem-based Core parts - and it was still looking for $999 at the top end.

    Let's skip a decade or so, and come bang up to date. The 3800X from the review is $400 on NewEgg, whereas a top-end RX Vega 64 is twice the price. The 3800X isn't AMD's top-end non-server part, though: that'd be the Threadripper 2990WX at $1,800, at least until the Zen 2 Threadrippers pop up. Then, of course, there's Intel: $2,000 for a Core i9 Extreme, anyone?

    What have we learned from all this? That I really, really like doing historical research, and that I'll jump on any excuse to do something that isn't the work I'm supposed to be doing... but more seriously, that in the early days of GPUs it was commonplace for the GPU to cost several times more than the CPU, then it flipped, and now it has flipped again - so long as you don't stray into HEDT territory, in which case CPU pricing is still king by quite some margin.
     
    jb0 likes this.
  17. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,301
    Likes Received:
    313
    I think the problem is that for the last decade (maybe more) there has been no such thing as 'a' top-of-the-range CPU.
    Any given CPU may have fantastic single-threaded performance and a relatively small number of cores, or a huge number of cores and relatively pants single-threaded performance (because there tend to be complaints if you try and push the better part of a kilowatt into a few square CM), and a huge sliding scale of tradeoffs in between. If your workload is gaming or other workload limited by single thread performance, the 'top of the range' will be one chip. If your workload is non-GPU raytraced rendering, then the top of the range will be a completely different chip. If you have a huge database to serve queries from, then the top of the range may be yet another chip again because it has an obtuse memory interface. And so on and so forth.
     
Tags: Add Tags

Share This Page