1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Reviews AMD Radeon RX Vega 64 and RX Vega 56 Review

Discussion in 'Article Discussion' started by bit-tech, 11 Aug 2017.

  1. Guest-56605

    Guest-56605 Guest

    Going back to what V-T said about price gouging, it's a little known fact but the AMD CPU and GPU Divisions are to all intents and purposes seperate companies still, just under a unified brand; there is certainly no love shown/shared towards one another even now as the GPU Division still views itself as a suppressed and very much resentful ATi.
     
  2. adidan

    adidan Guesswork is still work

    Joined:
    25 Mar 2009
    Posts:
    19,805
    Likes Received:
    5,592
    Sounds like the $499 is an introductory offer that won't last forever.
    I'm more interested in the 56 myself but if the not-known-before-launch MSRP includes a price discount which then disappears i think it's a real bad show on AMD's part.
     
  3. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    That seems odd as arguably AMD's recent CPU and GPU designs seem to both be targeting the same market with post fabrication changes made so as to better address other markets, that's not to say each division doesn't see themselves as separate companies but there must be some overarching decision making going on.
     
  4. Chicken76

    Chicken76 Minimodder

    Joined:
    10 Nov 2009
    Posts:
    952
    Likes Received:
    32
    Didn't the FuryX have a 4096-bit memory bus? The table on page 1 shows it had a 1024-bit one.
     
  5. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Someone needs to update the MSRP for the article as apparently the $499 for the V64 was a launch price only.
     
  6. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,117
    Likes Received:
    673
    So half of that £/$200 advantage of Freesync VS G-Sync goes straight out the window on MSRP alone, another £/$25-50 will evaporate if you want a properly cooled, non ear-bleeder AIB version, and the rest will no doubt be more than covered by awful availability, and subsequent price-gouging, because of poor yields...

    So there is now precisely ZERO reasons to consider a Vega 64.

    /slow clap
     
  7. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    The AMD statement on that kitguru article is weird, what is the AMD PR department smoking?
     
  8. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    That must have been added in the last hour, goes off to have a read.

    It is a bit weird isn't it, they only seem to talking about stock level and not price. :confused:

    Having just thought about it some more in effect what their saying is that a V64 on it's own costs $499 but those were limited in stock and from now on you're only going to be able to buy a V64 with two "free" games for an extra $100.
     
    Last edited: 18 Aug 2017
  9. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    AMD are smoking something and no one is buying.

    In the uk the price gap from a 1080 to a Vega64 is £150. Once all In one board partners do there job Vega64 is above the 1080ti all price tiers but without the performance to back it up.

    AMD are too late and too slow to matter, even if you have a freesync monitor the 1080ti is faster and still will max out to the Max hz of said monitor the better ones are all 144hz anyway.
     
  10. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,979
    Likes Received:
    3,741
    I read today that apparently the rebate AMD gave retailers for the £430 and $499 cards made it so that AMD made nothing. Their fault entirely. Obese die with very spenny memory tends to do that. This is why Nvidia stopped making Fermi after Fermi.
     
Tags: Add Tags

Share This Page