Graphics Do new chips perform better than release day versions?

Discussion in 'Hardware' started by Farting Bob, 12 Jan 2012.

  1. Farting Bob

    Farting Bob What's a Dremel?

    Joined:
    21 Jan 2009
    Posts:
    469
    Likes Received:
    13
    Difficult to explain what i mean in the short title, but essentially: Will a chip manufactured near the end of its life cycle perform differently in any way to an identically designed chip you could buy on release day?

    As an example: if you purchased 1000 ATI 5870 chips when it was first released way back in the day, would they average out to be slightly worse (power consumption, performance, overclocking or any metric really) than 1000 5870 chips produced in the last few batches before ATI decides to stop making them?
    With improved manufacturing process and yields over the course of a year or 2 that a specific chip would be produced on the same scale would otherwise identically designed chips would that chip have less bad transistors and therefor perform better or have less leaky transistors so consume less?

    I know each chip will perform slightly better or worse than others (overclocking is a great example of this) some chips just overclock better. Would the average chip migrate towards the better overclocking numbers over time? Or have marginally less power consumption at the reference speed.
    I would assume so, even if the difference is near impossible to measure in the real world. Just something i was wondering to myself.
     
  2. padrejones2001

    padrejones2001 Puppy Love

    Joined:
    17 Jun 2004
    Posts:
    1,434
    Likes Received:
    15
    In short, yes. Every manufacturer releases revisions of their products, whether they're graphics cards, processors, motherboards, monitors, etc. These revisions will typically be printed right on the product, so that the end user can easily find out which revision they have.

    As for processors specifically, there are different steppings that are released over time, but it has more to do with the batch as a whole. Sometimes manufacturers will cherry pick processors that perform especially well or run cooler and release them as "golden samples." This especially common in the graphics card industry.
     
  3. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Well if you don't count binning, then I would presume that later batches would tend to be at the very least slightly more overclockable. Not that the EVGA Cheesecake or Classified cards didn't appear until a while later, and the Super OC'able GPUs tend to come out a bit later in their life cycles.

    Over time when a process is matured, the chips tend to be better in general. Look at CPUs, usually revisions do quite a bit of good.
     

Share This Page