Difficult to explain what i mean in the short title, but essentially: Will a chip manufactured near the end of its life cycle perform differently in any way to an identically designed chip you could buy on release day? As an example: if you purchased 1000 ATI 5870 chips when it was first released way back in the day, would they average out to be slightly worse (power consumption, performance, overclocking or any metric really) than 1000 5870 chips produced in the last few batches before ATI decides to stop making them? With improved manufacturing process and yields over the course of a year or 2 that a specific chip would be produced on the same scale would otherwise identically designed chips would that chip have less bad transistors and therefor perform better or have less leaky transistors so consume less? I know each chip will perform slightly better or worse than others (overclocking is a great example of this) some chips just overclock better. Would the average chip migrate towards the better overclocking numbers over time? Or have marginally less power consumption at the reference speed. I would assume so, even if the difference is near impossible to measure in the real world. Just something i was wondering to myself.