A few tech bombshells have dropped recently, some of which have been due for a while. While the unfortunate truth for the traditional PC segment is y/y decline, servers are the ever increasing battlefield. RSIC servers that used to dominate are now niche, and since most of the worlds economies stay uncertain for at least the next few years, it is already swaying business decisions, which are pushing design decisions. But, why should bit-techers care? Because the tech you know and love today is moving further outwards - server and mobile. In one sense this has profound ramifications for the PC industry but DON'T jump on a "it's gonna die" bandwagon. It may decline but it'll remain important from an core-infrastructure perspective. I'm not going to write about the PC market though because it's more unknown - what is more clear is the 3-way clash that's due in the next 3 years in high-performance workstation and servers. Intel Advantage: Intel is taking massive gulps out of the mission critical, super priced RSIC market that Itanium and IBM dominated with every new Xeon release. This is mostly because of the increasing faith in x86 hardware and software as well as the massive cost difference. With businesses under pressure to cut costs but also grow data centers, it's the only logical answer. After the AMD roadmap announcement I was wondering about Larrabee, but having seen the latest article that it now becomes the new Knights co-processor, Intel is clearly back in the strongest position again. Intel has invested billions into its "100-core" IA hardware and has plenty of IP in Larrabee, so it would never drop this project. Finally, without doubt, Intel also has the best process tech in the business too: better than IBM, TSMC, GloFo, and it has a brand name that's better or at least as strong as these other companies. It's already got a roadmap to increase this and Intel has previously demonstrated for many years the IP and technology to do TSV stacking and core tiling. No one else has. Unknown: Intel is going to increase its compute density via PCI-Express or some other custom-interconnect with 50 cores of x86 on 22nm - Intel doesn't have a good record with many-core projects to date though and a new interconnect will increase development costs for partners. Ultimately Intel is in an extremely powerful position, however it's all still technical demonstrations - not a real product within a previously shown scope to achieve it. http://arstechnica.com/business/new...e-of-the-clouds-future-according-to-intel.ars AMD Advantage: AMD's ATI IP is yielding huge dividends and its recent demonstration about future Fusion parts and the complete unification of CPU/APU and GPU across the entire PC platform with is extremely encouraging. It's roadmap is encouraging and even though it's late on Bulldozer, it's mainstream APUs are already very promising. Clearly it's core team is very focused on it, but it should have had enough thinking time to achieve it as it's already 3 years late! It's likely the tech will appeal to workstations and smaller servers, rather than expansive, expansive racks, but that's fine for AMD. It doesn't have the brand of Intel anyway, nor does it have the cash to invest in making mission critical parts. AMD also has considerable experience in dealing with thousands of cores per chip, which Intel does not. Unknown GloFo's process tech continues to lag behind and hold up AMD. While it improves its relationship with TSMC, AMD is still heavily invested and linked with GloFo BUT AMD is the ONLY company demanding GloFo invests in the latest technologies and research. Depending on how GloFo's investors want to play it it could go either way: use advanced process tech excuse to invest in researchers ala IBM and Intel style, or, aim for the mass market a bit more and try to secure as many fabless firms as possible. AMD doesn't have the budget to compete with Intel and huge revisions and changes are extremely risky. The company has launched several completely new products in the last few months - more than ever in its history - but the very core CPU technology still has't evolved. History isn't favourable to AMD on this front: Phenom TLB, Bulldozer delay etc AMD doesn't have a captain on its ship, so there's no clue what direction the company might end up going. Right now it's directors want more fashionable mobile technology in its portfolio, which is both arguably too late and may stretch the company too far. AMD's weight is very little compared to Intel, so unless major changes are demanded by the market and it's first to get there (64-bit Opterons) it will be very difficult for them to force change. Most crucial of all: How does this Bulldozer replacement help PC gaming? Nvidia and ARM Advantage Nvidia has both cutting edge ARM tech and high performance GPGPU hardware, along with increasing support for CUDA. It is TSMC's biggest customer and benefits from that advantage. ARM performance is exploding and could very well find itself in servers shortly as every year it makes major, successful updates that the markets simply cannot get enough of. For its size it's R&D evolution versus revenue is fantastic. Nvidia will continue to push CUDA and Tesla as high performance workstation margins are worth it, and its company direction is always very strong from the top down. Unknowns Nvidia has very few ARM engineers and while its current schedule appears good, I've heard it is apparently struggling to achieve performance goals. CUDA remains a niche compared to the support x86 has an Nvidia relies on the performance of TSMC, which has struggled at 40nm and is struggling at 28nm too. It's generally thought of that Nvidia's performance roadmaps in its presentations are at best, optimistic. Whereas Intel is pushing Atom into servers, ARM's evolution is too slow for servers. It has no 64-bit support, no ECC support and it will require very high IO hubs and high density memory arrays in order to support several Tesla cards for example. ARM also has many partners - several bigger than Nvidia - so it will never make hardware specifically to match Nvidia's needs. Nvidia lacks the control it needs to pull of anything more competitive than just the mobile market. If ARM cannot make it for Nvidia, it will not be able to control the entire platform and it will ultimately lose out. Whereas AMD's roadmap is too central to PC, Nvidia's takes the opposite ends of mobile and server parts. Nvidia no longer has ANY control on the PC platform, it has been successfully marginalized by competitors and its own actions. Outside of mobile ARM code and drivers is supported much less compared to x86, however if Microsoft makes a version of Win 8 for workstations it could be a viable option. It will depend on the reception for Win 8 at the consumer level to some extent, but I can't see ARM happening easily for servers - there's no history of reliability and security. Of most concern though, it talks less and less about PC gaming and has shown with Fermi that PC gaming is increasingly in the "can also do" agenda.
So you're predicting a fast die-off of Itanium2 and the EPIC instruction set, then? I would be, but I'm just asking. What I'm finding interesting is the VLIW4/5 question and what impact that will have on GPGPU. With AMD's Southern Island apparently set to function on a new instruction set, it could pick up some serious speed over before, making it a viable alternative to CUDA-as well as a problem to Intel, whose Knights coprocessor I'm not convinced is anything other than Itanic3. The future will be decided by instruction set and interconnects-kinda like in the early days of supercomputers.
I liked my Itanium machines. They really didn't go down very often. Other than that, I have no idea. I just made sure they ran. I did have some bad servers, but they weren't it.
Wow, I don't usually care about server architecture but the above post was very, very interesting! Thanks Bindi!
Titanic will die off as a platform, as will RSIC for everything except the most niche, mission critical things. In fact, RSIC is going in the complete OPPOSITE direction of the PC platform - less platform centric, more "open" in its design. IBM will keep making PowerPC for it as it's supported by massive research funding and university programs afaik. If anything this will make mission critical parts more customized and potentially more secure for government institutions. Imagine personal verification at the very hardware level before any execution. Southern Islands will be VLIW4 still, but more granular power and on 28nm to pack more in. AMD learnt a lot from the 6000 series and I wouldn't be surprised if it tweaked it more towards Nvidia's SM model. High-K MG from TSMC will finally help keep the temp down too. Next year's part will be the fusion between CPU and GPU that requires a new IA - by then TSMC 28nm will be mature. Knights has been on the cards for years and yes, it's in some way Itanic 3 as Intel has been showing 100 core IA parts for the last 4 years already. It could go one of two routes within a 250W thermal budget: pull out what was useful from IA and merge it into a beefed up x86-64 core. That will equal a TON of transistors and 50 large cores will only work at low GHz. The alternative is to copy Sony/Tosh/IBM's Cell - small, in order (Atom-esq) cores that are lightning fast - exactly like GPU shaders in fact - except with a hefty dash of IA pipelining. Crank up the MHz and the fast local memory with lashings of L2 cache as a snoop filter, and it'll not only compensate for the lack of OoO execution but it'll also keep the many-cores in check. Throw in a ring bus Sandy Bridge style to some ECC GDDR(like) tech and this could work. Intel was just waiting for TSV technology and process technology to evolve to the point where it could release it. It's not the first generation of Intel Knights that will be impressive - it's the SECOND generation with built in routers, TSV and stacked memory. Larrabee on 90nm and 65nm was collossally power hungry. Larrabee was an arrogant mismanagement aimed at Nvidia - Intel thought they could do a GPU - but the core technology was always aimed at the server market where the money is. The extra time has given it a considerable QA advantage (remember that SMT took Intel about 5 years to perfect and originally intended to launch with the first P4, but it got delayed until 1.5 gens later). Intel's QA advantage will be considerable compared to AMD which I doubt has much prep on its lofty ambitions. This is why you should invest in Intel now.