Discussion in 'Article Discussion' started by bit-tech, 9 Nov 2017.
This is huge news, and AMD should be worried... Even Nvidia. I suspected in the other Raja thread that Intel would outright buy the entire RTG division from AMD (would give them the opportunity to focus purely on Zen successors), but to build it from the ground up, I did not expect!
Remember that they tried this once before and gave up after two years and no products - though, admittedly, they didn't have Mr. Radeon at the helm then.
Will be interesting to see what he can do with the huge resources of Intel behind him.
RTG was always held back by a (relatively) low budget and size of the design team, compared to Nvidia.
I suspect alot of hardware engineers will be getting their CV's ready for the hiring binge that will come.
Just a shame that we're probably going to have to wait until at least 2020 before we see anything come to market that gives consumers a genuine alternative to Nvidia & AMD (assuming they're still around).
AMD are probably a lot less worried than Nvidia, I'd even go as far as saying that he probably had AMD's blessing to move to Intel as AMD simply don't have the funds to be able to prevent Nvidia from becoming the only player in machine learning, AI, VR, AR, IoT, and all those other markets that need graphics processing chops.
I figured Intel would buy all the IP that comes with the RTG (as well as the engineers), but I don't know how much of that is tied up in the CPU business too.
We were all shocked when AMD bought ATI, I wonder if Nvidia will be weighing up the figures for buying out AMD? Ryzen with integrated Pascal/Volta to fight Intel/ATI?!
I doubt Nvidia are particularly worried yet.
CUDA has the GPGPU market well in hand, and on the graphics side GCN has been unable to effectively challenge in perf/watt or perf/area for generations (perf/$ is dependant on how much you want to shave your margin, and AMD's have been slimmed to near-zero for Vega). A GCN-based Intel discrete GPU would be no real challenger for them even if it were fabbed by Intel rather than GloFo.
A completely new architecture - or a ]]resurrected ghost of Larrabee / graphics-focussed Xeon Phi derivative which is what I'd bet on turning up to leverage Intel's existing work - will take years to create from whole cloth. Nvidia have a handy head-start, and have hardly been resting on their laurels.
To quote myself in the thread on AMD and Intel partnering on a laptop chip.
"Call me cynical but what if this is just a way for Intel to get close engineering knowledge of Radeon gfx allowing them to reverse engineer their own graphics core for future APU use?"
I'm going to say I called this!
On a more serious note, this is interesting news and I look forward to the inevitable fall out of either:
A - Another Larabee flop
B - A competitive design making it a 3 manufacturer market
C- Intel pull a 'Core' and make something so ridiculously good we will all want it (this is unlikely even if just because games just wont be optimised for anything new)
The fabrication of silicon doesn't work on time frames of yet, it takes years from design to market and Nvidia is a long way from having the GPGPU market well in hand, especially considering it's a market with the potential for huge growth in the future.
You only have to look at Google’s TPU, Apples 'neural engine' and the plethora of smaller start ups to see Nvidia are a long way away from having a developing market all to themselves.
I suspect he got frustrated at AMD not being able to compete due to the shoestring budget he almost certainly had to work with. Money is something that Intel has plenty of. From a competitive standpoint it'll take 3-5 years to develop something new I guess, so not a major problem for either Nvidia or AMD for quite a while. More immediate would be if Intel goes on a hiring spree and starts sucking talent out of the other companies - particularly AMD as Raja will know who the good people are and will be able to offer them packages AMD can't compete with.
Ah, the much vaunted Laughabee - it promised so much and delivered so little. I was beginning to wonder when Intel would dip it's toe in the discreet GPU market again.
A discreet GPU seems like a great idea - I hate these in-your-face GPUs, all blingy and LED and so noisy.
I also like discrete GPUs
Nothing quite like a discreet discrete GPU.
Separate names with a comma.