1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Intel releases discrete GPU prototype details

Discussion in 'Article Discussion' started by bit-tech, 20 Feb 2018.

  1. bit-tech

    bit-tech Supreme Overlord Staff Administrator

    Joined:
    12 Mar 2001
    Posts:
    1,349
    Likes Received:
    22
    Read more
     
  2. Guest-16

    Guest-16 Guest

    This is absolutely a prototype. It looks like a modified GPU core from a CPU spun out, and the FPGA is just the usual add-on to test and tune this new functionality before it's spun into the main silicon. The FPGA has to replace the Uncore area from a CPU, which requires an amount of 'new' engineering. I strongly believe they won't leave that an external chip, unless it's really a time-to-market issue.

    It's not so much low power that's the focus, more like a factor. Even the high performance models have to be 'low power' to be power-per-watt competitive with Nvidia, but the IVR and Turbo functionality is pretty unique to GPUs but Intel has that experience from CPUs. The IVR will let them simplify power input and lower cost of the platform (card), which makes it more price competitive. It says Intel is really pushing its unique IP where it can to gain the advantage because its EUs are likely to be less competitive per-unit.
     
  3. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,548
    Likes Received:
    134
    Knights Corner (first gen Xeon Phi) shipped with the Gen-derived texture units on the die but unused. Larrabee's graphics target was quietly shuffled under a rug, but it still shipped to great success for all its other uses. KNF, KNC and KNL are all Larrabee.
     
  4. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,752
    Likes Received:
    893
    ...isn't that what I said?
     
  5. Zak33

    Zak33 Staff Staff Administrator

    Joined:
    12 Jun 2017
    Posts:
    192
    Likes Received:
    35
    not in that exact order.

    //pulls our Les Dawson playing a piano, with all the right notes, but not in the right order
     
  6. Hustler

    Hustler Member

    Joined:
    8 Aug 2005
    Posts:
    983
    Likes Received:
    21
    Lol, I remember loads of people at the time in various forums posting 'I'm going to wait for Larrabee' thinking it was going to blow ATI & Nvidia out of the water..real time ray tracing, 60fps!!!
     
  7. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,548
    Likes Received:
    134
    Not really. The efforts for Larrabee weren't "redirect[ed] into the Many Integrated Core (MIC) co-processor initiative that would become the Xeon Phi server-centric product range", Xeon Phi literally is the Larrabee die. All that died of Larrabee was the idea to sell it in a box labelled 'GPU' to consumers, the actual product that was demonstrated as 'Larrabee' shipped just as shown, texture samplers and all. All the things gen 1 Xeon Phi was sold as doing were part of Project Larrabee, with the use as a discrete GPU being in addition to those uses (though for consumers, most only ever looked at the graphics side and ignored all the rest).
     
  8. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,752
    Likes Received:
    893
    I think we're arguing semantics: Larrabee was supposed to be a graphics card. No Larrabee graphics card ever hit the market. The R&D done for Larrabee instead became a general-purpose accelerator card for the server and HPC markets. What was a GPU initiative became the MIC initiative, and hit the market in a very different form than originally promised (i.e. there's no video output on any Xeon Phi product.)

    The Larrabee project, which was specifically to build and launch a high-end discrete graphics card, died. The MIC project was born from its ashes, and is still going. Calling MIC Larrabee is as inaccurate as calling the Quark a Pentium - actually, more so, because at least both the Quark and the Pentium are CPUs...

    I understand your perspective: that Larrabee is the die, the design of which lives on in MIC. I wasn't writing about just the die, though; I was writing about the Larrabee project to launch a graphics card, which was killed off (and remains killed off, 'cos what Intel's showing off at ISSCC is absolutely knack-all to do with Larrabee.)
     
    Last edited: 20 Feb 2018
  9. yuusou

    yuusou Well-Known Member

    Joined:
    5 Nov 2006
    Posts:
    1,678
    Likes Received:
    155
     
    MLyons and adidan like this.
  10. mi1ez

    mi1ez Active Member

    Joined:
    11 Jun 2009
    Posts:
    1,404
    Likes Received:
    13
    They've only just paired with AMD for mobile parts...
     
  11. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,548
    Likes Received:
    134
    One of the Larrabee project goals was to produce a discrete GPU, but while that was what people latched onto as "OMG Intel is making a GPU!" it was only a small part of the Larrabee project. Originally, Larrabee was called "Simple, Massive Array of Cores" (SMAC). It was conceived as a big parallel processor first, with "hey, why could this not also be a GPU?" as the 'big demo' to show it off. Amusingly, they kid of sent full circle as part of the development process: started with full-up x86 cores, pares off SSE as awkward and bloat-ey and write their own instruction set (SMACNI, then LRBNI), then SSE/MMX/AVX/etc got pushed in for KNL for compatibility, and LRBNI got mooshed together. That weird mooshing is what we now know as 'AVX 512' and made its way out of Larrabee into Core.
    SMAC is Larrabee is MIC is Xeon Phi. They're all the same project (and for KNF and KNC, literally the exact same dies), just with different names. Claiming Xeon Phi is not Larrabee is like claiming RyZen is not Zen.
    To quote Tom Forsyth directly:
    tl;dr: Xeon Phi = Larrabee. So sayeth the guy who was designing it.
     
  12. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,752
    Likes Received:
    893
    You're not wrong - to quote the 2007 unveiling article: "Otellini believes Larrabee has applications in supercomputing, in financial services, and physics and health applications. Most importantly, Otellini also stated that Larrabee would be very good at graphics."

    However, Intel's exclusive focus at that time was visual computing and it specifically stated it would "have a competitive graphics product [in 2009]." 2008: Tom Forsythe himself announces that "Larrabee is going to render DirectX and OpenGL games through rasterisation," i.e. that it's a graphics card. From the same link: "There's only one way to render the huge range of DirectX and OpenGL games out there, and that's the way they were designed to run – the conventional rasterisation pipeline. That has been the goal for the Larrabee team from day one, and it continues to be the primary focus of the hardware and software teams." (And isn't it interesting that, per your quote, Forsythe dramatically changed his tune on that front after the Larrabee GPU was cancelled?)

    Siggraph 2008 press release: "The first product based on Larrabee will target the personal computer graphics market and is expected in 2009 or 2010. [...] Initial product implementations of the Larrabee architecture will target discrete graphics applications, support DirectX and OpenGL, and run existing games and programs. Additionally, a broad potential range of highly parallel applications including scientific and engineering software will benefit from the Larrabee native C/C++ programming model."

    Note the 'additionally': in 2008 the primary focus of Larrabee was still personal computing graphics, not HPC acceleration. 2009: An IDF demonstration uses Quake Wars and talks exclusively about enthusiast computing. From the same piece: "Maloney confirmed that the first products based on Larrabee would be discrete graphics cards, and also revealed that the Larrabee architecture will eventually be integrated into the CPU."

    2009: Larrabee is cancelled, though Intel is careful to keep future products on the table by announcing the cancellation only of the "first Larrabee product [which] will not be launched as a standalone discrete graphics product." That's then it for Larrabee, until it resurfaces as MIC (fun fact: I was at the ISSCC conference for the original Knights Ferry card. Hefty thing, that!)

    However again, contrary to your assertions, Knights Ferry/MIC is not unadulterated Larrabee, nor even a direct descendent of Larrabee. Per Wikipedia's entry on Knights Ferry, Knights Landing is "a derivative of the Larrabee project and other Intel research including the Single-chip Cloud Computer," which uses Intel's own press release as a citation ("Products build upon Intel's history of many-core related research including Intel's Larrabee program and Single-chip Cloud Computer.")

    As I said earlier, though, we're arguing semantics. You're in no way wrong in what you're saying, but neither am I: given that every piece of coverage Larrabee has ever had on this site is exclusively about the cancelled GPU implementation, it's reasonable for me to use the word to refer to that GPU implementation - even though the technology itself lives on, in modified form, as MIC.

    TL;DR: When I say Larrabee, I'm talking the cancelled graphics implementation, which per both Intel and Forsythe explained at the time had as its primary focus the commercial release of high-end discrete graphics products - a commercial release that never happened; when you say Larrabee, you're talking the research project as a whole, and all its derivatives, secondary, and tertiary foci, which was merged with SCC and others to produce MIC. Neither of us is wrong.
     
  13. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,855
    Likes Received:
    194
    Depends on the definition of external.
    Look at the generic universal interface they are using for their collab with AMD chips, the line between external and internal GPUs may well be about to vanish if they start using the same approach for their own GPUs.
     
  14. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,548
    Likes Received:
    134
    Wikipedia is simply wrong in this case: KNF is Larrabee, as is KNL. The "don't be a GPU anymore" edict was dropped after KNL had already gone gold and was fabbing. There weren't any other super-secret non-GPU bunch-of-x86-cores dies in development. Apart from lacking a DVI connector soldered to the board, the only difference between KNL as a GPU and KNL as a compute card is the software running on it.
    Think of it like Nvidia's GV100: Put it on an SXM2 PCB without a video output and it's a pure compute card for the datacentre. Put it on a PCB with a PCIe card-edge and video outputs and it's a Titan GPU. Exact same die, different names.
    My assertion is basically: "MIC" was never a separate project from Larrabee for it to be 'merged' into, and Larrabee shipped twice to paying customers. It was literally the same team, the same chip, the same die, the same board (barring a video output being soldered on) just with a 2nd name change (SMAC -> Larrabee -> MIC). Two generations of Larrabee dies shipped with a 'Xeon Phi' label attached (and one generation, KNF, even shipped to developers in it's GPU guise also, with a few still knocking around) before the project was renamed MIC.
    SCC however, was dropped just like Terascale before it. The VLIW IA cores used in both (on the Polaris die for Terascale, and on Rock Creek for SCC) were abandoned, and only Larrabee's x86 cores were ever implemented in shipping products.
     
  15. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,752
    Likes Received:
    893
    Wikipedia is literally quoting Intel.
    Would it help any if I changed "announcing the Larrabee project" to "announcing the Larrabee GPU project" in the article, to make it clear the paragraph is referring to the planned-then-cancelled consumer GPU product?
     
    MLyons likes this.
  16. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,468
    Likes Received:
    191
    I'm not going to pretend i know what I'm talking about but i was reading this blog post from Tom Forsyth who was apparently on the team involved with the Larrabee project and he seems to be saying it wasn't a GPU, at least i think that's what he's saying when he says "Larrabee was never primarily a graphics card", sorry if that just confuses things even more. :)
     
  17. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,548
    Likes Received:
    134
    That's the article I linked earlier.
    My main complaint was the claim that "Sadly, while wafers, die photographs, and even live demos would follow, by late 2009 Intel would cancel its plans in favour of redirecting the research into the Many Integrated Core (MIC) co-processor initiative that would become the Xeon Phi server-centric product range" is not correct, as the wafers and dies as demonstrated were literally released unmodified as Xeon Phi, for two generations. Stick a 'Xeon Phi' sticker over the 'Larrabee' label on the box, still call the die itself Knights Corner/Landing, release as Xeon Phi instead of as Larrabee.
     
  18. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,752
    Likes Received:
    893
    The wafer was shown off as "hey, this is going to be a graphics card." The die photograph was shown off as "hey, this is going on a graphics card." The live demos were "hey, look at this graphics card we made." The plan to launch a graphics card was cancelled. Are we in agreement so far?

    The work done on making a Larrabee graphics card was then used to launch the MIC products, including reusing the dies. That, however, does not mean that the Larrabee graphics card was not cancelled, because it was. MIC/Xeon Phi is not a graphics card.

    Thus, I stand firmly by my statement: the Larrabee graphics card project - which given that's what every single linked article is talking about, and the article itself is comparing Intel's latest discrete graphics card efforts to the aborted Larrabee graphics card, I'd assume everyone was able to figure out was the context here - was showcased between 2007 and 2009, then cancelled.

    I've added the word "GPU" to the article, but I ain't making any further changes. What I wrote is both accurate and, to my mind, entirely clear in its context. I'm sorry that you disagree.
     
  19. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,548
    Likes Received:
    134
    To again quote Tom Forsyth (no longer filtered by Intel's PR department), note the highlights:
    My contention is that despite the marketing copy being "Larrabee is totally 100% GPU! not a pound for air to ground pay no attention to the Knights behind the curtain " this is not actually true. The "If Intel had wanted a kick-ass graphics card, they already had a very good graphics team begging to be allowed to build a nice big fat hot discrete GPU - and the Gen architecture is such that they'd build a great one, too" line is also extremely relevant, as it indicates that 'build a great big Gen chip' may well be the avenue Intel would pursue for a discrete GPU in the future, if it were to design one GPU-first rather run a compute-first chip as a GPU.
     
  20. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,752
    Likes Received:
    893
    And to again quote Tom Forsyth from before the Larrabee graphics card project failed, and fail it did: "There's only one way to render the huge range of DirectX and OpenGL games out there, and that's the way they were designed to run – the conventional rasterisation pipeline. That has been the goal for the Larrabee team from day one, and it continues to be the primary focus of the hardware and software teams."

    So, which is it? Was being a graphics card for games "the goal for the Larrabee team from day one [and] the primary focus of the hardware and software teams," or was "Larrabee never primarily a graphics card?" 'cos it can't be both. Either it was a graphics card first and foremost and he's rewriting history so as not to have been in charge of a failed product, or it wasn't a graphics card first and he was being less than truthful - or 'led' by Intel's PR team - when he said that making a graphics card was Larrabee's goal from day one.

    As I said and continue to say: the article is referring to Intel's absolutely-failed, totally-cancelled, and never-launched Larrabee graphics card project, and mentions as an aside that the R&D which went into said absolutely-failed, totally-cancelled, and never-launched Larrabee graphics card project lives on as MIC. That's accurate, and I see no need to change it.
     
    MLyons likes this.
Tags: Add Tags

Share This Page