1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Movidius unveils $100 USB neural network co-processor

Discussion in 'Article Discussion' started by Gareth Halfacree, 29 Apr 2016.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    11,508
    Likes Received:
    1,335
  2. schmidtbag

    schmidtbag New Member

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    If these can be used for BOINC then I'm definitely interested. If these can also be overclocked then even better. Otherwise, I have no interest in it.
     
  3. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    11,508
    Likes Received:
    1,335
    Are there visual processing projects on BOINC? If so, and if the project maintainer writes support for the Movidius VPU, then yes, they will be able to be used for BOINC.
     
  4. schmidtbag

    schmidtbag New Member

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    I'm not sure, but I don't see why it wouldn't be possible. For example, there are experimental drivers for the USB Bitcoin ASICs.
     
  5. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,783
    Likes Received:
    101
    I may be talking completely out my arse here (as usual), but wouldn't the market for these be more for GPU manufacturers to put on graphics cards? Perhaps as a daughter card using the SLI / Crosssfire architecture?
     
  6. dunx

    dunx ITX is where it's at !

    Joined:
    1 Sep 2010
    Posts:
    463
    Likes Received:
    13
    When GPU's are running multiple Teraflops, what use is a 0.15 Teraflop add-on device ?
    Sure it's energy efficient, but who wants 40 of these hanging from a bunch of USB hubs ?

    dunx

    P.S. Great solution ONLY if 150 GFlops does all you need within a minimal power envelope.
     
  7. schmidtbag

    schmidtbag New Member

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    GPUs can already effectively do this via OpenCL and/or OpenCV. Also in case you're not aware, the purpose of SLI and Crossfire is to combine the processing power of multiple GPUs, but you don't need it if each GPU is processing something independently. For example, you can have 2 Nvidia GPUs of different generations and you can use the 2nd one for Physx.

    I think this device is probably geared toward high-tech security camera servers, where they might not have a GPU but don't have the processing power to efficiently do image recognition. I otherwise don't really see why anyone would need this, because for home use, even an IGP is good enough.
     
  8. Wwhat

    Wwhat Member

    Joined:
    2 Oct 2005
    Posts:
    263
    Likes Received:
    1
    It's interesting that they list as requirement for PC use: 'a decently recent linux, like ubuntu, and python 3'.
    Because surely they can make libraries for other things like windows and OS-X, and it seems they should have done so before their big marketing campaign.
    Because I have the impression an awful lot of development, right or wrong, is done on non-linux systems when talking the PC hardware platform.
     
  9. Wwhat

    Wwhat Member

    Joined:
    2 Oct 2005
    Posts:
    263
    Likes Received:
    1
    I just saw that they now partnered with FLIR (the thermal imagers company) to embed their chip in the thermal sensor to do all the processing, and by doing so they reduced the size of their modules once again. So if you were an investor it does look good for Movidus's future, seems they are finding adoption and it's not just a momentary attention of the media.

    As for the GPU as competitor, I think we know the limitations since directcompute and Opencl have been around a while now, and the movidiius chip seems to do things the GPU so far did not bring to the public. So let's hope nvidia doesn't buy them up and kills them yet again like they did with the previous matrix-calculation dedicated chip.
     
  10. ModSquid

    ModSquid Member

    Joined:
    16 Apr 2011
    Posts:
    455
    Likes Received:
    9
    Whilst I won't pretend to even understand what visual computing is (although it does seem camera-centric), the above re: differing generations of card still offering a boost to your system needs more publicity.
     

Share This Page