Discussion in 'Article Discussion' started by bit-tech, 21 Jul 2017.
Excuse my ignorance, but what can be done with these?
S'basically a device which does the type of floating-point arithmetic that deep-learning, AI, and computer vision projects require really, really quickly and in a low power envelope. It's much faster than a CPU of equivalent power draw doing the same stuff, and while it's slower than a GPU-based accelerator it draws around 1W of power instead of, what, 300W?
Traditionally, for computer vision stuff on embedded devices, you'd farm the data off to a massively powerful server somewhere and process it there; this lets you stick a USB device into any cheap off-the-shelf low-power device - right down to a Raspberry Pi Zero, if you want - and do the processing locally instead. Faster, cheaper, and probably even lower power than what it'd take for you to be streaming the data off for remote processing.
As for what you can do with it? Well, unless you're doing anything with the Caffe deep-learning framework, then... not a lot, really.
That's all well can good, but can you run doom on it?
Separate names with a comma.