Subscribe
IoT Sensors and embedded Services

IBM’s brain-chip finds home in Samsung machine-vision project

Samsung has unveiled a machine-vision platform based on IBM’s TrueNorth processor, a chip designed to operate in the same fashion as the human brain, and bring superior compute power to far smaller power packages. The Samsung system, called the Dynamic Vision Sensor, appears very impressive, based on the demonstration at Samsung’s Advanced Institute of Technology.

The South Korean giant says that the chip allows its camera to see the world at 2,000 frames per second (fps), which requires a very high processing bandwidth. What’s remarkable is that the system only uses 300 milliwatts of electricity.

Also this week, the pair continued their collaboration with the announcement of IBM’s new SyNAPSE chip – a complement to its TrueNorth neurosynaptic/neuromorphic chip, which was built on Samsung’s 28nm silicon manufacturing process. The SyNAPSE’s 5.4bn transistors power 4,096 on-chip cores, which collectively only consume 70mW of power. The news was published in Science, in collaboration with Cornell Tech.

Samsung says that the machine-vision platform’s high speed is better for 3D mapping applications, which include safety systems in cars, as well as fine-detail gesture recognition. It’s a similar area that Intel has been exploring with its RealSense technology, which has begun appearing in laptops as a way of recognizing gestures. Samsung has an obvious requirement for such features in its consumer electronics and smartphones.

On stage, the Samsung team demonstrated how the system could be used to provide gesture-control for a TV. The camera could recognize hand waves, finger waves, finger pinches, and a closed fist, at around ten-feet from the camera. In terms of achievements, the recognition part isn’t actually that impressive.

However, the minuscule power requirement is pretty outstanding, with the demo system using around 1% of a standard laptop, and around 10% of a typical smartphone. In terms of thermal output, the low power-draw means that the chips will emit less heat while operational, and Samsung believes this will let it group the chips closely together in clusters – defying the convention of ARM and x86, in which you add more cores to chips to increase their performance, by simply adding more chips in an array.

IBM’s chips have been deployed in the Lawrence Livermore National Laboratory (LLNL), in a project that will evaluate the safety of the USA’s nuclear arsenal, by powering a super-computing system that will simulate the security and deterioration of the stockpile. The chips should help remove the need for the current underground testing.

IBM has also said that the USA’s Air Force Research Laboratory is investigating using TrueNorth in video analytics to spot anomalies, as well as for detecting network and security intrusions in computer systems, and more mundane tasks like text recognition for archiving. A more headline-grabbing investigation is under way into providing unmanned aerial vehicles (UAVs, or drones) with the ability to make autonomous flight decisions.

In applications like this, which are power-constrained, the new processor architecture has great potential in increasing the capabilities of these devices – whether they are the next generation of Predator UAVs, or simply a connected video intercom system.

It’s not just the power consumption that is a limiting factor in these deployments. The necessary internet bandwidth to send data back to the cloud for analysis can make certain projects non-starters, and introduce crippling latency into others. With storage getting faster and cheaper by the day, being able to store large reference libraries and databases at the network edge and run these kinds of computation on them open up a whole realm of possibilities that are not possible today.

VPS hosting