Advertisement

BrainChip debuts development kit for advanced neural networks

The Akida Development Environment is a complete machine-learning framework to create spiking neural networks

By Brian Santo, contributing writer

BrainChip Holdings Ltd. has launched a development kit for creating spiking neural networks (SNNs). It is decidedly not for beginners. Also described as neuromorphic computing, SNNs are a newer type of neural network that can be trained more rapidly than classic neural networks and have the capability to learn on the fly once deployed.

SNN can be used for a variety of applications, including public safety, transportation, agricultural productivity, financial security, cybersecurity, and health care. BrainChip’s first two commercial applications are facial recognition and a system designed for casinos. There is also intense interest in using it in data communications network edge equipment with the goal of making edge systems not only faster but also capable of adapting to network conditions and requirements almost in real time. Intel, Altera, Xilinx, IBM, and Nvidia are among the companies interested.

The short story is that SNN specialist BrainChip is offering its new Akida Development Environment, a machine-learning framework for the creation, training, and testing of SNNs, which supports the development of systems for edge and enterprise products for the company’s own upcoming neuromorphic system-on-a-chip, the Akida NSoC.

Brainchip-AkidaDevelopmentEnvironment

BrainChip declined to say anything more than that the NSoC will be supported by the development kit and that it is due sometime in the third quarter of 2018. That leaves open several questions, including what the NSoC specifically is and what improvement it might represent in the development of SNNs. Developers currently using the kit can do so on x86-based systems, according to the company’s website.

The longer story about why SNNs are interesting involves a quick dive into the history of neural-network development.

The brain has millions of neurons, and during brain activity and any learning process, neurons connect with neighbors. These connections form pathways, and each learned thing will have its own pathway or pattern of pathways distinct to itself.

Classic neural networks try to emulate this by starting with nodes — “neurons” — that are all connected to each other. At first, the connections between any two neurons are weak, but as new data is added and as processed results get fed back in, the strength of the connections established between each pair of neurons — referred to as “weights” — keep getting adjusted. Distinct processing pathways are formed as the system learns.

There are several drawbacks to this approach, however. A fully connected network evaluates weights between just about every possible combination of neuron pairs, which sounds complicated and is. You can keep adding more neurons, but the capabilities and performance of a classic neural network do not scale directly with the number of neurons. Also, it often takes huge data sets to train a classic neural network before it can be used, and the training process can be time-consuming.

There needs to be a better way, and of course, there are several, starting with some of the ways that biological brains behave. Biological brains can learn things in a manner that is far simpler than the way that classic neural networks work. Insect brains, for example, have a deceptively simple architecture, and learning appears to be oddly noisy (in engineering terms) and an entirely feed-forward process (see “The key to smarter, faster AI likely found by modeling moth brains ”). Similar processes can be observed in the larger, nominally more sophisticated brains of higher-order animals.

That’s not the only difference between biological brains and electronic brains, but it is emblematic in that biological brains and biological learning processes are often conceptually simpler than classic neural networks. More recent generations of neural-network technology attempt to emulate this type of simplicity. Two promising alternatives are convolutional neural networks (CNNs) and SNNs. In terms of commercialization, CNNs are ahead.

CNNs are based on an algorithmic approach that is quite computationally intense compared to SNNs. CNNs generally require training prior to deployment. Part of the reason is that the training process relies on backpropagation — feeding results back into the system for reinforcement.

SNNs rely on “threshold” logic. Weights accumulate until they hit some preset threshold, and then they fire or “spike,” after which the connection weight goes back to a reset level. It’s a feed-forward process (remember the moth?).

Of course, these are simplified explanations, but they lead directly to the ramifications for datacom networks. CNNs are compute-intensive, which translates directly into power consumption. Compute-intensive processes, therefore, tend to get shunted to data centers deep into the cloud. SNNs are not compute-intensive; being lower-power by nature, SNNs are recommended for edge computing. The ability to learn on the fly is also a recommendation for edge computing, explained BrainChip.

And that brings us back to the company’s Akida Development Environment. The company is looking for customers ready to explore innovative applications for an innovative technology.

The kit includes an Akida Execution Engine, data-to-spike converters, and a model zoo of pre-created SNN models. The framework leverages the Python scripting language and its associated tools and libraries.

The first (and, to date, only) data-to-spike converter available with the kit is a pixel-to-spike converter. Obviously, anyone interested in using the kit in the near future will have some sort of visual recognition problem to solve. BrainChip said that it plans to follow with other converters for audio and big data requirements in cybersecurity, financial information, and IoT data. Users are also able to create their own proprietary data-to-spike converters to be used within the development environment.

The development kit is recommended for engineers who already have experience with neural networks. In an email exchange, Bob Beachler, BrainChip’s senior vice president of marketing and business development, told Electronic Products, “If the person is familiar with developing and training artificial neural networks [either convolutional neural networks or spiking neural networks] using TensorFlow, Caffe, PyTorch, Theano, or a similar machine-learning framework, then they would have no problem creating SNNs using the Akida Development Environment.”

“If they are RTL engineers used to designing digital logic, then it would not be intuitive to them,” added Beachler. “They would need to take a class or two to learn how to create/train ANNs and learn Python scripting language.” (ANNs are artificial neural networks. They have conceptual similarities to SNNs, and the two terms are sometimes used interchangeably.)

The Akida Development Environment is currently available through an early-access program to approved customers. The company doesn’t know when it will become generally available and did not provide pricing information. Contact Brainchip for more information or to request the Akida Development Environment.

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply