By Brian Santo, contributing writer
Researchers at the University of Washington have developed a relatively simple neural network that mimics biological neural systems. The performance of the new neural-network model points to the possibility of building AIs that are less complex yet far more efficient at learning because of it. At the same time, the research, published in the arXiv repository , yielded new insight into how living creatures learn — or at least how some creatures learn some things.
The most common path to emulate the effectiveness of biological neural systems has been to create increasingly complex artificial intelligences with increasingly complicated machine-learning capabilities. Biological systems that outperform AIs sometimes aren’t all that complex, however, and living creatures often learn far more quickly than AIs using significantly fewer experiences to learn than AIs require data sets.
Starting with these observations, UW researchers resolved to devise a relatively simple neural-network model that mimics the relatively uncomplicated structure of a moth’s neurological system.
The University of Washington has been analyzing insect biology for decades; this research team chose moths because UW labs have already thoroughly mapped their neurological systems. They already knew that moths can learn smells after experiencing them only a few times. Despite the relative simplicity, however, it remained unclear precisely how moths’ neurological systems worked when learning.
Most neural networks operate on the principle of backpropagation. With this technique, the weights between neurons (essentially the strength of the connection between them) are constantly recalculated through a process of feeding outputs back into the system so that inputs and outputs can be compared and adjusted against each other.
Biological systems rarely do anything like this. Instead, they are commonly organized as feed-forward cascades.
The beginning of the cascade in hawk moths is a set of about 30,000 chemical receptor neurons (RNs), which feed signals into an antennal lobe (AL). The AL contains roughly 60 isolated clusters of cells (called glomeruli — it pays to enhance your word power!), each of which focuses on a single odor stimuli feature. The AL, the researchers say, is inherently noisy. The researchers liken the AL to a pre-amplifier, “providing gain control and sharpening of odor representations.”
Signals from the AL are forwarded to a structure called the mushroom body (MB). The MB contains roughly 4,000 cells (Kenyon cells) associated with forming memories. Signals go through two more ancillary structures (each numbering in the tens of cells), the function of which is believed to be to read out the signals from the MB. These sparser structures act as noise filters, the researchers wrote. Noise isn’t eliminated but is sufficiently reduced for the purpose of effective learning.
The process does not work at all without octopamine, described as a neuromodulator. Release of the chemical is triggered by a reward — for example, the moth finding sugar to consume. When a moth finds a reward, the octopamine that is released stimulates enhanced activity in the AL and MB. The practical effect of this enhanced activity is to strengthen the connections between correlated neurons in the moth’s neurological system. The mechanism is called Hebbian learning; the extent to which the strength of neuronal connections can be changed is called Hebbian plasticity.
The UW researchers built a mathematical model that mimics all of this, and their neural models of moths learned quickly with minimal simulated odor inputs. Their results are similar to the behavior that they observe in the moths, strongly suggesting that they have an accurate model.
If so, that will have ramifications both for biology and for neural networks.
That the behavior of the model was so similar to that of actual biological systems encouraged the researchers to expect that they might now have a clearer understanding of the mechanisms at work in living creatures. The olfactory/neurological systems of moths are structurally similar to those of many other creatures, the researchers noted.
Their work also suggests a new path to explore for machine learning. “Specifically,” they wrote in their paper , “our experiments elucidate mechanisms for fast learning from noisy data that rely on cascaded networks, sparsity, and Hebbian plasticity.”
Image source: Wikimedia Commons