shutterstock moth - Roses are red, are you single, we wonder? ‘Cos this moth-brain AI can read your phone number

Roses are red, are you single, we wonder? ‘Cos this moth-brain AI can read your phone number

Y’think we’re stretching this Valentine’s date thing too far?

The software, dubbed MothNet, can apparently discern handwritten digits with 75 per cent to 85 per cent accuracy, given 15 to 20 training samples of each number. That’s not bad considering it takes thousands of training examples for more traditional neural networks to achieve 99 per cent accuracy.

Its masterminds, Charles B. Delahunt and J. Nathan Kutz, both at the University of Washington in the US, built MothNet by modeling the olfactory network – the part of the brain that processes smells – found in the Carolina sphinx moth, also known as the tobacco hawk moth (Manduca sexta). That section of a bug’s grey matter is relatively straightforward, we’re told, making it ideal for experimentation.

“The moth olfactory network is among the simplest biological neural systems that can learn,” the pair’s paper describing their work stated.

moth

MothNet’s computer code, according to the boffins, contains layers of artificial neurons to simulate the bug’s antenna lobe and mushroom body, which are common parts of insect brains.

Crucially, instead of recognizing smells, the duo taught MothNet to identify handwritten digits in the MNIST dataset. This database is often used to train and test pattern recognition in computer vision applications.

The academics used supervised learning to train MothNet, feeding it about 15 to 20 images of each digit from zero to nine, and rewarding it when it recognized the numbers correctly.

Receptor neurons in the artificial brain processed the incoming images, and passed the information down to the antenna lobe, which learned the features of each number. This lobe was connected, by a set of projection neurons, to the sparse mushroom body. This section was wired up to extrinsic neurons, each ultimately representing an individual integer between zero and nine.

There are “60 processing units” that are mapped to each pixel in the image, Delahunt explained to The Register.

“The antenna lobe serves to accentuate distinctions between digits, as it does for odors,” he added. “Each mushroom body neuron starts out connected to each readout neuron. As training progresses, some of these connections – the active ones for the assigned digit – strengthen and the rest wither away.”

This withering leaves each extrinsic neuron representing one individual digit. After training, showing MothNet a hand-scribbled number from the MNIST dataset it hadn’t seen before made it light up the corresponding extrinsic neuron for that digit. Well, most of time.

Half mechanical brain

MothNet achieved 75 per cent to 85 per cent accuracy, the paper stated, despite relatively few training examples, seemingly outperforming more traditional neural networks when given the same amount of training data.

Convolutional neural networks typically need thousands of examples per digit from the MNIST handwriting dataset to get near perfect accuracy.

“The results demonstrate that even very simple biological architectures hold novel and effective algorithmic tools applicable to [machine learning] tasks, in particular tasks constrained by few training samples or the need to add new classes without full retraining,” the duo’s paper declared.

It shows that the simplest biological neural network of an insect brain can be taught simple image recognition tasks, and potentially exceed other models when training examples and processing resources are scarce. The researchers believe that these biological neural networks (BNNs) can be “combined and stacked into larger, deeper neural nets.”

“The success of live BNNs at a wide range of tasks argues for the potential of [neural networks] built with a biological toolkit to succeed at [machine learning] tasks,” the eggheads concluded.

The paper, which emerged this week, was submitted for the workshop track for the International Conference on Learning Representations, due to take place in Vancouver, Canada, at the end of April.

Inspired

Neural networks are modeled very loosely on how the brain is connected, but how comparable artificial minds are to biological grey matter is open for debate.

“Neural networks were originally inspired by the brain,” Delahunt told us. “Then as the techniques developed, deep neural networks became their own thing, just as a car is sort of inspired by a horse but is totally different. In one sense, our work is an effort to drink afresh at the well of bio-inspiration.

“Whether these techniques will stay somewhat close to their biological origins is unclear. It is certainly the case that biological brains have many, many more architectures and tricks than are found in the moth. So carefully analyzing other BNNs will yield novel insights that can be applied to machine learning.”

You can read more discussion on their work here, and here. So far, this technique of using an insect’s brain is still undergoing scrutiny – it’s rather fun but it may be no more efficient than a well-optimized convolutional neural network. ®