A new study from MIT neuroscientists has found that one of the latest generation of so-called “deep neural networks” matches the object recognition ability of the primate brain. This improved understanding of how the primate brain works could lead to better artificial intelligence and, someday, new ways to repair visual dysfunction, notes Charles Cadieu, a postdoc at the McGovern Institute and the paper’s lead author.
For this study, the researchers first measured the brain’s object recognition ability. Led by graduate students Hong and Majaj, researchers implanted arrays of electrodes in the IT cortex as well as in area V4, a part of the visual system that feeds into the IT cortex of Macaque monkeys. This allowed researchers to see the population of neurons that respond for every object that the animals looked at. Each image activated different populations of neurons in the deep neural network and Macaque brain. “Through each of these computational transformations, through each of these layers of networks, certain objects or images get closer together, while others get further apart,” Cadieu says. The accuracy of the model was then determined by whether it groups similar objects into similar clusters.
“The fact that the models predict the neural responses and the distances of objects in neural population space shows that these models encapsulate our current best understanding as to what is going on in this previously mysterious portion of the brain,” says James DiCarlo, a professor of neuroscience and head of MIT’s Department of Brain and Cognitive Sciences and the senior author of a paper describing the study in the Dec. 18 issue of the journal PLoS Computational Biology.
As reported on TechEnablement,researchers ihave found that deep neural networks can recognize faces better than people.. See “GaussianFace: Computers Claimed to Beat Humans in Recognizing Faces“. IBM has also been making news with the Synapse chip, which is a hardware neural network, or “Bee Brain on a chip” that consumes 70 mw.
Leave a Reply