Quanta Magazine
12 min read
The two processes are not exactly the same, however. When a deep neural network is trained to recognize an image, it proceeds in two stages: forward propagation first and then backpropagation, when the “learning” occurs. During the first stage, neurons in the input layer encode features of the image and pass it on. Then neurons in the hidden layers perform computations and send their results up to the output layer, which spits out its prediction of the image, like “cat.” But if the image was actually of a dog, then it’s up to the backpropagation algorithm to come in and fix what went wrong by adjusting the weights that connect neurons.
These changes are based on calculating how every neuron could contribute less to the overall error, starting with the neurons at the top, nearest the output layer, and then moving backward through each layer. If the backpropagation algorithm estimates that increasing a given neuron’s activity will improve the output prediction, for example, then that neuron’s weights will increase. The goal is to change all the connections in the neural network — each one a tiny bit in the right direction — until the output predictions are correct more often.
For decades, researchers had tried to figure out how the brain might perform something like backpropagation to solve the credit assignment problem. Backpropagation itself is not biologically plausible because, among other things, real neurons can’t just stop processing the external world and wait for backpropagation to begin — if they did, we’d end up with lapses in our vision or hearing.
Naud and Richards’ new model got around this with a simple change in the canonical understanding of how neurons communicate with each other. We’ve long known that neurons act as bits, capable of only two outputs, either sending a spike of electrical activity to another neuron or not sending it — either a 1 or a 0. But it’s also true that neurons can send a “burst” of spikes in quick succession. And doing so has been proved to change the connections between neurons, making bursts a natural candidate for solving the credit assignment problem. In the new model, the team considered neuron bursts a third output signal, a stream of 1s so close together it effectively becomes a 2. Rather than encoding anything about the external world, the 2 acts as a “teaching signal” to tell other neurons whether to strengthen or weaken their connections to each other, based on the error accrued at the top of the circuit.
But for this teaching signal to solve the credit assignment problem without hitting “pause” on sensory processing, their model required another key piece. Naud and Richards’ team proposed that neurons have separate compartments at their top and bottom that process the neural code in completely different ways.
“[Our model] shows that you really can have two signals, one going up and one going down, and they can pass one another,” said Naud.
To make this possible, their model posits that treelike branches receiving inputs on the tops of neurons are listening only for bursts — the internal teaching signal — in order to tune their connections and decrease error. The tuning happens from the top down, just like in backpropagation, because in their model, the neurons at the top are regulating the likelihood that the neurons below them will send a burst. The researchers showed that when a network has more bursts, neurons tend to increase the strength of their connections, whereas the strength of the connections tends to decrease when burst signals are less frequent. The idea is that the burst signal tells neurons that they should be active during the task, strengthening their connections, if doing so decreases the error. An absence of bursts tells neurons that they should be inactive and may need to weaken their connections.
At the same time, the branches on the bottom of the neuron treat bursts as if they were single spikes — the normal, external world signal — which allows them to continue sending sensory information upward in the circuit without interruption.
“In retrospect, the idea presented seems logical, and I think that this speaks for the beauty of it,” said João Sacramento, a computational neuroscientist at the University of Zurich and ETH Zurich. “I think that’s brilliant.”
Others had tried to follow a similar logic in the past. Twenty years ago, Konrad Kording of the University of Pennsylvania and Peter König of Osnabrück University in Germany proposed a learning framework with two-compartment neurons. But their proposal lacked many of the specific details in the newer model that are biologically relevant, and it was only a proposal — they couldn’t prove that it could actually solve the credit assignment problem.
“Back then, we simply lacked the ability to test these ideas,” Kording said. He considers the new paper “tremendous work” and will be following up on it in his own lab.
With today’s computational power, Naud, Richards and their collaborators successfully simulated their model, with bursting neurons playing the role of the learning rule. They showed that it solves the credit assignment problem in a classic task known as XOR, which requires learning to respond when one of two inputs (but not both) is 1. They also showed that a deep neural network built with their bursting rule could approximate the performance of the backpropagation algorithm on challenging image classification tasks. But there’s still room for improvement, as the backpropagation algorithm was still more accurate, and neither fully matches human capabilities.
“There’s got to be details that we don’t have, and we have to make the model better,” said Naud. “The main goal of the paper is to say that the sort of learning that machines are doing can be approximated by physiological processes.”
AI researchers are also excited, since figuring out how the brain approximates backpropagation could ultimately improve how AI systems learn, too. “If we understand it, then this may eventually lead to systems that can solve computational problems as efficiently as the brain does,” said Marcel van Gerven, chair of the artificial intelligence department at the Donders Institute at Radboud University in the Netherlands.
The new model suggests the partnership between neuroscience and AI could also move beyond our understanding of each one alone and instead find the general principles that are necessary for brains and machines to be able to learn anything at all.
“These are principles that, in the end, transcend the wetware,” said Larkum.
rnnn","settings":{"socialLinks":[{"type":"facebook","label":"Facebook","url":"https://www.facebook.com/QuantaNews","__typename":"SocialMediaLink"},{"type":"twitter","label":"Twitter","url":"https://twitter.com/QuantaMagazine","__typename":"SocialMediaLink"},{"type":"youtube","label":"YouTube","url":"http://youtube.com/c/QuantamagazineOrgNews","__typename":"SocialMediaLink"},{"type":"instagram","label":"Instagram","url":"https://instagram.com/quantamag","__typename":"SocialMediaLink"},{"type":"rss","label":"RSS","url":"https://api.quantamagazine.org/feed/","__typename":"SocialMediaLink"}],"newsletterAction":"https://quantamagazine.us1.list-manage.com/subscribe/post?u=0d6ddf7dc1a0b7297c8e06618&id=f0cb61321c","newsletterUrl":"http://us1.campaign-archive2.com/home/?u=0d6ddf7dc1a0b7297c8e06618&id=f0cb61321c","commentsHeader":"
n","itunesSubscribe":"https://itunes.apple.com/us/podcast/quanta-science-podcast/id1021340531?mt=2&ls=1","androidSubscribe":"https://podcasts.google.com/feed/aHR0cHM6Ly93d3cucXVhbnRhbWFnYXppbmUub3JnL2ZlZWQvcG9kY2FzdC8","spotifySubscribe":"https://open.spotify.com/show/7oKXOpbHzbICFUcJNbZ5wF","itunesJoyOfX":"https://podcasts.apple.com/us/podcast/the-joy-of-x/id1495067186","androidJoyOfX":"https://podcasts.google.com/feed/aHR0cHM6Ly9hcGkucXVhbnRhbWFnYXppbmUub3JnL2ZlZWQvdGhlLWpveS1vZi14Lw","spotifyJoyOfX":"https://open.spotify.com/show/5HcCtKPH5gnOjRiMtTdC07","popularSearches":[{"term":"math","label":"Mathematics","__typename":"PopularSearch"},{"term":"physics","label":"Physics","__typename":"PopularSearch"},{"term":"black holes","label":"Black Holes","__typename":"PopularSearch"},{"term":"evolution","label":"Evolution","__typename":"PopularSearch"}],"searchTopics":[{"type":"Tag","label":"Podcasts","tag":{"name":"podcast","slug":"podcast","term_id":"552","__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"Tag","label":"Columns","tag":{"name":"Quantized Columns","slug":"quantized","term_id":"551","__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"Series","label":"Series","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Interviews","tag":{"name":"Q&A","slug":"qa","term_id":"567","__typename":"Term"},"category":{"name":"Q&A","slug":"qa","term_id":"176","__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Multimedia","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":"Multimedia","slug":"multimedia","term_id":"43","__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Puzzles","tag":{"name":"puzzles","slug":"puzzles","term_id":"542","__typename":"Term"},"category":{"name":"Puzzles","slug":"puzzles","term_id":"546","__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Blog Posts","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":"Abstractions blog","slug":"abstractions","term_id":"619","__typename":"Term"},"__typename":"SearchTopic"},{"type":"news","label":"News Articles","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"videos","label":"Videos","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"}],"searchSections":[{"name":"Mathematics","slug":"mathematics","term_id":"188","__typename":"Term"},{"name":"Physics","slug":"physics","term_id":"189","__typename":"Term"},{"name":"Biology","slug":"biology","term_id":"191","__typename":"Term"},{"name":"Computer Science","slug":"computer-science","term_id":"190","__typename":"Term"}],"searchAuthors":[{"id":"38171","name":"Adam Becker","__typename":"AuthorList"},{"id":"28087","name":"Adam Mann","__typename":"AuthorList"},{"id":"29794","name":"Alex Kontorovich","__typename":"AuthorList"},{"id":"39302","name":"Alexander Hellemans","__typename":"AuthorList"},{"id":"56","name":"Alla Katsnelson","__typename":"AuthorList"},{"id":"29458","name":"Allison Whitten","__typename":"AuthorList"},{"id":"73","name":"Amanda Gefter","__typename":"AuthorList"},{"id":"39164","name":"Ana Kova","__typename":"AuthorList"},{"id":"59","name":"Andreas von Bubnoff","__typename":"AuthorList"},{"id":"8728","name":"Anil Ananthaswamy","__typename":"AuthorList"},{"id":"11648","name":"Ann Finkbeiner","__typename":"AuthorList"},{"id":"95","name":"Ariel Bleicher","__typename":"AuthorList"},{"id":"15493","name":"Ashley Smart","__typename":"AuthorList"},{"id":"450","name":"Ashley Yeager","__typename":"AuthorList"},{"id":"36490","name":"Ben Brubaker","__typename":"AuthorList"},{"id":"16315","name":"Bill Andrews","__typename":"AuthorList"},{"id":"2752","name":"Bob Henderson","__typename":"AuthorList"},{"id":"15492","name":"Brendan Z. Foster","__typename":"AuthorList"},{"id":"68","name":"Brooke Borel","__typename":"AuthorList"},{"id":"62","name":"Carl Zimmer","__typename":"AuthorList"},{"id":"13691","name":"Caroline Lee","__typename":"AuthorList"},{"id":"13684","name":"Caroline Lee","__typename":"AuthorList"},{"id":"50","name":"Carrie Arnold","__typename":"AuthorList"},{"id":"15142","name":"Chanda Prescod-Weinstein","__typename":"AuthorList"},{"id":"8084","name":"Charlie Wood","__typename":"AuthorList"},{"id":"742","name":"Christie Wilcox","__typename":"AuthorList"},{"id":"11543","name":"Claudia Dreifus","__typename":"AuthorList"},{"id":"57","name":"Courtney Humphries","__typename":"AuthorList"},{"id":"7262","name":"Dalmeet Singh Chawla","__typename":"AuthorList"},{"id":"70","name":"Dan Falk","__typename":"AuthorList"},{"id":"19918","name":"Dana Najjar","__typename":"AuthorList"},{"id":"32676","name":"Daniel S. Freed","__typename":"AuthorList"},{"id":"13724","name":"David H. Freedman","__typename":"AuthorList"},{"id":"26310","name":"David S. Richeson","__typename":"AuthorList"},{"id":"30207","name":"David Tse","__typename":"AuthorList"},{"id":"19266","name":"Devin Powell","__typename":"AuthorList"},{"id":"13251","name":"Diana Kwon","__typename":"AuthorList"},{"id":"17000","name":"Elena Renken","__typename":"AuthorList"},{"id":"17149","name":"Elizabeth Landau","__typename":"AuthorList"},{"id":"5279","name":"Elizabeth Preston","__typename":"AuthorList"},{"id":"58","name":"Elizabeth Svoboda","__typename":"AuthorList"},{"id":"32612","name":"Ellen Horne","__typename":"AuthorList"},{"id":"27534","name":"Emily Buder","__typename":"AuthorList"},{"id":"25173","name":"Emily Levesque","__typename":"AuthorList"},{"id":"64","name":"Emily Singer","__typename":"AuthorList"},{"id":"47","name":"Erica Klarreich","__typename":"AuthorList"},{"id":"14784","name":"Erika K. Carlson","__typename":"AuthorList"},{"id":"98","name":"Esther Landhuis","__typename":"AuthorList"},{"id":"5830","name":"Eva Silverstein","__typename":"AuthorList"},{"id":"6793","name":"Evelyn Lamb","__typename":"AuthorList"},{"id":"75","name":"Ferris Jabr","__typename":"AuthorList"},{"id":"52","name":"Frank Wilczek","__typename":"AuthorList"},{"id":"69","name":"Gabriel Popkin","__typename":"AuthorList"},{"id":"77","name":"George Musser","__typename":"AuthorList"},{"id":"19092","name":"Grant Sanderson","__typename":"AuthorList"},{"id":"20557","name":"Howard Lee","__typename":"AuthorList"},{"id":"66","name":"Ingrid Daubechies","__typename":"AuthorList"},{"id":"85","name":"Ivan Amato","__typename":"AuthorList"},{"id":"37141","name":"Jake Buehler","__typename":"AuthorList"},{"id":"12170","name":"Janna Levin","__typename":"AuthorList"},{"id":"32","name":"Jeanette Kazmierczak","__typename":"AuthorList"},{"id":"51","name":"Jennifer Ouellette","__typename":"AuthorList"},{"id":"72","name":"John Pavlus","__typename":"AuthorList"},{"id":"16475","name":"John Preskill","__typename":"AuthorList"},{"id":"91","name":"John Rennie","__typename":"AuthorList"},{"id":"10351","name":"Jonathan Lambert","__typename":"AuthorList"},{"id":"31716","name":"Jonathan O'Callaghan","__typename":"AuthorList"},{"id":"1241","name":"Jordana Cepelewicz","__typename":"AuthorList"},{"id":"8463","name":"Joshua Roebke","__typename":"AuthorList"},{"id":"49","name":"Joshua Sokol","__typename":"AuthorList"},{"id":"16815","name":"jye","__typename":"AuthorList"},{"id":"67","name":"K.C. Cole","__typename":"AuthorList"},{"id":"37462","name":"Karmela Padavic-Callaghan","__typename":"AuthorList"},{"id":"87","name":"Kat McGowan","__typename":"AuthorList"},{"id":"36139","name":"Katarina Zimmer","__typename":"AuthorList"},{"id":"20556","name":"Katherine Harmon Courage","__typename":"AuthorList"},{"id":"90","name":"Katia Moskvitch","__typename":"AuthorList"},{"id":"27374","name":"Kelsey Houston-Edwards","__typename":"AuthorList"},{"id":"40","name":"Kevin Hartnett","__typename":"AuthorList"},{"id":"38413","name":"Lakshmi Chandrasekaran","__typename":"AuthorList"},{"id":"12570","name":"Laura Poppick","__typename":"AuthorList"},{"id":"38699","name":"Leila Sloman","__typename":"AuthorList"},{"id":"23451","name":"Liam Drew","__typename":"AuthorList"},{"id":"79","name":"Liz Kruesi","__typename":"AuthorList"},{"id":"38","name":"Lucy Reading-Ikkanda","__typename":"AuthorList"},{"id":"60","name":"Maggie McKee","__typename":"AuthorList"},{"id":"2333","name":"Mallory Locklear","__typename":"AuthorList"},{"id":"3569","name":"Marcus Woo","__typename":"AuthorList"},{"id":"414","name":"Mark Kim-Mulgrew","__typename":"AuthorList"},{"id":"20495","name":"Matt Carlstrom","__typename":"AuthorList"},{"id":"17147","name":"Matthew Hutson","__typename":"AuthorList"},{"id":"30953","name":"Max G. Levy","__typename":"AuthorList"},{"id":"32437","name":"Max Kozlov","__typename":"AuthorList"},{"id":"7186","name":"Melinda Wenner Moyer","__typename":"AuthorList"},{"id":"14093","name":"Michael Harris","__typename":"AuthorList"},{"id":"34","name":"Michael Kranz","__typename":"AuthorList"},{"id":"23","name":"Michael Moyer","__typename":"AuthorList"},{"id":"74","name":"Michael Nielsen","__typename":"AuthorList"},{"id":"19093","name":"Michele Bannister","__typename":"AuthorList"},{"id":"1472","name":"Moira Chas","__typename":"AuthorList"},{"id":"6476","name":"Monique Brouillette","__typename":"AuthorList"},{"id":"35407","name":"Mordechai Rorvig","__typename":"AuthorList"},{"id":"10","name":"Natalie Wolchover","__typename":"AuthorList"},{"id":"37605","name":"Nick Thieme","__typename":"AuthorList"},{"id":"37428","name":"Nima Arkani-Hamed","__typename":"AuthorList"},{"id":"19962","name":"Nola Taylor Redd","__typename":"AuthorList"},{"id":"24","name":"Olena Shmahalo","__typename":"AuthorList"},{"id":"1816","name":"Patrick Honner","__typename":"AuthorList"},{"id":"84","name":"Peter Byrne","__typename":"AuthorList"},{"id":"55","name":"Philip Ball","__typename":"AuthorList"},{"id":"31","name":"Pradeep Mutalik","__typename":"AuthorList"},{"id":"24011","name":"Puja Changoiwala","__typename":"AuthorList"},{"id":"100","name":"Quanta Magazine","__typename":"AuthorList"},{"id":"2784","name":"R. Douglas Fields","__typename":"AuthorList"},{"id":"26114","name":"Rachel Crowell","__typename":"AuthorList"},{"id":"9412","name":"Raleigh McElvery","__typename":"AuthorList"},{"id":"820","name":"Ramin Skibba","__typename":"AuthorList"},{"id":"1666","name":"Rebecca Boyle","__typename":"AuthorList"},{"id":"20950","name":"Richard Masland","__typename":"AuthorList"},{"id":"48","name":"Robbert Dijkgraaf","__typename":"AuthorList"},{"id":"80","name":"Roberta Kwok","__typename":"AuthorList"},{"id":"15681","name":"Robin George Andrews","__typename":"AuthorList"},{"id":"24577","name":"Rodrigo Pérez Ortega","__typename":"AuthorList"},{"id":"78","name":"Sabine Hossenfelder","__typename":"AuthorList"},{"id":"23845","name":"Samuel Velasco","__typename":"AuthorList"},{"id":"83","name":"Sarah Lewin","__typename":"AuthorList"},{"id":"35441","name":"Scott Aaronson","__typename":"AuthorList"},{"id":"76","name":"Sean B. Carroll","__typename":"AuthorList"},{"id":"15680","name":"Sean Carroll","__typename":"AuthorList"},{"id":"7239","name":"Shannon Hall","__typename":"AuthorList"},{"id":"65","name":"Siobhan Roberts","__typename":"AuthorList"},{"id":"39028","name":"skhaled","__typename":"AuthorList"},{"id":"5944","name":"Sophia Chen","__typename":"AuthorList"},{"id":"61","name":"Steph Yin","__typename":"AuthorList"},{"id":"63","name":"Stephanie Bucklin","__typename":"AuthorList"},{"id":"26311","name":"Stephanie DeMarco","__typename":"AuthorList"},{"id":"71","name":"Stephen Ornes","__typename":"AuthorList"},{"id":"17148","name":"Steve Nadis","__typename":"AuthorList"},{"id":"13356","name":"Steven Strogatz","__typename":"AuthorList"},{"id":"17150","name":"Susan D'Agostino","__typename":"AuthorList"},{"id":"2960","name":"Tara C. Smith","__typename":"AuthorList"},{"id":"14785","name":"Thomas Lewton","__typename":"AuthorList"},{"id":"3","name":"Thomas Lin","__typename":"AuthorList"},{"id":"54","name":"Tim Vernimmen","__typename":"AuthorList"},{"id":"88","name":"Tom Siegfried","__typename":"AuthorList"},{"id":"12964","name":"Vanessa Schipani","__typename":"AuthorList"},{"id":"53","name":"Veronique Greenwood","__typename":"AuthorList"},{"id":"86","name":"Virginia Hughes","__typename":"AuthorList"},{"id":"3244","name":"Viviane Callier","__typename":"AuthorList"},{"id":"89","name":"Wynne Parry","__typename":"AuthorList"},{"id":"15913","name":"XiaoZhi Lim","__typename":"AuthorList"}],"adBehavior":"everywhere","adUrl":"https://www.quantamagazine.org/gift-store","adAlt":"Alice and Bob Meet the Wall of Fire - The Biggest Ideas in Science from Quanta – Available now!","adImageHome":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2019/01/Ad_Default_250x342_2x_Science.jpg","adImageArticle":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2019/01/Ad_Article_320x600_Science.jpg","adImageTablet":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2019/01/Ad_Tablet_890x250_2x_Science.jpg","adImageMobile":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2019/01/Ad_Mobile_250x200_2x_Science.jpg","trackingScripts":"rn"},"theme":{"page":{"accent":"#ff8600","text":"#1a1a1a","background":"white"},"header":{"type":"default","gradient":{"color":"white"},"solid":{"primary":"#1a1a1a","secondary":"#999999","hover":"#ff8600"},"transparent":{"primary":"white","secondary":"white","hover":"#ff8600"}}},"redirect":null,"fallbackImage":{"alt":"","caption":"","url":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default.gif","width":1200,"height":600,"sizes":{"thumbnail":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-520x260.gif","square_small":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-160x160.gif","square_large":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-520x520.gif","medium":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default.gif","medium_large":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-768x384.gif","large":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default.gif","__typename":"ImageSizes"},"__typename":"Image"}},"modals":{"loginModal":false,"signUpModal":false,"forgotPasswordModal":false,"resetPasswordModal":false,"lightboxModal":false,"callback":null,"props":null},"podcast":{"id":null,"playing":false,"duration":0,"currentTime":0},"user":{"loggedIn":false,"savedArticleIDs":[],"userEmail":"","editor":false},"comments":{"open":false},"cookies":{"acceptedCookie":false}},
env: {
APP_URL: 'https://www.quantamagazine.org',
NODE_ENV: 'production',
WP_URL: 'https://api.quantamagazine.org',
HAS_GOOGLE_ID: true,
HAS_FACEBOOK_ID: true,
},
}
https://www.quantamagazine.org/brain-bursts-can-mimic-famous-ai-learning-strategy-20211018/