首页|Champalimaud Foundation Reports Findings in Neural Computation (Approximating No nlinear Functions With Latent Boundaries in Low-Rank Excitatory-Inhibitory Spiki ng Networks)

Champalimaud Foundation Reports Findings in Neural Computation (Approximating No nlinear Functions With Latent Boundaries in Low-Rank Excitatory-Inhibitory Spiki ng Networks)

扫码查看
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News – New research on Computation - Neural C omputation is the subject of a report. According to news originating from Lisbon , Portugal, by NewsRx correspondents, research stated, “Deep feedforward and rec urrent neural networks have become successful functional models of the brain, bu t they neglect obvious biological details such as spikes and Dale’s law. Here we argue that these details are crucial in order to understand how real neural cir cuits operate.” Our news journalists obtained a quote from the research from Champalimaud Founda tion, “Towards this aim, we put forth a new framework for spike-based computatio n in low-rank excitatory-inhibitory spiking networks. By considering populations with rank-1 connectivity, we cast each neuron’s spiking threshold as a boundary in a low-dimensional input-output space. We then show how the combined threshol ds of a population of inhibitory neurons form a stable boundary in this space, a nd those of a population of excitatory neurons form an unstable boundary. Combin ing the two boundaries results in a rank-2 excitatory-inhibitory (EI) network wi th inhibition-stabilized dynamics at the intersection of the two boundaries. The computation of the resulting networks can be understood as the difference of tw o convex functions and is thereby capable of approximating arbitrary non-linear input-output mappings. We demonstrate several properties of these networks, incl uding noise suppression and amplification, irregular activity and synaptic balan ce, as well as how they relate to rate network dynamics in the limit that the bo undary becomes soft. Finally, while our work focuses on small networks (5-50 neu rons), we discuss potential avenues for scaling up to much larger networks.”

LisbonPortugalEuropeComputationN eural Computation

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(MAY.7)