Improving learning efficiency of recurrent neural network through adjusting weights of all layers in a biologically-inspired framework

Xiao Huang, Wei Wu, Peijie Yin, Hong Qiao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

Brain-inspired models have become a focus in artificial intelligence field. As a biologically plausible network, the recurrent neural network in reservoir computing framework has been proposed as a popular model of cortical computation because of its complicated dynamics and highly recurrent connections. To train this network, unlike adjusting only readout weights in liquid computing theory or changing only internal recurrent weights, inspired by global modulation of human emotions on cognition and motion control, we introduce a novel reward-modulated Hebbian learning rule to train the network by adjusting not only the internal recurrent weights but also the input connected weights and readout weights together, with solely delayed, phasic rewards. Experiment results show that the proposed method can train a recurrent neural network in near-chaotic regime to complete the motion control and working-memory tasks with higher accuracy and learning efficiency.

Original languageEnglish
Title of host publication2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages873-879
Number of pages7
ISBN (Electronic)9781509061815
DOIs
Publication statusPublished - 30 Jun 2017
Externally publishedYes
Event2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States
Duration: 14 May 201719 May 2017

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2017-May

Conference

Conference2017 International Joint Conference on Neural Networks, IJCNN 2017
Country/TerritoryUnited States
CityAnchorage
Period14/05/1719/05/17

Fingerprint

Dive into the research topics of 'Improving learning efficiency of recurrent neural network through adjusting weights of all layers in a biologically-inspired framework'. Together they form a unique fingerprint.

Cite this

Huang, X., Wu, W., Yin, P., & Qiao, H. (2017). Improving learning efficiency of recurrent neural network through adjusting weights of all layers in a biologically-inspired framework. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings (pp. 873-879). Article 7965944 (Proceedings of the International Joint Conference on Neural Networks; Vol. 2017-May). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2017.7965944