In-sensor reservoir computing for language learning via two-dimensional memristors

Linfeng Sun, Zhongrui Wang, Jinbao Jiang, Yeji Kim, Bomin Joo, Shoujun Zheng, Seungyeon Lee, Woo Jong Yu, Bai Sun Kong, Heejun Yang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

216 Citations (Scopus)

Abstract

The dynamic processing of optoelectronic signals carrying temporal and sequential information is critical to various machine learning applications including language processing and computer vision. Despite extensive efforts to emulate the visual cortex of human brain, large energy/time overhead and extra hardware costs are incurred by the physically separated sensing, memory, and processing units. The challenge is further intensified by the tedious training of conventional recurrent neural networks for edge deployment. Here, we report in-sensor reservoir computing for language learning. High dimensionality, nonlinearity, and fading memory for the in-sensor reservoir were achieved via two-dimensional memristors based on tin sulfide (SnS), uniquely having dual-type defect states associated with Sn and S vacancies. Our in-sensor reservoir computing demonstrates an accuracy of 91% to classify short sentences of language, thus shedding light on a low training cost and the real-time solution for processing temporal and sequential signals for machine learning applications at the edge.

Original languageEnglish
Article numbereabg1455
JournalScience advances
Volume7
Issue number20
DOIs
Publication statusPublished - May 2021

Fingerprint

Dive into the research topics of 'In-sensor reservoir computing for language learning via two-dimensional memristors'. Together they form a unique fingerprint.

Cite this