神经网络结构搜索在脑数据分析领域的研究进展

Translated title of the contribution: Survеу оn Nеurаl Аrсhitесturе Sеаrсh fоr Вrаin Dаtа Аnаlуsis

Qing Li, Qi Xin Wang, Zi Yu Li, Zhi Yuan Zhu, Shi Hao Zhang, Hao Nan Mou, Wen Ting Yang, Xia Wu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Neural architecture search (NAS) is an important part of automated machine learning, which has been widely used in multiple fields, including computer vision, speech recognition, etc. NAS can search the optimal deep neural network structures for specific data, scenarios, and tasks. In recent years, NAS has been increasingly applied to brain data analysis, significantly improving the performance in multiple application fields, such as brain image segment, feature extraction, brain disease auxiliary diagnosis, etc. Such researches have demonstrated the advantages of low-energy automated machine learning in the field of brain data analysis. NAS-based brain data analysis is one of the current research hotspots, and it still has certain challenges. At present, there are few review literatures available for reference in this field worldwide. This study conducts a detailed survey and analysis of relevant literature from different perspectives, including search frameworks, search space, search strategies, research tasks, and experimental data. At the same time, a systematic summary of brain data sets is also provided that can be used for NAS training. In addition, challenges and future research directions of NAS are prospected in brain data analysis.

Translated title of the contributionSurvеу оn Nеurаl Аrсhitесturе Sеаrсh fоr Вrаin Dаtа Аnаlуsis
Original languageChinese (Traditional)
Pages (from-to)1682-1702
Number of pages21
JournalRuan Jian Xue Bao/Journal of Software
Volume35
Issue number4
DOIs
Publication statusPublished - Apr 2024
Externally publishedYes

Fingerprint

Dive into the research topics of 'Survеу оn Nеurаl Аrсhitесturе Sеаrсh fоr Вrаin Dаtа Аnаlуsis'. Together they form a unique fingerprint.

Cite this