Towards human-like and transhuman perception in AI 2.0: a review

Yong hong Tian, Xi lin Chen, Hong kai Xiong, Hong liang Li, Li rong Dai, Jing Chen, Jun liang Xing, Jing Chen, Xi hong Wu, Wei min Hu, Yu Hu, Tie jun Huang*, Wen Gao

*此作品的通讯作者

科研成果: 期刊稿件文献综述同行评审

32 引用 (Scopus)

摘要

Perception is the interaction interface between an intelligent system and the real world. Without sophisticated and flexible perceptual capabilities, it is impossible to create advanced artificial intelligence (AI) systems. For the next-generation AI, called ‘AI 2.0’, one of the most significant features will be that AI is empowered with intelligent perceptual capabilities, which can simulate human brain’s mechanisms and are likely to surpass human brain in terms of performance. In this paper, we briefly review the state-of-the-art advances across different areas of perception, including visual perception, auditory perception, speech perception, and perceptual information processing and learning engines. On this basis, we envision several R&D trends in intelligent perception for the forthcoming era of AI 2.0, including: (1) human-like and transhuman active vision; (2) auditory perception and computation in an actual auditory setting; (3) speech perception and computation in a natural interaction setting; (4) autonomous learning of perceptual information; (5) large-scale perceptual information processing and learning platforms; and (6) urban omnidirectional intelligent perception and reasoning engines. We believe these research directions should be highlighted in the future plans for AI 2.0.

源语言英语
页(从-至)58-67
页数10
期刊Frontiers of Information Technology and Electronic Engineering
18
1
DOI
出版状态已出版 - 1 1月 2017

指纹

探究 'Towards human-like and transhuman perception in AI 2.0: a review' 的科研主题。它们共同构成独一无二的指纹。

引用此