TY - GEN
T1 - AdaBoostNet
T2 - 2019 IEEE International Conference on Signal, Information and Data Processing, ICSIDP 2019
AU - Zhou, Shichao
AU - Zhao, Baojun
AU - Tang, Linbo
AU - Jing, Donglin
AU - Pan, Yu
AU - Huang, Yun
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - Hierarchical Neural Networks [e.g., deep neural networks (DNNs)] have recently gained increasing attention for image classification task. Most of the previous works devote to stack multiple levels of feature learning modules with the hope that higher level modules can represent more abstract semantics of the image data. However, training of such hierarchical network is typically cast as a time-consuming non-convex optimization problem, and its effectiveness for image feature representation critically depends on expertise in parameter tuning with various ad hoc tricks. To address these issues, we advocate a biologically-inspired hierarchical neural network. One key philosophy is that the higher-level modules in the network should correct misclassification induced by the lower ones. Given this idea, a sequential stacking strategy of basic feature learning modules is presented. In practical, an efficient layer wise learning and decision aggregation method is applied to boost the hierarchical network, which is different from naive layers cascading and end-to-end finetuning, and we term it as AdaboostNet. Moreover, the basic feature learning module is set as extreme learning machine (ELM), an effective and cheaply-optimized model. Experiments are conducted on benchmark datasets to evaluate our claims. It shows that the proposed network achieves comparable or better image classification accuracy and training efficiency than traditional DNNs and hierarchical ELMs.
AB - Hierarchical Neural Networks [e.g., deep neural networks (DNNs)] have recently gained increasing attention for image classification task. Most of the previous works devote to stack multiple levels of feature learning modules with the hope that higher level modules can represent more abstract semantics of the image data. However, training of such hierarchical network is typically cast as a time-consuming non-convex optimization problem, and its effectiveness for image feature representation critically depends on expertise in parameter tuning with various ad hoc tricks. To address these issues, we advocate a biologically-inspired hierarchical neural network. One key philosophy is that the higher-level modules in the network should correct misclassification induced by the lower ones. Given this idea, a sequential stacking strategy of basic feature learning modules is presented. In practical, an efficient layer wise learning and decision aggregation method is applied to boost the hierarchical network, which is different from naive layers cascading and end-to-end finetuning, and we term it as AdaboostNet. Moreover, the basic feature learning module is set as extreme learning machine (ELM), an effective and cheaply-optimized model. Experiments are conducted on benchmark datasets to evaluate our claims. It shows that the proposed network achieves comparable or better image classification accuracy and training efficiency than traditional DNNs and hierarchical ELMs.
KW - Extreme Learning Machine
KW - Feature Learning Adaboost
KW - Hierarchical Neural Network
KW - Image Classification
UR - http://www.scopus.com/inward/record.url?scp=85091915850&partnerID=8YFLogxK
U2 - 10.1109/ICSIDP47821.2019.9173507
DO - 10.1109/ICSIDP47821.2019.9173507
M3 - Conference contribution
AN - SCOPUS:85091915850
T3 - ICSIDP 2019 - IEEE International Conference on Signal, Information and Data Processing 2019
BT - ICSIDP 2019 - IEEE International Conference on Signal, Information and Data Processing 2019
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 11 December 2019 through 13 December 2019
ER -