Abstract
Thanks to the supervised parameter generation strategy and non-iterative training mechanism, deep stochastic configuration network (DSCN) has achieved very efficient modelling efficiency in scenarios with relatively small problem complexity. However, the increasing number of hidden layers and the amount of training data have issued a challenge to the implementation of DSCN. To solve this problem, we propose a Dense DSCN with a Hybrid Training mechanism (HT-DDSCN), which extends the network structure of the DSCN to a dense connection type and combines three typical optimisation techniques and one universal control strategy to optimise the calculation process of the output weights. Extensive experiments on four benchmark regression problems show that HT-DDSCN can significantly improve the generalisation ability and the stability of DSCN.
| Original language | English |
|---|---|
| Pages (from-to) | 301-314 |
| Number of pages | 14 |
| Journal | International Journal of Computing Science and Mathematics |
| Volume | 15 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - 2022 |
Keywords
- DSCN
- broad learning system
- deep stochastic configuration network
- extreme learning machine
- generalisation ability
- hybrid training
- neural networks with random weights
- random vector functional link net
- randomised algorithms
- randomised neural networks
- stochastic configuration network
Fingerprint
Dive into the research topics of 'Dense deep stochastic configuration network with hybrid training mechanism'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver