Abstract
CMAC convergence properties both in batch and in incremental learning are analyzed. The previous conclusions about the CMAC convergence, which are deduced under the condition that the articulation matrix is positive definite, are improved into the new less limited and more general conclusions in which no additive conditions are needed. An improved CMAC algorithm with self-optimizing learning rate is proposed from the new conclusions. Simulation results show the correctness of the new conclusions and the advantages of the improved algorithm.
Original language | English |
---|---|
Pages (from-to) | 61-74 |
Number of pages | 14 |
Journal | Neural Processing Letters |
Volume | 14 |
Issue number | 1 |
DOIs | |
Publication status | Published - Aug 2001 |
Keywords
- Batch learning
- CMAC
- Incremental learning
- Learning convergence
- Neural networks