Abstract
Convolutional neural network performs well in some regions but contains much computational complexity and data transfer between storage module and computation module, which limits its application. The Winograd algorithm can accelerate the convolution layer, which is the most expensive to computing resources in CNN. In this letter, a kind of cache structure and data access method called Jump-Step flow is designed for Winograd algorithm to reduce the data transfer. The Winograd PE replaces multiplication with addition to accelerate the convolution operation. This allows it to work out 4 elements in 3 clock cycles, which contains 72 OPs, 2× better DSP efficiency than average. With the cache structure and data access method, each element is read only once from storage module even different tiles share data and it achieves 4× reduction in data transfer between between PE and storage module than general Winograd, 9× reduction than conventional convolution.
Original language | English |
---|---|
Title of host publication | IET Conference Proceedings |
Publisher | Institution of Engineering and Technology |
Pages | 634-639 |
Number of pages | 6 |
Volume | 2020 |
Edition | 9 |
ISBN (Electronic) | 9781839535406 |
DOIs | |
Publication status | Published - 2020 |
Event | 5th IET International Radar Conference, IET IRC 2020 - Virtual, Online Duration: 4 Nov 2020 → 6 Nov 2020 |
Conference
Conference | 5th IET International Radar Conference, IET IRC 2020 |
---|---|
City | Virtual, Online |
Period | 4/11/20 → 6/11/20 |
Keywords
- CACHE STRUCTURE
- CNN
- JUMP-STEP FLOW
- WINOGRAD ALGORITHM