A cache structure and corresponding data access method for Winograd algorithm

Zhixin Zhang, Zhiheng Li*, He Chen

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Convolutional neural network performs well in some regions but contains much computational complexity and data transfer between storage module and computation module, which limits its application. The Winograd algorithm can accelerate the convolution layer, which is the most expensive to computing resources in CNN. In this letter, a kind of cache structure and data access method called Jump-Step flow is designed for Winograd algorithm to reduce the data transfer. The Winograd PE replaces multiplication with addition to accelerate the convolution operation. This allows it to work out 4 elements in 3 clock cycles, which contains 72 OPs, 2× better DSP efficiency than average. With the cache structure and data access method, each element is read only once from storage module even different tiles share data and it achieves 4× reduction in data transfer between between PE and storage module than general Winograd, 9× reduction than conventional convolution.

Original languageEnglish
Title of host publicationIET Conference Proceedings
PublisherInstitution of Engineering and Technology
Pages634-639
Number of pages6
Volume2020
Edition9
ISBN (Electronic)9781839535406
DOIs
Publication statusPublished - 2020
Event5th IET International Radar Conference, IET IRC 2020 - Virtual, Online
Duration: 4 Nov 20206 Nov 2020

Conference

Conference5th IET International Radar Conference, IET IRC 2020
CityVirtual, Online
Period4/11/206/11/20

Keywords

  • CACHE STRUCTURE
  • CNN
  • JUMP-STEP FLOW
  • WINOGRAD ALGORITHM

Fingerprint

Dive into the research topics of 'A cache structure and corresponding data access method for Winograd algorithm'. Together they form a unique fingerprint.

Cite this