MMReLU: A Simple and Smooth Activation Function with High Convergence Speed

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Activation functions have a major effect on deep networks' performance. In past few years, there is an increasing interest in the construction of novel activation functions. In this paper, we introduced a novel non-monotonic activation function, named Moreau Mish Rectified Linear Unit (MMReLU). It's simple, efficient, and robust, comparing with Mish, ReLU, and etc. The experimental results on several classical datasets demonstrate that MMReLU outperforms its counterparts in both convergence speed and accuracy. We show that the capacity of neural networks could be enhanced by MMReLU without changing the network structure, especially the convergence speed.

Original languageEnglish
Title of host publication2021 7th International Conference on Computer and Communications, ICCC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1444-1448
Number of pages5
ISBN (Electronic)9781665409506
DOIs
Publication statusPublished - 2021
Event7th International Conference on Computer and Communications, ICCC 2021 - Chengdu, China
Duration: 10 Dec 202113 Dec 2021

Publication series

Name2021 7th International Conference on Computer and Communications, ICCC 2021

Conference

Conference7th International Conference on Computer and Communications, ICCC 2021
Country/TerritoryChina
CityChengdu
Period10/12/2113/12/21

Keywords

  • activation function
  • simple
  • smooth

Fingerprint

Dive into the research topics of 'MMReLU: A Simple and Smooth Activation Function with High Convergence Speed'. Together they form a unique fingerprint.

Cite this