MMReLU: A Simple and Smooth Activation Function with High Convergence Speed

科研成果: 书/报告/会议事项章节会议稿件同行评审

1 引用 (Scopus)

摘要

Activation functions have a major effect on deep networks' performance. In past few years, there is an increasing interest in the construction of novel activation functions. In this paper, we introduced a novel non-monotonic activation function, named Moreau Mish Rectified Linear Unit (MMReLU). It's simple, efficient, and robust, comparing with Mish, ReLU, and etc. The experimental results on several classical datasets demonstrate that MMReLU outperforms its counterparts in both convergence speed and accuracy. We show that the capacity of neural networks could be enhanced by MMReLU without changing the network structure, especially the convergence speed.

源语言英语
主期刊名2021 7th International Conference on Computer and Communications, ICCC 2021
出版商Institute of Electrical and Electronics Engineers Inc.
1444-1448
页数5
ISBN(电子版)9781665409506
DOI
出版状态已出版 - 2021
活动7th International Conference on Computer and Communications, ICCC 2021 - Chengdu, 中国
期限: 10 12月 202113 12月 2021

出版系列

姓名2021 7th International Conference on Computer and Communications, ICCC 2021

会议

会议7th International Conference on Computer and Communications, ICCC 2021
国家/地区中国
Chengdu
时期10/12/2113/12/21

指纹

探究 'MMReLU: A Simple and Smooth Activation Function with High Convergence Speed' 的科研主题。它们共同构成独一无二的指纹。

引用此