No

Author / Reference / Tittle

Activation function

Equation

1.

Rethinking the Role of Activation Functions in Deep Convolutional Neural Networks for Image Classification

SigmoidRELU


2.

Research on convolutional neural network based on improved Relu
piecewise activation function

SoftsignRELU


3.

Xray weld image classification using
improved convolutional neural network

LRELUSoftplus


4.

Improved Convolutional Neural Network
Based on Fast Exponentially Linear
Unit Activation Function

FELU from ELU


5.

The Influence of the Activation Function in a
Convolution Neural Network Model of Facial
Expression Recognition

LSRELU


6.

A new Conv2D model with modified ReLU activation function for
identication of disease type and severity in cucumber plant

MRELU (extension of RELU)


7.

RSigELU: A nonlinear activation function for deep neural networks

RSigELU (combinations of ReLU, sigmoid, and ELU)


8.

Funnel Activation for Visual Recognition

FRELU (RELUPRELU)


9.

ALReLU: A different approach on Leaky ReLU activation function to improve
Neural Networks Performance

ALReLU


10.

Parametric Deformable Exponential Linear Units for deep neural
networks

PDELU


11.

Elastic exponential linear units for convolutional neural networks

EELU


12

QReLU and mQReLU: Two novel quantum activation functions to aid medical diagnostics

QReLU and mQReLU

A twostep quantum approach was applied to ReLU first, by selecting its solution for positive values ( 𝑅(𝑧)= 𝑧,∀ 𝑧>0), and the Leaky ReLU’s solution for negative values (𝑅(𝑧)=𝛼×𝑧, ∀ 𝑧 ≤ 0, 𝑤ℎ𝑒𝑟𝑒 𝛼=0.01) as a starting point to improve quantistically.

13

A Dynamic ReLU on Neural Network

DRELU

we propose a modified ReLU function, the DReLU, with a dynamic threshold that changes in realtime according to the range of numbers in each batch.

14

Improving Convolutional Neural Network Expression via Difference
Exponentially Linear Units

DELU


15

Algorithm Research on Improving Activation Function of Convolutional Neural Networks

SoftplusRELU

