No
|
Author / Reference / Tittle
|
Activation function
|
Equation
|
1.
|
Rethinking the Role of Activation Functions in Deep Convolutional Neural Networks for Image Classification
|
Sigmoid-RELU
|
|
2.
|
Research on convolutional neural network based on improved Relu
piecewise activation function
|
Softsign-RELU
|
|
3.
|
X-ray weld image classification using
improved convolutional neural network
|
LRELU-Softplus
|
|
4.
|
Improved Convolutional Neural Network
Based on Fast Exponentially Linear
Unit Activation Function
|
FELU from ELU
|
|
5.
|
The Influence of the Activation Function in a
Convolution Neural Network Model of Facial
Expression Recognition
|
LS-RELU
|
|
6.
|
A new Conv2D model with modified ReLU activation function for
identication of disease type and severity in cucumber plant
|
M-RELU (extension of RELU)
|
|
7.
|
RSigELU: A nonlinear activation function for deep neural networks
|
RSigELU (combinations of ReLU, sigmoid, and ELU)
|
|
8.
|
Funnel Activation for Visual Recognition
|
FRELU (RELU-PRELU)
|
|
9.
|
ALReLU: A different approach on Leaky ReLU activation function to improve
Neural Networks Performance
|
ALReLU
|
|
10.
|
Parametric Deformable Exponential Linear Units for deep neural
networks
|
PDELU
|
|
11.
|
Elastic exponential linear units for convolutional neural networks
|
EELU
|
|
12
|
QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics
|
QReLU and m-QReLU
|
A two-step quantum approach was applied to ReLU first, by selecting its solution for positive values ( 𝑅(𝑧)= 𝑧,∀ 𝑧>0), and the Leaky ReLU’s solution for negative values (𝑅(𝑧)=𝛼×𝑧, ∀ 𝑧 ≤ 0, 𝑤ℎ𝑒𝑟𝑒 𝛼=0.01) as a starting point to improve quantistically.
|
13
|
A Dynamic ReLU on Neural Network
|
DRELU
|
we propose a modified ReLU function, the D-ReLU, with a dynamic threshold that changes in real-time according to the range of numbers in each batch.
|
14
|
Improving Convolutional Neural Network Expression via Difference
Exponentially Linear Units
|
DELU
|
|
15
|
Algorithm Research on Improving Activation Function of Convolutional Neural Networks
|
Softplus-RELU
|
|