Sign in
Register
ReLU stands for Rectified Linear Unit. ReLU activation function is one of the most used activation functions in the deep learning models. ReLU function is used in almost all convolutional neural networks or deep learning models.
1
/u/impressive_clock7032
, 2023-05-29, 05:58:39
Permalink
More like this: