Serf

public 2 min read
Serf: Understanding Log-Softplus ERror Activation Function When it comes to artificial neural networks and their deep learning algorithms, activation functions…

CReLU

public 2 min read
Introduction to CReLU CReLU, or Concatenated Rectified Linear Units, is an activation function used in deep learning. It involves concatenating…

Mish

public 2 min read
When it comes to neural networks, activation functions are a fundamental component. They are responsible for determining whether a neuron…

ReLU6

public 2 min read
ReLU6: A Modified Version of Rectified Linear Unit Machine learning algorithms are rapidly changing the computational landscape of artificial intelligence.…