Softsign Activation

The Softsign Activation function is one of the many activation functions that researchers have developed for use in neural networks. It is sometimes used in place of the more popular activation functions, such as sigmoid and ReLU and has its own advantages and disadvantages. Below, we will take a closer look at how it works, its pros and cons, and some examples of its use in image classification applications.

How Softsign Activation Works

The Softsign activation function is defined as:

$$f\left(x\right) = \left(\frac{x}{|x|+1}\right)$$

As can be seen from the equation, the Softsign activation function takes a given input 'x' and normalizes it by dividing it by the absolute value of 'x', plus one.

When the input to the Softsign activation function is large (either positive or negative), the output value approaches 1. Conversely, when the input is small (either positive or negative), the output value approaches zero. In essence, the Softsign activation function performs like the Sigmoid function, but without the steep slope that can cause issues with gradient vanishing during backpropagation.

Pros and Cons of Softsign Activation

Like other activation functions, Softsign has its own set of advantages and disadvantages that make it more or less appropriate in certain applications. Here are some of the pros and cons of using Softsign activation:

Pros

  • Softsign has a relatively smooth gradient that continuously changes with its input. This makes it easier to use in deep neural networks because it does not suffer from gradient vanishing or exploding problems.
  • Softsign is bounded and has a similar character to the sigmoid function. This makes it easier to use in other models and theoretical analyses.
  • The Softsign function is differentiable everywhere, making it an ideal activation function in optimization and learning-based applications.
  • Softsign is a non-linear function that can be useful in multi-layer neural networks when the goal is to classify complex patterns.

Cons

  • Softsign is susceptible to noise and outliers. When the input to the function is incredibly small or incredibly large, the output saturates and carries no information.
  • Although the Softsign function is less likely to suffer from vanishing gradients than the ReLU activation function, it may still produce a zero gradient where the input value is zero. This can cause some training issues.

Examples of Softsign Activation in Neural Networks

There are many examples of Softsign activation being used in neural networks, particularly in image classification tasks. Here are some examples:

  • MNIST Handwritten Digit Database: The MNIST dataset is a popular database of handwritten digits used to train machine learning models. It contains 60,000 samples for training and 10,000 samples for testing. In a 2016 paper, researchers used the Softsign activation function in their convolutional neural network to achieve an accuracy of 99.73% on the dataset.
  • ImageNet Classification: Researchers from Google trained a deep convolutional neural network (CNN) called Inception V2 on the ImageNet dataset using the Softsign activation function. The network was able to achieve a top-5 error rate of 4.80%, which was better than any other model available at the time.
  • Facial Expression Recognition: In a 2016 study, researchers used Softsign activation along with other activation functions in a deep neural network to recognize facial expressions. They were able to achieve an accuracy of 73.9%, which was higher than the state-of-the-art at the time.

The Softsign activation function is a non-linear function that takes a given input and normalizes it by dividing it by the absolute value of 'x', plus one. It has several advantages, including a smooth gradient and being bounded, but it also has some disadvantages, such as being susceptible to noise and outliers. Despite its drawbacks, Softsign activation has been used effectively in many machine learning applications, particularly in image classification tasks.

Great! Next, complete checkout for full access to SERP AI.
Welcome back! You've successfully signed in.
You've successfully subscribed to SERP AI.
Success! Your account is fully activated, you now have access to all content.
Success! Your billing info has been updated.
Your billing was not updated.