A Novel Activation Function in Convolutional Neural Network for Twitter Text Classification in Deep Learning

Authors

  • R. Balamurugan, S. Pushpa

Abstract

An activation layer is one of the important parameters that influences the performance of a neural network. There are some ideal characteristics that an activation function must have such as non-vanishing gradient, unboundedness etc. A new activation function has been proposed in this paper which has many ideal statistical characteristics. The activation function is mathematically expressed as f(x) = ReLu(x) + x*Sigmoid(x). The proposed activation function combines the advantages of both ReLu and Sigmoid activation function. The proposed activation function implemented in Deep Convolution Neural Network (DCNN) shows better performance for twitter sentiment analysis classification when compared with other commonly used activation functions in DCNN and other deep learning algorithms.

Published

2020-12-01

Issue

Section

Articles