Warning: Creating default object from empty value in /home/customer/www/expskill.com/public_html/wp-content/plugins/dw-question-answer/inc/Template.php on line 861
#Neuron Archives

Let’s say, you are using activation function X in hidden layers of neural network. At a particular neuron for any given input, you get the output as “-0.0001”. Which of the following activation function could X represent?

Let’s say, you are using activation function X in hidden layers of neural network. At a particular neuron for any given input, you get the output as “-0.0001”. Which of the following activation function could X represent?
A) ReLU
B) tanh
C) SIGMOID
D) None of these