Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation

Add code
Oct 16, 2019
Figure 1 for Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation
Figure 2 for Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation
Figure 3 for Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation
Figure 4 for Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: