Dual Rectified Linear Units : A Replacement for Tanh Activation Functions in Quasi-Recurrent Neural Networks

Add code
Oct 31, 2017
Figure 1 for Dual Rectified Linear Units : A Replacement for Tanh Activation Functions in Quasi-Recurrent Neural Networks
Figure 2 for Dual Rectified Linear Units : A Replacement for Tanh Activation Functions in Quasi-Recurrent Neural Networks
Figure 3 for Dual Rectified Linear Units : A Replacement for Tanh Activation Functions in Quasi-Recurrent Neural Networks
Figure 4 for Dual Rectified Linear Units : A Replacement for Tanh Activation Functions in Quasi-Recurrent Neural Networks

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: