Picture for Dimitri Nowicki

Dimitri Nowicki

Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks

Add code
Feb 13, 2017
Figure 1 for Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks
Figure 2 for Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks
Figure 3 for Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks
Figure 4 for Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks
Viaarxiv icon

Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)

Add code
Jan 31, 2017
Figure 1 for Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)
Figure 2 for Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)
Figure 3 for Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)
Figure 4 for Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)
Viaarxiv icon