APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

Add code
Sep 14, 2022
Figure 1 for APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning
Figure 2 for APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning
Figure 3 for APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning
Figure 4 for APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: