Picture for Yueyao Yu

Yueyao Yu

Why "classic" Transformers are shallow and how to make them go deep

Add code
Dec 11, 2023
Viaarxiv icon

POViT: Vision Transformer for Multi-objective Design and Characterization of Nanophotonic Devices

Add code
May 17, 2022
Figure 1 for POViT: Vision Transformer for Multi-objective Design and Characterization of Nanophotonic Devices
Figure 2 for POViT: Vision Transformer for Multi-objective Design and Characterization of Nanophotonic Devices
Figure 3 for POViT: Vision Transformer for Multi-objective Design and Characterization of Nanophotonic Devices
Figure 4 for POViT: Vision Transformer for Multi-objective Design and Characterization of Nanophotonic Devices
Viaarxiv icon

Householder-Absolute Neural Layers For High Variability and Deep Trainability

Add code
Jun 08, 2021
Figure 1 for Householder-Absolute Neural Layers For High Variability and Deep Trainability
Figure 2 for Householder-Absolute Neural Layers For High Variability and Deep Trainability
Figure 3 for Householder-Absolute Neural Layers For High Variability and Deep Trainability
Figure 4 for Householder-Absolute Neural Layers For High Variability and Deep Trainability
Viaarxiv icon

Variability of Artificial Neural Networks

Add code
May 20, 2021
Figure 1 for Variability of Artificial Neural Networks
Figure 2 for Variability of Artificial Neural Networks
Figure 3 for Variability of Artificial Neural Networks
Figure 4 for Variability of Artificial Neural Networks
Viaarxiv icon

AuxBlocks: Defense Adversarial Example via Auxiliary Blocks

Add code
Feb 18, 2019
Figure 1 for AuxBlocks: Defense Adversarial Example via Auxiliary Blocks
Figure 2 for AuxBlocks: Defense Adversarial Example via Auxiliary Blocks
Figure 3 for AuxBlocks: Defense Adversarial Example via Auxiliary Blocks
Figure 4 for AuxBlocks: Defense Adversarial Example via Auxiliary Blocks
Viaarxiv icon