Picture for Steven Walton

Steven Walton

StyleNAT: Giving Each Head a New Perspective

Add code
Nov 10, 2022
Figure 1 for StyleNAT: Giving Each Head a New Perspective
Figure 2 for StyleNAT: Giving Each Head a New Perspective
Figure 3 for StyleNAT: Giving Each Head a New Perspective
Figure 4 for StyleNAT: Giving Each Head a New Perspective
Viaarxiv icon

Design Amortization for Bayesian Optimal Experimental Design

Add code
Oct 07, 2022
Figure 1 for Design Amortization for Bayesian Optimal Experimental Design
Figure 2 for Design Amortization for Bayesian Optimal Experimental Design
Figure 3 for Design Amortization for Bayesian Optimal Experimental Design
Figure 4 for Design Amortization for Bayesian Optimal Experimental Design
Viaarxiv icon

Neighborhood Attention Transformer

Add code
Apr 14, 2022
Figure 1 for Neighborhood Attention Transformer
Figure 2 for Neighborhood Attention Transformer
Figure 3 for Neighborhood Attention Transformer
Figure 4 for Neighborhood Attention Transformer
Viaarxiv icon

SeMask: Semantically Masked Transformers for Semantic Segmentation

Add code
Dec 23, 2021
Figure 1 for SeMask: Semantically Masked Transformers for Semantic Segmentation
Figure 2 for SeMask: Semantically Masked Transformers for Semantic Segmentation
Figure 3 for SeMask: Semantically Masked Transformers for Semantic Segmentation
Figure 4 for SeMask: Semantically Masked Transformers for Semantic Segmentation
Viaarxiv icon

ConvMLP: Hierarchical Convolutional MLPs for Vision

Add code
Sep 18, 2021
Figure 1 for ConvMLP: Hierarchical Convolutional MLPs for Vision
Figure 2 for ConvMLP: Hierarchical Convolutional MLPs for Vision
Figure 3 for ConvMLP: Hierarchical Convolutional MLPs for Vision
Figure 4 for ConvMLP: Hierarchical Convolutional MLPs for Vision
Viaarxiv icon

Escaping the Big Data Paradigm with Compact Transformers

Add code
Apr 12, 2021
Figure 1 for Escaping the Big Data Paradigm with Compact Transformers
Figure 2 for Escaping the Big Data Paradigm with Compact Transformers
Figure 3 for Escaping the Big Data Paradigm with Compact Transformers
Figure 4 for Escaping the Big Data Paradigm with Compact Transformers
Viaarxiv icon