Picture for Linghan Zheng

Linghan Zheng

RecurFormer: Not All Transformer Heads Need Self-Attention

Add code
Oct 10, 2024
Figure 1 for RecurFormer: Not All Transformer Heads Need Self-Attention
Figure 2 for RecurFormer: Not All Transformer Heads Need Self-Attention
Figure 3 for RecurFormer: Not All Transformer Heads Need Self-Attention
Figure 4 for RecurFormer: Not All Transformer Heads Need Self-Attention
Viaarxiv icon

Unveiling and Controlling Anomalous Attention Distribution in Transformers

Add code
Jun 26, 2024
Viaarxiv icon

Top in Chinese Data Processing: English Code Models

Add code
Jan 25, 2024
Viaarxiv icon

CAINNFlow: Convolutional block Attention modules and Invertible Neural Networks Flow for anomaly detection and localization tasks

Add code
Jun 08, 2022
Figure 1 for CAINNFlow: Convolutional block Attention modules and Invertible Neural Networks Flow for anomaly detection and localization tasks
Figure 2 for CAINNFlow: Convolutional block Attention modules and Invertible Neural Networks Flow for anomaly detection and localization tasks
Figure 3 for CAINNFlow: Convolutional block Attention modules and Invertible Neural Networks Flow for anomaly detection and localization tasks
Figure 4 for CAINNFlow: Convolutional block Attention modules and Invertible Neural Networks Flow for anomaly detection and localization tasks
Viaarxiv icon