Picture for Shanshan Lao

Shanshan Lao

Weight-Inherited Distillation for Task-Agnostic BERT Compression

Add code
May 16, 2023
Viaarxiv icon

Rethinking Knowledge Distillation via Cross-Entropy

Add code
Aug 22, 2022
Figure 1 for Rethinking Knowledge Distillation via Cross-Entropy
Figure 2 for Rethinking Knowledge Distillation via Cross-Entropy
Figure 3 for Rethinking Knowledge Distillation via Cross-Entropy
Figure 4 for Rethinking Knowledge Distillation via Cross-Entropy
Viaarxiv icon

Attentions Help CNNs See Better: Attention-based Hybrid Image Quality Assessment Network

Add code
Apr 22, 2022
Figure 1 for Attentions Help CNNs See Better: Attention-based Hybrid Image Quality Assessment Network
Figure 2 for Attentions Help CNNs See Better: Attention-based Hybrid Image Quality Assessment Network
Figure 3 for Attentions Help CNNs See Better: Attention-based Hybrid Image Quality Assessment Network
Figure 4 for Attentions Help CNNs See Better: Attention-based Hybrid Image Quality Assessment Network
Viaarxiv icon

MANIQA: Multi-dimension Attention Network for No-Reference Image Quality Assessment

Add code
Apr 21, 2022
Figure 1 for MANIQA: Multi-dimension Attention Network for No-Reference Image Quality Assessment
Figure 2 for MANIQA: Multi-dimension Attention Network for No-Reference Image Quality Assessment
Figure 3 for MANIQA: Multi-dimension Attention Network for No-Reference Image Quality Assessment
Figure 4 for MANIQA: Multi-dimension Attention Network for No-Reference Image Quality Assessment
Viaarxiv icon