Picture for Seonghak Kim

Seonghak Kim

Maximizing Discrimination Capability of Knowledge Distillation with Energy-based Score

Add code
Nov 24, 2023
Viaarxiv icon

Cosine Similarity Knowledge Distillation for Individual Class Information Transfer

Add code
Nov 24, 2023
Viaarxiv icon

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning

Add code
Nov 23, 2023
Viaarxiv icon

A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation

Add code
Feb 22, 2020
Figure 1 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 2 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 3 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Figure 4 for A$^3$: Accelerating Attention Mechanisms in Neural Networks with Approximation
Viaarxiv icon