Picture for Jialiang Tang

Jialiang Tang

Hybrid Data-Free Knowledge Distillation

Add code
Dec 18, 2024
Viaarxiv icon

Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios

Add code
Sep 12, 2024
Figure 1 for Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Figure 2 for Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Figure 3 for Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Figure 4 for Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Viaarxiv icon

Direct Distillation between Different Domains

Add code
Jan 12, 2024
Figure 1 for Direct Distillation between Different Domains
Figure 2 for Direct Distillation between Different Domains
Figure 3 for Direct Distillation between Different Domains
Figure 4 for Direct Distillation between Different Domains
Viaarxiv icon

Distribution Shift Matters for Knowledge Distillation with Webly Collected Images

Add code
Jul 21, 2023
Figure 1 for Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Figure 2 for Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Figure 3 for Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Figure 4 for Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Viaarxiv icon