Picture for Ziliang Zong

Ziliang Zong

Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation

Add code
Nov 01, 2022
Viaarxiv icon

Learning Omnidirectional Flow in 360-degree Video via Siamese Representation

Add code
Aug 07, 2022
Viaarxiv icon

Network Binarization via Contrastive Learning

Add code
Jul 16, 2022
Figure 1 for Network Binarization via Contrastive Learning
Figure 2 for Network Binarization via Contrastive Learning
Figure 3 for Network Binarization via Contrastive Learning
Figure 4 for Network Binarization via Contrastive Learning
Viaarxiv icon

Lipschitz Continuity Retained Binary Neural Network

Add code
Jul 16, 2022
Figure 1 for Lipschitz Continuity Retained Binary Neural Network
Figure 2 for Lipschitz Continuity Retained Binary Neural Network
Figure 3 for Lipschitz Continuity Retained Binary Neural Network
Figure 4 for Lipschitz Continuity Retained Binary Neural Network
Viaarxiv icon

Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning

Add code
Jan 30, 2022
Figure 1 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Figure 2 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Figure 3 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Figure 4 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Viaarxiv icon

Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks

Add code
Oct 08, 2021
Figure 1 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Figure 2 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Figure 3 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Figure 4 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Viaarxiv icon

Lipschitz Continuity Guided Knowledge Distillation

Add code
Aug 29, 2021
Figure 1 for Lipschitz Continuity Guided Knowledge Distillation
Figure 2 for Lipschitz Continuity Guided Knowledge Distillation
Figure 3 for Lipschitz Continuity Guided Knowledge Distillation
Figure 4 for Lipschitz Continuity Guided Knowledge Distillation
Viaarxiv icon

Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation

Add code
Jun 15, 2021
Figure 1 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Figure 2 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Figure 3 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Figure 4 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Viaarxiv icon

Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression

Add code
Dec 05, 2020
Figure 1 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Figure 2 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Figure 3 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Figure 4 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Viaarxiv icon

Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset

Add code
Oct 15, 2020
Figure 1 for Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset
Figure 2 for Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset
Figure 3 for Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset
Figure 4 for Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset
Viaarxiv icon