Picture for Zhangchi Zhu

Zhangchi Zhu

Exploring Feature-based Knowledge Distillation For Recommender System: A Frequency Perspective

Add code
Nov 16, 2024
Figure 1 for Exploring Feature-based Knowledge Distillation For Recommender System: A Frequency Perspective
Figure 2 for Exploring Feature-based Knowledge Distillation For Recommender System: A Frequency Perspective
Figure 3 for Exploring Feature-based Knowledge Distillation For Recommender System: A Frequency Perspective
Figure 4 for Exploring Feature-based Knowledge Distillation For Recommender System: A Frequency Perspective
Viaarxiv icon

Are LLM-based Recommenders Already the Best? Simple Scaled Cross-entropy Unleashes the Potential of Traditional Sequential Recommenders

Add code
Aug 26, 2024
Viaarxiv icon

Understanding the Role of Cross-Entropy Loss in Fairly Evaluating Large Language Model-based Recommendation

Add code
Feb 22, 2024
Viaarxiv icon

Contrastive Learning with Negative Sampling Correction

Add code
Jan 13, 2024
Viaarxiv icon

From Input to Output: A Multi-layer Knowledge Distillation Framework for Compressing Recommendation Models

Add code
Nov 08, 2023
Viaarxiv icon

Robust Positive-Unlabeled Learning via Noise Negative Sample Self-correction

Add code
Aug 01, 2023
Figure 1 for Robust Positive-Unlabeled Learning via Noise Negative Sample Self-correction
Figure 2 for Robust Positive-Unlabeled Learning via Noise Negative Sample Self-correction
Figure 3 for Robust Positive-Unlabeled Learning via Noise Negative Sample Self-correction
Figure 4 for Robust Positive-Unlabeled Learning via Noise Negative Sample Self-correction
Viaarxiv icon