Picture for Haode Zhang

Haode Zhang

Minimizing PLM-Based Few-Shot Intent Detectors

Add code
Jul 13, 2024
Viaarxiv icon

Revisit Few-shot Intent Classification with PLMs: Direct Fine-tuning vs. Continual Pre-training

Add code
Jun 08, 2023
Viaarxiv icon

Asymmetric feature interaction for interpreting model predictions

Add code
May 12, 2023
Viaarxiv icon

New Intent Discovery with Pre-training and Contrastive Learning

Add code
May 25, 2022
Figure 1 for New Intent Discovery with Pre-training and Contrastive Learning
Figure 2 for New Intent Discovery with Pre-training and Contrastive Learning
Figure 3 for New Intent Discovery with Pre-training and Contrastive Learning
Figure 4 for New Intent Discovery with Pre-training and Contrastive Learning
Viaarxiv icon

Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization

Add code
May 15, 2022
Figure 1 for Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization
Figure 2 for Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization
Figure 3 for Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization
Figure 4 for Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization
Viaarxiv icon

Effectiveness of Pre-training for Few-shot Intent Classification

Add code
Sep 13, 2021
Figure 1 for Effectiveness of Pre-training for Few-shot Intent Classification
Figure 2 for Effectiveness of Pre-training for Few-shot Intent Classification
Figure 3 for Effectiveness of Pre-training for Few-shot Intent Classification
Figure 4 for Effectiveness of Pre-training for Few-shot Intent Classification
Viaarxiv icon