Picture for Zih-Ching Chen

Zih-Ching Chen

How to Learn a New Language? An Efficient Solution for Self-Supervised Learning Models Unseen Languages Adaption in Low-Resource Scenario

Add code
Nov 27, 2024
Viaarxiv icon

NeKo: Toward Post Recognition Generative Correction Large Language Models with Task-Oriented Experts

Add code
Nov 08, 2024
Viaarxiv icon

Leave No Knowledge Behind During Knowledge Distillation: Towards Practical and Effective Knowledge Distillation for Code-Switching ASR Using Realistic Data

Add code
Jul 15, 2024
Viaarxiv icon

PEFT for Speech: Unveiling Optimal Placement, Merging Strategies, and Ensemble Techniques

Add code
Jan 04, 2024
Viaarxiv icon

How to Estimate Model Transferability of Pre-Trained Speech Models?

Add code
Jun 01, 2023
Viaarxiv icon

CHAPTER: Exploiting Convolutional Neural Network Adapters for Self-supervised Speech Models

Add code
Dec 01, 2022
Viaarxiv icon

Exploring Efficient-tuning Methods in Self-supervised Speech Models

Add code
Oct 10, 2022
Figure 1 for Exploring Efficient-tuning Methods in Self-supervised Speech Models
Figure 2 for Exploring Efficient-tuning Methods in Self-supervised Speech Models
Figure 3 for Exploring Efficient-tuning Methods in Self-supervised Speech Models
Figure 4 for Exploring Efficient-tuning Methods in Self-supervised Speech Models
Viaarxiv icon

Learning Facial Liveness Representation for Domain Generalized Face Anti-spoofing

Add code
Aug 16, 2022
Figure 1 for Learning Facial Liveness Representation for Domain Generalized Face Anti-spoofing
Figure 2 for Learning Facial Liveness Representation for Domain Generalized Face Anti-spoofing
Figure 3 for Learning Facial Liveness Representation for Domain Generalized Face Anti-spoofing
Figure 4 for Learning Facial Liveness Representation for Domain Generalized Face Anti-spoofing
Viaarxiv icon

AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks

Add code
Apr 30, 2022
Figure 1 for AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Figure 2 for AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Figure 3 for AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Figure 4 for AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Viaarxiv icon