Picture for Ruchao Fan

Ruchao Fan

AlignFormer: Modality Matching Can Achieve Better Zero-shot Instruction-Following Speech-LLM

Add code
Dec 02, 2024
Viaarxiv icon

CTC-GMM: CTC guided modality matching for fast and accurate streaming speech translation

Add code
Oct 07, 2024
Figure 1 for CTC-GMM: CTC guided modality matching for fast and accurate streaming speech translation
Figure 2 for CTC-GMM: CTC guided modality matching for fast and accurate streaming speech translation
Figure 3 for CTC-GMM: CTC guided modality matching for fast and accurate streaming speech translation
Figure 4 for CTC-GMM: CTC guided modality matching for fast and accurate streaming speech translation
Viaarxiv icon

Benchmarking Children's ASR with Supervised and Self-supervised Speech Foundation Models

Add code
Jun 15, 2024
Viaarxiv icon

SOA: Reducing Domain Mismatch in SSL Pipeline by Speech Only Adaptation for Low Resource ASR

Add code
Jun 15, 2024
Viaarxiv icon

UniEnc-CASSNAT: An Encoder-only Non-autoregressive ASR for Speech SSL Models

Add code
Feb 14, 2024
Viaarxiv icon

Towards Better Domain Adaptation for Self-supervised Models: A Case Study of Child ASR

Add code
Apr 28, 2023
Viaarxiv icon

A CTC Alignment-based Non-autoregressive Transformer for End-to-end Automatic Speech Recognition

Add code
Apr 15, 2023
Viaarxiv icon

Acoustic-aware Non-autoregressive Spell Correction with Mask Sample Decoding

Add code
Oct 16, 2022
Figure 1 for Acoustic-aware Non-autoregressive Spell Correction with Mask Sample Decoding
Figure 2 for Acoustic-aware Non-autoregressive Spell Correction with Mask Sample Decoding
Figure 3 for Acoustic-aware Non-autoregressive Spell Correction with Mask Sample Decoding
Figure 4 for Acoustic-aware Non-autoregressive Spell Correction with Mask Sample Decoding
Viaarxiv icon

CTCBERT: Advancing Hidden-unit BERT with CTC Objectives

Add code
Oct 16, 2022
Figure 1 for CTCBERT: Advancing Hidden-unit BERT with CTC Objectives
Figure 2 for CTCBERT: Advancing Hidden-unit BERT with CTC Objectives
Figure 3 for CTCBERT: Advancing Hidden-unit BERT with CTC Objectives
Figure 4 for CTCBERT: Advancing Hidden-unit BERT with CTC Objectives
Viaarxiv icon

DRAFT: A Novel Framework to Reduce Domain Shifting in Self-supervised Learning and Its Application to Children's ASR

Add code
Jun 16, 2022
Figure 1 for DRAFT: A Novel Framework to Reduce Domain Shifting in Self-supervised Learning and Its Application to Children's ASR
Figure 2 for DRAFT: A Novel Framework to Reduce Domain Shifting in Self-supervised Learning and Its Application to Children's ASR
Figure 3 for DRAFT: A Novel Framework to Reduce Domain Shifting in Self-supervised Learning and Its Application to Children's ASR
Viaarxiv icon