Picture for Ji Won Yoon

Ji Won Yoon

Speed-up of Data Analysis with Kernel Trick in Encrypted Domain

Add code
Jun 14, 2024
Figure 1 for Speed-up of Data Analysis with Kernel Trick in Encrypted Domain
Figure 2 for Speed-up of Data Analysis with Kernel Trick in Encrypted Domain
Figure 3 for Speed-up of Data Analysis with Kernel Trick in Encrypted Domain
Figure 4 for Speed-up of Data Analysis with Kernel Trick in Encrypted Domain
Viaarxiv icon

EM-Network: Oracle Guided Self-distillation for Sequence Learning

Add code
Jun 14, 2023
Figure 1 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Figure 2 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Figure 3 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Figure 4 for EM-Network: Oracle Guided Self-distillation for Sequence Learning
Viaarxiv icon

MCR-Data2vec 2.0: Improving Self-supervised Speech Pre-training via Model-level Consistency Regularization

Add code
Jun 14, 2023
Viaarxiv icon

Development of deep biological ages aware of morbidity and mortality based on unsupervised and semi-supervised deep learning approaches

Add code
Feb 01, 2023
Viaarxiv icon

Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition

Add code
Nov 28, 2022
Viaarxiv icon

HuBERT-EE: Early Exiting HuBERT for Efficient Speech Recognition

Add code
Apr 13, 2022
Figure 1 for HuBERT-EE: Early Exiting HuBERT for Efficient Speech Recognition
Figure 2 for HuBERT-EE: Early Exiting HuBERT for Efficient Speech Recognition
Figure 3 for HuBERT-EE: Early Exiting HuBERT for Efficient Speech Recognition
Figure 4 for HuBERT-EE: Early Exiting HuBERT for Efficient Speech Recognition
Viaarxiv icon

Oracle Teacher: Towards Better Knowledge Distillation

Add code
Nov 05, 2021
Figure 1 for Oracle Teacher: Towards Better Knowledge Distillation
Figure 2 for Oracle Teacher: Towards Better Knowledge Distillation
Figure 3 for Oracle Teacher: Towards Better Knowledge Distillation
Figure 4 for Oracle Teacher: Towards Better Knowledge Distillation
Viaarxiv icon

Speech Intention Understanding in a Head-final Language: A Disambiguation Utilizing Intonation-dependency

Add code
Nov 10, 2018
Figure 1 for Speech Intention Understanding in a Head-final Language: A Disambiguation Utilizing Intonation-dependency
Figure 2 for Speech Intention Understanding in a Head-final Language: A Disambiguation Utilizing Intonation-dependency
Figure 3 for Speech Intention Understanding in a Head-final Language: A Disambiguation Utilizing Intonation-dependency
Figure 4 for Speech Intention Understanding in a Head-final Language: A Disambiguation Utilizing Intonation-dependency
Viaarxiv icon

An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework

Add code
Jul 03, 2013
Figure 1 for An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework
Figure 2 for An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework
Figure 3 for An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework
Viaarxiv icon

Statistical Denoising for single molecule fluorescence microscopic images

Add code
Jun 07, 2013
Figure 1 for Statistical Denoising for single molecule fluorescence microscopic images
Figure 2 for Statistical Denoising for single molecule fluorescence microscopic images
Figure 3 for Statistical Denoising for single molecule fluorescence microscopic images
Figure 4 for Statistical Denoising for single molecule fluorescence microscopic images
Viaarxiv icon