Picture for Kuluhan Binici

Kuluhan Binici

MEDSAGE: Enhancing Robustness of Medical Dialogue Summarization to ASR Errors with LLM-generated Synthetic Dialogues

Add code
Aug 26, 2024
Viaarxiv icon

Condensed Sample-Guided Model Inversion for Knowledge Distillation

Add code
Aug 25, 2024
Viaarxiv icon

LLMs are not Zero-Shot Reasoners for Biomedical Information Extraction

Add code
Aug 22, 2024
Figure 1 for LLMs are not Zero-Shot Reasoners for Biomedical Information Extraction
Figure 2 for LLMs are not Zero-Shot Reasoners for Biomedical Information Extraction
Figure 3 for LLMs are not Zero-Shot Reasoners for Biomedical Information Extraction
Figure 4 for LLMs are not Zero-Shot Reasoners for Biomedical Information Extraction
Viaarxiv icon

Generalizing Teacher Networks for Effective Knowledge Distillation Across Student Architectures

Add code
Jul 22, 2024
Viaarxiv icon

CRISP: Hybrid Structured Sparsity for Class-aware Model Pruning

Add code
Nov 24, 2023
Viaarxiv icon

Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks

Add code
Mar 13, 2023
Viaarxiv icon

Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay

Add code
Jan 09, 2022
Figure 1 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 2 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 3 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Figure 4 for Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Viaarxiv icon

Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data

Add code
Aug 11, 2021
Figure 1 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 2 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 3 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Figure 4 for Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Viaarxiv icon