Picture for Xiaoyu Liu

Xiaoyu Liu

Guangxi normal university, China

Diffusion-VLA: Scaling Robot Foundation Models via Unified Diffusion and Autoregression

Add code
Dec 04, 2024
Viaarxiv icon

Ensuring Safety and Trust: Analyzing the Risks of Large Language Models in Medicine

Add code
Nov 20, 2024
Figure 1 for Ensuring Safety and Trust: Analyzing the Risks of Large Language Models in Medicine
Figure 2 for Ensuring Safety and Trust: Analyzing the Risks of Large Language Models in Medicine
Figure 3 for Ensuring Safety and Trust: Analyzing the Risks of Large Language Models in Medicine
Figure 4 for Ensuring Safety and Trust: Analyzing the Risks of Large Language Models in Medicine
Viaarxiv icon

FANCL: Feature-Guided Attention Network with Curriculum Learning for Brain Metastases Segmentation

Add code
Oct 29, 2024
Viaarxiv icon

Predicting 30-Day Hospital Readmission in Medicare Patients: Insights from an LSTM Deep Learning Model

Add code
Oct 23, 2024
Viaarxiv icon

Single-stage TTS with Masked Audio Token Modeling and Semantic Knowledge Distillation

Add code
Sep 17, 2024
Viaarxiv icon

Joint Semantic Knowledge Distillation and Masked Acoustic Modeling for Full-band Speech Restoration with Improved Intelligibility

Add code
Sep 14, 2024
Viaarxiv icon

U-MedSAM: Uncertainty-aware MedSAM for Medical Image Segmentation

Add code
Aug 03, 2024
Viaarxiv icon

Deep Mutual Learning among Partially Labeled Datasets for Multi-Organ Segmentation

Add code
Jul 17, 2024
Viaarxiv icon

Multi-Granularity Semantic Revision for Large Language Model Distillation

Add code
Jul 14, 2024
Viaarxiv icon

Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation

Add code
Jun 19, 2024
Figure 1 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Figure 2 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Figure 3 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Figure 4 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Viaarxiv icon