Picture for Ningyuan Xi

Ningyuan Xi

Multi-Party Supervised Fine-tuning of Language Models for Multi-Party Dialogue Generation

Add code
Dec 06, 2024
Viaarxiv icon

Dual-Layer Training and Decoding of Large Language Model with Simultaneously Thinking and Speaking

Add code
Sep 18, 2024
Figure 1 for Dual-Layer Training and Decoding of Large Language Model with Simultaneously Thinking and Speaking
Figure 2 for Dual-Layer Training and Decoding of Large Language Model with Simultaneously Thinking and Speaking
Figure 3 for Dual-Layer Training and Decoding of Large Language Model with Simultaneously Thinking and Speaking
Figure 4 for Dual-Layer Training and Decoding of Large Language Model with Simultaneously Thinking and Speaking
Viaarxiv icon

Alleviating Hallucinations in Large Language Models with Scepticism Modeling

Add code
Sep 10, 2024
Figure 1 for Alleviating Hallucinations in Large Language Models with Scepticism Modeling
Figure 2 for Alleviating Hallucinations in Large Language Models with Scepticism Modeling
Figure 3 for Alleviating Hallucinations in Large Language Models with Scepticism Modeling
Figure 4 for Alleviating Hallucinations in Large Language Models with Scepticism Modeling
Viaarxiv icon

A Practice of Post-Training on Llama-3 70B with Optimal Selection of Additional Language Mixture Ratio

Add code
Sep 10, 2024
Figure 1 for A Practice of Post-Training on Llama-3 70B with Optimal Selection of Additional Language Mixture Ratio
Figure 2 for A Practice of Post-Training on Llama-3 70B with Optimal Selection of Additional Language Mixture Ratio
Figure 3 for A Practice of Post-Training on Llama-3 70B with Optimal Selection of Additional Language Mixture Ratio
Figure 4 for A Practice of Post-Training on Llama-3 70B with Optimal Selection of Additional Language Mixture Ratio
Viaarxiv icon