Picture for Yiqun Yao

Yiqun Yao

Sketch: A Toolkit for Streamlining LLM Operations

Add code
Sep 05, 2024
Viaarxiv icon

Open-domain Implicit Format Control for Large Language Model Generation

Add code
Aug 08, 2024
Viaarxiv icon

52B to 1T: Lessons Learned via Tele-FLM Series

Add code
Jul 03, 2024
Viaarxiv icon

Tele-FLM Technical Report

Add code
Apr 25, 2024
Viaarxiv icon

CatCode: A Comprehensive Evaluation Framework for LLMs On the Mixture of Code and Text

Add code
Mar 04, 2024
Viaarxiv icon

FLM-101B: An Open LLM and How to Train It with $100K Budget

Add code
Sep 17, 2023
Viaarxiv icon

2x Faster Language Model Pre-training via Masked Structural Growth

Add code
May 04, 2023
Viaarxiv icon

Research without Re-search: Maximal Update Parametrization Yields Accurate Loss Prediction across Scales

Add code
Apr 29, 2023
Viaarxiv icon

MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task

Add code
May 17, 2021
Figure 1 for MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task
Figure 2 for MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task
Figure 3 for MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task
Figure 4 for MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task
Viaarxiv icon

Concept Learning through Deep Reinforcement Learning with Memory-Augmented Neural Networks

Add code
Nov 15, 2018
Figure 1 for Concept Learning through Deep Reinforcement Learning with Memory-Augmented Neural Networks
Figure 2 for Concept Learning through Deep Reinforcement Learning with Memory-Augmented Neural Networks
Figure 3 for Concept Learning through Deep Reinforcement Learning with Memory-Augmented Neural Networks
Figure 4 for Concept Learning through Deep Reinforcement Learning with Memory-Augmented Neural Networks
Viaarxiv icon