Picture for Skander Moalla

Skander Moalla

Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis

Add code
Jul 13, 2024
Figure 1 for Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis
Figure 2 for Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis
Figure 3 for Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis
Figure 4 for Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis
Viaarxiv icon

Building on Efficient Foundations: Effectively Training LLMs with Structured Feedforward Layers

Add code
Jun 24, 2024
Viaarxiv icon

No Representation, No Trust: Connecting Representation, Collapse, and Trust Issues in PPO

Add code
May 01, 2024
Viaarxiv icon

SMACv2: An Improved Benchmark for Cooperative Multi-Agent Reinforcement Learning

Add code
Dec 14, 2022
Figure 1 for SMACv2: An Improved Benchmark for Cooperative Multi-Agent Reinforcement Learning
Figure 2 for SMACv2: An Improved Benchmark for Cooperative Multi-Agent Reinforcement Learning
Figure 3 for SMACv2: An Improved Benchmark for Cooperative Multi-Agent Reinforcement Learning
Figure 4 for SMACv2: An Improved Benchmark for Cooperative Multi-Agent Reinforcement Learning
Viaarxiv icon