Picture for Kale-ab Tessera

Kale-ab Tessera

How much can change in a year? Revisiting Evaluation in Multi-Agent Reinforcement Learning

Add code
Dec 13, 2023
Viaarxiv icon

Efficiently Quantifying Individual Agent Importance in Cooperative MARL

Add code
Dec 13, 2023
Viaarxiv icon

Generalisable Agents for Neural Network Optimisation

Add code
Nov 30, 2023
Viaarxiv icon

Are we going MAD? Benchmarking Multi-Agent Debate between Language Models for Medical Q&A

Add code
Nov 29, 2023
Viaarxiv icon

Reduce, Reuse, Recycle: Selective Reincarnation in Multi-Agent Reinforcement Learning

Add code
Mar 31, 2023
Viaarxiv icon

On pseudo-absence generation and machine learning for locust breeding ground prediction in Africa

Add code
Nov 06, 2021
Figure 1 for On pseudo-absence generation and machine learning for locust breeding ground prediction in Africa
Figure 2 for On pseudo-absence generation and machine learning for locust breeding ground prediction in Africa
Figure 3 for On pseudo-absence generation and machine learning for locust breeding ground prediction in Africa
Figure 4 for On pseudo-absence generation and machine learning for locust breeding ground prediction in Africa
Viaarxiv icon

Mava: a research framework for distributed multi-agent reinforcement learning

Add code
Jul 03, 2021
Figure 1 for Mava: a research framework for distributed multi-agent reinforcement learning
Figure 2 for Mava: a research framework for distributed multi-agent reinforcement learning
Figure 3 for Mava: a research framework for distributed multi-agent reinforcement learning
Figure 4 for Mava: a research framework for distributed multi-agent reinforcement learning
Viaarxiv icon

Keep the Gradients Flowing: Using Gradient Flow to Study Sparse Network Optimization

Add code
Feb 02, 2021
Figure 1 for Keep the Gradients Flowing: Using Gradient Flow to Study Sparse Network Optimization
Figure 2 for Keep the Gradients Flowing: Using Gradient Flow to Study Sparse Network Optimization
Figure 3 for Keep the Gradients Flowing: Using Gradient Flow to Study Sparse Network Optimization
Figure 4 for Keep the Gradients Flowing: Using Gradient Flow to Study Sparse Network Optimization
Viaarxiv icon