Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts

Add code
May 18, 2024
Figure 1 for Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts
Figure 2 for Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts
Figure 3 for Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts
Figure 4 for Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: