Picture for Yi Tay

Yi Tay

Vibe-Eval: A hard evaluation suite for measuring progress of multimodal language models

Add code
May 03, 2024
Figure 1 for Vibe-Eval: A hard evaluation suite for measuring progress of multimodal language models
Figure 2 for Vibe-Eval: A hard evaluation suite for measuring progress of multimodal language models
Figure 3 for Vibe-Eval: A hard evaluation suite for measuring progress of multimodal language models
Figure 4 for Vibe-Eval: A hard evaluation suite for measuring progress of multimodal language models
Viaarxiv icon

Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models

Add code
Apr 18, 2024
Figure 1 for Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models
Figure 2 for Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models
Figure 3 for Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models
Figure 4 for Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models
Viaarxiv icon

PaLI-X: On Scaling up a Multilingual Vision and Language Model

Add code
May 29, 2023
Figure 1 for PaLI-X: On Scaling up a Multilingual Vision and Language Model
Figure 2 for PaLI-X: On Scaling up a Multilingual Vision and Language Model
Figure 3 for PaLI-X: On Scaling up a Multilingual Vision and Language Model
Figure 4 for PaLI-X: On Scaling up a Multilingual Vision and Language Model
Viaarxiv icon

PaLM 2 Technical Report

Add code
May 17, 2023
Figure 1 for PaLM 2 Technical Report
Figure 2 for PaLM 2 Technical Report
Figure 3 for PaLM 2 Technical Report
Figure 4 for PaLM 2 Technical Report
Viaarxiv icon

Symbol tuning improves in-context learning in language models

Add code
May 15, 2023
Viaarxiv icon

Recommender Systems with Generative Retrieval

Add code
May 08, 2023
Viaarxiv icon

UniMax: Fairer and more Effective Language Sampling for Large-Scale Multilingual Pretraining

Add code
Apr 18, 2023
Figure 1 for UniMax: Fairer and more Effective Language Sampling for Large-Scale Multilingual Pretraining
Figure 2 for UniMax: Fairer and more Effective Language Sampling for Large-Scale Multilingual Pretraining
Figure 3 for UniMax: Fairer and more Effective Language Sampling for Large-Scale Multilingual Pretraining
Figure 4 for UniMax: Fairer and more Effective Language Sampling for Large-Scale Multilingual Pretraining
Viaarxiv icon

CoLT5: Faster Long-Range Transformers with Conditional Computation

Add code
Mar 17, 2023
Viaarxiv icon

Larger language models do in-context learning differently

Add code
Mar 08, 2023
Figure 1 for Larger language models do in-context learning differently
Figure 2 for Larger language models do in-context learning differently
Figure 3 for Larger language models do in-context learning differently
Figure 4 for Larger language models do in-context learning differently
Viaarxiv icon

The Flan Collection: Designing Data and Methods for Effective Instruction Tuning

Add code
Feb 14, 2023
Viaarxiv icon