Picture for Arkil Patel

Arkil Patel

Universal Adversarial Triggers Are Not Universal

Add code
Apr 24, 2024
Viaarxiv icon

Evaluating In-Context Learning of Libraries for Code Generation

Add code
Nov 16, 2023
Viaarxiv icon

MAGNIFICo: Evaluating the In-Context Learning Ability of Large Language Models to Generalize to Novel Interpretations

Add code
Oct 18, 2023
Viaarxiv icon

Understanding In-Context Learning in Transformers and LLMs by Learning to Learn Discrete Functions

Add code
Oct 04, 2023
Viaarxiv icon

Simplicity Bias in Transformers and their Ability to Learn Sparse Boolean Functions

Add code
Nov 22, 2022
Viaarxiv icon

When Can Transformers Ground and Compose: Insights from Compositional Generalization Benchmarks

Add code
Oct 31, 2022
Viaarxiv icon

Revisiting the Compositional Generalization Abilities of Neural Sequence Models

Add code
Mar 14, 2022
Figure 1 for Revisiting the Compositional Generalization Abilities of Neural Sequence Models
Figure 2 for Revisiting the Compositional Generalization Abilities of Neural Sequence Models
Figure 3 for Revisiting the Compositional Generalization Abilities of Neural Sequence Models
Figure 4 for Revisiting the Compositional Generalization Abilities of Neural Sequence Models
Viaarxiv icon

Are NLP Models really able to Solve Simple Math Word Problems?

Add code
Mar 12, 2021
Figure 1 for Are NLP Models really able to Solve Simple Math Word Problems?
Figure 2 for Are NLP Models really able to Solve Simple Math Word Problems?
Figure 3 for Are NLP Models really able to Solve Simple Math Word Problems?
Figure 4 for Are NLP Models really able to Solve Simple Math Word Problems?
Viaarxiv icon

On the Computational Power of Transformers and Its Implications in Sequence Modeling

Add code
Jun 16, 2020
Figure 1 for On the Computational Power of Transformers and Its Implications in Sequence Modeling
Figure 2 for On the Computational Power of Transformers and Its Implications in Sequence Modeling
Figure 3 for On the Computational Power of Transformers and Its Implications in Sequence Modeling
Figure 4 for On the Computational Power of Transformers and Its Implications in Sequence Modeling
Viaarxiv icon