Picture for Jackson Petty

Jackson Petty

How Does Code Pretraining Affect Language Model Task Performance?

Add code
Sep 06, 2024
Figure 1 for How Does Code Pretraining Affect Language Model Task Performance?
Figure 2 for How Does Code Pretraining Affect Language Model Task Performance?
Figure 3 for How Does Code Pretraining Affect Language Model Task Performance?
Figure 4 for How Does Code Pretraining Affect Language Model Task Performance?
Viaarxiv icon

The Illusion of State in State-Space Models

Add code
Apr 12, 2024
Viaarxiv icon

GPQA: A Graduate-Level Google-Proof Q&A Benchmark

Add code
Nov 20, 2023
Viaarxiv icon

Debate Helps Supervise Unreliable Experts

Add code
Nov 15, 2023
Viaarxiv icon

In-context Learning Generalizes, But Not Always Robustly: The Case of Syntax

Add code
Nov 13, 2023
Figure 1 for In-context Learning Generalizes, But Not Always Robustly: The Case of Syntax
Figure 2 for In-context Learning Generalizes, But Not Always Robustly: The Case of Syntax
Figure 3 for In-context Learning Generalizes, But Not Always Robustly: The Case of Syntax
Figure 4 for In-context Learning Generalizes, But Not Always Robustly: The Case of Syntax
Viaarxiv icon

How Abstract Is Linguistic Generalization in Large Language Models? Experiments with Argument Structure

Add code
Nov 08, 2023
Viaarxiv icon

The Impact of Depth and Width on Transformer Language Model Generalization

Add code
Oct 30, 2023
Figure 1 for The Impact of Depth and Width on Transformer Language Model Generalization
Figure 2 for The Impact of Depth and Width on Transformer Language Model Generalization
Figure 3 for The Impact of Depth and Width on Transformer Language Model Generalization
Figure 4 for The Impact of Depth and Width on Transformer Language Model Generalization
Viaarxiv icon

(QA)$^2$: Question Answering with Questionable Assumptions

Add code
Dec 20, 2022
Viaarxiv icon

Do Language Models Learn Position-Role Mappings?

Add code
Feb 08, 2022
Viaarxiv icon

Transformers Generalize Linearly

Add code
Sep 24, 2021
Figure 1 for Transformers Generalize Linearly
Figure 2 for Transformers Generalize Linearly
Figure 3 for Transformers Generalize Linearly
Viaarxiv icon