Picture for Azalia Mirhoseini

Azalia Mirhoseini

Synthetic Data Generation & Multi-Step RL for Reasoning & Tool Use

Add code
Apr 07, 2025
Viaarxiv icon

Reasoning-SQL: Reinforcement Learning with SQL Tailored Partial Rewards for Reasoning-Enhanced Text-to-SQL

Add code
Apr 01, 2025
Viaarxiv icon

How Do Large Language Monkeys Get Their Power (Laws)?

Add code
Feb 24, 2025
Viaarxiv icon

CodeMonkeys: Scaling Test-Time Compute for Software Engineering

Add code
Jan 24, 2025
Viaarxiv icon

That Chip Has Sailed: A Critique of Unfounded Skepticism Around AI for Chip Design

Add code
Nov 15, 2024
Figure 1 for That Chip Has Sailed: A Critique of Unfounded Skepticism Around AI for Chip Design
Figure 2 for That Chip Has Sailed: A Critique of Unfounded Skepticism Around AI for Chip Design
Figure 3 for That Chip Has Sailed: A Critique of Unfounded Skepticism Around AI for Chip Design
Figure 4 for That Chip Has Sailed: A Critique of Unfounded Skepticism Around AI for Chip Design
Viaarxiv icon

Large Language Monkeys: Scaling Inference Compute with Repeated Sampling

Add code
Jul 31, 2024
Figure 1 for Large Language Monkeys: Scaling Inference Compute with Repeated Sampling
Figure 2 for Large Language Monkeys: Scaling Inference Compute with Repeated Sampling
Figure 3 for Large Language Monkeys: Scaling Inference Compute with Repeated Sampling
Figure 4 for Large Language Monkeys: Scaling Inference Compute with Repeated Sampling
Viaarxiv icon

Training of Physical Neural Networks

Add code
Jun 05, 2024
Figure 1 for Training of Physical Neural Networks
Figure 2 for Training of Physical Neural Networks
Figure 3 for Training of Physical Neural Networks
Figure 4 for Training of Physical Neural Networks
Viaarxiv icon

CHESS: Contextual Harnessing for Efficient SQL Synthesis

Add code
May 27, 2024
Viaarxiv icon

CATS: Contextually-Aware Thresholding for Sparsity in Large Language Models

Add code
Apr 12, 2024
Viaarxiv icon

Hydragen: High-Throughput LLM Inference with Shared Prefixes

Add code
Feb 07, 2024
Figure 1 for Hydragen: High-Throughput LLM Inference with Shared Prefixes
Figure 2 for Hydragen: High-Throughput LLM Inference with Shared Prefixes
Figure 3 for Hydragen: High-Throughput LLM Inference with Shared Prefixes
Figure 4 for Hydragen: High-Throughput LLM Inference with Shared Prefixes
Viaarxiv icon