Picture for Keith Rush

Keith Rush

Global Convergence of Multiplicative Updates for the Matrix Mechanism: A Collaborative Proof with Gemini 3

Add code
Mar 19, 2026
Viaarxiv icon

Correlated Noise Mechanisms for Differentially Private Learning

Add code
Jun 09, 2025
Figure 1 for Correlated Noise Mechanisms for Differentially Private Learning
Figure 2 for Correlated Noise Mechanisms for Differentially Private Learning
Figure 3 for Correlated Noise Mechanisms for Differentially Private Learning
Figure 4 for Correlated Noise Mechanisms for Differentially Private Learning
Viaarxiv icon

Communication-Efficient Language Model Training Scales Reliably and Robustly: Scaling Laws for DiLoCo

Add code
Mar 12, 2025
Figure 1 for Communication-Efficient Language Model Training Scales Reliably and Robustly: Scaling Laws for DiLoCo
Figure 2 for Communication-Efficient Language Model Training Scales Reliably and Robustly: Scaling Laws for DiLoCo
Figure 3 for Communication-Efficient Language Model Training Scales Reliably and Robustly: Scaling Laws for DiLoCo
Figure 4 for Communication-Efficient Language Model Training Scales Reliably and Robustly: Scaling Laws for DiLoCo
Viaarxiv icon

Streaming DiLoCo with overlapping communication: Towards a Distributed Free Lunch

Add code
Jan 30, 2025
Viaarxiv icon

Fine-Tuning Large Language Models with User-Level Differential Privacy

Add code
Jul 10, 2024
Viaarxiv icon

Cascade-Aware Training of Language Models

Add code
May 29, 2024
Figure 1 for Cascade-Aware Training of Language Models
Figure 2 for Cascade-Aware Training of Language Models
Figure 3 for Cascade-Aware Training of Language Models
Figure 4 for Cascade-Aware Training of Language Models
Viaarxiv icon

FAX: Scalable and Differentiable Federated Primitives in JAX

Add code
Mar 11, 2024
Figure 1 for FAX: Scalable and Differentiable Federated Primitives in JAX
Figure 2 for FAX: Scalable and Differentiable Federated Primitives in JAX
Figure 3 for FAX: Scalable and Differentiable Federated Primitives in JAX
Figure 4 for FAX: Scalable and Differentiable Federated Primitives in JAX
Viaarxiv icon

(Amplified) Banded Matrix Factorization: A unified approach to private training

Add code
Jun 13, 2023
Figure 1 for (Amplified) Banded Matrix Factorization: A unified approach to private training
Figure 2 for (Amplified) Banded Matrix Factorization: A unified approach to private training
Figure 3 for (Amplified) Banded Matrix Factorization: A unified approach to private training
Figure 4 for (Amplified) Banded Matrix Factorization: A unified approach to private training
Viaarxiv icon

Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning

Add code
Feb 02, 2023
Figure 1 for Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning
Figure 2 for Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning
Figure 3 for Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning
Figure 4 for Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning
Viaarxiv icon

Federated Automatic Differentiation

Add code
Jan 18, 2023
Figure 1 for Federated Automatic Differentiation
Figure 2 for Federated Automatic Differentiation
Figure 3 for Federated Automatic Differentiation
Figure 4 for Federated Automatic Differentiation
Viaarxiv icon