Picture for Samuel Horvath

Samuel Horvath

Double Momentum and Error Feedback for Clipping with Fast Rates and Differential Privacy

Add code
Feb 17, 2025
Viaarxiv icon

Aequa: Fair Model Rewards in Collaborative Learning via Slimmable Networks

Add code
Feb 07, 2025
Viaarxiv icon

CYCle: Choosing Your Collaborators Wisely to Enhance Collaborative Fairness in Decentralized Learning

Add code
Jan 21, 2025
Figure 1 for CYCle: Choosing Your Collaborators Wisely to Enhance Collaborative Fairness in Decentralized Learning
Figure 2 for CYCle: Choosing Your Collaborators Wisely to Enhance Collaborative Fairness in Decentralized Learning
Figure 3 for CYCle: Choosing Your Collaborators Wisely to Enhance Collaborative Fairness in Decentralized Learning
Figure 4 for CYCle: Choosing Your Collaborators Wisely to Enhance Collaborative Fairness in Decentralized Learning
Viaarxiv icon

Initialization using Update Approximation is a Silver Bullet for Extremely Efficient Low-Rank Fine-Tuning

Add code
Nov 29, 2024
Viaarxiv icon

FedPeWS: Personalized Warmup via Subnetworks for Enhanced Heterogeneous Federated Learning

Add code
Oct 03, 2024
Viaarxiv icon

Redefining Contributions: Shapley-Driven Federated Learning

Add code
Jun 01, 2024
Viaarxiv icon

Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad

Add code
Mar 05, 2024
Figure 1 for Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad
Figure 2 for Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad
Figure 3 for Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad
Figure 4 for Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad
Viaarxiv icon

Rethink Model Re-Basin and the Linear Mode Connectivity

Add code
Feb 05, 2024
Viaarxiv icon

Efficient Conformal Prediction under Data Heterogeneity

Add code
Dec 25, 2023
Viaarxiv icon

Handling Data Heterogeneity via Architectural Design for Federated Visual Recognition

Add code
Oct 23, 2023
Viaarxiv icon