Picture for Michael Diskin

Michael Diskin

A critical look at the evaluation of GNNs under heterophily: are we really making progress?

Add code
Feb 22, 2023
Viaarxiv icon

SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient

Add code
Jan 27, 2023
Viaarxiv icon

Training Transformers Together

Add code
Jul 07, 2022
Figure 1 for Training Transformers Together
Figure 2 for Training Transformers Together
Viaarxiv icon

Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees

Add code
Oct 07, 2021
Figure 1 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Figure 2 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Figure 3 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Figure 4 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Viaarxiv icon

Secure Distributed Training at Scale

Add code
Jun 21, 2021
Figure 1 for Secure Distributed Training at Scale
Figure 2 for Secure Distributed Training at Scale
Figure 3 for Secure Distributed Training at Scale
Figure 4 for Secure Distributed Training at Scale
Viaarxiv icon

Distributed Deep Learning in Open Collaborations

Add code
Jun 18, 2021
Figure 1 for Distributed Deep Learning in Open Collaborations
Figure 2 for Distributed Deep Learning in Open Collaborations
Figure 3 for Distributed Deep Learning in Open Collaborations
Figure 4 for Distributed Deep Learning in Open Collaborations
Viaarxiv icon