Picture for Pierre H. Richemond

Pierre H. Richemond

The Edge of Orthogonality: A Simple View of What Makes BYOL Tick

Add code
Feb 09, 2023
Viaarxiv icon

SemPPL: Predicting pseudo-labels for better contrastive representations

Add code
Jan 12, 2023
Viaarxiv icon

Continuous diffusion for categorical data

Add code
Dec 15, 2022
Viaarxiv icon

Categorical SDEs with Simplex Diffusion

Add code
Oct 26, 2022
Viaarxiv icon

Zipfian environments for Reinforcement Learning

Add code
Mar 15, 2022
Figure 1 for Zipfian environments for Reinforcement Learning
Figure 2 for Zipfian environments for Reinforcement Learning
Figure 3 for Zipfian environments for Reinforcement Learning
Figure 4 for Zipfian environments for Reinforcement Learning
Viaarxiv icon

BYOL works even without batch statistics

Add code
Oct 20, 2020
Figure 1 for BYOL works even without batch statistics
Figure 2 for BYOL works even without batch statistics
Viaarxiv icon

Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning

Add code
Jun 13, 2020
Figure 1 for Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Figure 2 for Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Figure 3 for Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Figure 4 for Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Viaarxiv icon

Biologically inspired architectures for sample-efficient deep reinforcement learning

Add code
Nov 25, 2019
Figure 1 for Biologically inspired architectures for sample-efficient deep reinforcement learning
Figure 2 for Biologically inspired architectures for sample-efficient deep reinforcement learning
Figure 3 for Biologically inspired architectures for sample-efficient deep reinforcement learning
Figure 4 for Biologically inspired architectures for sample-efficient deep reinforcement learning
Viaarxiv icon

Static Activation Function Normalization

Add code
May 03, 2019
Figure 1 for Static Activation Function Normalization
Figure 2 for Static Activation Function Normalization
Figure 3 for Static Activation Function Normalization
Figure 4 for Static Activation Function Normalization
Viaarxiv icon

Combining learning rate decay and weight decay with complexity gradient descent - Part I

Add code
Feb 07, 2019
Figure 1 for Combining learning rate decay and weight decay with complexity gradient descent - Part I
Viaarxiv icon