Picture for Rodolphe Jenatton

Rodolphe Jenatton

CMAP

Pi-DUAL: Using Privileged Information to Distinguish Clean from Noisy Labels

Add code
Oct 10, 2023
Viaarxiv icon

Three Towers: Flexible Contrastive Learning with Pretrained Image Models

Add code
May 29, 2023
Figure 1 for Three Towers: Flexible Contrastive Learning with Pretrained Image Models
Figure 2 for Three Towers: Flexible Contrastive Learning with Pretrained Image Models
Figure 3 for Three Towers: Flexible Contrastive Learning with Pretrained Image Models
Figure 4 for Three Towers: Flexible Contrastive Learning with Pretrained Image Models
Viaarxiv icon

When does Privileged Information Explain Away Label Noise?

Add code
Mar 03, 2023
Viaarxiv icon

Scaling Vision Transformers to 22 Billion Parameters

Add code
Feb 10, 2023
Viaarxiv icon

Massively Scaling Heteroscedastic Classifiers

Add code
Jan 30, 2023
Viaarxiv icon

On the Adversarial Robustness of Mixture of Experts

Add code
Oct 19, 2022
Figure 1 for On the Adversarial Robustness of Mixture of Experts
Figure 2 for On the Adversarial Robustness of Mixture of Experts
Figure 3 for On the Adversarial Robustness of Mixture of Experts
Figure 4 for On the Adversarial Robustness of Mixture of Experts
Viaarxiv icon

Plex: Towards Reliability using Pretrained Large Model Extensions

Add code
Jul 15, 2022
Figure 1 for Plex: Towards Reliability using Pretrained Large Model Extensions
Figure 2 for Plex: Towards Reliability using Pretrained Large Model Extensions
Figure 3 for Plex: Towards Reliability using Pretrained Large Model Extensions
Figure 4 for Plex: Towards Reliability using Pretrained Large Model Extensions
Viaarxiv icon

Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts

Add code
Jun 06, 2022
Figure 1 for Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts
Figure 2 for Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts
Figure 3 for Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts
Figure 4 for Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts
Viaarxiv icon

Transfer and Marginalize: Explaining Away Label Noise with Privileged Information

Add code
Feb 18, 2022
Figure 1 for Transfer and Marginalize: Explaining Away Label Noise with Privileged Information
Figure 2 for Transfer and Marginalize: Explaining Away Label Noise with Privileged Information
Figure 3 for Transfer and Marginalize: Explaining Away Label Noise with Privileged Information
Figure 4 for Transfer and Marginalize: Explaining Away Label Noise with Privileged Information
Viaarxiv icon

Predicting the utility of search spaces for black-box optimization: a simple, budget-aware approach

Add code
Dec 16, 2021
Figure 1 for Predicting the utility of search spaces for black-box optimization: a simple, budget-aware approach
Figure 2 for Predicting the utility of search spaces for black-box optimization: a simple, budget-aware approach
Figure 3 for Predicting the utility of search spaces for black-box optimization: a simple, budget-aware approach
Figure 4 for Predicting the utility of search spaces for black-box optimization: a simple, budget-aware approach
Viaarxiv icon