Picture for Matthias Bethge

Matthias Bethge

Object segmentation from common fate: Motion energy processing enables human-like zero-shot generalization to random dot stimuli

Add code
Nov 03, 2024
Viaarxiv icon

Centaur: a foundation model of human cognition

Add code
Oct 26, 2024
Viaarxiv icon

In Search of Forgotten Domain Generalization

Add code
Oct 10, 2024
Viaarxiv icon

Adaptation Odyssey in LLMs: Why Does Additional Pretraining Sometimes Fail to Improve?

Add code
Oct 08, 2024
Figure 1 for Adaptation Odyssey in LLMs: Why Does Additional Pretraining Sometimes Fail to Improve?
Figure 2 for Adaptation Odyssey in LLMs: Why Does Additional Pretraining Sometimes Fail to Improve?
Figure 3 for Adaptation Odyssey in LLMs: Why Does Additional Pretraining Sometimes Fail to Improve?
Figure 4 for Adaptation Odyssey in LLMs: Why Does Additional Pretraining Sometimes Fail to Improve?
Viaarxiv icon

A Practitioner's Guide to Continual Multimodal Pretraining

Add code
Aug 26, 2024
Viaarxiv icon

Reflecting on the State of Rehearsal-free Continual Learning with Pretrained Models

Add code
Jun 13, 2024
Viaarxiv icon

Identifying latent state transition in non-linear dynamical systems

Add code
Jun 06, 2024
Viaarxiv icon

The Entropy Enigma: Success and Failure of Entropy Minimization

Add code
May 08, 2024
Viaarxiv icon

Wu's Method can Boost Symbolic AI to Rival Silver Medalists and AlphaGeometry to Outperform Gold Medalists at IMO Geometry

Add code
Apr 11, 2024
Viaarxiv icon

No "Zero-Shot" Without Exponential Data: Pretraining Concept Frequency Determines Multimodal Model Performance

Add code
Apr 08, 2024
Viaarxiv icon