Picture for Prashant Bhat

Prashant Bhat

Mitigating Interference in the Knowledge Continuum through Attention-Guided Incremental Learning

Add code
May 22, 2024
Viaarxiv icon

IMEX-Reg: Implicit-Explicit Regularization in the Function Space for Continual Learning

Add code
Apr 28, 2024
Viaarxiv icon

TriRE: A Multi-Mechanism Learning Paradigm for Continual Knowledge Retention and Promotion

Add code
Oct 12, 2023
Viaarxiv icon

BiRT: Bio-inspired Replay in Vision Transformers for Continual Learning

Add code
May 08, 2023
Viaarxiv icon

Task-Aware Information Routing from Common Representation Space in Lifelong Learning

Add code
Feb 14, 2023
Viaarxiv icon

Task Agnostic Representation Consolidation: a Self-supervised based Continual Learning Approach

Add code
Jul 13, 2022
Figure 1 for Task Agnostic Representation Consolidation: a Self-supervised based Continual Learning Approach
Figure 2 for Task Agnostic Representation Consolidation: a Self-supervised based Continual Learning Approach
Figure 3 for Task Agnostic Representation Consolidation: a Self-supervised based Continual Learning Approach
Figure 4 for Task Agnostic Representation Consolidation: a Self-supervised based Continual Learning Approach
Viaarxiv icon

Consistency is the key to further mitigating catastrophic forgetting in continual learning

Add code
Jul 11, 2022
Figure 1 for Consistency is the key to further mitigating catastrophic forgetting in continual learning
Figure 2 for Consistency is the key to further mitigating catastrophic forgetting in continual learning
Figure 3 for Consistency is the key to further mitigating catastrophic forgetting in continual learning
Figure 4 for Consistency is the key to further mitigating catastrophic forgetting in continual learning
Viaarxiv icon

Distill on the Go: Online knowledge distillation in self-supervised learning

Add code
Apr 20, 2021
Figure 1 for Distill on the Go: Online knowledge distillation in self-supervised learning
Figure 2 for Distill on the Go: Online knowledge distillation in self-supervised learning
Figure 3 for Distill on the Go: Online knowledge distillation in self-supervised learning
Figure 4 for Distill on the Go: Online knowledge distillation in self-supervised learning
Viaarxiv icon