Picture for Michal Lukasik

Michal Lukasik

Metric-aware LLM inference

Add code
Mar 07, 2024
Viaarxiv icon

It's an Alignment, Not a Trade-off: Revisiting Bias and Variance in Deep Models

Add code
Oct 13, 2023
Viaarxiv icon

What do larger image classifiers memorise?

Add code
Oct 09, 2023
Viaarxiv icon

ResMem: Learn what you can and memorize the rest

Add code
Feb 03, 2023
Viaarxiv icon

Large Language Models with Controllable Working Memory

Add code
Nov 09, 2022
Viaarxiv icon

Preserving In-Context Learning ability in Large Language Model Fine-tuning

Add code
Nov 01, 2022
Viaarxiv icon

Robust Distillation for Worst-class Performance

Add code
Jun 13, 2022
Figure 1 for Robust Distillation for Worst-class Performance
Figure 2 for Robust Distillation for Worst-class Performance
Figure 3 for Robust Distillation for Worst-class Performance
Figure 4 for Robust Distillation for Worst-class Performance
Viaarxiv icon

HD-cos Networks: Efficient Neural Architectures for Secure Multi-Party Computation

Add code
Oct 28, 2021
Figure 1 for HD-cos Networks: Efficient Neural Architectures for Secure Multi-Party Computation
Figure 2 for HD-cos Networks: Efficient Neural Architectures for Secure Multi-Party Computation
Figure 3 for HD-cos Networks: Efficient Neural Architectures for Secure Multi-Party Computation
Figure 4 for HD-cos Networks: Efficient Neural Architectures for Secure Multi-Party Computation
Viaarxiv icon

Leveraging redundancy in attention with Reuse Transformers

Add code
Oct 13, 2021
Figure 1 for Leveraging redundancy in attention with Reuse Transformers
Figure 2 for Leveraging redundancy in attention with Reuse Transformers
Figure 3 for Leveraging redundancy in attention with Reuse Transformers
Figure 4 for Leveraging redundancy in attention with Reuse Transformers
Viaarxiv icon

Teacher's pet: understanding and mitigating biases in distillation

Add code
Jul 08, 2021
Figure 1 for Teacher's pet: understanding and mitigating biases in distillation
Figure 2 for Teacher's pet: understanding and mitigating biases in distillation
Figure 3 for Teacher's pet: understanding and mitigating biases in distillation
Figure 4 for Teacher's pet: understanding and mitigating biases in distillation
Viaarxiv icon