PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models

Add code
May 30, 2023
Figure 1 for PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models
Figure 2 for PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models
Figure 3 for PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models
Figure 4 for PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: