Picture for HyunJin Kim

HyunJin Kim

PEMA: Plug-in External Memory Adaptation for Language Models

Add code
Nov 14, 2023
Viaarxiv icon

CTMQ: Cyclic Training of Convolutional Neural Networks with Multiple Quantization Steps

Add code
Jun 26, 2022
Figure 1 for CTMQ: Cyclic Training of Convolutional Neural Networks with Multiple Quantization Steps
Figure 2 for CTMQ: Cyclic Training of Convolutional Neural Networks with Multiple Quantization Steps
Figure 3 for CTMQ: Cyclic Training of Convolutional Neural Networks with Multiple Quantization Steps
Figure 4 for CTMQ: Cyclic Training of Convolutional Neural Networks with Multiple Quantization Steps
Viaarxiv icon

PLAM: a Posit Logarithm-Approximate Multiplier for Power Efficient Posit-based DNNs

Add code
Feb 18, 2021
Figure 1 for PLAM: a Posit Logarithm-Approximate Multiplier for Power Efficient Posit-based DNNs
Figure 2 for PLAM: a Posit Logarithm-Approximate Multiplier for Power Efficient Posit-based DNNs
Figure 3 for PLAM: a Posit Logarithm-Approximate Multiplier for Power Efficient Posit-based DNNs
Figure 4 for PLAM: a Posit Logarithm-Approximate Multiplier for Power Efficient Posit-based DNNs
Viaarxiv icon

Effects of Approximate Multiplication on Convolutional Neural Networks

Add code
Jul 20, 2020
Figure 1 for Effects of Approximate Multiplication on Convolutional Neural Networks
Figure 2 for Effects of Approximate Multiplication on Convolutional Neural Networks
Figure 3 for Effects of Approximate Multiplication on Convolutional Neural Networks
Figure 4 for Effects of Approximate Multiplication on Convolutional Neural Networks
Viaarxiv icon