Picture for Jinuk Kim

Jinuk Kim

LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging

Add code
Jun 18, 2024
Viaarxiv icon

HyperCLOVA X Technical Report

Add code
Apr 13, 2024
Viaarxiv icon

Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming

Add code
Jan 28, 2023
Viaarxiv icon

Dataset Condensation via Efficient Synthetic-Data Parameterization

Add code
Jun 02, 2022
Figure 1 for Dataset Condensation via Efficient Synthetic-Data Parameterization
Figure 2 for Dataset Condensation via Efficient Synthetic-Data Parameterization
Figure 3 for Dataset Condensation via Efficient Synthetic-Data Parameterization
Figure 4 for Dataset Condensation via Efficient Synthetic-Data Parameterization
Viaarxiv icon

What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers

Add code
Sep 10, 2021
Figure 1 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Figure 2 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Figure 3 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Figure 4 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Viaarxiv icon