Picture for Jihoon Ko

Jihoon Ko

TensorCodec: Compact Lossy Compression of Tensors without Strong Data Assumptions

Add code
Sep 20, 2023
Viaarxiv icon

NeuKron: Constant-Size Lossy Compression of Sparse Reorderable Matrices and Tensors

Add code
Feb 09, 2023
Viaarxiv icon

BeGin: Extensive Benchmark Scenarios and An Easy-to-use Framework for Graph Continual Learning

Add code
Nov 26, 2022
Viaarxiv icon

Effective Training Strategies for Deep-learning-based Precipitation Nowcasting and Estimation

Add code
Feb 17, 2022
Figure 1 for Effective Training Strategies for Deep-learning-based Precipitation Nowcasting and Estimation
Figure 2 for Effective Training Strategies for Deep-learning-based Precipitation Nowcasting and Estimation
Figure 3 for Effective Training Strategies for Deep-learning-based Precipitation Nowcasting and Estimation
Figure 4 for Effective Training Strategies for Deep-learning-based Precipitation Nowcasting and Estimation
Viaarxiv icon

Learning to Pool in Graph Neural Networks for Extrapolation

Add code
Jun 11, 2021
Figure 1 for Learning to Pool in Graph Neural Networks for Extrapolation
Figure 2 for Learning to Pool in Graph Neural Networks for Extrapolation
Figure 3 for Learning to Pool in Graph Neural Networks for Extrapolation
Figure 4 for Learning to Pool in Graph Neural Networks for Extrapolation
Viaarxiv icon

MONSTOR: An Inductive Approach for Estimating and Maximizing Influence over Unseen Social Networks

Add code
Jan 24, 2020
Figure 1 for MONSTOR: An Inductive Approach for Estimating and Maximizing Influence over Unseen Social Networks
Figure 2 for MONSTOR: An Inductive Approach for Estimating and Maximizing Influence over Unseen Social Networks
Figure 3 for MONSTOR: An Inductive Approach for Estimating and Maximizing Influence over Unseen Social Networks
Figure 4 for MONSTOR: An Inductive Approach for Estimating and Maximizing Influence over Unseen Social Networks
Viaarxiv icon