Picture for Vladimír Boža

Vladimír Boža

Two Sparse Matrices are Better than One: Sparsifying Neural Networks with Double Sparse Factorization

Add code
Sep 27, 2024
Figure 1 for Two Sparse Matrices are Better than One: Sparsifying Neural Networks with Double Sparse Factorization
Figure 2 for Two Sparse Matrices are Better than One: Sparsifying Neural Networks with Double Sparse Factorization
Figure 3 for Two Sparse Matrices are Better than One: Sparsifying Neural Networks with Double Sparse Factorization
Figure 4 for Two Sparse Matrices are Better than One: Sparsifying Neural Networks with Double Sparse Factorization
Viaarxiv icon

Fast and Optimal Weight Update for Pruned Large Language Models

Add code
Jan 01, 2024
Viaarxiv icon

Merging of neural networks

Add code
Apr 21, 2022
Figure 1 for Merging of neural networks
Figure 2 for Merging of neural networks
Figure 3 for Merging of neural networks
Figure 4 for Merging of neural networks
Viaarxiv icon

Dynamic Pooling Improves Nanopore Base Calling Accuracy

Add code
May 16, 2021
Figure 1 for Dynamic Pooling Improves Nanopore Base Calling Accuracy
Figure 2 for Dynamic Pooling Improves Nanopore Base Calling Accuracy
Figure 3 for Dynamic Pooling Improves Nanopore Base Calling Accuracy
Figure 4 for Dynamic Pooling Improves Nanopore Base Calling Accuracy
Viaarxiv icon

Nanopore Base Calling on the Edge

Add code
Nov 09, 2020
Figure 1 for Nanopore Base Calling on the Edge
Figure 2 for Nanopore Base Calling on the Edge
Figure 3 for Nanopore Base Calling on the Edge
Figure 4 for Nanopore Base Calling on the Edge
Viaarxiv icon