Picture for Kees Vissers

Kees Vissers

EcoFlow: Efficient Convolutional Dataflows for Low-Power Neural Network Accelerators

Add code
Feb 04, 2022
Figure 1 for EcoFlow: Efficient Convolutional Dataflows for Low-Power Neural Network Accelerators
Figure 2 for EcoFlow: Efficient Convolutional Dataflows for Low-Power Neural Network Accelerators
Figure 3 for EcoFlow: Efficient Convolutional Dataflows for Low-Power Neural Network Accelerators
Figure 4 for EcoFlow: Efficient Convolutional Dataflows for Low-Power Neural Network Accelerators
Viaarxiv icon

FAT: Training Neural Networks for Reliable Inference Under Hardware Faults

Add code
Nov 11, 2020
Figure 1 for FAT: Training Neural Networks for Reliable Inference Under Hardware Faults
Figure 2 for FAT: Training Neural Networks for Reliable Inference Under Hardware Faults
Figure 3 for FAT: Training Neural Networks for Reliable Inference Under Hardware Faults
Figure 4 for FAT: Training Neural Networks for Reliable Inference Under Hardware Faults
Viaarxiv icon

Efficient Error-Tolerant Quantized Neural Network Accelerators

Add code
Dec 16, 2019
Figure 1 for Efficient Error-Tolerant Quantized Neural Network Accelerators
Figure 2 for Efficient Error-Tolerant Quantized Neural Network Accelerators
Figure 3 for Efficient Error-Tolerant Quantized Neural Network Accelerators
Figure 4 for Efficient Error-Tolerant Quantized Neural Network Accelerators
Viaarxiv icon

Synetgy: Algorithm-hardware Co-design for ConvNet Accelerators on Embedded FPGAs

Add code
Nov 21, 2018
Figure 1 for Synetgy: Algorithm-hardware Co-design for ConvNet Accelerators on Embedded FPGAs
Figure 2 for Synetgy: Algorithm-hardware Co-design for ConvNet Accelerators on Embedded FPGAs
Figure 3 for Synetgy: Algorithm-hardware Co-design for ConvNet Accelerators on Embedded FPGAs
Figure 4 for Synetgy: Algorithm-hardware Co-design for ConvNet Accelerators on Embedded FPGAs
Viaarxiv icon

Scaling Binarized Neural Networks on Reconfigurable Logic

Add code
Jan 27, 2017
Figure 1 for Scaling Binarized Neural Networks on Reconfigurable Logic
Figure 2 for Scaling Binarized Neural Networks on Reconfigurable Logic
Figure 3 for Scaling Binarized Neural Networks on Reconfigurable Logic
Figure 4 for Scaling Binarized Neural Networks on Reconfigurable Logic
Viaarxiv icon

FINN: A Framework for Fast, Scalable Binarized Neural Network Inference

Add code
Dec 01, 2016
Figure 1 for FINN: A Framework for Fast, Scalable Binarized Neural Network Inference
Figure 2 for FINN: A Framework for Fast, Scalable Binarized Neural Network Inference
Figure 3 for FINN: A Framework for Fast, Scalable Binarized Neural Network Inference
Figure 4 for FINN: A Framework for Fast, Scalable Binarized Neural Network Inference
Viaarxiv icon