Picture for T. Patrick Xiao

T. Patrick Xiao

Analog Bayesian neural networks are insensitive to the shape of the weight distribution

Add code
Jan 09, 2025
Viaarxiv icon

Analog fast Fourier transforms for scalable and efficient signal processing

Add code
Sep 27, 2024
Figure 1 for Analog fast Fourier transforms for scalable and efficient signal processing
Figure 2 for Analog fast Fourier transforms for scalable and efficient signal processing
Figure 3 for Analog fast Fourier transforms for scalable and efficient signal processing
Figure 4 for Analog fast Fourier transforms for scalable and efficient signal processing
Viaarxiv icon

An out-of-distribution discriminator based on Bayesian neural network epistemic uncertainty

Add code
Oct 18, 2022
Figure 1 for An out-of-distribution discriminator based on Bayesian neural network epistemic uncertainty
Figure 2 for An out-of-distribution discriminator based on Bayesian neural network epistemic uncertainty
Figure 3 for An out-of-distribution discriminator based on Bayesian neural network epistemic uncertainty
Figure 4 for An out-of-distribution discriminator based on Bayesian neural network epistemic uncertainty
Viaarxiv icon

Shape-Dependent Multi-Weight Magnetic Artificial Synapses for Neuromorphic Computing

Add code
Nov 22, 2021
Figure 1 for Shape-Dependent Multi-Weight Magnetic Artificial Synapses for Neuromorphic Computing
Figure 2 for Shape-Dependent Multi-Weight Magnetic Artificial Synapses for Neuromorphic Computing
Figure 3 for Shape-Dependent Multi-Weight Magnetic Artificial Synapses for Neuromorphic Computing
Figure 4 for Shape-Dependent Multi-Weight Magnetic Artificial Synapses for Neuromorphic Computing
Viaarxiv icon

On the Accuracy of Analog Neural Network Inference Accelerators

Add code
Sep 12, 2021
Figure 1 for On the Accuracy of Analog Neural Network Inference Accelerators
Figure 2 for On the Accuracy of Analog Neural Network Inference Accelerators
Figure 3 for On the Accuracy of Analog Neural Network Inference Accelerators
Figure 4 for On the Accuracy of Analog Neural Network Inference Accelerators
Viaarxiv icon

Device-aware inference operations in SONOS nonvolatile memory arrays

Add code
Apr 02, 2020
Figure 1 for Device-aware inference operations in SONOS nonvolatile memory arrays
Figure 2 for Device-aware inference operations in SONOS nonvolatile memory arrays
Figure 3 for Device-aware inference operations in SONOS nonvolatile memory arrays
Figure 4 for Device-aware inference operations in SONOS nonvolatile memory arrays
Viaarxiv icon

Plasticity-Enhanced Domain-Wall MTJ Neural Networks for Energy-Efficient Online Learning

Add code
Mar 04, 2020
Figure 1 for Plasticity-Enhanced Domain-Wall MTJ Neural Networks for Energy-Efficient Online Learning
Figure 2 for Plasticity-Enhanced Domain-Wall MTJ Neural Networks for Energy-Efficient Online Learning
Figure 3 for Plasticity-Enhanced Domain-Wall MTJ Neural Networks for Energy-Efficient Online Learning
Figure 4 for Plasticity-Enhanced Domain-Wall MTJ Neural Networks for Energy-Efficient Online Learning
Viaarxiv icon

Evaluating complexity and resilience trade-offs in emerging memory inference machines

Add code
Feb 25, 2020
Figure 1 for Evaluating complexity and resilience trade-offs in emerging memory inference machines
Figure 2 for Evaluating complexity and resilience trade-offs in emerging memory inference machines
Figure 3 for Evaluating complexity and resilience trade-offs in emerging memory inference machines
Figure 4 for Evaluating complexity and resilience trade-offs in emerging memory inference machines
Viaarxiv icon