Picture for André Bourdoux

André Bourdoux

Distributed PMCW Radar Network in Presence of Phase Noise

Add code
May 15, 2024
Figure 1 for Distributed PMCW Radar Network in Presence of Phase Noise
Figure 2 for Distributed PMCW Radar Network in Presence of Phase Noise
Figure 3 for Distributed PMCW Radar Network in Presence of Phase Noise
Figure 4 for Distributed PMCW Radar Network in Presence of Phase Noise
Viaarxiv icon

Active Inference in Hebbian Learning Networks

Add code
Jun 22, 2023
Figure 1 for Active Inference in Hebbian Learning Networks
Figure 2 for Active Inference in Hebbian Learning Networks
Figure 3 for Active Inference in Hebbian Learning Networks
Figure 4 for Active Inference in Hebbian Learning Networks
Viaarxiv icon

FMCW Radar Sensing for Indoor Drones Using Learned Representations

Add code
Jan 06, 2023
Viaarxiv icon

Fusing Event-based Camera and Radar for SLAM Using Spiking Neural Networks with Continual STDP Learning

Add code
Oct 09, 2022
Figure 1 for Fusing Event-based Camera and Radar for SLAM Using Spiking Neural Networks with Continual STDP Learning
Figure 2 for Fusing Event-based Camera and Radar for SLAM Using Spiking Neural Networks with Continual STDP Learning
Figure 3 for Fusing Event-based Camera and Radar for SLAM Using Spiking Neural Networks with Continual STDP Learning
Figure 4 for Fusing Event-based Camera and Radar for SLAM Using Spiking Neural Networks with Continual STDP Learning
Viaarxiv icon

Learning to SLAM on the Fly in Unknown Environments: A Continual Learning Approach for Drones in Visually Ambiguous Scenes

Add code
Aug 27, 2022
Figure 1 for Learning to SLAM on the Fly in Unknown Environments: A Continual Learning Approach for Drones in Visually Ambiguous Scenes
Figure 2 for Learning to SLAM on the Fly in Unknown Environments: A Continual Learning Approach for Drones in Visually Ambiguous Scenes
Figure 3 for Learning to SLAM on the Fly in Unknown Environments: A Continual Learning Approach for Drones in Visually Ambiguous Scenes
Figure 4 for Learning to SLAM on the Fly in Unknown Environments: A Continual Learning Approach for Drones in Visually Ambiguous Scenes
Viaarxiv icon

Continuously Learning to Detect People on the Fly: A Bio-inspired Visual System for Drones

Add code
Feb 20, 2022
Figure 1 for Continuously Learning to Detect People on the Fly: A Bio-inspired Visual System for Drones
Figure 2 for Continuously Learning to Detect People on the Fly: A Bio-inspired Visual System for Drones
Figure 3 for Continuously Learning to Detect People on the Fly: A Bio-inspired Visual System for Drones
Figure 4 for Continuously Learning to Detect People on the Fly: A Bio-inspired Visual System for Drones
Viaarxiv icon

Learning Event-based Spatio-Temporal Feature Descriptors via Local Synaptic Plasticity: A Biologically-realistic Perspective of Computer Vision

Add code
Nov 04, 2021
Figure 1 for Learning Event-based Spatio-Temporal Feature Descriptors via Local Synaptic Plasticity: A Biologically-realistic Perspective of Computer Vision
Figure 2 for Learning Event-based Spatio-Temporal Feature Descriptors via Local Synaptic Plasticity: A Biologically-realistic Perspective of Computer Vision
Figure 3 for Learning Event-based Spatio-Temporal Feature Descriptors via Local Synaptic Plasticity: A Biologically-realistic Perspective of Computer Vision
Figure 4 for Learning Event-based Spatio-Temporal Feature Descriptors via Local Synaptic Plasticity: A Biologically-realistic Perspective of Computer Vision
Viaarxiv icon

Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum Learning Approach

Add code
Sep 28, 2021
Figure 1 for Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum Learning Approach
Figure 2 for Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum Learning Approach
Figure 3 for Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum Learning Approach
Figure 4 for Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum Learning Approach
Viaarxiv icon

A 2-$μ$J, 12-class, 91% Accuracy Spiking Neural Network Approach For Radar Gesture Recognition

Add code
Aug 24, 2021
Figure 1 for A 2-$μ$J, 12-class, 91% Accuracy Spiking Neural Network Approach For Radar Gesture Recognition
Figure 2 for A 2-$μ$J, 12-class, 91% Accuracy Spiking Neural Network Approach For Radar Gesture Recognition
Figure 3 for A 2-$μ$J, 12-class, 91% Accuracy Spiking Neural Network Approach For Radar Gesture Recognition
Figure 4 for A 2-$μ$J, 12-class, 91% Accuracy Spiking Neural Network Approach For Radar Gesture Recognition
Viaarxiv icon

A Low-Complexity Radar Detector Outperforming OS-CFAR for Indoor Drone Obstacle Avoidance

Add code
Jul 15, 2021
Figure 1 for A Low-Complexity Radar Detector Outperforming OS-CFAR for Indoor Drone Obstacle Avoidance
Figure 2 for A Low-Complexity Radar Detector Outperforming OS-CFAR for Indoor Drone Obstacle Avoidance
Figure 3 for A Low-Complexity Radar Detector Outperforming OS-CFAR for Indoor Drone Obstacle Avoidance
Figure 4 for A Low-Complexity Radar Detector Outperforming OS-CFAR for Indoor Drone Obstacle Avoidance
Viaarxiv icon