Abstract:Depth estimation using a single-photon LiDAR is often solved by a matched filter. It is, however, error-prone in the presence of background noise. A commonly used technique to reject background noise is the rank-ordered mean (ROM) filter previously reported by Shin \textit{et al.} (2015). ROM rejects noisy photon arrival timestamps by selecting only a small range of them around the median statistics within its local neighborhood. Despite the promising performance of ROM, its theoretical performance limit is unknown. In this paper, we theoretically characterize the ROM performance by showing that ROM fails when the reflectivity drops below a threshold predetermined by the depth and signal-to-background ratio, and its accuracy undergoes a phase transition at the cutoff. Based on our theory, we propose an improved signal extraction technique by selecting tight timestamp clusters. Experimental results show that the proposed algorithm improves depth estimation performance over ROM by 3 orders of magnitude at the same signal intensities, and achieves high image fidelity at noise levels as high as 17 times that of signal.
Abstract:In single-photon light detection and ranging (SP-LiDAR) systems, the histogram distortion due to hardware dead time fundamentally limits the precision of depth estimation. To compensate for the dead time effects, the photon registration distribution is typically modeled based on the Markov chain self-excitation process. However, this is a discrete process and it is computationally expensive, thus hindering potential neural network applications and fast simulations. In this paper, we overcome the modeling challenge by proposing a continuous parametric model. We introduce a Gaussian-uniform mixture model (GUMM) and periodic padding to address high noise floors and noise slopes respectively. By deriving and implementing a customized expectation maximization (EM) algorithm, we achieve accurate histogram matching in scenarios that were deemed difficult in the literature.
Abstract:Single-photon Light Detection and Ranging (LiDAR) systems are often equipped with an array of detectors for improved spatial resolution and sensing speed. However, given a fixed amount of flux produced by the laser transmitter across the scene, the per-pixel Signal-to-Noise Ratio (SNR) will decrease when more pixels are packed in a unit space. This presents a fundamental trade-off between the spatial resolution of the sensor array and the SNR received at each pixel. Theoretical characterization of this fundamental limit is explored. By deriving the photon arrival statistics and introducing a series of new approximation techniques, the Mean Squared Error (MSE) of the maximum-likelihood estimator of the time delay is derived. The theoretical predictions align well with simulations and real data.
Abstract:We present Etymo (https://etymo.io), a discovery engine to facilitate artificial intelligence (AI) research and development. It aims to help readers navigate a large number of AI-related papers published every week by using a novel form of search that finds relevant papers and displays related papers in a graphical interface. Etymo constructs and maintains an adaptive similarity-based network of research papers as an all-purpose knowledge graph for ranking, recommendation, and visualisation. The network is constantly evolving and can learn from user feedback to adjust itself.