Abstract:We present three biometric datasets (iCarB-Face, iCarB-Fingerprint, iCarB-Voice) containing face videos, fingerprint images, and voice samples, collected inside a car from 200 consenting volunteers. The data was acquired using a near-infrared camera, two fingerprint scanners, and two microphones, while the volunteers were seated in the driver's seat of the car. The data collection took place while the car was parked both indoors and outdoors, and different "noises" were added to simulate non-ideal biometric data capture that may be encountered in real-life driver recognition. Although the datasets are specifically tailored to in-vehicle biometric recognition, their utility is not limited to the automotive environment. The iCarB datasets, which are available to the research community, can be used to: (i) evaluate and benchmark face, fingerprint, and voice recognition systems (we provide several evaluation protocols); (ii) create multimodal pseudo-identities, to train/test multimodal fusion algorithms; (iii) create Presentation Attacks from the biometric data, to evaluate Presentation Attack Detection algorithms; (iv) investigate demographic and environmental biases in biometric systems, using the provided metadata. To the best of our knowledge, ours are the largest and most diverse publicly available in-vehicle biometric datasets. Most other datasets contain only one biometric modality (usually face), while our datasets consist of three modalities, all acquired in the same automotive environment. Moreover, iCarB-Fingerprint seems to be the first publicly available in-vehicle fingerprint dataset. Finally, the iCarB datasets boast a rare level of demographic diversity among the 200 data subjects, including a 50/50 gender split, skin colours across the whole Fitzpatrick-scale spectrum, and a wide age range (18-60+). So, these datasets will be valuable for advancing biometrics research.
Abstract:Geometry is a fundamental part of robotics and there have been various frameworks of representation over the years. Recently, geometric algebra has gained attention for its property of unifying many of those previous ideas into one algebra. While there are already efficient open-source implementations of geometric algebra available, none of them is targeted at robotics applications. We want to address this shortcoming with our library gafro. This article presents an overview of the implementation details as well as a tutorial of gafro, an efficient c++ library targeting robotics applications using geometric algebra. The library focuses on using conformal geometric algebra. Hence, various geometric primitives are available for computation as well as rigid body transformations. The modeling of robotic systems is also an important aspect of the library. It implements various algorithms for calculating the kinematics and dynamics of such systems as well as objectives for optimisation problems. The software stack is completed by python bindings in pygafro and a ROS interface in gafro_ros.
Abstract:Robot programming methods for industrial robots are time consuming and often require operators to have knowledge in robotics and programming. To reduce costs associated with reprogramming, various interfaces using augmented reality have recently been proposed to provide users with more intuitive means of controlling robots in real-time and programming them without having to code. However, most solutions require the operator to be close to the real robot's workspace which implies either removing it from the production line or shutting down the whole production line due to safety hazards. We propose a novel augmented reality interface providing the users with the ability to model a virtual representation of a workspace which can be saved and reused to program new tasks or adapt old ones without having to be co-located with the real robot. Similar to previous interfaces, the operators then have the ability to program robot tasks or control the robot in real-time by manipulating a virtual robot. We evaluate the intuitiveness and usability of the proposed interface with a user study where 18 participants programmed a robot manipulator for a disassembly task.