Abstract:Affordance detection from visual input is a fundamental step in autonomous robotic manipulation. Existing solutions to the problem of affordance detection rely on convolutional neural networks. However, these networks do not consider the spatial arrangement of the input data and miss parts-to-whole relationships. Therefore, they fall short when confronted with novel, previously unseen object instances or new viewpoints. One solution to overcome such limitations can be to resort to capsule networks. In this paper, we introduce the first affordance detection network based on dynamic tree-structured capsules for sparse 3D point clouds. We show that our capsule-based network outperforms current state-of-the-art models on viewpoint invariance and parts-segmentation of new object instances through a novel dataset we only used for evaluation and it is publicly available from github.com/gipfelen/DTCG-Net. In the experimental evaluation we will show that our algorithm is superior to current affordance detection methods when faced with grasping previously unseen objects thanks to our Capsule Network enforcing a parts-to-whole representation.
Abstract:In this paper we presented the Software Testing, AI and Robotics (STAIR) Learning Lab. STAIR is an initiative started at the University of Innsbruck to bring robotics, Artificial Intelligence (AI) and software testing into schools. In the lab physical and virtual learning units are developed in parallel and in sync with each other. Its core learning approach is based the develop of both a physical and simulated robotics environment. In both environments AI scenarios (like traffic sign recognition) are deployed and tested. We present and focus on our newly designed MiniBot that are both built on hardware which was designed for educational and research purposes as well as the simulation environment. Additionally, we describe first learning design concepts and a showcase scenario (i.e., AI-based traffic sign recognition) with different exercises which can easily be extended.