Manipulating objects with robotic hands is a complicated task. Not only the pose of the robot's end effector, but also the fingers of the hand need to be controlled and coordinated. Using human demonstrations of movements is an intuitive and data-efficient way of guiding the robot's behavior. We propose a modular framework with an automatic embodiment mapping to transfer human hand motions to robotic systems and use motion capture to record human motion. We evaluate our approach on eight challenging tasks, in which a robotic arm with a mounted robotic hand needs to grasp and manipulate deformable objects or small, fragile material.