Abstract:A common limitation of autonomous tissue manipulation in robotic minimally invasive surgery (MIS) is the absence of force sensing and control at the tool level. Recently, our team has developed haptics-enabled forceps that can simultaneously measure the grasping and pulling forces during tissue manipulation. Based on this design, here we further present a method to automate tissue traction with controlled grasping and pulling forces. Specifically, the grasping stage relies on a controlled grasping force, while the pulling stage is under the guidance of a controlled pulling force. Notably, during the pulling process, the simultaneous control of both grasping and pulling forces is also enabled for more precise tissue traction, achieved through force decoupling. The force controller is built upon a static model of tissue manipulation, considering the interaction between the haptics-enabled forceps and soft tissue. The efficacy of this force control approach is validated through a series of experiments comparing targeted, estimated, and actual reference forces. To verify the feasibility of the proposed method in surgical applications, various tissue resections are conducted on ex vivo tissues employing a dual-arm robotic setup. Finally, we discuss the benefits of multi-force control in tissue traction, evidenced through comparative analyses of various ex vivo tissue resections. The results affirm the feasibility of implementing automatic tissue traction using micro-sized forceps with multi-force control, suggesting its potential to promote autonomous MIS. A video demonstrating the experiments can be found at https://youtu.be/8fe8o8IFrjE.
Abstract:Current minimally invasive surgical robots are lacking in force sensing that is robust to temperature and electromagnetic variation while being compatible with micro-sized instruments. This paper presents a multi-axis force sensing module that can be integrated with micro-sized surgical instruments such as biopsy forceps. The proposed miniature sensing module mainly consists of a flexure, a camera, and a target. The deformation of the flexure is obtained by the pose variation of the top-mounted target, which is estimated by the camera with a proposed pose estimation algorithm. Then, the external force is estimated using the flexure's displacement and stiffness matrix. Integrating the sensing module, we further develop a pair of haptics-enabled forceps and realize its multi-modal force sensing, including touching, grasping, and pulling when the forceps manipulate tissues. To minimize the unexpected sliding between the forceps' clips and the tissue, we design a micro-level actuator to drive the forceps and compensate for the motion introduced by the flexure's deformation. Finally, a series of experiments are conducted to verify the feasibility of the proposed sensing module and forceps, including an automatic robotic grasping procedure on ex-vivo tissues. The results indicate the sensing module can estimate external forces accurately, and the haptics-enabled forceps can potentially realize multi-modal force sensing for task-autonomous robotic surgery. A video demonstrating the experiments can be found at https://youtu.be/4UUTT_hiFcI.