Jonas Tebbe, Yapeng Gao
The chair is equipped with a KUKA Agilus Robot with 6 degrees of freedom. Motivated by the KUKA comercial (https://www.youtube.com/watch?v=tIIJME8-au8) we are teaching the robot to play table tennis.
A high-speed vision system was set up to identify a table tennis ball in the scene and infer its 3D position. To find the ball in an image we use three filters which take the balls movement, color, and shape into account. Its 3D position is estimated by triangulation using the balls pixel coordinates in both camera frames. To speedup the image processing, the region of interest method is emploied if one ball is found in previous frame. This is schematically shown in above figure. Based on the position information, trajectory state is estimated by a extended Kalman filter and predicted into the future using a standard airodynamic force model.
The spin influences the trajectory of the ball via the Magnus force. Thus, it is beneficial to consider spin information in order to improve the trajecory prediction. A high speed monocular vision system tracks the spin of the flying table tennis ball. It uses the brand logo imprinted on the ball. The brand vectors are fitted into a plane using RANSAC. The rotation axis corresponds to the normal vector of this plane. The angular speed is calculated by using the angle between two adjacent vectors divided by the elapsed time.
The robot is controlled by the KR-C4 controller using KRL programming language. We can establish a connection to the controller PC via the network using the KUKA EthernetKRL package. We use approximated motions which allow for the trajectory of the robot to change before it arrives to the initial target. Thus we are able to send an initial guess of the hitting point ahead of time and then send corrections with more precise information of the hitting point as we get to see more of the actual trajectory of the ball.
|||Jonas Tebbe, Yapeng Gao, Marc Sastre-Rienitz, and Andreas Zell. A Table Tennis Robot System using an industrial KUKA Robot Arm. In Pattern Recognition. GCPR 2018. (Accepted for publication).|