Robot : Tocabi : TOrque Controlled compliAnt BIped

The robot was designed to reflect the height of an adult male. The robot’s name, 도깨비(Tocabi : goblin), is a tall, strong, and helpful creature in Korean fairy tales. It reflects the wishes of the creators for a world where humanoid robots help humans and coexist. It is characterized by having a high degree of freedom in each of the arms with 8 degrees of freedom. The large work space resulting from this is a great advantage of robots.

Mechanical Design

1800mm
100Kg
33DOF ( 2 + 16 + 3 +12 )
Reduction Ratio 100:1

Hardware

CPU Intel i7-10700
Sensoray 826 DAQ
Parker / Kormollgen actuator
Elmo Motor Driver
Homming Method with Proximity sensor / output encoder

Software

Ubuntu 20.04
Linux 5.4.133 with Xenomai 3.1.1
ROS Middleware
Simple Open EtherCAT Master 1.4.0
Mujoco Simulator

Control

2khz Control Frequency
2khz Dynamic update

Link

Tocabi GitHub :
https://github.com/saga0619/dyros_tocabi
Tocabi Gui GitHub :
https://github.com/saga0619/tocabi_gui
KRoC 2020 Youtube(Korean) :
https://www.youtube.com/watch?v=vLzAvPVPdFY

 


System

  • Vive Base Station
  • Vive Tracker
  • Vive pro2 / Valve index
  • SenseGlove Development Kit
  • Manus Xsens Gloves
  • Sony Actioncam fdr-x3000
  • Jabra Speak 410
  • Adafruit Flexible Adafruit DotStar Matrix
  • SNU Soft Sensor

HMD

Experiencing realistic control and vision is important in avatar system. For realistic experiences, our main focus was depth sensing and real-time vision. Stereoscopic view was applied for depth sensing. Using two cameras makes the operater feel depth through the screen. Also, for low video streaming latency, NDI protocol was applied. HDMI to NDI streaming has 0.2-0.4sec latency on our system.

Hand / Haptic

Expressing Emotion

Communicating the operator’s intentions to the recipient is important in the avatar system. Our avatar system can express emotions through voice and facial expressions. The microphone of the valve index and the jabra speaker are directly connected to deliver the operator’s voice. The 5-inch LCD display on the mouth expresses the operator’s emotions through animation. Animations are created by recognizing emotions from the operator’s voice through the google NLP API.

Safety

A life where robots and humans coexist is no longer just an imagination. Robots large and small are starting to enter our lives. On the other hand, it takes a little more time for a robot with more than two arms like a humanoid to enter life. We are preparing the avatar robot to be used safely. As one of them, various methods for emergency stop have been prepared. This will allow us to use the avatar robot more safely.


Walking

Our two-legged robot is a humanoid that resembles a human. It is important for avatar robots to take over human functions. Furthermore, when interaction with humans is required, humanoids are judged to be a more suitable option. Therefore, for the mobility of the two-legged avatar robot, we did our best to enable the robot to walk on two legs. Use the pedal to plan a footstep and create zmp, com trajectories based on the footstep. Afterwards, for a more robust gait, we use some feedback and some skills. As a result, our robot Tokabi can change direction and walk for a long time.


Motion Retargeting

The avatar robot shall be able to perform the same actions of the operator. To convey the operator’s behavior to the robot, we measured the operator’s posture using Vive Tracker, one of the VR off-the-shelf products. However, it is not easy to implement operator’s motion in a robot because the human body structure and robot structure are different. We created the robot’s desired behavior by changing task priorities depending on the situation. In addition, to control the degree of freedom of the robot’s upper and lower body, an optimization-based inverse kinematic controller was designed to balance the robot and perform various tasks according to its priorities.