Skip to content

How run Teleoperation scenario?

Kourosh Darvish edited this page Jan 20, 2020 · 5 revisions

With the Teleoperation module, you can run 5 different types of experiments based on your needs and the technology you are using.

Important: When start running the experiment using Oculus, the inertial frame on the perception side is the initial pose of the Oculus headset. Therefore, the user should try to stay straight.

  • Before running, follow the installation steps here, to ensure you are in the correct branch of this module and all the dependent modules.

  • If you have installed correctly the module, you will see the application files in yarpmanager-> Entities -> Applications in order to launch the demos.

Important: If you are using Oculus to teleoperate the human arm motions to rhe robots' ones; update the scaling Factor and install again the module if it was not updated. Scaling factor provides the ratio between the teleoperation user and robot arm length: $$ ScalingFactor=\frac{robotArmLength}{humanArmLength} $$ You should update it for both left arm and right arm. You can find them in /app/robots/iCubGenova04/leftHandRetargetingParams.ini and /app/robots/iCubGenova04/rightHandRetargetingParams.ini

pre-steps

  • Ensure the Oculus is calibrated: follow steps mentioned here. If you are using Oculus Quest, at the begining when you connect the oculus headset via cable to the oculus laptop, an option will appear to enable to the quest link; you should select enable or OK.

  • ensure yarprun is running in the icub-virtualizer laptop (Alienware or Oculus laptop) with icub user account.

    • to run yarprun in aliweware laptop, you need to do run selected it in yarpmanager -> cluster -> icub-virtualizer
    • if the command above does not work, yarp run --server /icub-virtualizer --login PowerShell of Alienware laptop.
  • ensure yarprun is running in icub-head and icub30 as well, you can do it using yarpmanger.

  • Application name: go to iCubStartup_wbd application, in the applications list.

    run following modules:

    • yarplogger icub30 -start -no_stop
    • yarprobotinterface icub-head -config icub_wbd.xml
    • yarpmotorgui icub30 -from homePoseBalancing.ini
  • In a new terminal run following command:yarp rpc /wholeBodyDynamics/rpc and insert calib all 300 to calibrate the FT sensors (at the calibration time, the robot feet are not on the floor, and the robot is kept by the crane).

  • If you are using Cyberith virtualizer treadmill, run VirtualizerControlPanel.exe application located in Documents/icub_ws/Cyberith/cybSDK_tools_controlpanel_main in Alienware laptop. Rotate three times and move vertically three times the virtualizer before going inside it in order to calibrate it. You can see the changes in the Cyberith virtualizer application if you connect it. Later, disconnect and close this application.

  • If you are using Xsens, calibrate the Xsens with NposeWalk option, and put your calibration accuracy level to Good. You can follow the detailed description of the calibration procedure of Xsens in human-dynamics-estimation repository.

Scenario Steps

1. Joypad + Oculus

  1. Application name: go to DCM_walking_retargeting application, in the applications list:

    • run all the modules listed in the application
    • connect all the ports
    • the user should receive the image in the Oculus headset
  2. Use z of the Alienware keyboard to decrease the image size, and use r to refresh to image center, so that the user can see better.

  3. Use the oculus joypad and press A button prepare the robot, so that the robot goes to its initial configuration.

  4. Put the robot on the floor.

  5. Use the oculus joypad and press X button start walking and start teleoperating the robot.

  6. use the left joypad joystick (directional pad) to control the robot walking.

  7. enjoy teleoperating the robot!

2. Oculus + Virtualizer

  1. Application name: go to DCM_walking_retargeting_(Virtualizer) application, in the applications list:

    • run all the modules listed in the application
    • connect all the ports
    • the user should receive the image in Oculus
  2. use z of the Alienware keyboard to decrease the image size, and use r to refresh to image center, so that the user can see better.

  3. use the oculus joypad and press A button prepare the robot, so that the robot goes to its walking position.

  4. put the robot on the floor.

  5. use the oculus joypad and press X button start walking and start teleoperating the robot.

  6. if you walk forward/backwards inside the virtualizer, the robot walks forward/backwards. If you rotate inside the virtualizer, the robot rotates the same way as the user.

    • Note While you rotate, your head should be straight.
  7. enjoy teleoperating the robot!

3. xsense + Oculus

  1. Application name: go to DCM_walking_retargeting_(Xsens) application, in the applications list:

    • run all the modules listed in the application
    • connect all the ports
    • the user should receive the image in Oculus
  2. use z of the Alienware keyboard to decrease the image size, and use r to refresh to image center, so that the user can see better.

  3. use the oculus joypad and press A button prepare the robot, so that the robot goes to its walking position.

  4. put the robot on the floor.

  5. use the oculus joypad and press X button start walking and start teleoperating the robot.

  6. use the left joypad joystick (directional pad) to control the robot walking.

  7. enjoy teleoperating the robot!

    Notes:

    i. be careful, when you are wearing the headset, do not move the Xsens headband.

    ii. if the headband moves, you can perform a secondary calibration, to remove the offsets.

     - `yarp rpc /HumanStateProvider/rpc:i`
    
     - `help`: give information regarding the available commands
    
     - `calibrate` : perform the whole-body calibration. Posture of the subject should be npose (zero configuration of the robot).
    
     - `calibrate linkName`: perform the calibration for the given link. Posture of the subject should be such that the joints between the given link and the parent link are zero in the robot configuration.
    
     N.B. the calibration can be performed only for those link on which we are mapping sensor measurements, i.e. the fake link that can be found in the [configuration](https://github.com/robotology/human-dynamics-estimation/blob/add-secondary-calibration/conf/xml/RobotStateProvider_iCub.xml#L47).
    
     - `reset`: remove all the secondary calibration matrices.
    
     - `reset linkName`: remove the secondary calibration for the given link.             `` 
    

4. xsense + Oculus + Virtualizer

  1. Application name: go to Xsens_Retargeting_Visualization application, in the applications list, in order to visualize the human/robot motions and IK in rviz.

    • run all the modules listed in the application
    • connect all the ports
  2. Application name: go to DCM_walking_retargeting_(Virtualizer_Xsens) application, in the applications list:

    • run all the modules listed in the application
    • connect all the ports
    • the user should receive the image in Oculus
  3. use z of the Alienware keyboard to decrease the image size, and use r to refresh to image center, so that the user can see better.

  4. use the oculus joypad and press A button prepare the robot, so that the robot goes to its walking position.

  5. put the robot on the floor.

  6. use the oculus joypad and press X button start walking and start teleoperating the robot.

  7. if you walk forward/backwards inside the virtualizer, the robot walks forward/backwards. If you rotate inside the virtualizer, the robot rotates the same way as the user.

  8. enjoy teleoperating the robot!

    Notes:

    i. be careful, when you are wearing the headset, do not move the Xsens headband.

    ii. if the headband moves, you can perform a secondary calibration, to remove the offsets.

     - `yarp rpc /HumanStateProvider/rpc:i`
    
     - `help`: give information regarding the available commands
    
     - `calibrate` : perform the whole-body calibration. Posture of the subject should be npose (zero configuration of the robot).
    
     - `calibrate linkName`: perform the calibration for the given link. Posture of the subject should be such that the joints between the given link and the parent link are zero in the robot configuration.
    
     N.B. the calibration can be performed only for those link on which we are mapping sensor measurements, i.e. the fake link that can be found in the [configuration](https://github.com/robotology/human-dynamics-estimation/blob/add-secondary-calibration/conf/xml/RobotStateProvider_iCub.xml#L47).
    
     - `reset`: remove all the secondary calibration matrices.
    
     - `reset linkName`: remove the secondary calibration for the given link.             `` 
    

5. Yoga-Retargeting

  1. Application name: go to Xsens_Retargeting_Visualization application, in the applications list, in order to visualize the human/robot motions and IK in rviz.

    • run all the modules listed in the application
    • connect all the ports
  2. Application name: go to Yoga_Retargeting_Xsens application, in the applications list and run it.

    • run all the modules listed in the application
    • connect all the ports
  3. Follow the procedures mentioned in the whole-body-controller in order to retarget the robot. The only difference is that, you should run the model located in controllers/retargeting-floating-base-balancing-torque-control/startModelWithStaticGui.m.

  4. Enjoy teleoperating the robot!