This repo provides minimal hands-on code for the class XAI615 in RealWorld Robot for Pick-n-Place Demo.
MuJoCo related code is employed from following DeepMind's repos:
- [MuJoCo] https://github.com/deepmind/mujoco
- [Perception] https://github.com/NVlabs/UnseenObjectClustering
- [UniversalRobot] https://github.com/ros-industrial/ur_modern_driver
This repo is tested on following environment:
- Ubuntu: 20.04
- Python: 3.8.10
- mujoco: 2.3.2
Mujoco Engine
pip install mujoco
pip install mujoco-python-viewer
RG2 - Gripper
pip install pymodbus==2.4.0
Below is a list of files and their descriptions:
-
Realworld
- Gripper: OnRobot RG2 Gripper usage
- Get Topic: Get Topic
/joint_states
(=current joint value) - Solve IK: Solve Inverse Kinematics
- Perception: Segmentation using UnseenObjectClustering method
- Pick-n-Place: Pick-n-Place Demo
-
Simulation
- Solve inverse kinematics in various method with [General, Augmented, Nullspace projection, Repulse, RRT-Star]
- Trajectory planning method: [Quintic, Minimum Jerk, Linear movement]
- Velocity profile method: [Trapezoidal, s-Spline method]
-
Final Demo
- Stack object: Stack one object over another.
- Pick-n-Place only one object: Pick-n-Place only one object via Finite-State-Machine.
- Pick-n-Place multiple objects: Pick-n-Place multiple objects.
- Pick-n-Place multiple objects only using mujoco : Pick-n-Place multiple objects using only mujoco simulator