Project Overview:
This project demonstrates the development of an autonomous robot capable of navigating through a given environment while avoiding obstacles. It combines simulation with ROS2, hardware implementation using Raspberry Pi and Arduino, and real-world tests. The project showcases the full communication between multiple components, both in software and hardware.
- Introduction
- Features
- System Architecture
- Hardware Setup
- Software Components
- Installation Guide
- Usage
- Simulation and Real-World Results
- Schematic Diagrams and Communication Flow
- Testing
- Acknowledgements
This project focuses on designing and implementing an autonomous robot for navigation in unknown environments. The project leverages ROS2 for control and path planning, while the Raspberry Pi manages sensor data, and the Arduino controls the motor drivers. The aim is to have the robot navigate using real-time data from LiDAR for obstacle detection and encoder for odometry.
The project was divided into two phases:
- Simulation: Using Gazebo and RViz for testing the navigation algorithms.
- Real-World Implementation: Developing and testing the physical robot.
- Autonomous Navigation: The robot can autonomously navigate while avoiding obstacles.
- Sensor Fusion: Data from LiDAR are processed in real-time to aid navigation.
- ROS2 Integration: ROS2 nodes handle the robot's communication, path planning, and sensor processing.
- Simulation: The environment was simulated in Gazebo and visualized using RViz.
- Real-World Testing: The robot was tested in various environments to validate the simulation results.
The system is designed to follow a modular architecture where the processing of sensor data and control algorithms is split across different nodes in ROS2. The path planning algorithm uses a combination of A* and sensor data for real-time decision-making.
The system consists of:
- Raspberry Pi: Acts as the brain, running ROS2 and processing sensor data.
- Arduino: Controls the motor drivers and handles the low-level motor commands.
- Local Computer: Used for simulations, running Gazebo and RViz.
- Microcontroller: Raspberry Pi 4, handling high-level operations.
- Motor Controller: Arduino Mega 2560 for controlling motor drivers.
- Motors: DC motors with encoders for precise control.
- Sensors:
- LiDAR: For distance measurement and obstacle detection.
- Encoders: For Odometry.
The robot chassis was built to house all components, including 4 Li-ion of 16.8V battery, 9.7V for powering the motors and the rest to power the motor driver. 5V from a power-bank to power the raspberry pi 4 and the lidar.
- ROS2 Humble: The middleware for communication between sensors and motor controllers.
- Gazebo: For simulating the environment.
- RViz: For visualizing the robot’s sensors and movements.
- Python and C++: Used for writing control nodes and processing sensor data.
- Libraries:
ros2_control
geometry_msgs
sensor_msgs
teleop_twist_joy
ros-humble-navigation2
slam_toolbox
Ensure that you have the following installed:
- Ubuntu 20.04 with ROS2 Humble
- Python 3.7
- Git
-
Clone the Repository:
git clone https://github.com/codeflamer/rover.git cd rover
-
Install ROS2 Dependencies:
sudo apt update sudo apt install ros-humble-desktop python3-argcomplete
-
Build the ROS2 Workspace:
colcon build
To run the simulation in Gazebo:
ros2 launch rover launch_sim.py
- Power up the robot and ensure the Raspberry Pi is connected.
- Ensure the arduino is properly connected to the raspberrypi for effective communication.
- Run the ROS2 nodes:
ssh [email protected] ros2 launch rover rplidar.launch.py ros2 launch rover launch_robot.launch.py cd worlds && ros2 run nav2_map_server map_saver_cli -f front-room.yaml ros2 launch rover navigation_launch.py use_sim_time:=false map_subscribe_transcient_local:=true
Below are images and a video showcasing the robot navigating in a simulated environment:
MAP of the indoor environment obtained using slam_toolbox:
The real-world tests demonstrated the robot successfully navigating through obstacles in various environments:
PICTURE OF INDOOR ENVIROMNENT:
Below is a diagram of the hardware setup showing connections between the Raspberry Pi, Arduino, motors, and encoder:
This diagram illustrates the communication flow between the Raspberry Pi, Arduino, and the local computer:
- Simulation: Tested with varying path planning strategies.
- Real-World Tests: Conducted tests in only indoor environments.
Special thanks to:
- The ROS2 community for excellent documentation and resources.
- Gazebo developers for their simulation environment.
- Articulated Robotics