-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* Copied release_4_2 which itself is a copy of dev and removed 2 accidental merges 61f4742, cb50317 * Fixed small bug in UI * Merged from master
- Loading branch information
1 parent
f8463e1
commit c8c8dd1
Showing
83 changed files
with
4,003 additions
and
493 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,45 @@ | ||
# A Guide to using the Intel Realsense T265 sensor with Donkeycar | ||
|
||
---- | ||
|
||
* **Note** Although the Realsense T265 can be used with a Nvidia Jetson Nano, it's a bit easier to set up with a Raspberry Pi (we recommend the RPi 4, with at least 4GB memory). Also, the Intel Realsense D4XX series can also be used with Donkeycar as a regular camera (with the use of its depth sensing data coming soon), and we'll add instructions for that when it's ready. | ||
|
||
Original T265 path follower code by [Tawn Kramer](https://github.com/tawnkramer/donkey) | ||
---- | ||
|
||
## Step 1: Setup Donkeycar | ||
|
||
## Step 2: Setup Librealsense on Ubuntu Machine | ||
|
||
Using the latest version of Raspian (tested with Raspian Buster) on the RPi, follow [these instructions](https://github.com/IntelRealSense/librealsense/blob/master/doc/installation_raspbian.md) to set up Intel's Realsense libraries (Librealsense) and dependencies. | ||
|
||
## Step 3: Setup TensorRT on your Jetson Nano | ||
|
||
After you’ve done that, set up the directory with this: | ||
|
||
```donkey createcar --path ~/follow --template path_follower | ||
Running | ||
``` cd ~/follow | ||
python3 manage.py drive``` | ||
|
||
Once it’s running, open a browser on your laptop and enter this in the URL bar: http://<your nano’s IP address>:8887 | ||
|
||
The rest of the instructions from Tawn’s repo: | ||
|
||
When you drive, this will draw a red line for the path, a green circle for the robot location. | ||
Mark a nice starting spot for your robot. Be sure to put it right back there each time you start. | ||
Drive the car in some kind of loop. You see the red line show the path. | ||
Hit X on the PS3/4 controller to save the path. | ||
Put the bot back at the start spot. | ||
Then hit the “select” button (on a PS3 controller) or “share” (on a PS4 controller) twice to go to pilot mode. This will start driving on the path. If you want it go faster or slower, change this line in the myconfig.py file: THROTTLE_FORWARD_PWM = 530 | ||
Check the bottom of myconfig.py for some settings to tweak. PID values, map offsets and scale. things like that. You might want to start by downloading and using the myconfig.py file from my repo, which has some known-good settings and is otherwise a good place to start. | ||
Some tips: | ||
|
||
When you start, the green dot will be in the top left corner of the box. You may prefer to have it in the center. If so, change PATH_OFFSET = (0, 0) in the myconfig.py file to PATH_OFFSET = (250, 250) | ||
|
||
For a small course, you may find that the path is too small to see well. In that case, change PATH_SCALE = 5.0 to PATH_SCALE = 10.0 (or more, if necessary) | ||
|
||
If you’re not seeing the red line, that means that a path file has already been written. Delete “donkey_path.pkl” (rm donkey_path.pkl) and the red line should show up | ||
|
||
It defaults to recording a path point every 0.3 meters. If you want it to be smoother, you can change to a smaller number in myconfig.py with this line: PATH_MIN_DIST = 0.3 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
# Lidar | ||
|
||
A Lidar sensor can be used with Donkeycar to provide obstacle avoidance or to help navigate on tracks with walls. It records data along with the camera during training and this can be used for training | ||
|
||
![Donkey lidar](../assets/lidar.jpg) | ||
## Supported Lidars | ||
|
||
We currently only support the RPLidar series of sensors, but will be adding support for the similar YDLidar series soon. | ||
|
||
We recommend the [$99 A1M8](https://amzn.to/3vCabyN) (12m range) | ||
|
||
|
||
## Hardware Setup | ||
|
||
Mount the Lidar underneath the camera canopy as shown above (the RPLidar A2M8 is used there, but the A1M8 mounting is the same). You can velcro the USB adapter under the Donkey plate and use a short USB cable to connect to one of your RPi or Nano USB ports. It can be powered by the USB port so there's no need for an additional power supply. | ||
|
||
## Software Setup | ||
|
||
Lidar requires the glob library to be installed. If you don't already have that, install it with `pip3 install glob2` | ||
|
||
Also install the Lidar driver: `pip3 install rplidar` | ||
|
||
Right now Lidar is only supported with the basic template. Install it as follows: | ||
|
||
`donkey createcar --path ~/lidarcar --template basic` | ||
|
||
Then go to the lidarcar directory and edit the myconfig.py file to ensure that the Lidar is turned on. The upper and lower limits should be set to reflect the areas you want your Lidar to "look at", omitting the areas that are blocked by parts of the car body. An example is shown below. For the RPLidar series, 0 degrees is in the direction of the motor (in the case of the A1M8) or cable (in the case of the A2M8) | ||
|
||
``` | ||
# LIDAR | ||
USE_LIDAR = True | ||
LIDAR_TYPE = 'RP' #(RP|YD) | ||
LIDAR_LOWER_LIMIT = 44 # angles that will be recorded. Use this to block out obstructed areas on your car and/or to avoid looking backwards. Note that for the RP A1M8 Lidar, "0" is in the direction of the motor | ||
LIDAR_UPPER_LIMIT = 136 | ||
``` | ||
![Lidar limits](../assets/lidar_angle.png) | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
## Odometry | ||
|
||
Odometry is a way to calculate the speed and distance travelled of the car by measuring the rotation of its wheels using a sensor called an rotary encoder. This encoder can be on the motor, on the main drive shaft or on individual wheels. The advantage of using an encoder is that it "closes the loop" with your throttle, so your car can reliably command an actual velocity rather than just issuing a motor control which will produce a faster or slower velocity depending on the slope of the track, the surface or mechanical friction in your drive train while turning. In short, an encoder gives you much better control over your speed. | ||
|
||
Encoders come in various forms: | ||
* Quadrature encoders use hall-effect sensors to measure magnetic pulses as the shaft turns, and have the advantage of being very precise as well as being able to tell the difference between forward and reverse rotation | ||
* Single-output encoders are like quadrature encoders but they can not determine the direction of motion | ||
* Optical encoders are typicall a LED/Light sensor combo with a disk that has slots cut in in-between them. As the disk rotates, the light is interruprted and those pulses are counted. These sensors are cheap and easy to install but cannot determine the direction of rotation | ||
|
||
There are several ways to read encoders with Donkey: | ||
* Directly with the RaspberryPi's GPIO pins. This is best for optical encoders, since they don't generate as many pulses as a quadrature encoder and the RPi will miss fewer of them as it task-swaps between the various Donkeycar parts | ||
* With an Arduino or Teensy. This is best for quadrature encoders, since the Arduino/Teensy is 100% devoted to counting pulses. It transmits the count to the RPi via the USB serial port when requested by Donkeycar, which lightens the processing load for the Rpi | ||
* With an Astar board. This is just a fancy Arduino, but if you have one, it makes for a neat installation | ||
|
||
|
||
## Supported Encoders | ||
|
||
Examples of rotary encoders that are supported: | ||
|
||
* Optical encoder sensors and discs [Available from many sources on Amazon](https://amzn.to/3s05QmG) | ||
* Quadrature encoders. [Larger, cheaper](https://amzn.to/3liBUjj), [Smaller, more expensive](https://www.sparkfun.com/products/10932) | ||
|
||
## Hardware Setup | ||
|
||
How you attach your encoder is up to you and which kind of encoder you're using. For example, [here's](https://diyrobocars.com/2020/01/31/how-to-add-an-encoder-to-the-donkeycar-chassis/) one way to put a quadrature encoder on the main drive shaft. [Here](https://guitar.ucsd.edu/maeece148/index.php/Project_encoders) is a more complex setup with dual encoders. | ||
|
||
But this is the easiest way to do it, with a cheap and simple optical encoder on the main drive shaft of a standard Donkeycar chassis (if your chassis is different, the same overall approach should work, although you may have to find a different place to mount the sensor): | ||
|
||
First, unscrew the plate over the main drive shaft. Tilt the rear wheels back a bit and you should be able to remove the shaft. | ||
|
||
![drive shaft](../assets/driveshaft.jpg) | ||
|
||
Now enlarge the hole in the optical encoder disc that came with your sensor (use a drill or Dremel grinding stone) so you can slip it onto the shaft. Stretch a rubber grommet (you can use the sort typically included with servos to mount them, but any one of the right size will do) over the shaft and push it into the encoder disc hole. If you don't have a grommet, you can wrap tape around the shaft until it's large enough to hold the disc firmly. Once you've ensured it's in the right place, use a few drops of superglue or hot glue to hold it in place) | ||
|
||
![drive shaft](../assets/encoder1.jpg) | ||
|
||
![drive shaft](../assets/encoder2.jpg) | ||
|
||
Cut out a small notch (marked in pencil here) in the plate covering the drive shaft, so you can mount the encoder sensor there, ensuring that the disc can turn freely in the gap in front of the steering servo | ||
|
||
![drive plate](../assets/cuthere.jpg) | ||
|
||
Now replace the plate and drill two holes so you can screw in the encoder sensor. Slide the disc along the shaft so that it doesn't bind on the sensor. | ||
|
||
![drive plate](../assets/encoder_inplace.jpg) | ||
|
||
Use three female-to-female jumper cables and connect the sensor to your RPi GPIO pins as follows. Connect the GND, V+ (which might say 5V or 3.3V) and data pin (which will say "Out or "D0") to the RPi 5V, Ground and GPIO 13 as shown here (if your sensor encoder has four pins, ignore the one that says "A0"): | ||
![wiring diagram](../assets/encoder_wiring.jpg) | ||
|
||
|
||
## Software Setup | ||
|
||
Enable odometry in `myconfig.py`. | ||
|
||
```HAVE_ODOM = True # Do you have an odometer/encoder | ||
ENCODER_TYPE = 'GPIO' # What kind of encoder? GPIO|Arduino|Astar | ||
MM_PER_TICK = 12.7625 # How much travel with a single tick, in mm. Roll you car a meter and divide total ticks measured by 1,000 | ||
ODOM_PIN = 13 # if using GPIO, which GPIO board mode pin to use as input | ||
ODOM_DEBUG = False # Write out values on vel and distance as it runs | ||
``` | ||
|
||
If you are using an Arduino or Teensy to read your encoder, select 'Arduino' in the myconfig.py file libe above. The microcontroller should be flashed using the Arduino IDE with [this sketch](https://github.com/zlite/donkeycar/tree/master/donkeycar/parts/encoder/encoder). Make sure you check the sketch using the "test_encoder.py code in the Donkeycar tests folder to make sure you've got your encoder plugged into the right pins, or edit it to reflect the pins you are using. |
Oops, something went wrong.