Skip to content

Commit

Permalink
Examples Clean-Up (#1408)
Browse files Browse the repository at this point in the history
  • Loading branch information
gerth2 authored Sep 15, 2024
1 parent 596c875 commit 9e6a066
Show file tree
Hide file tree
Showing 269 changed files with 9,342 additions and 3,730 deletions.
8 changes: 4 additions & 4 deletions docs/source/docs/additional-resources/best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@
- Make sure you take advantage of the field calibration time given at the start of the event:
- Bring your robot to the field at the allotted time.
- Turn on your robot and pull up the dashboard on your driver station.
- Point your robot at the target(s) and ensure you get a consistent tracking (you hold one target consistently, the ceiling lights aren't detected, etc.).
- If you have problems with your pipeline, go to the pipeline tuning section and retune the pipeline using the guide there. You want to make your exposure as low as possible with a tight hue value to ensure no extra targets are detected.
- Move the robot close, far, angled, and around the field to ensure no extra targets are found anywhere when looking for a target.
- Point your robot at the AprilTags(s) and ensure you get a consistent tracking (you hold one AprilTag consistently, the ceiling lights aren't detected, etc.).
- If you have problems with your pipeline, go to the pipeline tuning section and retune the pipeline using the guide there.
- Move the robot close, far, angled, and around the field to ensure no extra AprilTags are found.
- Go to a practice match to ensure everything is working correctly.
- After field calibration, use the "Export Settings" button in the "Settings" page to create a backup.
- Do this for each coprocessor on your robot that runs PhotonVision, and name your exports with meaningful names.
Expand All @@ -26,4 +26,4 @@
- This effectively works as a snapshot of your PhotonVision data that can be restored at any point.
- Before every match, check the ethernet connection going into your coprocessor and that it is seated fully.
- Ensure that exposure is as low as possible and that you don't have the dashboard up when you don't need it to reduce bandwidth.
- Stream at as low of a resolution as possible while still detecting targets to stay within bandwidth limits.
- Stream at as low of a resolution as possible while still detecting AprilTags to stay within field bandwidth limits.
7 changes: 6 additions & 1 deletion docs/source/docs/apriltag-pipelines/multitag.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ This multi-target pose estimate can be accessed using PhotonLib. We suggest usin
```{eval-rst}
.. tab-set-code::
.. code-block:: java
.. code-block:: Java
var result = camera.getLatestResult();
if (result.getMultiTagResult().estimatedPose.isPresent) {
Expand All @@ -38,6 +38,11 @@ This multi-target pose estimate can be accessed using PhotonLib. We suggest usin
if (result.MultiTagResult().result.isPresent) {
frc::Transform3d fieldToCamera = result.MultiTagResult().result.best;
}
.. code-block:: Python
# Coming Soon!
```

:::{note}
Expand Down
38 changes: 27 additions & 11 deletions docs/source/docs/contributing/building-photon.md
Original file line number Diff line number Diff line change
Expand Up @@ -243,19 +243,11 @@ The program will wait for the VSCode debugger to attach before proceeding.

### Running examples

You can run one of the many built in examples straight from the command line, too! They contain a fully featured robot project, and some include simulation support. The projects can be found inside the photonlib-java-examples and photonlib-cpp-examples subdirectories, respectively. The projects currently available include:
You can run one of the many built in examples straight from the command line, too! They contain a fully featured robot project, and some include simulation support. The projects can be found inside the photonlib-*-examples subdirectories for each language.

- photonlib-java-examples:
- aimandrange:simulateJava
- aimattarget:simulateJava
- getinrange:simulateJava
- simaimandrange:simulateJava
- simposeest:simulateJava
- photonlib-cpp-examples:
- aimandrange:simulateNative
- getinrange:simulateNative
#### Running C++/Java

To run them, use the commands listed below. PhotonLib must first be published to your local maven repository, then the copy PhotonLib task will copy the generated vendordep json file into each example. After that, the simulateJava/simulateNative task can be used like a normal robot project. Robot simulation with attached debugger is technically possible by using simulateExternalJava and modifying the launch script it exports, though unsupported.
PhotonLib must first be published to your local maven repository, then the copy PhotonLib task will copy the generated vendordep json file into each example. After that, the simulateJava/simulateNative task can be used like a normal robot project. Robot simulation with attached debugger is technically possible by using simulateExternalJava and modifying the launch script it exports, though not yet supported.

```
~/photonvision$ ./gradlew publishToMavenLocal
Expand All @@ -268,3 +260,27 @@ To run them, use the commands listed below. PhotonLib must first be published to
~/photonvision/photonlib-cpp-examples$ ./gradlew copyPhotonlib
~/photonvision/photonlib-cpp-examples$ ./gradlew <example-name>:simulateNative
```

#### Running Python

PhotonLibPy must first be built into a wheel.

```
> cd photon-lib/py
> buildAndTest.bat
```

Then, you must enable using the development wheels. robotpy will use pip behind the scenes, and this bat file tells pip about your development artifacts.

Note: This is best done in a virtual environment.

```
> enableUsingDevBuilds.bat
```

Then, run the examples:

```
> cd photonlib-python-examples
> run.bat <example name>
```
2 changes: 1 addition & 1 deletion docs/source/docs/description.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
## Description

PhotonVision is a free, fast, and easy-to-use vision processing solution for the *FIRST*Robotics Competition. PhotonVision is designed to get vision working on your robot *quickly*, without the significant cost of other similar solutions.
Using PhotonVision, teams can go from setting up a camera and coprocessor to detecting and tracking targets by simply tuning sliders. With an easy to use interface, comprehensive documentation, and a feature rich vendor dependency, no experience is necessary to use PhotonVision. No matter your resources, using PhotonVision is easy compared to its alternatives.
Using PhotonVision, teams can go from setting up a camera and coprocessor to detecting and tracking AprilTags and other targets by simply tuning sliders. With an easy to use interface, comprehensive documentation, and a feature rich vendor dependency, no experience is necessary to use PhotonVision. No matter your resources, using PhotonVision is easy compared to its alternatives.

## Advantages

Expand Down
37 changes: 24 additions & 13 deletions docs/source/docs/examples/aimandrange.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,36 +4,47 @@ The following example is from the PhotonLib example repository ([Java](https://g

## Knowledge and Equipment Needed

- Everything required in {ref}`Aiming at a Target <docs/examples/aimingatatarget:Knowledge and Equipment Needed>` and {ref}`Getting in Range of the Target <docs/examples/gettinginrangeofthetarget:Knowledge and Equipment Needed>`.
- Everything required in {ref}`Aiming at a Target <docs/examples/aimingatatarget:Knowledge and Equipment Needed>`.

## Code

Now that you know how to both aim and get in range of the target, it is time to combine them both at the same time. This example will take the previous two code examples and make them into one function using the same tools as before. With this example, you now have all the knowledge you need to use PhotonVision on your robot in any game.
Now that you know how to aim toward the AprilTag, let's also drive the correct distance from the AprilTag.

To do this, we'll use the *pitch* of the target in the camera image and trigonometry to figure out how far away the robot is from the AprilTag. Then, like before, we'll use the P term of a PID controller to drive the robot to the correct distance.

```{eval-rst}
.. tab-set::
.. tab-item:: Java
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-java-examples/aimandrange/src/main/java/frc/robot/Robot.java
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-java-examples/aimandrange/src/main/java/frc/robot/Robot.java
:language: java
:lines: 42-111
:lines: 84-131
:linenos:
:lineno-start: 42
:lineno-start: 84
.. tab-item:: C++ (Header)
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimandrange/src/main/include/Robot.h
:language: cpp
:lines: 27-71
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimandrange/src/main/include/Robot.h
:language: c++
:lines: 25-63
:linenos:
:lineno-start: 27
:lineno-start: 25
.. tab-item:: C++ (Source)
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimandrange/src/main/cpp/Robot.cpp
:language: cpp
:lines: 25-67
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimandrange/src/main/cpp/Robot.cpp
:language: c++
:lines: 58-107
:linenos:
:lineno-start: 25
:lineno-start: 58
.. tab-item:: Python
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-python-examples/aimandrange/robot.py
:language: python
:lines: 44-98
:linenos:
:lineno-start: 44
```
42 changes: 26 additions & 16 deletions docs/source/docs/examples/aimingatatarget.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,55 @@
# Aiming at a Target

The following example is from the PhotonLib example repository ([Java](https://github.com/PhotonVision/photonvision/tree/master/photonlib-java-examples/aimattarget)/[C++](https://github.com/PhotonVision/photonvision/tree/master/photonlib-cpp-examples/aimattarget)).
The following example is from the PhotonLib example repository ([Java](https://github.com/PhotonVision/photonvision/tree/master/photonlib-java-examples/aimattarget)).

## Knowledge and Equipment Needed

- Robot with a vision system running PhotonVision
- Target
- Ability to track a target by properly tuning a pipeline
- A Robot
- A camera mounted rigidly to the robot's frame, cenetered and pointed forward.
- A coprocessor running PhotonVision with an AprilTag or Aurco 2D Pipeline.
- [A printout of Apriltag 7](https://firstfrc.blob.core.windows.net/frc2024/FieldAssets/Apriltag_Images_and_User_Guide.pdf), mounted on a rigid and flat surface.

## Code

Now that you have properly set up your vision system and have tuned a pipeline, you can now aim your robot/turret at the target using the data from PhotonVision. This data is reported over NetworkTables and includes: latency, whether there is a target detected or not, pitch, yaw, area, skew, and target pose relative to the robot. This data will be used/manipulated by our vendor dependency, PhotonLib. The documentation for the Network Tables API can be found {ref}`here <docs/additional-resources/nt-api:Getting Target Information>` and the documentation for PhotonLib {ref}`here <docs/programming/photonlib/adding-vendordep:What is PhotonLib?>`.
Now that you have properly set up your vision system and have tuned a pipeline, you can now aim your robot at an AprilTag using the data from PhotonVision. The *yaw* of the target is the critical piece of data that will be needed first.

For this simple example, only yaw is needed.
Yaw is reported to the roboRIO over Network Tables. PhotonLib, our vender dependency, is the easiest way to access this data. The documentation for the Network Tables API can be found {ref}`here <docs/additional-resources/nt-api:Getting Target Information>` and the documentation for PhotonLib {ref}`here <docs/programming/photonlib/adding-vendordep:What is PhotonLib?>`.

In this example, while the operator holds a button down, the robot will turn towards the goal using the P term of a PID loop. To learn more about how PID loops work, how WPILib implements them, and more, visit [Advanced Controls (PID)](https://docs.wpilib.org/en/stable/docs/software/advanced-control/introduction/index.html) and [PID Control in WPILib](https://docs.wpilib.org/en/stable/docs/software/advanced-controls/controllers/pidcontroller.html#pid-control-in-wpilib).
In this example, while the operator holds a button down, the robot will turn towards the AprilTag using the P term of a PID loop. To learn more about how PID loops work, how WPILib implements them, and more, visit [Advanced Controls (PID)](https://docs.wpilib.org/en/stable/docs/software/advanced-control/introduction/index.html) and [PID Control in WPILib](https://docs.wpilib.org/en/stable/docs/software/advanced-controls/controllers/pidcontroller.html#pid-control-in-wpilib).

```{eval-rst}
.. tab-set::
.. tab-item:: Java
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-java-examples/aimattarget/src/main/java/frc/robot/Robot.java
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-java-examples/aimattarget/src/main/java/frc/robot/Robot.java
:language: java
:lines: 41-98
:lines: 77-117
:linenos:
:lineno-start: 41
:lineno-start: 77
.. tab-item:: C++ (Header)
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimattarget/src/main/include/Robot.h
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimattarget/src/main/include/Robot.h
:language: c++
:lines: 27-53
:lines: 25-60
:linenos:
:lineno-start: 27
:lineno-start: 25
.. tab-item:: C++ (Source)
.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimattarget/src/main/cpp/Robot.cpp
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimattarget/src/main/cpp/Robot.cpp
:language: c++
:lines: 25-52
:lines: 56-96
:linenos:
:lineno-start: 25
:lineno-start: 56
.. tab-item:: Python
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-python-examples/aimattarget/robot.py
:language: python
:lines: 46-70
:linenos:
:lineno-start: 46
```
58 changes: 0 additions & 58 deletions docs/source/docs/examples/gettinginrangeofthetarget.md

This file was deleted.

Binary file added docs/source/docs/examples/images/poseest_demo.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 1 addition & 3 deletions docs/source/docs/examples/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@
:maxdepth: 1
aimingatatarget
gettinginrangeofthetarget
aimandrange
simaimandrange
simposeest
poseest
```
Loading

0 comments on commit 9e6a066

Please sign in to comment.