Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PTU totally off when exceeding limit #87

Open
lucasb-eyer opened this issue Nov 18, 2014 · 24 comments
Open

PTU totally off when exceeding limit #87

lucasb-eyer opened this issue Nov 18, 2014 · 24 comments

Comments

@lucasb-eyer
Copy link
Member

You can test by doing the following; note that position is in radians:

rostopic pub --once /ptu/cmd sensor_msgs/JointState "header:
  seq: 0
  stamp: {secs: 0, nsecs: 0}
  frame_id: ''
name: ['tilt', 'pan']
position: [0, 0]
velocity: [0.6, 0.6]
effort: [1, 1]"

This moves it to what it thinks is 0,0. Now, make it rotate more than it can:

rostopic pub --once /ptu/cmd sensor_msgs/JointState "header:
  seq: 0
  stamp: {secs: 0, nsecs: 0}
  frame_id: ''
name: ['tilt', 'pan']
position: [0, 4]
velocity: [0.6, 0.6]
effort: [1, 1]"

4 Radians is more than 180deg, so it will hardware-block at around 181deg. Now, go back to 0:

rostopic pub --once /ptu/cmd sensor_msgs/JointState "header:
  seq: 0
  stamp: {secs: 0, nsecs: 0}
  frame_id: ''
name: ['tilt', 'pan']
position: [0, 0]
velocity: [0.6, 0.6]
effort: [1, 1]"

As you will notice, it goes far beyond what previously was 0. Exactly so much more than what it couldn't rotate in the 4 step:

rostopic pub --once /ptu/cmd sensor_msgs/JointState "header:
  seq: 0
  stamp: {secs: 0, nsecs: 0}
  frame_id: ''
name: ['tilt', 'pan']
position: [0, 0.8585]
velocity: [0.6, 0.6]
effort: [1, 1]"

Don't have the time to fix this now though.

@lucasb-eyer
Copy link
Member Author

This is most likely the root cause of strands-project/strands_recovery_behaviours#19

@marc-hanheide
Copy link
Member

did you hear a funny sound? This sounds to me as if your PTU hit some obstacles. If it does, it is miscalibrated and only a restart fixes that. This can also happen if it's going too fast, not only when it hits the limits. Indeed, we have disabled checking the limits: https://github.com/strands-project/scitos_drivers/blob/hydro-devel/flir_pantilt_d46/launch/ptu46.launch#L11

This is to be able to look down at tables and to actually pan to 180. This is exceeding the pre-set factory limits. So, yes, one has to be very careful with that.

@ghost
Copy link

ghost commented Nov 18, 2014

I would say that that is exactly describing what was happening, the weird
sound made me feel uncomfortable already. However what I don't get is that
the angle in the recovery behaviour was only 179, so not even the full 180
(and even though it can go to something like 182). Still we were having
these issues. I might of course be wrong and that it was actually some
other component that "overturned" the PTU, however I think this issues
started to occur when the recovery behaviour was enabled (i.e. today).

On Tue, Nov 18, 2014 at 10:45 PM, Marc Hanheide [email protected]
wrote:

did you hear a funny sound? This sounds to me as if your PTU hit some
obstacles. If it does, it is miscalibrated and only a restart fixes that.
This can also happen if it's going too fast, not only when it hits the
limits. Indeed, we have disabled checking the limits:
https://github.com/strands-project/scitos_drivers/blob/hydro-devel/flir_pantilt_d46/launch/ptu46.launch

This is to be able to look down at tables and to actually pan to 180. This
is exceeding the pre-set factory limits. So, yes, one has to be very
careful with that.


Reply to this email directly or view it on GitHub
#87 (comment)
.

@lucasb-eyer
Copy link
Member Author

Yes, we hear the sound, and I think it comes from hitting the hardware limit, you can see a small knob in the front that it hits when reaching the limit. Well I try these extreme values for debugging, but this is a real problem in practice, because someone, somewhere exceeds the limits and our PTU's 0 was off, which hinders docking and potentially much more.

Either we re-enable the limits, or the driver-node needs to keep track of the actual real-world orientation of the PTU. As-is, it seems the driver just accepts whatever number it gets and assumes the PTU gets to that orientation, which explains the above behaviour. This last part is the culprit; checking limits is just a (workable) workaround.

@cdondrup
Copy link
Member

I would strongly argue against reintroducing the limits. Turning the PTU 180 degrees is very useful in many cases.

@marc-hanheide
Copy link
Member

Yes, the limits are very tight. We wouldn't be able to look back with them in place. The problem is that the PTU uses a servo, i.e. it doesn't measure the position but counts ticks sent to the motor. So, if it's off, it's off until recalibrated. Nothing we can do about that. Can still happen with the limits in place, e.g. when going fast and having one of the cables being trapped. It might however be reasonable to reduce the speed (don't know what it's set to at the minute). I had to do that for the deployment as well, as our PTU simply wasn't working robustly.
We could introduce software limits, but I don't think it will solve the issue here. Also, as it's a servo it can by design not overshoot. If it does, it was already decalibrated.

@lucasb-eyer
Copy link
Member Author

I'm always for fixing the root of a problem instead of working around, but I'm not for spending tomorrow hacking on PTU code :)

Of course, I could also hunt down the culprit node, get mad at its author, rinse&repeat for all other times this will happen...

@marc-hanheide
Copy link
Member

And you are sure it isn't just the speed or you somehow blocked the PTU?

@lucasb-eyer
Copy link
Member Author

I'll have to check the speed tomorrow. No blocking, no.

@marc-hanheide
Copy link
Member

OK, let's leave it for now. I want to see it happening myself ;-)

@lucasb-eyer
Copy link
Member Author

There's no re-calibration code in the ros PTU driver, so I have to switch the whole robot off and on again for the PTU to recalibrate, which of course kills the long-term idea. Let's see how things go today.

@cdondrup
Copy link
Member

You could try to reconfigure the EBC the PTU is on. Same effect.

@lucasb-eyer
Copy link
Member Author

Great idea, thanks!

@nilsbore
Copy link
Member

I confirmed the issue that you had with backtrack, 179 seems too far. Since our robot is patrolling the whole day I can't test the maximum angle. @lucasb-eyer @AlexanderHermans Would you guys be able to see how far the pan can go with tilt=30? Then I can change the values in the backtrack code. Would be much appreciated.

@lucasb-eyer
Copy link
Member Author

Do you have a better plan testing this, other than going almost there and then stepping 1-by-1 and listening to the noise?

@nilsbore
Copy link
Member

You don't need to go to the absolute maximum. Maybe just go to something like 175 and see if it's ok after turning back and forth a couple of times.

@lucasb-eyer
Copy link
Member Author

Ok, will do later today.

@marc-hanheide
Copy link
Member

Just wondering if there might be a conflict with @cdondrup's "look at nav goal" behaviour that kicks in here sending the PTU out of limits?

@cdondrup
Copy link
Member

look at nav goal just makes the head move. No PTU involved

@marc-hanheide
Copy link
Member

Of course. Stupid me.

@nilsbore
Copy link
Member

@lucasb-eyer If you haven't gotten there yet, could you try @marc-hanheide 's suggestion of -175 instead? I'll go with that if it seems better.

@cdondrup
Copy link
Member

I agree, if you want to turn the PTU to the back use -180. Prevents it from getting tangled up in the cables.

@nilsbore
Copy link
Member

@lucasb-eyer
Copy link
Member Author

We ran for a whole day with multiple backtracking instances and in the evening, the PTU was still perfectly aligned, so it doesn't seem to be the backtracking. Didn't get the chance to test docking yet, though @cdondrup said that works fine for them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants