-
Notifications
You must be signed in to change notification settings - Fork 721
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connection stalled in multithreaded app with 1.6.x branch #598
Comments
Thanks for testing the 1.6.x branch. There is already multi threaded support in the Python client, I have certainly been trying to improve it though. The 1.6.x branch has an extra parameter in wait_for_publish(), which is a timeout so you can do I'd be interested in fixing the underlying problem of course though, do you have some example code you could share? |
I'm planning on a release around the end of September. |
Side question, is there a some type of is-loop check? It would make sense for the parent process to be able to know if the 'loop' logic has stalled, halted or such, no? I am checking is_connected of course, but that could be misleading if the loop thread has failed, right? |
Sorry for the late reply. Here is a link to my Github project: https://github.com/SoftwareAG/cumulocity-python-device-onboarding |
Is it possible that a reconnection happen and you are using QoS = 0 message ? If yes, it's probably fixed by #796 |
Hi,
I have 5 Python 3 processes, each running 2 threads.
Each process has an MQTT connection shared by the 2 threads.
After a random time some threads are stalled with wait_for_publish method.
My understanding is that you're working on supporting multithreading in 1.6.x branch but apparently it is not yet there.
Currently, only working workaround is to not use wait_for_publish but rather use a loop to wait with a timeout. If the timeout is reached, we disconnect and then reconnect.
Do you have some update to share, and maybe a release date?
Many thanks in advance.
Best regards,
Cyril
The text was updated successfully, but these errors were encountered: