Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible memory leak in [1.2] #3408

Open
do-not-set-2fa opened this issue Jul 19, 2024 · 17 comments
Open

Possible memory leak in [1.2] #3408

do-not-set-2fa opened this issue Jul 19, 2024 · 17 comments
Labels
multistream Related to Janus 1.x needinfo

Comments

@do-not-set-2fa
Copy link

What version of Janus is this happening on?
master (from 2024-07-18)
const char *janus_version_string = "1.2.4";

Have you tested a more recent version of Janus too?
Yes

Was this working before?
Yes

Is there a gdb or libasan trace of the issue?
libasan report

Additional context
Simple Peer connection via lua plugin, cleanup of all sessions and shutdown of janus

@do-not-set-2fa do-not-set-2fa added the multistream Related to Janus 1.x label Jul 19, 2024
@do-not-set-2fa do-not-set-2fa changed the title [1.2] Possible memory leak in [1.2] Jul 19, 2024
@lminiero
Copy link
Member

I see references to incoming requests, which makes me think there were still active connections to Janus when you shut it down.

@do-not-set-2fa
Copy link
Author

Ok i will check it again, but first vacation;)

@do-not-set-2fa
Copy link
Author

I repeated the same steps, and killed all sessions (made sure that session_list admin request return empty list) and got same leaks from incoming request, can it be that session recovery timer keeps it in memory for that amount of time till it is fully cleared?

@atoppi
Copy link
Member

atoppi commented Jul 23, 2024

Just test by terminating after the session reclaim timeout. E.g. if it's 30 seconds wait 60 seconds since the last destroy, then kill the server.

@do-not-set-2fa
Copy link
Author

I have 120sec reclame, and waited +3 min but same libasan output

@lminiero
Copy link
Member

You can try uncommenting the REFCOUNT_DEBUG define in refcount.h and recompile, which should show more info on the references Janus still has at shutdown.

@atoppi
Copy link
Member

atoppi commented Jul 23, 2024

I suspect that janus_lua_session_free is never being called because the lua session is the only thing getting malloc'ed while creating the plugin session.

@atoppi
Copy link
Member

atoppi commented Jul 23, 2024

@do-not-set-2fa are you able to repro with one of the stock lua plugins?

@atoppi
Copy link
Member

atoppi commented Jul 24, 2024

I was trying to repro with echotest.lua and bumped into a memory leak that seems different

==52251==ERROR: LeakSanitizer: detected memory leaks

Direct leak of 640 byte(s) in 8 object(s) allocated from:
    #0 0x58f97aeac743 in malloc (/usr/local/janus/bin/janus+0x27b743) (BuildId: 1a85cdaa67c38c85f1006f74bd2cbbe7db8a5234)
    #1 0x7c5ae62e8af9 in g_malloc /usr/src/glib2.0-2.80.0-6ubuntu3.1/debian/build/deb/../../../glib/gmem.c:100:13
    #2 0x7c5ae63416f1 in g_system_thread_new /usr/src/glib2.0-2.80.0-6ubuntu3.1/debian/build/deb/../../../glib/gthread-posix.c:1265:12
    #3 0x7c5ae275c1fb in janus_lua_method_pushevent /home/atoppi/src/janus-gateway/src/plugins/janus_lua.c:554:3
    #4 0x7c5ae1c56f7d in luaD_precall /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/ldo.c:434:12
    #5 0x7c5ae1c617f8 in luaV_execute /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/lvm.c:1134:13
    #6 0x7c5ae1c4f888 in luaD_rawrunprotected /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/ldo.c:142:3
    #7 0x7c5ae1c522e7 in lua_resume /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/ldo.c:664:12
    #8 0x7c5ae1c6e97f in auxresume /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/lcorolib.c:39:12
    #9 0x7c5ae1c6eba9 in luaB_coresume /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/lcorolib.c:60:7
    #10 0x7c5ae1c56f7d in luaD_precall /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/ldo.c:434:12
    #11 0x7c5ae1c617f8 in luaV_execute /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/lvm.c:1134:13
    #12 0x7c5ae1c57237 in luaD_call /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/ldo.c:499:5
    #13 0x7c5ae1c572ab in luaD_callnoyield /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/ldo.c:509:3
    #14 0x7c5ae1c572ab in lua_callk /build/lua5.3-LmUA4D/lua5.3-5.3.6/src/lapi.c:925:5
    #15 0x7c5ae2783c5b in janus_lua_scheduler /home/atoppi/src/janus-gateway/src/plugins/janus_lua.c:2533:4
    #16 0x7c5ae6311c81 in g_thread_proxy /usr/src/glib2.0-2.80.0-6ubuntu3.1/debian/build/deb/../../../glib/gthread.c:835:20
    #17 0x58f97aeaa22c in asan_thread_start(void*) asan_interceptors.cpp.o

This is a leak for sure, since for any echotest session started and closed, the amount of leaked bytes grows. Wondering if the root cause might be similar.

@atoppi
Copy link
Member

atoppi commented Jul 24, 2024

#3409 should fix the leak mentioned in the message above

@atoppi
Copy link
Member

atoppi commented Jul 24, 2024

With #3409 I can't reproduce memory leaks anymore with echotest.lua

@do-not-set-2fa
Copy link
Author

do-not-set-2fa commented Jul 25, 2024 via email

@do-not-set-2fa
Copy link
Author

@atoppi what transport plugin fif you use?

Have you tried with http?

@atoppi
Copy link
Member

atoppi commented Aug 5, 2024

@do-not-set-2fa it's the same with both ws and http. No leak when using this patch and the official echotest.lua.

@do-not-set-2fa
Copy link
Author

Ok this would explain the slow leak that i am experiencing over the long period of time (even though i can't seem to get the same libasan report as you do, probably doing something wrong :)

I'll wait for the merge and new tag and then start to deploy it to see the effect over time.

Thanks a lot @atoppi and @lminiero

@atoppi
Copy link
Member

atoppi commented Sep 4, 2024

@do-not-set-2fa the PR has been merged and will soon be backported to 0.x too.

@do-not-set-2fa
Copy link
Author

do-not-set-2fa commented Sep 4, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
multistream Related to Janus 1.x needinfo
Projects
None yet
Development

No branches or pull requests

3 participants