-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
change how user "debug" files in debug_files.tar.gz
are handled
#8719
Comments
move my thoughts here from #8699 (comment) about
so that in the UI they can be fetched from the scheduler [1] OTOH this could be done in AdjustSites when WEB_DIR is prepared instead of both copying there debug_files.tar.gz and creating a symlink for debug/ directoryCRABServer/scripts/AdjustSites.py Lines 228 to 233 in e8149dd
And leave for a future optimization to simply put files in S3, like we do for twlog. w/o creating the Having a tar saves on repeated calls to crabserver (to get preapproved url's) and S3 from the client. The idea was that when CRABClient is executed from sites with a large RTT, the fewer http calls, the better. Maybe better to settle on the final solution, and do all changes in one go. Avoid putting code there "for a few months" which is hard to remove. [1] CRABServer/src/script/task_info.js Lines 231 to 245 in e8149dd
|
currently we put in S3 cache
debug_files.tar.gz
which contains user's config and script exe to help operators debug problems and to create a new task to submit incrab recovery
But in order to make those files visible in the UI, we need an URL which points to the text version. We currently do this by placing files in WEB_DIR/debug and fetching via scheduler.
Also for historical reasons the TW sends to the scheduler both the tarball and the explicit files, which is a duplication of content.
Can we do something better, simpler and easier to document and understand ?
Since those files are small, the duplication in sending from TW to scheduler is not a big worry, but confusion and lack of documentation is bad.
The text was updated successfully, but these errors were encountered: