-
-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rebuild aarch64 only + pytorch 2.4.0 for numpy support #267
Rebuild aarch64 only + pytorch 2.4.0 for numpy support #267
Conversation
Hi! This is the friendly automated conda-forge-linting service. I just wanted to let you know that I linted all conda-recipes in your PR ( |
8bdad90
to
b77f56f
Compare
…nda-forge-pinning 2024.10.01.09.51.54
b77f56f
to
9222451
Compare
One more failure:
at test time we need Glibc 2.27.... How do we get around this @conda-forge/cuda |
Unfortunately, there isn't a great way to run tests for packages that require a glibc newer than 2.17. We have been skipping tests in conda-forge CI until a compatible system (like the Alma Linux 8 images) is available. There's not a very good workaround afaik, but I think that |
Thank you for your fast reply. Will include a comment to this effect and skip things |
Q: Can we piggyback #264 in this PR since we're rebuilding aarch64? It seems to me it should be as simple as setting |
AFAICT this release already has that fix. Please see comment: #264 (comment) |
Agree with Bradley @h-vetinari did this in FAISS. Perhaps he could share more about what he needed there Though note the packages then also build in an environment with a newer GLIBC, which means the packages could have newer GLIBC symbols. Admittedly the ARM CUDA packages are already needing a newer GLIBC. So in practice that may already be a reasonable move Have found maintaining a custom Ideally we solve this by working through GLIBC 2.28 support in conda-forge ( conda-forge/conda-forge.github.io#1941 ), which can then be used here too. Though understand the value of intermediate solutions; hence, the comments above |
No it is not, see #264 (comment). Please also take a look at the CI log:
|
I do not want to try. I am doing everything I can to contain the scope of this PR. This is a "damage control PR" to unblock many others. You may branch off this PR and try the fix in a different branch, but build mods have been difficult to test and validate. |
@leofang I am willing to try for Python 3.13 PR + 2.4.1 + libprotobuf migration. |
It is a one-liner change and the worse case scenario is it has no effect (i.e. no perf improvement) at run time; at build time regardless whether it is set it'd always build fine, so I considered it a zero-cost trial test and was hoping to not waste the CI resource (storage and bandwidth). I can certainly add this one liner change in a standalone PR if it is preferred. Just a drive-by comment since I was pinged (as part of the CUDA team). |
I'm sorry, this whole PR was a "1 liner change" that already took a 1 month to resolve. Again, I'll click merge for a green PR, but for now, I really don't want to do it. |
FYI, one issue addressed by this PR (#266) was filed by a Grace Hopper (GH200) user @jcwomack, which #264 will certainly help too
Sorry for frustration, I wasn't aware of the 1 month span nor anticipating any potential pushback. I only looked at the PR start time which was from earlier today. It is OK that we rebuild the package in the next PR. Thank you for considering. |
Thats ok. I am more than happy to help review PRs and do look for your help and guidance. If you want to help, do subscribe to the notifications on this repo.
Yeah, unfortunately it is difficult to prioritize things. Sorry for that.
If you want, you can help by forking off here and pulling in a few migrations. For what its worth, I am trying to build on my own machine to make sure the tests pass now, before pushing and going to sleep.... |
No, this is what the sysroot is all about. As long as our compilers use the right sysroot version (through the stdlib jinja), there's no risk to use the newer images. This is also how it worked to build for cos6 (~glibc 2.12) in a cos7 image. |
I know @hmaarrfk is working hard on unblocking this, so I'll defer to him for the purposes on this PR, but I'd much prefer bumping the image rather than skipping the tests |
Ah, just saw the auto merge label. Let's pick this up in the next PR then |
The reason for this build is that I’m pretty sure every aarch package for PyTorch 2.4 was build without bumpy support for the last month. This should bring us back in line allowing other improvements. Happy to use the new image next build. |
Closes #266
Closes #256
Hi! This is the friendly automated conda-forge-webservice.
I've started rerendering the recipe as instructed in #266.
If I find any needed changes to the recipe, I'll push them to this PR shortly. Thank you for waiting!
Here's a checklist to do before merging.