-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Couldn't find a setup script in /tmp/easy_install-a8g3h_h2/scipy-1.14.0.tar.gz #799
Comments
I think the README.md is incorrectly suggesting Python 3.9 in one place (in another place it suggests 3.10...). The error in your logs suggests using 3.10 here:
|
I ran into this issue and it looks like scipy changed their install method with version 14, I was able to work around it by editing the setup.py to install the last ver. "scipy==1.13.1" under install_requires |
The above did work for me, but I got stuck with another
Seems like there are several dependencies for me which are causing conflict. Any thoughts on what I can do to get this running. Sorry I am a n00b with python and pip configuration management. |
For that one I changed it to 'tokenizers==0.13.4.rc3', since that was the last version <0.14 |
I keep getting an installation error on the last step:
How do I fix this? I followed the conda installation instructions step-by-step on Debian. I can't fix this error. Even if I try to manually copy that file to the tmp directory, the random folder name changes each time and it the installer wont touch it. How do we install this in Debian 12?
UPDATE: I found another fix from another thread tfix that one problem but then I ran into a series of other problems that I was ultimately unable to solve:
I changed "tokenizers" to "tokenizers==0.11.1". The installation finally finishes but the program doesn't actually work. For example if I try "python tortoise/socket_server.py ", I get the following output:
so then I tried adding:
'spacy',
to my setup.py file. Now I get this:
Does anybody know how to fix this? How do I manually uninstall tokenizers 15? Where are those files stored? Can I manually installed these dependencies in conda? "pip install [package]" doesn't work.
ok so after trying a lot of different stuff, I think I finally tricked it into installing tokenizer 13.1. Now, running "python tortoise/socket_server.py" gets me this error:
any help would be greatly appreciated.
EDIT AGAIN: Ok, disregard everything up to the spacy issue. Turns out I had things freally messed up and actually, the following issues is the furthest I got:
can someone please post up to date installation instructions? The conda one 100% does not work, at least on Debian and neither does the single-line pip command because Debian doesn't let you use pip that way, then if I try to set up a python venv, I just get a torrent of cryptic errors and its hard to tell what it even wants.
The text was updated successfully, but these errors were encountered: