-
-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] Why is there no pipenv / poetry for installing dependencies? #69
Comments
For what it's worth, I adapted this project to use poetry, and would recommend it! It's not really in a state that would be compatible with a pull request any more though 😑 |
@dmontagu Could you provide more details about that? I currently use pipenv for the editor and add commands to the dockerfiles to install dependencies as required. In a previous project I used pipenv to install globally in the docker container. Is this similar to what you did? |
@omrihar some discussion here of how I'm using pyproject.toml and poetry to generate a requirements.txt which I add to the container and use to install dependencies. Basically, I use (I think I can actually drop the final poetry install step, but I ran into some inconsistencies in the past where it helped.) |
I use poetry locally, as per the tin box instructions, for local development and I have a set of dockerfiles instructions to install the python dependencies in the docker containers.
No need to go through a If you would like to avoid installing poetry in the container, you can also generate a 'requirements.txt' file from inside a poetry shell with a:
and use it with pip. Latest poetry version (unreleased yet) should provide a way to generate a requirements file too (last I heard) Bottom line is that I need to have full control of the version numbers being installed, and be sure that my builds are frozen on specific versions (that I have tested) and that accidental upstream changes will not creep in in later builds (unless I manually bump up version numbers, after testing them) Been bitten many times from |
@stratosgear See [this issue](which I linked above) for a discussion about precisely why I want to use Also related, if you don't use a poetry-generated virtual environment (I see you are skipping that step) it is difficult to run |
Wow that was a long thread. So, if I understood correctly, you are annoyed that everytime you make a change to the Yes, I am annoyed too, but never that much to change the above procedure too much. It is a tricky task that I never found an easy to follow instructions on how to do it. Just convoluted workarounds. I do use the poetry generated virtual environment on my local development machine (along with some dev dependencies) but the production builds are done with Regarding using root for poetry install in the containers, I never even thought about it. I am installing everything in this It would be sweet if there was an easy straightforward way of addressing all the issues that you mentioned, but the complexity level is too much for my taste. I wish there was a way to get notified when you find the gold recipe for all that. |
Yes, I think you understand correctly. The complexity level is too much for my taste too 😄. I think the tweak to the process to do a pip install prior to poetry has already much-more-than-paid off for me, but maybe I do more builds (with changes to pyproject.toml) than most. (I also have hundreds of megs of dependencies due to scientific stack stuff, so it is painfully slow to re-download all packages.) I also have some projects with "interesting" build processes (e.g., one that builds cython and cmake extensions); it is extremely painful to iteratively debug the build process if every change triggers a fresh pip install. Also, I think using |
Help me get this: If you have already installed with pip, what does a And you haven't even touched the case where installing on an alpine base image, cannot install from wheels having to download and compile everything (numpy, healpy etc, been there done that (by biting the bullet and... waiting :( Multistage builds would surely help but I never invested)). Do you have in any public repo any current implementation of your install procedure? I might be able to learn something from there... |
I don't have any public repos using this implementation, but here is the dockerfile that is used to build a common base image in one of my projects: FROM python:3.7
RUN pip install --upgrade pip
RUN useradd -m worker \
&& mkdir /app \
&& chown -R worker:worker /app
USER worker
ENV POETRY_VERSION=1.0.0b1 \
POETRY_VIRTUALENVS_CREATE=false \
PYTHONFAULTHANDLER=1 \
PYTHONUNBUFFERED=1 \
PYTHONHASHSEED=random \
PIP_NO_CACHE_DIR=off \
PIP_DISABLE_PIP_VERSION_CHECK=on \
PIP_DEFAULT_TIMEOUT=100
# Install Poetry, and set up PATH
# See https://github.com/sdispater/poetry/issues/1301 for pros/cons of this approach
RUN curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | POETRY_PREVIEW=1 python
ENV PYTHONPATH=/app \
HOME=/home/worker \
PATH="/home/worker/.local/bin:/home/worker/.poetry/bin:${PATH}"
WORKDIR /app
COPY --chown=worker:worker ./app/requirements/requirements-poetry.txt /app/requirements/
RUN pip install \
--user \
-r requirements/requirements-poetry.txt
COPY --chown=worker:worker ./app/requirements /app/requirements
RUN pip install \
--user \
--find-links=requirements/wheels \
-r requirements/requirements.txt
COPY --chown=worker:worker ./app /app I generate poetry export -f requirements.txt > requirements_tmp.txt
mv requirements_tmp.txt requirements/requirements-poetry.txt
# Remove "extra" flags not used by pip:
sed -i "" 's/ extra == "[^"]*"//g' requirements/requirements-poetry.txt The |
You'll notice the
(There is a known issue with poetry where it tries to install globally if no virtualenv is being used; this causes permissions errors if not running as root.) The project using the dockerfile above doesn't have any extensions, so it works okay without the There are various ways to get the I also haven't tried using multistage builds for this stuff yet, but that also seems like it could be useful. |
For what it's worth, I am posting my updated version of how I deal with dependencies, in the hope that might help someone (or even better improve upon it) I use poetry locally for development. I have a script (I use invoke to automate many such little things) that autogenerates a
The Then on my CI pipeline I use a multistage build as such
Then from that base image I use it to construct further images (with my fastapi server, or celery workers, etc) Seems to be working fine so far, although it is indeed quite a process to get it right... Thanks |
Thanks for the discussion everyone! ☕ The latest version is now based on Poetry, for local development and integrated into the |
Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues. |
fastapi#69 fastapi#123 fastapi#144 fastapi/full-stack-fastapi-template@00297f9 Commit 00297f9 gitignored poetry.lock. This commit will add poetry.lock to version control with Git to avoid dependency resolution errors during Docker builds. There is no established convention for working with Poetry in Docker, so developers have to consider each use case individually. See: python-poetry/poetry#1879 (comment) In this project, the Dockerfile copies poetry.lock into the Docker image, but there's no step to generate poetry.lock in the first place. Without poetry.lock, dependency resolutions are commonly seen, such as: ```text ❯ bash scripts/test.sh WARNING: The following deploy sub-keys are not supported and have been ignored: labels WARNING: The following deploy sub-keys are not supported and have been ignored: labels WARNING: The following deploy sub-keys are not supported and have been ignored: labels WARNING: The following deploy sub-keys are not supported and have been ignored: labels WARNING: The following deploy sub-keys are not supported and have been ignored: labels db uses an image, skipping flower uses an image, skipping pgadmin uses an image, skipping proxy uses an image, skipping queue uses an image, skipping Building backend [+] Building 15.3s (8/10) => [internal] load build definition from backend.dockerfile 0.2s => => transferring dockerfile: 797B 0.1s => [internal] load .dockerignore 0.1s => => transferring context: 2B 0.0s => [internal] load metadata for ghcr.io/br3ndonland/inboard:fastapi-python3.9 0.3s => [1/6] FROM ghcr.io/br3ndonland/inboard:fastapi-python3.9@sha256:5591f436a37490a1569afd9e55ae 0.0s => [internal] load build context 0.0s => => transferring context: 64.67kB 0.0s => CACHED [2/6] COPY ./app/pyproject.toml ./app/poetry.lock* /app/ 0.0s => CACHED [3/6] WORKDIR /app/ 0.0s => ERROR [4/6] RUN bash -c "if [ true == 'true' ] ; then poetry install --no-root ; else poetr 14.4s ------ > [4/6] RUN bash -c "if [ true == 'true' ] ; then poetry install --no-root ; else poetry install --no-root --no-dev ; fi": Skipping virtualenv creation, as specified in config file. Installing dependencies from lock file Warning: The lock file is not up to date with the latest changes in pyproject.toml. You may be getting dependencies. Run update to update them. SolverProblemError Because app depends on sqlalchemy-stubs (^0.3) which doesn't match any versions, version solving failed. at /opt/poetry/lib/poetry/puzzle/solver.py:241 in _solve 237│ packages = result.packages 238│ except OverrideNeeded as e: 239│ return self.solve_in_compatibility_mode(e.overrides, use_latest=use_latest) 240│ except SolveFailure as e: → 241│ raise SolverProblemError(e) 242│ 243│ results = dict( 244│ depth_first_search( 245│ PackageNode(self._package, packages), aggregate_package_nodes ------ executor failed running [/bin/sh -c bash -c "if [ $INSTALL_DEV == 'true' ] ; then poetry install --no-root ; else poetry install --no-root --no-dev ; fi"]: exit code: 1 ERROR: Service 'backend' failed to build : Build failed ```
I would assume that one or the other (or even a requirements.txt) would be used for setting up the python dependencies.
I've seen so MANY nice libraries/abstractions already used throughout this cookiecutter, that I'm surpised that there is no strict way of controlling python package dependency versions.
The text was updated successfully, but these errors were encountered: