Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting Error in installing vllm on Nvidia Jetson AGX ORIN #1872

Open
MausamJain opened this issue Jul 12, 2024 · 0 comments
Open

Getting Error in installing vllm on Nvidia Jetson AGX ORIN #1872

MausamJain opened this issue Jul 12, 2024 · 0 comments

Comments

@MausamJain
Copy link

Getting Error in installing vllm on Nvidia Jetson AGX ORIN
I have tried installation using pip , source git source and building setup.py also but could not succeed.
Below are the details about my setup -

jtop 4.2.8 (c) 2024, Raffaello Bonghi [[email protected]]
Website: https://rnext.it/jetson_stats
Platform
Machine: aarch64
Serial Number: [s❘XX CLICK TO READ XXX] Hardware
Model: NVIDIA Jetson AGX Orin Developer Kit
Distribution: Ubuntu 22.04 Jammy Jellyfish 699-level Part Number: 699-13701-0005-500 S.0
Release: 5.15.136-tegra
System: Linux
Python: 3.10.12
Libraries
CUDA: 12.2.140
CUDNN: 8.9.4.25
TensorRT: 8.6.2.3
VPI: 3.1.5
Vulkan: 1.3.204
OpenCV: 4.8.0 with CUDA: NO
P-Number: p3701-0005
Module: NVIDIA Jetson AGX Orin (64GB ram) SoC: tegra234
CUDA Arch BIN: 8.7
L4T: 36.3.0 Jetpack: 6.0
Hostname: ubuntu Interfaces
eth0: 10.0.1.30 docker0: 172.17.0.1
Operating System: Linux - 5.15.136-tegra - aarch64
Compiler: /usr/bin/c++
Compiler Version: 11.4.0 (Ubuntu 11.4.0-1ubuntu1~22.04)

With pip -

(bot_venv) lftds@ubuntu:~/Documents/copilot$ pip install vllm
Collecting vllm
Using cached vllm-0.5.1.tar.gz (790 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [16 lines of output]
Traceback (most recent call last):
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
main()
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/pip-build-env-wsi_ghvs/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 327, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
File "/tmp/pip-build-env-wsi_ghvs/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 297, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-wsi_ghvs/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 313, in run_setup
exec(code, locals())
File "", line 432, in
File "", line 353, in get_vllm_version
RuntimeError: Unknown runtime environment
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

From git source -

bot_venv) lftds@ubuntu:/Documents/copilot$ git clone https://github.com/vllm-project/vllm
Cloning into 'vllm'...
remote: Enumerating objects: 22588, done.
remote: Total 22588 (delta 0), reused 0 (delta 0), pack-reused 22588
Receiving objects: 100% (22588/22588), 21.67 MiB | 8.61 MiB/s, done.
Resolving deltas: 100% (16741/16741), done.
(bot_venv) lftds@ubuntu:
/Documents/copilot$ cd vllm/
(bot_venv) lftds@ubuntu:~/Documents/copilot/vllm$ pip install .
Processing /home/lftds/Documents/copilot/vllm
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [16 lines of output]
Traceback (most recent call last):
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
main()
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/pip-build-env-x2sad19s/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 327, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
File "/tmp/pip-build-env-x2sad19s/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 297, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-x2sad19s/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 313, in run_setup
exec(code, locals())
File "", line 432, in
File "", line 353, in get_vllm_version
RuntimeError: Unknown runtime environment
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Building from source.py

(bot_venv) lftds@ubuntu:~/Documents/copilot/vllm$ python setup.py bdist_wheel
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-aarch64-cpython-310
creating build/lib.linux-aarch64-cpython-310/vllm
copying vllm/tracing.py -> build/lib.linux-aarch64-cpython-310/vllm
copying vllm/model_executor/layers/fused_moe/configs/E=8,N=7168,device_name=AMD_Instinct_MI300X.json -> build/lib.linux-aarch64-cpython-310/vllm/model_executor/layers/fused_moe/configs
copying vllm/model_executor/layers/fused_moe/configs/E=8,N=14336,device_name=AMD_Instinct_MI300X.json -> build/lib.linux-aarch64-cpython-310/vllm/model_executor/layers/fused_moe/configs
running build_ext
-- The CXX compiler identification is GNU 11.4.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build type: RelWithDebInfo
-- Target device: cuda
-- Could NOT find Python (missing: Python_INCLUDE_DIRS Interpreter Development.Module Development.SABIModule) (found version "3.10.12")
CMake Error at cmake/utils.cmake:10 (message):
Unable to find python matching:
/home/lftds/Documents/copilot/bot_venv/bin/python.
Call Stack (most recent call first):
CMakeLists.txt:43 (find_python_from_executable)

-- Configuring incomplete, errors occurred!
See also "/home/lftds/Documents/copilot/vllm/build/temp.linux-aarch64-cpython-310/CMakeFiles/CMakeOutput.log".
Traceback (most recent call last):
File "/home/lftds/Documents/copilot/vllm/setup.py", line 430, in
setup(
File "/home/lftds/Documents/copilot/bot_venv/lib/python3.10/site-packages/setuptools/init.py", line 103, in setup
return distutils.core.setup(**attrs) .
.
.
.
.
.
.
.
.
.
.
File "/home/lftds/Documents/copilot/vllm/setup.py", line 175, in configure
subprocess.check_call(
File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '/home/lftds/Documents/copilot/vllm', '-DCMAKE_BUILD_TYPE=RelWithDebInfo', '-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=/home/lftds/Documents/copilot/vllm/build/lib.linux-aarch64-cpython-310/vllm', '-DCMAKE_ARCHIVE_OUTPUT_DIRECTORY=build/temp.linux-aarch64-cpython-310', '-DVLLM_TARGET_DEVICE=cuda', '-DVLLM_PYTHON_EXECUTABLE=/home/lftds/Documents/copilot/bot_venv/bin/python', '-DNVCC_THREADS=1']' returned non-zero exit status 1.

(bot_venv) lftds@ubuntu:~/Documents/copilot/vllm$ cmake -DPYTHON_EXECUTABLE=/home/lftds/Documents/copilot/bot_venv/bin/python ..
CMake Error: The source directory "/home/lftds/Documents/copilot" does not appear to contain CMakeLists.txt.
Specify --help for usage, or press the help button on the CMake GUI.

(bot_venv) lftds@ubuntu:~/Documents/copilot/vllm$ cd /home/lftds/Documents/copilot/vllm
mkdir -p build
cd build
cmake -DPYTHON_EXECUTABLE=/home/lftds/Documents/copilot/bot_venv/bin/python -DPYTHON_INCLUDE_DIR=/usr/include/python3.10 -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.10.so ..
-- The CXX compiler identification is GNU 11.4.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build type:
-- Target device: cuda
CMake Error at CMakeLists.txt:45 (message):
Please set VLLM_PYTHON_EXECUTABLE to the path of the desired python version
before running cmake configure.

-- Configuring incomplete, errors occurred!
See also "/home/lftds/Documents/copilot/vllm/build/CMakeFiles/CMakeOutput.log".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant