Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fatal error: xxx: No such file or directory #15518

Open
LOTEAT opened this issue Sep 4, 2024 · 6 comments
Open

Fatal error: xxx: No such file or directory #15518

LOTEAT opened this issue Sep 4, 2024 · 6 comments

Comments

@LOTEAT
Copy link

LOTEAT commented Sep 4, 2024

I want to install Apollo9.0 in a server, and the GPU is RTX4090, the version of cuda driver is 535.161.07, the version of cuda is 12.2.

When compiling the source code following the tutorial, I encountered this problem

(04:15:37) ERROR: /apollo/modules/drivers/lidar/vanjeelidar/BUILD:8:17: Compiling modules/drivers/lidar/vanjeelidar/src/vanjeelidar_component.cpp failed: (Exit 1): crosstool_wrapper_driver_is_not_gcc failed: error executing command external/local_config_cuda/crosstool/clang/bin/crosstool_wrapper_driver_is_not_gcc -MD -MF ... (remaining 116 arguments skipped)
In file included from modules/drivers/lidar/vanjeelidar/src/vanjeelidar_component.cpp:18:0:
./modules/drivers/lidar/vanjeelidar/src/vanjeelidar_component.h:20:10: fatal error: vanjee_driver/api/lidar_driver.hpp: No such file or directory
 #include <vanjee_driver/api/lidar_driver.hpp>

When I was looking for a solution, I found this problem mentioned in the issue. One of the solutions was to delete the vanjeelidar folder. After I delete this folder, I met another problem

In file included from modules/drivers/lidar/seyond/src/seyond_driver.cpp:16:0:
./modules/drivers/lidar/seyond/src/seyond_driver.h:29:10: fatal error: seyond/sdk_common/inno_lidar_api.h: No such file or directory
 #include "seyond/sdk_common/inno_lidar_api.h"

After I deleted it, I found that another LiDAR driver package was still not found.
Because it is RTX4090, additional operations are added when configuring docker.

  1. Modify docker/scripts/dev_start.sh
    VERSION_X86_64="dev-x86_64-18.04-20231128_2222"
  2. Modify third_party/centerpoint_infer_op/workspace.bzl
"""Loads the paddlelite library"""

# Sanitize a dependency so that it works correctly from code that includes
# Apollo as a submodule.
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")

def clean_dep(dep):
    return str(Label(dep))

def repo():
    http_archive(
        name = "centerpoint_infer_op-x86_64",
        sha256 = "038470fc2e741ebc43aefe365fc23400bc162c1b4cbb74d8c8019f84f2498190",
        strip_prefix = "centerpoint_infer_op",
        urls = ["https://apollo-pkg-beta.bj.bcebos.com/archive/centerpoint_infer_op_cu118.tar.gz"],
    )

    http_archive(
        name = "centerpoint_infer_op-aarch64",
        sha256 = "e7c933db4237399980c5217fa6a81dff622b00e3a23f0a1deb859743f7977fc1",
        strip_prefix = "centerpoint_infer_op",
        urls = ["https://apollo-pkg-beta.bj.bcebos.com/archive/centerpoint_infer_op-linux-aarch64-1.0.0.tar.gz"],
    )
  1. Modify third_party/paddleinference/workspace.bzl
"""Loads the paddlelite library"""

# Sanitize a dependency so that it works correctly from code that includes
# Apollo as a submodule.
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")

def clean_dep(dep):
    return str(Label(dep))

def repo():
    http_archive(
        name = "paddleinference-x86_64",
        sha256 = "7498df1f9bbaf5580c289a67920eea1a975311764c4b12a62c93b33a081e7520",
        strip_prefix = "paddleinference",
        urls = ["https://apollo-pkg-beta.cdn.bcebos.com/archive/paddleinference-cu118-x86.tar.gz"],
    )

    http_archive(
        name = "paddleinference-aarch64",
        sha256 = "048d1d7799ffdd7bd8876e33bc68f28c3af911ff923c10b362340bd83ded04b3",
        strip_prefix = "paddleinference",
        urls = ["https://apollo-pkg-beta.bj.bcebos.com/archive/paddleinference-linux-aarch64-1.0.0.tar.gz"],
    )

Therefore, I suspect that the image version replaced in docker/scripts/dev_start.sh is not compatible with the current LiDAR driver. I used the original dev_start.sh to create the docker environment. However, I met a problem again

(04:43:37) ERROR: /apollo/modules/perception/lidar_detection/BUILD:25:12: Compiling modules/perception/lidar_detection/detector/point_pillars_detection/anchor_mask_cuda.cu failed: (Exit 1): crosstool_wrapper_driver_is_not_gcc failed: error executing command external/local_config_cuda/crosstool/clang/bin/crosstool_wrapper_driver_is_not_gcc -MD -MF ... (remaining 49 arguments skipped)
nvcc fatal   : Unsupported gpu architecture 'compute_89'

Could you please tell me how to solve this problem?

@LOTEAT LOTEAT changed the title Fatal error: xxx_driver/api/xxx.hpp: No such file or directory Fatal error: xxx: No such file or directory Sep 4, 2024
@IvDmNe
Copy link

IvDmNe commented Sep 9, 2024

This issue seems related to your bug.

@thundersd
Copy link

Same problem. I got a RTX4080 and I solved it by deleting all three drivers and it worked. You can try if you don't need them.

@LOTEAT
Copy link
Author

LOTEAT commented Sep 10, 2024

Same problem. I got a RTX4080 and I solved it by deleting all three drivers and it worked. You can try if you don't need them.

Yes, it works. Thanks!

@LOTEAT
Copy link
Author

LOTEAT commented Sep 10, 2024

This issue seems related to your bug.

Thank you for your answer. When I followed the tutorial of this issue, I found that rt_legacy.h did not have #ifdef __aarch64__. I suspect that the apollo codes were modified after this issue, which led to the missing driver. If rolling back to the previous version, it should be able to compile successfully.

@thundersd
Copy link

Same problem. I got a RTX4080 and I solved it by deleting all three drivers and it worked. You can try if you don't need them.

Yes, it works. Thanks!

Apollo's docker environment DO still have some problems with latest RTX40 graphics cards.

I solved the problem of hsLiDAR by installing the SDK manually. You can run /apollo/docker/build/install_hesai2_driver.sh to install the SDK. I guess it will also work for Vanjee LiDAR and Seyond LiDAR (But I have not tried it yet. If you would like to give it a try pls let me know if it works, thx)

@LOTEAT
Copy link
Author

LOTEAT commented Sep 10, 2024

Same problem. I got a RTX4080 and I solved it by deleting all three drivers and it worked. You can try if you don't need them.

Yes, it works. Thanks!

Apollo's docker environment DO still have some problems with latest RTX40 graphics cards.

I solved the problem of hsLiDAR by installing the SDK manually. You can run /apollo/docker/build/install_hesai2_driver.sh to install the SDK. I guess it will also work for Vanjee LiDAR and Seyond LiDAR (But I have not tried it yet. If you would like to give it a try pls let me know if it works, thx)

The driver of Hesai can be installed by running /apollo/docker/build/install_hesai2_driver.sh. However, there is no bash commands for installing Seyond driver. After installing Vanjee driver, running the ./apollo.sh build reports a lot of bugs. This method doesn't seem to work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants