Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use it in maui #1370

Open
Barshan-Mandal opened this issue Aug 16, 2024 · 7 comments
Open

How to use it in maui #1370

Barshan-Mandal opened this issue Aug 16, 2024 · 7 comments

Comments

@Barshan-Mandal
Copy link

I want to use torchsharp for android and ios .How can i do it?

@yueyinqiu
Copy link
Contributor

it might be pretty hard... #1083

@Barshan-Mandal
Copy link
Author

but there must be a way as we know that c# is not capable enough to do much things like torch

@Barshan-Mandal
Copy link
Author

make an easy and seamless integration.you may use pytorch mobile

@asieradzk
Copy link

You're not supposed to put torchlib on user devices, its too massive. You pretty much have two options:

  1. Export your models as ONNX and use that for inference.
  2. Build Asp .net core server application hosting TorchSharp stuff and expose it to android/ios frontend.

@Barshan-Mandal
Copy link
Author

You're not supposed to put torchlib on user devices, its too massive. You pretty much have two options:

  1. Export your models as ONNX and use that for inference.
  2. Build Asp .net core server application hosting TorchSharp stuff and expose it to android/ios frontend.

but how does onnx exist for embedded devices?

@asieradzk
Copy link

You're not supposed to put torchlib on user devices, its too massive. You pretty much have two options:

  1. Export your models as ONNX and use that for inference.
  2. Build Asp .net core server application hosting TorchSharp stuff and expose it to android/ios frontend.

but how does onnx exist for embedded devices?

Something exists. For instance Unity already made two ONNX inference engines that run on mobile. The recent one is called Sentis, you can maybe use that? https://unity.com/products/sentis

@NiklasGustafsson
Copy link
Contributor

TorchSharp supports the platforms that libtorch supports: the CPU backend on Windows X64, MacOS M1/M2/M3, Linux X64. In addition, both Windows and Linux support then libtorch CUDA backends.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants