Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deployment #3

Open
ARCIST-AI opened this issue Jun 10, 2024 · 3 comments
Open

deployment #3

ARCIST-AI opened this issue Jun 10, 2024 · 3 comments

Comments

@ARCIST-AI
Copy link

ARCIST-AI commented Jun 10, 2024

Hi, i tried the install on the command line
Screenshot 2024-06-10 at 17 01 59
I got loads of the screenshot behaviour and then the following : (base) alimac@Alistairs-Mac-mini ~ % >....
./server -c 2048 --embeddings -m /Users/alimac/llamanet/models/huggingface/microsoft/Phi-3-mini-4k-instruct-gguf/Phi-3-mini-4k-instruct-q4.gguf --port 8000
node:events:498
throw er; // Unhandled 'error' event
^

Error: spawn ./server ENOENT
at ChildProcess._handle.onexit (node:internal/child_process:286:19)
at onErrorNT (node:internal/child_process:484:16)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
Emitted 'error' event on ChildProcess instance at:
at ChildProcess._handle.onexit (node:internal/child_process:292:12)
at onErrorNT (node:internal/child_process:484:16)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
errno: -2,
code: 'ENOENT',
syscall: 'spawn ./server',
path: './server',
spawnargs: [
'-c',

Any ideas?

@cocktailpeanut
Copy link
Contributor

can you check if the server file is actually located inside your ~/llamanet path somewhere?

@ARCIST-AI
Copy link
Author

ARCIST-AI commented Jun 12, 2024

@cocktailpeanut ~/llamanet just has /models in it. I wanted to ask you also about using llamanet in conjunction with this https://discord.gg/uMaGSHNjzc. It is recommended to use it in a docker container and i was wondering if i should have llamanet on board already or if i add the lines to the docker initiation would it work anyway?

@ARCIST-AI
Copy link
Author

Title of the Issue: Issues with "no valid release" and missing imports
Updated Description:
Hello,
I've encountered a persistent issue where I receive a "no valid release" message when trying to run Llamanet using npx llamanet@latest. Additionally, I've noticed that there are missing imports in the package which might be contributing to the problem.
Details:
Environment: macOS, Node.js v22.2.0, npm v10.7.0
Commands Run:
npm cache clean --force
npm install -g llamanet@latest
Issues Encountered:

  1. Persistent "no valid release" message despite successful reinstallation.
  2. Missing imports that are not specified in the documentation or error logs but are implied by the functionality described.
    Could you please look into these issues? Any guidance on resolving the "no valid release" and details on the expected imports would be greatly appreciated.
    Thank you for your assistance!
    Here‘s the message.
Screenshot 2024-06-13 at 19 17 17

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants