In this project, Ollama has been used to run any modal (which ollama supports) locally on system as inference server. For this example, I have used mistral model. Here I am trying to perform some operation on provided file.
-
Notifications
You must be signed in to change notification settings - Fork 0
vijaykeshri/localllm
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published