Skip to content

vijaykeshri/localllm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

In this project, Ollama has been used to run any modal (which ollama supports) locally on system as inference server. For this example, I have used mistral model. Here I am trying to perform some operation on provided file.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages