We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
linux 使用命令 curl -fsSL https://ollama.com/install.sh | sh 一键安装
curl -fsSL https://ollama.com/install.sh | sh
更新只需要再次执行curl -fsSL https://ollama.com/install.sh | sh 即可
使用 systemctl edit ollama.service 命令,添加以下配置
systemctl edit ollama.service
[Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*"
不要直接编辑/etc/systemd/system/ollama.service,因为更新ollama时会覆盖这个文件。
浏览器安装“沉浸式翻译”插件,进入插件设置 -> 翻译服务 -> OpenAI -> 设置 模型可以手动设置用ollama下载的
vscode安装Continue 插件,打开插件侧边栏,点最下面的设置,可以编辑配置使用ollama, 下面是配置参考
"models": [ { "title": "Llama Code 13b", "provider": "ollama", "model": "codellama:13b", "apiBase": "https://xxx.xxx.xxx" } ], "tabAutocompleteModel": { "title": "deepseekCoder 2", "provider": "ollama", "model": "deepseek-coder-v2", "apiBase": "https://xxx.xxx.xxx" }, "embeddingsProvider": { "provider": "ollama", "model": "nomic-embed-text", "apiBase": "https://xxx.xxx.xxx" }
The text was updated successfully, but these errors were encountered:
No branches or pull requests
安装
linux 使用命令
curl -fsSL https://ollama.com/install.sh | sh
一键安装更新
更新只需要再次执行
curl -fsSL https://ollama.com/install.sh | sh
即可翻译接入
使用
systemctl edit ollama.service
命令,添加以下配置不要直接编辑/etc/systemd/system/ollama.service,因为更新ollama时会覆盖这个文件。
浏览器安装“沉浸式翻译”插件,进入插件设置 -> 翻译服务 -> OpenAI -> 设置
模型可以手动设置用ollama下载的
vscode 接入
vscode安装Continue 插件,打开插件侧边栏,点最下面的设置,可以编辑配置使用ollama, 下面是配置参考
The text was updated successfully, but these errors were encountered: