v0.7.0
Jarvis can now work completely offline! (Continue reading)
This release adds two new model interfaces.
Google PaLM
- If you have access to it (it's free), you can use it for chat and for related notes.
Custom OpenAI-like APIs
- This allows Jarvis to use custom endpoints and models that have an OpenAI-compatible interface.
- Example: [tested] OpenRouter (for ebc000) setup guide
- Example: [not tested] Azure OpenAI (previously requested)
- Example: [tested] Locally served GPT4All (for laurent, and everyone else who showed interest) setup guide
- This is an open source, offline model (you may in fact choose from several available models), that you can install and run on a laptop. It can be used for chat, and potentially also for related notes (embeddings didn't work for me, probably due to a gpt4all issue, but related notes already support the USE offline model).
- This solution for an offline model is not ideal, as it may be technically challenging for a user to run their own server, but at the moment this workaround looks like the only viable solution, and doesn't involve a lot of steps.
- Example: [not tested] LocalAI
- This is another self-hosted server that supports many models, in case you run into issues with GPT4All.
Full Changelog: v0.6.0...v0.7.0