Allow to use local LLMs
It will be good, if we can select an local LLM provider, e.g. Ollama
6
votes
-
Christopher Aaby commented
I second this. It would:
* make the service usable offline
* make a vital difference in privacy
* allow much more customization and professional useThis is already available, for free, with open source software... it may not be a feature every user wants, but it would take the Options+ AI "implementation" from quaint to professionally useful.