Using a simple export or .env file to allow local LLMs to be used more easily and readily #899
rosegarden-coder
started this conversation in
Ideas
Replies: 1 comment
-
You are right we can do better here @rosegarden-coder , I agree and thanks for raising this. +1 to this |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Right now, it is so cumbersome to use ollama. One has to do a very long command line like this:
Or using the script with quick fix described in #787.
Why does ollama has to be the second class citizen? I am using the 70b local models and granite3-dense:8b for embedding with solid results.
As I read the tutorial, it say:
Why can't ollama models be set in an .env file? paper-qa is a gem, but ollama could be more easily and readily used with it.
Beta Was this translation helpful? Give feedback.
All reactions