-
Notifications
You must be signed in to change notification settings - Fork 486
Ollama support? #44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
any updates on this? |
This would be great. Please develop this :). Any update on this? |
Im making this now locally |
Hello, Hope you're doing well guys! First of all, if you want to use Ollama models, you need to save the model's embeddings in the kb_s2 folder. Step 1 is to generate these embeddings for retrieval purposes. After that, you'll need to adapt the model's output to match the input format expected by the agents — this is the issue I'm currently working on. That said, I can share my script with you for loading and integrating Ollama models into the package. |
I'd need a more thorough walkthrough to be able to replicate this. Good work though! |
You need to set the provider's API key to an empty string. @Julianvvz https://github.com/SylvainVerdy/Agent-S/tree/ollama_support ( i use ollama serve as command line to run ollama) My issue is in the manager.py with the method : _generate_dag This new branch worked for me : https://github.com/SylvainVerdy/Agent-S/blob/ollama_support_working (lot of editions in the core folder etc.. maybe i have to clean the code) the llm managed to automatically launch Spotify on my Windows computer. |
No description provided.
The text was updated successfully, but these errors were encountered: