Skip to content

根据conf.yaml.example 示例 配置ollama模型报错 #112

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Albertyao1993 opened this issue May 13, 2025 · 5 comments
Open

根据conf.yaml.example 示例 配置ollama模型报错 #112

Albertyao1993 opened this issue May 13, 2025 · 5 comments

Comments

@Albertyao1993
Copy link

根据conf.yaml.example 文件中,本地配置Ollama模型。run main.py 后,报错。请问是需要额外配置一个OPENAI_API_KEY 吗?
BASIC_MODEL: model: "ollama/qwen3:14b" base_url: "http://localhost:11434" # Local service address of Ollama, which can be started/viewed via ollama serve
> uv run main.py Traceback (most recent call last): File "/home/guoqiang/developments/deer-flow/main.py", line 14, in <module> from src.workflow import run_agent_workflow_async File "/home/guoqiang/developments/deer-flow/src/workflow.py", line 6, in <module> from src.graph import build_graph File "/home/guoqiang/developments/deer-flow/src/graph/__init__.py", line 4, in <module> from .builder import build_graph_with_memory, build_graph File "/home/guoqiang/developments/deer-flow/src/graph/builder.py", line 8, in <module> from .nodes import ( File "/home/guoqiang/developments/deer-flow/src/graph/nodes.py", line 14, in <module> from src.agents.agents import coder_agent, research_agent, create_agent File "/home/guoqiang/developments/deer-flow/src/agents/__init__.py", line 4, in <module> from .agents import research_agent, coder_agent File "/home/guoqiang/developments/deer-flow/src/agents/agents.py", line 13, in <module> from src.llms.llm import get_llm_by_type File "/home/guoqiang/developments/deer-flow/src/llms/llm.py", line 48, in <module> basic_llm = get_llm_by_type("basic") ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/guoqiang/developments/deer-flow/src/llms/llm.py", line 42, in get_llm_by_type llm = _create_llm_use_conf(llm_type, conf) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/guoqiang/developments/deer-flow/src/llms/llm.py", line 27, in _create_llm_use_conf return ChatOpenAI(**llm_conf) ^^^^^^^^^^^^^^^^^^^^^^ File "/home/guoqiang/developments/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/load/serializable.py", line 125, in __init__ super().__init__(*args, **kwargs) File "/home/guoqiang/developments/deer-flow/.venv/lib/python3.12/site-packages/pydantic/main.py", line 214, in __init__ validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/guoqiang/developments/deer-flow/.venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 600, in validate_environment self.root_client = openai.OpenAI(**client_params, **sync_specific) # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/guoqiang/developments/deer-flow/.venv/lib/python3.12/site-packages/openai/_client.py", line 114, in __init__ raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

@naiheSH
Copy link

naiheSH commented May 13, 2025

conf.yaml.example 把.example后缀删除,然后再运行

@Albertyao1993
Copy link
Author

conf.yaml.example 把.example后缀删除,然后再运行

我是按照官方说明文档, cp了一份 conf.yaml 文件并且配置了ollama。然后报的需要设置openai_api_key的错误。

@naiheSH
Copy link

naiheSH commented May 13, 2025

在conf.yaml需要配置豆包的key

conf.yaml.example 把.example后缀删除,然后再运行

我是按照官方说明文档, cp了一份 conf.yaml 文件并且配置了ollama。然后报的需要设置openai_api_key的错误。

@ai-srcflow
Copy link

这配置太拉了 压根一次性就成功不了 抄langmanus都抄不明白

@yin3331
Copy link

yin3331 commented May 13, 2025

这样可以成功,最重要的是加个 /v1,还有模型名字前不要加ollama/
BASIC_MODEL:
model: "mistral-small3.1:24b"
api_key: fake
base_url: "http://localhost:11434/v1"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants