Skip to content

langchain_openai的兼容性太差了 #147

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
highkay opened this issue May 14, 2025 · 1 comment
Closed

langchain_openai的兼容性太差了 #147

highkay opened this issue May 14, 2025 · 1 comment

Comments

@highkay
Copy link

highkay commented May 14, 2025

  1. 用modelscope的推理端点,会提示没有发送enable_thinking
  2. openrouter的大部分免费模型都不支持tool
  3. chutes.ai会报500错误,端点没错,推测是发送了不支持的参数

而且工程化做的不太行啊,1和3都是选型的锅,从litellm出来又掉进langchain_openai的坑。2的话其实看看smolagents可以用codeagent来解决,而且依赖原生模型上下文窗口这个memory方案太low了。

结论,暂时看来实用性不高,目前这种通用agent框架都是做弱需求的,定位还没想清楚吧。

@MagicCube
Copy link
Collaborator

在文档里已清楚的说明,当前不支持 reasoning model。主要是因为 Reasoning Model 在输出 JSON 的兼容性上有很多的问题。
开源不易,大模型应用开源更不易,我们暂时无法测试所有市面上主流的模型和网关。而 OpenAI 的网关无疑是目前最主流的。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants