-
-
Notifications
You must be signed in to change notification settings - Fork 338
Add Groq model support to LLMClient (#1977) #2165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Groq model support to LLMClient (#1977) #2165
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good already. Some minor remarks :)
… redundant seed assignment - Added _supports_json_format(model: str) to check for JSON support based on model name. - Now user can specify JSON format via the 'format' parameter instead of hardcoding. - Removed duplicate assignment of extra_params['seed'] = seed in GroqClient.complete()."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, I left some more comments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed previous issues and added one comment that needs your attention in groq_client.py
Here https://github.com/Giskard-AI/giskard/pull/2165/files/10a4b68e3841dedb23605c1f5d3f333621942d0f#r2120776154
LFTM, one final thing is to mention it in our docs: https://docs.giskard.ai/en/stable/open_source/setting_up/index.html. |
I will update it by today and let you know. :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have added Groq Client in Documentation.
@kunjanshah0811 now, we need to make sure all tests pass. |
Solved the Linting Error. |
@kunjanshah0811 there are some remaining errors. Before pushing, could you make sure they pass locally? |
I ran both Let me know if anything else is needed from my end. @davidberenstein1957 😀 Note: Some tests require HuggingFace authentication. So, it is done locally via |
Awesome @kunjanshah0811 . Thanks a lot for the help. |
…e-multilingual-MiniLM-L12-v2
Fixed with minor errors. @davidberenstein1957 thanks a lot for the review. :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minor tweaks for the dependencies and we are good to go. A lot of thanks for the help.
pyproject.toml
Outdated
@@ -59,7 +61,13 @@ dev = [ | |||
"mistralai>=1", | |||
"boto3>=1.34.88", | |||
"scikit-learn==1.4.2", | |||
"bert-score>=0.3.13", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why did you duplicate these dependencies here? Is there a specific reason? I'm not sure if we rely on them directly and they might be included in the other libraries, like langchain already?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why did you duplicate these dependencies here? Is there a specific reason? I'm not sure if we rely on them directly and they might be included in the other libraries, like langchain already?
I added those dependencies because I ran into import errors during local testing, and adding them directly resolved the issues at the time. But you're right,other libraries like langchain might already pull in some of them and I may have added them redundantly without double-checking after that. I’m happy to remove any that aren’t needed directly.
giskard/llm/client/__init__.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we should set groq
as default LLM client.
Could we revert the modifications here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we should set
groq
as default LLM client. Could we revert the modifications here?
Thank you for the feedback. Before I make any changes, I want to ensure I understand your request correctly:
-
I'll remove Groq from being automatically selected as a default LLM client by modifying the
get_default_llm_api()
function in__init__.py
-
I'll keep the following in place so Groq can still be used when explicitly selected:
- The Groq client implementation (groq_client.py)
- The Groq dependency in pyproject.toml
- The test case for the Groq client
- "groq" as a valid option in the API list (users can still explicitly set GSK_LLM_API="groq")
Is this understanding correct? If you'd prefer I completely remove all Groq-related changes, please let me know.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @kunjanshah0811, yes exactly!
Just reverting llm/client/__init__.py
should be fine, you can keep the groq_client.py
as it is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @kunjanshah0811, yes exactly! Just reverting
llm/client/__init__.py
should be fine, you can keep thegroq_client.py
as it is.
@henchaves I have made suggested changes with the recent commit. Please have a look and let me know. Thanks a lot 🚀.
docs/open_source/setting_up/index.md
Outdated
# Note: Groq does not currently support embedding models | ||
# Use another provider for embeddings if needed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect! Thanks.
Could you just mention to access https://docs.litellm.ai/docs/embedding/supported_embedding to see the full list of supported providers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect! Thanks. Could you just mention to access https://docs.litellm.ai/docs/embedding/supported_embedding to see the full list of supported providers?
- I'll add the documentation reference you suggested in
docs/open_source/setting_up/index.md
linking to LiteLLM's supported embedding providers
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, thanks!
Thanks a lot @kunjanshah0811! The PR is ready to be merged 🙂 |
Thank you so much @davidberenstein1957 and @henchaves for your constant support! 🙏✨ |
@davidberenstein1957 @henchaves @alexcombessie
Description
This PR adds support for using Groq as an additional LLM client in Giskard, without replacing the default LiteLLM client. It enables developers to integrate Groq models like
llama3-8b-8192
by configuring and initializingGroqClient
explicitly.Related Issue
Closes Set any groq model as default LLMClient for giskard #1977
Type of Change
Checklist
CODE_OF_CONDUCT.md
document.CONTRIBUTING.md
guide.pdm.lock
runningpdm update-lock
(only applicable whenpyproject.toml
has beenmodified)
Usage Example