-
-
Notifications
You must be signed in to change notification settings - Fork 338
Add Groq model support to LLMClient (#1977) #2165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
henchaves
merged 20 commits into
Giskard-AI:main
from
kunjanshah0811:add-groq-client-support
Jul 3, 2025
Merged
Changes from 1 commit
Commits
Show all changes
20 commits
Select commit
Hold shift + click to select a range
930099f
feat(llm): add GroqClient support to set any Groq model as default LL…
10a4b68
"Address PR feedback: make JSON format support less hardcoded and fix…
97ac6c6
Fix: Issues of previous comments
b9d3f01
Docs: Added Groq Client
97dc016
Fix: Linting Error
1659045
Fix: Resolve test failures and ensure all tests pass locally
3bc5b74
Merge branch 'main' into add-groq-client-support
kunjanshah0811 4f27293
Add get_config in groq_client.py file
ef817bf
Fix: code formatting via pre-commit
3a6362a
Fix: ValueError: Could not load model sentence-transformers/paraphras…
ff02a9e
Fix: Remove duplicate dependencies and organize LLM-related packages
8af4202
Apply suggestions from code review
davidberenstein1957 a70fbb6
fix: removed extra spaces in pyproject.toml
1fb5174
Merge branch 'add-groq-client-support' of https://github.com/kunjansh…
7c86562
refactor(llm): Revert making Groq a default LLM client
ff8d141
docs: Add note about Groq embedding support
c0f3a5b
Merge branch 'main' into add-groq-client-support
henchaves 9ad79ec
undo changes in llm/client/__init__
henchaves 92981aa
update docs order
henchaves 7c5b277
update pdm.lock
henchaves File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -10,4 +10,5 @@ | |
"set_llm_api", | ||
"set_default_embedding", | ||
"set_embedding_model", | ||
"GroqClient" | ||
] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,93 @@ | ||
from typing import Optional, Sequence | ||
|
||
from dataclasses import asdict | ||
from logging import warning | ||
import logging | ||
|
||
from ..config import LLMConfigurationError | ||
from ..errors import LLMImportError | ||
from . import LLMClient | ||
from .base import ChatMessage | ||
|
||
try: | ||
from groq import Groq | ||
import groq | ||
except ImportError as err: | ||
raise LLMImportError(flavor="llm") from err | ||
davidberenstein1957 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
AUTH_ERROR_MESSAGE = ( | ||
"Could not authenticate with Groq API. Please make sure you have configured the API key by " | ||
"setting GROQ_API_KEY in the environment." | ||
) | ||
|
||
def _supports_json_format(model: str) -> bool: | ||
if "llama-3.3-70b-versatile" in model: | ||
return True | ||
|
||
if model == "llama-3.1-8b-instant" or model == "gemma2-9b-it": | ||
return True | ||
|
||
return False | ||
davidberenstein1957 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
logger = logging.getLogger(__name__) | ||
|
||
class GroqClient(LLMClient): | ||
def __init__( | ||
self, | ||
model: str = "llama-3.3-70b-versatile", # Default model for Groq | ||
davidberenstein1957 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
client: Groq = None, | ||
json_mode: Optional[bool] = None | ||
): | ||
logger.info(f"Initializing GroqClient with model: {model}") | ||
self.model = model | ||
self._client = client or Groq() | ||
self.json_mode = json_mode if json_mode is not None else _supports_json_format(model) | ||
logger.info("GroqClient initialized successfully") | ||
|
||
def complete( | ||
self, | ||
messages: Sequence[ChatMessage], | ||
temperature: float = 1.0, | ||
max_tokens: Optional[int] = None, | ||
caller_id: Optional[str] = None, | ||
seed: Optional[int] = None, | ||
format=None, | ||
) -> ChatMessage: | ||
logger.info(f"GroqClient.complete called with model: {self.model}") | ||
logger.info(f"Messages: {messages}") | ||
|
||
extra_params = dict() | ||
|
||
if seed is not None: | ||
extra_params["seed"] = seed | ||
davidberenstein1957 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
if self.json_mode: | ||
if format not in (None, "json", "json_object"): | ||
warning(f"Unsupported format '{format}', ignoring.") | ||
format = None | ||
|
||
if format == "json" or format == "json_object": | ||
extra_params["response_format"] = {"type": "json_object"} | ||
|
||
try: | ||
completion = self._client.chat.completions.create( | ||
model=self.model, | ||
messages=[asdict(m) for m in messages], | ||
temperature=temperature, | ||
max_tokens=max_tokens, | ||
**extra_params, | ||
) | ||
except groq.AuthenticationError as err: | ||
raise LLMConfigurationError(AUTH_ERROR_MESSAGE) from err | ||
|
||
self.logger.log_call( | ||
prompt_tokens=completion.usage.prompt_tokens, | ||
sampled_tokens=completion.usage.completion_tokens, | ||
model=self.model, | ||
client_class=self.__class__.__name__, | ||
caller_id=caller_id, | ||
) | ||
|
||
msg = completion.choices[0].message | ||
|
||
return ChatMessage(role=msg.role, content=msg.content) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we should set
groq
as default LLM client.Could we revert the modifications here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the feedback. Before I make any changes, I want to ensure I understand your request correctly:
I'll remove Groq from being automatically selected as a default LLM client by modifying the
get_default_llm_api()
function in__init__.py
I'll keep the following in place so Groq can still be used when explicitly selected:
Is this understanding correct? If you'd prefer I completely remove all Groq-related changes, please let me know.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @kunjanshah0811, yes exactly!
Just reverting
llm/client/__init__.py
should be fine, you can keep thegroq_client.py
as it is.Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@henchaves I have made suggested changes with the recent commit. Please have a look and let me know. Thanks a lot 🚀.