Skip to content

Update models.py -added Groq API integration #995

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ldagar315
Copy link

feat: add GroqApiModel class for Groq API integration

This commit introduces the GroqApiModel class that mirrors the structure and usage of HfApiModel from the smolagents library. The new class leverages the Groq API endpoint to support chat completions with configurable parameters such as temperature, max tokens, top-p, and streaming responses.

Key changes:

  • Added GroqApiModel with a default model_id of "qwen" (overridable via parameter).
  • Supports both streaming and non-streaming responses.
  • Implements a similar interface to HfApiModel for consistency within the codebase.

This implementation facilitates seamless integration with Groq's API and offers flexibility in choosing different models.

feat: add GroqApiModel class for Groq API integration

This commit introduces the GroqApiModel class that mirrors the structure and usage of HfApiModel from the smolagents library. The new class leverages the Groq API endpoint to support chat completions with configurable parameters such as temperature, max tokens, top-p, and streaming responses.

Key changes:
- Added GroqApiModel with a default model_id of "qwen" (overridable via parameter).
- Supports both streaming and non-streaming responses.
- Implements a similar interface to HfApiModel for consistency within the codebase.

This implementation facilitates seamless integration with Groq's API and offers flexibility in choosing different models.
@sysradium
Copy link
Contributor

Hey, cool stuff. Is it more beneficial than the https://docs.litellm.ai/docs/providers/groq which already is there?

@ldagar315
Copy link
Author

Hey there, Nopes, not so beneficial, It just took me some time to explore that we can also integrate groq using LiteLLM (as I was not familiar with LiteLLM), so just thought of adding it as Groq is very widely used and it didn't have a connector of its own, so decided to add it.

@edlee123
Copy link

edlee123 commented Mar 22, 2025

Since Groq is open ai compatible, curious what would be the nuances between using GroqApiModel, and OpenAIServerModel?

Was just looking at: https://console.groq.com/docs/openai

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants