Supported LLM Providers and Models¶
Our platform integrates with multiple Large Language Model (LLM) providers to offer you flexibility and choice in selecting the most suitable AI models for your needs. Each provider offers unique capabilities and model variations, allowing you to leverage state-of-the-art AI technology through a unified interface. This document outlines the currently supported providers, their available models, and the requirements for using them.
| Provider | Prefix | Required API Key | Available Models | Model names URL |
|---|---|---|---|---|
google_genai | GOOGLE_API_KEY | gemini-2.5-pro-preview-03-25, gemini-2.5-flash-preview-04-17, gemini-2.0-flash, gemini-2.0-flash-lite | https://ai.google.dev/gemini-api/docs/models | |
| Anthropic | anthropic | ANTHROPIC_API_KEY | claude-3-7-sonnet-latest, claude-3-5-haiku-latest, claude-3-5-sonnet-latest | https://docs.anthropic.com/en/docs/about-claude/models/all-models |
| OpenAI | openai | OPENAI_API_KEY | gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, 04-mini, o3, o3-mini | https://platform.openai.com/docs/models |
| Mistral AI | mistralai | MISTRAL_API_KEY | codestral-latest, mistral-large-latest, mistral-small-latest, open-mistral-nemo | https://docs.mistral.ai/getting-started/models/models_overview/ |
| Groq | groq | GROQ_API_KEY | gemma2-9b-it, llama-3.3-70b-versatile, llama-3.1-8b-instant, llama-guard-3-8b | https://console.groq.com/docs/models |
| DeepSeek | deepseek | DEEPSEEK_API_KEY | deepseek-chat, deepseek-reasoner | https://api-docs.deepseek.com/quick_start/pricing |
How it works¶
When using these models in your application:
- Each model requires its corresponding API key to be set in your user settings
- The model name must be prefixed with the provider's prefix. Examples:
- Google:
google_genai:gemini-2.0-flash - Anthropic:
anthropic-claude-3-7-sonnet-latest - OpenAI:
openai:gpt-4.1
- Google:
- The system will automatically:
- Validate the presence of the required API key
- Strip the provider prefix when needed (for providers like Anthropic and Groq)
- Initialize the appropriate client with the correct endpoints and configurations
Future Updates¶
We are actively working on expanding our supported providers and models. Future updates will include: - Additional language model providers - New model versions as they become available - Enhanced capabilities and specialized models - Support for more regional endpoints and deployment options
Please check our documentation regularly for updates on newly supported models and providers.