What you’ll learn:
- Available providers
- Provider comparison
- Choosing a provider
Supported Providers
- OpenAI - GPT-4, GPT-3.5, and other OpenAI models
- Anthropic - Claude Sonnet, Opus, and Haiku models
- Google - Gemini Pro and other Google models
- Vercel AI SDK - Unified interface for multiple providers (recommended for TypeScript)
- Mistral - Mistral models
- DeepSeek - DeepSeek models via OpenAI-compatible API
- TogetherAI - Models hosted on TogetherAI
- XAI - Grok models
- Any OpenAI-compatible API - Use the OpenAI provider with custom base URLs
Choosing a Provider
For TypeScript projects, we recommend using the Vercel AI SDK provider (@metorial/ai-sdk). It offers the best developer experience with built-in streaming support, unified tool handling, and compatibility with multiple model providers.
For Python projects, use the provider-specific packages like MetorialAnthropic or MetorialOpenAI. These work directly with the official Python SDKs from each provider.
For production applications, consider factors like model capabilities, pricing, rate limits, and latency when choosing your provider. Metorial makes it easy to switch providers later if your needs change.