AI Gateway Provider Options
AI Gateway can route your AI model requests across multiple AI providers. Each provider offers different models, pricing, and performance characteristics. By default, AI Gateway will automatically choose providers for you, with a goal to provide a fast and dependable response.
Provider options allow you to control how your requests are distributed across providers, customizing the order of provider queries and fallbacks.
You can read more about the provider options feature in the AI SDK documentation.
You can use the order
array to specify the sequence in which providers should be attempted. Providers are specified using their slug
string. You can find the slugs in the table of available providers.
You can also copy the provider slug using the copy button next to a provider's name on a model's detail page. In the Vercel Dashboard:
- Click the AI tab,
- Then, click the Model List sub-tab on the left
- Click a model entry in the list.
The bottom section of the page lists the available providers for that model. The copy button next to a provider's name will copy their slug for pasting.
First, ensure you have the necessary package installed:
terminalpnpm install ai@beta
Use the
providerOptions.gateway.order
configuration:app/api/chat/route.tsimport { streamText } from 'ai'; export async function POST(request: Request) { const { prompt } = await request.json(); const result = streamText({ model: 'anthropic/claude-4-sonnet', prompt, providerOptions: { gateway: { order: ['bedrock', 'anthropic'], // Try Amazon Bedrock first, then Anthropic }, }, }); return result.toUIMessageStreamResponse(); }
In this example:
- The gateway will first attempt to use Amazon Bedrock to serve the Claude 4 Sonnet model
- If Amazon Bedrock is unavailable or fails, it will fall back to Anthropic
- Other providers (like Vertex AI) are still available but will only be used after the specified providers
You can monitor which provider you used by checking the provider metadata in the response.
app/api/chat/route.tsimport { streamText } from 'ai'; export async function POST(request: Request) { const { prompt } = await request.json(); const result = streamText({ model: 'anthropic/claude-4-sonnet', prompt, providerOptions: { gateway: { order: ['bedrock', 'anthropic'], }, }, }); // Log which provider was actually used console.log(await result.providerMetadata); return result.toUIMessageStreamResponse(); }
You can view the available models for a provider in the Model List section under the AI tab in your Vercel dashboard.
Provider Slug | Provider Name |
---|---|
anthropic | Anthropic |
bedrock | Amazon Bedrock |
cerebras | Cerebras |
cohere | Cohere |
deepinfra | DeepInfra |
deepseek | DeepSeek |
fireworks | Fireworks |
groq | Groq |
inception | Inception |
mistral | Mistral |
morph | Morph |
openai | OpenAI |
perplexity | Perplexity |
vertex | Vertex AI |
xai | xAI |
Provider availability may vary by model. Some models may only be available through specific providers or may have different capabilities depending on the provider used.
Was this helpful?