VercelVercel
Menu

Provider Options

Last updated January 21, 2026

The OpenResponses API lets you configure AI Gateway behavior using providerOptions. The gateway namespace gives you control over provider routing, fallbacks, and restrictions.

Set up automatic fallbacks so if your primary model is unavailable, requests route to backup models in order. Use the models array to specify the fallback chain.

fallbacks.ts
const apiKey = process.env.AI_GATEWAY_API_KEY;
 
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    Authorization: `Bearer ${apiKey}`,
  },
  body: JSON.stringify({
    model: 'anthropic/claude-sonnet-4.6',
    input: [{ type: 'message', role: 'user', content: 'Tell me a fun fact about octopuses.' }],
    providerOptions: {
      gateway: {
        models: ['anthropic/claude-sonnet-4.6', 'openai/gpt-5.4', 'google/gemini-3-flash'],
      },
    },
  }),
});

Control the order in which providers are tried using the order array. AI Gateway will attempt providers in the specified order until one succeeds.

routing.ts
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    Authorization: `Bearer ${apiKey}`,
  },
  body: JSON.stringify({
    model: 'google/gemini-3-flash',
    input: [{ type: 'message', role: 'user', content: 'Explain quantum computing in one sentence.' }],
    providerOptions: {
      gateway: {
        order: ['google', 'openai', 'anthropic'],
      },
    },
  }),
});

Restrict requests to specific providers using the only array. This ensures your requests only go to approved providers, which can be useful for compliance or cost control.

restriction.ts
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    Authorization: `Bearer ${apiKey}`,
  },
  body: JSON.stringify({
    model: 'zai/glm-4.7',
    input: [{ type: 'message', role: 'user', content: 'What makes a great cup of coffee?' }],
    providerOptions: {
      gateway: {
        only: ['zai', 'deepseek'],
      },
    },
  }),
});

You can set per-provider timeouts for BYOK credentials to trigger fast failover when a provider is slow to respond. Pass providerTimeouts in providerOptions.gateway:

"providerOptions": {
  "gateway": {
    "providerTimeouts": {
      "byok": { "anthropic": 3000, "bedrock": 5000 }
    }
  }
}

For full details, limits, and response metadata, see Provider Timeouts.

Use caching: 'auto' in the request body to let AI Gateway automatically add cache markers for providers that require them (like Anthropic). For full details, supported providers, and examples, see Automatic Caching.


Was this helpful?

supported.