SDKs & APIs
Last updated January 21, 2026
AI Gateway provides drop-in compatible APIs that let you switch by changing a base URL. No code rewrites required. Use the same SDKs and tools you already know, with access to 200+ models from every major provider.
Point your existing SDK to the gateway:
npm i ai @ai-sdk/openaiimport { generateText } from 'ai';
const { text } = await generateText({
model: 'anthropic/claude-sonnet-4.6',
prompt: 'Hello!',
});import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
const response = await client.chat.completions.create({
model: 'anthropic/claude-sonnet-4.6',
messages: [{ role: 'user', content: 'Hello!' }],
});import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
const response = await client.responses.create({
model: 'anthropic/claude-sonnet-4.6',
input: 'Hello!',
});import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh',
});
const message = await client.messages.create({
model: 'anthropic/claude-sonnet-4.6',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/openresponses/v1',
});
const response = await client.responses.create({
model: 'anthropic/claude-sonnet-4.6',
input: 'Hello!',
});- No vendor lock-in: Switch between Claude, GPT, Gemini, and other models without changing your code
- Unified billing: One invoice for all providers instead of managing multiple accounts
- Built-in fallbacks: Automatic retry with alternative providers if one fails
- Streaming support: Real-time responses with SSE across all compatible endpoints
- Full feature parity: Tool calling, structured outputs, vision, and embeddings work exactly as documented
| API | Best for | Documentation |
|---|---|---|
| AI SDK (recommended) | Normalizes provider differences, works with AI Gateway automatically | Streaming, Structured outputs, Tools |
| OpenAI Responses API | OpenAI Responses API users | Streaming, Tools, Structured output |
| OpenAI Chat Completions API | Existing OpenAI integrations, broad language support | Chat, Tools, Embeddings |
| Anthropic Messages API | Claude Code, Anthropic SDK users | Messages, Tools, Files |
| OpenResponses | New projects, provider-agnostic design | Streaming, Tools, Vision |
| Python | Python developers | Async, Streaming, Frameworks |
- New project? Use AI SDK. It handles provider differences for you and supports streaming, structured outputs, tool calling, and reasoning across all providers.
- Using the OpenAI SDK? The OpenAI Responses API and Chat Completions API both work by changing your base URL.
- Using Claude Code or the Anthropic SDK? Use the Anthropic Messages API for native feature support.
- Want a provider-agnostic REST API? Use OpenResponses.
- Get your API key to start making requests
- Browse available models to find the right model for your use case
- Set up observability to monitor usage and debug requests
Was this helpful?