VercelVercel
Menu

OpenAI Codex

Last updated March 16, 2026

OpenAI Codex is OpenAI's agentic coding tool. You can configure it to use Vercel AI Gateway, enabling you to:

  • Route requests through multiple AI providers
  • Monitor traffic and spend in your AI Gateway Overview
  • View detailed traces in Vercel Observability under AI
  • Use any model available through the gateway

Configure Codex to use AI Gateway through its configuration file for persistent settings.

  1. Follow the installation instructions on the OpenAI Codex repository to install the Codex CLI tool.

  2. Set your AI Gateway API key in your shell configuration file, for example in ~/.zshrc or ~/.bashrc:

    export AI_GATEWAY_API_KEY="your-ai-gateway-api-key"

    After adding this, reload your shell configuration:

    source ~/.zshrc  # or source ~/.bashrc
  3. Open ~/.codex/config.toml and add the following:

    ~/.codex/config.toml
    [model_providers.vercel]
    name = "Vercel AI Gateway"
    base_url = "https://ai-gateway.vercel.sh/v1"
    env_key = "AI_GATEWAY_API_KEY"
    wire_api = "responses"
     
    [profiles.vercel]
    model_provider = "vercel"
    model = "openai/gpt-5.2-codex"

    The configuration above:

    • Sets up a model provider named vercel that points to the AI Gateway
    • References your AI_GATEWAY_API_KEY environment variable
    • Creates a vercel profile that uses the Vercel provider
    • Specifies openai/gpt-5.2-codex as the default model
    • Uses wire_api = "responses" for the OpenAI Responses API format
  4. Start Codex with the vercel profile:

    codex --profile vercel

    Vercel AI Gateway routes your requests. To confirm, check your AI Gateway Overview in the Vercel dashboard.

  5. To use a different model, update the model field in your config:

    ~/.codex/config.toml
    [profiles.vercel]
    model_provider = "vercel"
    model = "anthropic/claude-sonnet-4.5"
    # Or try other models:
    # model = "google/gemini-3-flash"
    # model = "openai/o3"

    When using non-OpenAI models through the gateway, you may see warnings about model metadata not being found. These warnings are safe to ignore since the gateway handles model routing.

  6. Add each profile to your config file:

    ~/.codex/config.toml
    [model_providers.vercel]
    name = "Vercel AI Gateway"
    base_url = "https://ai-gateway.vercel.sh/v1"
    env_key = "AI_GATEWAY_API_KEY"
    wire_api = "responses"
     
    [profiles.vercel]
    model_provider = "vercel"
    model = "openai/gpt-5.2-codex"
     
    [profiles.fast]
    model_provider = "vercel"
    model = "openai/gpt-4o-mini"
     
    [profiles.reasoning]
    model_provider = "vercel"
    model = "openai/o3"
     
    [profiles.claude]
    model_provider = "vercel"
    model = "anthropic/claude-sonnet-4.5"

    Switch between profiles using the --profile flag:

    codex --profile vercel
    codex --profile claude

Was this helpful?

supported.