• Deploy xmcp servers with zero-configuration

    Vercel now supports xmcp, a framework for building and shipping MCP servers with TypeScript, with zero-configuration.

    xmcp uses file-based routing to create tools for your MCP server.

    my-project/
    ├── src/
    │ ├── middleware.ts
    │ └── tools/
    │ ├── greet.ts
    │ ├── search.ts
    ├── package.json
    ├── tsconfig.json
    └── xmcp.config.ts

    File-based routing using xmcp

    Once you've created a file for your tool, you can use a default export in a way that feels familiar to many other file-based routing frameworks. Below, we create a "greeting" tool.

    // src/tools/greet.ts
    import { z } from "zod";
    import { type InferSchema } from "xmcp";
    export const schema = {
    name: z.string().describe("The name of the user to greet"),
    };
    // Tool metadata
    export const metadata = {
    name: "greet",
    description: "Greet the user",
    };
    export default async function greet({ name }: InferSchema<typeof schema>) {
    const result = `Hello, ${name}!`;
    return {
    content: [{ type: "text", text: result }],
    };
    }

    Learn more about deploying xmcp to Vercel in the documentation.

  • AI Gateway is now generally available

    AI Gateway GA OGAI Gateway GA OG

    AI Gateway is now generally available, providing a single unified API to access hundreds of AI models with transparent pricing and built-in observability.

    With sub-20ms latency routing across multiple inference providers, AI Gateway delivers:

    • Transparent pricing with no markup on tokens (including Bring Your Own Keys)

    • Automatic failover for higher availability

    • High rate limits

    • Detailed cost and usage analytics

    You can use AI Gateway with the AI SDK or through the OpenAI-compatible endpoint. With the AI SDK, it’s just a simple model string switch.

    Get started with a single API call:

    import { streamText } from 'ai'
    const result = streamText({
    model: 'openai/gpt-5',
    prompt: 'How can AI Gateway not have a markup on tokens?'
    })

    Read more about the announcement, learn more about AI Gateway, or get started now.

  • Introducing Streamdown: Open source Markdown for AI streaming

    streamdown darkstreamdown dark

    Streamdown is a new open source, drop-in Markdown renderer built for AI streaming. It powers the AI Elements Response component, but can also be used standalone to give developers a fully composable, independently managed option with npm i streamdown.

    Streamdown is designed to handle unterminated chunks, interactive code blocks, math, and other cases that are unreliable with existing Markdown packages.

    It's available now, and ships with:

    • Tailwind typography styles: Preconfigured classes for headings, lists, and code blocks

    • GitHub Flavored Markdown: Tables, task lists, and other GFM features

    • Interactive code blocks: Shiki highlighting with built-in copy button

    • Math support: LaTeX expressions via remark-math and KaTeX

    • Graceful chunk handling: Proper formatting for unterminated Markdown chunks

    • Security hardening: Safe handling of untrusted content with restricted images and links

    You can get started with start with AI Elements:

    npx ai-elements@latest add response

    Or as a standalone package:

    npm i streamdown

    Read the docs and upgrade your AI-powered streaming.