• 402-mcp enables x402 payments in MCP

    Introducing x402-mcp, a library that integrates with the AI SDK to bring x402 paywalls to Model Context Protocol (MCP) servers to let agents discover and call pay for MCP tools easily and securely.

    With x402-mcp, you can define MCP servers with paidTools that require payment to run, enabling account-less, low-latency, anonymous payments directly in AI workflows. Payments confirm in ~100–200ms, with fees under $0.01 and support for minimums under $0.001.

    Getting started is easy, here's how you can define a paid tool:

    import { createPaidMcpHandler } from "x402-mcp";
    import z from "zod";
    const handler = createPaidMcpHandler(
    (server) => {
    server.paidTool(
    "add_numbers",
    {
    // declare a price of $0.001
    price: 0.001
    },
    { a: z.number(), b: z.number() },
    async (args) => {
    // ...your tool call
    }
    );
    },
    { recipient: process.env.WALLET_ADDRESS }
    );
    export { handler as GET, handler as POST };

    And integrating with AI SDK MCP Clients takes just one function to enable payments:

    import { experimental_createMCPClient as createMCPClient } from "ai";
    import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";
    import { withPayment } from "x402-mcp";
    const mcpClient = await createMCPClient({
    transport: new StreamableHTTPClientTransport(url),
    }).then((client) => withPayment(client, { account: process.env.PRIVATE_KEY }));
    const tools = await mcpClient.tools();

    Read more about x402 or try our full stack x402 AI Starter Kit.

    x402 AI Starter

    Fullstack AI demo of x402 with Next.js, AI SDK, AI Gateway, AI Elements and the Coinbase CDP

    Deploy

  • New Vercel CLI login flow

    Authorize Device FlowAuthorize Device Flow

    The vercel login command now uses the industry-standard OAuth 2.0 Device Flow, making authentication more secure and intuitive. You can sign in from any browser-capable device.

    When approving a login, be sure to verify the location, IP, and request time before granting access to your Vercel account.

    Email-based login (vercel login your@email.com) and flags such as --github, --gitlab, --bitbucket, --oob, and team are deprecated. Beginning February 1, 2026, these methods will no longer be supported.

    Upgrade today with npm i vercel@latest

    Learn more in the docs.

  • LongCat Flash Chat model is now supported in Vercel AI Gateway

    You can now access LongCat Flash Chat, a new model from Meituan focused on agentic tool use, using Vercel AI Gateway with no other provider accounts required. The model dynamically activates parameters, based on contextual demands.

    AI Gateway lets you call the model with a consistent unified API and just a single string update, track usage and cost, and configure performance optimizations, retries, and failover for higher than provider-average uptime.

    To use it with the AI SDK v5, start by installing the package:

    pnpm i ai

    Then set the model to meituan/longcat-flash-chat:

    import { streamText } from 'ai'
    const result = streamText({
    model: 'meituan/longcat-flash-chat',
    prompt: 'How does dynamic parameter activation work for AI models?'
    })

    Includes built-in observability, Bring Your Own Key support, and intelligent provider routing with automatic retries.

    Learn more about AI Gateway and access the model here.