Vercel AI SDK Integration

Use MachineMarket as a tool with the Vercel AI SDK's tool() function and Zod schemas. Works with generateText and streamText.

Prerequisites

  • Vercel AI SDK (ai)
  • An LLM provider package (e.g., @ai-sdk/openai)
  • zod
  • The MachineMarket SDK
terminal
npm install ai @ai-sdk/openai zod

Define tools

tools.ts
import { tool } from "ai";
import { z } from "zod";
import { MachineMarket } from "@machinemarket/sdk";

const mm = new MachineMarket({
  baseUrl: "https://api.machinemarket.ai",
});

export const machineMarketTools = {
  getPricing: tool({
    description: "Get MachineMarket VPS pricing tiers and durations",
    parameters: z.object({}),
    execute: async () => {
      return await mm.getPricing();
    },
  }),

  spawnServer: tool({
    description:
      "Provision a VPS server with MachineMarket. " +
      "Requires a USDC payment tx_hash on Base.",
    parameters: z.object({
      tier: z.enum(["Nano", "Small", "Medium", "Large"])
        .describe("Server tier"),
      template: z.enum(["base", "node", "python", "agent"])
        .describe("Software template"),
      duration: z.enum(["1h", "24h", "7d", "30d"])
        .describe("Server lifetime"),
      tx_hash: z.string()
        .describe("USDC payment transaction hash"),
      wallet_address: z.string()
        .describe("Sender wallet address"),
      region: z.string().optional()
        .describe("Region (default: fsn1)"),
    }),
    execute: async (params) => {
      return await mm.spawn(params);
    },
  }),

  getInstance: tool({
    description: "Get details of a MachineMarket VPS instance",
    parameters: z.object({
      id: z.string().describe("Instance UUID"),
    }),
    execute: async ({ id }) => {
      return await mm.getInstance(id);
    },
  }),

  destroyServer: tool({
    description: "Immediately destroy a MachineMarket VPS instance",
    parameters: z.object({
      id: z.string().describe("Instance UUID to destroy"),
    }),
    execute: async ({ id }) => {
      return await mm.destroyInstance(id);
    },
  }),
};

Use with generateText

example.ts
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { machineMarketTools } from "./tools";

const { text } = await generateText({
  model: openai("gpt-4o"),
  tools: machineMarketTools,
  maxSteps: 5,
  prompt:
    "Check MachineMarket pricing, then provision a Small Node.js " +
    "server for 24h using tx_hash 0xabc...def " +
    "from wallet 0x1234...5678.",
});

console.log(text);

Use in a Next.js route handler

app/api/chat/route.ts
import { streamText } from "ai";
import { openai } from "@ai-sdk/openai";
import { machineMarketTools } from "@/lib/tools";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai("gpt-4o"),
    tools: machineMarketTools,
    maxSteps: 5,
    messages,
    system:
      "You are an infrastructure assistant. " +
      "You can provision and manage VPS servers using MachineMarket. " +
      "Servers are paid with USDC on Base.",
  });

  return result.toDataStreamResponse();
}
~Tip
The Vercel AI SDK automatically handles tool calling loops with maxSteps. The LLM will call your tools and reason about the results without extra code.

Next steps