Skip to main content
Use Kontext integrations as tools in any Vercel AI SDK application. The toKontextTools adapter converts your connected integrations into Vercel AI-compatible tools that work with generateText, streamText, and generateObject.

Install

npm install @kontext-dev/js-sdk ai @ai-sdk/openai

Complete example

A CLI application that connects to Kontext, loads tools from the user’s integrations, and runs a multi-step conversation:
import { createKontextClient } from "@kontext-dev/js-sdk/client";
import { toKontextTools } from "@kontext-dev/js-sdk/ai";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import open from "open";
import http from "node:http";
import readline from "readline";

function waitForCallback(): Promise<string> {
  return new Promise((resolve, reject) => {
    const server = http.createServer((req, res) => {
      const url = new URL(req.url ?? "/", "http://localhost:3333");
      if (url.pathname === "/callback") {
        res.writeHead(200, { "Content-Type": "text/html" });
        res.end("<h1>Authenticated. You can close this tab.</h1>");
        server.close();
        resolve(url.toString());
      }
    });
    server.listen(3333);
    server.on("error", reject);
  });
}

const client = createKontextClient({
  clientId: "app_your-app-id",
  redirectUri: "http://localhost:3333/callback",
  onAuthRequired: async (url) => {
    await open(url.toString());
    return await waitForCallback();
  },
});

await client.connect();
const { tools, systemPrompt, integrations } = await toKontextTools(client);

const rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout,
});
const prompt = await new Promise<string>((resolve) =>
  rl.question("You: ", resolve)
);

const result = await generateText({
  model: openai("gpt-4o"),
  tools,
  system: systemPrompt,
  prompt,
  maxSteps: 5,
});

console.log(result.text);
await client.disconnect();

toKontextTools return value

import { toKontextTools } from "@kontext-dev/js-sdk/ai";

const { tools, systemPrompt, integrations } = await toKontextTools(client);
The function returns three things:
  • tools — A Record<string, CoreTool> compatible with the Vercel AI SDK. Each tool maps to an action from your connected integrations. The adapter also injects a parameterless integration-management helper tool that returns a fresh integrations management link the model can share for connect/manage/reconnect flows.
  • systemPrompt — A pre-built string that tells the LLM which integrations are connected and which are not. Pass this as the system parameter to give the model context about available capabilities.
  • integrations — A readonly IntegrationInfo[] array with the current state of each integration (connected: true/false, plus optional reason and connectUrl). Use this to build UI indicators or conditional logic.

System prompt

The generated systemPrompt includes a list of all integrations attached to your application and their connection status. This lets the model know what tools are available and respond appropriately when a user asks for something that requires a disconnected integration. It also tells the model to call the integration-management helper tool for a fresh integrations management link (instead of reusing old links) and to prefer Kontext tools over local shell/CLI commands when Kontext can satisfy the integration request. You don’t need to write integration-awareness logic yourself. The prompt handles it.

Multi-step conversations with maxSteps

Set maxSteps to allow the model to call multiple tools in sequence before returning a final response. Without this, the model makes one tool call and stops.
const result = await generateText({
  model: openai("gpt-4o"),
  tools,
  system: systemPrompt,
  prompt: "Find my open GitHub PRs and post a summary to Slack",
  maxSteps: 5,
});
Each step is one LLM round-trip. A value of 5 covers most multi-tool workflows. Increase it for complex chains that need more back-and-forth.

Streaming with streamText

Replace generateText with streamText to stream the response as it’s generated:
import { streamText } from "ai";

const result = streamText({
  model: openai("gpt-4o"),
  tools,
  system: systemPrompt,
  prompt,
  maxSteps: 5,
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}
Tools still execute between steps. The stream delivers text tokens as the model produces them, with pauses while tool calls run.

Custom result formatting

Pass a formatResult function to control how tool outputs are presented to the model:
const { tools, systemPrompt } = await toKontextTools(client, {
  formatResult: (result) => {
    return { summary: result.content, timestamp: Date.now() };
  },
});
This is useful when tool outputs are large and you want to trim or reshape them before the model sees them.

Auth patterns

CLI applications

For CLI tools, use onAuthRequired to open the browser and return the callback URL:
import http from "node:http";
import open from "open";

function waitForCallback(): Promise<string> {
  return new Promise((resolve, reject) => {
    const server = http.createServer((req, res) => {
      const url = new URL(req.url ?? "/", "http://localhost:3333");
      if (url.pathname === "/callback") {
        res.writeHead(200, { "Content-Type": "text/html" });
        res.end("<h1>Authenticated. You can close this tab.</h1>");
        server.close();
        resolve(url.toString());
      }
    });
    server.listen(3333);
    server.on("error", reject);
  });
}

const client = createKontextClient({
  clientId: "app_your-app-id",
  redirectUri: "http://localhost:3333/callback",
  onAuthRequired: async (url) => {
    await open(url.toString());
    return await waitForCallback();
  },
});
The SDK does not start a local callback server for you. Your application must receive the callback URL and either return it from onAuthRequired or pass it to client.auth.handleCallback().

Web applications

In browser environments, use the React SDK for popup-based OAuth, or handle the redirect flow manually:
const client = createKontextClient({
  clientId: "app_your-app-id",
  redirectUri: "https://yourapp.com/callback",
  onAuthRequired: async (url) => {
    window.location.href = url.toString();
  },
});

Next steps

  • React SDK — Build browser-based chat UIs with OAuth popup support.
  • Cloudflare Agents — Deploy AI applications on Cloudflare Workers with Durable Object storage.
  • How Kontext Works — Understand the Kontext architecture and credential flow.