HomeArtificial IntelligenceKong Releases Volcano: A TypeScript, MCP-native SDK for Constructing Manufacturing Prepared AI...

Kong Releases Volcano: A TypeScript, MCP-native SDK for Constructing Manufacturing Prepared AI Brokers with LLM Reasoning and Actual-World actions


Kong has open-sourced Volcano, a TypeScript SDK that composes multi-step agent workflows throughout a number of LLM suppliers with native Mannequin Context Protocol (MCP) device use. The discharge coincides with broader MCP capabilities in Kong AI Gateway and Konnect, positioning Volcano because the developer SDK in an MCP-governed management airplane.

  • Why Volcano SDK? as a result of 9 strains of code are sooner to put in writing and simpler to handle than 100+.
  • With out Volcano SDK? You’d want 100+ strains dealing with device schemas, context administration, supplier switching, error dealing with, and HTTP purchasers. 
  • With Volcano SDK: 9 strains.
import { agent, llmOpenAI, llmAnthropic, mcp } from "volcano-ai";


// Setup: two LLMs, two MCP servers
const planner = llmOpenAI({ mannequin: "gpt-5-mini", apiKey: course of.env.OPENAI_API_KEY! });
const executor = llmAnthropic({ mannequin: "claude-4.5-sonnet", apiKey: course of.env.ANTHROPIC_API_KEY! });
const database = mcp("https://api.firm.com/database/mcp");
const slack = mcp("https://api.firm.com/slack/mcp");


// One workflow
await agent({ llm: planner })
 .then({
   immediate: "Analyze final week's gross sales information",
   mcps: [database]  // Auto-discovers and calls the precise instruments
 })
 .then({
   llm: executor,  // Swap to Claude
   immediate: "Write an government abstract"
 })
 .then({
   immediate: "Submit the abstract to #executives",
   mcps: [slack]
 })
 .run();

What Volcano offers?

Volcano exposes a compact, chainable API.then(...).run()—that passes intermediate context between steps whereas switching LLMs per step (e.g., plan with one mannequin, execute with one other). It treats MCP as a first-class interface: builders hand Volcano an inventory of MCP servers, and the SDK performs device discovery and invocation routinely. Manufacturing options embody computerized retries, per-step timeouts, connection pooling for MCP servers, OAuth 2.1 authentication, and OpenTelemetry traces/metrics for distributed observability. The challenge is launched below Apache-2.0.

Listed here are the Key Options of the Volcano SDK:

  • Chainable API: Construct multi-step workflows with a concise .then(...).run() sample; context flows between steps
  • MCP-native device use: Move MCP servers; the SDK auto-discovers and invokes the precise instruments in every step.
  • Multi-provider LLM assist: Combine fashions (e.g., planning with one, execution with one other) inside one workflow.
  • Streaming of intermediate and closing outcomes for responsive agent interactions.
  • Retries & timeouts configurable per step for reliability below real-world failures.
  • Hooks (earlier than/after step) to customise habits and instrumentation.
  • Typed error dealing with to floor actionable failures throughout agent execution.
  • Parallel execution, branching, and loops to specific advanced management circulation.
  • Observability through OpenTelemetry for tracing and metrics throughout steps and gear calls.
  • OAuth assist & connection pooling for safe, environment friendly entry to MCP servers.

The place it matches in Kong’s MCP structure?

Kong’s Konnect platform provides a number of MCP governance and entry layers that complement Volcano’s SDK floor:

  • AI Gateway features MCP gateway options similar to server autogeneration from Kong-managed APIs, centralized OAuth 2.1 for MCP servers, and observability over instruments, workflows, and prompts in Konnect dashboards. These present uniform coverage and analytics for MCP analytics.
  • The Konnect Developer Portal will be changed into an MCP server so AI coding instruments and brokers can uncover APIs, request entry, and eat endpoints programmatically—decreasing handbook credential workflows and making API catalogs accessible via MCP.
  • Kong’s staff additionally previewed MCP Composer and MCP Runner to design, generate, and function MCP servers and integrations.

Key Takeaways

  • Volcano is an open-source TypeScript SDK that builds multi-step AI brokers with first-class MCP device use.
  • The SDK offers manufacturing options—retries, timeouts, connection pooling, OAuth, and OpenTelemetry tracing/metrics—for MCP workflows.
  • Volcano composes multi-LLM plans/executions and auto-discovers/invokes MCP servers/instruments, minimizing customized glue code.
  • Kong paired the SDK with platform controls: AI Gateway/Konnect add MCP server autogeneration, centralized OAuth 2.1, and observability.

Kong’s Volcano SDK is a practical addition to the MCP ecosystem: a TypeScript-first agent framework that aligns developer workflow with enterprise controls (OAuth 2.1, OpenTelemetry) delivered through AI Gateway and Konnect. The pairing closes a standard hole in agent stacks—device discovery, auth, and observability—with out inventing new interfaces past MCP. This design prioritizes protocol-native MCP integration over bespoke glue, reducing operational drift and shutting auditing gaps as inside brokers scale.


Take a look at the GitHub Repo and Technical particulars. Be happy to take a look at our GitHub Web page for Tutorials, Codes and Notebooks. Additionally, be at liberty to observe us on Twitter and don’t overlook to affix our 100k+ ML SubReddit and Subscribe to our E-newsletter. Wait! are you on telegram? now you may be a part of us on telegram as effectively.


Michal Sutter is an information science skilled with a Grasp of Science in Knowledge Science from the College of Padova. With a stable basis in statistical evaluation, machine studying, and information engineering, Michal excels at reworking advanced datasets into actionable insights.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments