Constructing AI brokers is the brand new gold rush. However each developer is aware of the largest bottleneck: getting the AI to really talk to your information. In the present day, journey big Agoda is tackling this drawback head-on. They’ve formally launched APIAgent, an open-source device designed to show any REST or GraphQL API right into a Mannequin Context Protocol (MCP) server with 0 code and 0 deployments.
The Drawback: The ‘Integration Tax‘
Till not too long ago, when you needed your AI agent to verify flight costs or search for a database, you needed to write a customized device. When Anthropic launched the Mannequin Context Protocol (MCP), it created a normal manner for Giant Language Fashions (LLMs) to connect with exterior instruments.
Nonetheless, even with MCP, the workflow is tedious. A developer should:
- Write a brand new MCP server in Python or TypeScript.
- Outline each device and its parameters manually.
- Deploy and keep that server.
- Replace the code each time the underlying API modifications.
Agoda staff calls this the ‘integration tax.’ For a corporation with 1000s of inner APIs, writing 1000s of MCP servers will not be sensible. APIAgent is their reply to this scaling drawback.
What’s APIAgent?
APIAgent is a common MCP server. As an alternative of writing customized logic for each API, you utilize APIAgent as a proxy. It sits between your LLM (like Claude or GPT-4) and your present APIs.
The device is constructed on a selected technical stack:
- FastMCP: Powers the MCP server layer.
- OpenAI Brokers SDK: Handles the language mannequin orchestration.
- DuckDB: An in-process SQL engine used for SQL post-processing.
The ‘magic’ lies in its means to grasp API documentation. You present a definition of your API—utilizing an OpenAPI specification for REST or a schema for GraphQL—and APIAgent handles the remainder.
How It Works?
The structure is easy. APIAgent acts as a gateway. When a person asks an AI agent a query, the circulate appears like this:
- The Request: The person asks, ‘Present me the highest 10 accommodations in Bangkok with probably the most evaluations.’
- Schema Introspection: APIAgent robotically inspects the API schema to grasp the accessible endpoints and fields.
- The SQL Layer (DuckDB): That is the key sauce. If the API returns 10,000 unsorted rows, APIAgent makes use of DuckDB to filter, kind, and combination that information regionally through SQL earlier than sending the concise end result again to the LLM.
- The Response: The JSON information travels again by APIAgent, which codecs it for the AI to learn.
This technique makes use of Dynamic Instrument Discovery. You possibly can level APIAgent at any URL, and it robotically generates the mandatory instruments for the LLM with out handbook mapping.
Key Characteristic: ‘Recipe’ Studying
One of many key options is Recipe Studying. When a fancy pure language question efficiently executes, APIAgent can extract the hint and put it aside as a ‘Recipe.’
- These recipes are parameterized templates.
- The subsequent time an identical query is requested, APIAgent makes use of the recipe immediately.
- This skips the costly LLM reasoning step, which considerably reduces latency and value.
Key Takeaway
- Common Protocol Bridge: APIAgent acts as a single, open-source proxy that converts any REST or GraphQL API right into a Mannequin Context Protocol (MCP) server. This removes the necessity to write customized boilerplate code or keep particular person MCP servers for each inner microservice.
- Zero-Code Schema Introspection: The device is ‘configuration-first.’ By merely pointing APIAgent at an OpenAPI spec or GraphQL endpoint, it robotically introspects the schema to grasp endpoints and fields. It then exposes these to the LLM as purposeful instruments with out handbook mapping.
- Superior SQL Put up-Processing: It integrates DuckDB, an in-process SQL engine, to deal with advanced information manipulation. If an API returns hundreds of unsorted rows or lacks particular filtering, APIAgent makes use of SQL to kind, combination, or be a part of the information regionally earlier than delivering a concise reply to the AI.
- Efficiency through ‘Recipe Studying’: To resolve excessive latency and LLM prices, the agent options Recipe Studying. It data the profitable execution hint of a pure language question and saves it as a parameterized template.
- Safety-First Structure: The system is ‘Protected by Default,‘ working in a read-only state. Any ‘mutating’ actions (like
POST,PUT, orDELETErequests) are strictly blocked by the proxy except a developer explicitly whitelists them within the YAML configuration file.
Try the PR Right here. Additionally, be happy to comply with us on Twitter and don’t overlook to hitch our 100k+ ML SubReddit and Subscribe to our Publication. Wait! are you on telegram? now you may be a part of us on telegram as effectively.


