Data & AIData & AI
Ignite5min
BEGINNER

Zero-Toil AI Enablement: Bridging LLM Agents to Microservices with MCP-on-Envoy

This proposal introduces MCP‑on‑Envoy, a lightweight server using Envoy’s ext_proc filter to connect LLM agents with existing REST‑based microservices via OpenAPI mappings. It enables agentic workflows without code changes or service rewrites, simplifying enterprise integration and offering an open‑source bridge between AI agents and legacy architectures.

talk.summaryAiDisclaimer

Jens Kat
Jens KatING

talkDetail.whenAndWhere

Wednesday, April 1, 13:05-13:10
Zaal 10
talks.description
AI agents are becoming first‑class API consumers, yet enterprises already operate large REST-based microservice ecosystems that should not be rebuilt for agentic workflows. To enable LLM agents without burdening hundreds of teams, we built a thin MCP (Model Context Protocol) server integrated with Envoy’s ext_proc filter. It maps agent tool calls to existing OpenAPI-defined REST endpoints, with no code changes it bridges LLMs to an existing microservices architecture. We’re open‑sourcing MCP‑on‑Envoy for others to adopt.
talks.speakers
Jens Kat

Jens Kat

ING

Netherlands

Jens Kat is a senior engineer at ING, where he leads the platform team responsible for ING’s API service mesh. His work focuses on large‑scale API architectures, control-plane and data-plane, and enabling new capabilities, such as LLM agent integrations, across thousands of microservices in highly regulated environments.

talkDetail.rateThisTalk

talkDetail.ratingExpired

talkDetail.ratingWindowExpired

comments.title

comments.speakerNotEnabledComments