Wednesday
Room 1
16:20 - 17:20
(UTC±00)
Talk (60 min)
MCP Demystified
Model Context Protocol is emerging as the as a defacto standard for integrating tools with LLMs. As with most new technology, especially those related to AI, it is shrouded in hype and confusion. What is MCP exactly, how is it implemented, what can it do and not do?
This talk explains the purpose and goal of MCP. It solves the problem of integrating large language models with other systems in a consistent, interoperable way.
- What was the state of the art for integrating LLMs and tools prior to MCP
- What were the problems and limitations of those approaches?
- How does MCP resolve those limitations?
The talk then dives deep into the details of how MCP is implemented, by building an MCP server from scratch.
The audience will discover how MCP uses established tech such as JSON RPC and standard IO to define a common integration pattern for building AI solutions. Once these nuts and bolts are laid bare, the demonstration moves on to solve a real-world problem via the server implementation.
Finally, the talk explains the less-used capabilities of MCP beyond tools – for example how the “samples” concept allows tools the initiate communication with the LLM, a reversal of the typical tool pattern.
Outline:
- Why do we want to integrate tools with LLMs?
- Prior state of the art (ChatGPT plugins, Langchain tools) and their limitations
- MCP – what it is
- MCP – how it solves the problems
- Deep dive – What is stdio?
- Deep dive – what is JSON-RPC?
- Deep dive – The steps in the MCP communication protocol
- Real world problem – Implement an MCP server to solve … (problem TBC)
- Beyond tools – what else can MCP do and why sampling matters.